I'm not new to casting and know that a number of variables will affect the final weight of cast slugs. The question I have for you folks is, for accuracy how big a variance in weight is ok? I have begun to weigh each bullet, and separating them by weight. This is not rocket science and although time consuming it is a simple process. . The slug I have started with is a Lyman 250 grain slug, .380 Dia. to be shot in a 38/55 Sharps. The vast majority are coming in at 250 Grs. to 253 Grs. Bullets outside this range are discarded and those remaining are separated by weight. The first test batch are all + or - .5 grs. 249.6 to 250.5 go in the 250 gr batch, 250.6 to 251.5 goes in the 251 pile. Does that allowable variance seem to big? Would a tighter allowance really be worth the effort? I can break them down to +/- 1 or 2 tenths of a grain but would it be worth the effort? Another thought would be to settle on a given percentage of the bullet weight. I don't know, maybe plus or minus 2% of bullet weight?? I will also be weighing and separating slugs for .30 caliber and both 45/70 and 45/90 rifles. In the past most all my casting has been for Cowboy Action shooting where "Minuet of Cowboy" at a few feet was all that mattered.