Correct use of measurement tools gives foundation for quality
March 24, 2009
It's more important than ever for your employees—particularly entry-level personnel—to learn to use measurement tools to their full potential.
Measurement tools form the basic foundation of quality at any manufacturer. In the sheet metal world, precision shops need to hold parts to tighter and tighter tolerances. That said, it's more important than ever for your employees—particularly entry-level personnel—to learn to use these tools to their full potential.
Common in sheet metal shops measuring large parts, rulers and tape measures offer straightforward operation. Common versions found at hardware stores may be off significantly: 12 in. isn't likely to be 12 in. A precision steel ruler, however, may be accurate to 0.001 in., depending on the length of the rule (see Figure 1). Longer rulers have larger tolerances.
Tape measures follow the same tolerance guideline, though margins of error are larger. Up to 12 ft., a precision tape measure will be accurate to about ±1⁄32 in.; beyond 12 ft., it's within ±1⁄16 in. Other areas for potential error include the hook at the end, which, if not precisely assembled, can cause inaccurate measurements.
With demand for increasingly tight tolerances, precision sheet metal shops may want to consider calibrated steel rulers. Calibrated with blocks and gauges traceable to a metrology lab (including the national lab at the National Institute of Standards and Technology), precision rulers and tape measures can add a degree of confidence when measuring. These devices come with a certificate showing they have been calibrated, along with data showing how far off each inch mark is to the measurement standard.
The most accurate hand-held tool available for skilled operators, micrometers come in various types, including digital, vernier, inside, bench, and specialized models. For most measurements, you hold the micrometer as shown in Figure 2. The work is placed against the anvil with the left hand while the spindle is turned down to the work with the thumb and index finger of the right hand.
You can adjust micrometers with two steps. First, to eliminate play in the spindle, back off the thimble, insert a spanner wrench (likely furnished with the micrometer) into the adjusting nut, and tighten just enough to eliminate play.
Next, to adjust to a zero reading, clean all dirt or grit from the measuring faces by gently closing the spindle to the anvil with a clean piece of paper between them. Pull the paper out with pressure applied, then close the faces and insert the spanner wrench in the small slot of the sleeve. Next, turn the sleeve until its zero line coincides with the zero line on the thimble.
For best operation:
While not having the same degree of precision as a micrometer, slide calipers offer more measurement range than a single micrometer (seeFigure 3). Slide calipers include electronic, mechanical, dial, vernier, and plain versions.
The best digital and dial slide calipers, regardless of resolution, are accurate to within 0.001 in. every 6 in. The best vernier calipers are accurate to 0.0005 in. per foot.
Slide calipers have two knurled thumb pieces on the slide, which make it easy to open or close the jaws, and a knurled clamping screw with a left-hand thread for locking the slide at any desired setting. The thumb on the same hand that holds the tool can be used for both of these adjustments. The slide also has a stop, preventing it from being entirely withdrawn from the body.
Because slide caliper measuring surfaces are not in line with the beam of the caliper, some care should be taken not to use too much measuring pressure. This will lessen the possibility of springing the jaws. A general rule is to use good judgment for setting a minimum measuring pressure, often in the half-pound range.
To check or set the separate ID nibs on a caliper, you can use a micrometer or ring gauge. Individual"feel" is important when measuring an ID because the measuring surfaces are so thin that small pressure changes can affect the reading by as much as 0.001 in. Also, be sure to keep the sliding surfaces clean and lightly oiled.
A micrometer depth gauge (seeFigure 4) measures the depth of holes, slots, recesses, and other geometries and is available in electronic, mechanical-digital, and standard readouts. The tool consists of a hardened, ground, and lapped base combined with a micrometer head. Measuring rods are inserted through a hole in the micrometer screw and brought to a positive seat by a knurled nut.
The reading is taken exactly the same as with an outside micrometer except that sleeve graduations run in the opposite direction. To obtain a reading using a rod other than the 0- to 1-in. rod, it is necessary to consider the additional rod length. For example, if the 1- to 2-in. rod is being used, 1 in. must be added to the reading on the sleeve and thimble.
Before using the micrometer depth gauge, be sure that the base and end of the rod and work are wiped clean and that the rod is properly seated in the micrometer head. Hold the base firmly against the work and turn the thimble until the rod contacts the bottom of the slot or recess. Tighten the lock nut and remove the tool from the work to read the measurement.
The sense of touch becomes important when you're using contact measuring tools. A highly skilled worker with a highly developed "feel" can readily detect a difference in contact made by changes in a dimension as small as 0.00025 in. This sensitivity can be developed over time with practice.
The sense of touch is most prominent in the fingertips. Therefore, a contact measuring tool should be balanced properly in the hand and held lightly when handling or moving the tool. If you harshly grasp the tool, you can reduce the sense of touch that is important in precision measuring.
Scott Robinson is technical support manager and Mike Baczewski is calibration certification manager for L.S. Starrett Co., 121 Crescent St., Athol, MA 01331, 978-249-3551, www.starrett.com.