|Established in 1952
Thursday, May 23, 2013
There is a cause for everything that happens. Many readers will find what I am going to say, to be heresy. “There simply is no such thing as a random error.” Random errors are simply systematic errors the cause of which, are for the present, unknown.
There is an often told story about a measuring technician at NIST (then NBS). She was excellent at getting repetitive dimensional results. She would measure parts to an exact number, where other technicians, only got a jumble of measurements. When a new director took over the lab, he asked her if she could teach him how she was able to do this. She explained to him, that she simply made a lot of measurements and threw away the outliers. Now this may inspire a chuckle or two, but when I delve into some of the software being used in metrology labs today, this is actually what is being done.
It seems that there is already an acceptable jargon in place for this practice. They are referred to as “Thinning Algorithms”.
In one case that I was exposed to, they simply used an arithmetic band pass filter, that eliminates 27% of the outlying data. It wouldn’t be so bad, but this was C.M.M. Calibration software, so the machine that you are using to measure parts with has already been “purified.” It only gives you the data that you want to see.
Our Bal-tec division designed and built what may be the most accurate sphericity measuring machine in the world. This was done by manipulating the position of the test sphere to 64 precise positions, in each of three precisely orthogonal axii. The test sphere is lifted and repositioned and then lowered into a Kinematic couple, by a robotic mechanism.
In each one of the three orthogonal positions, we would get seven or so outliers. If we used any form of averaging it would obviously pollute the true character of the test sphere. These seven positions amount to over 10% of all of the readings. If we filtered them out, we were diluting the data and possibly massaging the outcome of the measurement.
When I approached Dr. Dan DuBra at Stanford University regarding the dilemma of the outliers, he offered to lend us his graduate class for a day to brainstorm the problem. In distilling the literally hundreds of ideas that came out of this session, I came to the conclusion that even though these errors were random, they were actually systematic, or systematically random. They occurred every time, but in different places. The original Kinematic couple that we used consisted of three, angstrom quality diamond spheres.
The solution to our dilemma turned out to be the substitution of an electrically grounded, flat, gold plate, that was diamond finished in place of one of the diamond spheres. All of the outliers, and I do mean all, of the outliers disappeared!
I believe that two things happened. The low Young’s modulus (stiffness) of the gold plate gave a little wrap around of the ball. This improved the electrical contact between any high resistance areas, such as a chrome carbide particle in the matrix of steel ball, and the frame of the machine. In addition, the excellent electrical conductivity of the gold plate greatly reduced the overall contact resistance between the ball and the grounded frame, of the machine. We used a capacitance sensor, with a seven digit voltmeter, for the readout that was downloaded directly to the computer.
On this same machine, we had a slow drift, that we attributed to temperature change. We added an extra layer of insulation to the system without any definite results. The environment already consisted of a small copper room, on top of the vibration isolation table, that was insulated and then surrounded by a large, insulated steel room, all of which was housed inside a clean, temperature controlled lab. The modern terminology for this approach is the “ONION”, layer on layer of insulated chambers. The earliest reference that I have found for this technique was in the 1940's for the quartz crystal temperature controlled oven of very accurate clocks.
I spoke with a technician from the Lockheed Metrology lab in Sunnyvale California, who reported problems with a changing magnetic field in the lab caused by the encroachment of sea water into the water table during high tide. We are not too far from the Pacific Ocean, so we added a heavy steel plate under the machine to add to the magnetic isolation. This did seem to help, but there was still a problem. This persistent slow drift was somehow related to the slightest mechanical movement of anything in the system. The solution was to drive an eleven foot long bar of copper into the floor of the building. We then ran a separate grounding cable, to each individual piece of electrical equipment, in the system. All of the drift abruptly ended. There had been a ground loop or ground loops between the various parts of the electrical system.
Bal-tec makes hundreds of roundness measurements of our precision balls and cylinders each and every day. We have five Talyronds in our system. The normal approach for providing physical contact between the pickup transducer and the part in a roundness measuring machine is to use a sapphire or tungsten carbide contact sphere.
It is simply impossible to provide a work environment that is lint free enough to measure the roundness of a high quality part without an occasional blip that is caused by the spherical measuring tip riding up and over a tiny piece of lint. This lint is only micro inches thick. Using a modern approach “This really doesn’t matter,” we will simply provide a filter between the gage and the recorder that recognizes this anomaly and ignores it. What if that blip is a particle of embedded diamond or a protruding metallic carbide that was exposed by a polishing process? I know of one large company that was put out of business when protruding carbides left by a new, more economical, polishing process, ate the matching bearing liners in their product.
A deterministic process to measure roundness is to remove the tiny pieces of lint. This is done by using a hatched shaped contact stylus, with a large vertical radius of about 0.1 inches [ 2.5 mm ] but with a very small horizontal radius of less than 0.002 inches [ 0.05 mm ]. A byproduct of using this link removing approach is that the small radius of the hatchet stylus will more accurately show up flaws and protrusions in the general surface.
I was recently viewing a roundness measuring machine in operation at a trade show exhibit. There were circular scratches on the master ball that were visible to the naked eye. These scratches crossed the measurement path. The output of the gage was reading less than 20 nanometers. They used a 1/8 inch [ 0.125", 3.2mm ] diameter probe tip and some unspecified electronic filtration.
When you look at roundness measurements, always, and I do mean always, know what mechanical filter was used in the form of the measuring tip geometry, and what electrical massaging of the data may have taken place.
One practitioner of the inspection art recently told me in no uncertain terms that flaws (occasional disruptions of the surface) or waviness and surface or micro finish were no part of the roundness of the subject part. I was told by him, that the inspector or metrologist's job today, is to simply report the characteristic being inspected, according to specification, without prejudice, and I may add without comment. This kind of thinking may or may not be applicable when you are measuring commercial tolerances; but when it comes to nanometric measurements, every factor in the makeup of the part is a “prime factor”.
If a truly competent technician had reported the protruding metallic carbides in the previous case, a multimillion dollar bearing company and hundreds of jobs might have been saved.
I believe that the present legalistic approach to metrology is trying to turn inspectors into well behaved automatons without any Newtonian powers of observation and without the right to make any comments.
The point of this outline is that the random blip caused by a piece of lint should be removed from the picture, not compensated for.
If equipment designers and metrology practitioners would or could quit settling for compromises, and start solving the real problems that are hiding behind the random errors, we could take the next quantum step toward true nanometric metrology.