There is a misunderstanding about automated systems on harvest equipment, especially grain loss monitors (GLMs).
Early GLMs were based on small impact sensors mounted to the rear of the combine’s sieves. The taps of any grain coming off the sieves and striking the sensor(s) were translated into electrical voltage. That voltage went to a gauge in the cab. More taps created more voltage, and the needle on the gauge moved farther.
The theory was that the operator would adjust the concave, threshing speed, cleaning fan and sieve settings for minimum grain loss, then adjust that gauge so the needle was in the middle of its range on the dial at an average speed in an average-yielding part of the field.
If grain loss out the back of the machine increased, there were more “taps,” which increased voltage and the needle moved to the right. The operator knew he needed to slow down or adjust the machine to get back to the minimal level of grain loss.
Some combine operators didn’t understand the need to calibrate—or recalibrate—their GLMs. They assumed the monitor was set from the factory, and after covering the ground behind their combine with thrown-over grain, disgustedly either turned off or ignored the little gauge in their cab. Or, they didn’t adjust the sensitivity when they switched between varieties of grain with different test-weights, so the, “danged thing isn’t accurate anymore.” The GLM system was working like it was supposed to; its operator just hadn’t calibrated it to prevailing conditions.
Fast forward to today. It’s safe to say that any system on a modern combine that has the term “Auto” in its name requires some degree of calibration or human adjustment. Without that calibration, without human inputs, it’s just a bunch of sensors, gauges and wires.
On modern automated agricultural equipment, when in doubt—recalibrate.


