Feature

●Dual input K-type with two CK21M, GK13M, A304, A908 carrying case (for Diff. Temp.)
●343C3 343 Dual Input K-Type Thermocouple Thermometer with Field Calibration
●Auto Field Calibrate in less than 10 seconds in ice water to achieve +/- 1degrees-F system (tester & probe) accuracy within 30degrees-F to 120degrees-F temperature range
●Temp1, Temp2, Temp1-Temp2 display modes
●Min / Max Record


Description

TPI 343C3 343 Dual Input K-Type Thermocouple Thermometer with Field Calibration with (2) CK21MThermocouple Thermometer Auto Field Calibrate in less than 10 seconds Auto Resolution (0.1degrees- / 1degrees-) degrees-C/degrees-F SelectableTPI contact temperature testers are accurate, reliable, and built to last. Model 343 can be auto field calibrated to system accuracy (tester plus probe) of +/- 1degrees-F in less than 10 seconds in ice water.AUTO FIELD CALIBRATION Perform ice bath calibration to achieve +-1degrees-F system accuracy within the 30degrees-F to 120degrees-F temperature range. Calibration is an easy two-step process performed with instrument keypad, no additional tools needed.Field calibration for thermometers TPI 343 FIELD CALIBRATIONThe 343 has the ability to be field calibrated. Accuracy of +/-1degrees-F within the 30degrees-F to 120degrees-F range can be achieved by performing an ice bath calibration.The temperature of an ice bath is approximately 32degrees-F (0degrees-C). Fill a plastic or metal container with crushed ice and add clean water to a depth of at least 4 inches. Stir the ice and water for 2 to 3 minutes prior to performing calibration to ensure the water is completely chilled. Make certain there is plenty of ice in the mixture and always use clean water. Distilled water works well.Step-by-Step Procedures Connect both temperature probes to the 343. Press and hold down the MIN/MAX and HOLD buttons and turn the 343 on. CALF will be displayed and the 343 will enter calibration mode and begin cycling between T1 and T2. Insert the probes into the ice bath and allow the readings to stabilize.Once the reading on both inputs has stabilized, press the HOLD button. SAVE will be displayed and calibration is complete.