You are on page 1of 6

What is the definition of calibration record?

A document displayed with a measuring instrument that contains information about its calibration. Calibration records help maintain accuracy and traceability. Learn more about calibration record in the class "Calibration Fundamentals 210" below.

Calibration Fundamentals 210


Inspection Training
Class Information Tooling U classes are offered at the beginner, intermediate, and advanced levels. The typical class consists of 12 to 25 lessons and typically requires at least two hours of instruction time. Class Name: Calibration Fundamentals 210 This class describes the calibration process and explains how measuring instruments are traced back to national and international standards. Includes an Interactive Lab. 350110 350115 Intermediate 20 Class Objectives

Description:

Prerequisites: Difficulty: Number of Lessons: Class Outline


Take a Free Trial Class! Contact Me

Objectives The Importance of Calibration Measurement Standards Hierarchy of Measurement Standards Traceability ISO 9000 Requirements Working Standards Gage Blocks Measurement Uncertainty Uncertainty vs. Error Random and Systematic Errors The Calibration Process

Describe the main purpose of calibration. Explain why calibration requires measurement standards. Identify the hierarchy of measurement standards. Define traceability. Describe the ISO 9000 calibration requirement. Describe the role of working standards. Describe how gage blocks are used

Calibrating a Micrometer Calibration Records The Calibration Report Calibration Cycles Factors Affecting Calibration In-House vs. Outside Calibration How Calibration Affects Cost Summary

in calibration. Define measurement uncertainty. Distinguish between uncertainty and error. Distinguish between random and systematic errors. List the steps in the calibration process. Describe how to calibrate a micrometer. Identify the purpose of a calibration record. Identify the contents of a calibration report. Explain the importance of regular calibration. Identify the key factors that affect calibration. Describe factors that determine inhouse or outside calibration. Describe how calibration affects cost.

Class Vocabulary accuracy The difference between a measurement reading and the true value of that measurement. The fixed, nonadjustable block on a micrometer. The face of the anvil is used as the reference from which the dimension is taken. An individual outside of an organization who objectively evaluates the effectiveness of a company's quality system. The comparison of a device with unknown accuracy to a device with a known, accurate standard to eliminate any variation in the device being checked. A controlled test environment where higher-level calibration is performed. These calibration results should be traced back to NIST. A document displayed with a

anvil

auditor

calibration

calibration laboratory

calibration record

measuring instrument that contains information about its calibration. Calibration records help maintain accuracy and traceability. A document that contains information about a particular calibration procedure. Calibration reports maintain traceability. The amount of deviation in a measurement that is accounted for in the calibration process. The correction factor can be added to the measured value, or the measuring instrument can be adjusted. A measuring instrument with a contact point attached to a spindle and gears that move a pointer on the dial. Dial indicators have graduations that are available for reading different measurement values. The actual change in the measurement value when the same characteristic is measured under the same conditions, with the same operator, at different points in time. Drift indicates how often a measurement needs recalibration. The amount of deviation from a standard or specification. Errors should be eliminated in the measuring process. A hardened steel block that is manufactured with highly accurate dimensions. Gage blocks are available in a set of standardized lengths. A group of items classified lowest to highest according to ability. The hierarchy of measurement standards is classified according to quality. A measurement standard recognized by international agreement and used as the basis for assigning values to other standards. A collection of documents that lists requirements for the creation and

calibration report

correction factor

dial indicator

drift

error

gage block

hierarchy

international standard

ISO 9000

implementation of an effective quality system. The section of the ISO 9000 standard containing the list of requirements. ISO 9001:2000 is the auditable section of the standard. The pulsation in space that transmits light energy. Light wave values are used to determine primary standards. A power-driven piece of metalworking equipment for cutting or forming metal. A gage that is used to calibrate thread ring gages. The master thread gage inspects the internal threads of the ring gages. A recognized true value. Calibration must compare measurement values to a known standard. A scientist of measurement. Metrologists test the highest-quality standards. A hand-held measuring device used to inspect the dimensions of parts. The typical micrometer is accurate within 0.001 in. or 0.02 mm.

ISO 9001:2000

light wave

machine tool

master thread gage

measurement standard

metrologist

micrometer

An organization required by law to maintain national standards. This National Institute of Standards and Technology organization is formerly known as the National Bureau of Standards. precision The ability of a process to repeat the same accurate measurement over time. A measurement standard with the highest quality. A light wave is used to determine a primary standard. The objectives and processes of a company designed to focus on quality and customer satisfaction. An error that results from unpredictable variations from one or more influenced quantities. Random errors are inconsistent and easily recognizable.

primary standard

quality system

random error

secondary standard

A measurement standard that is used in comparison with a primary standard. They are also known as transfer standards. A rotating component on a micrometer that advances toward the anvil to make contact with the part. An error that is not determined by chance but is introduced by an inaccuracy in the system. Systematic errors are predictable and expected. A device used with a measuring instrument that records the number of hours an instrument operates. Time meters help determine calibration cycles. An unwanted but acceptable variation from a specified dimension. The ability to verify each higher step of calibration until the NIST international standards are reached. A measurement standard that is used in comparison with a primary standard. They are also known as secondary standards. A measurement value with no errors. The true value can never be known with total certainty. The measurement range in which the true value of a measurement is expected to lie. Uncertainty is an estimation of error. A difference between two or more similar things. A type of gage block that is used to calibrate measuring instruments. These are generally either Grade 2 or 3 gage blocks. A measurement standard used to calibrate or check measuring instruments. Gage blocks are common working standards.

spindle

systematic error

time meter

tolerance

traceability

transfer standard

true value

uncertainty

variation

working gage block

working standard

wringing

Bringing two surfaces of microinch flatness together so that they adhere, leaving only microinch separation. Gage blocks are wrung together in various combinations to form any length.

Calibration is the validation of specific measurement techniques and equipment. At the simplest level, calibration is a comparison between measurements-one of known magnitude or correctness made or set with one device and another measurement made in as similar a way as possible with a second device. Calibration is the process of establishing the relationship between a measuring device and the units of measure. This is done by comparing a device or the output of an instrument to a standard having known measurement characteristics

SO 14001:2004 specifies requirements for an environmental management system to enable an organization to develop and implement a policy and objectives which take into account legal requirements and other requirements to which the organization subscribes, and information about significant environmental aspects. It applies to those environmental aspects that the organization identifies as those which it can control and those which it can influence. It does not itself state specific environmental performance criteria.

You might also like