You are on page 1of 12

AEROSPACE TELEMETRY

CH.UMAKALYANI Department of Electrical and Communication Engineering Ramachandra College of Engineering, Eluru.

Abstract:
Aerospace telemetry is the science of transmission of information from air space vehicles to an accessible location. Aerospace telemetry and the reception of flight test data are key components to flight test in ascertaining positional data for further analysis. Data gathered by a telemetry system is critical to the success of every aeronautical research project and reliable equipment is needed to keep the telemetry system, from the aircraft to the ground, transmitting and receiving every moment data is needed. This research paper introduces and defines telemetry and aerospace telemetry, provides a brief history of how telemetry came about, briefly captures some telemetry applications, introduces the telemetry equipment requirements necessary for telemetry to occur, such as signal conditioning and subcarrier oscillator equipment, outlines some telemetry concerns, such as noise and errors, and presents ways to minimize these errors using transducer, data system, and physical end-to-end calibrations.

1. INTRODUCTION
This paper will provide an understanding of the many aspects of aerospace telemetry. The next few sections define telemetry and aerospace telemetry, provide a brief history of telemetry, and further emphasize key telemetry aspects using a few telemetry application examples.

1.1 Aerospace Telemetry: Telemetry is a technology that allows the remote measurement and reporting of information of interest to the system designer or operator. Information, in this case, is data that is organized as a result of processing or manipulation, which adds to the knowledge of the receiver. The Greek root tele means remote and metron means to measure. Other sources define telemetry as the science of the use of the telemeter 14, and further define telemeter as an instrument for determining distance of an object, such as the distance target in gunnery. One early example of this science existed in Italy in t he early 17th century. An Italian astronomer named Porro used a quick measurement technique called tachymetry to determine distances. Porro used an optical method with fixed stadia hairs in a focal plane and a variable length graduated horizontal base at a remote location where the distance was observed. Telemetry can also be considered as the science of transmission of inaccessible data to accessible locations. Aerospace telemetry further defines telemetry as the science of transmission of information from air and space vehicles to accessible locations. This definition extended the conventional notion for telemetry. For example, although receiving stations are generally located on Earth, they may also be located within air and space vehicles remote from the vehicles containing the transmitting stations.

1.2 History of Aerospace Telemetry: One of the earliest documented uses of telemetry (TM) occurred in the United States (U.S.) in 1885. In that year, a U.S. patent was granted for a telemetry system as a result of the invention of the telegraph and later the telephone. These early TM systems were used by the electric power companies to monitor the distribution and use of electricity, and were called supervisory systems because of their monitoring abilities. In 1912, one of the twentieth centurys first TM systems was installed in Chicago, IL, and transmitted data on its several electric power generating plants to a central control station. This TM system transmitted the data using the citys existing telephone line network10. Following World War I (WWI), the TM data was transmitted using the electric power lines themselves. The first TM systems were extensively voltage or current type systems only and didnt involve radio frequency (RF) usage. The current type system involved using a balanced system in which a current in the circuit connecting the point of measurement to the point of display and applied force to an armature. The voltage system also worked on a balanced method; however, the balancing occurred at the receiving end. The voltage generated was a function of the quantity of telemetry to be measured and occurred at the transmitting end of the circuit. This voltage was connected to a line. At the receiving end, the line current was reduced to zero by a self-balancing instrument that generated a voltage to buck out the current in the line. If there was no leakage in the line, the bucking voltage was equal to the transmitted voltage within the sensitivity of the balancing instrument like combination. Pulse-type wired telemetry was also developed during the 1920s. In the pulse-type TM, the variable was transmitted as a function of electrical pulse timings rather than voltage or current. This method

allowed the information to be transmitted without serious loss of fidelity through circuits that were less than perfect, as compared to voltage or current-type systems. Timing of the pulse-type wired TM systems were often mechanically generated using a ratchet like device. Following WWI, TM data was transmitted by the electric power lines themselves. Although electrical telemetry systems are still in use, most modern telemetry systems use radio transmissions to span substantial distances. During WWI, wired telemetry systems were just starting to become more prolific in the U.S. due to the expansion of public utilities. Telemetry equipment was used on a limited basis by aircraft manufacturers before World War II (WWII). However, the postwar WWII years and the advent of the missile age have resulted in a phenomenal increase in TM usage, especially in the area of flight testing of aircraft and munitions.

2. TELEMETRY APPLICATIONS
Telemetry applications and its usage are highly visible in the areas of agriculture, retail businesses, medicine, defense, space, and resource exploration systems, wildlife study and management, and water management. 2.1 Agriculture and Telemetry Usage: Growing crops today is a very hightech competitive business. The timely availability of weather and soil data plays a crucial role in most activities related to healthy crop stations plays a major role in disease prevention and precision irrigation. Data stations transmit back real time to a base station and identify the major parameters needed to make good cropgrowing decisions, such as air temperature, relative humidity, precipitation and leaf wetness data for disease prediction models, solar radiation, wind speed, and soil moisture. Instant data allows the

agriculturalist to understand the progress of water into the soil and towards the roots. 2.2 Telemetry Usage in Retail Business: Retailers make use of radio frequency identification (RFID) tags to track inventory and prevent shoplifting. Most of these RFID tags passively respond to RFID readers and send data to the cashier. However, active RFID tags are available that periodically transmit telemetry to a particular base station. In 2005, some retail businesses used telemetry equipment to communicate vending machine sales and inventory data to a route truck and headquarters. This telemetry data eliminated the need for excessive trips by vending machine service drivers to see what items needed to be restocked before bringing the inventory inside. 2.3 Telemetry Usage in Medicine: Medical applications for telemetry include the use of biotelemetry in coronary care units for patients at risk due to abnormal heart activity. These patients are outfitted with measuring, recording and transmitting devices and a diagnostic data log of the patient's condition that is generated with the transmitted device. Biotelemetric devices are also used by nurses to monitor the patients acute or dangerous condition. 2.4 Telemetry Usage in Defense: Telemetry data systems enable defense, space, and resource exploration systems in the U.S., and is the enabling technology for large complex systems such as missiles, chemical plants, oil rigs, and spacecraft because TM allows automatic monitoring, alerting, and record-keeping necessary for safe, efficient operations. Space agencies such as National Aeronautics and Space Administration (NASA) use telemetry to collect data from

operating spacecraft and satellites. Real time satellite telemetry data provides continuous monitoring for military activities. Telemetry systems provide vital data in the development phase of missiles, satellites and aircraft because the system might be destroyed after or during the missile test. Engineers need critical system parameters in order to analyze and improve the performance and safety of the system. Without telemetry, these data parameters would often be unavailable to the engineer. 2.5 Wildlife Management and Telemetry: Wildlife monitoring utilizes animal telemetry, monitoring species at the individual level. Animals under study may be fitted with instruments ranging from simple tags to cameras, to global positioning system (GPS) packages, with transceivers to provide position and other basic information to scientists and stewards. 2.6 Telemetry and Water Management: Telemetry remains indispensable in water management applications, such as in hydrometry, the monitoring of rainfall, groundwater characteristics, and water quality. Water management applications use telemetry for automatic meter reading, groundwater monitoring, and use equipment surveillance for leak detection in distributed pipeline networks. The ability to have realtime data allows quick reaction responses in the field.

3. REQUIRED TELEMETRY EQUIPMENT


The minimum equipment required for a radio telemetry system consists of signal conditioning equipment, subcarrier oscillator equipment, an RF link, and a ground station1 as shown in Figure 1. The left portion of the block diagram depicts the block diagram components required to transmit the link while the right side of the

diagram depicts the block diagram components for the ground station.

converters, demodulators.

and

phase-sensitive

Figure 3: Telemetry Receiving Block diagram Figure 1: Telemetry Link Block Diagram The below figure depicts block diagram components for a basic signal source, signal conditioning circuitry, signal conversion and multiplexing, and transmission circuitry. 3.2 Sub-Carrier Oscillator Equipment: A sub-carrier oscillator, with a transducer designed into the oscillator circuit itself, is one way to eliminate the use of signal conditioners. A resistancecontrolled oscillator used in a strain gage bridge circuit, where one or more of the arms of the bridge changes resistance, causes the oscillator frequency to drift accordingly. Based upon test and datagathering requirements, the TM engineer determines the necessary TM-measuring system, utilizing VCOs and resistancecontrolled oscillators where needed. The proper RF link to communicate with the test article and data collection stations determines component selection. The reactance-controlled oscillator uses variable reactance type transducers. Transducers of this type are installed commonly in flight test operations requiring pressure, altitude, and aircraft positioning to be measured. Any change in inductance within the transducer will cause a proportional change effect in the oscillator frequency. If system calibrations become a problem, strain gages can be still be excited with a DC voltage and the outputs can be used to modulate voltagecontrolled oscillators (VCO). The VCOs are available in numerous types of modulating equipment and come in many customizable configurations for both vacuum tube and transistor variants.

Figure 2: Telemetry Transmitting System Block Diagram Figure 3, depicts block diagram components for a basic preamplifier, telemetry receiver, data synchronizer, analog magnetic recorder, and data display. 3.1 Signal Conditioning Equipment: A signal conditioner device converts the output of the transducer into a more suitable form for modulating the sub-carrier oscillator, regardless of the transducer output signal characteristics. Commonly used signal conditioning equipment includes DC amplifiers, AC amplifiers, flow meter

3.3 Radio Frequency (RF) Link: When a particular type of sub-carrier oscillator is used in an FM/FM type telemetry system, the outputs from the oscillators must be combined into a composite signal, called the sub-carrier multiplex, which is then able to modulate the transmitter carrier signal. The modulated carrier signal transmits to a receiving radio station. The receiver output, which is the recovered sub-carrier multiplex, routs the information to a set of sub-carrier discriminators and to a magnetic tape recorder in the telemetry station itself. The signals of interest are then separated at this time by the discriminators and then reconverted to the form in which they appear at the output of the transducers in an airborne system. In addition, the multiplexed signals are usually recorded on one track of the magnetic tape recorder for flight test to perform subsequent data reduction. If an airborne TM system is in relatively close proximity to a particular receiving station, a cable installation provides the multiplexed output of the sub-carrier oscillators directly to the inputs of the discriminators. This particular type of telemetry system is extensively used for industrial applications, such as in power distribution systems and pipeline systems. However, if the distance between the transmitting station and the receiving station is too great for direct cable linkage, these industrial telemetry systems often use a power line carrier, wire line carrier, and microwave systems. Power line and wire line carriers cannot be used for aerospace telemetry because of telemetry distance limitations; however, microwave systems may be used. The RF link in an aerospace telemetry system also requires transmitters, RF amplifiers, specific antennas, such as transmitting and receiving, preamplifiers, multi-couplers, frequency mixers, intermediate amplifiers,

demodulators, a free space path to radiate, and receivers to make the RF link complete. After the RF link has been established, the two main TM concerns for engineers are noise and errors. 3.4 Transmitters: One of the most important components in the RF link for aerospace telemetry is the use of transmitters. Most of the transmitters are either frequency modulated (FM) or phase modulated (PM). As mentioned earlier, the composite signal from the output of the sub-carrier oscillators and from the output of the mixer amplifier, are used to modulate the transmitter itself. For a FM transmitter, the modulation is accomplished by shifting the carrier frequency, which is generated within the transmitter. This FM-type transmitter is called a crystal stabilized transmitter because a crystal is used to determine the necessary frequency and stabilize it. PM transmitter modulation is accomplished by performing a phase shift of the transmitter frequency, and is often called crystal controlled modulation. 3.5 Antennas: An antenna is a transducer designed to transmit or receive telemetry radio waves, which are considered a class of electromagnetic waves .Antennas convert radio frequency electrical currents into electromagnetic waves and vice versa. Antennas are used in systems such as radio, Television broadcasting, point-to-point radio communication, wireless local area network (LAN), radar, and space exploration. Antennas transmit and receive RF via numerous mediums, including air, space, water, soil, or rock, at specific frequencies, for short distances. These antennas, or an antenna system, are required to provide adequate signal strength at the receiving station throughout the flight path of the test

vehicle. For instance, the signal is ready to be transmitted from one test vehicle to a receiving station only after the modulated carrier achieves sufficient amplification. 3.6 Receiver: A receiver is an electronic device or circuit that receives telemetry signals from an antenna and converts these signals into meaningful data, such as navigationalposition information8. Each output of the multi-coupler routes to an FM-type receiver. If only one receiver is required, the use of a multi-coupler would not be necessary. The FM receiver is usually a continuous-tunable unit that can be tuned to any frequency within the telemetry band being utilized. Another type of FM receiver is the crystalcontrolled FM receiver. This type of FM receiver requires the selection of an appropriate crystal to tune the incoming telemetry signal. The telemetry signal is amplified and the sub-carrier composite signal is separated from the detected carrier signal, all within the receiver itself, through the use of frequency-discriminating network and filter circuitry. The sub-carrier composite signal is a replica of the signal which was used to modulate the transmitter in the telemetry package. 3.7 Preamplifiers: A preamplifier is an electronic amplifier which precedes another amplifier to prepare an electronic signal for further amplification or processing6. Preamplifiers are required when acquiring telemetry data due to a large attenuation in the telemetry signal through free space itself. The receiving antenna output signal expects high enough power to overcome any degradation due to signal attenuation or signal loss between the antenna and the receiver itself. The pre-amplifier maintains certain signal strength and signal level compatibility, based on receiver input requirements. Long

cables running between the receiver and the antenna cause large losses to receiving systems. A preamplifier installation, at or near the receiving antenna, is used to amplify the antenna output signal and overcome excessive cable losses. If the intent is to overcome only cable losses, the preamplifier installs in very close proximity to the antenna. The preamplifier can provide amplification of the antenna output signal before there is any signal loss generated in the cable itself. The main advantage of using this particular preamplifier configuration is that it provides a higher signal-to-noise ratio. Preamplifiers configured for this purpose usually mounts in some type of weatherproof enclosure, controlled from within the receiving station itself, and have self-regulating power supplies. 3.8 Demodulators: A demodulator is an electronic circuit used to recover the information content from the carrier wave of a signal. The carrier wave or carrier, is a waveform, usually a sinusoidal waveform, which is modulated (modified) with the input signal for the purpose of conveying information to be transmitted, such as voice or data. The carrier wave functions at a much higher frequency than the baseband (information containing) modulating signal. A modem device uses both a modulator and a demodulator. The amplitude modulated (AM) signal encodes the information onto the carrier wave by varying its amplitude in direct sympathy with the signal to be sent. At least two methods are used to demodulate AM signals: (1) use an envelope detector or (2) use a product detector. The envelope detector, a very simple method of demodulation, consists of using anything that will pass current in one direction only and this current is considered to be rectified. The envelope detector may be in the form of a single diode, or may be more complex.

Many natural substances exhibited this rectification behavior, which helps explain why it was the earliest modulation and demodulation technique used in radio. The crystal set, mentioned earlier, exploits the simplicity of the modulation to produce an AM receiver while using very few parts. The product detector multiplies the incoming signal by the signal of a local oscillator with the same frequency and phase as the carrier of the incoming signal. After filtering the original audio signal will result. This method will decode both AM and singlesideband (SSB), although if the phase cannot be determined, a more complex setup is required. 3.9 Telemetry Ground Station: The telemetry ground station contains the RF equipment necessary to receive the telemetry data (receivers, receiving antenna, preamplifiers, etc.), and other processing and recording equipment to monitor telemetry data. If the ground station is used for aircraft telemetry programs, such as monitoring F-16D aircraft operational flight programs (OFP), this station must contain sufficient equipment to process and display those data signals required for realtime monitoring. When telemetry ground stations are used for missile or munitions test programs, the stations are usually minimally configuration with rack-mounted test equipment at a remote location near the intended target. The test station usually monitors and simulates weapon fly out data. This station only serves to receive the composite telemetry signal from the receiver and stores TM data on a magnetic tape for further processing at another subsequent location. There are many possible telemetry ground station configurations and the particular configuration really depends on how many data sources you choose to monitor, test requirements, safety, and available funding. These telemetry ground

stations are usually configured in prefabricated trailers are built using rackmounted equipment for easy equipment access. Eglin AFB, FL. uses these particular type telemetry ground stations to support the Advanced Medium-Range Air-to-Air Missile (AMRAAM) program. Figure 4, Notional Telemetry Ground Station, shown on the following page, depicts a notional ground telemetry station setup.

4. TM CONCERNS: NOISE AND ERRORS


One of the main concerns when using telemetry data is the possibility of introducing excessive noise and errors into the TM data stream itself. Errors are inherently present to a greater or lesser degree, as with any measurement process. These errors may be classified as either systematic or random. Systematic errors are those errors that can be eliminated by some calibration procedure or other form of compensation. Random errors result from the superposition of unrelated events, such as the interference between irregularities in bearing surfaces, the thermal motion of electrons in conductors, or the shot effect of electrons in vacuum tubes, to name a few. Systematic errors may be calibrated out by suitable methods, such as an RF calibration test. The results of a measurement may then be corrected in accordance with the calibration. In many cases there are zero drifts due to things such as temperature changes. If these drifts are slow enough, they may be corrected for by performing frequent automatic or manual calibrations. In radio telemetry, when employing telemetry applications involving aircraft and missiles, it is necessary to incorporate an automatic calibration procedure into the telemetry process. Telemetry calibration is discussed in the next section in more detail. However; in many cases, its not practical to

vary the physical variable at the input of a

particular channel by a known amount

Fig 4: National Telemetry Ground Station

during flight test. In this case, a calibration variable is inserted at the first subsequent feasible point in the system. A good example of this technique is the use of a strain-gage bridge used to measure bending in a structural member. Its impractical to load the member by a known amount during flight, so the calibration is frequently achieved by disconnecting the output of the strain-gage bridge using a relay, and then inserting known fractions of the bridgedriving voltage into the system at this point. The resulting calibration is used to correct zero drifts and changes in sensitivity of the subsequent parts of the system, but doesnt

give information on changes in the bridge circuit. If this calibration bridge circuit is designed properly and the precise location of the bridge elements is known, drifts in zero and sensitivity of the bridge become more unlikely. In this case, drifts in zero and sensitivity of the subsequent remaining parts of the system will become more detectable and easier to correct. Random errors, or noise, can also be reduced by good design, by averaging processes, or using wide-band modulation methods, but cant be eliminated the same way that systematic errors can be eliminated. The reason this isnt possible is because all physical systems have

performance limited by random effects, such as photographic emulsions, dry friction, or other effects such as the granular nature of electric currents. As such, these errors are divided into two broad groups; environmental errors and inherent equipment errors. Environmental errors are errors that arise when a particular piece of equipment is subjected to acceleration, temperature changes, and other environmental factors. Inherent equipment errors are caused by such factors as noise in the RF radio link, crosstalk in multi-channel systems, drifts in zeros and gains, and friction in pickup instruments, to name a few. Environmental errors are as difficult to cope with as inherent equipment errors. Many environmental errors are also systematic. For example, certain temperature effects on a piece of equipment can be compensated for or calibrated out. Other temperature effects may be considered random effects, such as the relative expansion of components in sliding contact with each other, and are the result of dry friction. 5. TELEMETRY CALIBRATIONS To calibrate means to check, adjust, or standardize systematically the graduations of a quantitative measuring instrument. A telemetry system measurement begins with the sensing of the item to be measured using a transducer located either on a test vehicle or at a remote test site. The calibration ends at a data storage or display device located at a receiving test site. Telemetry systems can be interconnected by radio links, direct wiring, electro-optical methods, or other combinations. Engineers test and calibrate individual components in a suitable laboratory before installing the system to ensure that data is of the highest possible quality. Additionally, the engineer subjects the telemetry system to a carefully conducted end-to-end calibration checks just

before, during, and immediately after the actual test. This section introduces instrumentation engineering guidance on how to address general calibration methodologies, techniques, and cases. For instance, the instrumentation engineer uses many types of transducers to make many physical measurements of acceleration, velocity, displacement, pressure, acoustics, flow, strain, humidity, corrosion, images, video, temperature, heat flux, position, torque, shaft power, angular rate, time, frequency, and phase angle, to name a few. Each of these measurements may require a different calibration technique. 5.1 Calibration Types: For illustrative purposes, calibration is defined as one of three different types; transducer calibration, data system calibration, or physical end-to-end calibration. 5.2 Transducer Calibration: Transducer calibration focuses on the transducer input output relationship. The transducer manufacturer usually performs a unit calibration in their laboratory. The instrumentation engineer should become very familiar with the piece of equipment and the techniques used by the manufacturer to prevent unnecessary calibration errors. After manufacturing equipment and calibration techniques are known, the engineer should then perform an in-house calibration on the individual transducer to verify the accuracy of the manufacturers transfer function. If manufacturing performance deviations are present, the engineer may end up defining a new transfer function for that unit, or in some cases, reset the device to conform to the original transfer function. In either case, successive calibrations may indicate upcoming failures. Many engineers stop after performing a transducer calibration and then combine the

transducers transfer function mathematically with the data system signal conditioners transfer functions. This method provides a calibration estimate under the assumption that the engineer precisely knows all the transfer characteristics of the wiring and other signal conditioning between the transducer and the data storage system. The engineer assumes that all wiring and signal conditioning will function as designed, but one bad connection invalidates the data. It should be noted that relying solely on transducer calibration is too risky for collection of valid data on an experiment or test. 5.3 Data System Calibration: Data system calibration simulates or models the input of the entire measurement system. Manufacturer usually performs a unit calibration in their laboratory. As stated earlier, an instrumentation engineer should become very familiar with the piece of equipment and the techniques used by the manufacturer to prevent unnecessary calibration errors. The most important consideration for making valid engineering measurements is to determine how the transducer operates in the actual test environment with all signal conditioning attached. The transducer should be calibrated while connected to the same signal conditioning equipment in the laboratory as it would be used on the actual test article, although this configuration is not always feasible. A minimum data system calibration should then be performed after mounting the transducer on the test article. This can be accomplished by simulating an excitation of the transducer, such as is often accomplished for strain gages, by using a shunt calibration resistor to simulate a change in resistance of the strain gage. Inserting a simulated transducer signal into the system verifies all signal conditioning transfer function predictions and simulates

transducer excitation by its physical input. Installation constraints, such as inaccessible or glued transducers, often mean that a data system calibration is the best that an instrumentation engineer can achieve to ensure the acquisition of valid data. Data system calibration simulates the desired item to be measured, rather than physically stimulating the transducers sensing device. 5.4 Physical end-to-end Calibration: Physical end-to-end calibration, also called mechanical end-to-end calibration, focuses on the relationship between the physical input and measured output throughout the entire measurement system, and is the best method to ensure the collection of valid data. 19 An end-to-end mechanical calibration means a full calibration of the instrumentation from the actual physical input, to the transducer, to the output, where the analog or digital signal will normally be analyzed. An end-to-end calibration verifies the measurement system characteristics and is per-formed by engineers after installing the measurement system in the test article. A calibration source stimulates the transducer and the instrumentation engineer monitors the signal entering the data collection unit to ensure the calculated value matches the actual system transfer function. It is highly recommended that end-to-end calibrations be performed before the test experiment, after the test experiment is completed, and before the instrumentation system is removed. The end-to-end calibration checks the measurement system, including wiring and connectors installed on the test article, allows the engineer to identify and correct any potential problems early in test, such as phase and wiring errors.

6. WHY USE DIFFEENT IRIG FORMATS?


Today, modern electronic systems such as communication systems, missile tracking systems, and other data handling systems, require time-of-day and year information to properly correlate data with time. Serial formatted time codes are used to efficiently interface a timing system output with the user system. Standardization of time codes is necessary to ensure system compatibility among the various ranges, ground tracking networks, spacecraft and missile projects, data reduction facilities, and international cooperative projects. These digital codes are typically amplitude modulated on an audio sine wave carrier or transmitted as fast rise-time TTL signals. 6.1 Summary of Different IRIG Formats: There are mainly six different IRIG formats that standard test range areas readily support. The main difference between each IRIG format is the rate, usually in pulse per second (PPS) or pulse per minute (PPM), and represents the rate at which data is correlated, and the count interval, which is usually in seconds, millisecond, or minutes. The choice as to which IRIG serial time code to use is based on the users needs, the fidelity of data being analyzed, and supportability of that particular IRIG standard when performing tests. For instance, the six IRIG formats currently available are IRIG-A, IRIG-B, IRIG-D, IRIG-E, IRIG-G, and IRIG-H. These serial formatted time codes have the following rate and count interval: IRIG-Type Rate Count Interval IRIG-A: 1,000 PPS 1 ms IRIG-B: 100 PPS 10 ms IRIG-D: 1 PPM 1 minute IRIG-E: 10 PPM 0.1 second IRIG-G: 10,000 PPM 0.1 ms IRIG-H: 1 PPS 1 second

6.2 IRIG B: The Choice Standard: Although the six standardized time codes have been in existence for quite some time, the most commonly used IRIG format used today is the IRIG-B serial time code format. IRIG-B is primarily chosen as the format of choice because of its 100 PPS rate and 10 ms count interval. The IRIG-B serial time code is primarily used because many instrumentation applications can sufficiently use the 100 PPS rate and usually a count interval of 10 ms is more than enough to accurately analyze the telemetry data. Also, one or ten pulses per minute for a rate is usually too slow for many applications and 1,000 or 10,000 PPS rates are usually overkill for most applications.

7. CONCLUSION
This paper presented telemetry and provided a brief history of how telemetry came about. It gave some telemetry applications and introduced the telemetry equipment requirements. It also discussed signal conditioning and sub-carrier oscillator equipment, and outlined some telemetry concerns, such as noise and errors, and presented ways to minimize these errors using transducer, data system, and physical end-to-end calibrations.

8 References
[1] Types of Modulation, retrieved from http://cbdd.wsu.edu/kewlcontent/cdoutput/T R502/page21.htm. [2] Hydrometry-Wikipedia, retrieved from http://en.wikipedia.org/wiki/Hydrometry. [3] Lalik, Jay M. Sr., (2007) 46th Test Wing, Flight Test Engineer, Eglin AFB, FL.

[4] Nichols, Myron H. and Rauch, Lawrence L., (1956) John Wiley & Sons, Inc., Radio Telemetry, John Wiley & Sons, Inc., NY and London. [5] Pacific Missile Test Center, Point Mugu, California, Handbook. [6] Stiltz, Harry L. (1961). Prentice-Hall, Inc. Aerospace Telemetry, Prentice Hall Publishing, Englewood lifts, NJ [7] Stiltz, Harry L. (1961). Prentice-Hall, Inc., Aerospace Telemetry-Equipment Telemetry Applications

Requirements, Prentice Hall Publishing, Englewood Cliffs, NJ. [8] White Sands Missile Range, RCC OnLine Documents, Document 118-02, Vol 2, Test Methods for Telemetry Systems and Subsystems, Volume 2, Test Methods for Telemetry for Telemetry RF Subsystems, retrieved fromhttps://wsmrc2vger.wsmr.army.mil/rcc/ manuals/118v2/ind ex118vol2.htm. [9] White Sands Missile Range, RCC OnLine Documents, Document 120-01,

Telemetry Systems Radiofrequency (RF) Handbook. [10] White Sands Missile Range, RCC OnLine Documents, Document 121-06,

Instrumentation Engineers Handbook.

You might also like