You are on page 1of 37

Part 3 Electronic Media

Index Modules Page No Radio - Mass Communication .................................................... 3 Growth of Broadcasting .................................................... 5 Trends in Broadcasting .................................................... 12 Introduction to Audiography .................................................... 15 Introduction to Television .................................................... 17 Programming Process .................................................... 21 Production Process .................................................... 30 Types of Programming .................................................... 36 Electronic Cinematography .................................................... 39 Post Production in Television .................................................... 44

Radio - Mass Communication Siddesh Radio: Introduction Radio is the transmission of signals, by modulation of electromagnetic waves with frequencies below those of visible light. Information is carried by systematically changing some property of the radiated waves, such as amplitude, frequency, or phase. The meaning and usage of the word "radio" has developed with developments within the field and can be seen to have three distinct phases: a) Electromagnetic waves and experimentation b) Wireless communication and technical development c) Radio broadcasting and commercialization Equipment: Receivers : The few radio receivers able to pick up this first-ever "outside broadcast" are those at the De Forest Radio Laboratory, on board ships in New York Harbor, in large hotels

on Times Square and at New York where members of the press were stationed at receiving sets. Public receivers with earphones had been set up in several well-advertised locations throughout New York City. There were members of the press stationed at various receiving sets throughout the city and the general public was invited to listen to the broadcast. Lee De Forest's Radio Telephone Company manufactured and sold the first commercial radios in the demonstration room at the Metropolitan Life Building in New York City for this public event. Transmitter : The wireless transmitter had 500 watts of power. It is reported that this broadcast was heard 20 km away on a ship at sea. Radio: Means of Mass Communication: Radio was first used by marines for sending telegraphic messages using Morse code between ships and land. The earliest users included the Japanese Navy scouting the Russian fleet during the Battle of Tsushima in 1905. One of the most memorable uses of marine telegraphy was during the sinking of the ship Titanic in 1912. This included communications between operators on the sinking ship and nearby vessels, and communications to shore stations listing the survivors. Radio was also used to pass on orders and communications between armies and navies on both sides in World War I. Besides broadcasting, point-to-point broadcasting, including telephone messages and relays of radio programs, became widespread in the 1920s and 1930s. Another use of radio in the pre-war years was the development of detection and locating of aircraft and ships by the use of radar (Radio Detection And Ranging) Today, radio takes many forms, including wireless networks and mobile communications of all types, as well as radio broadcasting. Before the advent of television, commercial radio broadcasts included not only news and music, but dramas, comedies, variety shows, and many other forms of entertainment.

The 1930's and 1940's brought what is known today as the Golden Age of Radio. During this period in time, radio had taken its crown as king of all media. Radio became the greatest form of popular entertainment for families. With dramatic and compelling programming such as sitcoms, dramas, game shows, and soap operas, there was something for everyone to enjoy. Radio is one of our most important ways of communicating. Since the late 1800s, when radio was invented, it has played a huge role in our lives. Communication between two far distant places became quick and much more inexpensive than stringing telegraph wire. Suddenly, ship-to-ship and to-shore radios were saving thousands from disaster at sea. Radio entertainment broadcasts were going into peoples' homes, and soldiers in the field were able to keep in touch with friendly units. Broadcasting is the most well known use of radio. Radio stations arrange songs and programs of particular genres to broadcast to listeners who tune in to hear them. Most stations provide short newscasts and talk radio provides a public forum where people can listen to interviews or call in to speak with the host or his or her guests. Sports events can be broadcast as an announcer provides a play-by-play description of the action. Companies can buy ad space on privately owned stations to air commercials designed to appeal to that station's listeners. Two-way radios are also very important. Emergency personnel such as police, fire fighters, and ambulance crews use radio to stay in contact with their bases and with each other. They send and receive reports with radios in their vehicles and carry smaller portable devices with them. Commercial vehicles such as taxis, trucks, and airplanes use radios to receive

directions and report difficulties. Construction crews, farmers, ranchers, and other groups use radio to send and receive information such as instructions and warnings. Radio is used extensively in the military to facilitate communication between bases, ships, planes, military vehicles, and field units. Private individuals may also use radio to communicate with others on citizens band radio. Other uses of radio include remote controls used to direct toys, railroad cars, or unpiloted aircraft. Airplanes depend on radioed navigation signals to stay on course and a form of radio called radar is used to guide ships, submarines, and aircraft as well as to detect them. Radios may also transmit large amounts of data between electronic devices, such as computers. Devices called bugs allow others to listen in on private conversations to obtain information and are commonly used by intelligence agencies. Doctors can also use radio to diagnose stomach ailments by having the patient swallow a capsule radio and then studying the signals it transmits. Roles of Radio: As Commercial Radio developed in the early days, its key strengths were seen as primarily tactical - fast turnaround, low capital cost and local flexibility. These days however, while the traditional strengths still apply, radio is increasingly being used for strategic roles. Dominant share of mind: Share of mind can be described as the extent to which a brand makes itself salient within the consumer's mind - this is often the most challenging task in sectors where

there are several top-parity brands, and/or high levels of competitive activity. The ability of radio to create dominant share of mind is a product of its intrusiveness and the high frequency with which ads are broadcast. Support to other media: Young people are consumers of several media, and campaigns which use only one medium can miss out on the "media multiplier effect". Because of its inherent characteristics, radio can work in a complementary way to other media. Brands in fast-changing areas like retail or financial services often use radio for its ability to put over several different messages as an overlay to a core TV campaign (multiple executions in radio are very inexpensive compared to TV). Radio can also, like TV, bring things to life - for services or corporate advertising this can be very valuable in adding personality and tone of voice.

The "explainer" medium: Young people, because of their inexperience, often need the benefits of products or services explained to them before they can make a decision to purchase - for example, a bank account, or a promotional offer. Radio is particularly useful for this as it uses the human voice in real-time. This means that the young people do not have to wade their way through extensive reading material before they even know what the proposition is. Studies clearly demonstrate that properties created on radio are cost effective and have advantage of high recall. International experience suggests that music oriented properties targeted at youth last long and provide immense benefit to advertisers.

Conclusion Radio offers tremendous opportunities for advertisers and media planners need to explore various options by which they can effectively use radio in their media mix.

Conversely, broadcasters need to develop the market by being more responsive to the advertiser's needs. This will provide an opportunity for the market to arrive at the final verdict on the effectiveness of the medium.

Growth of Broadcasting Jasmine Gill Invention of Radio Broadcasting evolved from electronic communication through wires (hence the term wireless, frequently used in the early years of radio). The roots of broadcasting technology lie in the development of the telegraph (1844) by the American inventor Samuel Morse and of the telephone (1876) by the Scottish-born American inventor Alexander Graham Bell. The idea of using radio for broadcasting to mass audiences was formulated in 1916 by the Russian-born executive of the Marconi Wireless Telegraph Co., David Sarnoff. Four years later Frank Conrad (18741941), an engineer with the Westinghouse Electric Corp., attracted considerable attention when a newspaper reported on the growing audience listening on crystal radio sets to his evening and weekend amateur broadcasts; a local music store had provided records to play on the Victrola, and Conrad and his family served as early disc jockeys. Westinghouse vice-president Harry Davis (18681931) assigned Conrad to build a more powerful transmitter to announce the outcome of the next U.S. presidential election, and in 1920 station KDKA in Pittsburgh, Pa., announced that Warren G. Harding had been elected president. About 1000 people heard this first news broadcast by the first U.S. commercial station. Radio communicated news much faster than did newspapers, and because crystal sets were easy to build and inexpensive, radio expanded rapidly in the following years. To stimulate the sale of radio sets, equipment manufacturers provided transmitting facilities. Singers, comedians, and entire orchestras volunteered their services for publicity. The eventual financial basis of the new industry, however, was still unclear. Invention of Television As the radio industry matured, inventors were busy working on the next innovation

in the electronic revolutiontelevision. The idea of broadcast television first surfaced in science fiction of the 1880s. In 1884 the German inventor Paul Gottlieb Nipkow (18601940) developed a rotating-disk technology to transmit pictures over wire. This limited technology dominated the early years of television research but was ultimately abandoned as impractical. The basic elements of an all-electronic television system became available only after 1927, when the American engineer Philo T. Farnsworth (190671) developed his image dissector. The corporate giants of the electronics field were no longer as skeptical as they had been about a new mass medium. In 1928 the first television drama, The Queen's Messenger, was broadcast on experimental equipment in Schenectady, N.Y. The first TV receivers offered black-and-white pictures on small screensabout 13 cm (about 5 in) acrossand sold for almost half the price of a new economy-model automobile. The technology was in place, and the audience was curious and willing, but after 1940 the industry moved into large-scale production of armaments and other materiel for use in World War II. Only six TV transmitters continued to broadcast through the war years. After the war, television expanded rapidly until 1948, when about 70 stations were on the air. In that year the Federal Communications Commission (FCC), concerned about the limited space available for television transmission in the VHF band (very high frequency, channels 213), initiated a 4-year freeze on all new licenses. In 1952 the FCC resumed licensing new stations, and it opened the UHF band (ultra high frequency, channels 1483) for television transmission. Mass production lowered the price of TV sets to an affordable range so that by 1955, 67 percent of American households were equipped to receive black-and-white programs; the figure increased to 87 percent by 1960. Peter Carl Goldmark (190677) of the Columbia Broadcasting System (CBS) demonstrated the first colour television system in 1940. During the 1940s, RCA developed a colour broadcasting technique that, unlike the CBS prototype, was compatible with existing black-and-white services; standards for a colour system compatible with monochrome receivers gained FCC approval in 1953. Broadcasting around the World Britain The first experimental broadcasts, from Marconi's factory in Chelmsford, began in 1920. Two years later, a consortium of radio manufacturers formed the British Broadcasting Company (BBC). This broadcast continued till its licence expired at the

end of 1926. The company then became the British Broadcasting Corporation, a noncommercial organisation. Its governors are appointed by the government but they did not answer to it. Lord Reith took a formative role in developing the BBC, especially in radio. Working as its first manager and Director-General, he promoted the philosophy of public service broadcasting, firmly grounded in the moral benefits of education and of uplifting entertainment, eschewing commercial influence and maintaining a maximum of independence from political control. BBC television broadcasts in Britain began on November 2, 1936, and continued until wartime conditions closed the service in 1939. Germany Before the Nazi assumption of power in 1933, German radio broadcasting was supervised by the Post Office. A listening fee of 2 Reichsmark per receiver paid most subsidies. Immediately following Hitler's assumption of power, Joseph Goebbels became head of the Ministry for Propaganda and Public Enlightenment. Non-Nazis were removed from broadcasting and editorial positions. Jews were fired from all positions. The Reichsrundfunk programming began to decline in popularity as the theme of Kampfzeit was continually played. Germany experimented with television broadcasting before the Second World War, using a 180-line raster system beginning before 1935. German propaganda claimed the system was superior to the British mechanical scanning system, but this was subject to debate by persons who saw the broadcasts. Sri Lanka Sri Lanka has the oldest radio station in Asia. The station was known as Radio Ceylon. It developed into one of the finest broadcasting institutions in the world. It is now known as the Sri Lanka Broadcasting Corporation. Sri Lanka created broadcasting history in Asia when broadcasting was started in Ceylon by the Telegraph Department in 1923 on an experimental footing, just three years after the inauguration of broadcasting in Europe. Gramophone music was broadcast from a tiny room in the Central Telegraph Office with the aid of a small transmitter built by the Telegraph Department engineers from the radio equipment of a captured German submarine. This broadcasting experiment was a huge success and barely three years later, on December 16, 1925, a regular broadcasting service came to be instituted. Edward Harper who came to Ceylon as Chief Engineer of the Telegraph Office in 1921, was the first person to actively promote broadcasting in Ceylon. Sri Lanka occupies an

important place in the history of broadcasting with broadcasting services inaugurated just three years after the launch of the BBC in the United Kingdom. The Future of Broadcasting Beginning In the 1980s, mass communications technologies have undergone dramatic changes. Innovations in video recording, as well as improved cable and satellite TV technology, have brought much greater diversity in programming as well as increased viewer control over what is watched and when. In 1985 the FCC ruled that government restrictions on so-called backyard dish antennas, which receive satellite-delivered signals, cannot be used to limit competition in distributing television programs. Another new element in communications is the computer. Through the use of coaxial cable or fiber optic cable, or modified use of the existing telephone system, two-way communication can take place between a home computer terminal and a central facility that provides information and entertainment. The Telecommunications Act of 1996 opened new possibilities by allowing telephone companies to offer television broadcasting services through phone lines or satellites, and by reserving a portion of the broadcasting spectrum for advanced digital television broadcasting. By the early 2000s close to 200 million Americans were estimated to make use of the Internet. Initially broadcasters were wary of the Internet, seeing it as a competitor that might drastically reduce their viewership. More recently, however, advertisers, broadcasters, and Internet content providers have been seeking effective ways of linking the various media. Even as broadcasters have sought to harness Internet technology, electronics manufacturers have found new ways to allow users to interact with TV. Keyboards and other controllers linked to set-top boxes let consumers use their TV screens for e-mail, electronic commerce, Web-surfing, and interactive video games. This growing convergence of television and computers represents one of the most exciting frontiers for broadcasting in the 21st century. Trends in Broadcasting Ronie Koran, Sourabh Raghuwanshi Today the scenario around the world has changed a lot. Traditional radio as we know it not only faces stiff competition from T.V. but also with new forms of media that have come about with technological advancement. Not long ago traditional terrestrial radio occupied a unique and seemingly unshakable position among media. It had the portability of a magazine or a newspaper and the content variety of television and cost nothing to use beyond the cost of a receiver. As broadcast

television struggled to keep its audience from fleeing to cable and later satellite, radio remained stable. Technology certainly offered alternatives portable tape and CD players but they were clunky and lacked the scope and flexibility of old-fashioned radio. By 2005 that had begun to change dramatically. Seemingly overnight, satellite radio, Internet-only stations, podcasts, MP3s and iPods were changing the way the world listened. And all of it was quickly getting portable. A listener could carry around everything from an entire home CD collection to a radio show downloaded last night, and the new audio programmers were capturing and creating content limited only by the scope of imagination From the early 1920s through the early 1980s, broadcasting was the only effective means of delivering radio programming to the general public. However, functions once exclusive to broadcasting are now shared in industrially advanced societies by two other means of mass communication: (1) cable television and radio systems, such as commercial cable services, pay-per-view channels, and modem-accessible databases, which transmit sounds and images to paid subscribers rather than to the general public; and (2) self-programmable systems, such as the videocassette recorder (VCR), digital video disc (DVD), video game, and digital recording technology, which allow the user more control over content and scheduling. Despite these innovations, in the first years of the 21st century broadcasting remained the single most important component of mass communication, even in countries where the newer systems are available and growing. It is estimated that about 1.8 billion radios are in use worldwide, with more than half concentrated in North America, the European Union countries, and Japan. In developing societies such as China, India, Brazil, and Egypt, nearly all citizens own or have access to a radio; television, on the other hand, remains the privilege of a smaller but expanding class of people. The digitisation of sound and video broadcasting has much more profound implications than improvements in distortion and signal-to-noise ratio it typically achieves. User terminals for digital reception are far more sophisticated than their analogue predecessors. They can record material for later playback, make use of electronic program guides and in some cases communicate back to the broadcaster. This raises questions about copyright and the technical and legal measures which may be employed to prevent end-users from copying material contrary to the wishes of the materials owners. Many of the digital approaches work within the Radio Frequency (RF) spectrum arrangements which were developed between the 1920s

and the 1960s for analogue systems. In a number of instances, entire broadcasting systems have been devised and implemented using proprietary transmission and reception technologies. The creators of these new businesses have taken bold leaps on several fronts at once in order to realise their service. For instance, WorldSpace, XM and Sirius satellite radio systems, and the Japanese and South Korean Satellite Digital Multimedia Broadcasting (S-DMB) systems each involved the development of exceedingly complex, low-power, microchips and software to run on them, simultaneously with financing, designing, building and successfully launching one or more satellites which have capabilities exceeding those of previous satellites. The resulting services are unique and potentially profitable but there are tremendous risks. At every point in this explosion and cross-fertilisation of various technologies, we find new opportunities for communication and often great capacity for things going wrong due to technical, business and legal complexity. New broadcast delivery systems continue to be developed. In the increasing number of homes equipped with digital cable systems, broadcast radio stations must now compete against scores of commercial-free digital music channels, each offering round-the-clock delivery of a single style or genre of music. The years during which radio and television broadcasting dominated mass communication as the principal means of signal deliveryapproximately the 1920s to the 1990scan be thought of as the broadcast era communications. It is fair to say that this was perhaps the only time in history when so wide a range of economic and social classes constituted a single audience. Although still technically possible, the assembly of enormous, heterogeneous audiencesa common daily occurrence of the broadcasting erais becoming increasingly rare, as the number of non-broadcasting alternatives increases and target audiences become narrower. A big trend affecting the broadcasting industry is new technology, especially the Internet. As the web grows in popularity, more and more people turn to it to receive late breaking news reports rather than rely on the work of radio and television reporters. Competition has really heated up between these three information mediums, and as a result, employment has decreased in recent years for television and radio broadcasters. To stay competitive, broadcasters have to be faster and better at what they do than ever before. Consumers are continuously moving towards online media and shifting away from traditional entertainment sources. As broadband penetration continues to rise, consumers will continue to use online services to obtain content and players in the media sector must look to expand their portfolios in order survive. Content producers are looking to push their content over

multiple channels in order to retain and attract consumers. Video aggregators and social networking sites are the latest disruptive platform set to be utilized for content distribution. It is essential for players to understand the opportunities and risks present when exploiting the medium. Introduction to Audiography Akanksha Singh Sounds vary according to the intensity (volume or loudness) and the tone (the speed of sound wave vibrations). Hearing occurs when sound waves are conducted to the nerves of the inner ear and from there to the brain. Sound waves can travel to the inner ear by air conduction (through the ear canal, eardrum, and bones of the inner ear) or bone conduction (through the bones around and behind the ear). The intensity of sound is measured in decibels (dB). A whisper is about 20 dB, loud music (some concerts) is around 80 to 120 dB, and a jet engine is about 140 to 180 dB. Usually, sounds greater than 85 dB can cause hearing loss in a few hours; The Director of Audiography (DOA) or Sound Director (SD) or Audio Director (AuD) is the designer and manager responsible for the audio experience in a filmmaking. The responsibilities range from the sound concept, design, planning and initial budgeting in pre-production through to recording and scheduling in production and coordinating the final mix in post-production and overall quality control of the audio process in filmmaking. The DOA is mostly found in Bollywood productions where music is a vital part of the genre. The SD was once a recognised role in Hollywood prior to the 1990s, however today this role is largely reduced to either sound designer and sound engineer (in post-production) or sound mixer (in production). Hollywood films are normally dialogue-based, and even this is often re-recorded in post-production using a technique called ADR. A tension exists between the visual and aural dimensions of filmmaking which is reflected in film history, where silent films preceded the "talkies". Production sound crew often complain at the lack of consideration given to audio issues in some productions. Having a DOA or SD helps alleviate such pressures by providing a powerful presence to defend the dimension of sound in filmmaking. The absence of a DOA or SD can result in a production company failing to plan effectively or budget realistically for sound. Hollywood sound editor David Yewdall bemoans the loss of the SD and tells the true

story of how the film producer of Airport failed to understand the importance of recording aircraft sound effects during a shoot, costing the film additional expense in post-production. Every dimension of filmmaking requires specialist attention; none less than sound, which requires the detailed planning and coordination of an experienced DOA or SD to assure the sound quality of any modern film. This role should not be confused with that of Recording Director, who was the head of sound recording at a major Hollywood studio in the pre-60s. Douglas Shearer was the Recording Director of MGM until 1952. Usually this was the only role credited to sound on those early MGM films.

Introduction to Television Sayantani Paul The origins of what would become today's television system can be traced back to the discovery of the photoconductivity of the element selenium by Willoughby Smith in 1873, the invention of a scanning disk by Paul Gottlieb Nipkow in 1884, and Philo Farnsworth's Image dissector in 1927. The 20-year old German university student Nipkow proposed and patented the first electromechanical television system in 1884. Nipkow's spinning disk design is credited with being the first television image rasterizer. Constantin Perskyi had coined the word television in a paper read to the International Electricity Congress at the International World Fair in Paris on August 25, 1900. Perskyi's paper reviewed the existing electromechanical technologies, mentioning the work of Nipkow and others. The photoconductivity of selenium and Nipkow's scanning disk were first joined for practical use in the electronic transmission of still pictures and photographs, and by the first decade of the 20th century halftone photographs, composed of equally spaced dots of varying size, were being transmitted by facsimile over telegraph and telephone lines as a newspaper service. However, it wasn't until 1907 that developments in amplification tube technology made the design practical. The first demonstration of the instantaneous transmission

of still monochromatic images with continuous tonal variation (as opposed to halftone) was by Georges Rignoux and A. Fournier in Paris in 1909, using a rotating mirror-drum as the scanner, and a matrix of 64 selenium cells as the receiver. On March 25, 1925, Scottish inventor John Logie Baird gave a demonstration of televised silhouette images in motion at Selfridge's Department Store in London. But if television is defined as the contemporaneous transmission of moving, monochromatic images with continuous tonal variation not still, silhouette or halftone images Baird first achieved this privately on October 2, 1925. But strictly speaking, Baird had not yet achieved moving images on October 2. His scanner worked at only five images per second, below the threshold required to give the illusion of motion, usually defined as at least 12 images per second. By January, he had improved the scan rate to 12.5 images per second. Then he gave the world's first public demonstration of a working television system to members of the Royal Institution and a newspaper reporter on January 26, 1926 at his laboratory in London. Unlike later electronic systems with several hundred lines of resolution, Baird's vertically scanned image, using a scanning disk embedded with a double spiral of lenses, had only 30 lines, just enough to reproduce a recognizable human face. In 1927, Baird transmitted a signal over 438 miles (705 km) of telephone line between London and Glasgow. In 1928, Baird's company (Baird Television Development Company / Cinema Television) broadcast the first transatlantic television signal, between London and New York, and the first shore-to-ship transmission. In 1929, he became involved in the first experimental electromechanical television service in Germany. In November 1929, Baird and Bernard Natan of Pathe established France's first television company, TlvisionBaird-Natan. In 1931, he made the first live transmission, of the Epsom Derby. In 1932, he demonstrated ultra-short wave television. Baird's electromechanical system reached a peak of 240 lines of resolution on BBC television broadcasts in 1936, before being discontinued in favor of a 405-line all-electronic system developed by Marconi-EMI. Meanwhile in Soviet Russia, Lon Theremin had been developing a mirror drumbased television, starting with 16 lines resolution in 1925, then 32 lines and eventually 64 using interlacing in 1926, and as part of his thesis on June 7, 1926 he electrically transmitted and then projected near-simultaneous moving images on a five foot square screen. By 1927 he achieved an image of 100 lines, a resolution that was not surpassed until 1931 by RCA, with 120 lines.

Mechanical color The successful transmission of color images was hindered in television's mechanical era by the sluggish response of selenium photoelectric cells at the transmitting end, which could barely transmit a grayscale image fast enough to simulate motion, much less three times as many images per second to analyze the red, blue and green primary colors of light. To get around this, early color systems typically used three separate scanning, transmission, and reproduction systems that were superimposed into one image at the receiver. John Logie Baird demonstrated the world's first color transmission on July 3, 1928, using scanning discs at the transmitting and receiving ends with three spirals of apertures, each spiral with filters of a different primary color; and three light sources at the receiving end, with a commutator to alternate their illumination. Baird also made the world's first color broadcast on February 4, 1938, sending a mechanically scanned 120-line image from Baird's Crystal Palace studios to a projection screen at London's Dominion Theatre. TELEVISION IN INDIA Television first came to India (named as Doordarshan or DD) as the National Television Network of India. The first telecast started on September 15, 1959 in New Delhi. After a gap of about 13 years, second television station was established in Mumbai (Maharashtra) in 1972 and by 1975 there were five more television stations at Srinagar (Kashmir), Amritsar (Punjab), Calcutta (West Bengal), Madras (Tamil Nadu) and Lucknow (Uttar Pradesh). Till 1982, transmission was in black & white, when Doordarshan introduced colour during the 1982 Asian Games Indian small screen programming started off in the early 1980s. At that time there was only one national channel Doordarshan, which was government owned. The Ramayana and Mahabharat was the first major television series produced. This serial notched up the world record in viewer ship numbers for a single program. By the late 1980s more and more people started to own television sets. Though there was a single channel, television programming had reached saturation. Hence the government opened up another channel which had part national programming and part regional. This channel was known as DD 2 later DD Metro. Both channels were broadcast terrestrially. Post Liberalisation Television The central government launched a series of economic and social reforms in 1991 under Prime Minister Narasimha Rao. Under the new policies the government

allowed private and foreign broadcasters to engage in limited operations in India. This process has been pursued consistently by all subsequent federal administrations. Foreign channels like CNN, Star TV and domestic channels such as Zee TV and Sun TV started satellite broadcasts. Starting with 41 sets in 1962 and one channel (Audience Research unit, 1991) at present TV in India covers more than 70 million homes giving a viewing population more than 400 million individuals through more than 100 channels. A large relatively untapped market, easy accessibility of relevant technology and a variety of programmes are the main reasons for rapid expansion of Television in India. It must be stressed that Television Entertainment in India is one of the cheapest in the world. Cable Television In 1992, the government liberated its markets, opening them up to cable television. Five new channels belonging to the Hong Kong based STAR TV gave Indians a fresh breath of life. MTV, STAR Plus, BBC, Prime Sports and STAR Chinese Channel were the 5 channels. Zee TV was the first private owned Indian channel to broadcast over cable. A few years later CNN, Discovery Channel, National Geographic Channel made its foray into India. Star expanded its bouquet introducing STAR World, STAR Sports, ESPN and STAR Gold. Regional channels flourished along with a multitude of Hindi channels and a few English channels. By 2001 HBO and History Channel were the other international channels to enter India. By 20012003, other international channels such as Nickelodeon, Cartoon Network, VH1, Disney and Toon Disney came into foray. In 2003 news channels started to boom.

Programming process Audio in Television Programming. Sumit Equipments Used: The equipment and listening conditions also greatly affect how we will perceive different frequencies. But at times it creates problems that we can compensate with bass and treble controls of playback equipment.

Here is showing a graphic equalizer which allows specific bands of frequencies to be individually adjusted for loudness. A graphic equalizer is also used to help match audio segments recorded under different conditions, or simply to customize audio playback to the acoustics of a specific listening area. There are so many audio equipments which affect the fidelity of sound like microphones, amplifier, recorder, or audio speaker. Microphone is the device that transduces sound waves into electrical energy in a variety of ways depending on their design. The following is the classification of the microphones depending on their physical and electrical characteristics: 1. Camera-mounted microphone: These are permanently mounted on the camcorders. These are usually condenser mikes with an omnidirectional pickup pattern that is primarily useful for recording ambient sound, which is also called natural or wild sound. But it is often advised not to use this microphone as it captures the ambient sound and also its reach is too less.

2. Hand microphones: These are designed to be hand-held by the talent. Reporters frequently use hand microphones for interviews or stand-ups at news events. It must be physically rugged, because they are seen by viewers, they usually have nondistracting, non-reflective finishes. 3. Boom microphone: A boom is a device with a long-arm that positions the microphone near the talent, but outside the cameras view.The most simple boom microphone is fishpole, a light, hollow, telescopic pole with a microphone attached to one end. A principle reason for using a boom is to keep the microphone out of sight, so boom mounted microphones is some distance from the sound source, depending on shot composition. This often requires that the microphone has a highly directional pattern to avoid picking up unwanted sounds. But if the talent moves quickly or you need to pick up the sound of several persons at a time, you cannot use this microphone. 4. Lavaliere and Clip Microphone: These are microphones physically attached to the talent. Technically a lav microphone hangs by a cord from the talents neck while

clip microphone is attached to the talents clothing with a small pin or tie-clip device. These often use condenser-type transducers and have an omni-directional pick up pattern. Microphones worn close to the chest would excessively emphasize lower frequencies in the talents voice were they not designed to compensate for this effect. This results in a tinny sound. 5. Wireless microphone: It uses a radio frequency transmitter and receive rather an audio cable to deliver the microphones output to recorder or console. The part of the system attached to the talent includes a low-powered transmitter that broadcasts the microphones output to a receiver located somewhere off-camera.

Visuals in Television Programming TV CAMERA ENG Camera: Until the mid 1970s, TV Journalists used various types of 16mm film Cameras to gather news in the field. Later on, advances in Transistor technology and smaller video-image sensing tubes made possible much smaller, portable video cameras. Originally ENG Cameras and their companion video recorders came in two separate units. The videographer typically rested the camera on the right shoulder and carried the relatively heavy recorder by a strap slung over the left shoulder. By the late 1980s, advancing technology allowed the merging of cameras and recorders into a single unit called a camcorder. It featured smaller, lighter cameras and high-quality recorders. Todays ENG cameras produce pictures technically comparable to those made by the best studio cameras of the recent past. Studio Cameras: Studio Cameras produce the finest, sharpest pictures available using existing technology. These are designed for use co-operatively by more than one individual. These cameras have design features that make productions easier or more efficient within the studio environment. In addition, these cameras use carefully counterbalanced mounting pedestals and other devices to provide smooth, jitter-free movement. The cameras zoom and focus controls are located on the handles attached to the mounts, allowing operators to adjust these controls from behind the

camera while watching the image in the cameras viewfinder. It also allows easier attachment and balancing of teleprompters and other accessories. Composition: Whatever a film-maker sees through the view-finder of the camera, it is called as shot. There are various shots being commonly used in the composition of a film, which are described below:Camera Techniques: Distance and Angle

Long Shot (LS): Shot which shows all or most of a fairly large subject (for example, a person) and usually much of the surroundings. Extreme Long Shot (ELS) - see establishing shot: In this type of shot the camera is at its furthest distance from the subject, emphasising the background. Medium Long Shot (MLS): In the case of a standing actor, the lower frame line cuts off his feet and ankles. Some documentaries with social themes favour keeping people in the longer shots, keeping social circumstances rather than the individual as the focus of attention. Medium Shots. Medium Shot or Mid-Shot (MS):- In such a shot the subject or actor and its setting occupy roughly equal areas in the frame. In the case of the standing actor, the lower frame passes through the waist. There is space for hand gestures to be seen. Medium Close Shot (MCS): The setting can still be seen. The lower frame line passes through the chest of the actor. Medium shots are frequently used for the tight presentation of two actors (the two shot), or with dexterity three (the three shot). Close-up (CU):- A picture which shows a fairly small part of the scene, such as a character's face, in great detail so that it fills the screen. It abstracts the subject from a context. MCU (Medium Close-Up): head and shoulders. BCU (Big Close-Up): forehead to chin. SHOT ANGLES:There are various shot angles being used in the programming. These are as follows:1. Low angle shot 4.Worms eye 2. Eye-Level shot 5. Canted

3. High angle shot 6. Birds eye.

PROGRAMMING PROCESS EDIT Archana Editing is a process of preparing language, images or sound through correction, condensation, organization and other modification in various media. A person who edits is called an editor. Editing is often done to correct mistakes by cutting out the bad parts, and/or replacing them with the good one. Offline & Online Editing Offline editing is the film and television production process in which raw footage is copied and edited, without affecting the camera original film or tape e.g. Film editing. While online editing is done when there is live programme is going on e.g. Cricket match, live interview Linear & Non Linear Editing Linear video editing is the process of selecting, arranging and modifying the images and sound recorded on the video tape , captured by the video camera. Non linear editing for the film and television production is the modern editing method which involves being able to access any frame in a video clip with the same ease as any other .This method is similar in concept to the cut & paste technique used in film editing from the beginning . In the non-linear system, the timeline is very important. It contains all the relevant information about the edit. While creating the timeline the editor must save, render effects and maintain backup files. The effects and dissolves must be used in good taste. The timing of the effects can be varied. Similarly colour corrections can be made when required. The rise of consumerism has increased the manipulation of visuals. News Editing Editors who edit news must understand news, have news sense and also be used to research. Besides knowing non linear editing software, news editors must have knowledge of other aspects of editing. Since news has immediacy and is produced

online using multi camera operation the news editor must know about vision mixing. The editor should know the proper time and use of graphics, reporting, news casting. The editor must have an aesthetic sense while using the software. Above all, the news editors must be responsible. Technology is like a gun that must be handled carefully when news is delivered. The technology of Chroma Keying must not be used to convey incorrect information to the viewers. Transition The way in which any two shots are joined together is called the transition .Transition are very important .the most common transition is the cut , in which one shot changes instantly to the next. The next most common transition is the crossfade where one shot gradually fades into the next . advance transition include wipes and digital effects . Creating Transitions A cut doesnt need any sort of processing and shot ends and the next begins Transition are created in several ways 1) In camera some cameras come with built in transition and fades 2) Generating device in live production , transition can be added in the real time using special effect generators .Most vision switchers include a selection of transitions 3) Post production Transitions can be added during editing , using appropriate software Types of Video Transition Cut: The most common transition an instant change from one shot to the another . The raw footage from your camera contains cut between shots where one stop and start recording .In film and the television production the vast majority of the transitions are the cut. Mix / Dissolve/Crossfade: These are all terms to describe the same transition a gradual fade from one shot to next. Crossfade have a more relaxed feel than a cut and are useful if one want to meandering pace, contemplative mood, and etc scenery sequence work well with crossfades. Crossfade can also convey a sense of passing time or changing location

Fade: Fades the shot to a single colour. Usually black or white .the fades to black and fade from black are ubiquitous on film and television they usually signal the beginning and end of scene. Fades can be used between shots to create a sort of crossfade which, for example fades briefly to white before fading to the next shot Wipe: One shot is progressively replaced by another shot in a geometric pattern . there are many types of wipe, from straight lines to complex shapes. Wipes often have coloured border to help distinguish the shots during the transition .Wipes are a good way to show changing location Digital Effect: Most editing applications offer a large selection of digital transitions with various effects. There are too many to list here, but these effect include colour replacement, animated effects, pixelization, focus drops, lighting effects, etc.

Production process Yojana Phadnis Introduction Television Production, techniques used to create a television program. The entire process of creating a program may involve developing a script, creating a budget, hiring creative talent, designing a set, and rehearsing lines before filming takes place. After filming, the post-production process may include video editing and the addition of sound, music, and optical effects. The three basic forms of television programs are fictional, non fictional and live television. Fictional programs include daytime soap operas; situation comedies; dramatic series; and motion pictures made for television, including the mini-series (a multiple-part movie). The basic non fictional, or reality, programs include game shows, talk shows, news, and magazine shows (informational shows exploring a variety of news stories in an entertainment format). Live television is generally restricted to sports, awards shows, news coverage, and several network daily talk shows. Most television programs are produced by production companies unrelated to the television networks and licensed to the networks. The network creates the financing for the production by selling commercial time to sponsors. The Production Team

The personnel involved in the production of a television program include creative talent such as actors, directors, writers, and producers as well as technical crew members such as camera operators, electrical technicians, and sound technicians. The executive producer is responsible for the complete project and is usually the person who conceives the project and sells it to the network. The executive producer bears final responsibility for the budget and all creative personnel, including the writer, line producer, director, and major cast members. The line producer reports to the executive producer and is responsible for the shooting schedule, budget, crew, and all production logistics. The writer or writers develop the script for each show. They often work during pre production and rehearsals to correct problems encountered by the actors or directors, or to revise for budgetary or production considerations. Reporting to the executive producer, the director helps choose actors, locations, and the visual design of the production, such as the style of sets and wardrobe. In addition, the director is responsible for the performances of the actors as well as all camera movements. After filming, the director edits the videotape to create what is known as a director's cut. Actors work under the direction of the director to portray a character. Performers include talk-show hosts, newscasters, and sports announcers. Actors and performers are chosen by the producer, and most audition to earn their part. Once they are hired, actors memorize their lines from a script and usually participate in a rehearsal before the program is filmed, or shot. Performers may provide live commentary, or in the case of newscasters, they may read their lines from cue cards or a Tele promptera machine that displays words on a screen. The production manager is responsible for all physical production elements, including equipment, crew, and location. The assistant directors report to the director and are responsible for controlling the set, managing the extras, and in general carrying out the director's needs. The cinematographer, who operates the camera, is responsible for lighting the set and the care and movement of the camera. The production designer, also called the art director, is responsible for the design, construction, and appearance of the sets and the wardrobe. Often the makeup artists and hair stylists report to the production designer. The key grip is responsible for the camera dolly (the platform that holds and moves the camera) and all on-set logistical support, such as camera mounts, which are used to affix the camera to a car or crane. Videotape production involves a technical director, who is responsible for video recording, and video engineers, who are responsible for the maintenance and quality

of the electronic equipment and their output. The creation of a television show begins with an idea for a program and the development of a script. A television network may also require a commitment from one or more well-known actors before financially committing to film a show. Producing a show involves three main stages: pre-production, principle photography, and post-production. Pre-Production Activities Pre-production activities involve the planning, budgeting, and preparation needed before shooting begins. The pre-production period can last as long as a month or more for a movie, or just a week for a single episode of a situation comedy. Productions of great complexity, such as a telethon or a live-awards ceremony, may take months of pre-production. Three key people involved in pre-production are the production manager, director, and casting director. The production manager's first tasks are to produce a preliminary budget, hire the location manager, and locate key crew department leaders. The first essential production decisions are the location of shooting and a start-of-production date. The director's first activities are to review the script for creative changes, begin the casting process, and select assistant directors and camera operators. Subsequently, every decision involving cast, creative crew, location, schedule, or visual components will require the director's consultation or approval. The culminating activity of the pre-production process is the final production meeting, attended by all crew members, producers, director, and often, the writer. Led by the director, the pre-production team reviews the script in detail scene by scene. Each element of production is reviewed and any questions answered. This meeting can last from two hours to a full day depending on the complexity of the shoot. A screenplay or script is a written plan, authored by a screenwriter, for a film or television program. Screenplays can be original works or adaptations from existing works such as novels. The major components of a screenplay are action and dialogue, with the "action" being "what we see happening" and "dialogue" being "what we hear" (i.e., what the characters utter). The characters, when first introduced in the screenplay, may also be described visually. Screenplays differ from traditional literature conventions in ways described below; however, screenplays may not involve emotion-related descriptions and other aspects of the story that are, in fact, visual within the end-

product. Screenplays in print are highly formal, conforming to font and margin specifications designed to cause one page of screenplay to correspond to approximately one minute of action on screen; thus screen directions and descriptions of location are designed to occupy less vertical space than dialogue, and various technical directions, such as settings and camera indication are set apart from the text with capital letters and/or indentation. Professional screenplays are always printed in 12-point Courier, or another fixed-width font that appears like typewriter type Teleplay A teleplay is a drama which is telecast using many of the same constraints as a theater piece (limited scenery, cast, special effects). Teleplays are typically televised live or filmed at a single television studio using one camera or a few stationary cameras. Storyboard Storyboards are graphic organizers such as a series of illustrations or images displayed in sequence for the purpose of pre visualizing a motion graphic or interactive media sequence, including website interactivity. The storyboarding process, in the form it is known today, was developed at the Walt Disney studio during the early 1930s, after several years of similar processes being in use at Walt Disney and other animation studios. Principal Photography Principle photography is the period in which all the tape or film needed for the project is shot. All television programs are shot using one of two basic methods of photography: single camera film production and multiple camera tape production. The single camera method is used to produce movies for television and most dramatic series. Multiple camera tape production is used to produce most situation comedies, soap operas, talk shows, game shows, news magazines, and live programs such as sports, awards shows, and the news. Some forms of programming such as music videos or reality programs (special interest news presented in an entertaining format) employ both methods, using single camera shooting for field pieces and multiple cameras for in-studio footage. The single camera film mode of production is virtually identical to the method of making theatrical movies. The script is broken down into individual scenes. Each scene is shot from a number of angles. The widest shot, which includes all the action, is called the master. Additional shots include closer angles of the characters,

sometimes in groups of two or more, and almost always at least one angle of each actor alone. That shot can be either a medium shot (from waist to head), close-up (only head and shoulders), or extreme close-up (of the face only). Many times a scene includes insert shots (such as a close-up of a clock or a gun) or cutaways (a shot of the sky or tree or other visual that relates to the scene). Scenes are scheduled to be filmed according to production efficiency, not story progression. The film is pieced together in sequential order during post-production. The multiple camera tape method is most suitable for shooting inside a studio. Three or four videotape cameras are focused on the action taking place on the set, and scenes are shot in sequence. Each camera operator works from a list of camera positions and framing requirements for the full scene. Together the cameras cover all required camera angles. Using headsets to communicate with the camera crew, the director asks for camera adjustments during the filming of the scene and indicates to the technical director which cameras to use at each moment. The technical director ensures the selected shot is recorded on a master tape. The result is a fully edited, complete show, needing only sound effects, music, optical effects, and titles to be complete. Post Production Activities Post-production begins with the completion of filming and continues until the project is delivered to the network for airing. The two main activities of postproduction are the editing, or assembling, of video footage and the creation of a complete sound track. Editing may begin during production. In single-camera shoots, the film from each day is reviewed at a later time by the director, producer, and network in the order in which it was shot. These films, called dailies, are then broken down and assembled into scenes by the editors. The first full assemblage is shown to the director, who makes further editing changes and creates the director's cut. Thereafter, the producer and the network make changes until a final cut is created. The final cut is given to the sound department, which is responsible for preparing the music tracks, or recordings; sound effects; and dialogue tracks for final combination into one track. The final mixing of all the sound is called dubbing. During this period, the sound engineers will spot the musicthat is, select the points at which music will be insertedand musicians will write and record the music. Sound engineers also adjust dialogue recording for production quality and record new or replacement dialogue in a process called looping. Sound effects are also added at this time. The

resulting dubbing session, which can take several days for a movie or just a few hours for a multiple camera tape production, can involve the combination of 5 to 25 separate sound tracks. The final stage of post-production is the addition of optical effects, such as scene fade-outs or dissolves, insertion of titles and credits; creation of special visual effects, such as animations; and colour correction. The post-production process can take as long as eight weeks for a movie to three days for a situation comedy. Commonly, all optical effects, titles, and music are rolled in during the production of soap operas, game shows, or talk showsgreatly reducing post-production.

Types of Programming Rajendra Prasadh Television programmes can be divided in two broad based categories Fiction and Non-fiction programmes. Under Non-fiction Programmes we have: News and Public Affairs: This category can be subdivided into news bulletins, general news magazines, news magazines for particular ethnic groups, and public affairs discussion. Features and Documentaries: It is a category not always easy to separate from, for example, news magazines and public affairs discussions. The normal criterion used is that an item is classified as feature or documentary when a substantial part of it is offered as direct presentation of the substance of a problem or an experience or a situation, by contrast with the discussion in which a situation or problem maybe illustrated, usually relatively briefly, but in which the main emphasis falls on relatively formal argument about it. Education: This is defined as items of formal educational intention, as distinct from educational elements in other kinds of programme. It is subdivided into course programmes for schools, colleges and universities; instructional programmes, not normally related to external courses, mainly on crafts, hobbies, etc; and adult education of a more general kind - specific teaching of general skills, which is

however not related to formal courses and qualifications. There is an overlap between these categories, and the course programmes, for example, are available to and often watched by the general public. The total figure for education is thus more significant than the subsidiary figures. Arts and Music: This is a difficult category to separate, since it depends on received definitions of the arts - painting, sculpture, architecture, literature - and music in the sense of the established classical concert repertory and opera. It is given as a separate category because it is usually planned in these terms and is then significant as a received interpretation. General Entertainment: This is a miscellaneous category, but as such significant. It is subdivided into musical shows - where singers or groups are principally presented, at times with rather different supporting items; variety shows - where the main emphasis is on comedy, in a number of cases with supporting musical items; games and quiz shows - where in many different forms there is some kind of overt gameplaying or competition and question-and-answer shows of the same competitive kind; talk--shows - a category not always easy to separate from discussions but conventionally defined as a separate form of and presented as entertainment, usually late at night; in matter and manner usually strongly linked to show-business. A new genre which can be included in this is the reality genre which has become quite a craze. Sport: Television Sport and sports discussion. Religion: Religious services, discussions and features, presented at specific times. Publicity (internal): A channels presentation of its own programmes , by trailers, advance announcements , etc. Commercials: Advertising programmes of all kinds other than internal publicity eg infomercials. Fiction programming: Fictional programmes on television range from feature films to soap opera, from innovative ten minute shorts to popular comedy series that run over many years. The

various types are One-off dramas: These are usually grouped together under a regular title and transmitting them at the same time every week. These individual dramas usually have a similar theme when they are under a regular title examples from Indian television include Rishtey, Aahat and X-zone. Serials: These are dramas in which the plot develops over several episodes, the number of episodes can vary but is a short limited number of episodes. This format is not popular in India. Childrens Programmes: This is defined as programmes specifically made for and offered to children, at certain special times. Children off course watch many other kinds of programmes, but this separable category is significant. It is subdivided into programmes composed mainly of cartoons and puppet-shows; other kinds of entertainment programmes, usually live stories and plays; and educational programmes. This last subdivision needs explanation. Such programmes - Sesame Street, The Electric Company, Playschool - often use cartoons or puppets, and are often entertaining. But they are separately listed because their formal intention is to help with learning, and much specific teaching of skills is included in them, though often in informal ways. Movies: This is defined as films originally made for distribution in cinemas and movie theatres. Plays, series and serials which may in whole or in part have been filmed have been included under Drama. Drama series: These are dramas in which the plot develops over several episodes and run on an average 2-5 years. They are more predictable for the schedulers than oneoffs and also offer reassuring signposts to audiences, since the same characters and settings appear in every episode. The audience gets to know a group of individuals, but can watch a single self-contained episode without needing to follow a cliff hanging plot line. In Indian context suitable examples would be CID and Siddhanth. Soap Operas: The long running development of character and the interweaving of multiple story lines that can never be resolved, since the end is never in sight, are the defining characteristics of Soap operas and as characters develop the audience gets hooked into it. This is generally where many directors and writers get their first

opportunities. This format is very successful in India for example the Ekta Kapoor soaps. Sitcoms: Situational comedies, or sitcoms, usually an impossible or insoluble situation reveals new cosmic possibilities each week.eg Office office, Hum Paanch. In the U.S.A and UK sitcoms are usually studio produced in front of a studio audience, whose laughter is heard, although they are not seen. They break the strict realism of straight drama which does not admit to the presence of an audience.

Electronic Cinematography Komal Shastri What is Electronic cinematography? Electronic Cinematography is the cinematography using an electronic video camera to create a videotape that can be viewed on a monitor, edited electronically, and transferred to film for motion-picture projectors. It is the process of capturing motion pictures as digital images, rather than on film. Digital capture may occur on tape, hard disks, flash memory, or other media which can record digital data. As digital technology has improved, this practice has become increasingly common. Several mainstream Hollywood & few of Bollywood movies have now been shot digitally.The benefits and drawbacks of electronic vs. film acquisition are still hotly debated, but electronic cinematography cameras sales have surpassed mechanical cameras in the classic 35mm format. Video is the technology of electronically capturing, recording, processing, storing, transmitting, and reconstructing a sequence of still images representing scenes in motion.The device used for it is called as Electronic cinematography Camera.EC camera A high quality video camera, often resembling a film camera in basic features, which is associated with single-camera video production. It is commonly believed that the quality of 35mm motion picture film as viewed on television is better than video. If we are talking about the artistic differences, then film (still) has a definite advantage for the historical reasons.

What is the difference between Film and Video? Although artistic differences between film and videotape are difficult to measure, purely technical differences are not. This brings us to the following statement. If production conditions are controlled and if comparisons are made solely on the basis of sharpness and colour fidelity, the best 35mm film will be slightly inferior to the best video, assuming the latest professional-quality video equipment is used and the final result is broadcast. As controversial as this statement might be with some film people, the reason becomes obvious when the production process for each medium is traced. First, it is important to realize that if a signal from a video camera is recorded on the highestquality process, no discernible difference will be noted between the picture coming from the camera and the picture that is later electronically reproduced. With film intended for broadcast the process is far more complex. First the image is recorded on negative film. Typically, the original negative film is then used to make a master positive or intermediate print. From the master positive a "dupe'' (duplicate) negative is created; and from that a positive release print is made. This adds up to a minimum of three generations. At each step things happen: colour and quality variations are introduced by film emulsions and processing, there is a general optical degradation of the image, and the inevitable accumulation of dirt and scratches on the film surface starts. Video is becoming capable of resolving ever-greater levels of fine detail. E.g. Eastman Kodak has announced a CCD chip capable of holding 16,777,216 bytes per square inch, which is double the resolution of standard 35mm film. Another company, Foveon, has announced a relatively inexpensive CMOS-type chip that is not only capable of the same resolution as film, but has a tonal scale and brightness range that is reportedly equal to film. DI - the Intermediate Digital Step By 2005, major motion pictures were using the advantages of digital imaging (DI) as an intermediate step between the colour negative film shot in the camera and the final release print copied for use within theatres. (Here, we are talking about films made for theatrical release.) Scanning the film into digital form provides much more control over colour correction and artistic colour changes. Of course once in digital form special effects with video are much easier and less expensive than with film.

E- Cinema So-called e-cinema (electronic cinematography) is rapidly gaining ground, especially since it is becoming almost impossible for most theater patrons to distinguish between it and film. E-cinema is now preferred by many independent "filmmakers," and major "film" competitions now have more entries on video than on film. The major weakness in the move to electronic cinema has been with projectors. But, the latest generation is based on projector imagers with a 4-megapixel resolution-twice that of the previous generation of projectors. The detail possible with these projectors exceeds that of 35mm film projection. Now the major stumbling block for digital cinema is the great initial investment in equipment--the projector and the associated computer. However, once this investment is made, major savings can be realized. Directors of Photography in film often resist moving to video equipment because "everything is different." It can take decades to move up to a Director of Photography position, and old habits and patterns of thinking are difficult to break. For this reason, video camera manufactures have made some of their cameras resemble the operation of film cameras. This means that directors of (film) photography do not have to abandon all that they have learned about the lenses. To make video look even more like film, even this "double-step" effect (resulting from the extra film fields being regularly added) can be electronically created. In fact everything, right down to electronically-generated random specks of "dust" can be added to the video image! (For a time -- and for questionable reasons -- video was being made to look like film -- bad film, in fact, -- by adding a host of electronic scratches, dirt, and even flash frames.) This extreme step aside, the first practical step used in creating a "film look" with video is through the use of filters. This link lists filters that are often used to make video look like film (if that's your goal). Film also can have a more saturated colour appearance. With sophisticated video equipment this can be simulated by adjusting the colour curves in a sophisticated video editor. This can also be addressed in post-production by channelling video

through computer programs such as Photoshop CS3, After Effects, or Chroma Match. By softening the image to smudge the digital grid of video, and reducing the contrast, you can take additional steps to make video look like film.The feature film, 28 Days Later, released in mid-2003, did very well at the box office and was shot with video equipment. By 2007, a number of feature films had been shot in high-definition video and then transferred to 35mm film for release in theaters.

Single-Camera, Multiple-Camera Production Differences Purely technical considerations aside, the primary underlying difference between film and video lies in the way it's shot. Film is normally shot in a single-camera style, and video is normally shot in the studio using a multiple-camera production approach. Video is generally shot with several cameras covering several angles simultaneously. Instead of lighting being optimized for one camera angle, it must hold up for three or more camera angles at the same time. This means that it's generally lit in a rather flat manner, which sacrifices dimension and form. And, with the exception of single-camera production, multiple takes in video are not the rule. "By replacing film with videotape and speeding the production process George Lucas saved at least $3-million on the 2002 Attack of the Clones." Film and Videotape The minute-for-minute cost of 16mm and 35mm film and processing is hundreds of times more than the cost of broadcast-quality video recording. And, unlike film, tape is reusable, which results in even greater savings. Offsetting the savings with video is the initial cost of video equipment. Depending on levels of sophistication, the initial investment in video production and postproduction equipment can easily be ten times the cost of film equipment. The cost of maintaining professional videotape equipment is also greater -- although this is changing with the adoption of computer disk and solid-state recording. On the other hand, there is a substantial cost savings in using video for postproduction (special effects, editing, etc.). As we've noted, for these and other reasons film productions intended for television are routinely transferred to videotape. This transfer can take place as soon as the film comes out of the film processor.

Reversal of the negative film to a positive image, complete with needed colour correction, can be done electronically as the film is being transferred to videotape or computer disk. From this point on all editing and special effects are done by the video process. The negative film is then locked away in a film vault and kept in perfect condition. Even for film productions intended for theatrical release, major time and cost savings can be realized by transferring the film to videotape for editing. Once edited, the videotape is then used as a "blueprint'' for editing the film. Video Broadcast Standards - NTSC PAL SECAM Most countries around the World use one of three main Video Broadcast Standards. These three main standards are NTSC - PAL and SECAM. NTSC - National Television System Committee The first colour TV broadcast system was implemented in the United States in 1953. This was based on the NTSC - National Television System Committee standard. NTSC is used by many countries on the American continent as well as many Asian countries including Japan. NTSC runs on 525 lines/frame. PAL - Phase Alternating Line The PAL - Phase Alternating Line standard was introduced in the early 1960's and implemented in most European countries except for France. The PAL standard utilises a wider channel bandwidth than NTSC which allows for better picture quality. PAL runs on 625 lines/frame. SECAM - Sequential Couleur Avec Memoire The SECAM - Sequential Couleur Avec Memoire or Sequential Colour with Memory standard was introduced in the early 1960's and implemented in France. SECAM uses the same bandwidth as PAL but transmits the colour information sequentially. SECAM runs on 625 lines/frame.

Post Production in Television Sajna Menon The processes of editing, sound dubbing, adding music are the final and crucial steps of television production and are known by the collective term post-production. The Post-production period is often liable to create an overspend on the budget, often due to the interference of the broadcaster or financier attempting to assert control over the programme. Another reason why post-production can be long, complex and expensive business is that modern digital editing technology allows for an extraordinary range of interventions that can be made during post-production. The addition of digital effects for cleaning up footage or importing animated elements and complex graphics, for example, is all done after the production shoot at the editing stage. Editing is crucial to television as it is to cinema. Editing can be done in analogue (linear) form, or in digital (non-linear) form. Linear editing involves re-recording selected shots from the original camera tape, choosing them from the camera tape in the order they are needed in the finished programme. Picture quality is lost in the rerecording process, and it is time-consuming to change the editing decisions made during the process. Digital tape formats become commonly available in the 1990s, and involve the storage of sound and image information as numerical code stored on the tape, in the same way as data is stored on the computer disks. Digital editing involves no loss of picture or sound quality when data is moved from the tape to the editing system and downloaded on the editing master tape. It is also much easier to revise and rework the programme during editing using digital technology. However, the principles of linear editing are useful for thinking through creative decisions. The production team will have ensured their familiarity with all the footage prior to editing, logged it and noted ideas to be used in the edit. The factors in the minds of the director, producer and editor will be: Possibilities for creating progressions through the programme The revelation of dramatic or interesting turning points

Possibilities for intriguing and holding the audience The rhythm and flow of the programme will be dependent on the careful structuring of these elements, referring back to the script and story board (in drama) to check that the aims of the programme idea are being effectively achieved. The editing process begins with viewing rushes from the shoot with burning timecode in vision. The rushes are logged so that particular shots or sequences can be identified ;ater and chosen for editing, and this process also enables the director to remind himelf or herself of the various takes and their possible usefulness. Particular shots and sequences are logged along with a label that identifies them descriptively, since this is much easier to remember than the number or time code. The next stage is a paper edit because it is written list of shots and sequences that will form the material at the first assembly edit stage off-line. The ingredients of the paper edit are always longer than the finished programme, and the editing process in general works by a process of gradual paring down and simplification. It is likely that chosen shots or sequences will be part of a somewhat longer shot or sequence, so the beginning and ending points desired for the next stage are identified by marking the script of the documentary or drama with the words In and Out to specify where the selection from the shot will be made. Next the off-line assembly edit is produced by transferring the sequences identified in the paper edit to the editing software. It is followed by the rough cut stage, where the material from paper edit is trimmed and ordered more precisely so that it is close to the planned length of the finished programme. Sound as well as music will contribute to the world evoked on screen, either emphasising what is already present in the image or contradicting it. Sound adds dramatic perspective to images by providing a sound point of view on the action: action in long shot can be accompanied by sound appearing to bring the action much closer to the audience by its volume and clarity, or on the other hand close-up action can be distanced from the audience by muting or blurring sound. Recorded sound from various locations (seaside, city street, in rooms with different kinds of acoustic tone) or sounds available on CD specifically for use in television productions (sound effects) can be assembled to form a library for use in various projects. Contrasting sound and image provoke moods and tomes that shape the predominant

interpretations of action for the audience, while sound montage opens up a whole range of meaning when running parallel with montages of images. These considerations apply not only to sound in general but also to speech. The factors offering connotations in recorded speech include : The apparent acoustic source of speech (within the represented world, or from outside it) The gender of the voice The accent of the voice The relative volume of the voice The speed of speech The timbre or tone of the voice Broadcast television soundtracks are complex layers of sound edited on to the images in the online-edit. Digital off-line editing systems such as Avid are based on the difference between the digital rushes from the shoot that are stored on the hard disk (known as the media) and the control information that the computer uses to determine the order of shots (timeline, timecode and editing instructions, known collectively as the project). As the off-line edit procedure is carried out, the amount of media stored digitally reduces as selections are made from the whole, based on the instructions contained in the project. Eventually the off-line edit will produce a finished tape containing all the right shots and sequences in the right order. The online edit is where this material is fine-tuned, minor repairs can be carried out to unsatisfactory frames, graphics and visual effects are added, and sound and music tracks are attached to images (unless complex sound is being added, where it would be added separately in a dubbing theatre). Online editing effects include transitions between shots such as when one shot is blended seamlessly (mixed) into another, and three-dimensional effects where moving images can be pasted on to shots and parts of the shot can be shifted around within the frame. The resulting programme after online editing is called the fine cut, and at this point the director will review the programme to make minor final changes before the master copy is at last produced.

--- End ---

You might also like