You are on page 1of 10

Book Review

HTML 4 for Dummies


by Ed Tittel and Natanya Pitts IDG Books Worldwide Foster City, CA 1999

This is yet another in the "Dummies" series of books. The series presents information, in a friendly manner, on about just about every topic that could be imagined. The idea is to present a fresh look at a topic and present it in a way that even dummies could understand. These books are not really for dummies (creating web pages using HTML is not a task for dummies). The title is just a bit of tonguein-cheek humor to suggest that these topics can at least be presented in a friendly and easier-to-read format (each new section in the book starts with a job-related cartoon). The book begins with an introduction to, in the authors' words, "the wild, wacky, and wonderful possibilities inherent in the World Wide Web." After a not-so-brief internet history lesson, they begin the analysis of how the internet works, describing what is behind the scenes of web pages (they call it "under the hood"). They provide an overview of clients and servers, responses and requests, and the roles of front and back ends. That leads to a discussion of serving up web resources which, of course, leads directly to the role and function of HTML. The HyperText Markup Language (HTML) is described as the way web servers and clients talk to each other. While most books on HTML start with the basics, this book starts with a focus on what the viewer of a web page sees. It takes several chapters to begin the introduction of HTML syntax. While some more experienced programmers might get impatient with this approach, it is an easy-to-digest method of providing the necessary background in web function for the less experienced. The review of history and structure of the internet makes for interesting reading, no matter what your level of expertise.

HTML is a collection of markup codes that must be recognized by the user's browser. But not every new tag is determined by the standards committee: some special tags are recognized only by certain browsers. And, as the authors point out, the number and kinds of HTML tags continues to grow with each new iteration of the markup language. The book describes all of the currently recognized HTML tags and it provides an overview of which tags are supported by which browsers. The CD-ROM that comes with this book contains an excellent collection of readymade web page designs, along with other useful information. This book provides a good introduction to the internet and how it works. It also provides a very thorough introduction to HTML and it does a good job of describing how HTML codes are used. However, when you are ready to begin creating your own web pages, a detailed reference to HTML will be also necessary. We would recommend this book to beginners, with the support of a detailed reference to HTML such as Special Edition Using HTML 4, Sixth Edition by Molly E. Holzschlag or Platinum Edition Using HTML 4, XML, and Java by Eric Ladd and Jim O'Donnell.

Book Review: Game Theory: A Nontechnical Introduction


This is pretty intimidating stuff, game theory. All you have to do us use the expression and people start becoming concerned about competing with you. Mention that it was employed to help analyze and develop options for various Cold War scenarios and you are approaching expert status. Finally, toss something into the mix like this is a classic prisoners dilemma situation and the game is over people will start quietly folding their hands; youve won. And you can accomplish all of this without even showing your cards; that is, without actually understanding game theory at all, much less what it really is. And guess what: That is precisely what most of the people you hear talking about it are doing. In some cases, though, as so often happens, they simply interpret it to be illustrating a theme they wish to promote, themselves. Take the prisoners dilemma, for example. Ive heard it characterized variously as describing how, in certain circumstances, two people are driven to the least optimal decision for both, suboptimal for one and optimal for the other, or optimal for both. Only one of these is always correct, but youd be surprised at who can get it wrong; you may even have read books written by some of them. Game theory is worth getting right. It is a branch of mathematics that can be difficult to explain in non-mathematical language. And yet, it does indeed provide useful insight into understanding how people comprehend and react to each other in environments of imperfect information and communication. An intellectual tool like this is well worth the effort to acquire. Fortunately, you can do that with this excellent book by Morton Davis, and I highly recommend that you do. It is well written with the nontechnical reader in mind. Intelligently organized, it begins with an overview, and then builds your theoretical knowledge and practical capability with chapters building from the 2-person zero-sum game to multi-person games. As you go, you encounter exercises designed to test your comprehension. With the aid of these, you grow in real and practical understanding of game theory. You will come to a new appreciation of the complexity of decision-making in environments of imperfect information and communication. Moreover, you will thus develop new insights into how to remove or overcome these shortcomings. This is best book on this topic that Ive found for the non-technical reader, and should definitely be on the reading list of any professional manager. Then, the next time someone starts tossing game theory flack around, you can pose them a real dilemma by advancing the conversation based on real knowledge. Todays tip: Speaking of making decisions on the fly with imperfect information, and communication impeded by high anxiety, see this collection of BNET articles on how to hire brilliant people. Take a quick look at the contents section on the sidebar of the main site: you will now see a listing of the article series that have been published here. You can click through to view summaries of the pieces, and then read the full series or selections most appropriate for you. Enjoy!

1. Introduction
Wireless services in India started a little over 100 years ago with the commissioning of a single wireless telegraph link in Calcutta. After many years of slow growth, wireless in recent years has become both an economical and convenient (e.g., mobile) means to deliver voice and data services. Wireless networks in India have expanded dramatically in the past six years to reach over 700 million mobile wireless phone subscribers (March 2011). This makes India the second largest in the number of wireless subscribers - with China at the top. A large population, low wire line telephony penetration, falling tariff, and rising income levels have made India the fastest-growing wireless nation. Despite the success of its wireless services sector, India largely relies on imported technology for its networks. This article has four sections. In the first, we outline the evolution of global wireless technology and networks and discuss the emerging trends. Next, we trace the history of wireless networks in India, policy evolution, and current trends in services. In the third section we survey wireless R&D and manufacturing in India. Finally in section four we conclude with suggestions for reviving the telecom equipment industry.

2. Evolution of Wireless Technology

G. Marconi, the name most associated with mass-market wireless services, demonstrated the feasibility of wireless telegraphy, progressing from small home experiments in 1895, to an experimental transatlantic wireless link in 1901. Although Marconi has received much credit for commercialization of the wireless technology, the underlying scientific discoveries go back to at least 50 years earlier. Most prominent of these were J.C. Bose in India, O. Lodge in the UK, E. Branly in Paris, and N. Tesla in the US. In 1894, Prof. J.C. Bose demonstrated the first millimeter wave radio transmission using gun powder to create a burst of radio energy, which rang a bell a few feet away. Following Marconi's commercial success, technology improvements came rapidly in the early part of the twentieth century. Until 1980, wireless communications remained a technology used in defense and police services and did not reach mass markets. The roots of today's pervasive mobile wireless technology goes back to D.H. Ring and W.R. Young, both at Bell Laboratories, who proposed the fundamentals of the cellular frequency reuse in 1947. Another key concept of handover, a technique for transferring calls as the user moves across cells, was proposed by A. Joel in 1970, also at Bell Laboratories. Soon, work began on a full-fledged mobile telephony system involving the key principles of cellular frequency reuse, handover, and multiple-access, with technology demonstrations in 1973, in the US. In the eleven decades since those early beginnings in the nineteenth century, wireless technology has transformed our world, from satellite radio and High Definition Television (HDTV) broadcasting, to mobile telephony, and now the emerging mobile broadband. We have over four billion wireless mobile phone users in a total world population of 6.5 billion, and it is in mobile services, that the wireless has made the greatest impact on our society. Mobile broadband is still in its early stages and will usher in a big new era of wireless services that deliver rich multimedia internet services to a variety of devices including handheld products. A number of core technologies have underpinned the growth and success of mobile wireless. Some examples are multiple access, multiple-input and multiple-output (MIMO), adaptive modulation, and coding. We discuss these briefly. The origins of multiple access date back to Marconi's proposal of 'tuned circuits' or equivalently, Frequency Division Multiplexing (FDM). This allows multiple links to be established from a single transmitter base to a receiver, with each link using a distinct frequency channel. FDM strictly refers to each link terminating at different geographically dispersed users. The connection from the transmitter base to the dispersed receivers is referred to as the downlink. When these dispersed users transmit back to the base, again with distinct frequency channels, it is called the uplink and this access mode is referred to as Frequency Division Multiple Access (FDMA). In a cell with a base station and its multiple dispersed users, the FDM downlink and FDMA uplink, are referred to as FDM/FDMA. Two other multiple access technologies have been developed: Time Division (TDM/TDMA) and Code Division (CDM/CDMA). TDM/TDMA uses time slots instead of frequencies to distinguish users. In Code CDM/CDMA, the users are separated by unique spreading codes. Henceforth, we will use CDMA to refer to CDM/CDMA and likewise for TDMA. There was a vigorous debate of CDMA versus TDMA in the mid 1990s, about the relative spectral efficiency - roughly the amount of throughput per cell for a fixed amount of spectrum. The key to such efficiency lay in the clever manner in which the challenges of fading, interference, and handover were dealt with. Although both TDMA and CDMA, found equally effective ways to deal with these challenges, there were practical reasons favoring CDMA as the more convenient approach. The arguments for and against CDMA and TDMA were finely balanced, and both technologies became well established, with TDMA-based GSM remaining the dominant technology even today. The 3G standard adopted a 5 MHz channel to support higher bit rates and adopted a CDMA approach, as channel equalization in TDMA became computationally complex at such bandwidths. CDMA used Rake receivers and was simpler to implement. In the late 1990s, as the demand for broadband data further increased, the channel bandwidth had to be increased to 10 or 20 MHz. CDMA also became efficient due to a large number of multipaths that became resolved and the resulting high interpath interference. A new approach called Orthogonal Frequency Multiple Access (OFDMA) using a large number of narrow, mutually orthogonal, sub-channels emerged as the preferred access technique for 4G broadband systems. In summary, 1G used FDMA, 2G used both TDMA, 3G used CDMA, and 4G adopted OFDMA. Multiple Input Multiple Output, another key technology, goes back to the work of Stanford University researchers in 1973, who proposed the use of multiple antennas to transmit and receive and to implement a new concept called spatial multiplexing.

Spatial multiplexing works when the spatial signatures at the receiving antenna array, induced by the different transmit antenna streams, are quasi orthogonal, and hence, separable. MIMO spatial multiplexing multiplies the effective channel bandwidth by the number of antenna pairs and has stirred enormous interest. Multiple transmit and receive antennas used in MIMO can also be used for link diversity. Although many of the theoretical fundamentals of MIMO were developed by Bell Laboratory researchers, the first commercial system to adopt MIMO and OFDMA was developed by Iospan Communication Inc. in the US during the late 1990s, and this technology eventually became the basis of 4G wireless standards. WiMAX and 3GPP Long Term Evolution (LTE) have both adopted MIMO-OFMDA, as also WiFi IEEE 802.11n standard. MIMO spatial multiplexing has been extended to a multi-user format, wherein, a base station transmits dedicated streams to different users. On the downlink, the multi-user MIMO requires channel state knowledge to orthogonalize transmissions to different users. Other key technologies that have contributed to wireless performance improvements include: (a) Turbo and LDPC channel coding that have enabled links to operate close to Shannon capacity. (b) Hybrid-Automatic Request for Re-transmission (HARQ) is a physical layer Automatic Repeat-reQuest (ARQ), which outperforms regular ARQ. (c) Adaptive modulation and coding (AMC) allows the use of modulation and a coding rate to be chosen, to suit the channel SNR. (d) Opportunistic Scheduling (OS) refers to assigning the most favorable (highest SNR) frequency or time slot to a user, as against random channel assignment. OS can improve spectral efficiency by about 20% in practical networks.

3. Evolution of Wireless Networks and Services


The advent of the modern cellular networks began in 1983, with the launch of the Advanced Mobile Phone Service (AMPS) in the US. This became the first standardized cellular service in the world and is often referred to as 1G. AMPS used analog FM modulation, and were based on FDMA with 30 KHz channels. The industry moved to digital modulation and TDMA with 2G technologies to take advantage of voice compression, advanced modulation, and coding techniques and security. GSM first appeared in Europe in the early 1990s, and used some versions of Gaussian Minimum Shift (GMSK) digital modulation. The CDMA based IS-95 standard was introduced in the mid 1990s and was helped considerably by South Korea, who developed a commercial system. With advances in semiconductor technology and rising subscriber base, handset sizes and prices began to drop significantly in the late 1990s, and the world subscriber reached 300M by 2000. This period also saw the introduction of short messaging service (SMS), which proved to be attractive in emerging countries. The advent of 3G came with Wideband CDMA standard that was developed by 3GPP standards body and later ratified by ITU as an IMT-2000 standard. The growth of the WCDMA network was initially slow, due lack of handset equipment, but by 2006, these bottlenecks were overcome and its growth accelerated. Meanwhile, a more advanced version of WCDMA called High Speed Downlink Packet Access (HSDPA) was standardized and deployed worldwide. A more advanced version with a high speed uplink known as High Speed Packet Access (HSPA) was standardized and saw commercial roll outs in 2009. There is another enhancement known as HSPA+, which adds a higher level of QAM modulation and 2 2 MIMO to HSPA, but has as yet to see a significant uptake. Although 3G has remained a voice service primarily, the introduction of HSDPA and HSPA has given 3G significant data capability. Unfortunately, the 3G operators still have a limited spectrum, which while adequate for voice has fallen significantly below the needs of data intensive services such as those used by smart phones and notebook/netbook computers and tablets/pads. The AT&T network in the US, for example, is unable to handle the demands of the Apple iPhone (a smart phone) data traffic. In the past two years, data traffic has been increasing dramatically, exceeding a 100% compounding rate annually, greatly stressing the data capability of 3G networks. Even as HSPA is being deployed to meet some of these needs, a transition to a 4G data technology is clearly needed. The early beginnings of 4G began toward the end of the 1990s, when Iospan Inc. developed a MIMO-OFDMA system, which demonstrated unprecedented 10 Bps/Hz peak spectral efficiency links. The robust operation of MIMO-OFDMA, combined with the increasing problems of making MIMO work in CDMA, resulted in a strong shift to MIMO-OFDMA as the new technology for 4G. The first global standards to embrace MIMO-OFDMA were IEEE 802.16d and 802.16e, which were ratified in 2004 and 2005, respectively. WiMAX (Worldwide Interoperability for Microwave Access) is an industry forum that supported equipment development and certification based on IEEE 802.16 standards. This victory of MIMO-OFDMA in WIMAX was rapidly followed by a clean sweep across other broadband technologies WLANs (IEEE 802.11n), UMB (a Qualcomm proprietary system, which has since been abandoned), and 3GPP LTE. WiMAX is 4G technology, which delivers high speed and reliable broadband wireless access. The technology can be deployed in cell sizes ranging from a 1-2 km cell radius in the urban, and 0.2-0.5 km radius in dense urban areas. WiMAX can serve mobile/nomadic and fixed users. Its use of OFDMA and MIMO enables it to offer high data rates (large bandwidth channels) and higher spectrum efficiency compared to CDMA-based 3G technologies. Next generation Long Term Evolution (LTE) is dubbed a 3G-evolution technology based on OFDMA and MIMO and is very similar to WiMAX and enjoys all the advantages. The LTE standard was completed in 2008, and is at least two-to-three years from being a commercially mature technology. LTE comes in two flavors, FDD (Frequency Division Duplexing) and TDD (Time Division Duplexing). Although FDD-LTE is dubbed to be an evolution of 3G, it has no real backward compatibility with 3G, except that, but very importantly, FDD-LTE has spectrum profiles that are compatible with 3G bands. LTE deployment will need new equipment for infrastructure and terminals. Early deployments have started in US with roll outs planned in Japan soon. Large scale LTE roll outs are not expected before 2015. India has allocated two channels of 10 MHz each - TDD broadband spectrum in 2.3 GHz band for private operators and 10 MHz in the 2.5 GHz band to BSNL. The private carriers have opted to deploy TD-LTE. BSNL is deploying WIMAX, albeit very slowly. The government also announced that blocks in the 700 MHz and 3.3-3.6 GHz bands will be auctioned as they become available.

4. Telecom Services in India


The telecom service in India started in 1851, with an electric telegraph line between Calcutta and Diamond Harbor and was operated by the Public Works Department of the British East India Company. Its, inventor Dr. William O'Shaugnessy came to India as a surgeon for the British East India Company (BEIC) in 1823, and invented the electric telegraph in 1839, just a few months behind and independently of Samuel F. B. Morse in the US. With the support of the then Governor General, Shaugnessy began building a 4000 mile nationwide telegraph network in 1853, connecting Calcutta, Delhi, Peshawar, Bombay, and Madras. The network was completed in 1856, and Shaugnessy become India's first Director General of Telegraphs. In 1881, the Government of India (GOI) licensed The Oriental Telephone Company Ltd., for opening telephone exchanges at Calcutta, Bombay, Madras, and Ahmedabad. The first commercial telephone call was made in January 1882, in Calcutta. The first experimental wireless telegraphy links in India were demonstrated as early as 1902, and a Department of Wireless Telegraph was set up. Wireless telegraphy came into routine use in Calcutta at Diamond Harbor in 1908. By 1920, the Madras - Port Blair wireless telegraph link was established. In 1921, continuous wave (CW) radio transmitters began to replace spark gap systems in India. In 1927, the UK - India wireless telegraph links were established and upgraded to wireless telephony in 1933. In 1923 the Indian Radio Telegraph Company was formed and merged with the Indian Cable Company in 1932. In 1947, after independence, all telephone and telegraph companies were nationalized and run by the Department of Post, Telephone, and Telegraph under the Ministry of Communications. In 1985, the postal services were separated and the new Department of Telecommunication (DOT) was formed, becoming India's local and long distance operator. In 1986, DOT was reorganized, with the services in the four metros carved out into a new Public Sector Unit (PSU) - Mahanagar Telephone Nigam Ltd. (MTNL) and international telephone services though another PSU - Videsh Sanchar Nigam Ltd. (VSNL). In 2000, the remaining nationwide telephone services inside DOT were incorporated into a third PSU - Bharat Sanchar Nigam Ltd. (BSNL). In 2002, the Tata Group of Companies acquired 45% stake in VSNL. Growth of the wired Indian telephone network was very slow. At the time of India's independence in 1947, there were 80,000 telephone subscribers in India. The growth of telephone connections remained painfully slow, reaching 980,000 lines in 1971, 2.15 million lines in 1981, and 5.07 million lines in 1991, all in a country that was crossing one billion people. Wireless telephony was unknown till the mid 1990s except for the armed forces and a few private corporate networks like ONGC. The first mobile phone service was launched in 1985, on a non-commercial basis. The mobile services were commercially launched only in August 1995, in India. In the initial five years the annual subscriber additions were modest and reached 10.5 million by 2002. Although mobile services followed the New Telecom Policy 1994, the market growth was hampered by high spectrum and equipment costs. The Telecom Regulatory Authority of India (TRAI) was established in 1997, and helped create a strong focus on the Telecom Policy (like the FCC in the US). The New Telecom Policy in 1999, in part, enabled the growth of mobile telephony, primarily by moving away from a high fixed spectrum fee to a lower fixed fee and revenue sharing model. The current revenue share is 15%. Furthermore, the concept of Unified Access License (UASL) rationalized the licensing policy and allowed UASL operators to provide fixed and wireless services. Other initiatives such as Calling Party Pays (CPP) and interconnect charges helped vitalize the industry. Other accelerating factors were the rapid price erosion in cost of infrastructure equipment and phones (handsets), and the sheer market volume allowing operators to amortize fixed costs more efficiently. The number of mobile phones grew to 16 million in 2003, 32 million in 2005, 200 million in 2007, and 560 million in 2009. The Average Revenue per Subscriber (ARPU) in India is less than $5 per month, making it the lowest in the world. The next mass-market wireless service in India is the broadband. India currently has only eight million broadband (all wired) lines serviced through DSL (Digital Subscriber Line) technology. There are another six million low speed dial-up lines, but they offer very limited internet experience. The total number of internet users stands at 81 million, thanks to heavy resource sharing through internet shops. The majority of new internet connections use low speed dial-up connections. DSL-based connections have limited appeal because of the high cost, limited copper loop availability (only 35 million lines nationwide), and poor quality of the copper local loop. Moreover, DSL is only possible in urban areas, where a local loop exists. Wireless broadband offers significantly better economics and superior convenience (mobility) compared to the wired DSL technology. India seriously lags behind other emerging and developed economies in broadband access. Given the increasingly rich media internet content, only a dedicated broadband connection - at least one per household and better still, one per person, can offer a full spectrum internet experience. The US has recently set a goal of 100 million broadband connections at 100 Mbps each, by 2020. Simply put, broadband internet penetration with eight million lines is abysmally low, translating to 0.7% penetration and growing at only 0.1% per year. In India, given our weak physical infrastructure, pervasive internet access can act as our virtual infrastructure and be a powerful enabler of many core segments of Indian society - industry, education, commerce, governance, and social connectivity. There is a tremendous unmet demand for internet access in India, perhaps in excess of 100 million subscribers based on the current projected price points for wireless broadband service. This demand will only grow rapidly in the coming years. The Indian Government's own goal for broadband connections for 2010 is 20 million lines, rising to 100 million by 2015. Wireless broadband internet can serve both mobile and fixed users. Apart from 24 7 personal accesses to high speed web browsing, email, rail/air booking, banking, and social networking; enterprises can expand online customer support, online ordering, and e-commerce; moreover, the local/city government can expand e-governance and public information services. Many new applications such as video security, enabled by machine-to-machine connections, will increasingly dominate broadband internet use after 2012.

5. Indian Telecom Equipment Industry


India imports about $ 12-14 billion of wireless equipment. Only about $ 1.5 billion of this can be considered to be local value addition, mainly through assembly of phones, and is carried out by major OEMs like Nokia, Samsung, and Motorola. The participation of Indian companies in wireless technology has been shrinking for the past two decades. A discussion of the major segments of this industry is as follows: 5.1 Indian Companies Public sector: The beginning of the telecom industry came with the setting up of the Indian Telephone Industries Ltd. (ITI), in

1948, in Bangalore. ITI was the first Public Sector Unit (PSU) in independent India and is indicative of the importance that Mr. Nehru, the Indian Prime Minister then, gave to the development of an indigenous telecom equipment capability. Till the 1990s, ITI manufactured large Strowger and Crossbar exchanges, small local exchanges, and telephone equipment under licensed agreements with western companies. A notable exception was switches based on Centre for Development of Telematics (CDOT) technology (see a little later in the text). Later, ITI entered into a number of joint ventures with US, European, and Chinese companies for a diverse range of transmission products. Since the late 1990s, ITI had run at a loss and was declared a sick unit in 2003. The Indian Government has tried to revive ITI with large cash grants. Attempts to sell or merge ITI have so far not succeeded and its future remains uncertain. Many other central and state PSUs also built telecom equipment such as VSAT terminals, digital STM radios, again, mostly based on licensed technology from abroad. They have also faced severe difficulties in recent years and have suffered declines. CDOT: This was a Government of India (GOI) funded R&D unit formed in 1984 under the Chairmanship of Mr. Sam Pitroda, a visionary telecom leader who returned to India after a successful career in the US. CDOT successfully developed and transferred technology for Rural Exchange (RAX) and Private Automatic Branch Exchange (PABX) switches to a number of small/ medium manufactures from 1988 onwards. The MAX switch (up to 50,000 line capacity) was successfully developed by 1995, and the technology was transferred to ITI. Over 1000 MAX switches based on CDOT technology have been installed and have been the mainstay of the DOT voice network. CDOT's success in switch design remains a singular achievement in indigenous telecom technology. However, CDOT in recent years has not been able to sustain the success of RAX and MAX developments during the early 1990s. Its inability to develop a Mobile Switch removed a major opportunity to participate in the exploding mobile wireless market. Private sector: From the 1950s to 1989s, efforts by the private sector to design and manufacture telecom equipment was severely limited due to the restrictive licensing policy of the Indian Government that was focused on protecting the public sector. Early companies in this segment included ARM (now ICOMM Tele), Himachal Futuristic Communications, and BPL (Telecom). Due to licensing restrictions, these companies never developed the market scale to build a credible R&D capability to compete against the giant Telecom Multi National Companies (T-MNCs). A particularly promising company is Midas Communications Ltd., which developed an innovative wireless local loop product, known as CorDECT, based on a Digital Enhanced Cordless Technology (DECT) technology. Midas successfully marketed CorDECT to over six countries and has so far sold over two mMillion lines. CorDECT share on Indian market remains limited and the company is trying to re-enter the market with a GSM technology. VNL has also built low power off grid solar GSM base stations that show great promise in emerging countries. Shyam Telecom, Terracom, Coral, Pointred, and Matrix are other examples in the private sector with some local capability, but are no match to T-MNCs. Engineering Service Companies: India has built a vibrant engineering services industry which carries out mostly software and some hardware development for MNC clients. Wipro, Sasken, Infosys, Mindtree, and others, have a sizable business in telecom engineering services. Their revenue from telecom-related engineering services is estimated to be $ 6 billion in 2009. However, none of these companies have emerged as significant telecom equipment providers for the Indian market, which continues to be dominated by T-MNCs. A number of factors played a role in the eclipse of the local telecom equipment industry. First the barriers to entry, including import duty of foreign manufactured equipment were rapidly reduced after the National Telecom Policy (NTP)-99. This allowed T-MNCs to successfully outbid ITI with long-term vendor financing, which proved to be irresistible to DOT. Next, the private sector began to take an increasing share of the telecom services (current share about 65%), and this sector opted for the most advanced global technology, to remain competitive in the quality of services and network economics. Finally, there was no determined effort by the Government to help the local industry, which was forced out of product development and manufacturing. Some of these companies have moved to a trading model, wherein the telecom equipment is imported in a fully or partially assembled form, and only the final assembly is undertaken before supply to the end customers. There is usually only minor, if any, value addition. Clearly, revival of an indigenous telecom sector has to receive high priority. 5.2 Foreign Companies T-MNCs: With the growth of Indian mobile market, Nokia, Motorola, LG, and Samsung have set up cell phone assembly plants to take advantage of tax incentives. The value addition in these plants is estimated to be <7% of the end product price. Ericsson and Alcatel-Lucent also do some assembly and test of wireless infrastructure equipment in India. Alcatel-Lucent entered into a joint venture with CDOT for the development of WiMAX products for the Indian market. The net local value addition still remains focused in the final assembly and test areas, and hence, remains small. No value addition in the core technologies such as semiconductor design or manufacturing is undertaken by T-MNCs in India. The Government has not mandated value addition in core technologies, as is currently happening in China. Venture Funded Small Companies: In recent years Venture funded companies in the US have built significant engineering back ends in India. Two notable companies are Beceem Communications and Tejas Networks. Beceem commands 65% of the WIMAX semiconductor market worldwide and has 80% market share in US 4G semiconductor. Beceem was acquired by Broadcom in November 2010. Tejas is an optical switch company that is almost 100% India-based and has done well in the mid and lower segments of SDH switches, both in the Indian and emerging markets.

6. Concluding Remarks

Wireless services in India have seen dramatic growth in recent years through the expansion of mobile voice networks. With over 700 million users, the cell phone has become the symbol of India's growth and dynamism, and has proven to be a truly transformative technology. The cellphone has put in the hands of even the weakest sections of Indian society, a technology marvel, which would allow him or her to connect with anyone anywhere in the country or indeed the world. Clearly, the next revolution of broadband wireless is now close at hand, and may prove to have an even greater impact on the India's economic growth and productivity. The dark cloud in all this good news is the inability of Indian technology companies, with very few exceptions, to participate in the huge opportunities in equipment design and manufacturing offered by India's massive telecom expansion. Telecom technology is very R&D intensive and it takes large and high risk investments to build each of the different pieces of this great

industry. It will need concerted policy support to help create a globally competitive Indian telecom equipment industry
[1] [2] [3] [4] [5]

, , , ,

Why Analog Computation? Byl I Unclassi{ied An introduction to analog computation containing a brief description of the analog computer and problems in which it can be advantageously applied. Both analog computers and systems combining analogand digital techniques are discussed in order to show why the Agency'8 interest in thiIJ computation area has increased. Why analog computation? Wi th the interest in analog computing equipment rapidly increasing in our digitally oriented Agency, thi s is a question many of us mus t ask. The preponderence of digital comput ing equipment in this Agency would preclude analog computation from consideration i f the two type s of computers performed the same operations equally well; but thi s is not the case. A comparison of digital and analog comput e r applications reveals a. basic difference in the i r operation. The digital computer performs numerical operations on discrete signals; in cont r a s t , the analog computer performs algebraic and integro-differential operations upon continuous signals. Therefore certain operations, which a r e difficult to program on a digital computer, are available inherently on the analog machine. In order to appreciate where an analog computer can be advantageously applied, one mus t become more familiar with wha t i t is and how it is used. Before discussing problem areas in which the analog computer possesses an advant age , let us briefly consider the fundamentals

of i t s operation. The he a r t of the computer is the high-gain D.C. amplliier--either va cuum tube or t r ans i s tor - tha t , when properly connected wi th passive components, forms the basic operational element. The schematic representation for an operational amplifier is shown in Fig. 1. I f the passive components in both feedback and input a rms a r e entirely resistive, the circuit of Fig. 1 adds the applied voltages in proportion to the r a t ios of the individual resistors; while if the feedba ck impedance is capacitive, the circuit integrates the sum of the applied voltages. The schematic diagrams for an amplifier used as a summe r (it is called an inve r t e r if it ha s only one input )

Creation Science Rebuttals


Technical Journal (TJ)
Ripples of Galaxies
TJ, Volume 19, Issue 1, April 2005
Review by Greg Neyman Answers In Creation First Published 30 August 2006

In an article in TJ, young earth creation science theorist Jason Lisle comments on a scientific report that there is a ripple pattern in the clustering of galaxies. He claims that this is a blow to the Big Bang model of creation.1 (The article was also featured as the daily feature on the Creation Ministries International website on 30 August 2006).
This article is a little strange for an article in Technical Journal, as it is not very "technical." Lisle gives some general information about galaxies, and then proceeds to tell about the scientists method of interpretation. He claims that we both have the same data (typical of young earth claims), but that the interpretation of that data is different. This is true. Whereas secular scientists have interpreted the data based solely upon science, Lisle has based his interpretation not on the facts, but upon his preconceived notion that the earth is young, based on his faulty biblical methods. His main point in this article is that the secular scientist "assumes that the big bang is true." This assumption is based upon other scientific facts which confirm the Big Bang. The young earth creation propoganda machine has long said that the Big Bang is in trouble, when in reality, it has only been getting stronger and stronger as a theory.

Of course, we can say the same thing about Lisle. Whereas he claims the scientists "assume" the Big Bang is true, we can say that he "assumes" that the universe is young. He cannot prove this through science, and can only issue weak arguments against the Big Bang, arguments which contain no science. Another one of his main points is this...

The big bang, however, has been refuted on the basis of both Scripture and good science. For example, the big bang is not compatible with the order, timescale and cause of the events of creation as recorded in Genesis. Really, the big bang is a secular opponent of the biblical framework.
Nothing could be further from the truth. Scripture does not indicate clearly that the earth is young. And, after many years of young earth claims, we are still waiting for their "good science" that refutes the Big Bang. The Big Bang is compatible with Genesis, and millions of Christians believe in the Big Bang and the Bible. There is no problem accepting a literal, inerrant interpretation of Genesis, and the Big Bang. Finally, Lisle mentions that the stars and galaxies were created on Day 4 of the creation week. Within the old earth interpretation, the stars, sun, and moon became plainly visible on this day, but were previously existing. One must keep in mind that the creation account is written from the perspective of a person standing on the surface of the earth. From his point of view, they were first observed this day.

You might also like