History of electronic engineering

This article details the history of electronics engineering. Chambers Twentieth Century Dictionary (1972) defines electronics as "The science and technology of the conduction of electricity in a vacuum, a gas, or a semiconductor, and devices based thereon".

Electronics engineering as a profession sprang from technological improvements in the telegraph industry during the late 19th century and in the radio and telephone industries during the early 20th century. People gravitated to radio, attracted by the technical fascination it inspired, first in receiving and then in transmitting. Many who went into broadcasting in the 1920s had become "amateurs" in the period before World War I. The modern discipline of electronics engineering was to a large extent born out of telephone-, radio-, and television-equipment development and the large amount of electronic-systems development during World War II of radar, sonar, communication systems, and advanced munitions and weapon systems. In the interwar years, the subject was known as radio engineering. The word electronics began to be used in the 1940s In the late 1950s, the term electronics engineering started to emerge.

Electronic laboratories (Bell Labs, for instance) created and subsidized by large corporations in the industries of radio, television, and telephone equipment, began churning out a series of electronic advances. The electronics industry was revolutionized by the inventions of the first transistor in 1948, the integrated circuit chip in 1959, and the silicon MOSFET (metal–oxide–semiconductor field-effect transistor) in 1959. In the UK, the subject of electronics engineering became distinct from electrical engineering as a university-degree subject around 1960. (Before this time, students of electronics and related subjects like radio and telecommunications had to enroll in the electrical engineering department of the university as no university had departments of electronics. Electrical engineering was the nearest subject with which electronics engineering could be aligned, although the similarities in subjects covered (except mathematics and electromagnetism) lasted only for the first year of three-year courses.)

Electronics engineering (even before it acquired the name) facilitated the development of many technologies including wireless telegraphy, radio, television, radar, computers, and microprocessors.

Wireless telegraphy and radio
Some of the devices which would enable wireless telegraphy were invented before 1900. These include the spark-gap transmitter and the coherer with early demonstrations and published findings by David Edward Hughes (1880) and Heinrich Rudolf Hertz (1887 to 1890) and further additions to the field by Édouard Branly, Nikola Tesla, Oliver Lodge, Jagadish Chandra Bose, and Ferdinand Braun. In 1896, Guglielmo Marconi went on to develop the first practical and widely used radio wave based communication system.

Millimetre wave communication was first investigated by Jagadish Chandra Bose during 1894–1896, when he reached an extremely high frequency of up to 60GHz in his experiments. He also introduced the use of semiconductor junctions to detect radio waves, when he patented the radio crystal detector in 1901.

In 1904, John Ambrose Fleming, the first professor of electrical Engineering at University College London, invented the first radio tube, the diode. Then, in 1906, Robert von Lieben and Lee De Forest independently developed the amplifier tube, called the triode. Electronics is often considered to have begun with the invention of the diode. Within 10 years, the device was used in radio transmitters and receivers as well as systems for long distance telephone calls.

The invention of the triode amplifier, generator, and detector made audio communication by radio practical. (Reginald Fessenden's 1906 transmissions used an electro-mechanical alternator.) In 1912, Edwin H. Armstrong invented the regenerative feedback amplifier and oscillator; he also invented the superheterodyne radio receiver and could be considered the father of modern radio.

The first known radio news program was broadcast 31 August 1920 by station 8MK, the unlicensed predecessor of WWJ (AM) in Detroit, Michigan. Regular wireless broadcasts for entertainment commenced in 1922 from the Marconi Research Centre at Writtle near Chelmsford, England. The station was known as 2MT and was followed by 2LO broadcasting from Strand, London.

While some early radios used some type of amplification through electric current or battery, through the mid-1920s the most common type of receiver was the crystal set. In the 1920s, amplifying vacuum tubes revolutionized both radio receivers and transmitters.

Vacuum tubes remained the preferred amplifying device for 40 years, until researchers working for William Shockley at Bell Labs invented the transistor in 1947. In the following years, transistors made small portable radios, or transistor radios, possible as well as allowing more powerful mainframe computers to be built. Transistors were smaller and required lower voltages than vacuum tubes to work.

Before the invention of the integrated circuit in 1959, electronic circuits were constructed from discrete components that could be manipulated by hand. These non-integrated circuits consumed much space and power, were prone to failure and were limited in speed although they are still common in simple applications. By contrast, integrated circuits packed a large number — often millions — of tiny electrical components, mainly transistors, into a small chip around the size of a coin.

Television
In 1927, Philo Farnsworth made the first public demonstration of a purely electronic television. During the 1930s, several countries began broadcasting, and after World War II it spread to millions of receivers, eventually worldwide. Ever since then, electronics have been fully present in television devices.

Modern televisions and video displays have evolved from bulky electron tube technology to use more compact devices, such as plasma and liquid-crystal displays. The trend is for even lower power devices such as the organic light-emitting diode displays, and it is most likely to replace the LCD and plasma technologies.

Radar and radio location
During World War II, many efforts were expended in the electronic location of enemy targets and aircraft. These included radio beam guidance of bombers, electronic counter measures, early radar systems, etc. During this time, very little if any effort was expended on consumer electronics developments.

Transistors and integrated circuits


The first working transistor was a point-contact transistor invented by John Bardeen and Walter Houser Brattain at the Bell Telephone Laboratories (BTL) in 1947. William Shockley then invented the bipolar junction transistor at BTL in 1948. While early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, they opened the door for more compact devices.



The first integrated circuits were the hybrid integrated circuit invented by Jack Kilby at Texas Instruments in 1958, and the monolithic integrated circuit chip invented by Robert Noyce at Fairchild Semiconductor in 1959.

The MOSFET (metal–oxide–semiconductor field-effect transistor, or MOS transistor) was invented by Mohamed Atalla and Dawon Kahng at BTL in 1959. It was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses. It revolutionized the electronics industry, becoming the most widely used electronic device in the world. The MOSFET is the basic element in most modern electronic equipment.

The MOSFET made it possible to build high-density integrated circuit chips. The earliest experimental MOS IC chip to be fabricated was built by Fred Heiman and Steven Hofstein at RCA Laboratories in 1962. MOS technology enabled Moore's law, the doubling of transistors on an IC chip every two years, predicted by Gordon Moore in 1965. Silicon-gate MOS technology was developed by Federico Faggin at Fairchild in 1968. Since then, the mass-production of silicon MOSFETs and MOS integrated circuit chips, along with continuous MOSFET scaling miniaturization at an exponential pace (as predicted by Moore's law), has led to revolutionary changes in technology, economy, culture, and thinking.

Computers
A computer is a programmable machine that receives input, stores and manipulates data, and provides output in a useful format.

Although mechanical examples of computers have existed through much of recorded human history, the first electronic computers were developed in the mid-20th century (1940–1945). These were the size of a large room, consuming as much power as several hundred modern personal computers (PCs). Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into small pocket devices, and can be powered by a small battery. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from MP3 players to fighter aircraft and from toys to industrial robots are the most numerous.

The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers ranging from a netbook to a supercomputer are all able to perform the same computational tasks, given enough time and storage capacity.

Microprocessors


By 1964, MOS chips had reached higher transistor density and lower manufacturing costs than bipolar chips. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on a single MOS LSI chip.

The first multi-chip microprocessors, the Four-Phase Systems AL1 in 1969 and the Garrett AiResearch MP944 in 1970, were developed with multiple MOS LSI chips. The first single-chip microprocessor was the Intel 4004, released on a single MOS LSI chip in 1971. A single-chip microprocessor was conceived in 1969 by Marcian Hoff. His concept was part of an order by Japanese company Busicom for a desktop programmable electronic calculator, which Hoff wanted to build as cheaply as possible. The first realization of the single-chip microprocessor was the Intel 4004, a 4-bit processor released on a single MOS LSI chip in 1971. It was developed by Federico Faggin, using his silicon-gate MOS technology, along with Intel engineers Hoff and Stan Mazor, and Busicom engineer Masatoshi Shima. This ignited the development of the personal computer. In 1973, the Intel 8080, an 8-bit processor, made possible the building of the first personal computer, the MITS Altair 8800. The first PC was announced to the general public on the cover of the January 1975 issue of Popular Electronics.

Many electronics engineers today specialize in the development and programming of microprocessor-based electronic systems, known as embedded systems. Hybrid specializations such as computer engineering have emerged due to the detailed knowledge of the hardware that is required for working on such systems. Software engineers typically do not study microprocessors, unlike computer and electronics engineers. Engineers who exclusively carry out the role of programming embedded systems or microprocessors are referred to as "embedded systems engineers" or "firmware engineers".