Loading AI tools
From Wikipedia, the free encyclopedia
This article details the history of electrical engineering.
Long before any knowledge of electricity existed, people were aware of shocks from electric fish. Ancient Egyptian texts dating from 2750 BCE referred to these fish as the "Thunderer of the Nile", and described them as the "protectors" of all other fish. Electric fish were again reported millennia later by ancient Greek, Roman and Arabic naturalists and physicians.[1] Several ancient writers, such as Pliny the Elder and Scribonius Largus, attested to the numbing effect of electric shocks delivered by electric catfish and electric rays, and knew that such shocks could travel along conducting objects.[2] Patients with ailments such as gout or headache were directed to touch electric fish in the hope that the powerful jolt might cure them.[3] Possibly the earliest and nearest approach to the discovery of the identity of lightning, and electricity from any other source, is to be attributed to the Arabs, who before the 15th century had the Arabic word for lightning ra‘ad (رعد) applied to the electric ray.[4]
Ancient cultures around the Mediterranean knew that certain objects, such as rods of amber, could be rubbed with cat's fur to attract light objects like feathers. Thales of Miletus, an ancient Greek philosopher, writing at around 600 BCE, described a form of static electricity, noting that rubbing fur on various substances, such as amber, would cause a particular attraction between the two. He noted that the amber buttons could attract light objects such as hair and that if they rubbed the amber for long enough they could even get a spark to jump.
At around 450 BCE Democritus, a later Greek philosopher, developed an atomic theory that was similar to modern atomic theory. His mentor, Leucippus, is credited with this same theory. The hypothesis of Leucippus and Democritus held everything to be composed of atoms. But these atoms, called "atomos", were indivisible, and indestructible. He presciently stated that between atoms lies empty space, and that atoms are constantly in motion. He was incorrect only in stating that atoms come in different sizes and shapes, and that each object had its own shaped and sized atom.[5][6]
An object found in Iraq in 1938, dated to about 250 BCE and called the Baghdad Battery, resembles a galvanic cell and is claimed by some to have been used for electroplating in Mesopotamia, although there is no evidence for this.
Electricity would remain little more than an intellectual curiosity for millennia. In 1600, the English scientist, William Gilbert extended the study of Cardano on electricity and magnetism, distinguishing the lodestone effect from static electricity produced by rubbing amber.[7] He coined the Neo-Latin word electricus ("of amber" or "like amber", from ήλεκτρον [elektron], the Greek word for "amber") to refer to the property of attracting small objects after being rubbed.[8] This association gave rise to the English words "electric" and "electricity", which made their first appearance in print in Thomas Browne's Pseudodoxia Epidemica of 1646.[9]
Further work was conducted by Otto von Guericke who showed electrostatic repulsion. Robert Boyle also published work.[10]
Though electrical phenomena had been known for centuries, in the 18th century, the systematic study of electricity became known as "the youngest of the sciences", and the public became electrified by the newest discoveries in the field.[11]
By 1705, Francis Hauksbee had discovered that if he placed a small amount of mercury in the glass of his modified version of Otto von Guericke's generator, evacuated the air from it to create a mild vacuum and rubbed the ball to build up a charge, a glow was visible if he placed his hand on the outside of the ball. This glow was bright enough to read by. It seemed to be similar to St. Elmo's Fire. This effect later became the basis of the gas-discharge lamp, which led to neon lighting and mercury vapor lamps. In 1706 he produced an 'Influence machine' to generate this effect.[12] He was elected a Fellow of the Royal Society the same year.[13]
Hauksbee continued to experiment with electricity, making numerous observations and developing machines to generate and demonstrate various electrical phenomena. In 1709 he published Physico-Mechanical Experiments on Various Subjects which summarized much of his scientific work.
Stephen Gray discovered the importance of insulators and conductors. C. F. du Fay seeing his work, developed a "two-fluid" theory of electricity. [10]
In the 18th century, Benjamin Franklin conducted extensive research in electricity, selling his possessions to fund his work. In June 1752 he is reputed to have attached a metal key to the bottom of a dampened kite string and flown the kite in a storm-threatened sky.[14] A succession of sparks jumping from the key to the back of his hand showed that lightning was indeed electrical in nature.[15] He also explained the apparently paradoxical behavior of the Leyden jar as a device for storing large amounts of electrical charge, by coming up with the single fluid, two states theory of electricity.
In 1791, Italian Luigi Galvani published his discovery of bioelectricity, demonstrating that electricity was the medium by which nerve cells passed signals to the muscles.[10][16][17] Alessandro Volta's battery, or voltaic pile, of 1800, made from alternating layers of zinc and copper, provided scientists with a more reliable source of electrical energy than the electrostatic machines previously used.[16][17]
The first application of electricity that was put to practical use was electromagnetism.[18] William Sturgeon invented the electromagnet in 1825.[19] Electromagnets were then used in the first practical engineering application of electricity by William Fothergill Cooke and Charles Wheatstone who co-developed a telegraph system that used a number of needles on a board which were moved to point to letters of the alphabet. A five needle system was used initially, but was given up as too expensive. In 1838 an improvement reduced the number of needles to two, and a patent for this version was taken out by Cooke and Wheatstone.[20] Cooke tested the invention, with the London & Blackwall Railway, the London & Birmingham Railway, and the Great Western Railway companies, successively allowing the use of their lines for the experiment. Subsequently, railways developed systems with signal boxes along the line communicating with their neighbouring boxes by telegraphic sounding of single-stroke bells and three-position needle telegraph instruments. Such systems implementing signalling block systems remained in use on rural lines well into the 21st century.[21]
Electrical engineering became a profession in the late 19th century. Practitioners had created a global electric telegraph network and the first electrical engineering institutions to support the new discipline were founded in the UK and US. Although it is impossible to precisely pinpoint a first electrical engineer, Francis Ronalds stands ahead of the field, who created a working electric telegraph system in 1816 and documented his vision of how the world could be transformed by electricity.[22][23] Over 50 years later, he joined the new Society of Telegraph Engineers (soon to be renamed the Institution of Electrical Engineers) where he was regarded by other members as the first of their cohort.[24] The donation of his extensive electrical library was a considerable boon for the fledgling Society.
Development of the scientific basis for electrical engineering, using research techniques, intensified during the 19th century. Notable developments early in this century include the work of Georg Ohm, who in 1827 quantified the relationship between the electric current and potential difference in a conductor, Michael Faraday, the discoverer of electromagnetic induction in 1831.[26] In the 1830s, Georg Ohm also constructed an early electrostatic machine. The homopolar generator was developed first by Michael Faraday during his memorable experiments in 1831. It was the beginning of modern dynamos – that is, electrical generators which operate using a magnetic field. The invention of the industrial generator in 1866 by Werner von Siemens – which did not need external magnetic power – made a large series of other inventions possible.
In 1873, James Clerk Maxwell published a unified treatment of electricity and magnetism in A Treatise on Electricity and Magnetism which stimulated several theorists to think in terms of fields described by Maxwell's equations. In 1878, the British inventor James Wimshurst developed an apparatus that had two glass disks mounted on two shafts. It was not until 1883 that the Wimshurst machine was more fully reported to the scientific community.
During the latter part of the 1800s, the study of electricity was largely considered to be a subfield of physics. It was not until the late 19th century that universities started to offer degrees in electrical engineering. In 1882, Darmstadt University of Technology founded the first chair and the first faculty of electrical engineering worldwide. In the same year, under Professor Charles Cross, the Massachusetts Institute of Technology began offering the first option of Electrical Engineering within a physics department.[27] In 1883, Darmstadt University of Technology and Cornell University introduced the world's first courses of study in electrical engineering and in 1885 the University College London founded the first chair of electrical engineering in the United Kingdom. The University of Missouri subsequently established the first department of electrical engineering in the United States in 1886.[28]
During this period commercial use of electricity increased dramatically. Starting in the late 1870s cities started installing large scale electric street lighting systems based on arc lamps.[29] After the development of a practical incandescent lamp for indoor lighting, Thomas Edison switched on the world's first public electric supply utility in 1882, using what was considered a relatively safe 110 volts direct current system to supply customers. Engineering advances in the 1880s, including the invention of the transformer, led to electric utilities starting to adopting alternating current, up until then used primarily in arc lighting systems, as a distribution standard for outdoor and indoor lighting (eventually replacing direct current for such purposes). In the US there was a rivalry, primarily between a Westinghouse AC and the Edison DC system known as the "war of the currents".[30]
"By the mid-1890s the four "Maxwell equations" were recognized as the foundation of one of the strongest and most successful theories in all of physics; they had taken their place as companions, even rivals, to Newton's laws of mechanics. The equations were by then also being put to practical use, most dramatically in the emerging new technology of radio communications, but also in the telegraph, telephone, and electric power industries."[31] By the end of the 19th century, figures in the progress of electrical engineering were beginning to emerge.[32]
Charles Proteus Steinmetz helped foster the development of alternating current that made possible the expansion of the electric power industry in the United States, formulating mathematical theories for engineers.
During the development of radio, many scientists and inventors contributed to radio technology and electronics. In his classic UHF experiments of 1888, Heinrich Hertz demonstrated the existence of electromagnetic waves (radio waves) leading many inventors and scientists to try to adapt them to commercial applications, such as Guglielmo Marconi (1895) and Alexander Popov (1896).
Millimetre wave communication was first investigated by Jagadish Chandra Bose during 1894–1896, when he reached an extremely high frequency of up to 60 GHz in his experiments.[33] He also introduced the use of semiconductor junctions to detect radio waves,[34] when he patented the radio crystal detector in 1901.[35][36]
John Fleming invented the first radio tube, the diode, in 1904.
Reginald Fessenden recognized that a continuous wave needed to be generated to make speech transmission possible, and by the end of 1906 he sent the first radio broadcast of voice. Also in 1906, Robert von Lieben and Lee De Forest independently developed the amplifier tube, called the triode.[37] Edwin Howard Armstrong enabling technology for electronic television, in 1931.[38]
In the early 1920s, there was a growing interest in the development of domestic applications for electricity.[39] Public interest led to exhibitions such featuring "homes of the future" and in the UK, the Electrical Association for Women was established with Caroline Haslett as its director in 1924 to encourage women to become involved in electrical engineering.[40]
The second world war saw tremendous advances in the field of electronics; especially in radar and with the invention of the magnetron by Randall and Boot at the University of Birmingham in 1940. Radio location, radio communication and radio guidance of aircraft were all developed at this time. An early electronic computing device, Colossus was built by Tommy Flowers of the GPO to decipher the coded messages of the German Lorenz cipher machine. Also developed at this time were advanced clandestine radio transmitters and receivers for use by secret agents.
An American invention at the time was a device to scramble the telephone calls between Winston Churchill and Franklin D. Roosevelt. This was called the Green Hornet system and worked by inserting noise into the signal. The noise was then extracted at the receiving end. This system was never broken by the Germans.
A great amount of work was undertaken in the United States as part of the War Training Program in the areas of radio direction finding, pulsed linear networks, frequency modulation, vacuum tube circuits, transmission line theory and fundamentals of electromagnetic engineering. These studies were published shortly after the war in what became known as the 'Radio Communication Series' published by McGraw-Hill in 1946.
In 1941 Konrad Zuse presented the Z3, the world's first fully functional and programmable computer.[41]
Prior to the Second World War, the subject was commonly known as 'radio engineering' and was primarily restricted to aspects of communications and radar, commercial radio and early television. At this time, the study of radio engineering at universities could only be undertaken as part of a physics degree.
Later, in post war years, as consumer devices began to be developed, the field broadened to include modern TV, audio systems, Hi-Fi and latterly computers and microprocessors. In 1946 the ENIAC (Electronic Numerical Integrator and Computer) of John Presper Eckert and John Mauchly followed, beginning the computing era. The arithmetic performance of these machines allowed engineers to develop completely new technologies and achieve new objectives, including the Apollo missions and the NASA Moon landing.[42]
In the mid-to-late 1950s, the term radio engineering gradually gave way to the name electronics engineering, which then became a stand-alone university degree subject, usually taught alongside electrical engineering with which it had become associated due to some similarities.
The first working transistor was a point-contact transistor invented by John Bardeen and Walter Houser Brattain while working under William Shockley at the Bell Telephone Laboratories (BTL) in 1947.[43] They then invented the bipolar junction transistor in 1948.[44] While early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis,[45] they opened the door for more compact devices.[46]
The first integrated circuits were the hybrid integrated circuit invented by Jack Kilby at Texas Instruments in 1958 and the monolithic integrated circuit chip invented by Robert Noyce at Fairchild Semiconductor in 1959.[47]
In 1955, Carl Frosch and Lincoln Derick accidentally grew a layer of silicon dioxide over the silicon wafer, for which they observed surface passivation effects.[48] By 1957 Frosch and Derrick, using masking and predeposition, published their manufactured silicon dioxide planar transistors, the first field effect transistors in which drain and source were adjacent at the same surface.[49] They showed that silicon dioxide insulated, protected silicon wafers and prevented dopants from diffusing into the wafer.[48][50]
Following this research, Mohamed Atalla and Dawon Kahng proposed a silicon MOS transistor in 1959[51] and successfully demonstrated a working MOS device with their Bell Labs team in 1960.[52][53] Their team included E. E. LaBate and E. I. Povilonis who fabricated the device; M. O. Thurston, L. A. D’Asaro, and J. R. Ligenza who developed the diffusion processes, and H. K. Gummel and R. Lindner who characterized the device.[54][55] This was a culmination of decades of field-effect research that began with Lilienfeld.
The MOSFET was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[45] It revolutionized the electronics industry,[56][57] becoming the most widely used electronic device in the world.[58][59][60]
The MOSFET made it possible to build high-density integrated circuit chips.[58] The earliest experimental MOS IC chip to be fabricated was built by Fred Heiman and Steven Hofstein at RCA Laboratories in 1962.[61] MOS technology enabled Moore's law, the doubling of transistors on an IC chip every two years, predicted by Gordon Moore in 1965.[62] Silicon-gate MOS technology was developed by Federico Faggin at Fairchild in 1968.[63] Since then, the MOSFET has been the basic building block of modern electronics.[64][65][66] The mass-production of silicon MOSFETs and MOS integrated circuit chips, along with continuous MOSFET scaling miniaturization at an exponential pace (as predicted by Moore's law), has since led to revolutionary changes in technology, economy, culture and thinking.[67]
The Apollo program which culminated in landing astronauts on the Moon with Apollo 11 in 1969 was enabled by NASA's adoption of advances in semiconductor electronic technology, including MOSFETs in the Interplanetary Monitoring Platform (IMP)[68][69] and silicon integrated circuit chips in the Apollo Guidance Computer (AGC).[70]
The development of MOS integrated circuit technology in the 1960s led to the invention of the microprocessor in the early 1970s.[71][72] The first single-chip microprocessor was the Intel 4004, released in 1971.[71][73] The Intel 4004 was designed and realized by Federico Faggin at Intel with his silicon-gate MOS technology,[71] along with Intel's Marcian Hoff and Stanley Mazor and Busicom's Masatoshi Shima.[74] This ignited the development of the personal computer. The 4004, a 4-bit processor, was followed in 1973 by the Intel 8080, an 8-bit processor, which made possible the building of the first personal computer, the Altair 8800.[75]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.