electronics and communication papers
Microprocessor, artificial intelligence, led-light emitting diode, data acquisation system, optical character recognition, analog integrated circuit, control system, dsp-digital signal processing dsp-2010-2011, digital image processing, digital camera, embedded system embedded system-2011 embedded system-2, fuzzy logic fuzzy logic-2, home automation, image processing, low power vlsi, mobile computing, mobile communication, mobile technology, nanotechnology, optical communication, operating system, sensor network, soc-system on chip, telecommunication, vlsi asic vhdl fpga soc-system on chip, wireless network, wireless wireless-2 cdma wireless-embedded cognitive-radio satellite umts uwb wireless power transfer, wireless sensor network, research-projects-electronics-mobile-technology.
FREE IEEE PAPER AND PROJECTS
Ieee projects 2022, seminar reports, free ieee projects ieee papers.
Journal of Communications Technology and Electronics
Journal of Communications Technology and Electronics is a peer-reviewed journal that publishes articles on a broad spectrum of theoretical, fundamental, and applied issues of radio engineering, communication, and electron physics. It publishes original articles from leading scientific and research centers. The journal covers all essential branches of electromagnetics, wave propagation theory, signal processing, transmission lines, telecommunications, physics of semiconductors, and physical processes in electron devices, as well as applications in biology, medicine, microelectronics, nanoelectronics, electron and ion emission, etc. The journal publishes original manuscripts submitted in English, as well as works translated from several other journals. The sources of content are indicated at the article level. The peer review policy of the journal is independent of the manuscript source, ensuring a fair and unbiased evaluation process for all submissions. As part of its aim to become an international publication, the journal welcomes submissions in English from all countries.
Peer review and editorial policy
The journal follows the Springer Nature Peer Review Policy, Process and Guidance , Springer Nature Journal Editors' Code of Conduct , and COPE's Ethical Guidelines for Peer-reviewers .
Approximately 1% of the manuscripts are rejected without review based on formal criteria as they do not comply with the submission guidelines. Each manuscript is assigned to one peer reviewer, who can either be an external expert or a member of the Editorial Board. When the reviewer is a member of the Editorial Board, they will write the review themselves. In special cases (for example, when the topic of the article is interdisciplinary), two reviewers are assigned to the article. The journal follows a single-blind reviewing procedure. The period from submission to the first decision is up to 6 weeks. The approximate rejection rate is 15%. The final decision on the acceptance of a manuscript for publication is mostly made by the responsible editor and sometimes by a meeting of the most active members of the Editorial Board.
If Editors, including the Editor-in-Chief, publish in the journal, they do not participate in the decision-making process for manuscripts where they are listed as co-authors.
Special issues published in the journal follow the same procedures as all other issues. If not stated otherwise, special issues are prepared by the members of the editorial board without guest editors.
- Examines theoretical, fundamental, and applied issues in radio engineering, communications, and electron physics
- Official Publication of the Russian Academy of Sciences
- Covers electromagnetics, waves propagation theory, signal processing, transmission lines, telecommunications, physics of semiconductors, and physical processes in electron devices
- Sets forth new applications in biology, medicine, microelectronics, nanoelectronics, electron and ion emission, etc.
- Yurii V. Gulyaev
Issue 9, September 2023
Microstrip beam-forming network for generating a sector radiation pattern of an element of a linear array.
- S. E. Bankov
- E. V. Frolova
- Content type: ON THE 70th ANNIVERSARY OF THE INSTITUTE OF RADIOENGINEERING AND ELECTRONICS, RUSSIAN ACADEMY OF SCIENCES
- Published: 16 November 2023
- Pages: 925 - 933
Delaying Medium for Broadband Lens Antennas Based on Corrugated Metal Surfaces
- V. A. Kaloshin
- Bui Van Chung
- Pages: 934 - 939
Influence of a Strong Loal Atmospheric Disturbance on the Resonant Structure of the Near Field of a Low-Frequency Loop Antenna Located in the Ionosphere of the Earth
- A. V. Moshkov
- Pages: 940 - 945
Study of Superconducting Transmission Lines and Tunnel Junctions for Signal Detection at Frequencies above 1 THz
Authors (first, second and last of 4).
- N. V. Kinev
- A. M. Chekushkin
- K. I. Rudakov
- Pages: 946 - 951
Parametric Filter Family with a Finite Impulse Response Based on Splines and a Method for Searching for the Optimal Parameter
- K. A. Budunova
- V. F. Kravchenko
- Pages: 952 - 959
Working on a manuscript.
Avoid the most common mistakes and prepare your manuscript for journal editors.
About this journal
- Chemical Abstracts Service (CAS)
- Current Contents/Engineering, Computing and Technology
- EBSCO Applied Science & Technology Source
- EBSCO Discovery Service
- EBSCO STM Source
- EI Compendex
- Google Scholar
- Japanese Science and Technology Agency (JST)
- Journal Citation Reports/Science Edition
- OCLC WorldCat Discovery Service
- ProQuest ABI/INFORM
- ProQuest Advanced Technologies & Aerospace Database
- ProQuest-ExLibris Primo
- ProQuest-ExLibris Summon
- Science Citation Index
- Science Citation Index Expanded (SCIE)
- TD Net Discovery Service
- UGC-CARE List (India)
Rights and permissions
© Pleiades Publishing, Inc.
M.Tech/Ph.D Thesis Help in Chandigarh | Thesis Guidance in Chandigarh
Latest Topics in Electronics and Communication (ECE) for project, research, and thesis
Electronics and Communication is an important field with respect to our daily life. There are a number of good topics in electronics and communication engineering (ECE) for thesis, research, and project. New developments and research are going on in this field. It has made our life, even more, easier and comfortable.
Mobile phones and Communication network have brought the world closer. All thanks to electronics and communication engineers working towards the development of these electrical products. Talking about academics, students are often confused about which topic to choose in electronics and communication for project, thesis or for the seminar. M.Tech students find it even more difficult to choose a good master thesis topics in communication engineering. Even after the choice of the topic is made, students are unable to get proper thesis guidance and thesis assistance in ECE.
So what to do?
Here are some of the latest and best topics in electronics and communication which you can choose for your thesis, projects, and seminars for M.Tech and Ph.D. You can get thesis help in any of these topics from the thesis guidance experts.
Latest Thesis and Research Topics in Electronics and Communication(ECE)
Following is the list of latest topics in Electronics and Communication(ECE) for the project, research and thesis:
Fibre Optic Communication
Embedded systems, nanoelectronics, oled(organic light emitting diode).
Human Area Network
Bluetooth is a low-power wireless technology used for exchange of data within a short range. Bluetooth build a Personal Area Network(PAN) for exchange of data between mobile devices. This technology was invented by Ericsson in 1994. Bluetooth is based on radio technology known as frequency-hopping spread spectrum. In this technology, the data is transmitted in the form of packets. It is a very good topic for an M.Tech thesis. Thesis help in this topic can be taken from an expert in this field. Along with this it is also a very good choice for major project in ECE. There are two processes of Bluetooth technology:
Basic Rate/Enhanced Data Rate(BR/ED) –It uses point-to-point topology to enable continuous wireless communication between two devices. The common example for this is wireless speakers.
Low Energy(LE) – It uses multiple network topologies for communication which include point-to-point, mesh and broadcast. Point-to-point is for one to one device communication. Broadcast is for one to many device communication. Mesh is for many to many device communication.
Many consumers are using this technology worldwide for streaming audio, data exchange and broadcasting information. Bluetooth technology uses a variety of protocols. The Bluetooth protocol stack is divided into two parts: Controller Stack and the host stack.
The controller stack is implemented in low-cost silicon devices that contains Bluetooth radio and a microprocessor. The host stack is implemented on the top of the operating system or as an installable package on the operating system.
How Bluetooth technology works?
The Bluetooth network is also known as Personal Area Network or Piconet in which there are 2 to 8 devices. One is the master device that initiates the communication while other are the slaves. The slave devices respond to the action of the master device. The master device governs the transmission between the slave devices. A slave device may begin transmission only in an allotted time slot.
A scatternet is created when a device participates in more than one piconet.
Features of Bluetooth Technology
Following are some of the features of Bluetooth technology:
Based on radio technology.
Power consumption is less.
There are fewer complications.
Applications of Bluetooth technology
Wireless mobile phone headset.
Bluetooth enabled laptops and Pcs.
Wireless mouse and keyboard.
Data transfer between mobile devices.
Disadvantages of Bluetooth Technology
Along with benefits, there are certain disadvantages of Bluetooth technology. Some of these are:
High battery consumption.
Security is poor.
The data transfer is low.
For transmission of large amount of data fibre optic communication is the perfect choice. This type of communication is used to transmit data over long distances over the computer network. This technology converts electronic signals into light signals and the signals are transmitted through the optical fibres. It is a very good choice for your M.Tech thesis project. Thesis help in this topic can be taken from professionals in this topic. Some of the characteristics of this type of communication are:
Long distance communication
Less electromagnetic interference
How Fibre Optics Communication works?
Unlike other form of communication, in fibre optics, the communication takes place in the form of light signals. The components of fibre optics communication are:
The transmitter receives input in the form of electrical signals which are converted into light signals using a light source like LED and laser. The light signal is transmitted using optic fibre cable to the receiver where it is converted back into electric signals. The receiver consists of a photodetector that measures the frequency of the optic field. The wavelength near to the infrared is used for communication.
Photodetector – A photodetector is a device that converts light signals into electric signals. Two types of photodetectors mainly used in fibre optic communication are PN photodiode and avalanche photodiode.
Advantages of fibre optics communication
Some of the advantages of fibre optics communication are:
Higher transmission bandwidth
Data transmission is higher
Low power loss
Immunity to electromagnetic interference
High installation cost
More number of repeaters
More maintenance is required
Embedded Systems are the type of physical hardware systems with software embedded in that. This system is microprocessor or microcontroller-based and can be independent or can be a part of larger system. This system is specifically designed to perform some tasks. It is a hot topic for thesis, project ad for seminar. If you know little about this topic you can also take thesis guidance for this topic. Following are the three components of embedded systems:
Real-Time Operating System
Characteristics of Embedded Systems
The characteristics of the embedded systems are:
Single Functionality – Embedded Systems are specifically designed to perform a single task.
Tightly Constrained – Embedded Systems are based on constraints like design, cost , size, and power.
Reactive – Embedded Systems are reactive in nature i.e. they instantaneously react to any changes in nature.
Based on microprocessor – Embedded Systems are microprocessor and microcontroller based.
Memory – These systems have ROM(Read Only Memory) embedded in that as there is no need of secondary memory.
Connectivity – These systems have peripherals connected to them for input and output.
What an Embedded System consists of?
The basic structure of an embedded system consists of the following components:
Sensor – To measure the quantity of a system by converting it into electrical signals.
A/D Converter – It is required to convert analog to digital signals.
Processor – It processes the data and stores it into memory.
D/A Converter – It converts digital to analog signals.
Actuator – An actuator compares the output of the D/A converter to the expected output.
Advantages of Embedded Systems
These systems can be easily customized.
These have low power consumption.
The cost is comparatively low.
The power is enhanced.
Disadvantages of Embedded Systems
High efforts in development
Marketing is not easy.
Nanoelectronics is a field that deals with the use of nanotechnology in electrical components. On the other hand, nanotechnology is a branch of engineering that deals with the matter at an atomic and molecular level. Nanoelectronics more or less is based on the transistors. The transistors used here have size lesser than 1000 nanometers. These are so small that there is separate study to understand the inter atomic interactions as well as quantum mechanical properties. These transistors are designed through nanotechnology and are very much different from the traditional transistors.
The work that a nanoelectronic device can do depends upon its size. With increase in volume, the power of the device will increase. The development in this field is in progress as there are some limitations of it when used in real world.
Different approaches to nanotechnology
The different approaches to nanotechnology are:
Applications of Nanoelectronics
Certain development and applications have been made in this field of nanotechnology which are as follows:
Nanoradio – These will have nanoprocessors for its working with high speed and performance. Carbon nanotubes are being used in this application.
Nanocomputers – Traditional computers will be replaced by nanocomputers for higher performance and speed. Detailed research is being carried out in this field.
Medical Diagnostics – Nanoelectronic devices can detect biomolecules and thus will help in medical diagnostics.
Energy Production – Research is being conducted to create energy efficient solar cells, galvanic cells and fuel cells.
VLSI (Very Large Scale Integration)
VLSI is a process to create Integrated Circuits(IC) by combining together thousands of transistors on a single chip. Microprocessor is an example of VLSI. Before the development of VLSI, the Integrated Circuits had limited functionality and performance. VLSI gives the ability to add CPU, RAM, ROM and other such functions on a single chip.
Due to this, the electronics industry has recorded a commendable growth.
Design of VLSI
VLSI maily consists of front-end design and back-end design. Front-end design is the digital design while back-end design is the CMOS(Complememtary metal-oxide semiconductor) library design. The steps followed while designing a VLSI are:
Problem Specification – In this step, various parameters are studied like size, cost, performance and functionality.
Architecture – In this step, specifications like floating point unit, ALU, RISC/CISC and cache size are studied.
Functional Design – The functional unit along with the input and output are defined in this step using a block diagram.
Logical Design – The main logic of the system is designed at this step. Other developments in this step include boolean expression, register allocation, control flow and word width.
Design of the Circuit – The circuit is designed after the logical design by the use of gates and transistors.
Physical Design – The complete layout of the system is designed at this step through geometrical representation.
Packaging – The final product is obtained after putting together all the chips into a single printed circuit board.
Advantages of VLSI
The advantages of VLSI are:
Size of the circuit is reduced.
Cost of the device is reduced.
Increase in overall performance and speed.
Find its use in almost every field from computers to medicines.
OLED is a type of LED( Light Emitting Diode) with a small change that the component that produces light is made up of a thin layer of organic compounds. This organic semiconductor layer is situated between the two electrodes. It is mainly used for flat panel displays, mobile devices, and smartphones. There are two types of OLEDs :
Based on small molecules
Working of OLEDs
Organic LED work almost in the same way as traditional LEDs with some changes. In this instead of n-type and p-type semiconductors, organic molecules are used to produce electrons and holes. There are 6 layers of OLED. The top layer is known as the seal while the bottom layer is called the substrate. There are two terminals between the top and the bottom layers – anode(positive terminal) and cathode(negative terminal). In between these terminals, there are the organic layers one is the emissive layer and the other one is the conductive layer.
A voltage is connected to the anode and the cathode. Electricity starts flowing and the cathode starts receiving electrons while the cathode starts losing them. As the electrons are added, the emissive layer starts becoming negatively charged while the conductive layer starts becoming positively charged. The positively charged holes starts jumping towards the emissive layer. When the positive hole meets the negatively charged electron ,a photon is produced which is a particle of light.
Advantages of OLEDs
These are superior to LCDs.
These are thinner, lighter and flexible.
The respond time is faster.
They produce true colors with better viewing angle.
Disadvantages of OLEDs
These have comparatively less life time than the LCDs.
The organic molecules degrade over the time.
These are very sensitive to water.
ZigBee is an IEEE 802.15.4 based communication system designed for wireless personal area network . This standard allows the physical and media access control layer(MAC) to handle various devices at a very low-data rate. The main characteristics of this technology is that it is low powered and low cost. It controls and manages application within a range of 10-100 metres. Moreover, it is less expensive than the Bluetooth and Wifi.
Architecture of ZigBee
This system consist of the following three devices:
The ZigBee coordinator acts as the bridge and the root of the whole network. It handles and stores the information by performing some data operations. The ZigBee routers are the intermediatory device that allows data to pass to and from other devices. The end device communicates with the parent node. The ZigBee protocol consists of the following 5 layers:
Physical Layer – This layer performs the modulation and demodulation operation.
MAC Layer – This layer access different networks using CSMA to check for reliable transmission of data.
Network Layer – This layer looks after all the operations related to the network.
Application Support Sub-Layer – This layer matches two devices according to their services and needs.
Application Framework – This layer provides two types of data services. One is the key value pair and the other one is the generic messages service.
ZigBee Operating Modes
There two modes of operation in ZigBee:
Non-beacon – In this mode, there is no monitoring of the incoming data by the coordinators and the routers.
Beacon – In this mode, the active state of the incoming data is continuously monitored by the coordinators and routers thereby consuming more power.
Applications of ZigBee Technology
ZigBee finds its application in the following fields:
Smart Grid Monitoring
Human Area Network is a wireless network also referred to as RedTacton that uses the human body as a medium for high-speed transmission. It is different from other wireless and infrared technologies in the sense that it uses tiny electric field emitted on the surface of the human body. It is a very good topic under ece thesis topics list.
The human body forms a transmission path whenever a part of it comes into contact with the RedTacton transceiver. Body surface can be hands, legs, arm, feet or face. It can work through clothes and shoes. Whenever the physical contact between the transceiver and the human body is lost, communication ends.
It has the following three main features:
The communication can be triggered by human movements like touching, gripping, walking, sitting, and stepping for obtaining data.
The transmission speed is not depleted when many people are communication at the same time as the transmission path is human body surface.
Conductors and dielectrics can be used along with the human body.
Working of Human Area Network
The approach of Human Area Network is different from other networks. It does not use electromagnetic waves or light waves for data transmission. Instead, it uses weak electric signals on the human body for transmission. It works as follows:
The RedTacton transmitter generates a weak electric signal on the human body surface.
Any changes caused by the transmitter to the electric field is sensed by the RedTacton receiver.
RedTacton depends upon the principle that the changes in the weak electric field can cause a change in the optical properties of an electro-optic crystal.
These changes are detected using a laser and the result is produced in the form of electrical signals.
RedTacton uses CSMA/CD(Carrier Sense with Multiple Access with Collision Detection) protocols for transmission.
GPRS stands for General Packet Radio Services. It is a packet-based service for 2G and 3G mobile communication. It is standardized under European Telecommunications Standards Institute(ETSI). It provides higher data rates for Internet on mobile phones. It is based on GSM(Global System for Mobile) communication and provides additional services on circuit-switched connections and Short Message Service(SMS). It is another popular topic for final year project, thesis, and seminar.
GPRS has the following main features:
It has lesser cost than the circuit-switched services as the communication channels are shared.
It provides variable throughput and latency.
It provides data rates of 56-114 kbps.
It supports IP, PPP, and X.25 packet-based protocol.
Services offered by GPRS include:
SMS(Short Messaging Service)
MMS(Multimedia Messaging Service)
Point-to-point(P2P) and point-to-multipoint(P2M) services
It stands for High-Speed Packet Access. It is a combination of two technologies named HSDPA and HSUPA for uplink and downlink. This provides high-speed data access. It can provide download speed up to 384 kbps. It uses WCDMA protocols and improves the performance of the existing 3G mobile communication. Students looking for ece project ideas can work on this topic.
Components of HSPA
Following are the two main components of HSPA providing a link between the base station and the user:
HSDPA(High-speed Downlink Packet Access) – HSDPA is used to provide support for packet data and a data rate of 14 Mbps. Also, it helps in reducing delays.
HSUPA(High-speed Uplink Packet Access) – It also provides data support with improved features along with data rate of 5.74 Mbps.
Benefits of HSPA
There are a number of benefits of HSPA but following are the significant ones:
HSPA uses a higher order of modulation for data to be transmitted at a higher rate.
It uses a Shorter Transmission Time Interval(TTI) to reduce the round trip time and reduction in latency.
It uses a shared channel for transmission which provides a great level of efficiency.
To maximize the channel usage, link adaption is used.
Fast Node B scheduling is used with adaptive coding and modulation to respond to the constantly varying radio channel and interference.
These were some of the topics in electronics and communication for your project, thesis and for your seminar. Thesis help and thesis guidance can be taken for ece thesis topics from thesis guidance agencies.
Techsparks offer thesis and research help in electronics and communication (ECE). You can contact us on this number +91-9465330425 or email us at [email protected] for any help in all the latest topics in electronics and communication. You can also fill the query form on the website.
Leave a Reply Cancel reply
Your email address will not be published. Required fields are marked *
Save my name, email, and website in this browser for the next time I comment.
Get a quote, share your details to get free.
Electronics Research Paper Topics
This list of electronics research paper topics provides the list of 30 potential topics for research papers and an overview article on the history of electronics.
1. Applications of Superconductivity
The 1986 Applied Superconductivity Conference proclaimed, ‘‘Applied superconductivity has come of age.’’ The claim reflected only 25 years of development, but was justifiable due to significant worldwide interest and investment. For example, the 1976 annual budget for superconducting systems exceeded $30 million in the U.S., with similar efforts in Europe and Japan. By 1986 the technology had matured impressively into applications for the energy industry, the military, transportation, high-energy physics, electronics, and medicine. The announcement of high-temperature superconductivity just two months later brought about a new round of dramatic developments.
Academic Writing, Editing, Proofreading, And Problem Solving Services
Get 10% off with fall23 discount code, 2. discovery of superconductivity.
As the twenty-first century began, an array of superconducting applications in high-speed electronics, medical imaging, levitated transportation, and electric power systems are either having, or will soon have, an impact on the daily life of millions. Surprisingly, at the beginning of the twentieth century, the discovery of superconductivity was completely unanticipated and unimagined.
In 1911, three years after liquefying helium, H. Kammerlingh Onnes of the University of Leiden discovered superconductivity while investigating the temperature-dependent resistance of metals below 4.2Kelvin. Later reporting on experiments conducted in 1911, he described the disappearance of the resistance of mercury, stating, ‘‘Within some hundredths of a degree came a sudden fall, not foreseen [by existing theories of resistance]. Mercury has passed into a new state, which . . . may be called the superconductive state.’’
3. Electric Motors
The main types of electric motors that drove twentieth century technology were developed toward the end of the nineteenth century, with direct current (DC) motors being introduced before alternating current (AC) ones. Most important initially was the ‘‘series’’ DC motor, used in electric trolleys and trains from the 1880s onward. The series motor exerts maximum torque on starting and then accelerates to its full running speed, the ideal characteristic for traction work. Where speed control independent of the load is required in such applications as crane and lift drives, the ‘‘shunt’’ DC motor is more suitable.
4. Electronic Calculators
The electronic calculator is usually inexpensive and pocket-sized, using solar cells for its power and having a gray liquid crystal display (LCD) to show the numbers. Depending on the sophistication, the calculator might simply perform the basic mathematical functions (addition, subtraction, multiplication, division) or might include scientific functions (square, log, trig). For a slightly higher cost, the calculator will probably include programmable scientific and business functions. At the end of the twentieth century, the electronic calculator was as commonplace as a screwdriver and helped people deal with all types of mathematics on an everyday basis. Its birth and growth were early steps on the road to today’s world of computing.
5. Electronic Communications
The broad use of digital electronic message communications in most societies by the end of the 20th century can be attributed to a myriad of reasons. Diffusion was incremental and evolutionary. Digital communication technology was seeded by large-scale funding for military projects that broke technological ground, however social needs and use drove systems in unexpected ways and made it popular because these needs were embraced. Key technological developments happened long before diffusion into society, and it was only after popularity of the personal computer that global and widespread use became commonplace. The Internet was an important medium in this regard, however the popular uses of it were well established long before its success. Collaborative developments with open, mutually agreed standards were key factors in broader diffusion of the low-level transmission of digital data, and provided resistance to technological lock-in by any commercial player. By the twenty-first century, the concept of interpersonal electronic messaging was accepted as normal and taken for granted by millions around the world, where infrastructural and political freedoms permitted. As a result, traditional lines of information control and mass broadcasting were challenged, although it remains to be seen what, if any, long-term impact this will have on society.
6. Electronic Control Technology
The advancement of electrical engineering in the twentieth century made a fundamental change in control technology. New electronic devices including vacuum tubes (valves) and transistors were used to replace electromechanical elements in conventional controllers and to develop new types of controllers. In these practices, engineers discovered basic principles of control theory that could be further applied to design electronic control systems.
7. Fax Machine
Fax technology was especially useful for international commercial communication, which was traditionally the realm of the Telex machine, which only relayed Western alpha-numeric content. A fax machine could transmit a page of information regardless of what information it contained, and this led to rapid and widespread adoption in developing Asian countries during the 1980s. With the proliferation of the Internet and electronic e-mail in the last decade of the twentieth century, fax technology became less used for correspondence. At the close of the 20th century, the fax machine was still widely used internationally for the transmission of documents of all forms, with the ‘‘hard copy’’ aspect giving many a sense of permanence that other electronic communication lacked.
8. Hall Effect Devices
The ‘‘Hall effect,’’ discovered in 1879 by American physicist Edwin H. Hall, is the electrical potential produced when a magnetic field is perpendicular to a conductor or semiconductor that is carrying current. This potential is a product of the buildup of charges in that conductor. The magnetic field makes a transverse force on the charge carriers, resulting in the charge being moved to one of the sides of the conductor. Between the sides of the conductor, measurable voltage is yielded from the interaction and balancing of the polarized charge and the magnetic influence.
Hall effect devices are commonly used as magnetic field sensors, or alternatively if a known magnetic field is applied, the sensor can be used to measure the current in a conductor, without actually plugging into it (‘‘contactless potentiometers’’). Hall sensors can also be used as magnetically controlled switches, and as a contactless method of detecting rotation and position, sensing ferrous objects.
9. Infrared Detectors
Infrared detectors rely on the change of a physical characteristic to sense illumination by infrared radiation (i.e., radiation having a wavelength longer than that of visible light). The origins of such detectors lie in the nineteenth century, although their development, variety and applications exploded during the twentieth century. William Herschel (c. 1800) employed a thermometer to detect this ‘‘radiant heat’’; Macedonio Melloni, (c. 1850) invented the ‘‘thermochrose’’ to display spatial differences of irradiation as color patterns on a temperature-sensitive surface; and in 1882 William Abney found that photographic film could be sensitized to respond to wavelengths beyond the red end of the spectrum. Most infrared detectors, however, convert infrared radiation into an electrical signal via a variety of physical effects. Here, too, 19th century innovations continued in use well into the 21st century.
10. Integrated Circuits Design and Use
Integrated circuits (ICs) are electronic devices designed to integrate a large number of microscopic electronic components, normally connected by wires in circuits, within the same substrate material. According to the American engineer Jack S. Kilby, they are the realization of the so-called ‘‘monolithic idea’’: building an entire circuit out of silicon or germanium. ICs are made out of these materials because of their properties as semiconductors— materials that have a degree of electrical conductivity between that of a conductor such as metal and that of an insulator (having almost no conductivity at low temperatures). A piece of silicon containing one circuit is called a die or chip. Thus, ICs are known also as microchips. Advances in semiconductor technology in the 1960s (the miniaturization revolution) meant that the number of transistors on a single chip doubled every two years, and led to lowered microprocessor costs and the introduction of consumer products such as handheld calculators.
11. Integrated Circuits Fabrication
The fabrication of integrated circuits (ICs) is a complicated process that consists primarily of the transfer of a circuit design onto a piece of silicon (the silicon wafer). Using a photolithographic technique, the areas of the silicon wafer to be imprinted with electric circuitry are covered with glass plates (photomasks), irradiated with ultraviolet light, and treated with chemicals in order to shape a circuit’s pattern. On the whole, IC manufacture consists of four main stages:
- Preparation of a design
- Preparation of photomasks and silicon wafers
- Testing and packaging
Preparing an IC design consists of drafting the circuit’s electronic functions within the silicon board. This process has radically changed over the years due to the increasing complexity of design and the number of electronic components contained within the same IC. For example, in 1971, the Intel 4004 microprocessor was designed by just three engineers, while in the 1990s the Intel Pentium was designed by a team of 100 engineers. Moreover, the early designs were produced with traditional drafting techniques, while from the late 1970s onward the introduction of computer-aided design (CAD) techniques completely changed the design stage. Computers are used to check the design and simulate the operations of perspective ICs in order to optimize their performance. Thus, the IC drafted design can be modified up to 400 times before going into production.
12. Josephson Junction Devices
One of the most important implications of quantum physics is the existence of so-called tunneling phenomena in which elementary particles are able to cross an energy barrier on subatomic scales that it would not be possible for them to traverse were they subject to the laws of classical mechanics. In 1973 the Nobel Prize in Physics was awarded to Brian Josephson, Ivan Giaever and Leo Esaki for their work in this field. Josephson’s contribution consisted of a number of important theoretical predictions made while a doctoral student at Cambridge University. His work was confirmed experimentally within a year of its publication in 1961, and practical applications were commercialized within ten years.
13. Laser Applications
Lasers are employed in virtually every sector of the modern world including industry, commerce, transportation, medicine, education, science, and in many consumer devices such as CD players and laser printers. The intensity of lasers makes them ideal cutting tools since their highly focused beam cuts more accurately than machined instruments and leaves surrounding materials unaffected. Surgeons, for example, have employed carbon dioxide or argon lasers in soft tissue surgery since the early 1970s. These lasers produce infrared wavelengths of energy that are absorbed by water. Water in tissues is rapidly heated and vaporized, resulting in disintegration of the tissue. Visible wavelengths (argon ion laser) coagulate tissue. Far-ultraviolet wavelengths (higher photon energy, as produced by excimer lasers) break down molecular bonds in target tissue and ‘‘ablate’’ tissue without heating. Excimer lasers have been used in corneal surgery since 1984. Short pulses only affect the surface area of interest and not deeper tissues. The extremely small size of the beam, coupled with optical fibers, enables today’s surgeons to conduct surgery deep inside the human body often without a single cut on the exterior. Blue lasers, developed in 1994 by Shuji Nakamura of Nichia Chemical Industries of Japan, promise even more precision than the dominant red lasers currently used and will further revolutionize surgical cutting techniques.
14. Laser Theory and Operation
Lasers (an acronym for light amplification by stimulated emission of radiation) provide intense, focused beams of light whose unique properties enable them to be employed in a wide range of applications in the modern world. The key idea underlying lasers originated with Albert Einstein who published a paper in 1916 on Planck’s distribution law, within which he described what happens when additional energy is introduced into an atom. Atoms have a heavy and positively charged nucleus surrounded by groups of extremely light and negatively charged electrons. Electrons orbit the atom in a series of ‘‘fixed’’ levels based upon the degree of electromagnetic attraction between each single electron and the nucleus. Various orbital levels also represent different energy levels. Normally electrons remain as close to the nucleus as their energy level permits, with the consequence that an atom’s overall energy level is minimized. Einstein realized that when energy is introduced to an atom; for example, through an atomic collision or through electrical stimulation, one or more electrons become excited and move to a higher energy level. This condition exists temporarily before the electron returns to its former energy level. When this decay phenomenon occurs, a photon of light is emitted. Einstein understood that since the energy transitions within the atom are always identical, the energy and the wavelength of the stimulated photon of light are also predictable; that is, a specific type of transition within an atom will yield a photon of light of a specific wavelength. Hendrick Kramers and Werner Heisenberg obtained a series of more extensive calculations of the effects of these stimulated emissions over the next decade. The first empirical evidence supporting these theoretical calculations occurred between 1926 and 1930 in a series of experiments involving electrical discharges in neon.
15. Lasers in Optoelectronics
Optoelectronics, the field combining optics and electronics, is dependent on semiconductor (diode) lasers for its existence. Mass use of semiconductor lasers has emerged with the advent of CD and DVD technologies, but it is the telecommunications sector that has primarily driven the development of lasers for optoelectronic systems. Lasers are used to transmit voice, data, or video signals down fiber-optic cables.
While the success of lasers within telecommunication systems seems unquestioned thanks to their utility in long-distance large-capacity, point-to-point links, these lasers also find use in many other applications and are ubiquitous in the developed world. Their small physical size, low power operation, ease of modulation (via simple input current variation) and small beam size mean that these lasers are now part of our everyday world, from CDs and DVDs, to supermarket checkouts and cosmetic medicine.
16. Light Emitting Diodes
Light emitting diodes, or LEDs, are semiconductor devices that emit monochromatic light once an electric current passes through it. The color of light emitted from LEDs depends not on the color of the bulb, but on the emission’s wavelength. Typically made of inorganic materials like gallium or silicon, LEDs have found frequent use as ‘‘pilot,’’ or indicator, lights for electronic devices. Unlike incandescent light bulbs, which generate light from ‘‘heat glow,’’ LEDs create light more efficiently and are generally more durable than traditional light sources.
17. Lighting Techniques
In 1900 electric lighting in the home was a rarity. Carbon filament incandescent lamps had been around for 20 years, but few households had electricity. Arc lamps were used in streets and large buildings such as railway stations. Domestic lighting was by candle, oil and gas.
The stages of the lightning techniques evolution are the following:
- Non-Electric Lighting
- Electric Lighting: Filament Lamps
- Electric Lighting: Discharge Lamps
- Electric Lighting: Fluorescent Lamps
- Electric Lighting: LED Lamps
18. Mechanical and Electromechanical Calculators
The widespread use of calculating devices in the twentieth century is intimately linked to the rise of large corporations and to the increasing role of mathematical calculation in science and engineering. In the business setting, calculators were used to efficiently process financial information. In science and engineering, calculators speeded up routine calculations. The manufacture and sale of calculators was a widespread industry, with major firms in most industrialized nations. However, the manufacture of mechanical calculators declined very rapidly in the 1970s with the introduction of electronic calculators, and firms either diversified into other product lines or went out of business. By the end of the twentieth century, slide rules, adding machines, and other mechanical calculators were no longer being manufactured.
19. Mobile (Cell) Telephones
In the last two decades of the twentieth century, mobile or cell phones developed from a minority communication tool, characterized by its prevalence in the 1980s among young professionals, to a pervasive cultural object. In many developed countries, more than three quarters of the population owned a cell phone by the end of the 20th century.
Cell phone technology is a highly evolved form of the personal radio systems used by truck drivers (citizens band, or CB, radio) and police forces in which receiver/transmitter units communicate with one another or a base antenna. Such systems work adequately over short distances with a low volume of traffic but cannot be expanded to cope with mass communication due to the limited space (bandwidth) available in the electromagnetic spectrum. Transmitting and receiving on one frequency, they allow for talking or listening but not both simultaneously.
For mobile radio systems to make the step up to effective telephony, a large number of two-way conversations needed to be accommodated, requiring a duplex channel (two separate frequencies, taking up double the bandwidth). In order to establish national mobile phone networks without limiting capacity or the range of travel of handsets, a number of technological improvements had to occur.
The photocopier, copier, or copying machine, as it is variously known, is a staple of modern life. Copies by the billions are produced not only in the office but also on machines available to the public in libraries, copy shops, stationery stores, supermarkets, and a wide variety of other commercial facilities. Modern xerographic copiers, produced by a number of manufacturers, are available as desktop models suitable for the home as well as the small office. Many modern copiers reproduce in color as well as black and white, and office models can rival printing presses in speed of operation.
21. Photosensitive Detectors
Sensing radiation from ultraviolet to optical wavelengths and beyond is an important part of many devices. Whether analyzing the emission of radiation, chemical solutions, detecting lidar signals, fiber-optic communication systems, or imaging of medical ionizing radiation, detectors are the final link in any optoelectronic experiment or process.
Detectors fall into two groups: thermal detectors (where radiation is absorbed and the resulting temperature change is used to generate an electrical output) and photon (quantum) detectors. The operation of photon detectors is based on the photoelectric effect, in which the radiation is absorbed within a metal or semiconductor by direct interaction with electrons, which are excited to a higher energy level. Under the effect of an electric field these carriers move and produce a measurable electric current. The photon detectors show a selective wavelength-dependent response per unit incident radiation power.
22. Public and Private Lighting
At the turn of the 20th century, lighting was in a state of flux. In technical terms, a number of emerging lighting technologies jostled for economic dominance. In social terms, changing standards of illumination began to transform cities, the workplace, and the home. In design terms, the study of illumination as a science, as an engineering profession, and as an applied art was becoming firmly established. In the last decades of the 20th century, the technological and social choices in lighting attained considerable stability both technically and socially. Newer forms of compact fluorescent lighting, despite their greater efficiency, have not significantly replaced incandescent bulbs in homes owing to higher initial cost. Low-pressure sodium lamps, on the other hand, have been adopted increasingly for street and architectural lighting owing to lower replacement and maintenance costs. As with fluorescent lighting in the 1950s, recent lighting technologies have found niche markets rather than displacing incandescents, which have now been the dominant lighting system for well over a century.
23. Quantum Electronic Devices
Quantum theory, developed during the 1920s to explain the behavior of atoms and the absorption and emission of light, is thought to apply to every kind of physical system, from individual elementary particles to macroscopic systems such as lasers. In lasers, stimulated transitions between discrete or quantized energy levels is a quantum electronic phenomena (discussed in the entry Lasers, Theory and Operation). Stimulated transitions are also the central phenomena in atomic clocks. Semiconductor devices such as the transistor also rely on the arrangement of quantum energy levels into a valence band and a conduction band separated by an energy gap, but advanced quantum semiconductor devices were not possible until advances in fabrication techniques such as molecular beam epitaxy (MBE) developed in the 1960s made it possible to grow extremely pure single crystal semiconductor structures one atomic layer at a time.
In most electronic devices and integrated circuits, quantum phenomena such as quantum tunneling and electron diffraction—where electrons behave not as particles but as waves—are of no significance, since the device is much larger than the wavelength of the electron (around 100 nanometers, where one nanometer is 109 meters or about 4 atoms wide). Since the early 1980s however, researchers have been aware that as the overall device size of field effect transistors decreased, small-scale quantum mechanical effects between components, plus the limitations of materials and fabrication techniques, would sooner or later inhibit further reduction in the size of conventional semiconductor transistors. Thus to produce devices on ever-smaller integrated circuits (down to 25 nanometers in length), conventional microelectronic devices would have to be replaced with new device concepts that take advantage of the quantum mechanical effects that dominate on the nanometer scale, rather than function in despite of them. Such solid state ‘‘nanoelectronics’’ offers the potential for increased speed and density of information processing, but mass fabrication on this small scale presented formidable challenges at the end of the 20th century.
24. Quartz Clocks and Watches
The wristwatch and the domestic clock were completely reinvented with all-new electronic components beginning about 1960. In the new electronic timepieces, a tiny sliver of vibrating quartz in an electrical circuit provides the time base and replaces the traditional mechanical oscillator, the swinging pendulum in the clock or the balance wheel in the watch. Instead of an unwinding spring or a falling weight, batteries power these quartz clocks and watches, and integrated circuits substitute for intricate mechanical gear trains.
25. Radio-Frequency Electronics
Radio was originally conceived as a means for interpersonal communications, either person-toperson, or person-to-people, using analog waveforms containing either Morse code or actual sound. The use of radio frequencies (RF) designed to carry digital data in the form of binary code rather than voice and to replace physical wired connections between devices began in the 1970s, but the technology was not commercialized until the 1990s through digital cellular phone networks known as personal communications services (PCS) and an emerging group of wireless data network technologies just reaching commercial viability. The first of these is a so-called wireless personal area network (WPAN) technology known as Bluetooth. There are also two wireless local area networks (WLANs), generally grouped under the name Wi-Fi (wireless fidelity): (1) Wi-Fi, also known by its Institute of Electrical and Electronic Engineers (IEEE) designation 802.11b, and (2) Wi-Fi5 (802.11a).
Rectifiers are electronic devices that are used to control the flow of current. They do this by having conducting and nonconducting states that depend on the polarity of the applied voltage. A major function in electronics is the conversion from alternating current (AC) to direct current (DC) where the output is only one-half (either positive or negative) of the input. Rectifiers that are currently, or have been, in use include: point-contact diodes, plate rectifiers, thermionic diodes, and semiconductor diodes. There are various ways in which rectifiers may be classified in terms of the signals they encounter; this contribution will consider two extremes—high frequency and heavy current—that make significantly different demands on device design.
27. Strobe Flashes
Scarcely a dozen years after photography was announced to the world in 1839, William Henry Fox Talbot produced the first known flash photograph. Talbot, the new art’s co-inventor, fastened a printed paper onto a disk, set it spinning as fast as possible, and then discharged a spark to expose a glass plate negative. The words on the paper could be read on the photograph. Talbot believed that the potential for combining electric sparks and photography was unlimited. In 1852, he pronounced, ‘‘It is in our power to obtain the pictures of all moving objects, no matter in how rapid motion they may be, provided we have the means of sufficiently illuminating them with a sudden electric flash.’’
The electronic stroboscope fulfills Talbot’s prediction. It is a repeating, short-duration light source used primarily for visual observation and photography of high-speed phenomena. The intensity of the light emitted from strobes also makes them useful as signal lights on communication towers, airport runways, emergency vehicles, and more. Though ‘‘stroboscope’’ actually refers to a repeating flash and ‘‘electronic flash’’ denotes a single burst, both types are commonly called ‘‘strobes.’’
Early experiments in transistor technology were based on the analogy between the semiconductor and the vacuum tube: the ability to both amplify and effectively switch an electrical signal on or off (rectification). By 1940, Russell Ohl at Bell Telephone Laboratories, among others, had found that impure silicon had both positive (ptype material with holes) and negative (n-type) regions. When a junction is created between n-type material and p-type material, electrons on the ntype side are attracted across the junction to fill holes in the other layer. In this way, the n-type semiconductor becomes positively charged and the p-type becomes negatively charged. Holes move in the opposite direction, thus reinforcing the voltage built up at the junction. The key point is that current flows from one side to the other when a positive voltage is applied to the layers (‘‘forward biased’’).
29. Travelling Wave Tubes
One of the most important devices for the amplification of radio-frequency (RF) signals— which range in frequency from 3 kilohertz to 300 gigahertz—is the traveling wave tube (TWT). When matched with its power supply unit, or electronic power conditioner (EPC), the combination is known as a traveling wave tube amplifier (TWTA). The amplification of RF signals is important in many aspects of science and technology, since the ability to increase the strength of a very low-power input signal is fundamental to all types of long-range communications, radar and electronic warfare.
30. Vacuum Tubes/Valves
The vacuum tube has its roots in the late nineteenth century when Thomas A. Edison conducted experiments with electric bulbs in 1883. Edison’s light bulbs consisted of a conducting filament mounted in a glass bulb. Passing electricity through the filament caused it to heat up and radiate light. A vacuum in the tube prevented the filament from burning up. Edison noted that electric current would flow from the bulb filament to a positively charged metal plate inside the tube. This phenomenon, the one-way flow of current, was called the Edison Effect. Edison himself could not explain the filament’s behavior. He felt this effect was interesting but unimportant and patented it as a matter of course. It was only fifteen years later that Joseph John Thomson, a physics professor at the Cavendish Laboratory at the University of Cambridge in the U.K., discovered the electron and understood the significance of what was occurring in the tube. He identified the filament rays as a stream of particles, now called electrons. In a range of papers from 1901 to 1916, O.W. Richardson explained the electron behavior. Today the Edison Effect is known as thermionic emission.
History of Electronics
Few of the basic tasks that electronic technologies perform, such as communication, computation, amplification, or automatic control, are unique to electronics. Most were anticipated by the designers of mechanical or electromechanical technologies in earlier years. What distinguishes electronic communication, computation, and control is often linked to the instantaneous action of the devices, the delicacy of their actions compared to mechanical systems, their high reliability, or their tiny size.
The electronics systems introduced between the late nineteenth century and the end of the twentieth century can be roughly divided into the applications related to communications (including telegraphy, telephony, broadcasting, and remote detection) and the more recently developed fields involving digital information and computation. In recent years these two fields have tended to converge, but it is still useful to consider them separately for a discussion of their history.
The origins of electronics as distinguished from other electrical technologies can be traced to 1880 and the work of Thomas Edison. While investigating the phenomenon of the blackening of the inside surface of electric light bulbs, Edison built an experimental bulb that included a third, unused wire in addition to the two wires supporting the filament. When the lamp was operating, Edison detected a flow of electricity from the filament to the third wire, through the evacuated space in the bulb. He was unable to explain the phenomenon, and although he thought it would be useful in telegraphy, he failed to commercialize it. It went unexplained for about 20 years, until the advent of wireless telegraphic transmission by radio waves. John Ambrose Fleming, an experimenter in radio, not only explained the Edison effect but used it to detect radio waves. Fleming’s ‘‘valve’’ as he called it, acted like a one-way valve for electric waves, and could be used in a circuit to convert radio waves to electric pulses so that that incoming Morse code signals could be heard through a sounder or earphone.
As in the case of the Fleming valve, many early electronic devices were used first in the field of communications, mainly to enhance existing forms of technology. Initially, for example, telephony (1870s) and radio (1890s) were accomplished using ordinary electrical and electromechanical circuits, but eventually both were transformed through the use of electronic devices. Many inventors in the late nineteenth century sought a functional telephone ‘‘relay’’; that is, something to refresh a degraded telephone signal to allow long distance telephony. Several people simultaneously recognized the possibility of developing a relay based on the Fleming valve. The American inventor Lee de Forest was one of the first to announce an electronic amplifier using a modified Fleming valve, which he called the Audion. While he initially saw it as a detector and amplifier of radio waves, its successful commercialization occurred first in the telephone industry. The sound quality and long-distance capability of telephony was enhanced and extended after the introduction of the first electronic amplifier circuits in 1907. In the U.S., where vast geographic distances separated the population, the American Telephone and Telegraph Company (AT&T) introduced improved vacuum tube amplifiers in 1913, which were later used to establish the first coast-to-coast telephone service in 1915 (an overland distance of nearly 5000 kilometers).
These vacuum tubes soon saw many other uses, such as a public-address systems constructed as early as 1920, and radio transmitters and receivers. The convergence of telephony and radio in the form of voice broadcasting was technically possible before the advent of electronics, but its application was greatly enhanced through the use of electronics both in the radio transmitter and in the receiver.
World War I saw the applications of electronics diversify somewhat to include military applications. Mostly, these were modifications of existing telegraph, telephone, and radio systems, but applications such as ground-to-air radio telephony were novel. The pressing need for large numbers of electronic components, especially vacuum tubes suitable for military use, stimulated changes in their design and manufacture and contributed to improving quality and falling prices. After the war, the expanded capacity of the vacuum tube industry contributed to a boom in low-cost consumer radio receivers. Yet because of the withdrawal of the military stimulus and the onset of the Great Depression, the pace of change slowed in the 1930s. One notable exception was in the field of television. Radio broadcasting became such a phenomenal commercial success that engineers and businessmen were envisioning how ‘‘pictures with sound’’ would replace ordinary broadcasting, even in the early 1930s. Germany, Great Britain, and the U.S. all had rudimentary television systems in place by 1939, although World War II would bring nearly a complete halt to these early TV broadcasts.
World War II saw another period of rapid change, this one much more dramatic than that of World War I. Not only were radio communications systems again greatly improved, but for the first time the field of electronics engineering came to encompass much more than communication. While it was the atomic bomb that is most commonly cited as the major technological outcome of World War II, radar should probably be called the weapon that won the war. To describe radar as a weapon is somewhat inaccurate, but there is no doubt that it had profound effects upon the way that naval, aerial, and ground combat was conducted. Using radio waves as a sort of searchlight, radar could act as an artificial eye capable of seeing through clouds or fog, over the horizon, or in the dark. Furthermore, it substituted for existing methods of calculating the distance and speed of targets. Radar’s success hinged on the development of new electronic components, particularly new kinds of vacuum tubes such as the klystron and magnetron, which were oriented toward the generation of microwaves. Subsidized by military agencies on both sides of the Atlantic (as well as Japan) during World War II, radar sets were eventually installed in aircraft and ships, used in ground stations, and even built into artillery shells. The remarkable engineering effort that was launched to make radar systems smaller, more energy efficient, and more reliable would mark the beginning of an international research program in electronics miniaturization that continues today. Radar technology also had many unexpected applications elsewhere, such as the use of microwave beams as a substitute for long-distance telephone cables. Microwave communication is also used extensively today for satellite-to-earth communication.
The second major outcome of electronics research during World War II was the effort to build an electronic computer. Mechanical adders and calculators were widely used in science, business, and government by the early twentieth century, and had reached an advanced state of design. Yet the problems peculiar to wartime, especially the rapid calculation of mountains of ballistics data, drove engineers to look for ways to speed up the machines. At the same time, some sought a calculator that could be reprogrammed as computational needs changed. While computers played a role in the war, it was not until the postwar period that they came into their own. In addition, computer research during World War II contributed little to the development of vacuum tubes, although in later years computer research would drive certain areas of semiconductor electron device research.
While the forces of the free market are not to be discounted, the role of the military in electronics development during World War II was of paramount importance. More-or-less continuous military support for research in electronic devices and systems persisted during the second half of the twentieth century too, and many more new technologies emerged from this effort. The sustained effort to develop more compact, rugged devices such as those demanded by military systems would converge with computer development during the 1950s, especially after the invention of the transistor in late 1947.
The transistor was not a product of the war, and in fact its development started in the 1930s and was delayed by the war effort. A transistor is simply a very small substitute for a vacuum tube, but beyond that it is an almost entirely new sort of device. At the time of its invention, its energy efficiency, reliability, and diminutive size suggested new possibilities for electronic systems. The most famous of these possibilities was related to computers and systems derived from or related to computers, such as robotics or industrial automation. The impetus for the transistor was a desire within the telephone industry to create an energy-efficient, reliable substitute for the vacuum tube. Once introduced, the military pressed hard to accelerate its development, as the need emerged for improved electronic navigational devices for aircraft and missiles.
There were many unanticipated results of the substitution of transistors for vacuum tubes. Because they were so energy efficient, transistors made it much more practical to design battery powered systems. The small transistor radio (known in some countries simply as ‘‘the transistor’’), introduced in the 1950s, is credited with helping to popularize rock and roll music. It is also worth noting that many developing countries could not easily provide broadcasting services until the diffusion of battery operated transistor receivers because of the lack of central station electric power. The use of the transistor also allowed designers to enhance existing automotive radios and tape players, contributing eventually to a greatly expanded culture of in-car listening. There were other important outcomes as well; transistor manufacture provided access to the global electronics market for Asian radio manufacturers, who improved manufacturing methods to undercut their U.S. competitors during the 1950s and 1960s. Further, the transistor’s high reliability nearly eliminated the profession of television and radio repair, which had supported tens of thousands of technicians in the U.S. alone before about 1980.
However, for all its remarkable features, the transistor also had its limitations; while it was an essential part of nearly every cutting-edge technology of the postwar period, it was easily outperformed by the older technology of vacuum tubes in some areas. The high-power microwave transmitting devices in communications satellites and spacecraft, for example, nearly all relied on special vacuum tubes through the end of the twentieth century, because of the physical limitations of semiconductor devices. For the most part, however, the transistor made the vacuum tube obsolete by about 1960.
The attention paid to the transistor in the 1950s and 1960s made the phrase ‘‘solid-state’’ familiar to the general public, and the new device spawned many new companies. However, its overall impact pales in comparison to its successor—the integrated circuit. Integrated circuits emerged in the late 1950s, were immediately adopted by the military for small computer and communications systems, and were then used in civilian computers and related applications from the 1960s. Integrated circuits consist of multiple transistors fabricated simultaneously from layers of semiconductor and other materials. The transistors, interconnecting ‘‘wires,’’ and many of the necessary circuit elements such as capacitors and resistors are fabricated on the ‘‘chip.’’ Such a circuit eliminates much of the laborious process of assembling an electronic system such as a computer by hand, and results in a much smaller product. The ability to miniaturize components through integrated circuit fabrication techniques would lead to circuits so vanishingly small that it became difficult to connect them to the systems of which they were a part. The plastic housings or ‘‘packages’’ containing today’s microprocessor chips measure just a few centimeters on a side, and yet the actual circuits inside are much smaller. Some of the most complex chips made today contain many millions of transistors, plus millions more solid-state resistors and other passive components.
While used extensively in military and aerospace applications, the integrated circuit became famous as a component in computer systems. The logic and memory circuits of digital computers, which have been the focus of much research, consist mainly of switching devices. Computers were first constructed in the 1930s with electromechanical relays as switching devices, then with vacuum tubes, transistors, and finally integrated circuits. Most early computers used off-the-shelf tubes and transistors, but with the advent of the integrated circuit, designers began to call for components designed especially for computers. It was clear to engineers at the time that all the circuits necessary to build a computer could be placed on one chip (or a small set of chips), and in fact, the desire to create a ‘‘computer on a chip’’ led to the microprocessor, introduced around 1970. The commercial impetus underlying later generations of computer chip design was not simply miniaturization (although there are important exceptions) or energy efficiency, but also the speed of operation, reliability, and lower cost. However, the inherent energy efficiency and small size of the resulting systems did enable the construction of smaller computers, and the incorporation of programmable controllers (special purpose computers) into a wide variety of other technologies. The recent merging of the computer (or computer-like systems) with so many other technologies makes it difficult to summarize the current status of digital electronic systems. As the twentieth century drew to a close, computer chips were widely in use in communications and entertainment devices, in industrial robots, in automobiles, in household appliances, in telephone calling cards, in traffic signals, and in a myriad other places. The rapid evolution of the computer during the last 50 years of the twentieth century was reflected by the near-meaninglessness of its name, which no longer adequately described its functions.
From an engineering perspective, not only did electronics begin to inhabit, in an almost symbiotic fashion, other technological systems after about 1950, but these electronics systems were increasingly dominated by the use of semiconductor technology. After virtually supplanting the vacuum tube in the 1950s, the semiconductor-based transistor became the technology of choice for most subsequent electronics development projects. Yet semiconducting alloys and compounds proved remarkably versatile in applications at first unrelated to transistors and chips. The laser, for example, was originally operated in a large vacuum chamber and depended on ionized gas for its operation. By the 1960s, laser research was focused on the remarkable ability of certain semiconducting materials to accomplish the same task as the ion chamber version. Today semiconductor devices are used not only as the basis of amplifiers and switches, but also for sensing light, heat, and pressure, for emitting light (as in lasers or video displays), for generating electricity (as in solar cells), and even for mechanical motion (as in micromechanical systems or MEMS).
However, semiconductor devices in ‘‘discrete’’ forms such as transistors, would probably not have had the remarkable impact of the integrated circuit. By the 1970s, when the manufacturing techniques for integrated circuits allowed high volume production, low cost, tiny size, relatively small energy needs, and enormous complexity; electronics entered a new phase of its history, having a chief characteristic of allowing electronic systems to be retrofitted into existing technologies. Low-cost microprocessors, for example, which were available from the late 1970s onward, were used to sense data from their environment, measure it, and use it to control various technological systems from coffee machines to video tape recorders. Even the human body is increasingly invaded by electronics; at the end of the twentieth century, several researchers announced the first microchips for implantation directly in the body. They were to be used to store information for retrieval by external sensors or to help deliver subcutaneous drugs. The integrated circuit has thus become part of innumerable technological and biological systems.
It is this remarkable flexibility of application that enabled designers of electronic systems to make electronics the defining technology of the late twentieth century, eclipsing both the mechanical technologies associated with the industrial revolution and the electrical and information technologies of the so-called second industrial revolution. While many in the post-World War II era once referred to an ‘‘atomic age,’’ it was in fact an era in which daily life was increasingly dominated by electronics.
Browse other Technology Research Paper Topics .
ORDER HIGH QUALITY CUSTOM PAPER
American Research Journal of Electronics and Communication Engineering
[email protected] | ISSN-2643-3486
American Research Journal of Electronics and Communication Engineering is an international, peer-reviewed, open access online journal. Electronics and Communication Engineering involves the study of electronic devices, circuits, communication equipments like transmitters, receivers, integrated circuits (IC), analog and digital transmission & reception of data, voice and video (Example AM, FM, DTH), microprocessors, satellite communication, microwave engineering, antennae and wave progression. This journal publishes research work on topics like solid state devices, VLSI, RF Engineering, digital signal processing, image processing and wireless technology.The journal welcomes and publishes insightful Electronics and Communication research related articles in the form of original articles, review articles, case reports, short reviews, short notes etc.
Given below are some of the key (but not limited) topics of this journal.
- Antennas and radio propagation
- Applied Electromagnetics and RF Circuits
- Circuit design
- Computer Vision
- Control Systems
- Electromagnetic compatibility
- Embedded Systems
- Energy Science and Engineering
- Engineering Education Research
- GPS and GIS Technology
- Integrated Circuits and VLSI
- Measurement and instrumentation
- MEMS and Microsystems
- Optics and Photonics
- Plasma Science and Engineering
- Power and Energy
- Quantum Science and Technology
- Radio and satellite communications
- Robotics and Autonomous Systems
- Signal & Image Processing and Machine Learning
- Signal and image processing coding
- Simulation and CAD
- Solid-state Devices and Nanotechnology
- Sonar and navigation systems
- Telecommunication networks
- TV and sound broadcasting
ARJ @ SOCIAL
Dr. Navid Asadizanjani, Ph.D.
Dr. M. Chandra Shekar
Venkata Ragahavendra Miriampally
Dr. Gagan Singh
Dr. Kumar Keshamoni
Dr. P .V. Rao
View | Download | Full Text