Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • Introduction & Top Questions
  • Analog computers
  • Mainframe computer
  • Supercomputer
  • Minicomputer
  • Microcomputer
  • Laptop computer
  • Embedded processors
  • Central processing unit
  • Main memory
  • Secondary memory
  • Input devices
  • Output devices
  • Communication devices
  • Peripheral interfaces
  • Fabrication
  • Transistor size
  • Power consumption
  • Quantum computing
  • Molecular computing
  • Role of operating systems
  • Multiuser systems
  • Thin systems
  • Reactive systems
  • Operating system design approaches
  • Local area networks
  • Wide area networks
  • Business and personal software
  • Scientific and engineering software
  • Internet and collaborative software
  • Games and entertainment

Analog calculators: from Napier’s logarithms to the slide rule

Digital calculators: from the calculating clock to the arithmometer, the jacquard loom.

  • The Difference Engine
  • The Analytical Engine
  • Ada Lovelace, the first programmer
  • Herman Hollerith’s census tabulator
  • Other early business machine companies
  • Vannevar Bush’s Differential Analyzer
  • Howard Aiken’s digital calculators
  • The Turing machine
  • The Atanasoff-Berry Computer
  • The first computer network
  • Konrad Zuse
  • Bigger brains
  • Von Neumann’s “Preliminary Discussion”
  • The first stored-program machines
  • Machine language
  • Zuse’s Plankalkül
  • Interpreters
  • Grace Murray Hopper
  • IBM develops FORTRAN
  • Control programs
  • The IBM 360
  • Time-sharing from Project MAC to UNIX
  • Minicomputers
  • Integrated circuits
  • The Intel 4004
  • Early computer enthusiasts
  • The hobby market expands
  • From Star Trek to Microsoft
  • Application software
  • Commodore and Tandy enter the field
  • The graphical user interface
  • The IBM Personal Computer
  • Microsoft’s Windows operating system
  • Workstation computers
  • Embedded systems
  • Handheld digital devices
  • The Internet
  • Social networking
  • Ubiquitous computing

A laptop computer

  • What is a computer?
  • Who invented the computer?
  • What can computers do?
  • Are computers conscious?
  • What is the impact of computer artificial intelligence (AI) on society?

Technical insides of a desktop computer

History of computing

Our editors will review what you’ve submitted and determine whether to revise the article.

  • University of Rhode Island - College of Arts and Sciences - Department of Computer Science and Statistics - History of Computers
  • LiveScience - History of Computers: A Brief Timeline
  • Computer History Museum - Timeline of Computer history
  • Engineering LibreTexts - What is a computer?
  • Computer Hope - What is a Computer?
  • computer - Children's Encyclopedia (Ages 8-11)
  • computer - Student Encyclopedia (Ages 11 and up)
  • Table Of Contents

A computer might be described with deceptive simplicity as “an apparatus that performs routine calculations automatically.” Such a definition would owe its deceptiveness to a naive and narrow view of calculation as a strictly mathematical process. In fact, calculation underlies many activities that are not normally thought of as mathematical. Walking across a room, for instance, requires many complex, albeit subconscious, calculations. Computers, too, have proved capable of solving a vast array of problems, from balancing a checkbook to even—in the form of guidance systems for robots—walking across a room.

Recent News

Before the true power of computing could be realized, therefore, the naive view of calculation had to be overcome. The inventors who labored to bring the computer into the world had to learn that the thing they were inventing was not just a number cruncher, not merely a calculator. For example, they had to learn that it was not necessary to invent a new computer for every new calculation and that a computer could be designed to solve numerous problems, even problems not yet imagined when the computer was built. They also had to learn how to tell such a general problem-solving computer what problem to solve. In other words, they had to invent programming.

They had to solve all the heady problems of developing such a device, of implementing the design, of actually building the thing. The history of the solving of these problems is the history of the computer. That history is covered in this section, and links are provided to entries on many of the individuals and companies mentioned. In addition, see the articles computer science and supercomputer .

Early history

Computer precursors.

The earliest known calculating device is probably the abacus . It dates back at least to 1100 bce and is still in use today, particularly in Asia. Now, as then, it typically consists of a rectangular frame with thin parallel rods strung with beads. Long before any systematic positional notation was adopted for the writing of numbers, the abacus assigned different units, or weights, to each rod. This scheme allowed a wide range of numbers to be represented by just a few beads and, together with the invention of zero in India, may have inspired the invention of the Hindu-Arabic number system . In any case, abacus beads can be readily manipulated to perform the common arithmetical operations—addition, subtraction, multiplication, and division—that are useful for commercial transactions and in bookkeeping.

The abacus is a digital device; that is, it represents values discretely. A bead is either in one predefined position or another, representing unambiguously, say, one or zero.

Calculating devices took a different turn when John Napier , a Scottish mathematician, published his discovery of logarithms in 1614. As any person can attest , adding two 10-digit numbers is much simpler than multiplying them together, and the transformation of a multiplication problem into an addition problem is exactly what logarithms enable. This simplification is possible because of the following logarithmic property: the logarithm of the product of two numbers is equal to the sum of the logarithms of the numbers. By 1624, tables with 14 significant digits were available for the logarithms of numbers from 1 to 20,000, and scientists quickly adopted the new labor-saving tool for tedious astronomical calculations.

Most significant for the development of computing, the transformation of multiplication into addition greatly simplified the possibility of mechanization. Analog calculating devices based on Napier’s logarithms—representing digital values with analogous physical lengths—soon appeared. In 1620 Edmund Gunter , the English mathematician who coined the terms cosine and cotangent , built a device for performing navigational calculations: the Gunter scale, or, as navigators simply called it, the gunter. About 1632 an English clergyman and mathematician named William Oughtred built the first slide rule , drawing on Napier’s ideas. That first slide rule was circular, but Oughtred also built the first rectangular one in 1633. The analog devices of Gunter and Oughtred had various advantages and disadvantages compared with digital devices such as the abacus. What is important is that the consequences of these design decisions were being tested in the real world.

Calculating Clock

In 1623 the German astronomer and mathematician Wilhelm Schickard built the first calculator . He described it in a letter to his friend the astronomer Johannes Kepler , and in 1624 he wrote again to explain that a machine he had commissioned to be built for Kepler was, apparently along with the prototype , destroyed in a fire. He called it a Calculating Clock , which modern engineers have been able to reproduce from details in his letters. Even general knowledge of the clock had been temporarily lost when Schickard and his entire family perished during the Thirty Years’ War .

But Schickard may not have been the true inventor of the calculator. A century earlier, Leonardo da Vinci sketched plans for a calculator that were sufficiently complete and correct for modern engineers to build a calculator on their basis.

Arithmetic Machine, or Pascaline

The first calculator or adding machine to be produced in any quantity and actually used was the Pascaline, or Arithmetic Machine , designed and built by the French mathematician-philosopher Blaise Pascal between 1642 and 1644. It could only do addition and subtraction, with numbers being entered by manipulating its dials. Pascal invented the machine for his father, a tax collector, so it was the first business machine too (if one does not count the abacus). He built 50 of them over the next 10 years.

Step Reckoner

In 1671 the German mathematician-philosopher Gottfried Wilhelm von Leibniz designed a calculating machine called the Step Reckoner . (It was first built in 1673.) The Step Reckoner expanded on Pascal’s ideas and did multiplication by repeated addition and shifting.

Leibniz was a strong advocate of the binary number system . Binary numbers are ideal for machines because they require only two digits, which can easily be represented by the on and off states of a switch. When computers became electronic, the binary system was particularly appropriate because an electrical circuit is either on or off. This meant that on could represent true, off could represent false, and the flow of current would directly represent the flow of logic.

Leibniz was prescient in seeing the appropriateness of the binary system in calculating machines, but his machine did not use it. Instead, the Step Reckoner represented numbers in decimal form, as positions on 10-position dials. Even decimal representation was not a given: in 1668 Samuel Morland invented an adding machine specialized for British money—a decidedly nondecimal system.

Pascal’s, Leibniz’s, and Morland’s devices were curiosities, but with the Industrial Revolution of the 18th century came a widespread need to perform repetitive operations efficiently. With other activities being mechanized, why not calculation? In 1820 Charles Xavier Thomas de Colmar of France effectively met this challenge when he built his Arithmometer , the first commercial mass-produced calculating device. It could perform addition, subtraction, multiplication, and, with some more elaborate user involvement, division. Based on Leibniz’s technology , it was extremely popular and sold for 90 years. In contrast to the modern calculator’s credit-card size, the Arithmometer was large enough to cover a desktop.

Calculators such as the Arithmometer remained a fascination after 1820, and their potential for commercial use was well understood. Many other mechanical devices built during the 19th century also performed repetitive functions more or less automatically, but few had any application to computing. There was one major exception: the Jacquard loom , invented in 1804–05 by a French weaver, Joseph-Marie Jacquard .

Jacquard loom

The Jacquard loom was a marvel of the Industrial Revolution. A textile-weaving loom , it could also be called the first practical information-processing device. The loom worked by tugging various-colored threads into patterns by means of an array of rods. By inserting a card punched with holes, an operator could control the motion of the rods and thereby alter the pattern of the weave. Moreover, the loom was equipped with a card-reading device that slipped a new card from a pre-punched deck into place every time the shuttle was thrown, so that complex weaving patterns could be automated.

What was extraordinary about the device was that it transferred the design process from a labor-intensive weaving stage to a card-punching stage. Once the cards had been punched and assembled, the design was complete, and the loom implemented the design automatically. The Jacquard loom, therefore, could be said to be programmed for different patterns by these decks of punched cards.

For those intent on mechanizing calculations, the Jacquard loom provided important lessons: the sequence of operations that a machine performs could be controlled to make the machine do something quite different; a punched card could be used as a medium for directing the machine; and, most important, a device could be directed to perform different tasks by feeding it instructions in a sort of language—i.e., making the machine programmable.

It is not too great a stretch to say that, in the Jacquard loom, programming was invented before the computer. The close relationship between the device and the program became apparent some 20 years later, with Charles Babbage’s invention of the first computer.

Welcome to CS

Section 2.3 the first generation.

essay on first generation of computer

Subsection 2.3.1 The Stored Program Computer

essay on first generation of computer

Remark 2.3.5 .

essay on first generation of computer

Chapter 2 - Background

2.15 The Emergence of First Generation Computers 1946-1959

In late 1945 researchers at the Moore School of the University of Pennsylvania powered up a machine that was 100 feet long, 10 feet high, and 3 feet deep. It contained 17,000 vacuum tubes, about 70,000 resistors, 10,000 capacitors, and 6,000 switches. The Electronic Numerical Integrator and Calculator (ENIAC) was the first fully electronic computer. Designed and developed by J. Presper Eckert and John W. Mauchly under contractor supervision of Lieutenant Herman Goldstine, 25 ENIAC funding came from the Army’s Ballistic Research Laboratory at Aberdeen. ENIAC had a serious defect. Although it could compute several hundred times faster than an electro-mechanical or relay-type machine 26 , the computer required rewiring with each new problem, consuming from 30 minutes to a full day. 27 Nevertheless, in instantiating the first electronic computer, Eckert, Mauchly and Goldstine advanced the trajectory of computers beyond being just an idea.

Upon completing the design of ENIAC, the team faced a new challenge: to design the next computer, one to be significantly better, preferably without the impossible constraint of rewiring. In the summer of 1944, Lieutenant Goldstine, by chance, encountered John von Neumann while waiting for a train. Goldstine began to tell von Neumann, the foremost applied mathematician of his time, about ENIAC. Von Neumann was himself involved in a number of secret projects needing more computation than then available, yet did not know of ENIAC.

On August 7, 1944, von Neumann visited the Moore School and began contributing ideas immediately. The collaboration between von Neumann, Eckert, Mauchly and Goldstine proved fruitful when they hit upon the concept of storing the logic instructions in memory – the “stored-program” 28 computer. Instead of manual resetting of the switches, or worse yet, rewiring, to set-up the calculations of a new problem, the programmer could modify the program arithmetically. 29 With these new architectural ideas, they designed the ENIAC’s successor, the EDVAC (Electronic Discrete Variable Automatic Calculator).

Elsewhere, Thomas J. Watson Jr. had rejoined IBM after serving in the Air Force. In early 1946, Eckert and Mauchly gave Watson Jr. a tour of the Moore School and ENIAC. Although Watson Jr. sensed that Eckert and Mauchly thought they had a product with which to best IBM, he was unimpressed. “The truth was that I reacted to ENIAC the way some people probably reacted to the Wright brother’s airplane: it didn’t move me at all. I can’t imagine why I didn’t think, “Good God, that’s the future of the IBM company.” 30

Funding the development of computer systems, such as the ENIAC and EDVAC, represented only one way the government advanced understanding of computer technologies. Government agencies also sponsored meetings and courses, with the objectives of theory development, diffusion of knowledge, and training of personnel needed to explore the computer trajectory. Diffusing the growing knowledge and practice of computers ensured the government a future base of personnel to design and build the computers the military would need.

During the summer of 1946, the Moore School held a six-week course entitled, “Theory and Techniques for the Design of Electronic Digital Computers,” sponsored by the Office of Naval Research and the Army Ordinance Department. 31 Six months later, the Navy sponsored a four-day conference at Harvard, 350 people attended. 32 Representatives of government agencies, universities and a wide range of companies attended both of these significant events. Although many companies were learning about computers, none of them would be the first to act; that honor belongs to a start-up formed by Eckert and Mauchly in March 1946.

In 1946, the University of Pennsylvania dismissed Eckert and Mauchly because of their interests in commercializing the ENIAC and EDVAC. The two perceived an economic opportunity in selling computers and together formed the Electronic Control Company. The Census Bureau awarded them their first contract in June, beating out Raytheon. In 1947, the company changed its name to the Eckert-Mauchly Computer Corporation.

Two additional contracts to build computers were signed – with A.C. Nielson and Prudential Life Insurance Company – with badly needed cash advances to fund continuing product investment. Henry Straus, a Delaware racetrack owner and vice president of American Totalizer, committed to invest half a million dollars for 40 percent of Eckert-Mauchly common stock. But Straus died in an airplane crash, and local financial institutions refused to honor the notes Straus had given the company. By 1949, the company verged on bankruptcy, despite having their three contracts. 33

Eckert and Mauchly contacted everyone they knew who might be interested in either funding them, or acquiring them: NCR, Remington Rand, IBM, Philco, Burroughs, Hughes Aircraft, and others. They signed their acquisition by Remington Rand in February 1950. Remington Rand then attempted to cancel the three contracts. Nielson and Prudential agreed reluctantly. But the Census Bureau refused to cancel. They were going to make Remington Rand deliver the agreed upon computer. This act, obviously, influenced the economic history of computers by forcing product instantiation.

In the Spring of 1951, the Eckert-Mauchly division of Remington Rand shipped the first UNIVAC I to the Census Bureau. The next five UNIVAC I’s shipped to the government as well, to the Atomic Energy Commission (2), Air Force, Army, and Navy. 34

Another start-up, however, captured the honors of shipping the first commercial computer. The founders of Engineering Research Associates (ERA), after resigning from the Naval Communications Supplementary Activity, signed a contract with the Navy to develop the Atlas I computer. They had the right to resell the same technology commercially. The Atlas I computer as the ERA 1101 shipped in December 1950.

In 1952, Remington Rand, repeating a pattern of organizational development used in the office machinery, acquired their second computer firm, ERA. This time Remington Rand owned the two leading firms, not firms wanting to be acquired because they had trouble competing. Remington Rand controlled the emerging computer market-structure, yet remained a distant second to IBM in office machinery, especially tabulating equipment. In 1954, Remington Rand sold their first UNIVAC to a commercial customer: General Electric.

Goldstine held a Ph.D. and participated actively in the project.

RichardR. Nelson, “The Computer Industry,” Government and Technical Progress: A Cross-Industry Analysis,” Pergammon Press, 1982, p. 165

George Schussel, “IBM and REMRAND” Datamation, May 1965, p. 55

Stored-programing laid the beginnings of software, both as component and architecture.

Nelson, p. 167

Thomas J. Watson Jr., “Father, Son & Co.,” Bantam Books, 1990, p.143

Nelson, p 167

Ibid.., p 168

Nelson, pp 169-170

Nelson, p. 170

  • Introduction
  • Entrepreneurial Capitalism
  • From Ideas to Entrepreneurs to Adaptive Corporations
  • Firms constructing social networks as Populations
  • Three Revolutions in Computer Technologies and Corporate Usage 1968-1988
  • Institutional Change in Communications: Deregulation and Break-up of AT&T
  • A Brief Overview of Computer Communications 1968-1988
  • Personal Comments
  • Terminology
  • Preconditions
  • The Institutions of Competitive Capitalism
  • The Telegraph and the Information Revolution
  • The Institutions of Corporate Capitalism
  • Alexander Graham Bell and Bell Telephone Co. -- 1873-1878
  • Vail Joins the Bell Telephone Company -- 1878-1887
  • Monopoly Asserted -- 1918-1934
  • The FCC and AT&T Regulation -- 1934-1946
  • The U.S. vs. Western Union Lawsuit -- 1949-1956
  • Computer Inquiry I and the Carterfone -- 1965-1973
  • The FCC, Jurisdictional Disputes and Direct Connection of CPE -- 1973-1978
  • Antitrust, Computer Inquiry II and the Break-up of AT&T - 1973-1984
  • The Emergence of First Generation Computers 1946-1959
  • The Entrance of IBM - 1952
  • Real-Time Computing -- The SAGE Project -- 1952 - 1958
  • The Transistor - 1947
  • Second Generation Computing -- 1959-1963
  • The Integrated Circuit -- 1959
  • Management Information Systems -- 1959-1972
  • The IBM System/360 and the Third Generation of Computing --1964
  • Timesharing -- Project MAC -- 1962-1968
  • The Minicomputer -- 1959-1979
  • The Microprocessor -- 1971
  • Personal Distributed Computing -- Xerox PARC -- 1980
  • Personal Computers -- 1973-1988
  • Beginnings of Modem Competition: Codex and Milgo 1956-1967
  • Carterfone, ATT and the FCC 1948-1967
  • The Remarkable Growth in the Use of Computers
  • The FCC and Computer Inquiry I 1966-1967
  • Codex and Milgo: Needing Money 1967-1968
  • Multiplexer Innovation: American Data Systems 1966-1968
  • Euphoric Markets and Venture Capital 1967-1968
  • Codex and Milgo Become Public Companies 1968
  • American Data Systems Off and Running 1968
  • Carterfone, Computer Inquiry I and Deregulation 1967-1968
  • In Perspective
  • The Intergalactic Network: 1962-1964
  • The Seminal Experiment: 1965
  • Circuit Switching
  • Paul Baran - 1959-1965
  • Donald Davies - 1965-1966
  • Packet Switching
  • Planning the ARPANET: 1967-1968
  • The RFQ and Bidding: 1968
  • Bolt Beranek and Newman: The Winning Bid -1968
  • Entrepreneurism Flourishes 1968-1972
  • The Economic Roller Coaster 1969-1975
  • AT&T and Computer Inquiry I 1969
  • Codex Encounters Unexpected Problems: 1969
  • ADS Has a Blockbuster 1969
  • Codex Turns the Corner: 1970
  • ADS Hits a Wall: 1970
  • AT&T and Computer Inquiry I 1970-1971
  • Firms and Collective Behavior: The Creation of the IDCMA 197
  • Codex and the 9600: 1971
  • ADS Falls on Hard Times: 1971-1972
  • Codex Passes a Milestone: 1972
  • Data Communications 1972
  • The Communications Subnet: BBN 1969
  • Host-to-Host Software: The Network Working Group 1968-1969
  • Delivery of the First IMP to UCLA - September 1969
  • IPTO Management Changes - 1969
  • Host-to-Host Software - 1970
  • Network Topology - 1969-1970
  • Network Measurement Center - 1969-1970
  • Early Surprises - 1969-1970
  • Host-to-Host Software: The Network Control Program - 1970-1971
  • ALOHANET and Norm Abramson: 1966 - 1972
  • NPL Network and Donald Davies 1966 - 1971
  • ICCC Demonstration 1971-1972
  • Minicomputers, Distributed Data Processing and Microprocessors
  • The Justice Department: IBM and AT&T
  • Codex: LSI modems and Front-End Processors 1973
  • Wesley Chu and the Statistical Multiplexer 1966-1975
  • Codex: The LSI Modem and Competition 1974-1975
  • ADS: Rebirth as Micom 1973-1976
  • CPE Certification and Computer Inquiry II
  • Codex: The Statistical Multiplexer and Competition 1975-1976
  • Modems, Multiplexers and Networks 1976-1978
  • Micom: The Statistical Multiplexer 1976-1978
  • Codex and Motorola 1977-1978
  • Micom: Meteoric Success and Competition 1978-1979
  • Commercializing Arpanet 1972 - 1975
  • Packet Radio and Robert Kahn: 1972-1974
  • CYCLADES Network and Louis Pouzin 1971 - 1972
  • Transmission Control Protocol (TCP) 1973-1976
  • A Proliferation of Communication Projects
  • Token Ring and David Farber, UC Irvine and the NSF 1969-1974
  • Ethernet and Robert Metcalfe and Xerox PARC 1971-1975
  • Massachusetts Institute of Technology 1974 - 1977
  • Metcalfe Joins the Systems Development Division of Xerox 1975-1978
  • Xerox Network System (XNS) 1977-1978
  • TCP to TCP/IP 1976-1979
  • Open System Interconnection (OSI) 1975 - 1979
  • National Bureau of Standards and MITRE 1971 - 1979
  • The NBS and MITRE Workshop of January 1979
  • Prime Computers
  • The Workshop
  • Robert Metcalfe and the MIT Laboratory of Computer Science
  • Robert Metcalfe and Digital Equipment Corporation
  • The Symposium
  • The Return of Venture Capital
  • Robert Metcalfe and the Founding of 3Com
  • Michael Pliner and the Founding of Sytek
  • Ralph Ungermann and Charlie Bass and the Founding of Ungermann-Bass
  • Micom: The DataPBX and IPO 1978-1981
  • Codex: The DataPBX 1978-1981
  • Sytek: A Broadband Network and Needing Cash
  • Ungermann-Bass: Xerox, Broadband and Needing a Chip
  • 3Com: Product Strategy and Waiting for a PC
  • Emerging LAN Competition 1981
  • Bridge Communications
  • Concord Data Systems
  • The Office of the Future, the PBX to CBX, and AT&T
  • The IBM PC and IBM’s Token Ring LAN 1981-1982
  • 3Com, Ungermann-Bass and Sytek – 1981-'82
  • Ungermann-Bass
  • The Data Communication Competitors 1981-1982
  • Other Data Communication Competitors
  • The Early LAN Competitors – 1982
  • A Second Wave of LAN Competition - 1982
  • Digital Equipment Corporation (DEC)
  • Communications Machinery Corporation (CMC)
  • General Electric
  • The AT&T Settlement: January 1982
  • AT&T Introduces CBXs and LANs
  • Does IBM Need Both LANs and PBXs?
  • 3Com - 1982
  • Ungermann-Bass - 1982
  • Sytek - 1982
  • Ethernet Chips, Boundless Hope and Market Confusion
  • Standards Making and the OSI Reference Model
  • IEEE Committee 802: 1979 - 1980
  • DIX (Digital Equipment Corporation, Intel, and Xerox): 1979 - 1980
  • IEEE Committee 802 and DIX: 1980 - 1981
  • ISO/OSI (Open Systems Interconnection): 1979 - 1980
  • TCP/IP and XNS: 1979-1980
  • ISO/OSI (Open Systems Interconnection): 1981 - 1982
  • TCP/IP and XNS 1981 - 1983
  • IEEE Committee 802: 1981 - 1982
  • ISO/OSI (Open Systems Interconnection): 1982 - 1983
  • The Emergence of Technological Order: 1983 - 1984
  • Alex Brown & Sons Conference: March 1983
  • 3Com, Ungermann-Bass, and Sytek: 1983 – 1984
  • The Early LAN Competitors: 1983 – 1984
  • The Second Wave of LAN Competitors: 1983 – 1984
  • Excelan 1983-1984
  • The Data Communication Competitors: 1983 – 1984
  • New DataPBX Competitors
  • State of Competition: 1985
  • 3Com, Ungermann-Bass and Sytek: 1985 –1986
  • The Early LAN Competitors: 1985 - 1986
  • The Second Wave of LAN Competitors: 1985 - 1986
  • Communications Machinery Corporation
  • The Data Communication Competitors: 1985-1986
  • Micom - Interlan
  • The Revolution of Digital Transmission
  • AT&T and the T-1 Tariffs 1982-1984
  • The T-1 Multiplexer
  • The Beginnings of “Be Your Own Bell”
  • Data Communications: First Signs of Digital Networks 1982-1985
  • Data Communications - Industry Overview
  • General DataComm
  • Digital Communication Associates
  • Other Data Communication Firms
  • Tymnet and the Caravan Project 1982
  • Entrepreneurs: The T-1 start-ups 1982-1985
  • Network Equipment Technologies
  • Cohesive Networks
  • Network Switching Systems
  • Spectrum Digital
  • Market Analysis 1984-1987
  • Samples of Experts' Opinions
  • The Yankee Group
  • Datapro Research
  • Alex. Brown & Sons
  • Salomon Brothers Inc.
  • T-1 Multiplexer OEM Relationships - 1985
  • Data Communication: Wide Area Networks 1985-1988
  • which firms will adapt successfully?
  • The Emergence of Internetworking
  • Interconnecting Local Area Networks (LANs)
  • Repeaters - Physical Layer: Solutions to Extend a Network
  • Bridges - Data Link Layer: Adding a Few Networks Together
  • Gateways/Routers - Network Layer: Integrating Countless Networks
  • Open System Interconnection (OSI) Gaining Momentum
  • The Department of Defense - OSI and TCP/IP
  • The Role of the National Bureau of Standards (NBS)
  • Autofact Trade Show - November 1985
  • The NBS in Action: OSINET, COS, and GOSIP
  • LANs and WANs: The Public Demonstrations - 1988
  • ENE and Interop
  • Enterprise Network Event (OSI) - June
  • Interop (TCP/IP) Trade Show - September
  • Data Communications: Firms Adapting or Dying? 1987-1988
  • DCA, Racal Electronics, Timeplex, Paradyne, and Stratacom
  • Networking: Firms Responding to Market Consolidation: 1987-1988
  • Concord Communications, Inc.
  • DEC, Excelan, Sytek, and CMC
  • Internetworking: Entrepreneurs and Start-Ups: 1985-1988
  • cisco Systems
  • Product Revenues 1970-1988
  • Computer and Terminal Forecasts 1968-1988
  • Computer Communications Market
  • Computer Communications Revenues Reconcilliation
  • Product Categories and Firms
  • Computer Communications Start-Ups
  • Timing of Start-Up Financing
  • Computer Communications Market-Structure Consolodation
  • Income Statement Analysis
  • Balance Sheet Analysis
  • Financial Histories Aligned by IPO Year
  • Cash Uses of Pre-IPO Capital
  • Market Windows and Organization Ecology
  • Entrepreneurial Profit
  • Market Research Forecasting Uncertainties
  • Dominant Design Examples
  • Synoptics and 3Com Analysis
  • Data Communications Firm Interrelationships
  • Data Communications Sector Income Statements
  • Networking Sector Income Statements
  • Networking Market-Structure Analysis
  • Selection Pressures in Networking
  • Investment in Innovation by Data Communications and Networking Firms
  • Internetworking Sector Income Statements
  • Bolt Beranek & Newman (BBN) documents
  • Manley Irwin Papers
  • Market Analysis
  • Abramson, Norm
  • Bachman, Charles
  • Baran, Paul
  • Bass, Charlie
  • Bell, Gordon
  • Bingham, John
  • Botwinick, Edward
  • Carrico, Bill
  • Chu, Wesley
  • Clark, Dave
  • Clark, Wesley
  • Crocker, Steve
  • Dalal, Yogen
  • Dambrackas, Bill
  • Davidson, John
  • Davies, Donald
  • Donnan, Robert
  • Estrin, Judy
  • Evans, Roger
  • Farber, David
  • Fernandez, Manny
  • Forkish, Robbie
  • Forney, David
  • Frank, Howard
  • Frankel, Steve
  • Graube, Maris
  • Grumbles, George
  • Heafner, John
  • Heart, Frank
  • Holsinger, Jerry
  • Huffaker, Craig
  • Hunt, Bruce
  • Johnson, Johnny
  • Jordan, Jim
  • Kahn, Robert
  • Kaufman, Phil
  • Kinney, Matt
  • Kleinrock, Leonard
  • Krause, Bill
  • Krechmer, Ken
  • LaBarre, Lee & Brusil, Paul
  • Licklider, J.C.R.
  • Liddle, David
  • Loughry, Don
  • MacLean, Audrey
  • Maxwell, Kim
  • McDowell, Jerry
  • Metcalfe, Robert
  • Miller, Ken
  • Mulvenna, Jerry
  • Nordin, Bert
  • Norred, Bill
  • Nyborg, Phil
  • Pliner, Michael
  • Pogran, Ken
  • Postel, Jon
  • Pouzin, Louis
  • Rekhi, Kanwal
  • Roberts, Larry
  • Rosenthal, Robert
  • Saltzer, Jerry
  • Salwen, Howard
  • Severino, Paul
  • Slomin, Mike
  • Smith, Bruce
  • Smith, Mark
  • Smith, Robert & Thompson, Thomas
  • Strassburg, Bernard
  • Taylor, Robert
  • Ungermann, Ralph
  • Warmenhoven, Dan
  • Wecker, Stuart
  • White, James
  • Wiggins, Robert
  • Wilkes, Art
  • Zimmerman, Hubert
  • David Forney
  • Jerry Holsinger
  • Craig Huffaker
  • Bert Nordin
  • Bill Dambrackas
  • Robert Smith & Thomas Thompson
  • Roger Evans
  • Steve Frankel
  • Bill Norred
  • Paul Severino
  • Matt Kinney
  • Howard Frank
  • Robert Wiggins
  • Johnny Johnson
  • Edward Botwinick
  • George Grumbles
  • John Bingham
  • Ken Krechmer
  • Kim Maxwell
  • Bill Krause
  • Robert Metcalfe
  • Bill Carrico
  • Judy Estrin
  • Kanwal Rekhi
  • Howard Salwen
  • Michael Pliner
  • Larry Roberts
  • James White
  • Charlie Bass
  • John Davidson
  • Ralph Ungermann
  • Jerry McDowell
  • Manny Fernandez
  • Robbie Forkish
  • Audrey MacLean
  • Bruce Smith
  • Frank Heart
  • Robert Kahn
  • Gordon Bell
  • Stuart Wecker
  • Don Loughry
  • Dan Warmenhoven
  • Phil Kaufman
  • Yogen Dalal
  • David Liddle
  • Robert Taylor
  • Steve Crocker
  • J.C.R. Licklider
  • Phil Nyborg
  • Mike Slomin
  • Bernard Strassburg
  • Maris Graube
  • Jerry Saltzer
  • Charles Bachman
  • Louis Pouzin
  • Hubert Zimmerman
  • Lee LaBarre & Paul Brusil
  • John Heafner
  • Jerry Mulvenna
  • Robert Rosenthal
  • Donald Davies
  • Wesley Clark
  • David Farber
  • Leonard Kleinrock
  • Norm Abramson
  • Random article
  • Teaching guide
  • Privacy & cookies

A model of a Babbage-style Difference Engine at the Computer History Museum. Photo by Cory Doctorow.

A brief history of computers

by Chris Woodford . Last updated: January 19, 2023.

C omputers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations more quickly than the human brain—is exactly the same.

Read on to learn more about the history of computers—or take a look at our article on how computers work .

Listen instead... or scroll to keep reading

Photo: A model of one of the world's first computers (the Difference Engine invented by Charles Babbage) at the Computer History Museum in Mountain View, California, USA. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Cogs and Calculators

It is a measure of the brilliance of the abacus, invented in the Middle East circa 500 BC, that it remained the fastest form of calculator until the middle of the 17th century. Then, in 1642, aged only 18, French scientist and philosopher Blaise Pascal (1623–1666) invented the first practical mechanical calculator , the Pascaline, to help his tax-collector father do his sums. The machine had a series of interlocking cogs ( gear wheels with teeth around their outer edges) that could add and subtract decimal numbers. Several decades later, in 1671, German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) came up with a similar but more advanced machine. Instead of using cogs, it had a "stepped drum" (a cylinder with teeth of increasing length around its edge), an innovation that survived in mechanical calculators for 300 hundred years. The Leibniz machine could do much more than Pascal's: as well as adding and subtracting, it could multiply, divide, and work out square roots. Another pioneering feature was the first memory store or "register."

Apart from developing one of the world's earliest mechanical calculators, Leibniz is remembered for another important contribution to computing: he was the man who invented binary code, a way of representing any decimal number using only the two digits zero and one. Although Leibniz made no use of binary in his own calculator, it set others thinking. In 1854, a little over a century after Leibniz had died, Englishman George Boole (1815–1864) used the idea to invent a new branch of mathematics called Boolean algebra. [1] In modern computers, binary code and Boolean algebra allow computers to make simple decisions by comparing long strings of zeros and ones. But, in the 19th century, these ideas were still far ahead of their time. It would take another 50–100 years for mathematicians and computer scientists to figure out how to use them (find out more in our articles about calculators and logic gates ).

Artwork: Pascaline: Two details of Blaise Pascal's 17th-century calculator. Left: The "user interface": the part where you dial in numbers you want to calculate. Right: The internal gear mechanism. Picture courtesy of US Library of Congress .

Engines of Calculation

Neither the abacus, nor the mechanical calculators constructed by Pascal and Leibniz really qualified as computers. A calculator is a device that makes it quicker and easier for people to do sums—but it needs a human operator. A computer, on the other hand, is a machine that can operate automatically, without any human help, by following a series of stored instructions called a program (a kind of mathematical recipe). Calculators evolved into computers when people devised ways of making entirely automatic, programmable calculators.

Photo: Punched cards: Herman Hollerith perfected the way of using punched cards and paper tape to store information and feed it into a machine. Here's a drawing from his 1889 patent Art of Compiling Statistics (US Patent#395,782), showing how a strip of paper (yellow) is punched with different patterns of holes (orange) that correspond to statistics gathered about people in the US census. Picture courtesy of US Patent and Trademark Office.

The first person to attempt this was a rather obsessive, notoriously grumpy English mathematician named Charles Babbage (1791–1871). Many regard Babbage as the "father of the computer" because his machines had an input (a way of feeding in numbers), a memory (something to store these numbers while complex calculations were taking place), a processor (the number-cruncher that carried out the calculations), and an output (a printing mechanism)—the same basic components shared by all modern computers. During his lifetime, Babbage never completed a single one of the hugely ambitious machines that he tried to build. That was no surprise. Each of his programmable "engines" was designed to use tens of thousands of precision-made gears. It was like a pocket watch scaled up to the size of a steam engine , a Pascal or Leibniz machine magnified a thousand-fold in dimensions, ambition, and complexity. For a time, the British government financed Babbage—to the tune of £17,000, then an enormous sum. But when Babbage pressed the government for more money to build an even more advanced machine, they lost patience and pulled out. Babbage was more fortunate in receiving help from Augusta Ada Byron (1815–1852), Countess of Lovelace, daughter of the poet Lord Byron. An enthusiastic mathematician, she helped to refine Babbage's ideas for making his machine programmable—and this is why she is still, sometimes, referred to as the world's first computer programmer. [2] Little of Babbage's work survived after his death. But when, by chance, his notebooks were rediscovered in the 1930s, computer scientists finally appreciated the brilliance of his ideas. Unfortunately, by then, most of these ideas had already been reinvented by others.

Artwork: Charles Babbage (1791–1871). Picture from The Illustrated London News, 1871, courtesy of US Library of Congress .

Babbage had intended that his machine would take the drudgery out of repetitive calculations. Originally, he imagined it would be used by the army to compile the tables that helped their gunners to fire cannons more accurately. Toward the end of the 19th century, other inventors were more successful in their effort to construct "engines" of calculation. American statistician Herman Hollerith (1860–1929) built one of the world's first practical calculating machines, which he called a tabulator, to help compile census data. Then, as now, a census was taken each decade but, by the 1880s, the population of the United States had grown so much through immigration that a full-scale analysis of the data by hand was taking seven and a half years. The statisticians soon figured out that, if trends continued, they would run out of time to compile one census before the next one fell due. Fortunately, Hollerith's tabulator was an amazing success: it tallied the entire census in only six weeks and completed the full analysis in just two and a half years. Soon afterward, Hollerith realized his machine had other applications, so he set up the Tabulating Machine Company in 1896 to manufacture it commercially. A few years later, it changed its name to the Computing-Tabulating-Recording (C-T-R) company and then, in 1924, acquired its present name: International Business Machines (IBM).

Photo: Keeping count: Herman Hollerith's late-19th-century census machine (blue, left) could process 12 separate bits of statistical data each minute. Its compact 1940 replacement (red, right), invented by Eugene M. La Boiteaux of the Census Bureau, could work almost five times faster. Photo by Harris & Ewing courtesy of US Library of Congress .

Bush and the bomb

Photo: Dr Vannevar Bush (1890–1974). Picture by Harris & Ewing, courtesy of US Library of Congress .

The history of computing remembers colorful characters like Babbage, but others who played important—if supporting—roles are less well known. At the time when C-T-R was becoming IBM, the world's most powerful calculators were being developed by US government scientist Vannevar Bush (1890–1974). In 1925, Bush made the first of a series of unwieldy contraptions with equally cumbersome names: the New Recording Product Integraph Multiplier. Later, he built a machine called the Differential Analyzer, which used gears, belts, levers, and shafts to represent numbers and carry out calculations in a very physical way, like a gigantic mechanical slide rule. Bush's ultimate calculator was an improved machine named the Rockefeller Differential Analyzer, assembled in 1935 from 320 km (200 miles) of wire and 150 electric motors . Machines like these were known as analog calculators—analog because they stored numbers in a physical form (as so many turns on a wheel or twists of a belt) rather than as digits. Although they could carry out incredibly complex calculations, it took several days of wheel cranking and belt turning before the results finally emerged.

Impressive machines like the Differential Analyzer were only one of several outstanding contributions Bush made to 20th-century technology. Another came as the teacher of Claude Shannon (1916–2001), a brilliant mathematician who figured out how electrical circuits could be linked together to process binary code with Boolean algebra (a way of comparing binary numbers using logic) and thus make simple decisions. During World War II, President Franklin D. Roosevelt appointed Bush chairman first of the US National Defense Research Committee and then director of the Office of Scientific Research and Development (OSRD). In this capacity, he was in charge of the Manhattan Project, the secret $2-billion initiative that led to the creation of the atomic bomb. One of Bush's final wartime contributions was to sketch out, in 1945, an idea for a memory-storing and sharing device called Memex that would later inspire Tim Berners-Lee to invent the World Wide Web . [3] Few outside the world of computing remember Vannevar Bush today—but what a legacy! As a father of the digital computer, an overseer of the atom bomb, and an inspiration for the Web, Bush played a pivotal role in three of the 20th-century's most far-reaching technologies.

Photo: "A gigantic mechanical slide rule": A differential analyzer pictured in 1938. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

Turing—tested

The first modern computers.

The World War II years were a crucial period in the history of computing, when powerful gargantuan computers began to appear. Just before the outbreak of the war, in 1938, German engineer Konrad Zuse (1910–1995) constructed his Z1, the world's first programmable binary computer, in his parents' living room. [4] The following year, American physicist John Atanasoff (1903–1995) and his assistant, electrical engineer Clifford Berry (1918–1963), built a more elaborate binary machine that they named the Atanasoff Berry Computer (ABC). It was a great advance—1000 times more accurate than Bush's Differential Analyzer. These were the first machines that used electrical switches to store numbers: when a switch was "off", it stored the number zero; flipped over to its other, "on", position, it stored the number one. Hundreds or thousands of switches could thus store a great many binary digits (although binary is much less efficient in this respect than decimal, since it takes up to eight binary digits to store a three-digit decimal number). These machines were digital computers: unlike analog machines, which stored numbers using the positions of wheels and rods, they stored numbers as digits.

The first large-scale digital computer of this kind appeared in 1944 at Harvard University, built by mathematician Howard Aiken (1900–1973). Sponsored by IBM, it was variously known as the Harvard Mark I or the IBM Automatic Sequence Controlled Calculator (ASCC). A giant of a machine, stretching 15m (50ft) in length, it was like a huge mechanical calculator built into a wall. It must have sounded impressive, because it stored and processed numbers using "clickety-clack" electromagnetic relays (electrically operated magnets that automatically switched lines in telephone exchanges)—no fewer than 3304 of them. Impressive they may have been, but relays suffered from several problems: they were large (that's why the Harvard Mark I had to be so big); they needed quite hefty pulses of power to make them switch; and they were slow (it took time for a relay to flip from "off" to "on" or from 0 to 1).

Photo: An analog computer being used in military research in 1949. Picture courtesy of NASA on the Commons (where you can download a larger version.

Most of the machines developed around this time were intended for military purposes. Like Babbage's never-built mechanical engines, they were designed to calculate artillery firing tables and chew through the other complex chores that were then the lot of military mathematicians. During World War II, the military co-opted thousands of the best scientific minds: recognizing that science would win the war, Vannevar Bush's Office of Scientific Research and Development employed 10,000 scientists from the United States alone. Things were very different in Germany. When Konrad Zuse offered to build his Z2 computer to help the army, they couldn't see the need—and turned him down.

On the Allied side, great minds began to make great breakthroughs. In 1943, a team of mathematicians based at Bletchley Park near London, England (including Alan Turing) built a computer called Colossus to help them crack secret German codes. Colossus was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube (also known, especially in Britain, as a valve). The vacuum tube, each one about as big as a person's thumb (earlier ones were very much bigger) and glowing red hot like a tiny electric light bulb, had been invented in 1906 by Lee de Forest (1873–1961), who named it the Audion. This breakthrough earned de Forest his nickname as "the father of radio" because their first major use was in radio receivers , where they amplified weak incoming signals so people could hear them more clearly. [5] In computers such as the ABC and Colossus, vacuum tubes found an alternative use as faster and more compact switches.

Just like the codes it was trying to crack, Colossus was top-secret and its existence wasn't confirmed until after the war ended. As far as most people were concerned, vacuum tubes were pioneered by a more visible computer that appeared in 1946: the Electronic Numerical Integrator And Calculator (ENIAC). The ENIAC's inventors, two scientists from the University of Pennsylvania, John Mauchly (1907–1980) and J. Presper Eckert (1919–1995), were originally inspired by Bush's Differential Analyzer; years later Eckert recalled that ENIAC was the "descendant of Dr Bush's machine." But the machine they constructed was far more ambitious. It contained nearly 18,000 vacuum tubes (nine times more than Colossus), was around 24 m (80 ft) long, and weighed almost 30 tons. ENIAC is generally recognized as the world's first fully electronic, general-purpose, digital computer. Colossus might have qualified for this title too, but it was designed purely for one job (code-breaking); since it couldn't store a program, it couldn't easily be reprogrammed to do other things.

Photo: Sir Maurice Wilkes (left), his collaborator William Renwick, and the early EDSAC-1 electronic computer they built in Cambridge, pictured around 1947/8. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

ENIAC was just the beginning. Its two inventors formed the Eckert Mauchly Computer Corporation in the late 1940s. Working with a brilliant Hungarian mathematician, John von Neumann (1903–1957), who was based at Princeton University, they then designed a better machine called EDVAC (Electronic Discrete Variable Automatic Computer). In a key piece of work, von Neumann helped to define how the machine stored and processed its programs, laying the foundations for how all modern computers operate. [6] After EDVAC, Eckert and Mauchly developed UNIVAC 1 (UNIVersal Automatic Computer) in 1951. They were helped in this task by a young, largely unknown American mathematician and Naval reserve named Grace Murray Hopper (1906–1992), who had originally been employed by Howard Aiken on the Harvard Mark I. Like Herman Hollerith's tabulator over 50 years before, UNIVAC 1 was used for processing data from the US census. It was then manufactured for other users—and became the world's first large-scale commercial computer.

Machines like Colossus, the ENIAC, and the Harvard Mark I compete for significance and recognition in the minds of computer historians. Which one was truly the first great modern computer? All of them and none: these—and several other important machines—evolved our idea of the modern electronic computer during the key period between the late 1930s and the early 1950s. Among those other machines were pioneering computers put together by English academics, notably the Manchester/Ferranti Mark I, built at Manchester University by Frederic Williams (1911–1977) and Thomas Kilburn (1921–2001), and the EDSAC (Electronic Delay Storage Automatic Calculator), built by Maurice Wilkes (1913–2010) at Cambridge University. [7]

Photo: Control panel of the UNIVAC 1, the world's first large-scale commercial computer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

The microelectronic revolution

Vacuum tubes were a considerable advance on relay switches, but machines like the ENIAC were notoriously unreliable. The modern term for a problem that holds up a computer program is a "bug." Popular legend has it that this word entered the vocabulary of computer programmers sometime in the 1950s when moths, attracted by the glowing lights of vacuum tubes, flew inside machines like the ENIAC, caused a short circuit, and brought work to a juddering halt. But there were other problems with vacuum tubes too. They consumed enormous amounts of power: the ENIAC used about 2000 times as much electricity as a modern laptop. And they took up huge amounts of space. Military needs were driving the development of machines like the ENIAC, but the sheer size of vacuum tubes had now become a real problem. ABC had used 300 vacuum tubes, Colossus had 2000, and the ENIAC had 18,000. The ENIAC's designers had boasted that its calculating speed was "at least 500 times as great as that of any other existing computing machine." But developing computers that were an order of magnitude more powerful still would have needed hundreds of thousands or even millions of vacuum tubes—which would have been far too costly, unwieldy, and unreliable. So a new technology was urgently required.

The solution appeared in 1947 thanks to three physicists working at Bell Telephone Laboratories (Bell Labs). John Bardeen (1908–1991), Walter Brattain (1902–1987), and William Shockley (1910–1989) were then helping Bell to develop new technology for the American public telephone system, so the electrical signals that carried phone calls could be amplified more easily and carried further. Shockley, who was leading the team, believed he could use semiconductors (materials such as germanium and silicon that allow electricity to flow through them only when they've been treated in special ways) to make a better form of amplifier than the vacuum tube. When his early experiments failed, he set Bardeen and Brattain to work on the task for him. Eventually, in December 1947, they created a new form of amplifier that became known as the point-contact transistor. Bell Labs credited Bardeen and Brattain with the transistor and awarded them a patent. This enraged Shockley and prompted him to invent an even better design, the junction transistor, which has formed the basis of most transistors ever since.

Like vacuum tubes, transistors could be used as amplifiers or as switches. But they had several major advantages. They were a fraction the size of vacuum tubes (typically about as big as a pea), used no power at all unless they were in operation, and were virtually 100 percent reliable. The transistor was one of the most important breakthroughs in the history of computing and it earned its inventors the world's greatest science prize, the 1956 Nobel Prize in Physics . By that time, however, the three men had already gone their separate ways. John Bardeen had begun pioneering research into superconductivity , which would earn him a second Nobel Prize in 1972. Walter Brattain moved to another part of Bell Labs.

William Shockley decided to stick with the transistor, eventually forming his own corporation to develop it further. His decision would have extraordinary consequences for the computer industry. With a small amount of capital, Shockley set about hiring the best brains he could find in American universities, including young electrical engineer Robert Noyce (1927–1990) and research chemist Gordon Moore (1929–). It wasn't long before Shockley's idiosyncratic and bullying management style upset his workers. In 1956, eight of them—including Noyce and Moore—left Shockley Transistor to found a company of their own, Fairchild Semiconductor, just down the road. Thus began the growth of "Silicon Valley," the part of California centered on Palo Alto, where many of the world's leading computer and electronics companies have been based ever since. [8]

It was in Fairchild's California building that the next breakthrough occurred—although, somewhat curiously, it also happened at exactly the same time in the Dallas laboratories of Texas Instruments. In Dallas, a young engineer from Kansas named Jack Kilby (1923–2005) was considering how to improve the transistor. Although transistors were a great advance on vacuum tubes, one key problem remained. Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error prone. Wouldn't it be better, Kilby reflected, if many transistors could be made in a single package? This prompted him to invent the "monolithic" integrated circuit (IC) , a collection of transistors and other components that could be manufactured all at once, in a block, on the surface of a semiconductor. Kilby's invention was another step forward, but it also had a drawback: the components in his integrated circuit still had to be connected by hand. While Kilby was making his breakthrough in Dallas, unknown to him, Robert Noyce was perfecting almost exactly the same idea at Fairchild in California. Noyce went one better, however: he found a way to include the connections between components in an integrated circuit, thus automating the entire process.

Photo: An integrated circuit from the 1980s. This is an EPROM chip (effectively a forerunner of flash memory , which you could only erase with a blast of ultraviolet light).

Mainframes, minis, and micros

Photo: An IBM 704 mainframe pictured at NASA in 1958. Designed by Gene Amdahl, this scientific number cruncher was the successor to the 701 and helped pave the way to arguably the most important IBM computer of all time, the System/360, which Amdahl also designed. Photo courtesy of NASA .

Photo: The control panel of DEC's classic 1965 PDP-8 minicomputer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Integrated circuits, as much as transistors, helped to shrink computers during the 1960s. In 1943, IBM boss Thomas Watson had reputedly quipped: "I think there is a world market for about five computers." Just two decades later, the company and its competitors had installed around 25,000 large computer systems across the United States. As the 1960s wore on, integrated circuits became increasingly sophisticated and compact. Soon, engineers were speaking of large-scale integration (LSI), in which hundreds of components could be crammed onto a single chip, and then very large-scale integrated (VLSI), when the same chip could contain thousands of components.

The logical conclusion of all this miniaturization was that, someday, someone would be able to squeeze an entire computer onto a chip. In 1968, Robert Noyce and Gordon Moore had left Fairchild to establish a new company of their own. With integration very much in their minds, they called it Integrated Electronics or Intel for short. Originally they had planned to make memory chips, but when the company landed an order to make chips for a range of pocket calculators, history headed in a different direction. A couple of their engineers, Federico Faggin (1941–) and Marcian Edward (Ted) Hoff (1937–), realized that instead of making a range of specialist chips for a range of calculators, they could make a universal chip that could be programmed to work in them all. Thus was born the general-purpose, single chip computer or microprocessor—and that brought about the next phase of the computer revolution.

Personal computers

By 1974, Intel had launched a popular microprocessor known as the 8080 and computer hobbyists were soon building home computers around it. The first was the MITS Altair 8800, built by Ed Roberts . With its front panel covered in red LED lights and toggle switches, it was a far cry from modern PCs and laptops. Even so, it sold by the thousand and earned Roberts a fortune. The Altair inspired a Californian electronics wizard name Steve Wozniak (1950–) to develop a computer of his own. "Woz" is often described as the hacker's "hacker"—a technically brilliant and highly creative engineer who pushed the boundaries of computing largely for his own amusement. In the mid-1970s, he was working at the Hewlett-Packard computer company in California, and spending his free time tinkering away as a member of the Homebrew Computer Club in the Bay Area.

After seeing the Altair, Woz used a 6502 microprocessor (made by an Intel rival, Mos Technology) to build a better home computer of his own: the Apple I. When he showed off his machine to his colleagues at the club, they all wanted one too. One of his friends, Steve Jobs (1955–2011), persuaded Woz that they should go into business making the machine. Woz agreed so, famously, they set up Apple Computer Corporation in a garage belonging to Jobs' parents. After selling 175 of the Apple I for the devilish price of $666.66, Woz built a much better machine called the Apple ][ (pronounced "Apple Two"). While the Altair 8800 looked like something out of a science lab, and the Apple I was little more than a bare circuit board, the Apple ][ took its inspiration from such things as Sony televisions and stereos: it had a neat and friendly looking cream plastic case. Launched in April 1977, it was the world's first easy-to-use home "microcomputer." Soon home users, schools, and small businesses were buying the machine in their tens of thousands—at $1298 a time. Two things turned the Apple ][ into a really credible machine for small firms: a disk drive unit, launched in 1978, which made it easy to store data; and a spreadsheet program called VisiCalc, which gave Apple users the ability to analyze that data. In just two and a half years, Apple sold around 50,000 of the machine, quickly accelerating out of Jobs' garage to become one of the world's biggest companies. Dozens of other microcomputers were launched around this time, including the TRS-80 from Radio Shack (Tandy in the UK) and the Commodore PET. [9]

Apple's success selling to businesses came as a great shock to IBM and the other big companies that dominated the computer industry. It didn't take a VisiCalc spreadsheet to figure out that, if the trend continued, upstarts like Apple would undermine IBM's immensely lucrative business market selling "Big Blue" computers. In 1980, IBM finally realized it had to do something and launched a highly streamlined project to save its business. One year later, it released the IBM Personal Computer (PC), based on an Intel 8080 microprocessor, which rapidly reversed the company's fortunes and stole the market back from Apple.

The PC was successful essentially for one reason. All the dozens of microcomputers that had been launched in the 1970s—including the Apple ][—were incompatible. All used different hardware and worked in different ways. Most were programmed using a simple, English-like language called BASIC, but each one used its own flavor of BASIC, which was tied closely to the machine's hardware design. As a result, programs written for one machine would generally not run on another one without a great deal of conversion. Companies who wrote software professionally typically wrote it just for one machine and, consequently, there was no software industry to speak of.

In 1976, Gary Kildall (1942–1994), a teacher and computer scientist, and one of the founders of the Homebrew Computer Club, had figured out a solution to this problem. Kildall wrote an operating system (a computer's fundamental control software) called CP/M that acted as an intermediary between the user's programs and the machine's hardware. With a stroke of genius, Kildall realized that all he had to do was rewrite CP/M so it worked on each different machine. Then all those machines could run identical user programs—without any modification at all—inside CP/M. That would make all the different microcomputers compatible at a stroke. By the early 1980s, Kildall had become a multimillionaire through the success of his invention: the first personal computer operating system. Naturally, when IBM was developing its personal computer, it approached him hoping to put CP/M on its own machine. Legend has it that Kildall was out flying his personal plane when IBM called, so missed out on one of the world's greatest deals. But the truth seems to have been that IBM wanted to buy CP/M outright for just $200,000, while Kildall recognized his product was worth millions more and refused to sell. Instead, IBM turned to a young programmer named Bill Gates (1955–). His then tiny company, Microsoft, rapidly put together an operating system called DOS, based on a product called QDOS (Quick and Dirty Operating System), which they acquired from Seattle Computer Products. Some believe Microsoft and IBM cheated Kildall out of his place in computer history; Kildall himself accused them of copying his ideas. Others think Gates was simply the shrewder businessman. Either way, the IBM PC, powered by Microsoft's operating system, was a runaway success.

Yet IBM's victory was short-lived. Cannily, Bill Gates had sold IBM the rights to one flavor of DOS (PC-DOS) and retained the rights to a very similar version (MS-DOS) for his own use. When other computer manufacturers, notably Compaq and Dell, starting making IBM-compatible (or "cloned") hardware, they too came to Gates for the software. IBM charged a premium for machines that carried its badge, but consumers soon realized that PCs were commodities: they contained almost identical components—an Intel microprocessor, for example—no matter whose name they had on the case. As IBM lost market share, the ultimate victors were Microsoft and Intel, who were soon supplying the software and hardware for almost every PC on the planet. Apple, IBM, and Kildall made a great deal of money—but all failed to capitalize decisively on their early success. [10]

Photo: Personal computers threatened companies making large "mainframes" like this one. Picture courtesy of NASA on the Commons (where you can download a larger version).

The user revolution

Fortunately for Apple, it had another great idea. One of the Apple II's strongest suits was its sheer "user-friendliness." For Steve Jobs, developing truly easy-to-use computers became a personal mission in the early 1980s. What truly inspired him was a visit to PARC (Palo Alto Research Center), a cutting-edge computer laboratory then run as a division of the Xerox Corporation. Xerox had started developing computers in the early 1970s, believing they would make paper (and the highly lucrative photocopiers Xerox made) obsolete. One of PARC's research projects was an advanced $40,000 computer called the Xerox Alto. Unlike most microcomputers launched in the 1970s, which were programmed by typing in text commands, the Alto had a desktop-like screen with little picture icons that could be moved around with a mouse: it was the very first graphical user interface (GUI, pronounced "gooey")—an idea conceived by Alan Kay (1940–) and now used in virtually every modern computer. The Alto borrowed some of its ideas, including the mouse , from 1960s computer pioneer Douglas Engelbart (1925–2013).

Photo: During the 1980s, computers started to converge on the same basic "look and feel," largely inspired by the work of pioneers like Alan Kay and Douglas Engelbart. Photographs in the Carol M. Highsmith Archive, courtesy of US Library of Congress , Prints and Photographs Division.

Back at Apple, Jobs launched his own version of the Alto project to develop an easy-to-use computer called PITS (Person In The Street). This machine became the Apple Lisa, launched in January 1983—the first widely available computer with a GUI desktop. With a retail price of $10,000, over three times the cost of an IBM PC, the Lisa was a commercial flop. But it paved the way for a better, cheaper machine called the Macintosh that Jobs unveiled a year later, in January 1984. With its memorable launch ad for the Macintosh inspired by George Orwell's novel 1984 , and directed by Ridley Scott (director of the dystopic movie Blade Runner ), Apple took a swipe at IBM's monopoly, criticizing what it portrayed as the firm's domineering—even totalitarian—approach: Big Blue was really Big Brother. Apple's ad promised a very different vision: "On January 24, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984'." The Macintosh was a critical success and helped to invent the new field of desktop publishing in the mid-1980s, yet it never came close to challenging IBM's position.

Ironically, Jobs' easy-to-use machine also helped Microsoft to dislodge IBM as the world's leading force in computing. When Bill Gates saw how the Macintosh worked, with its easy-to-use picture-icon desktop, he launched Windows, an upgraded version of his MS-DOS software. Apple saw this as blatant plagiarism and filed a $5.5 billion copyright lawsuit in 1988. Four years later, the case collapsed with Microsoft effectively securing the right to use the Macintosh "look and feel" in all present and future versions of Windows. Microsoft's Windows 95 system, launched three years later, had an easy-to-use, Macintosh-like desktop and MS-DOS running behind the scenes.

Photo: The IBM Blue Gene/P supercomputer at Argonne National Laboratory: one of the world's most powerful computers. Picture courtesy of Argonne National Laboratory published on Wikimedia Commons in 2009 under a Creative Commons Licence .

From nets to the Internet

Standardized PCs running standardized software brought a big benefit for businesses: computers could be linked together into networks to share information. At Xerox PARC in 1973, electrical engineer Bob Metcalfe (1946–) developed a new way of linking computers "through the ether" (empty space) that he called Ethernet. A few years later, Metcalfe left Xerox to form his own company, 3Com, to help companies realize "Metcalfe's Law": computers become useful the more closely connected they are to other people's computers. As more and more companies explored the power of local area networks (LANs), so, as the 1980s progressed, it became clear that there were great benefits to be gained by connecting computers over even greater distances—into so-called wide area networks (WANs).

Photo: Computers aren't what they used to be: they're much less noticeable because they're much more seamlessly integrated into everyday life. Some are "embedded" into household gadgets like coffee makers or televisions . Others travel round in our pockets in our smartphones—essentially pocket computers that we can program simply by downloading "apps" (applications).

Today, the best known WAN is the Internet —a global network of individual computers and LANs that links up hundreds of millions of people. The history of the Internet is another story, but it began in the 1960s when four American universities launched a project to connect their computer systems together to make the first WAN. Later, with funding for the Department of Defense, that network became a bigger project called ARPANET (Advanced Research Projects Agency Network). In the mid-1980s, the US National Science Foundation (NSF) launched its own WAN called NSFNET. The convergence of all these networks produced what we now call the Internet later in the 1980s. Shortly afterward, the power of networking gave British computer programmer Tim Berners-Lee (1955–) his big idea: to combine the power of computer networks with the information-sharing idea Vannevar Bush had proposed in 1945. Thus, was born the World Wide Web —an easy way of sharing information over a computer network, which made possible the modern age of cloud computing (where anyone can access vast computing power over the Internet without having to worry about where or how their data is processed). It's Tim Berners-Lee's invention that brings you this potted history of computing today!

And now where?

If you liked this article..., don't want to read our articles try listening instead, find out more, on this site.

  • Supercomputers : How do the world's most powerful computers work?

Other websites

There are lots of websites covering computer history. Here are a just a few favorites worth exploring!

  • The Computer History Museum : The website of the world's biggest computer museum in California.
  • The Computing Age : A BBC special report into computing past, present, and future.
  • Charles Babbage at the London Science Museum : Lots of information about Babbage and his extraordinary engines. [Archived via the Wayback Machine]
  • IBM History : Many fascinating online exhibits, as well as inside information about the part IBM inventors have played in wider computer history.
  • Wikipedia History of Computing Hardware : covers similar ground to this page.
  • Computer history images : A small but interesting selection of photos.
  • Transistorized! : The history of the invention of the transistor from PBS.
  • Intel Museum : The story of Intel's contributions to computing from the 1970s onward.

There are some superb computer history videos on YouTube and elsewhere; here are three good ones to start you off:

  • The Difference Engine : A great introduction to Babbage's Difference Engine from Doron Swade, one of the world's leading Babbage experts.
  • The ENIAC : A short Movietone news clip about the completion of the world's first programmable electronic computer.
  • A tour of the Computer History Museum : Dag Spicer gives us a tour of the world's most famous computer museum, in California.

For older readers

For younger readers.

Text copyright © Chris Woodford 2006, 2023. All rights reserved. Full copyright notice and terms of use .

Rate this page

Tell your friends, cite this page, more to explore on our website....

  • Get the book
  • Send feedback

First generation of computers

Modern computer era owes much to the great technological advances that took place during World War II. Thus, the invention of electronic circuits, vacuum tubes and capacitors replace the generation of mechanical components while numerical calculation replaces analog calculation. The computers and products of this era constitute the so-called first generation of computers .

First generation of computers

  • Date : 1951 to 1958
  • Featured Computers : Atanasoff Berry Computer, MARK I, UNIVAC, ENIAC

What is the first generation of computers?

The first generation of computers was launched in the middle of the 20th century , specifically between 1946 and 1958, a period that generated great technological advances based on the search for an aid instrument in the scientific and military fields. These computers were very notorious and particular for the magnitude of their size and for the little power to acquire one.

Characteristics of the first generation of computers

History of the first generation of computers, first generation of computers inventors, featured pcs from the first generation of computers.

The first generation of computers generated in the mid-twentieth century had the first indication or antecedent of modern computers, but among its main characteristics were its large size as well as its high cost of acquisition , and the recurrent theme of failures and errors for being experimental.

The computers counted with the use of vacuum tubes to process the information, punched cards for data entry and exit and programs, and used magnetic cylinders to store information and internal instructions.

The first ones on the market cost approximately $10,000. Due to their large size, their use implied a great amount of electricity, generating an overheating in the system, requiring special auxiliary air conditioning systems (in order to avoid this overheating). For example, the large ENIAC computer weighing up to 30 tons.

The first-generation computers used magnetic drums as data storage elements to be changed later in the second generation by ferrite memories.

The historical development of the first generation of computers does not have an exact beginning, since it is the result of previous discoveries and experimentation from different authors, but in this case would begin to take its development since the twentieth century .

The design of Charles Babbage ’s analytical machine collected ideas as a primitive way of giving orders to the machine for the automatic performance of calculations and the introduction of data storage systems. These ideas were incorporated into the design of the ENIAC , the first electronic computer to be built . In it, they were later based for the UNIVAC I , which was the first computer to be manufactured commercially, as well as being the first to use a compiler to change the language from program to machine language.

Its main advances were the system of magnetic tapes, which were read back and forth, and the possibility of checking errors. The introduction of integrated circuits allowed the appearance of the first desktop computer in 1974 . This immediate success led to the appearance of the IBM PC in 1981.

  • Howard Aiken (1900- 1973), developed the Automatic Sequence Controller Calculator (ASCC), where, at the same time, he relied on Babbage’s work with the analytical machine, managing to build the Mark 1 , the first electro-mechanical computer (1944), which had a speed of a couple of tenths of seconds to add or subtract, two seconds to multiply two numbers of 11 digits and divide in a term of more or less 4 seconds.
  • Eckert and Mauchly contributed to the development of first-generation computers by forming a private company and building UNIVAC I, which the Census Committee used to evaluate the 1950 census. IBM had a monopoly on punch card data processing equipment.
  • Von Neumann , the first to use a binary or double arithmetic in an electronic computer. Although his greatest contribution occurs with the collaboration with Arthur W. Burks and Herman . H. They write Discussion of the logical design of an electronic computing instrument, which served as a basis (even today) for computer construction. Ideas such as: the central concept, which allowed programs and data to be organized in the same medium (memory) and the use of the first machine to use magnetic tapes, such as the EDVAC design, Electronic discrete variable automatic computer.
  • Atanasoff Berry Computer (ABC): The first automatic electronic digital computer developed between 1937 and 1942, which was named in honor of its two authors, John Vicent Atanasoff and his assistant, Clifford Berry. It was able to perform with a high level of accuracy equations of up to 29 unknowns.
  • MARK I: The first electro-mechanical order , which was the product of Howard Aiken, who based his research in the field, such as Babbage’s work. Which leads him to be the founder of the ASCC, Automatic Sequence Controller Calculator. This computer had a magnitude of 15 meters long x 2.4 meters high, as it was made up of about 800,000 pieces and more than 804 km of wiring. Its speed was not the best, however, was a breakthrough for the time and later came to build the MARK II and MARK III.
  • UNIVAC : Universal Automatic Computer opened the way for the explosive development and improvement of computers that would be produced. It first appeared in 1951, being the first digital computer for commercial Its creators were John Mauchly and John Presper Eckert, taking around 5 years for its realization. It was characterized, like all computers of this generation, by its magnitude because it weighed 7,257 kg, consisting of 5,000 vacuum tubes and could perform about 1,000 calculations per second, being quite fast for his time.
  • ENIAC (Electronic Numerical Integrator And Computer) : created in 1943 by John W. Mauchly and John Presper Eckert, with the purpose of providing help or facilitating problems in the military field, since it required an instrument that was used as a medium. Its implementation took about 3 years and still generated a good number of problems apart from its enormous size and energy consumption required for its operation. In addition to that, it also presented problems with the programming as it was very complex to modify, being a model quickly surpassed.

Whatsapp

How to cite this article?

Briceño V., Gabriela. (2019). First generation of computers . Recovered on 24 February, 2024, de Euston96: https://www.euston96.com/en/first-generation-of-computers/

Recommended for you

Fourth generation of computers.

Fourth generation of computers

Cache memory

Cache memory

Fifth generation of computers

Fifth generation of computers

  • Search Menu

Sign in through your institution

  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Numismatics
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Papyrology
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Archaeology
  • Late Antiquity
  • Religion in the Ancient World
  • Social History
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Emotions
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Evolution
  • Language Reference
  • Language Acquisition
  • Language Variation
  • Language Families
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Modernism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Media
  • Music and Religion
  • Music and Culture
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Meta-Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Science
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Legal System - Costs and Funding
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Restitution
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Clinical Neuroscience
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Medical Ethics
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Ethics
  • Business Strategy
  • Business History
  • Business and Technology
  • Business and Government
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Social Issues in Business and Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic History
  • Economic Systems
  • Economic Methodology
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Management of Land and Natural Resources (Social Science)
  • Natural Disasters (Environment)
  • Pollution and Threats to the Environment (Social Science)
  • Social Impact of Environmental Issues (Social Science)
  • Sustainability
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • Ethnic Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Theory
  • Politics and Law
  • Politics of Development
  • Public Policy
  • Public Administration
  • Qualitative Political Methodology
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Disability Studies
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

On the Foundations of Computing

  • < Previous chapter
  • Next chapter >

On the Foundations of Computing

8 The First Generation of Computers

  • Published: November 2019
  • Cite Icon Cite
  • Permissions Icon Permissions

This chapter starts with the analysis of the engineering foundation of computing which, proceeding in parallelwith themathematical foundation, led to the design and creation of physical computingmachines. It illustrates the historical evolution of the first generation of computing and their technical foundation, known as the von Neumann architecture. Fromthe conceptual point of view, the chapter clarifies the relation between the universal model of computation and the construction of an all-purpose machine.

Personal account

  • Sign in with email/username & password
  • Get email alerts
  • Save searches
  • Purchase content
  • Activate your purchase/trial code
  • Add your ORCID iD

Institutional access

Sign in with a library card.

  • Sign in with username/password
  • Recommend to your librarian
  • Institutional account management
  • Get help with access

Access to content on Oxford Academic is often provided through institutional subscriptions and purchases. If you are a member of an institution with an active account, you may be able to access content in one of the following ways:

IP based access

Typically, access is provided across an institutional network to a range of IP addresses. This authentication occurs automatically, and it is not possible to sign out of an IP authenticated account.

Choose this option to get remote access when outside your institution. Shibboleth/Open Athens technology is used to provide single sign-on between your institution’s website and Oxford Academic.

  • Click Sign in through your institution.
  • Select your institution from the list provided, which will take you to your institution's website to sign in.
  • When on the institution site, please use the credentials provided by your institution. Do not use an Oxford Academic personal account.
  • Following successful sign in, you will be returned to Oxford Academic.

If your institution is not listed or you cannot sign in to your institution’s website, please contact your librarian or administrator.

Enter your library card number to sign in. If you cannot sign in, please contact your librarian.

Society Members

Society member access to a journal is achieved in one of the following ways:

Sign in through society site

Many societies offer single sign-on between the society website and Oxford Academic. If you see ‘Sign in through society site’ in the sign in pane within a journal:

  • Click Sign in through society site.
  • When on the society site, please use the credentials provided by that society. Do not use an Oxford Academic personal account.

If you do not have a society account or have forgotten your username or password, please contact your society.

Sign in using a personal account

Some societies use Oxford Academic personal accounts to provide access to their members. See below.

A personal account can be used to get email alerts, save searches, purchase content, and activate subscriptions.

Some societies use Oxford Academic personal accounts to provide access to their members.

Viewing your signed in accounts

Click the account icon in the top right to:

  • View your signed in personal account and access account management features.
  • View the institutional accounts that are providing access.

Signed in but can't access content

Oxford Academic is home to a wide variety of products. The institutional subscription may not cover the content that you are trying to access. If you believe you should have access to that content, please contact your librarian.

For librarians and administrators, your personal account also provides access to institutional account management. Here you will find options to view and activate subscriptions, manage institutional settings and access options, access usage statistics, and more.

Our books are available by subscription or purchase to libraries and institutions.

Month: Total Views:
October 2022 2
November 2022 5
January 2023 1
February 2023 1
March 2023 2
April 2023 9
May 2023 1
June 2023 2
July 2023 6
September 2023 1
October 2023 2
January 2024 4
March 2024 4
April 2024 2
June 2024 4
August 2024 2
September 2024 4
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Rights and permissions
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Digitalworld839.com

Generations of Computer 1st to 5th Explained with Pictures.

The history of computer technology is often used to refer to the origin of all the different generations of computers . From first to fifth each computer generation is characterized by significant technological development in their components, memory , and elements which essentially changed the way these devices work.

Several periods of generation from over the years advanced the technological evolution leads to the creation of today’s modern computer with more complex, more powerful, and increased capability and functionality.

Introduction to Computer Generations

This development period of electronic computing technology is called Computer Generation. There are five generations of computers identified, although the sixth generation could be in development now in the early 21st century.

During the evolutionary timeline, each generation of computers has improved a lot by undergoing considerable changes in their size, type, and functionality.

By analyzing them, one can trace the evolution of computer technology, to see how the computer industry has changed over the years and how great capabilities and software progress has been made by humankind in under a hundred years , as a result, the creation of different generations.

At present, the computer is playing a significant part in human existence because today’s digital computer is being used for every work in each field. If someday an issue occurs in the computer or the server is down, at that point all the work stops. This is how significant it is for technology development!

In this article, I will introduce you to all the generations of computers with pictures by explaining the complete information about their characteristics , names, components , and examples too.

Generations of Computer From 1st to 5th

Generations of Computer 1st to 5th

Let’s discover the series of computer generations in the following list:

1st Generation of Computer (1940-1956)

This first generation of computers was based on vacuum tube technology used for calculations, storage, and control, invented in 1904 by John Ambrose Fleming. The vacuum tubes and diode valves were the chief components of the first generations of computers.

vacuum tube technology

First-generation computers relied on the lowest-level machine language, in order to perform operations, and could only solve a single problem at a point of time.

Magnetic drums were used as the memory in these computers (were very slow in speed). The punched and magnetic tapes were used for the input and output function of the computer in order to display on prints even the results weren’t 100% accurate.

punched and magnetic tapes

Also, the first generation of computers available was based on the 8-bit microprocessor.

The disadvantages of 1st gen computers are that they were very enormous in size and heavy in weight (made of thousands of vacuum tubes ) , occupying large rooms. Also, once they were kept in one place it was difficult to transfer. Another con like using a decimal number system and many switches and cables.

In addition, they were also very expensive to operate with using a large amount of electricity, the vacuum tubes produced large amounts of heat, so an air conditioner was required for the proper functioning unless a lot of heat can cause a malfunction.

The advantage of the first generation of computers is that they could calculate in milliseconds (about five thousand sums per second.)

The computers of first-generation were managed to use in different fields like weather forecasting, solving mathematical problems, energy tasks, also in space research, military, and other scientific tasks.

In the first generation of computers, the first computer of the world named “ENIAC” (Electronic Numerical Integrator and Computer) was discovered by John Mauchly and J. Presper Eckert in the year between 1943 to 1945.

ENIAC used panel-to-panel wiring and switches for programming, occupied more than 1,000 square feet, used about 18,000 vacuum tubes, and weighed 30 tons.

very huge size of computer

Characteristics of the 1st Generation of Computer:

  • Vacuum tubes and diode valves were used as the main electronic component in the first generation computers.
  • Punch cards, paper tape utilized for input and output operations.
  • Magnetic drums used for storage.
  • Huge in size and weight with a lot of power consumption.
  • Very expensive in price also not reliable.
  • Computers were programmed with low-level machine language also has low operating speed.

Examples of the first generation of computers are ENIAC (Electronic Numerical Integrator and Computer), UNIVAC (Universal Automatic Computer) EDSEC (Electronic Delay Storage Automatic Calculator), EDVAC (Electronic Discrete Variable Automatic Computer), (Electronic delay storage automatic calculator), IBM -701 and IBM 650.

ENIAC, the first general-purpose electronic digital computer . This computer about 18,000 vacuum tubes used for the calculation result in huge in size, occupied more than 1,000 square feet, and weighed 30 tons. These were the harbingers of today’s digital computers. This first computing machine was designed by people J. P. Eckert, W. Mosley, J. W. Mauchly.

2nd Generation of Computer (1956-1964)

The second generation of computers replaced the vacuum tubes with a reliable component called transistors for manufacturing of computers was invented by William Shockley in 1947.

transistors

The transistors were the revolution in the computer field because this component advantaged the 2nd gen computer by increasing the performance, operating speed (hundreds of thousands of operations per second), as well as decreasing the electricity consumption of the computers.

Transistors were far superior to the vacuum tube, allowing computers to get faster, cheaper, more energy-efficient made and possible to reduce the size of computing equipment and ultimately heat reduced and reliability improved.

Computers of second-generation are characterized by the use of the first high-level programming languages, allowing programmers to specify instructions in words. At this time, early versions of COBOL, ALGOL, SNOBOL, and FORTRAN languages were developed .

These were the first computers to store their instructions in their memory, which went from a magnetic drum to magnetic core technology. During this period, the first computer game name “ Spacewar ” was seen on a PDP-1 computer.

Spacewar game in PDP-1 computer

Do you know~ that the oldest abacus was a computing machine designed to calculate thousands of years ago, which is still used in schools today to do calculations.

Also, the concept of Central Processing Unit (CPU), multi-programming operating systems, programming language, memory, and input and output units (I / O units) were developed in the timeline of second-generation computers.

The major disadvantages of Second-generation computers were they still relied on punch cards for input and hard copies for output as well as still it was difficult to move the computers for the reason they were enough large and even some computers needed ACs.

2nd generation of computers still huge in size

This second generation of computers was first used in the fields like the atomic energy industry and nuclear power plants and other commercial fields.

Characteristics of the 2nd Generation of Computer:

  • Computers based on transistors instead of vacuum tubes.
  • Magnetic Tape was used to store data.
  • Relatively small in size and reduced weight with low energy consumption than 1st gen computers.
  • Faster, reliable, and less expensive than the first generation.
  • Use of storage devices, printers, and operating systems, etc.
  • Higher-level languages like COBOL, ALGOL, SNOBOL, and FORTRAN were developed and used.

Examples of the second generation of computers include IBM 1620, CDC 1604, IBM 7094, UNIVAC 1108, IBM 620, CDC 3600, IBM 4044, Honeywell 400, IBM 1401 Mainframe, and PDP-1 minicomputer. IBM was actively working, producing transistor versions of its computers.

3rd Generation of Computer (1964-1971)

The third generation appeared in the form of integrated circuits (invented by Jack Kilby from 1958 to 1964). An IC (integrated circuit) is consists of many small transistors mounted on chips , which are called semiconductors.

integrated circuits

This synchronized chip became an important foundation for the third generation computers when scientists combined hundreds of transistors fit in this circuit result in a more powerful electronic segment called an integrated circuit.

Multiprogramming was implemented (this is when there are several executable programs in memory) at the same time that it diminished their manufacturing costs. In the mid-60s. IBM improved the term “computer architecture”. By the end of the 60s. mini-computers appeared.

This revolutionary innovation allowed to expansion of the processing capacity and memory of the machines.

Instead of punch cards and prints, users interacted via keyboards and monitors , and interacted with an operating system, allowing the device to run various applications at once with a central program that monitored the memory.

3rd Generation of Computer

As you can see, the first appearance of computer monitors fell on the second generation of computers. The invention belongs to the company IBM, which in 1964 released the commercial display station IBM-2250.

it was used in the system/360 series. The model had a vector monochrome display measuring 12×12 inches, with a resolution of 1024×1024 pixels and a refresh rate of 40 Hz. This invention revolutionized today’s different types of monitors including LCD, LED, OLED monitors.

The invention of IC incredibly decreased the size of computers and made it easy for transportation from one place to another. The working speed and efficiency of this generation of computers were much faster than the previous generation and even cheaper.

High-end languages such as PASCAL, BASIC, FORTRAN – II TO IV, COBOL, ALGOL developed in this generation.

For the first time, they got access to a mass audience allowed computers to penetrate into different spheres of human activity since they were smaller and cheaper. Along these, they turned out to be more specialized (i.e., there were different computers for different tasks).

The 3rd generation of computers was the initial move towards the miniaturization of computers and quickly expanded their scope: control, automation of scientific experiments, data transmission, etc. In addition to being used in the manufacture of radios, TVs, and other similar devices .

Characteristics of the 3rd Generation of Computer:

  • In this generation, computers based on Integrated Circuit was more powerful than the transistor.
  • The size of the computers was likewise little because the size of the IC being more modest than the circuit size of the transistors.
  • More reliable, inexpensive, faster, energy-efficient, as well as very light in weight than 2nd gen computers.
  • The first Computer Mouse and Keyboard were appeared and used in the 3rd generation of computers
  • Use of new versions of high-level languages like BASIC, COBOL, FORTRAN, PASCAL, and ALGOL
  • Available for a mass audience and made it possible for general purpose usage.

Some of the most popular models of the 3rd generation of computers were the ICL 2903, ICL 1900, TDC-B16, IBM 360 and 370, Honeywell 6000, UNIVAC 1108, PDP-8, and PDP-11, which were ideal in their handling multiprocessing capabilities, reliability, and flexibility than previous generations.

4th Generation of Computer (1971-2010)

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits equivalent to about millions of transistors were assembled and brought the whole central processing unit and other fundamental elements of the machine into a small chip called a microprocessor fitted on the CPU socket.

microprocessor chip

These computers used Very Large Scale Integrated circuits technology also called VLSI technology. After the invention, the microprocessor began to used in computing machines in the fourth and fifth generations of computers.

Within the framework of the considered generation in 1971, the first microprocessor appeared as an unexpected result of Intel’s work on calculator circuits and further development of minicomputers ( PDP-11 ).

first microprocessor of Intel 4004

The first personal computer and a microcomputer was “ ALTAIR ” developed by the company MITS in 1974. Also, the first microprocessor was the Intel 4004, manufactured in 1971, initially for an electronic calculator. Whereas the computers of the first generation filled an entire room, while now the 4th generation ‘microprocessors’ fit in the palm of the hand.

This generation of computers used an operating system based on the graphical user interface (GUI), which means these numbers were very easy to perform mathematical and logical tasks.

The computers started to utilize high-speed memory systems on integrated circuits with a capacity of several megabytes. Computer performance has increased significantly (hundreds of millions of operations per second).

The high-level language like C, C ++, Java, PHP, Python, Visual Basic,  was utilized to compose programs in the computers of the fourth generation.

high-level languages in 4th generation of computers

The advent of the first personal computers in the mid-70s gave every common user the same computing resources that enormous computers had during the 60s. These computers were made more modest, faster, and less expensive can undoubtedly be put on a table or desk. Which marked the so-called era of personal computers .

Peripheral devices examples , such as mice, joysticks, handheld devices, etc., were developed during this 4th generation. Computers could be connected together in a network to share information with each other, this has played an important role in the birth and development of LAN, Ethernet, and the Internet .

Era of personal computers and Internet

The most popular companies in the world like Intel and AMD were rising. Then again, companies like Microsoft and Apple introduced their operating systems ‘Windows’ and ‘Macintosh’ in the generation of this computer. Because of which the act of multimedia started.

This is the era where personal computers were born, an idea that actually persists today. Also, these were the generation of DEC’s (Digital Equipment Corporation) minicomputers.

Characteristics of the 4th Generation of Computer:

  • Computers based on microprocessors and VLSI technology .
  • The computers of 4th gen were small in size, lightweight, and almost portable computers.
  • The integrating of multi cores in processors like Dual core , Octa core, etc has began.
  • The processing speed of this computer generation was much faster and reliable than the previous three generations.
  • The size and cost of power supply units has reduced.
  • Use of languages ​​like C, C ++, .Net, Java, PHP, Python , Visual Basic.
  • Use of GUI Based OS with more memory capacity.
  • Accessible to the Internet .
  • Due to the low cost of these computers, they were available to every common man.

Desktops, Laptops, Workstations, Tablets, Chromebooks , and Smartphones, are examples of the fourth generation of computers.

Good to Know~ Alan Turing is the father of modern computers born in England in 1912.

5th Generation of Computer (2010-At Present)

Artificial intelligence is the name of the fifth as well as the latest generation of computers based on ULSI (Ultra Large Scale Integration) technology is the process of integrating or embedding millions of transistors on a single silicon microchip.

5th Generation of Computer

Computing in the 5th computer generation is versatile made portable, powerful, lightweight, innovative, comfortable with low electricity consumption . Because of the Internet’s advantages , it extended its limits of use to limits never before suspected.

The main objective of the latest fifth-generation computing and effort made by computer researchers is to make them smart by incorporating Artificial Intelligence so as to develop devices that respond to the input of natural language and are capable of learning and self-organizing even in 2022 it is under development.

This new information technology has greatly increased the size and working ability of the microprocessor, which has prompted the use of computers in the various fields of Entertainment, Accounting, Educational institutes , Film-making, Traffic-control, Business applications , and Hospitals, Engineering, Researches, Defense, etc.

That’s why a computer of the 5th generation is also known as the AI (Artificial Intelligence) generation of computers.

Some computers are being intended to do all the work themselves as a human act, behave, and communicate. The best example of this is an Artificial Intelligence (AI) based computing machine in the 5th generation of computers “ Sophia ” a robot.

Artificial intelligence

Characteristics of the 5th Generation of Computer:

  • The main focus on AI-based computers.
  • Computers made of microprocessors based on ULSI (Ultra Large Scale Integration) technology.
  • The processing speed is quite high can perform billions of calculations in a second.
  • Computers are portable, cheap, reliable, fast, and available in various forms and sizes like a Desktop, Laptop, Smartphone, Smartwatches, etc.
  • Invention of the operating system such as Windows, Macintosh and ChromeOS of Chromebooks .
  • Multimedia has evolved in this generation by combining Sound, Graphics, or Picture and Text.
  • Development of Internet of Things.

Computers of the fifth generation are being made to think like us. For which continuous advancement of technologies like Artificial Intelligence, Internet of Things, Robotics, etc. Although the examples of AI computing software such as Chatbots, Windows Cortana, Google Assistant, Apple Siri, Speech recognition, that are being used today.

Classification of the computer by generations

)

.

)

)

)

)

Factors/Reasons for the development of computer generations:

There below are the general factors associated with the development and change in the generations of electronic computers:

  • Improvement of the element base,
  • Downsizing,
  • Technological progress (increased performance, speed, and memory)
  • Reduced cost,
  • Development of  software ,
  • Changes in architecture, expansion of the range of tasks solved by computers,
  • Simplification and standardization of hardware.
  • Changing the way of interaction between the user and the computer.

How many generations of computers have there been?

There are 5 computer generations till now i.e. vacuum tubes, transistors, integrated circuits, microprocessors, and the last one is artificial intelligence. 6th generation yet to come may be either in the form of quantum computers or developing the existing artificial intelligence technology to a greater extent.

What is the 6th generation of computers?

Electronic computers are usually divided into five generations now and the 6th generation is still in development but has the potential to give birth to the sixth generation of computers may be in the form of quantum computing.

Which is the current modern generation of computers today?

The technologies based on artificial intelligence are the current and the latest generation of computers(5th GEN) today.

What is the historical development of computers according to generation?

In accordance with the methodology for assessing the development of computer technology, the first generation was considered to be vacuum tube computers, the second – transistor computers, the third – computers on integrated circuits, the fourth – using microprocessors, and the fifth generation computers is based on the artificial intelligence.

What is the generation of a colossus computer?

Colossus computer was the first generation of the computer developed and designed by Tommy Flowers at Bletchley Park in the year 1944 with the purpose of cracking Hitler’s codes.

The sixth will also discover in the future since there are some flaws of technology in this generation that will be revived or resolved in the upcoming generation.

It takes much time and research to publish such an article ” Generation of Computer 1st to 5th “ If you liked the insights of the article you can support us by sharing this post on social networks.

Share this Post !

Similar Posts

What is WiFi 6 Benefits and Drawbacks Differences from WiFi 5

What is WiFi 6 Benefits and Drawbacks? Differences from WiFi 5

What is WIFI 6 802.11ax Target Wake Time (TWT)

What is WIFI 6 / 802.11ax? Target Wake Time (TWT)?

Types of Central processing Unit (CPU Processors)

Types of Central Processing Unit » All Types of Processors.

Examples of computer peripheral devices

Meaning, Types & Examples of Peripheral Devices with Pictures

26 thoughts on “generations of computer 1st to 5th explained with pictures.”.

yes that awesome

You’re welcome. And I’m happy to hear that you enjoyed this information!

This the best platform for student to learn from very gradual, this information is really helpful

It was so wonderful and interesting thank you so much

Hi Rachel, you’re welcome. And I’m happy to hear that you enjoyed this information!

You have explained the generation of computers very well, by reading this article anyone will understand about the generation of computers.

You’re welcome. Glad you learned some new & informative stuff.

Yes you right sir

Thanks for DIGITALWORLD839.COM for publication of the topics on computers

Wow it helped a lot

Hi Angel, you’re welcome. And I’m happy to hear that you found this information helpful!

You’re welcome, Asif.

Thank you so much

You’re welcome, Zamzam.

thank you! you help me a lot

Very informative and really precise on the subjects. Thanks.

This’s really helped me with my school project. Thanks so much!

It’s outstanding To much details given by the writer

Well understood!

well understood! thank you

That sounds nice It’ll boost the academic performance of computer student?

Thanks i found this platform very interesting

Thanks for the information it’s really useful

That’s great

thank so much for the help much appreciated.

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Computer Technology: Evolution and Developments Essay

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

Evolution of Computers and their Technology

Uses of computers, advantages of computers and their technology, disadvantages of computers and computer technology, trends in computer technology, works cited.

The development of computer technology is characterized by the change in the technology used in building the devices. The evolution of computer technology is divided into several generations, from mechanical devices, followed by analog devices, to the recent digital computers that now dominate the world. This paper examines the evolution of computers and their technology, their use in the early and modern periods, their merits and demerits, and future developments.

Mechanical Age (1800s -1920s)

The development of the computer characterized this period to facilitate mathematical calculations that could not be done manually by individuals. The first notable computing device was the “analytical engine” designed by Charles Babbage in 1834, which used electromechanical relays to function (Zakari 1). The mechanical era saw improvements made to the first design by Babbage until the first generation era.

First Generation (the 1930s-1950s)

The first generation era is characterized by the development of three electronic computers that used vacuum tubes, unlike the previous devices that used electromechanical relays to perform their tasks (Enzo 4). In this period, the machines were capable of storing data in the form of instructions written manually by the programmers and installed into the device (Zakari 1). The devices developed in this period were primarily used in applied science and engineering to facilitate solving evaluations.

Second Generation (Mid-1950s-Early 1960s)

The second-generation period saw the development of many design areas; there was development in the technology used and the programming language used to write the commands. Unlike in the previous generations, the operations in this era were performed in the hardware (McAfee 141). The period saw the development of the index registers used for numerous operations.

Third Generation (the Early 1960s – Early1970s)

The era saw improvement in the technology used in designing the devices; integrated circuits in computer devices were introduced. The period saw the introduction of the microprogramming technique and the development of the operation system (Zakari, 1). The speed of functioning of the devices designed in this period was faster than in the previous eras, and the computers could perform more functions.

Fourth Generation (The early 1970s – Mid 1980s)

This Generation saw the development in the use of large-scale integration in the computers developed. The size of the microchips was the information for the computers was stored was reduced to allow for data to be stored in the same microchip (Zakari 1). The devices were installed with semiconductors memories to replace the core memories of the previous era. The processors were designed with high speed to allow faster processing speed of operations in the devices (McAfee 141).

Fifth Generation (the Mid 1980s- Early 1990s)

The machines/ devices designed had many processors that worked simultaneously on a single program (Zakari1). The semiconductors in the computers were improved to increase the scale of operation with the development of chips (Enzo 2). In this period, the computer devices developed were capable of performing parallel processing of commands. Which improved their functionality?

Sixth Generation (1990 to Date)

The era is characterized by improvements in all the areas of designing computers. There is a reduction in the size of the devices developed with increased portability of the machines. The era has seen the development of computers to interact more with people and facilitate human functions in society, with an increase in connection due to improved network development linking computers (Zachari 1).

The early computers were mainly used to accomplish mathematical functions in applied science and engineering. These machines were primarily used to solve mathematical calculation problems (Zakari 1). The second-generation devices improved on their functionality and were capable of processing information stored in them by the programmer (Zakari 1). Today, individuals use computers to perform various functions, including facilitating communication, storing data, and processing information for individuals. The use of computer technology is now in every section of the world; people in different areas are using computers to perform numerous functions (McAfee 141). The technology is directly applied in agriculture, health and medicine, education and transport, communication, and other regions.

Computer technology has enabled the development of devices like mobile phones that are easy to use and effective, allowing individuals to keep in contact with one another even when at different locations (Golosova and Romanovs 3). Computer technology has improved manufacturing; producing goods is now better and more efficient due to the development of technology that enhances individuals’ performance. Computer technology enhances the development of better healthcare operations by facilitating functions in health. Computer technology also enhances learning as individuals can get the required learning material (Golosova and Romanovs 6). Computers and computer technology improve teacher-student interaction during education by providing a medium that can facilitate lessons.

Computers are hazardous to human health; when used excessively, individuals suffer from health issues like eye problems resulting from extreme exposure to the screen light. Also, sitting for an extended period affects an individual’s health (Golosova and Romanovs 14). Computers and computer technology are artificial, making them susceptible to human manipulation; humans are exposed to risks from those that can harm them by manipulating information (Suma 133). Computers also impact the environment negatively due to the carbon footprint left in the environment when they become obsolete because people can no longer use them.

There is an expected increase in the use of artificial intelligence among people with increased developments in computers and their technology (McAfee 141). Computer technology is expected to increase the automation of processes and functci0ons previously done by humans in society. Computer technology is expected to increase the virtual reality and augmented reality among individuals in society to improve the human experience.

Enzo, Albert, Charles O. Connors, and Walter Curtis. “The Evolution of Computer Science.” Computer Science, Murdoch University, Australia. Web.

McAfee, Andrew. “Mastering the Three Worlds of Information Technology.” Harvard Business Review. vol. 84, no. 11, 2006, p. 141. Web.

Suma. V. “Computer Vision for Humans-machines Interaction-review.” Journal of Trends in Computer Science and Smart Technology ( TCSST ), vol. 1, no. 2, 2019, pp. 131-139. Web.

Golosova, Julija, and Andrejs Romanovs. “The Advantages and Disadvantages of the Blockchain Technology.” 2018 IEEE 6th Workshop on Advances in Information, Electronic and Electrical Engineering (AIEEE) . Web.

Zakari, Ishaq “History of Computers and its Generations.” Umaru Musa Yar’adua University, Katsina State (2019). Web.

  • How Computer Works?
  • Threats Facing Microsoft Products and ISA Server Security Settings
  • Personal Computer Evolution Overview
  • Computer Evolution, Its Future and Societal Impact
  • Mechanical Engineering Lab: Chevrolet Engine Performance
  • The Essence of Niche Networking
  • VMware Server Virtualization Solution
  • Google Technologies That Are Currently Developing
  • Creating a Database in Microsoft Access
  • Computer Network: Data Flow and Protocol Layering
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2022, August 13). Computer Technology: Evolution and Developments. https://ivypanda.com/essays/computer-technology-evolution-and-developments/

"Computer Technology: Evolution and Developments." IvyPanda , 13 Aug. 2022, ivypanda.com/essays/computer-technology-evolution-and-developments/.

IvyPanda . (2022) 'Computer Technology: Evolution and Developments'. 13 August.

IvyPanda . 2022. "Computer Technology: Evolution and Developments." August 13, 2022. https://ivypanda.com/essays/computer-technology-evolution-and-developments/.

1. IvyPanda . "Computer Technology: Evolution and Developments." August 13, 2022. https://ivypanda.com/essays/computer-technology-evolution-and-developments/.

Bibliography

IvyPanda . "Computer Technology: Evolution and Developments." August 13, 2022. https://ivypanda.com/essays/computer-technology-evolution-and-developments/.

IvyPanda uses cookies and similar technologies to enhance your experience, enabling functionalities such as:

  • Basic site functions
  • Ensuring secure, safe transactions
  • Secure account login
  • Remembering account, browser, and regional preferences
  • Remembering privacy and security settings
  • Analyzing site traffic and usage
  • Personalized search, content, and recommendations
  • Displaying relevant, targeted ads on and off IvyPanda

Please refer to IvyPanda's Cookies Policy and Privacy Policy for detailed information.

Certain technologies we use are essential for critical functions such as security and site integrity, account authentication, security and privacy preferences, internal site usage and maintenance data, and ensuring the site operates correctly for browsing and transactions.

Cookies and similar technologies are used to enhance your experience by:

  • Remembering general and regional preferences
  • Personalizing content, search, recommendations, and offers

Some functions, such as personalized recommendations, account preferences, or localization, may not work correctly without these technologies. For more details, please refer to IvyPanda's Cookies Policy .

To enable personalized advertising (such as interest-based ads), we may share your data with our marketing and advertising partners using cookies and other technologies. These partners may have their own information collected about you. Turning off the personalized advertising setting won't stop you from seeing IvyPanda ads, but it may make the ads you see less relevant or more repetitive.

Personalized advertising may be considered a "sale" or "sharing" of the information under California and other state privacy laws, and you may have the right to opt out. Turning off personalized advertising allows you to exercise your right to opt out. Learn more in IvyPanda's Cookies Policy and Privacy Policy .

History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

What is a quantum bit (qubit)?

Japan to start building 1st 'zeta-class' supercomputer in 2025, 1,000 times more powerful than today's fastest machines

Watch extremely rare footage of a bigfin squid 'walking' on long, spindly arms deep in the South Pacific

Most Popular

  • 2 'The secret to living to 110 was, don't register your death': Ig Nobel winner Saul Justin Newman on the flawed data on extreme aging
  • 3 New self-swab HPV test is an alternative to Pap smears. Here's how it works.
  • 4 Nuking an asteroid could save Earth from destruction, researchers show in 1st-of-its-kind X-ray experiment
  • 5 Experts predicted way more hurricanes this year — here's the weird reason we're 'missing' storms

essay on first generation of computer

WebNots

Home » Tech Tips » Internet » A Comprehensive Guide to Generations of Computers

A Comprehensive Guide to Generations of Computers

There are five generations of computers and the sixth generation is an emerging one. Over past decades, computers have evolved significantly, with each generation introducing new capabilities, improved performance, and enhanced features. The journey of computer’s development through different generations represents a fascinating tale of innovation, progress, and technological advancement. In this guide, we will delve into the various generations of computers, highlighting their characteristics, key advancements, and the impact they had on shaping the digital landscape.

Computer Generations

Learn more about types of computer keyboards and types of search engines .

Generations of Computers

There are five generations of computers.

  • First generation computers used vacuum tubes.
  • Second generation computers used transistors.
  • Third generation computers used ICs (Integrated Circuits).
  • Microprocessors are used in fourth generation computers.
  • Fifth generation computers are the most modern ones that are commonly used nowadays.

And finally, the sixth generation is AI powered super computers that are emerging and evolving as of today. So, this is not yet an officially and widely accepted category.

Download this entire guide to generations of computers as a PDF file

Guide to Generations of Computers

1. First Generation Computers – Vacuum Tubes

The first generation of computers, spanning the 1940s to the early 1950s, represents the initial foray into electronic computing. These machines were huge, expensive and marked by the use of vacuum tubes as their primary electronic component. Here are key aspects of the first generation of computers, along with notable examples.

Vacuum Tubes – Characteristics

Vacuum tubes are glass tubes containing electrodes used to control electrical current. They were the heart of early computers, performing functions like amplification and switching. The first generation marked the shift from mechanical calculating devices to electronic computing. This transition laid the foundation for subsequent generations to build upon. First generation computers processed data in binary code, using ones and zeros to represent information. These computers were primarily designed for scientific and mathematical calculations, often related to military or defense applications.

Vacuum Tube

Programming Challenges & Other Issues

Programmers in the first generation had to physically wire the machine to perform specific tasks. This process was time-consuming and required a deep understanding of the machine’s architecture. Debugging and correcting errors in the programs were complex tasks due to the lack of high-level programming languages and debugging tools.

Vacuum tubes generated a considerable amount of heat, were prone to failure and consumed significant amounts of power. This made the machines large, cumbersome and challenging to maintain. Despite being revolutionary at the time, these computers were relatively slow by today’s standards and their applications were limited compared to modern computing.

Interaction with these computers was minimal and users often had to physically reconfigure the machine for different tasks. Skilled operators played a crucial role in the operation of first generation computers, handling tasks like loading programs and managing hardware components.

Examples of First Generation Computers

  • ENIAC (Electronic Numerical Integrator and Computer): Completed in 1945, ENIAC was one of the earliest electronic general-purpose computers. It consisted of around 17,468 vacuum tubes and occupied a large room.
  • UNIVAC I (Universal Automatic Computer): Developed in the early 1950s, UNIVAC I was the first commercially produced computer. It used vacuum tubes and magnetic tape for data storage.

ENIAC

Moving to Second Generation

First generation computers quickly became outdated as technology evolved. The rapid pace of advancements in subsequent generations rendered these machines obsolete within a relatively short time frame. Understanding the challenges and innovations of the first generation of computers provides valuable insights into the monumental strides made in subsequent generations. The transition from vacuum tubes to transistors in the second generation marked a pivotal moment in the history of computing, paving the way for smaller, more reliable and efficient machines.

2. Second Generation Computers – Transistors

The second generation of computers, spanning the late 1950s to the early 1960s, marked a significant leap forward in terms of technology and design compared to the first generation. The key innovation defining this era was the replacement of vacuum tubes with transistors, leading to improvements in size, reliability and efficiency. Here are some crucial aspects of the second generation, along with notable examples.

Transistor

Prominent Features

The most defining feature of second generation computers was the use of transistors as electronic components, replacing the bulky and less reliable vacuum tubes. Transistors were smaller, faster, more durable and consumed less power than vacuum tubes. This transition resulted in more compact and efficient computer systems. It also made them more affordable and accessible to a broader range of organizations and businesses.

  • Magnetic Core Memory – Second generation computers replaced the drum memory used in the first generation with magnetic core memory. This type of memory was faster, more reliable and allowed for random access to data. Magnetic core memory improved the overall performance and efficiency of computers , making them more suitable for a wider range of applications.
  • Printed Circuit Boards – Second generation computers saw the adoption of printed circuit boards, which simplified the construction of electronic circuits and contributed to the overall reliability of the systems. The use of printed circuit boards allowed for easier maintenance and troubleshooting.
  • Speed & Processing – Second generation computers demonstrated substantial improvements in processing speed compared to their predecessors, allowing for more complex calculations and data processing. These computers found applications in scientific research, business data processing and military operations, reflecting the growing versatility of computing technology.

Programming & Processing

With the advent of assembly languages and high-level programming languages like FORTRAN and COBOL, programming became more accessible and less reliant on low-level machine code. This shift allowed for more efficient programming, making it easier for developers to write and debug code.

Second generation computers often operated in batch processing mode, where a series of jobs were submitted for processing together. This mode improved the overall efficiency of computing tasks.

Examples of Second Generation Computers

  • IBM 1401 and CDC 1604 are examples of second generation computers that were widely used for batch processing applications.
  • IBM 7090 and UNIVAC 1107 were examples of second generation computers that were smaller and more commercially viable.

IBM 1401 Computer

Moving to Third Generation

The second generation marked the beginning of the end of the punched card era. While punched cards were still used for input and output, magnetic tapes and disks became more prevalent, offering faster and more efficient data storage solutions. The transition to transistors and other technological advancements during the second generation laid the groundwork for subsequent developments in computing. The improvements in size, speed and reliability set the stage for further innovation in the third generation, which would see the integration of integrated circuits and bring about a new era in computing.

3. Third Generation of Computers – Integrated Circuits

The third generation of computers, spanning the 1960s to the 1970s, marked a significant evolution in computing technology, introducing integrated circuits (ICs) and bringing about improvements in performance, reliability and versatility. This era witnessed a shift from discrete transistors to integrated circuits, enabling more powerful and compact computer systems. Here are key aspects of the third generation, along with notable examples.

Integrated Circuits (ICs)

The defining feature of third generation computers was the use of integrated circuits, which incorporated multiple transistors and other electronic components onto a single semiconductor chip. Integrated circuits significantly reduced the size of computers, enhanced reliability and improved overall performance. The miniaturization allowed for the creation of smaller, more efficient and cost-effective systems.

Microprocessor Chip

Advancements with Third Generation

  • Graphics – Third generation computers started to incorporate basic graphics capabilities, paving the way for the development of graphical user interfaces (GUIs) in subsequent generations. Graphics capabilities found applications in scientific visualization, engineering and early computer-aided design (CAD).
  • High-level Programming Languages –  The use of high-level programming languages continued to evolve in the third generation. Languages such as COBOL, FORTRAN and ALGOL gained popularity, making programming more accessible and efficient. The availability of high-level languages allowed programmers to focus on problem-solving rather than dealing with the complexities of machine code, fostering greater productivity and software development.
  • Time-Sharing Systems – Third generation computers introduced more sophisticated operating systems, facilitating better management of resources and scheduling of tasks. Time-sharing systems emerged, enabling multiple users to access a computer simultaneously. This marked a departure from batch processing, allowing for interactive computing and improved resource utilization.
  • Input/Output Devices – The third generation saw improvements in input/output devices. The use of terminals and displays became more widespread, enhancing user interaction and making computing more user-friendly.
  • Remote Data Access – With improvements in communication technology, third generation computers began to support remote data access. This facilitated the sharing of information across different locations and laid the groundwork for the interconnected computing environments of the future.
  • Magnetic Tape and Disk Storage – While magnetic tapes were still used for data storage, third generation computers witnessed the increased adoption of magnetic disk storage. Disk storage allowed for faster access to data and became a standard feature in computer systems.

Examples – Mainframe & MiniComputers

Third generation computers saw the widespread adoption of mainframe computers, which became the backbone of large-scale data processing for organizations and businesses. IBM System/360, introduced in 1964, was a groundbreaking series of mainframe computers that offered a range of compatible models for different applications. The System/360 architecture set a standard for compatibility across various models and paved the way for future computing systems.

Third generation also saw the rise of minicomputers, which were smaller, more affordable and suitable for medium-scale computing tasks. DEC PDP-11, introduced in 1970, was a highly successful minicomputer that found applications in research, education and industrial control systems.

Mainframe Computer

Moving to Fourth Generation

The third generation of computers represented a significant step forward in terms of technology, with integrated circuits revolutionizing the design and capabilities of computing systems. The adoption of high-level programming languages, sophisticated operating systems and advancements in storage and communication set the stage for the continued evolution of computers in the fourth generation and beyond.

4. Fourth Generation Computers – Microprocessors

The fourth generation of computers, spanning the late 1970s through the 1980s and into the 1990s, witnessed transformative advancements in technology, introducing microprocessors, personal computers and a shift towards user-friendly interfaces. This era marked a departure from the large, centralized mainframe systems of the previous generations. Here are key aspects of the fourth generation, along with notable examples.

Microprocessor

Features & Advancements

  • Microprocessors – The most significant development of the fourth generation was the integration of microprocessors. Microprocessors combined the central processing unit (CPU) onto a single semiconductor chip, bringing unprecedented computing power to smaller, more affordable systems. Microprocessors enabled the creation of compact, powerful and energy-efficient computers. This innovation paved the way for the personal computer revolution.
  • Personal Computers (PCs) – The fourth generation saw the rise of personal computers, making computing accessible to individuals and small businesses.
  • Storage Advancements – Fourth generation computers saw the widespread adoption of hard disk drives (HDDs) for mass storage. Hard drives offered larger capacities and faster access to data than previous storage technologies. The introduction of CDs as a storage medium for software distribution and multimedia content became prominent during this era.
  • Parallel Processing and Supercomputers – The fourth generation saw advancements in parallel processing, enabling computers to perform multiple tasks simultaneously.
  • Graphical User Interfaces (GUIs) – GUIs became a standard feature in the fourth generation computers, providing users with visual interfaces, icons and point-and-click interactions. GUIs made computers more user-friendly and accessible to individuals with limited technical expertise, contributing to the democratization of computing.
  • Software Development – Fourth generation computers saw a proliferation of software applications for various purposes, including word processing, spreadsheets, databases and entertainment. The availability of commercial software expanded, providing users with a wide range of options to enhance productivity and creativity.

Networking and the Internet

The fourth generation saw the expansion of computer networking, laying the groundwork for the development of the internet.

  • TCP/IP Protocol – The adoption of TCP/IP protocol standardized communication on the emerging internet, facilitating global connectivity .
  • ARPANET – The precursor to the internet, ARPANET, continued to evolve during this era, connecting research institutions and paving the way for the information age.

Examples of Fourth Generation Computers

The fourth generation witnessed the development of portable computers and laptops, providing users with mobility and flexibility.

  • Personal Computers – Introduced in 1981, the IBM PC became a standard for personal computing. Its open architecture allowed for the use of third-party hardware and software, contributing to the widespread adoption of PCs.
  • Portable Computers – The Osborne 1 (1981) and the IBM ThinkPad (1992) were early examples of portable computers that contributed to the evolution of mobile computing.
  • Apple Macintosh – Launched in 1984, the Macintosh brought a graphical user interface (GUI) to personal computers, enhancing user interaction and making computing more intuitive.
  • Supercomputers – High-performance computing became more accessible, with the development of supercomputers like the Cray-2 (1985) and the Connection Machine (1987).

Apple’s Macintosh System Software (macOS) and Microsoft Windows were prominent examples of operating systems with graphical user interfaces.

Moving to Fifth Generation

The fourth generation of computers revolutionized the landscape by making computing power available to individuals, fostering a new era of accessibility and innovation. The integration of microprocessors, the rise of personal computers and the development of user-friendly interfaces laid the foundation for the diverse and interconnected computing ecosystem we experience today.

Apple Macintosh

5. Fifth Generation of Computers

The fifth generation of computers represents a period of computing that extends from the late 20th century into the early 21st century. This era is characterized by advancements in parallel processing, artificial intelligence (AI) and the development of novel computing architectures. While the exact timeline of the fifth generation can vary, it generally covers the period from the mid-1980s to the present day. Here are key aspects of the fifth generation, along with notable examples.

  • Parallel Processing – Fifth generation computers embraced parallel processing, the simultaneous execution of multiple tasks to enhance computational speed and efficiency. Parallel processing allowed for the development of supercomputers and high-performance computing clusters capable of tackling complex problems in fields like scientific research, weather modeling and cryptography.
  • Artificial Intelligence (AI) – The fifth generation is often synonymous with the integration of artificial intelligence into computing systems. Advanced programming languages, expert systems and neural networks became integral tools in the development of AI applications. AI supports in areas like natural language processing, image recognition and expert systems for decision-making.
  • Knowledge-Based Systems – Knowledge-based systems, also known as expert systems, were developed during the fifth generation. These systems used human knowledge to make decisions and solve complex problems.
  • Natural Language Processing (NLP) – Fifth generation computers focused on improving the ability to understand and respond to human language. NLP applications included language translation, voice recognition and text understanding.
  • Massive Parallelism and Distributed Computing – The fifth generation witnessed a shift towards massive parallelism and distributed computing architectures.
  • Quantum Computing (Emerging) – Towards the latter part of the fifth generation and into the sixth generation, quantum computing emerged as a groundbreaking field. Quantum computers leverage the principles of quantum mechanics to perform computations at speeds that classical computers cannot achieve.
  • Personal Computing Evolution – The fifth generation saw the continued evolution of personal computing, with advancements in hardware, software and user interfaces.

Fifth Generation Computer Systems (FGCS) & Internet

The Japanese government launched the Fifth Generation Computer Systems project in the 1980s, aiming to develop advanced computer systems with AI capabilities. The project was focused on parallel processing, knowledge-based systems and natural language processing. While it didn’t achieve all its ambitious goals, it contributed to advancements in AI research.

The fifth generation witnessed the widespread adoption of the internet as a global communication and information-sharing platform. The development of the World Wide Web in the early 1990s transformed how information is accessed and shared, leading to the interconnected digital world we experience today.

Examples – Mainframe & Minicomputers

  • IBM’s Deep Blue, which defeated a world chess champion in 1997, is a notable example of AI achievements during this era.
  • Systems like IBM’s Watson, known for winning Jeopardy! in 2011, showcased advancements in natural language processing.
  • Distributed computing projects, like SETI@home, utilized the power of networked computers worldwide to analyze radio signals from space in the search for extraterrestrial intelligence.

The proliferation of personal computers, laptops and the eventual rise of smartphones and tablets exemplify the ongoing evolution of computing devices. Companies like IBM, Google and startups like Rigetti and D-Wave are actively working on quantum computing research and development.

IBM Watson

Moving to Sixth Generation

The fifth generation of computers represents a period of profound transformation, with a focus on AI, parallel processing and the development of technologies that continue to shape the digital landscape. As technology continues to advance, the fifth generation sets the stage for ongoing innovations in computing, including the exploration of quantum computing and the continued integration of AI into various aspects of our lives.

6. Sixth Generation of Computers

The sixth generation of computers are still in the early stages of development and concrete examples are not yet been fully realized. Predictions and expectations for the sixth generation generally involve advancements in technologies such as quantum computing, artificial intelligence (AI) and further integration of computing into various aspects of daily life. Here are key concepts associated with the potential characteristics of the sixth generation.

AI Chips

  • Quantum Computing – Quantum computing represents a paradigm shift in computing, utilizing the principles of quantum mechanics to perform calculations at speeds that surpass classical computers. Quantum computers have the potential to solve complex problems, such as optimization tasks, cryptography and simulations, at a pace that was previously unimaginable.
  • Biocomputing and Neuromorphic Computing – The sixth generation may explore the integration of biological components into computing systems. This includes the use of DNA computing and other biologically-inspired computing approaches. Drawing inspiration from the human brain, neuromorphic computing aims to create processors that mimic the brain’s architecture, potentially leading to more efficient and powerful computing systems for tasks like pattern recognition and learning.
  • AI Integration – The sixth generation is expected to witness the development of even more advanced and sophisticated AI systems , capable of complex reasoning, problem-solving and decision-making. AI may become further integrated into various aspects of daily life, from autonomous vehicles and smart homes to personalized healthcare and virtual assistants.
  • Advanced Robotics – Sixth generation computers may contribute to the development of more advanced and autonomous robotic systems. These could find applications in fields like healthcare, manufacturing and space exploration.
  • Brain-Computer Interfaces (BCIs) – The integration of computers with the human brain through BCIs could become more sophisticated in the sixth generation, allowing for direct communication between the brain and computing systems.
  • Augmented and Virtual Reality – Advances in augmented and virtual reality technologies may further enhance the integration of computing into human experiences. You can expect spatial computing devices like Apple Vision Pro will take the computer technology to imaginary level.
  • Green Computing and Sustainability – The sixth generation may prioritize sustainability and energy efficiency in computing, exploring new technologies to reduce the environmental impact of large-scale computing systems.
  • Edge Computing – This involves processing data closer to the source rather than relying on centralized cloud servers. The sixth generation may see further developments in edge computing for faster data processing and reduced latency.
  • Hybrid Architectures – Hybrid computing architectures that leverage a combination of classical computing, quantum computing and other specialized computing technologies may become prevalent in the sixth generation.
  • Advanced Encryption – With the growing importance of cybersecurity, the sixth generation is likely to bring advancements in encryption and security measures to protect sensitive data.

It’s essential to note that the predictions for the sixth generation are speculative and the timeline for its full realization may extend well into the future. Ongoing research and development in various fields, including quantum computing, AI and biotechnology, will play a crucial role in shaping the characteristics of the sixth generation of computers.

Sixth Generation Computers

The evolution of computers across different generations reflects the relentless pursuit of innovation and improvement in the field of computing. Each generation has left an indelible mark on the digital landscape, shaping the way we work, communicate and live. As we look to the future, the ongoing advancements in technology continue to redefine the possibilities of computing, promising a world where the sixth generation and beyond will unlock new frontiers in computational capabilities.

Use of Latest Computers

About Editorial Staff

Editorial Staff at WebNots are team of experts who love to build websites, find tech hacks and share the learning with community.

You also might be interested in

Project Android Phone to Windows PC

How to Project Android Phone to Windows PC Screen?

Apple has continuity features to seamlessly integrate iPhone and Mac.[...]

Remote Desktop Connection in Windows 10

How to Setup Remote Desktop Connection in Windows 10?

Remote desktop connection is a method of making connection between[...]

7 Ways to Restart Windows Computer

7 Ways to Restart Windows 10 Computer

Restarting is one of the quick solutions to many Windows[...]

DOWNLOAD EBOOKS

  • SEO Guide for Beginners
  • WordPress SEO PDF Guide
  • Weebly SEO PDF Guide
  • Alt Code Emoji Shortcuts PDF
  • Free ALT Code Shortcuts PDF
  • View All eBooks

TRENDING TECH ARTICLES

  • 600+ Windows Alt Codes for Symbols
  • Fix Chrome Resolving Host Problem
  • Fix Slow Page Loading Issue in Google Chrome
  • View Webpage Source CSS and HTML in Google Chrome
  • Fix Safari Slow Loading Pages in macOS
  • Fix Windows WiFi Connection Issue
  • ROYGBIV or VIBGYOR Rainbow Color Codes
  • Fix I’m Not A Robot reCAPTCHA Issue in Google Search
  • Structure of HTTP Request and Response

POPULAR WEB TUTORIALS

  • Move WordPress Localhost Site to Live Server
  • Move Live WordPress Site to Localhost
  • Move WordPress Media Folder to Subdomain
  • Fix WooCommerce Ajax Loading Issue
  • Create a Free Weebly Blog
  • Edit Weebly Source Code HTML and CSS
  • Add Scroll To Top Button in Weebly
  • Add Table in Weebly Site
  • How to Add Advanced Data Table Widget in Weebly?
  • Up to $500 Free Google Ads Coupon Codes

FREE SEO TOOLS

  • Webpage Source Code Viewer
  • HTTP Header Checker
  • What is My IP Address?
  • Google Cache Checker
  • Domain Age Checker Tool
  • View All Free Web and SEO Tools

© 2024 · WebNots · All Rights Reserved.

Type and press Enter to search

  • Definitions
  • How to Buy XRP in 2024
  • How to Buy Bitcoin in 2024
  • How to Buy Shiba Inu in 2024
  • Crypto Gambling
  • Crypto Casinos
  • Crash Gambling
  • Crypto Sports Betting

What Are the 5 Generations of Computers?

5 generations of computers checklist, getting started: key terms to know, first generation: vacuum tubes (1940–1956), second generation: transistors (1956–1963), third generation: integrated circuits (1964–1971), fourth generation: microprocessors (1971–present), fifth generation: artificial intelligence (present and beyond), what are the five generations of computers (1st to 5th).

Vangie Beal

Reviewed by Web Webster

We’ve come a long way since the first generation of computer, with new generation of computers bringing significant advances in speed and power to computing tasks. Learn about each of the five generations of computers and major technology developments that have led to the computer technology that we use today.

The history of computer development is a computer science topic that is often used to reference the different generations of computing devices . Each computer generation is characterized by a major technological development that fundamentally changed the way computers operate.

Each major developments from the 1940s to the present day (5th generation of computer) has introduced smaller, cheaper, more powerful, and more efficient computing machines. This technology has minimized storage and increased portability.

In this Webopedia Study Guide, you’ll learn more about each of the five generations of computers and the advances in technology that have led to the development of the many computing devices we use today.

Our journey through the five generations of computers starts in 1940 with vacuum tube circuitry and goes to the present day and beyond with artificial intelligence (AI)  systems and devices.

Let’s take a look…

  • First Generation: Vacuum Tubes
  • Second Generation: Transistors
  • Third Generation: Integrated Circuits
  • Fourth Generation: Microprocessors
  • Fifth Generation: Artificial Intelligence

The following technology definitions will help you to better understand the five generations of computing:

  • Microprocessor
  • Magnetic drums
  • Integrated circuit
  • Semiconductor
  • Nanotechnology
  • Machine language
  • Assembly language
  • Artificial intelligence

The first generation of computer systems used vacuum tubes for circuitry and magnetic drums for main memory , and they were often enormous, taking up entire rooms. These computers were very expensive to operate, and in addition to using a great deal of electricity, the first computers generated a lot of heat, which was often the cause of malfunctions. The maximum internal storage capacity was 20,000 characters. 

First generation computers relied on machine language , the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. It would take operators days or even weeks to set up a new problem. Input was based on punched cards and paper tape, and output was displayed on printouts.

It was in this generation that the Von Neumann architecture was introduced, which displays the design architecture of an electronic digital computer. Later, the UNIVAC and ENIAC computers, invented by J. Presper Eckert, became examples of first generation computer technology. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

Recommended Reading: Webopedia’s ENIAC definition

The world would see transistors replace vacuum tubes in the second generation of computer. The transistor was invented at Bell Labs in 1947 but did not see widespread use in computers until the late 1950s. This generation of computers also included hardware advances like magnetic core memory, magnetic tape, and the magnetic disk.

The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient, and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. A second-generation computer still relied on punched cards for input and printouts for output .

When Did Computers Start Using Assembly Languages?

Second-generation computers moved from cryptic binary language to symbolic, or assembly , languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN . These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

The first computers of this generation were developed for the atomic energy industry.

The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips , called semiconductors , which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users would interact with a third-generation computer through keyboards, monitors, and interfaces with an operating system , which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers, for the first time, became accessible to a mass audience because they were smaller and cheaper than their predecessors.

Did You Know… ? Integrated circuit (IC) chips are small electronic devices made out of semiconductor material. The first integrated circuit was developed in the 1950s by Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor.

The microprocessor ushered in the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. The technology in the first generation that filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, integrated all the components of the computer, from the central processing unit and memory to input/output controls, on a single chip.

In 1981, IBM introduced its first personal computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use the microprocessor chip.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Each fourth-generation computer also saw the computer development of GUIs , the mouse , and handheld technology.

The fifth generation of computer technology, based on artificial intelligence, is still in development. However, there are some applications, such as voice recognition , that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. This is also so far the prime generation for packing a large amount of storage into a compact and portable device.

Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that will respond to natural language input and are capable of learning and self-organization.

The History of Windows Operating Systems  

Related Links

  • IBM 704 Vacuum Tube Assembly
  • UNIVAC I History
  • Vintage Computer Chip Collectibles, The Transistor
  • ENIAC Museum Online
  • IBM Advanced Computing Systems Timeline

Vangie Beal

Vangie Beal is a freelance business and technology writer covering Internet technologies and online business since the late '90s.

  • Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial

Generations of Computers – Computer Fundamentals

Generations of Computer : The modern computer took its shape with the arrival of your time. It had been around the 16th century when the evolution of the computer started. The initial computer faced many changes, obviously for the betterment. It continuously improved itself in terms of speed, accuracy, size, and price to urge the form of the fashionable day computer. 

Basic Terms Related to Computers

The basic terms related to generations of computers are listed below.

  • Vacuum Tube: Vacuum tubes have the functionality of controlling the flow of electronics in a vacuum. Generally, it is used in switches, amplifiers, radios, televisions, etc.
  • Transistor: A transistor helps in controlling the flow of electricity in devices, it works as an amplifier or a switch.
  • Integrated Circuit (IC): Integrated circuits are silicon chips that contain their circuit elements like transistors, resistors, etc.
  • Microprocessors: Microprocessors are the components that contain the CPU and its circuits and are present in the Integrated Circuit.
  • Central Processing Unit (CPU): The CPU is called the brain of the computer. CPU performs processing and operations work.
  • Magnetic Drum: Magnetic Drum is like a cylinder that stores data and cylinder.
  • Magnetic Core: Magnetic cores are used to store information. These are arrays of small rings.
  • Machine Language: Machine Language is the language that a computer accepts (in the form of binary digits). It is also called low-level programming language.
  • Memory: Memory is used to store data, information, and program in a computer.
  • Artificial Intelligence: Artificial Intelligence deals with creating intelligent machines and behaviors.

Phases of Computer Generations

This long period is often conveniently divided into the subsequent phases called computer generations.

  • First Generation Computers (1940-1956)
  • Second Generation Computers (1956-1963)
  • Third Generation Computers (1964-1971)
  • Fourth Generation Computers (1971-Present)
  • Fifth Generation Computers (Present and Beyond)
Generations of Computer     Time-Period          Evolving Hardware              
First Generation 1940s – 1950s Vacuum Tube Based
Second Generation 1950s – 1960s Transistor Based
Third Generation 1960s – 1970s Integrated Circuit Based
Fourth Generation 1970s – Present Microprocessor Based
Fifth Generation Present – Future Artificial Intelligence Based

Before the generation of computers, we used calculators, spreadsheets, and computer algebra systems, mathematicians and inventors searched for solutions to ease the burden of calculation. 

Below are the 8 Mechanical Calculators before modern computers were invented.

  •  Abacus (ca. 2700 BC)
  • Pascal’s Calculator (1652)
  • Stepped Reckoner (1694)
  • Arithmometer (1820)
  • Comptometer (1887) and Comptograph (1889)
  • The Difference Engine (1822)
  • Analytical Engine (1834)
  • The Millionaire (1893)

First Generation Computers

The technology behind the primary generation computers was a fragile glass device, which was called a vacuum tube. These computers were very heavy and really large. These weren’t very reliable and programming on them was a tedious task as they used low-level programming language and used no OS. First-generation computers were used for calculation, storage, and control purpose. They were too bulky and large that they needed a full room and consume a lot of electricity. Punch cards were used for improving the information for external storage. Magnetic card used . Machine and assembly language is developed.

Examples of some main first-generation computers are mentioned below.

  • ENIAC: Electronic Numerical Integrator and Computer, built by J. Presper Eckert and John V. Mauchly was a general-purpose computer. It had been cumbersome, and large, and contained 18,000 vacuum tubes.
  • EDVAC: Electronic Discrete Variable Automatic Computer was designed by von Neumann. It could store data also as instruction and thus the speed was enhanced.
  • UNIVAC: Universal Automatic Computer was developed in 1952 by Eckert and Mauchly.

Vacuum Tube

Vacuum Tube

Characteristics of First-Generation Computers

Characteristics Components
Main electronic component Vacuum tube.
Programming language Machine language.
Main memory and magnetic drums.
Input/output devices Paper tape and punched cards.
Speed and size Very slow and very large (often taking up an entire room).
Examples of the first generation IBM 650, IBM 701, ENIAC, UNIVAC1, etc.

Second Generation Computers

Second-generation computers used the technology of transistors rather than bulky vacuum tubes. Another feature was the core storage. A transistor may be a device composed of semiconductor material that amplifies a sign or opens or closes a circuit.

Second

Second Generation Computer

Transistors were invented in Bell Labs. The use of transistors made it possible to perform powerfully and with due speed. It reduced the dimensions and price and thankfully the warmth too, which was generated by vacuum tubes. Central Processing Unit (CPU), memory, programming language, and input, and output units also came into the force within the second generation.

The programming language was shifted from high level to programming language and made programming comparatively a simple task for programmers. Languages used for programming during this era were FORTRAN (1956), ALGOL (1958), and COBOL (1959).

Transistor

Characteristics of Second-Generation Computers

Characteristics Components
Main electronic component Transistor.
Programming language Machine language and assembly language.
Memory Magnetic core and magnetic tape/disk.
Input/output devices Magnetic tape and punched cards.
Power and size Smaller in size, had low power consumption, and generated less heat (in comparison with the first-generation computers).
Examples of the second generation PDP-8, IBM1400 series, IBM 7090 and 7094, UNIVAC 1107, CDC 3600, etc.

Third Generation Computers

During the third generation, technology envisaged a shift from huge transistors to integrated circuits, also referred to as IC. Here a variety of transistors were placed on silicon chips, called semiconductors. The most feature of this era’s computer was speed and reliability. IC was made from silicon and also called silicon chips.

third-generation-computers

The computer programs was designed to make the machine work. Operating system was a program designed to handle a machine completely. Because of the operating system machine could execute multiple jobs simultaneously. Integrated circuits were used to replace many transistors used in the second generation.

A single IC has many transistors, registers, and capacitors built on one thin slice of silicon. The value size was reduced and memory space and dealing efficiency were increased during this generation. Programming was now wiped out Higher level languages like BASIC (Beginners All-purpose Symbolic Instruction Code). Minicomputers find their shape during this era.

Integrated Circuit

Integrated Circuit

Characteristics of Third-Generation Computers

Characteristics Components
Main electronic component Integrated circuits (ICs).
Programming language High-level language.
Memory Large magnetic core, magnetic tape/disk.
Input/output devices Magnetic tape, monitor, keyboard, printer, etc.
Examples of the third generation IBM 360, IBM 370, PDP-11, NCR 395, B6500, UNIVAC 1108, etc.

Fourth Generation Computers

In 1971 First microprocessors were used, the large-scale of integration LSI circuits built on one chip called microprocessors. The advantage of this technology is that one microprocessor can contain all the circuits required to perform arithmetic, logic, and control functions on one chip. LSI placed thousands of transistors onto a single chip.

fourth-generation-computer

Fourth Generation Computer

The computers using microchips were called microcomputers. This generation provided even smaller size of computers, with larger capacities. That’s not enough, then Very Large Scale Integrated (VLSI) circuits replaced LSI circuits. The Intel 4004 chip, developed in 1971, located all the components of the pc from the central processing unit and memory to input/ output controls on one chip and allowed the dimensions to reduce drastically. VLSI placed several hundred thousand transistors on a single silicon chip. This silicon chip is known as the micro processor.

Technologies like multiprocessing, multiprogramming, time-sharing, operating speed, and virtual memory made it a more user-friendly and customary device. The concept of private computers and computer networks came into being within the fourth generation.

Microprocessor

Microprocessor

Characteristics of Fourth-Generation  Computers

Characteristics Components
Main electronic component Very-large-scale integration (VLSI) and the microprocessor (VLSI has thousands of transistors on a single microchip).
Memory semiconductor memory (such as , etc.).
pointing devices, optical scanning, keyboard, monitor, printer, etc.
Examples of the fourth generation IBM PC, STAR 1000, APPLE II, Apple Macintosh, Alter 8800, etc.

Fifth Generation Computers

The technology behind the fifth generation of computers is AI. It allows computers to behave like humans. It is often seen in programs like voice recognition, area of medicine, and entertainment. Within the field of game playing also it’s shown remarkable performance where computers are capable of beating human competitors.

Fifth-generation-Computers

Fifth-Generation-Computers

The speed is the highest, size is the smallest and area of use has remarkably increased within the fifth generation computers. Though not a hundred percent AI has been achieved to date but keeping in sight the present developments, it is often said that this dream also will become a reality very soon.

To summarize the features of varied generations of computers, it is often said that a big improvement has been seen so far because of the speed and accuracy of functioning care, but if we mention the dimensions, it’s been small over the years. The value is additionally diminishing and reliability is increasing.

AI-Based Computers

AI-Based Computers

Characteristics of Fifth-Generation Computers

Characteristics Components
Main electronic component Based on artificial intelligence, uses the Ultra Large-Scale Integration (ULSI) technology and parallel processing method (ULSI has millions of transistors on a single microchip and the Parallel processing method use two or more microprocessors to run tasks simultaneously).
Language Understand natural language (human language).
Size Portable and small in size.
Input/output device Trackpad (or touchpad), touchscreen, pen, speech input (recognize voice/speech), light scanner, printer, keyboard, monitor, mouse, etc.
Example of the fifth generation Desktops, laptops, tablets, smartphones, etc.

FAQs on Generations of Computer

What are the 5 types of generation of computer.

The five generations of computers are: 1. First Generation (1940s-1950s): Characterized by vacuum tubes and punched cards. Examples: ENIAC, UNIVAC. 2. Second Generation (1950s-1960s): Transistors replaced vacuum tubes, allowing smaller and more efficient computers. Introduction of high-level programming languages. Examples: IBM 1401, IBM 7094. 3. Third Generation (1960s-1970s): Integrated circuits (ICs) replaced transistors, leading to smaller and faster computers. Introduction of operating systems. Examples: IBM System/360, DEC PDP-11. 4. Fourth Generation (1970s-1980s): Microprocessors brought computing power to individual users. Introduction of personal computers. Examples: IBM PC, Apple Macintosh. 5. Fifth Generation (1980s-Present): Focus on parallel processing, artificial intelligence (AI), and natural language processing. Development of supercomputers and expert systems. Ongoing advancements in AI and machine learning. Examples: IBM Watson, Google’s DeepMind.

What is Gen Z technology?

Gen Z technology encompasses the digital tools and platforms that define the experiences of individuals born roughly between the mid-1990s and early 2010s. This generation is characterized by its seamless integration of smartphones, social media, online collaboration, and video content into daily life, shaping their communication, learning, and entertainment habits.

What is Artificial Intelligence?

Artificial Intelligence (AI) is the simulation of human intelligence in machines. It involves programming computers to think, learn, and perform tasks that traditionally require human intelligence, such as problem-solving and decision-making. AI encompasses subfields like machine learning and natural language processing, with applications ranging from virtual assistants to autonomous vehicles.

What was the First Computer?

The ENIAC (Electronic Numerical Integrator and Computer), completed in 1945, is widely regarded as the first electronic general-purpose computer.

Who is Known as the Father of Computers?

Charles Babbage is known as the Father of Computers for his pioneering work on the concept of a programmable mechanical computer in the 19th century.

Similar Reads

  • School Learning
  • School Programming

Please Login to comment...

  • How to Watch NFL on NFL+ in 2024: A Complete Guide
  • Best Smartwatches in 2024: Top Picks for Every Need
  • Top Budgeting Apps in 2024
  • 10 Best Parental Control App in 2024
  • GeeksforGeeks Practice - Leading Online Coding Platform

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Logo

Essay on Evolution of Computers

Students are often asked to write an essay on Evolution of Computers in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on Evolution of Computers

The birth of computers.

Computers were born in the 19th century. Charles Babbage, an English mathematician, designed a machine called the Analytical Engine. It was a mechanical computer that used punched cards.

Early 20th Century

In the 1930s and 1940s, computers like the ENIAC were developed. They were large machines that used vacuum tubes and punch cards. They were used in World War II.

Mid 20th Century

In the 1950s and 1960s, transistors replaced vacuum tubes in computers. This made them smaller, faster, and cheaper. The first personal computers were introduced.

Today’s computers are small and powerful. They use microprocessors and can do complex tasks. They are a part of our daily lives.

250 Words Essay on Evolution of Computers

The genesis of computers, the birth of modern computers.

The 20th century marked the onset of modern computing. The 1936 invention of the Turing machine by Alan Turing laid the groundwork for theoretical computation. During World War II, the ENIAC (Electronic Numerical Integrator and Computer) was developed, marking the advent of the first large-scale digital computer.

The Era of Personal Computers

The 1970s and 1980s saw the rise of personal computers. Companies like IBM and Apple revolutionized the industry, making computers accessible to the public. This era also marked the birth of the Internet, transforming the way computers were used.

Present and Future of Computers

Today, computers have evolved to become an integral part of our lives, from smartphones to smart homes. The future holds immense possibilities, with quantum computing and AI promising to redefine our understanding of computers. The evolution of computers is a testament to human ingenuity and innovation, and their future continues to inspire awe and anticipation.

500 Words Essay on Evolution of Computers

The dawn of computing.

The evolution of computers has been an intriguing journey, intertwined with human ingenuity and innovation. The earliest computing device, the abacus, was invented in 2400 BC, a simple manual tool used for calculations. Fast forward to the 19th century, the concept of a programmable computer was introduced by Charles Babbage, who designed the Analytical Engine, a mechanical general-purpose computer.

The Birth of the Modern Computer

The 20th century witnessed the birth of the modern computer. The first generation (1940-1956) was characterized by vacuum tubes and magnetic drums for data storage. Computers like the ENIAC, UNIVAC, and IBM were monumental in this era. However, they were enormous, consuming large amounts of electricity, and had limited processing power.

Transistors and Integrated Circuits: The Game Changers

Microprocessors and personal computers.

The fourth generation (1971-Present) brought about the microprocessor, a single chip containing all the elements of a CPU. This led to the advent of personal computers. The IBM PC and Apple Macintosh, introduced in the 1980s, revolutionized the way people interact with computers, making them accessible to the general public.

The Internet and the Digital Age

The rise of the Internet in the 1990s and early 2000s marked a significant milestone in computer evolution. It connected computers globally, enabling information sharing and communication on an unprecedented scale. It also paved the way for the development of sophisticated software applications, online services, and cloud computing.

Present and Future: Towards Quantum Computing

The evolution of computers is a testament to human innovation, transforming from simple calculating devices to powerful tools that shape our society. As we stand on the brink of a new era in computing, one can only wonder what the future holds.

That’s it! I hope the essay helped you.

Happy studying!

One Comment

To whom that wrote this essay, You saved my time and my life… thanks for helping me with my assignment. Thank you sooo much.

Leave a Reply Cancel reply

  • History of Computers

When we study the many aspects of computing and computers, it is important to know about the history of computers. Charles Babbage designed an Analytical Engine which was a general computer   It helps us understand the growth and progress of technology through the times. It is also an important topic for competitive and banking exams.

Suggested Videos

What is a computer.

A computer is an electronic machine that collects information, stores it, processes it according to user instructions, and then returns the result.

A computer is a programmable electronic device that performs arithmetic and logical operations automatically using a set of instructions provided by the user.

Early Computing Devices

People used sticks, stones, and bones as counting tools before computers were invented. More computing devices were produced as technology advanced and the human intellect improved over time. Let us look at a few of the early-age computing devices used by mankind.

Abacus was invented by the Chinese around 4000 years ago. It’s a wooden rack with metal rods with beads attached to them. The abacus operator moves the beads according to certain guidelines to complete arithmetic computations.

  • Napier’s Bone

John Napier devised Napier’s Bones, a manually operated calculating apparatus. For calculating, this instrument used 9 separate ivory strips (bones) marked with numerals to multiply and divide. It was also the first machine to calculate using the decimal point system.

Pascaline was invented in 1642 by Biaise Pascal, a French mathematician and philosopher. It is thought to be the first mechanical and automated calculator. It was a wooden box with gears and wheels inside.

  • Stepped Reckoner or Leibniz wheel

In 1673, a German mathematician-philosopher named Gottfried Wilhelm Leibniz improved on Pascal’s invention to create this apparatus. It was a digital mechanical calculator known as the stepped reckoner because it used fluted drums instead of gears.

  • Difference Engine

In the early 1820s, Charles Babbage created the Difference Engine. It was a mechanical computer that could do basic computations. It was a steam-powered calculating machine used to solve numerical tables such as logarithmic tables.

  • Analytical Engine 

Charles Babbage created another calculating machine, the Analytical Engine, in 1830. It was a mechanical computer that took input from punch cards. It was capable of solving any mathematical problem and storing data in an indefinite memory.

  • Tabulating machine 

An American Statistician – Herman Hollerith invented this machine in the year 1890. Tabulating Machine was a punch card-based mechanical tabulator. It could compute statistics and record or sort data or information. Hollerith began manufacturing these machines in his company, which ultimately became International Business Machines (IBM) in 1924.

  • Differential Analyzer 

Vannevar Bush introduced the first electrical computer, the Differential Analyzer, in 1930. This machine is made up of vacuum tubes that switch electrical impulses in order to do calculations. It was capable of performing 25 calculations in a matter of minutes.

Howard Aiken planned to build a machine in 1937 that could conduct massive calculations or calculations using enormous numbers. The Mark I computer was constructed in 1944 as a collaboration between IBM and Harvard.

History of Computers Generation

The word ‘computer’ has a very interesting origin. It was first used in the 16th century for a person who used to compute, i.e. do calculations. The word was used in the same sense as a noun until the 20th century. Women were hired as human computers to carry out all forms of calculations and computations.

By the last part of the 19th century, the word was also used to describe machines that did calculations. The modern-day use of the word is generally to describe programmable digital devices that run on electricity.

Early History of Computer

Since the evolution of humans, devices have been used for calculations for thousands of years. One of the earliest and most well-known devices was an abacus. Then in 1822, the father of computers, Charles Babbage began developing what would be the first mechanical computer. And then in 1833 he actually designed an Analytical Engine which was a general-purpose computer. It contained an ALU, some basic flow chart principles and the concept of integrated memory.

Then more than a century later in the history of computers, we got our first electronic computer for general purpose. It was the ENIAC, which stands for Electronic Numerical Integrator and Computer. The inventors of this computer were John W. Mauchly and J.Presper Eckert.

And with times the technology developed and the computers got smaller and the processing got faster. We got our first laptop in 1981 and it was introduced by Adam Osborne and EPSON.

Browse more Topics under Basics Of Computers

  • Number Systems
  • Number System Conversions

Generations of Computers

  • Computer Organisation
  • Computer Memory
  • Computers Abbreviations
  • Basic Computer Terminology
  • Computer Languages
  • Basic Internet Knowledge and Protocols
  • Hardware and Software
  • Keyboard Shortcuts
  • I/O Devices
  • Practice Problems On Basics Of Computers

In the history of computers, we often refer to the advancements of modern computers as the generation of computers . We are currently on the fifth generation of computers. So let us look at the important features of these five generations of computers.

  • 1st Generation: This was from the period of 1940 to 1955. This was when machine language was developed for the use of computers. They used vacuum tubes for the circuitry. For the purpose of memory, they used magnetic drums. These machines were complicated, large, and expensive. They were mostly reliant on batch operating systems and punch cards. As output and input devices, magnetic tape and paper tape were implemented. For example, ENIAC, UNIVAC-1, EDVAC, and so on.
  • 2nd Generation:  The years 1957-1963 were referred to as the “second generation of computers” at the time. In second-generation computers, COBOL and FORTRAN are employed as assembly languages and programming languages. Here they advanced from vacuum tubes to transistors. This made the computers smaller, faster and more energy-efficient. And they advanced from binary to assembly languages. For instance, IBM 1620, IBM 7094, CDC 1604, CDC 3600, and so forth.
  • 3rd Generation: The hallmark of this period (1964-1971) was the development of the integrated circuit.  A single integrated circuit (IC) is made up of many transistors, which increases the power of a computer while simultaneously lowering its cost. These computers were quicker, smaller, more reliable, and less expensive than their predecessors. High-level programming languages such as FORTRON-II to IV, COBOL, and PASCAL PL/1 were utilized. For example, the IBM-360 series, the Honeywell-6000 series, and the IBM-370/168.
  • 4th Generation: The invention of the microprocessors brought along the fourth generation of computers. The years 1971-1980 were dominated by fourth generation computers. C, C++ and Java were the programming languages utilized in this generation of computers. For instance, the STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, and Apple II. This was when we started producing computers for home use.
  • 5th Generation:  These computers have been utilized since 1980 and continue to be used now. This is the present and the future of the computer world. The defining aspect of this generation is artificial intelligence. The use of parallel processing and superconductors are making this a reality and provide a lot of scope for the future. Fifth-generation computers use ULSI (Ultra Large Scale Integration) technology. These are the most recent and sophisticated computers. C, C++, Java,.Net, and more programming languages are used. For instance, IBM, Pentium, Desktop, Laptop, Notebook, Ultrabook, and so on.

Brief History of Computers

The naive understanding of computation had to be overcome before the true power of computing could be realized. The inventors who worked tirelessly to bring the computer into the world had to realize that what they were creating was more than just a number cruncher or a calculator. They had to address all of the difficulties associated with inventing such a machine, implementing the design, and actually building the thing. The history of the computer is the history of these difficulties being solved.

19 th Century

1801 – Joseph Marie Jacquard, a weaver and businessman from France, devised a loom that employed punched wooden cards to automatically weave cloth designs.

1822 – Charles Babbage, a mathematician, invented the steam-powered calculating machine capable of calculating number tables. The “Difference Engine” idea failed owing to a lack of technology at the time.

1848 – The world’s first computer program was written by Ada Lovelace, an English mathematician. Lovelace also includes a step-by-step tutorial on how to compute Bernoulli numbers using Babbage’s machine.

1890 – Herman Hollerith, an inventor, creates the punch card technique used to calculate the 1880 U.S. census. He would go on to start the corporation that would become IBM.

Early 20 th Century

1930 – Differential Analyzer was the first large-scale automatic general-purpose mechanical analogue computer invented and built by Vannevar Bush.

1936 – Alan Turing had an idea for a universal machine, which he called the Turing machine, that could compute anything that could be computed.

1939 – Hewlett-Packard was discovered in a garage in Palo Alto, California by Bill Hewlett and David Packard.

1941 – Konrad Zuse, a German inventor and engineer, completed his Z3 machine, the world’s first digital computer. However, the machine was destroyed during a World War II bombing strike on Berlin.

1941 – J.V. Atanasoff and graduate student Clifford Berry devise a computer capable of solving 29 equations at the same time. The first time a computer can store data in its primary memory.

1945 – University of Pennsylvania academics John Mauchly and J. Presper Eckert create an Electronic Numerical Integrator and Calculator (ENIAC). It was Turing-complete and capable of solving “a vast class of numerical problems” by reprogramming, earning it the title of “Grandfather of computers.”

1946 – The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic digital computer designed in the United States for corporate applications.

1949 – The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at the University of Cambridge, is the “first practical stored-program computer.”

1950 – The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC, and it was the first stored-program computer completed in the United States.

Late 20 th Century

1953 – Grace Hopper, a computer scientist, creates the first computer language, which becomes known as COBOL, which stands for CO mmon, B usiness- O riented L anguage. It allowed a computer user to offer the computer instructions in English-like words rather than numbers.

1954 – John Backus and a team of IBM programmers created the FORTRAN programming language, an acronym for FOR mula TRAN slation. In addition, IBM developed the 650.

1958 – The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby and Robert Noyce.

1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world at the time, and it pioneered the concept of “virtual memory.”

1964 – Douglas Engelbart proposes a modern computer prototype that combines a mouse and a graphical user interface (GUI).

1969 – Bell Labs developers, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an operating system developed in the C programming language that addressed program compatibility difficulties.

1970 – The Intel 1103, the first Dynamic Access Memory (DRAM) chip, is unveiled by Intel.

1971 – The floppy disc was invented by Alan Shugart and a team of IBM engineers. In the same year, Xerox developed the first laser printer, which not only produced billions of dollars but also heralded the beginning of a new age in computer printing.

1973 – Robert Metcalfe, a member of Xerox’s research department, created Ethernet, which is used to connect many computers and other gear.

1974 – Personal computers were introduced into the market. The first were the Altair Scelbi & Mark-8, IBM 5100, and Radio Shack’s TRS-80.

1975 – Popular Electronics magazine touted the Altair 8800 as the world’s first minicomputer kit in January. Paul Allen and Bill Gates offer to build software in the BASIC language for the Altair.

1976 – Apple Computers is founded by Steve Jobs and Steve Wozniak, who expose the world to the Apple I, the first computer with a single-circuit board.

1977 – At the first West Coast Computer Faire, Jobs and Wozniak announce the Apple II. It has colour graphics and a cassette drive for storing music.

1978 – The first computerized spreadsheet program, VisiCalc, is introduced.

1979 – WordStar, a word processing tool from MicroPro International, is released.

1981 – IBM unveils the Acorn, their first personal computer, which has an Intel CPU, two floppy drives, and a colour display. The MS-DOS operating system from Microsoft is used by Acorn.

1983 – The CD-ROM, which could carry 550 megabytes of pre-recorded data, hit the market. This year also saw the release of the Gavilan SC, the first portable computer with a flip-form design and the first to be offered as a “laptop.”

1984 – Apple launched Macintosh during the Superbowl XVIII commercial. It was priced at $2,500

1985 – Microsoft introduces Windows, which enables multitasking via a graphical user interface. In addition, the programming language C++ has been released.

1990 – Tim Berners-Lee, an English programmer and scientist, creates HyperText Markup Language, widely known as HTML. He also coined the term “WorldWideWeb.” It includes the first browser, a server, HTML, and URLs.

1993 – The Pentium CPU improves the usage of graphics and music on personal computers.

1995 – Microsoft’s Windows 95 operating system was released. A $300 million promotional campaign was launched to get the news out. Sun Microsystems introduces Java 1.0, followed by Netscape Communications’ JavaScript.

1996 – At Stanford University, Sergey Brin and Larry Page created the Google search engine.

1998 – Apple introduces the iMac, an all-in-one Macintosh desktop computer. These PCs cost $1,300 and came with a 4GB hard drive, 32MB RAM, a CD-ROM, and a 15-inch monitor.

1999 – Wi-Fi, an abbreviation for “wireless fidelity,” is created, originally covering a range of up to 300 feet.

21 st Century

2000 – The USB flash drive is first introduced in 2000. They were speedier and had more storage space than other storage media options when used for data storage.

2001 – Apple releases Mac OS X, later renamed OS X and eventually simply macOS, as the successor to its conventional Mac Operating System.

2003 – Customers could purchase AMD’s Athlon 64, the first 64-bit CPU for consumer computers.

2004 – Facebook began as a social networking website.

2005 – Google acquires Android, a mobile phone OS based on Linux.

2006 – Apple’s MacBook Pro was available. The Pro was the company’s first dual-core, Intel-based mobile computer.

Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and Amazon Simple Storage Service, were also launched (S3)

2007 – The first iPhone was produced by Apple, bringing many computer operations into the palm of our hands. Amazon also released the Kindle, one of the first electronic reading systems, in 2007.

2009 – Microsoft released Windows 7.

2011 – Google introduces the Chromebook, which runs Google Chrome OS.

2014 – The University of Michigan Micro Mote (M3), the world’s smallest computer, was constructed.

2015 – Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.

2016 – The world’s first reprogrammable quantum computer is built.

Types of Computers

  • Analog Computers –  Analog computers are built with various components such as gears and levers, with no electrical components. One advantage of analogue computation is that designing and building an analogue computer to tackle a specific problem can be quite straightforward.
  • Mainframe computers –  It is a computer that is generally utilized by large enterprises for mission-critical activities such as massive data processing. Mainframe computers were distinguished by massive storage capacities, quick components, and powerful computational capabilities. Because they were complicated systems, they were managed by a team of systems programmers who had sole access to the computer. These machines are now referred to as servers rather than mainframes.
  • Supercomputers –  The most powerful computers to date are commonly referred to as supercomputers. Supercomputers are enormous systems that are purpose-built to solve complicated scientific and industrial problems. Quantum mechanics, weather forecasting, oil and gas exploration, molecular modelling, physical simulations, aerodynamics, nuclear fusion research, and cryptoanalysis are all done on supercomputers.
  • Minicomputers –  A minicomputer is a type of computer that has many of the same features and capabilities as a larger computer but is smaller in size. Minicomputers, which were relatively small and affordable, were often employed in a single department of an organization and were often dedicated to a specific task or shared by a small group.
  • Microcomputers –  A microcomputer is a small computer that is based on a microprocessor integrated circuit, often known as a chip. A microcomputer is a system that incorporates at a minimum a microprocessor, program memory, data memory, and input-output system (I/O). A microcomputer is now commonly referred to as a personal computer (PC).
  • Embedded processors –  These are miniature computers that control electrical and mechanical processes with basic microprocessors. Embedded processors are often simple in design, have limited processing capability and I/O capabilities, and need little power. Ordinary microprocessors and microcontrollers are the two primary types of embedded processors. Embedded processors are employed in systems that do not require the computing capability of traditional devices such as desktop computers, laptop computers, or workstations.

FAQs on History of Computers

Q: The principle of modern computers was proposed by ____

  • Adam Osborne
  • Alan Turing
  • Charles Babbage

Ans: The correct answer is C.

Q: Who introduced the first computer from home use in 1981?

  • Sun Technology

Ans: Answer is A. IBM made the first home-use personal computer.

Q: Third generation computers used which programming language ?

  • Machine language

Ans: The correct option is C.

Customize your course in 30 seconds

Which class are you in.

tutor

Basics of Computers

  • Computer Abbreviations
  • Basic Computer Knowledge – Practice Problems
  • Computer Organization
  • Input and Output (I/O) Devices

One response to “Hardware and Software”

THANKS ,THIS IS THE VERY USEFUL KNOWLEDGE

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Download the App

Google Play

computer generation

The Evolution Of Computer | Generations of Computer

The development of computers has been a wonderful journey that has covered several centuries and is defined by a number of inventions and advancements made by our greatest scientists. Because of these scientists, we are using now the latest technology in the computer system.

Now we have Laptops , Desktop computers , notebooks , etc. which we are using today to make our lives easier, and most importantly we can communicate with the world from anywhere around the world with these things.

So, In today’s blog, I want you to explore the journey of computers with me that has been made by our scientists.

Note: If you haven’t read our History of Computer blog then must read first then come over here

let’s look at the evolution of computers/generations of computers

COMPUTER GENERATIONS

Computer generations are essential to understanding computing technology’s evolution. It divides computer history into periods marked by substantial advancements in hardware, software, and computing capabilities. So the first period of computers started from the year 1940 in the first generation of computers. let us see…

Table of Contents

Generations of computer

The generation of classified into five generations:

  • First Generation Computer (1940-1956)
  • Second Generation Computer (1956-1963)
  • Third Generation Computer(1964-1971)
  • Fourth Generation Computer(1971-Present)
  • Fifth Generation Computer(Present and Beyond)
Computer GenerationsPeriodsBased on
First-generation of computer1940-1956Vacuum tubes
Second-generation of computer1956-1963Transistor
Third generation of computer1964-1971Integrated Circuit (ICs)
Fourth-generation of computer1971-presentMicroprocessor
Fifth-generation of computerPresent and BeyondAI (Artificial Intelligence)

1. FIRST GENERATION COMPUTER: Vacuum Tubes (1940-1956)

vacuum tubes g97ba56c1e 1280

The first generation of computers is characterized by the use of “Vacuum tubes” It was developed in 1904 by the British engineer “John Ambrose Fleming” . A vacuum tube is an electronic device used to control the flow of electric current in a vacuum. It is used in CRT(Cathode Ray Tube) TV , Radio , etc.

John Ambrose Fleming

The first general-purpose programmable electronic computer was the ENIAC (Electronic Numerical Integrator and Computer) which was completed in 1945 and introduced on Feb 14, 1946, to the public. It was built by two American engineers “J. Presper Eckert” and “John V Mauchly” at the University of Pennsylvania.

J. Presper Eckert

The ENIAC was 30-50 feet long, 30 tons weighted, contained 18000 vacuum tubes, 70,000 registers, and 10,000 capacitors, and it required 150000 watts of electricity, which makes it very expensive.

Later, Eckert and Mauchly developed the first commercially successful computer named UNIVAC(Univeral Automatic Computer) in 1952 .

Examples are ENIAC (Electronic Numerical Integrator and Computer), EDVAC (Electronic Discrete Variable Automatic Computer), UNIVAC-1 (Univeral Automatic Computer-1)

ENIAC Computer

  • These computers were designed by using vacuum tubes.
  • These generations’ computers were simple architecture.
  • These computers calculate data in a millisecond.
  • This computer is used for scientific purposes.

DISADVANTAGES

  • The computer was very costly.
  • Very large.
  • It takes up a lot of space and electricity
  • The speed of these computers was very slow
  • It is used for commercial purposes.
  • It is very expensive.
  • These computers heat a lot.
  • Cooling is needed to operate these types of computers because they heat up very quickly.

2. SECOND GENERATION COMPUTER: Transistors (1956-1963)

Transitors

The second generation of computers is characterized by the use of “Transistors” and it was developed in 1947 by three American physicists “John Bardeen, Walter Brattain, and William Shockley” .

John Bardeen

A transistor is a semiconductor device used to amplify or switch electronic signals or open or close a circuit. It was invented in Bell labs, The transistors became the key ingredient of all digital circuits, including computers.

The invention of transistors replaced the bulky electric tubes from the first generation of computers.

Transistors perform the same functions as a Vacuum tube , except that electrons move through instead of through a vacuum. Transistors are made of semiconducting materials and they control the flow of electricity.

It is smaller than the first generation of computers, it is faster and less expensive compared to the first generation of computers. The second-generation computer has a high level of programming languages, including FORTRAN (1956), ALGOL (1958), and COBOL (1959).

Examples are PDP-8 (Programmed Data Processor-8), IBM1400 (International business machine 1400 series), IBM 7090 (International business machine 7090 series), CDC 3600 ( Control Data Corporation 3600 series)

PDP 8

ADVANTAGES:

  • It is smaller in size as compared to the first-generation computer
  • It used less electricity
  • Not heated as much as the first-generation computer.
  • It has better speed

DISADVANTAGES:

  • It is also costly and not versatile
  • still, it is expensive for commercial purposes
  • Cooling is still needed
  • Punch cards were used for input
  • The computer is used for a particular purpose

3. THIRD GENERATION COMPUTER: Integrated Circuits (1964-1971)

pexels pixabay 39290

The Third generation of computers is characterized by the use of “Integrated Circuits” It was developed in 1958 by two American engineers “Robert Noyce” & “Jack Kilby” . The integrated circuit is a set of electronic circuits on small flat pieces of semiconductor that is normally known as silicon. The transistors were miniaturized and placed on silicon chips which are called semiconductors, which drastically increased the efficiency and speed of the computers.

integrated circuit inventor

These ICs (integrated circuits) are popularly known as chips. A single IC has many transistors, resistors, and capacitors built on a single slice of silicon.

This development made computers smaller in size, low cost, large memory, and processing. The speed of these computers is very high and it is efficient and reliable also.

These generations of computers have a higher level of languages such as Pascal PL/1, FORTON-II to V, COBOL, ALGOL-68, and BASIC(Beginners All-purpose Symbolic Instruction Code) was developed during these periods.

Examples are NCR 395 (National Cash Register), IBM 360,370 series, B6500

IBM 360

  • These computers are smaller in size as compared to previous generations
  • It consumed less energy and was more reliable
  • More Versatile
  • It produced less heat as compared to previous generations
  • These computers are used for commercial and as well as general-purpose
  • These computers used a fan for head discharge to prevent damage
  • This generation of computers has increased the storage capacity of computers
  • Still, a cooling system is needed.
  • It is still very costly
  • Sophisticated Technology is required to manufacture Integrated Circuits
  • It is not easy to maintain the IC chips.
  • The performance of these computers is degraded if we execute large applications.

4. FOURTH GENERATION OF COMPUTER: Microprocessor (1971-Present)

Intel C4004

The fourth generation of computers is characterized by the use of “Microprocessor”. It was invented in the 1970s and It was developed by four inventors named are “Marcian Hoff, Masatoshi Shima, Federico Faggin, and Stanley Mazor “. The first microprocessor named was the “Intel 4004” CPU, it was the first microprocessor that was invented.

Marcian Hoff

A microprocessor contains all the circuits required to perform arithmetic, logic, and control functions on a single chip. Because of microprocessors, fourth-generation includes more data processing capacity than equivalent-sized third-generation computers. Due to the development of microprocessors, it is possible to place the CPU(central processing unit) on a single chip. These computers are also known as microcomputers. The personal computer is a fourth-generation computer. It is the period when the evolution of computer networks takes place.

Examples are APPLE II, Alter 8800

Altair 8800 Computer

  • These computers are smaller in size and much more reliable as compared to other generations of computers.
  • The heating issue on these computers is almost negligible
  • No A/C or Air conditioner is required in a fourth-generation computer.
  • In these computers, all types of higher languages can be used in this generation
  • It is also used for the general purpose
  • less expensive
  • These computers are cheaper and portable
  • Fans are required to operate these kinds of computers
  • It required the latest technology for the need to make microprocessors and complex software
  • These computers were highly sophisticated
  • It also required advanced technology to make the ICs(Integrated circuits)

5. FIFTH GENERATION OF COMPUTERS (Present and beyond)

These generations of computers were based on AI (Artificial Intelligence) technology. Artificial technology is the branch of computer science concerned with making computers behave like humans and allowing the computer to make its own decisions currently, no computers exhibit full artificial intelligence (that is, can simulate human behavior).

AI

In the fifth generation of computers, VLSI technology and ULSI (Ultra Large Scale Integration) technology are used and the speed of these computers is extremely high. This generation introduced machines with hundreds of processors that could all be working on different parts of a single program. The development of a more powerful computer is still in progress. It has been predicted that such a computer will be able to communicate in natural spoken languages with its user.

In this generation, computers are also required to use a high level of languages like C language, c++, java, etc.

Examples are Desktop computers, laptops, notebooks, MacBooks, etc. These all are the computers which we are using.

Screenshot 2023 10 20 132524 min

  • These computers are smaller in size and it is more compatible
  • These computers are mighty cheaper
  • It is obviously used for the general purpose
  • Higher technology is used
  • Development of true artificial intelligence
  • Advancement in Parallel Processing and Superconductor Technology.
  • It tends to be sophisticated and complex tools
  • It pushes the limit of transistor density.

Frequently Asked Questions

How many computer generations are there.

Mainly five generations are there:

First Generation Computer (1940-1956) Second Generation Computer (1956-1963) Third Generation Computer(1964-1971) Fourth Generation Computer(1971-Present) Fifth Generation Computer(Present and Beyond)

Which things were invented in the first generation of computers?

Vacuum Tubes

What is the fifth generation of computers?

The Fifth Generation of computers is entirely based on Artificial Intelligence. Where it predicts that the computer will be able to communicate in natural spoken languages with its user.

What is the latest computer generation?

The latest generation of computers is Fifth which is totally based on Artificial Intelligence.

Who is the inventor of the Integrated Circuit?

“Robert Noyce” and “Jack Bily”

What is the full form of ENIAC ?

ENIAC Stands for “Electronic Numerical Integrator and Computer” .

Related posts:

  • What is a Computer System and Its Types?|Different types of Computer System
  • How does the Computer System Work| With Diagram, Input, Output, processing
  • The History of Computer Systems and its Generations
  • Different Applications of Computer Systems in Various Fields | Top 12 Fields
  • Explain Von Neumann Architecture?
  • What are the input and Output Devices of Computer System with Examples
  • What is Unicode and ASCII Code
  • What is RAM and its Types?
  • What is the difference between firmware and driver? | What are Firmware and Driver?
  • What is Hardware and its Types

6 thoughts on “The Evolution Of Computer | Generations of Computer”

It is really useful thanks

Glad to see

it is very useful information for the students of b.sc people who are seeing plz leave a comment to related post thank u

Love to see that this post is proving useful for the students.

It is useful information for students…thank u soo much for guide us

Most Welcome 🙂

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Diverse part synthesis for 3D shape creation

New citation alert added.

This alert has been successfully added and will be sent to:

You will be notified whenever a record that you have chosen has been cited.

To manage your alert preferences, click on the button below.

New Citation Alert!

Please log in to your account

Information & Contributors

Bibliometrics & citations, view options, graphical abstract, index terms.

Computing methodologies

Artificial intelligence

Computer vision

Computer vision problems

Shape inference

Computer vision representations

Shape representations

Computer graphics

Shape modeling

Mesh models

Shape analysis

Machine learning

Machine learning approaches

Neural networks

Recommendations

A probabilistic model for component-based shape synthesis.

We present an approach to synthesizing shapes from complex domains, by identifying new plausible combinations of components from existing shapes. Our primary contribution is a new generative model of component-based shape structure. The model represents ...

GEM3D: GEnerative Medial Abstractions for 3D Shape Synthesis

We introduce GEM3D 1 – a new deep, topology-aware generative model of 3D shapes. The key ingredient of our method is a neural skeleton-based representation encoding information on both shape topology and geometry. Through a denoising diffusion ...

Discrete curve model for non-elastic shape analysis on shape manifold

  • We construct a shape manifold and its tangent space for the discrete curve model.

In this paper, we construct a novel finite dimensional shape manifold for shape analyses. Elements of the shape manifold are a set of discrete, planar, and closed curves, which stand for object boundaries and are represented by ...

Information

Published in.

Butterworth-Heinemann

United States

Publication History

Author tags.

  • Shape synthesis
  • Part-based shape modeling
  • Multimodal generative models
  • Research-article

Contributors

Other metrics, bibliometrics, article metrics.

  • 0 Total Citations
  • 0 Total Downloads
  • Downloads (Last 12 months) 0
  • Downloads (Last 6 weeks) 0

View options

Login options.

Check if you have access through your login credentials or your institution to get full access on this article.

Full Access

Share this publication link.

Copying failed.

Share on social media

Affiliations.

Ad-free. Influence-free. Powered by consumers.

The payment for your account couldn't be processed or you've canceled your account with us.

We don’t recognize that sign in. Your username maybe be your email address. Passwords are 6-20 characters with at least one number and letter.

We still don’t recognize that sign in. Retrieve your username. Reset your password.

Forgot your username or password ?

Don’t have an account?

  • Account Settings
  • My Benefits
  • My Products
  • Donate Donate

Save products you love, products you own and much more!

Other Membership Benefits:

Suggested Searches

  • Become a Member

Car Ratings & Reviews

2024 Top Picks

Car Buying & Pricing

Which Car Brands Make the Best Vehicles?

Tires, Maintenance & Repair

Car Reliability Guide

Key Topics & News

Listen to the Talking Cars Podcast

Home & Garden

Bed & Bath

Top Picks From CR

Best Mattresses

Lawn & Garden

TOP PICKS FROM CR

Best Leaf Blowers

Home Improvement

Home Improvement Essential

Best Wood Stains

Home Safety & Security

HOME SAFETY

Best DIY Home Security Systems

SURVEY RESULTS

Most and Least Reliable Refrigerators

Small Appliances

Best Small Kitchen Appliances

Laundry & Cleaning

Best Washing Machines

Heating, Cooling & Air

Best Air Purifiers

Electronics

Home Entertainment

FIND YOUR NEW TV

Home Office

Cheapest Printers for Ink Costs

Smartphones & Wearables

BEST SMARTPHONES

Find the Right Phone for You

Digital Security

Digital Security & Privacy

CR PERMISSION SLIP APP

One app to take back control of your data

Take Action

Apple AirPods 4 Review: First Entry-Level AirPods With Optional Noise Cancellation

Is the $50 upgrade worth it? Plus, the new AirPods Pro will soon enable a hearing test and hearing aid functionality

Apple Airpods 4 in case

Apple’s new AirPods 4 have improved drivers and electronics for what Apple says is better sound, along with reshaped eartips for a better fit, and, for the first time, consumers have the option to add active noise cancellation (ANC), a feature previously found only in the AirPods Pro and AirPods Max.

The basic model of the AirPods 4 sells for $129 and the noise-canceling version of the AirPods 4 costs $179.

New AirPods With Classic Fit

An open design, now with noise canceling, should you buy the airpods 4, first look at airpods pro hearing test.

To set its premium model apart from the basic AirPods once again, Apple announced a feature that can turn the AirPods Pro into an over-the-counter hearing aid via an over-the-air update. The free software upgrade, due later this year, includes a clinical-grade hearing test administered via an iPhone app. Once it determines your profile, it programs the earbuds to help you better hear music, video, phone calls, and what your friends and family are saying to you.

Designed for those with mild to moderate hearing loss, the feature activates hearing protection measures across your Apple devices, too.

We’ll take a closer look at the hearing aid feature once it’s available. We’ll also buy both versions of the new AirPods 4—with and without noise cancellation—for official testing in our labs. In the meantime, we asked Apple to borrow pre-production modes of both versions of the AirPods 4 for a quick review of the revised fit and new noise cancellation feature. This preview will be updated once we have the full results from our testers.

The original AirPods were, in many ways, a love-it-or-hate-it product.

In the nine years since they were first introduced, two things have happened.

First, the design—which features a hard plastic eartip on a white plastic stem—has become iconic. During the pandemic, it even emerged as a Zoom-call fashion accessory.

However, around the same time, almost all earbuds from Apple and other manufacturers moved away from the one-size-fits-all design. Newer earbuds come with silicone or foam ear tips in a variety of sizes. Some people, like me, love the secure fit and the passive noise canceling of these models. For other folks, like my wife, the tight-fitting soft eartips remind them of earplugs … in a bad way. These listeners prefer the less-isolating AirPods 4, which retain that hard plastic eartip—no silicone in sight—and form a very gentle seal with your ear canal.

The AirPods 4 have been reshaped slightly with an eye toward better comfort, but I find the difference between the AirPods 4 and a pair of older AirPods I borrowed from my son to be pretty subtle. Do the AirPods 4 ever fall out? For me, they didn’t. Do they feel like they’re going to fall out? That was sometimes a concern for me.

But, as with all earbuds, fit is very much a personal thing. We’ll have much more to say about all this once our diverse panel of testers tries out the AirPods 4 in our sound labs in Yonkers, N.Y.

If you like the fit of the AirPods 4, there is an additional benefit. Since the design doesn’t fully plug your ears, you get an appealing sense of openness to the outside environment.

Turn down the Prince playlist on your iPhone and you can have a conversation with a friend without removing your buds. Higher-end earbuds, like the AirPods Pro, achieve this effect electronically with a transparency mode that pipes in sound from an outside mic. The AirPods 4 function more like the Bose Ultra Open earbuds , permitting outside sounds to enter organically, but Apple’s loose-fitting design is more elegant.

That openness is great in circumstances where you want to hear what’s going on around you, whether you’re out walking, running, or chatting with a neighbor who stopped you to ask a quick question. On the down side, if your neighbor is mowing the lawn, you have to put up with the noise, at least if you have entry-level AirPods 4.

That’s what makes the noise-canceling version of AirPods 4 (called, rather anticlimactically, the AirPods 4 with noise canceling) so appealing. And in my quick evaluation of the 4s with ANC, the new feature did its job. Even without the passive noise cancellation, you get from silicone or foam ear tips, the AirPods 4 suppressed the background noise compared to the plain AirPods 4 when I was walking on the streets of New York or riding on a commuter train. However, we’ll leave it to our lab testers to make a final call about the effectiveness of the active noise cancellation on the AirPods 4. The AirPods 4 with ANC also have a transparency mode that shuts down the noise canceling and electronically combines outside sound with whatever content is playing, although the open design makes that feature a little redundant

Apple claims that both versions of the AirPods 4 serve up better sound quality than previous incarnations, with fuller bass and more extended highs. In my informal trial, they did sound better than a pair of the original AirPods I had in my home, which failed to earn a recommendation from our testers. Our lab testers will compare the 4s to the much better-sounding AirPods 3, which did earn our recommendation. After a brief trial, the plain AirPods 4 seemed to me like an incremental upgrade over older generations of AirPods. If you liked the classic AirPods, you’ll probably like these.

However, the upgrade to active noise cancellation in the more expensive model adds a potentially useful new feature that could allow them to perform better in noisy situations. How does the active noise canceling of the AirPods 4 compare to other earbuds that have silicone tips to block out the outside world? That’s another question that our lab testers will answer in an updated version of this story.

Apple is planning to introduce a suite of free over-the-air upgrades for the AirPods Pro this fall that will allow users to test their hearing and, if necessary, modify a pair of AirPods Pro to work as an over-the-counter hearing aid . At Apple’s press briefing earlier this week, I got a sneak preview of the hearing test that is promised for the iPhone before year’s end.

I listened for a tone on a pair of AirPods Pro provided by Apple, and when I heard it, touched a large button on an iPhone screen. There was no learning curve at all. This test involved just a few tones that were quite loud and clear. The full test will be longer—at around 5 minutes—and will involve more tones at different volumes and frequencies, but the demo seems quite similar to the gold-standard pure tone audiometry test that’s performed by an audiologist. (As an audio writer, I’ve had my hearing checked regularly.)

Best & Worst Hearing Aid Brands

CR’s survey provides insights on consumer satisfaction with both prescription and over-the-counter hearing aid brands .

If the results from the test show mild to moderate hearing loss, a hearing profile can be applied to your AirPods Pro to turn them into a full-blown over-the-counter hearing aid. By boosting or reducing specific sounds in real time they’re designed to enhance both content and normal conversation. The profile is automatically applied to music, movies, and phone calls across all your other Apple devices, too. (For those with more severe hearing loss, the results of the test can be shared with your doctor to begin a conversation about other hearing aid options.)

The new features will be delivered in software updates to the AirPods Pro and iOS 18 for the iPhone. Although the hearing test and OTC functionality did receive approval from the FDA earlier this month, Apple says, the company didn’t provide a timetable for the full rollout of these features, aside from suggesting that it’s expected later this fall. Our lab testers will report on this functionality when it’s available and we’ll update the story when our testing is complete.

Allen St. John

Allen St. John has been a senior product editor at CR since 2016, focusing on digital privacy, audio devices, printers, and home products. He was a senior editor at Condé Nast and a contributing editor at publications including Road & Track and The Village Voice. A New York Times bestselling author, he's also written for The New York Times Magazine, The Wall Street Journal, and Rolling Stone. He lives in Montclair, N.J., with his wife, their two children, and their dog, Rugby.

Sharing is Nice

We respect your privacy . All email addresses you provide will be used just for sending this story.

Dyson OnTrac

Shokz openfit air, beats by dre solo buds, samsung galaxy buds3, jabra elite 8 active gen 2 , sennheiser hd 620s, marshall minor iv, jbl live 770nc, jbl tune 720bt, soundcore aerofit, marshall major v, soundcore sport x20.

See All Ratings

Trending in Noise-Canceling Headphones

Best Deals on Tech Products Right Now

Dyson OnTrac Headphones Are Expensive but Could Be Worth the Money

Best Wired and Wireless Earbuds of 2024

Apple Unveils iPhone 16 Lineup, Watch Series 10, and AirPods 4

IMAGES

  1. Brief History of Computers Free Essay Sample on Samploon.com

    essay on first generation of computer

  2. Generations of Computer

    essay on first generation of computer

  3. History and Generation of Computer short essay

    essay on first generation of computer

  4. Computer Generations (Q 2,3,4,5,6,7)

    essay on first generation of computer

  5. Essay on Generation of Computer

    essay on first generation of computer

  6. The Five Generations of Computers

    essay on first generation of computer

VIDEO

  1. Basics of Computers

  2. CSIRAC worlds first computer generated music

  3. First generation Computer #computer #technology

  4. The First Generation Computer ENIAC

  5. Second Generation computer

  6. Second Generation Computer #computer #computerscience #generalknowledge

COMMENTS

  1. First Generation Of Computer

    Got it. First-generation computers, which were created between the 1940s and 1950s, represented the start of the computers. These computers employed vacuum tubes for circuitry and magnetic drums for storage. First Generation Computers were too bulky and large that they needed a full room and consumed a lot of electricity.

  2. The First Generation of Computers Essay

    The first generation of computer which is from the year 1945 has been relatively large in size and very expensive due to the technology that we have back then. Goes by the name "Colossus", it was the very first electronic computer developed. It is programmable, digital, electronic, computing devices. The vacuum tubes or known as thermionic ...

  3. Essay on Generation of Computer

    The First Generation (1940-1956): Vacuum Tubes. The first generation of computers were characterized by the use of vacuum tubes. These machines were enormous, occupying entire rooms, and were prone to overheating. Their programming was done in machine language, which was a low-level language. Despite their size and inefficiency, these computers ...

  4. Computer

    The close relationship between the device and the program became apparent some 20 years later, with Charles Babbage's invention of the first computer. Computer - History, Technology, Innovation: A computer might be described with deceptive simplicity as "an apparatus that performs routine calculations automatically.".

  5. History of computing

    The first digital electronic computer was developed in the period April 1936 - June 1939, in the IBM Patent Department, Endicott, New York by Arthur Halsey Dickinson. [35] [36] [37] In this computer IBM introduced, a calculating device with a keyboard, processor and electronic output (display). The competitor to IBM was the digital electronic ...

  6. The First Generation

    2.3 The First Generation. 2.3. The First Generation. Generations of computers are largely defined by the components used to build them. Although mechanical computers like Babbage's Engines and the Hollerith desk would continue to see use well into the 20th century, mathematicians and engineers in the 1930s realized that the logic of ...

  7. The Emergence of First Generation Computers 1946-1959

    The Electronic Numerical Integrator and Calculator (ENIAC) was the first fully electronic computer. Designed and developed by J. Presper Eckert and John W. Mauchly under contractor supervision of Lieutenant Herman Goldstine, 25 ENIAC funding came from the Army's Ballistic Research Laboratory at Aberdeen. ENIAC had a serious defect.

  8. History of computers

    The first modern computers. The World War II years were a crucial period in the history of computing, when powerful gargantuan computers began to appear. Just before the outbreak of the war, in 1938, German engineer Konrad Zuse (1910-1995) constructed his Z1, the world's first programmable binary computer, in his parents' living room.

  9. First generation of computers

    It first appeared in 1951, being the first digital computer for commercial Its creators were John Mauchly and John Presper Eckert, taking around 5 years for its realization. It was characterized, like all computers of this generation, by its magnitude because it weighed 7,257 kg, consisting of 5,000 vacuum tubes and could perform about 1,000 ...

  10. 8 The First Generation of Computers

    To stress the parallel evolution of the mathematical and engineering foundations, it is worth starting by noting that one of the earliest crucial results for the realization of physical computers dates back to just one year after that annus mirabilis of the Church and Turing papers. In 1937, Claude Shannon in his Master's Thesis at MIT illustrates how to implement Boolean logic in the design ...

  11. 1st to 5th Generations of Computer: Complete Information

    Let's discover the series of computer generations in the following list: 1st Generation of Computer (1940-1956). This first generation of computers was based on vacuum tube technology used for calculations, storage, and control, invented in 1904 by John Ambrose Fleming. The vacuum tubes and diode valves were the chief components of the first generations of computers.

  12. Computer Technology: Evolution and Developments Essay

    The development of computer technology is characterized by the change in the technology used in building the devices. The evolution of computer technology is divided into several generations, from mechanical devices, followed by analog devices, to the recent digital computers that now dominate the world. This paper examines the evolution of ...

  13. History of computers: A brief timeline

    The first computer mouse was invented in 1963 by Douglas C. Engelbart and presented at the Fall Joint Computer Conference in 1968 (Image credit: ... The first generation, spanning the 1940s to the ...

  14. A Comprehensive Guide to Generations of Computers

    The first generation of computers, spanning the 1940s to the early 1950s, represents the initial foray into electronic computing. These machines were huge, expensive and marked by the use of vacuum tubes as their primary electronic component. Here are key aspects of the first generation of computers, along with notable examples.

  15. Essay on History of Computer

    The Dawn of Computing. The history of computers dates back to antiquity with devices like the abacus, used for calculations. However, the concept of a programmable computer was first realized in the 19th century by Charles Babbage, an English mathematician. His design, known as the Analytical Engine, is considered the first general-purpose ...

  16. History of Computers

    First Generation Computers. In the period of the year 1940-1956, it was referred to as the period of the first generation of computers. These machines are slow, huge, and expensive. In this generation of computers, vacuum tubes were used as the basic components of CPU and memory. Also, they were mainly dependent on the batch operating systems ...

  17. What are the Five Generations of Computers? (1st to 5th)

    The fifth generation of computer technology, based on artificial intelligence, is still in development. However, there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality.

  18. Generations of Computers

    FAQs on Generations of Computer What are the 5 types of generation of computer? The five generations of computers are: 1. First Generation (1940s-1950s): Characterized by vacuum tubes and punched cards. Examples: ENIAC, UNIVAC. 2. Second Generation (1950s-1960s): Transistors replaced vacuum tubes, allowing smaller and more efficient computers.

  19. Essay on Evolution of Computers

    250 Words Essay on Evolution of Computers The Genesis of Computers. The evolution of computers has been an intriguing journey, starting with the abacus in 500 B.C., used for basic arithmetic. ... The 20th century witnessed the birth of the modern computer. The first generation (1940-1956) was characterized by vacuum tubes and magnetic drums for ...

  20. History of Computers: Parts, Networking, Operating Systems, FAQs

    The word 'computer' has a very interesting origin. It was first used in the 16th century for a person who used to compute, i.e. do calculations. The word was used in the same sense as a noun until the 20th century. Women were hired as human computers to carry out all forms of calculations and computations.

  21. The First Generation Of Computers

    In 1943, an electronic computer designed and built for the Military. ENIAC (Electronic Numerical Integrator and Computer) was built by two (2) professors from the Univesity of Pennsylvania and it is. Free Essay: One of the first definitions for computers was given to people that performed early mathematical calculations.

  22. The Evolution Of Computer

    1. FIRST GENERATION COMPUTER: Vacuum Tubes (1940-1956) Vacuum Tubes. The first generation of computers is characterized by the use of "Vacuum tubes" It was developed in 1904 by the British engineer "John Ambrose Fleming". A vacuum tube is an electronic device used to control the flow of electric current in a vacuum.

  23. Exploring the Evolution of Generations of Computers

    The different phases of this long period are known as computer generations. The first generation of computers was developed from 1940-1956, followed by the second generation from 1956-1963, the third generation from 1964-1971, the fourth generation from 1971 until the present, and the fifth generation are still being developed.

  24. Diverse part synthesis for 3D shape creation

    Our contribution is to show with qualitative and quantitative evaluations which of the new techniques for multimodal part generation perform the best and that a synthesis method based on the top-performing techniques allows the user to more finely control the parts that are generated in the 3D shapes while maintaining high shape fidelity when ...

  25. Apple AirPods 4 Review

    Apple's new AirPods 4 have improved drivers and electronics for what Apple says is better sound, along with reshaped eartips for a better fit, and, for the first time, consumers have the option ...