top of page
  • Writer's pictureLaila Alahaideb

Across 4 generations

Updated: Oct 26, 2022

From vacuum tups to quantum

Over the centuries, all computers have interacted with bits either 0 or 1, which share one thing in common, Boolean logic. If we go back even before the von Neumann architecture, in the first generation of computers the first computer was a digital computer that dealt with logic, the ABC computer.

First generation:

ABC computer

In 1937, Atanasoff was preoccupied with solving the computer issue, and to solve it he envisioned a machine that would:

Condensers are used for memory, and the binary system and a regenerative mechanism are used to prevent lapses resulting from power loss. Direct logical action is used to calculate rather than the enumeration procedures used in analogy calculators.

Professor John Vincent Atanasoff and graduate student Cliff Berry develop the Atanasoff-Berry Computer knowns as the ABC computer and continued to work on it until 1942 at Iowa State College (now Iowa State University). The ABC was an electromechanical computer without a CPU that performed digital calculations using more than 300 vacuum tubes, including binary math and Boolean logic.

ENIAC Computer

And for fully functional, in 1943 J. Presper Eckert and John Mauchly created the Electronic Numerical Integrator and Computer (ENIAC Computer) at the University of Pennsylvania. It employed roughly 18,000 vacuum tubes, took up about 1,800 square feet, weighed close to 50 tons, and 3 years of construction.

Von Neumann architecture

In 1945, John von Neumann published the Von Neumann architecture the architecture design consists of an Arithmetic and Logic Unit (ALU), a Memory Unit, a Control Unit, Registers, and Inputs/Outputs.

The stored-program computer concept, in which program data and instruction data are both kept in the same memory, is the foundation of Von Neumann's architecture. The vast majority of modern computers still employ this design.

In 1947, at Bell Labs, William Shockley, Walter Houser Brattain, and John Bardeen worked together to design and invent the transistor, A transistor is a device that aids in boosting or switching electrical and electronic impulses.

TX-0 computer

Second generation:

Compared to the first generation of vacuum tube-based devices, the TX-0 computer was the first transistor-based computer. In 1959-1965, the second generation, transistors were adopted because they were more affordable, utilized less power, were smaller and more dependable, and were faster. Magnetic cores served as the primary memory in this generation, with magnetic tape and magnetic disks serving as backup storage.

And this generation used high-level programming languages like FORTRAN and COBOL as well as assembly code. The machines employed a multiprogramming operating system and batch processing.


Third generation:

In 1964-1971, Technology predicted a switch from massive transistors to integrated circuits, often known as ICs, during this third generation. On these silicon chips, sometimes referred to as semiconductors, a variety of transistors were arranged. The speed and dependability of computers in this era were their best features. ICs, often known as silicon chips, were constructed from silicon.

On a single, thin piece of silicon, an integrated circuit (IC) is made up of many transistors, registers, and capacitors. During this generation, the value size was decreased but memory usage and dealing efficiency were elevated.

During this generation in 1965, Gordon Moore predicted that the number of transistors on microchips would double roughly every 18 months. This phenomenon, often known as Moore's Law, predicts that as time passes, computational advances will become noticeably quicker, smaller, and more effective.

Very Large Scale Integrated (VLSI) circuits

Fourth generation:

The first large-scale integration (LSI) circuits based on a single chip, known as microprocessors, were utilized in 1971. The biggest benefit of this technology is that it can execute all arithmetic, logic, and control operations on a single chip using a single microprocessor.

Microcomputers were the name given to computers that used microchips, LSI circuits were replaced by Very Large Scale Integrated (VLSI) circuits because that wasn't adequate. In 1971 developed Intel 4004chip enabled a significant reduction in size by consolidating all of the computer's parts, from the central processing unit and memory to the input/output controls, onto a single chip which improved in strength, size, dependability, and cost. It consequently sparked the personal computer (PC) revolution. Time-sharing, real-time networks, and distributed operating systems were popular in this generation. Every high-level language, including C, C++, and DBASE.

Then in 1998, the first quantum computer that could process data and provide a result was developed by Mark Kubinec of the University of California at Berkeley, Neil Gershenfeld of the Massachusetts Institute of Technology, and Isaac Chuang of the Los Alamos National Laboratory.

Across 4 generations from a ten-based system to a binary system until quantum bits.


L. Null and J. Lobur, Essentials of Computer Organization and Architecture. Jones & Bartlett Publishers, 2014.

“Second Generation of Computer: Transistors,” TutorialsMate. (accessed Oct. 26, 2022).

“Generations of Computers - Computer Fundamentals,” GeeksforGeeks, Feb. 16, 2021. (accessed Oct. 26, 2022).

“What Is Moore’s Law and Is It Still True?,” Investopedia. (accessed Oct. 26, 2022).

4 views0 comments

Recent Posts

See All


bottom of page