Generation of Computers

It is important to realize that major changes and trends in computer systems have occurred during the major   stages-or generations-of computing, and will continue into the future. The first generation of computers developed in the early 1950s, the second generation blossomed during the late 1960s, the third generation took computing into the 1970s, and the fourth generation has been the computer technology of the 1980s and 1990s. A fifth generation of computers that accelerates the trends of the previous generations is expected to evolve as we enter the 21st century. Notice that computers continue to become smaller, faster, more reliable, less costly to purchase and maintain, and more interconnected within computer networks.

First-generation computing involved massive computers using hundreds or thousands of vacuum tubes for their processing and memory circuitry. These large computers generated enormous amounts of heat; their vacuum tubes had to be replaced frequently. Thus, they had large electrical power, air conditioning, and maintenance requirements. First-generation computers had main memories of only a few thousand characters and millisecond processing speeds. They used magnetic drums or tape for secondary storage and punched cards or paper tape as input and output media.

Second-generation computing used transistors and other solid-state, semiconductor devices that were wired to circuit boards in the computers. Transistorized circuits were much smaller and much more reliable, generated little heat, were less expensive, and required less power than vacuum tubes. Tiny magnetic cores were used for the computer’s memory, or internal storage. Many second-generation computers had main memory capacities of less than 100 kilobytes and microsecond processing, speeds. Removable magnetic disk packs were introduced, and magnetic tape merged as the major input, output, and secondary storage medium for large computer installations.

Third-generation computing saw the development of computers that used integrated circuits, in which thousands of transistors and other circuit elements are etched on tiny chips of silicon. Main memory capacities increased to several megabytes and processing speeds jumped to millions of instructions per second (MIPS) as telecommunications capabilities became common. This made it possible for operating system programs to come into widespread use that automated and supervised the activities of many types of peripheral devices and processing by mainframe computers of several programs at the same time, frequently involving networks of users at remote terminals. Integrated circuit technology also made possible the development and widespread use of small computers called minicomputers in the third computer generation.

Fourth-generation computing relies on the use of LSI (large-scale integration) and VLSI (very-large-scale integration) technologies that cram hundreds of thousands or millions of transistors and other circuit elements on each chip. This enabled the development of microprocessors, in which all of the circuits of a CPU are contained on a single chip with processing speeds of millions of instructions per second. Main memory capacities ranging from a few megabytes to several gigabytes can also be achieved by memory chips that replaced   magnetic core memories. Microcomputers, which use microprocessor CPUs and a variety of peripheral devices and easy-to-use software packages to form small personal computer (PC), systems or client/server networks of linked PCs and servers, are a hallmark of the fourth generation of computing, which accelerated the downsizing of computing systems.

Whether we are moving into a fifth generation of computing is a subject of debated since the concept of generations may no longer fit the continual, rapid changes occurring in computer hardware, software, data, and networking technologies. But in any case, we can be sure that progress in computing will continue to accelerate, and that the development of Internet-based technologies and applications will be one of the major forces driving computing into the 21st century.

Leave a Reply

Your email address will not be published. Required fields are marked *