Computer Science, asked by nazjam752, 2 months ago

discuss the various genarations of computers

1 page answer please

Answers

Answered by havellshavells
2

Answer:

Each generation is defined by a significant technological development that changes fundamentally how computers operate – leading to more compact, less expensive, but more powerful, efficient and robust machines. These early computers used vacuum tubes as circuitry and magnetic drums for memory.

Explanation:

please mark as brainlist

follow for more help

Answered by Anonymous
3

\large\mathcal{\blue {\underline {\overline {\mid {\purple {Answer}}\mid}}}}

1940 – 1956: First Generation – Vacuum Tubes

These early computers used vacuum tubes as circuitry and magnetic drums for memory. As a result they were enormous, literally taking up entire rooms and costing a fortune to run. These were inefficient materials which generated a lot of heat, sucked huge electricity and subsequently generated a lot of heat which caused ongoing breakdowns.

⠀⠀

⠀⠀⠀⠀

1956 – 1963: Second Generation – Transistors

The replacement of vacuum tubes by transistors saw the advent of the second generation of computing. Although first invented in 1947, transistors weren’t used significantly in computers until the end of the 1950s. They were a big improvement over the vacuum tube, despite still subjecting computers to damaging levels of heat. However they were hugely superior to the vacuum tubes, making computers smaller, faster, cheaper and less heavy on electricity use.

⠀⠀

⠀⠀⠀⠀

1964 – 1971: Third Generation – Integrated Circuits

By this phase, transistors were now being miniaturised and put on silicon chips (called semiconductors). This led to a massive increase in speed and efficiency of these machines. These were the first computers where users interacted using keyboards and monitors which interfaced with an operating system, a significant leap up from the punch cards and printouts.

⠀⠀⠀⠀⠀⠀

⠀⠀⠀

1972 – 2010: Fourth Generation – Microprocessors

This revolution can be summed in one word: Intel. The chip-maker developed the Intel 4004 chip in 1971, which positioned all computer components (CPU, memory, input/output controls) onto a single chip. What filled a room in the 1940s now fit in the palm of the hand. The Intel chip housed thousands of integrated circuits. The year 1981 saw the first ever computer (IBM) specifically designed for home use and 1984 saw the MacIntosh introduced by Apple.

⠀⠀

⠀⠀⠀

2010- : Fifth Generation – Artificial Intelligence

artificalintelligenceComputer devices with artificial intelligence are still in development, but some of these technologies are beginning to emerge and be used such as voice recognition.

AI is a reality made possible by using parallel processing and superconductors. Leaning to the future, computers will be radically transformed again by quantum computation, molecular and nano technology.

Similar questions