Question
Computer software is classified in
generation.
Answers
2
3
4
5
Answers
Answer:
This is the answer I hope it helps you.
Explanation:
Technology is just growing and growing, and when you think about the time that humans have been around, it’s really happened in such a minuscule time period. Experts say that there are five generations, which means there have been five-time periods during which computer science has taken a big leap in its technological development. The first generation began in the 1940s and they go all the way up through today.
First Generation (1940-1956)
Everything started with vacuum tubes. These were widely used in the first computer systems for circuitry, while magnetic drums were used for memory.
As you’re most likely aware, these first computers were huge, and would quite often take up an entire room. Not only this, they were expensive to run, used a lot of electricity, and were limited to what they could do – they certainly couldn’t multitask, that was for sure.
In a sense, these machines were just giant calculators.
Second Generation (1956-1963)
Next, there was the introduction of transistors, which came in to replace vacuum tubes. The creation came about at Bell Labs in 1947, although they weren’t commonly used in computers until the late 1950s.
Not only were transistors smaller, but they were also cheaper to build, more energy-efficient, and worked at a faster speed. Their only downside was that they generated a lot of heat, which could cause damage to the computer. However, it was still a great improvement to its predecessor.
Third Generation (1964-1971)
Third-generation computers were where we saw the introduction of integrated circuits (IC), which are still in use today. These reduced the size of the computer even more than the second generation and, again, sped things up.
The first two generations relied on punch cards and printouts, whereas now, we finally start seeing keyboards and monitors that are interfaced with an operating system. Thanks to these advances and a central program to monitor memory, computer devices could now run multiple applications at once.
Fourth Generation (1971-2010)
In the fourth generation of computers, the invention of the microprocessor (commonly known as CPU) helped to get computers to the desk and, later, lap-size that we still know and use today.
In 1981, we saw the first home computers, brought to us by IBM, and in 1984, the first Apple Macintosh was introduced. Over time these small computers became more powerful and, before long, the Internet was developed.
Not only do we have monitors and keyboards, at this time, but also mice and, eventually, handheld devices like cell phones.
Fifth Generation (Present Day)
Although we are still using technology from the fourth generation of information technology, we are now going into a new age: the fifth generation.
The biggest thing to date is the introduction of artificial technology (AI) and features such as Apple’s Siri or Amazon’s Alexa. AI is constantly adapting and, moving forward, is expected to become more tailored towards individual business needs.
The hope, as this generation progresses, is that computers can begin to learn self-organization, which sounds pretty appealing if an organization isn’t something that comes naturally to you! IF HELPFUL PLEASE MARK AS BRAINLIEST.