Computer Science, asked by sizzlindazz549, 10 months ago

what is evolution of computers​

Answers

Answered by Anonymous
1

Computers in the form of personal desktop computers, laptops and tablets have become such an important part of everyday living that it can be difficult to remember a time when they did not exist. In reality, computers as they are known and used today are still relatively new. Although computers have technically been in use since the abacus approximately 5000 years ago, it is modern computers that have had the greatest and most profound effect on society. The first full-sized digital computer in history was developed in 1944. Called the Mark I, this computer was used only for calculations and weighed five tons. Despite its size and limited ability it was the first of many that would start off generations of computer development and growth.

 

First Generation Computers

First generation computers bore little resemblance to computers of today, either in appearance or performance. The first generation of computers took place from 1940 to 1956 and was extremely large in size. The inner workings of the computers at that time were unsophisticated. These early machines required magnetic drums for memory and vacuum tubes that worked as switches and amplifiers. It was the vacuum tubes that were mainly responsible for the large size of the machines and the massive amounts of heat that they released. These computers produced so much heat that they regularly overheated despite large cooling units. First generation computers also used a very basic programming language that is referred to as machine language.

 

Second Generation Computers

The second generation (from 1956 to 1963) of computers managed to do away with vacuum tubes in lieu of transistors. This allowed them to use less electricity and generate less heat. Second generation computers were also significantly faster than their predecessors. Another significant change was in the size of the computers, which were smaller. Transistor computers also developed core memory which they used alongside magnetic storage.

 

Third Generation Computers

From 1964 to 1971 computers went through a significant change in terms of speed, courtesy of integrated circuits. Integrated circuits, or semiconductor chips, were large numbers of miniature transistors packed on silicon chips. This not only increased the speed of computers but also made them smaller, more powerful, and less expensive. In addition, instead of the punch cards and the printouts of previous systems, keyboards and monitors were now allowing people to interact with computing machines.

 

Fourth Generation Computers

The changes with the greatest impact occurred in the years from 1971 to 2010. During this time technology developed to a point where manufacturers could place millions of transistors on a single circuit chip. This was called monolithic integrated circuit technology. It also heralded the invention of the Intel 4004 chip which was the first microprocessor to become commercially available in 1971. This invention led to the dawn of the personal computer industry. By the mid-70s, personal computers such as the Altair 8800 became available to the public in the form of kits and required assembly. By the late 70s and early 80s assembled personal computers for home use, such as the Commodore Pet, Apple II and the first IBM computer, were making their way onto the market. Personal computers and their ability to create networks eventually would lead to the Internet in the early 1990s. The fourth generation of computers also saw the creation of even smaller computers including laptops and hand-held devices. Graphical user interface, or GUI, was also invented during this time. Computer memory and storage also went through major improvements, with an increase in storage capacity and speed.

 

The Fifth Generation of Computers

In the future, computer users can expect even faster and more advanced computer technology. Computers continue to develop into advanced forms of technology. Fifth generation computing has yet to be truly defined, as there are numerous paths that technology is taking toward the future of computer development. For instance, research is ongoing in the fields of nanotechnology, artificial intelligence, as well as quantum computation.


sizzlindazz549: it's correct yaar Thanks 2 u
Anonymous: Welcm
Answered by SouvikBaidya
1
 

A complete history of computing would include a multitude of diverse devices such as the ancient Chinese abacus, the Jacquard loom (1805) and Charles Babbage's ``analytical engine'' (1834). It would also include discussion of mechanical, analog and digital computing architectures. As late as the 1960s, mechanical devices, such as the Marchant calculator, still found widespread application in science and engineering. During the early days of electronic computing devices, there was much discussion about the relative merits of analog vs. digital computers. In fact, as late as the 1960s, analog computers were routinely used to solve systems of finite difference equations arising in oil reservoir modeling. In the end, digital computing devices proved to have the power, economics and scalability necessary to deal with large scale computations. Digital computers now dominate the computing world in all areas ranging from the hand calculator to the supercomputer and are pervasive throughout society. Therefore, this brief sketch of the development of scientific computing is limited to the area of digital, electronic computers.

The evolution of digital computing is often divided into generations. Each generation is characterized by dramatic improvements over the previous generation in the technology used to build computers, the internal organization of computer systems, and programming languages. Although not usually associated with computer generations, there has been a steady improvement in algorithms, including algorithms used in computational science. The following history has been organized using these widely recognized generations as mileposts. 

Similar questions