Presentation on the topic of how the computer appeared. History of the creation of the computer. First devices. We can only speculate about when humanity learned to count. But you can with confidence. Electronic computer technology is usually divided into generations

Slide 2

In recent years, there has been rapid development of computer technology. The computer is being introduced into almost all areas of our lives. But few people know where computer technology came to us and who invented it. The purpose of my work is to study the history of one of the most important objects of modern life - the computer.

Slide 3

The word computer comes from the English word computer, which means “computer”. At first, counting was inseparable from bending the fingers. Fingers became the first computing technology. A revolution occurred with the invention of the abacus. Even if you have not heard this word, you have come across, more than once, the Russian version of this device - an abacus.

Slide 4

But with development, calculations became more complex, and people wanted to entrust the calculations to a machine. Around 1632, the German scientist Wilhelm Schickard invented the first calculating mechanism in history. In 1642, French scientist Blaise Pascal created a machine that could add and subtract. In 1672, Wilhelm Leibniz created an adding machine that could also multiply and divide.

Slide 5

In the 19th century, the Englishman Charles Babbage developed the design of a machine that can be called the first computer. But he was never able to build it, since no one wanted to finance his project.

Slide 6

In 1944, the Mark-1 machine was created at IBM at the request of the US Navy. It was a monster weighing about 35 tons.

Slide 7

But the Mark 1 did not work fast enough, and in 1946 the first electronic machine, ENIAC, was built. Its weight was 30 tons, it required 170 m2 of area. ENIAC contained 18 thousand lamps, which emitted so much light that flying insects caused malfunctions.

Slide 8

In 1947, the Americans invented transistors. One transistor replaced 40 lamps. As a result, the speed increased 10 times, the weight and size of the machines decreased. A new computer era has begun - second generation computers have appeared.

Slide 9

In 1959, chips were invented. Computer speed has increased tenfold. The dimensions of the machines are noticeably reduced. The appearance of the chip marked the birth of the third generation of computers. It was a box for the body and a set of parts. To work with it, you had to solder it yourself, assemble all the parts, and master programming. The first set-top box Altair-8800.

Slide 10

In the 1970s, the American company Apple created the first personal computer. In 1977, the Apple II was released, which already had a keyboard, monitor, sound and a plastic case.

Slide 11

The first computer that included a mouse was the Xerox 8010. The manipulator received the name “mouse” due to the similarity of the signal wire to the tail of the mouse (in early models it came out from the back).

In the 40s and 50s, computers were very large devices. One computer required a room of impressive size filled with cabinets with electronic equipment. Computers worked on vacuum tubes, which were large and also cost a lot. At that time, computers were only available to large companies and institutions.

With the invention in 1948 of transistors - miniature electronic devices that could replace vacuum tubes in computers, it became possible to reduce the size. And since very cheap ways to produce transistors were found in the mid-50s, transistor-based computers have emerged. They were hundreds of times smaller than tube computers of the same performance. The only part of the computer where transistors could not replace vacuum tubes was memory units, but instead of tubes they began to use memory circuits on magnetic cores, which had been invented by that time.

In the mid-60s, much more compact external devices for computers appeared, which allowed Digital Equipment

the first mini-computer PDP-8.

But producing transistors was a very labor-intensive process. They were manufactured separately, and when assembling the circuits they had to be connected and soldered by hand. In 1958, Jack Kilby figured out how to make several transistors on one semiconductor wafer. In 1959, Robert Noyce invented a more advanced method that made it possible to create transistors and all the necessary connections between them on one plate. Such electronic circuits came to be called integrated circuits or chips.

The invention of integrated circuits was a major step towards the miniaturization of computers. Subsequently, the number of transistors that could be placed per unit area of ​​an integrated circuit approximately doubled every year. In 1968, the company released its first

integrated circuit computer, and in 1970 Intel began selling memory integrated circuits.

At the same time, another important step was taken on the path to a personal computer - Marchian Edward Hoff designed an integrated circuit, similar in its functions to the central processor of a mainframe computer. This is how the first microprocessor Intel-4004 appeared, which was released for sale at the end of 1970.

Of course, the capabilities of the Intel-4004 were much more modest than those of the central processor of a large computer; it worked much slower and could only process 4 bits of information at a time. But it continued to improve and in 1973 Intel released the 8-bit microprocessor Intel-8008, and in 1974 its improved version Intel-8080, which until the end of the 70s became the standard for the microcomputer industry.

Initially, these microprocessors were used only by amateur electronics engineers and in various specialized devices. But in 1974, several companies announced the creation of a computer based on the Intel-8008 microprocessor, i.e. a device that performs the same functions as a mainframe computer. At the beginning of 1975, the first commercially distributed computer, Altair-8800, appeared, built on the basis

Despite such shortcomings as small RAM (only 256 bytes), lack of a keyboard and screen, its appearance was greeted with great enthusiasm. In the first months, several thousand sets of the machine were sold. Buyers supplied it with additional devices: a monitor for displaying information, a keyboard, memory expansion units, etc. Soon these devices began to be produced by others

At the end of 1975, Paul Allen and Bill Gates created a Basic language interpreter for the Altair computer, which allowed users to easily communicate with the computer and easily write programs for it. This made the computer easier to handle and was another milestone in the popularity of the PC.

Many companies began producing personal computers. Periodicals devoted to computers began to be published. PCs began to be sold from

there were hundreds of thousands of them

Sales growth was greatly facilitated by numerous useful programs developed for business applications.

Commercially distributed programs also appeared, for example in 1978. WordStar text editing program appeared. With the help of such programs, it has become possible to perform accounting calculations, draw up documents, etc. much more efficiently. As a result, it turned out that for many organizations it became possible to perform the calculations they needed not on mainframe computers or minicomputers, but on personal computers, which made PCs a profitable and quickly payback investment, since they were much cheaper.

The growing demand for personal computers by the end of the 70s led to a slight decrease in demand for large computers and minicomputers. This became a matter of serious concern for IBM (International Business Machines Corporation), a leading company in the production of large computers, and in 1979 IBM decided to try to release a personal computer.

The company's management did not consider this as a serious project, but only as a minor experiment, something like one of dozens to create new equipment. In order not to spend too much money on this experiment, the company's management allowed the division responsible for this project not to design a personal computer from scratch, but to use blocks made by other companies. And this unit took full advantage of the given chance.

Slide 1

Slide 2

In recent years, there has been rapid development of computer technology. The computer is being introduced into almost all areas of our lives. But few people know where computer technology came to us and who invented it. The purpose of my work is to study the history of one of the most important objects of modern life - the computer.

Slide 3

The word computer comes from the English word computer, which means “computer”. At first, counting was inseparable from bending the fingers. Fingers became the first computing technology. A revolution occurred with the invention of the abacus. Even if you have not heard this word, you have come across, more than once, the Russian version of this device - an abacus.

Slide 4

But with development, calculations became more complex, and people wanted to entrust the calculations to a machine. Around 1632, the German scientist Wilhelm Schickard invented the first calculating mechanism in history. In 1642, French scientist Blaise Pascal created a machine that could add and subtract. In 1672, Wilhelm Leibniz created an adding machine that could also multiply and divide.

Slide 5

In the 19th century, the Englishman Charles Babbage developed the design of a machine that can be called the first computer. But he was never able to build it because no one wanted to finance his project.

Slide 6

In 1944, the Mark-1 machine was created at IBM at the request of the US Navy. It was a monster weighing about 35 tons.

Slide 7

But the Mark 1 did not work fast enough, and in 1946 the first electronic machine, ENIAC, was built. Its weight was 30 tons, it required 170 m2 of area. ENIAC contained 18 thousand lamps, which emitted so much light that flying insects caused malfunctions.

Slide 8

In 1947, the Americans invented transistors. One transistor replaced 40 lamps. As a result, the speed increased 10 times, the weight and size of the machines decreased. A new computer era has begun - second generation computers have appeared.

Slide 9

In 1959, chips were invented. Computer speed has increased tenfold. The dimensions of the machines are noticeably reduced. The appearance of the chip marked the birth of the third generation of computers. It was a box for the body and a set of parts. To work with it, you had to solder it yourself, assemble all the parts, and master programming. The first set-top box Altair-8800.

Slide 10

In the 1970s, the American company Apple created the first personal computer. In 1977, the Apple II was released, which already had a keyboard, monitor, sound and a plastic case.

Slide 11

The first computer that included a mouse was the Xerox 8010. The manipulator received the name “mouse” due to the similarity of the signal wire to the tail of the mouse (in early models it came out from the back).

Slide 12

After this, computer technology began to develop rapidly. Every year new technologies and new computer models appeared.

History of computer development
10th grade students
Zhilavskaya Ekaterina
Neteplyuk Daria
Goncharenko Valeria
Burchak Maria

Plan
Babbage's Analytical Engine
First computers



The emergence of the IBM PC

Babbage's Analytical Engine
Back in the first half of the 19th century. The English mathematician Charles Babbage tried to build a universal computing device, that is, a computer (Babbage called it the Analytical Engine). It was Babbage who first came up with the idea that a computer should contain memory and be controlled by a program. Babbage wanted to build his computer as a mechanical device, and he was going to set programs using punched cards - cards made of thick paper with information printed using holes (they were already widely used in looms at that time).
However, Babbage was unable to complete this work - it turned out to be too complex for the technology of that time.

First computers
In the 40s of the XX century. several groups of researchers repeated Babbage's attempt. Some of these researchers knew nothing of Babbage's work and rediscovered his ideas. The first of them was the German engineer Konrad Zuse, who in 1941 built a small computer based on several electromechanical relays. But due to the war, Zuse's works were not published.
And in the USA in 1943, at one of the IBM enterprises, the American Howard Aiken created a more powerful computer called “Mark-1”.

Stored program computers
Beginning in 1943 in the United States, a group of specialists led by John Mauchly and Presper Eckert began to construct an ENIAC computer based on vacuum tubes. The computer they created worked a thousand times faster than Mark-1. To simplify and speed up the process of setting programs, Eckert and Mauchly began to design a new computer that could store a program in its memory.
In 1945, John von Neumann prepared a report on the computer. The first computer to embody von Neumann's principles was built in 1949 by the English researcher Maurice Wilkes.

Development of computer element base
In the 40s and 50s, computers were created using vacuum tubes. In 1948, transistors were invented. Thanks to them, computers have become smaller. The first transistor-based computers appeared in the late 50s, and by the mid-60s, much more compact external devices for computers had been created, which allowed Digital Equipment to release the first mini-computer PDP-8 in 1965, the size of refrigerator and costs only 20 thousand dollars..
In 1959, Robert Noyce invented a method that made it possible to create transistors and all the necessary connections between them on a single silicon wafer.
In 1968, Burroughs released the first integrated circuit computer, and in 1970, Intel began selling memory integrated circuits. Subsequently, the number of transistors that could be placed per unit area of ​​the integrated circuit ensured a constant reduction in the cost of computers and an increase in performance.

The advent of personal computers
In 1974, several companies announced the creation of a personal computer based on the Intel-8008 microprocessor, designed for one user. At the beginning of 1975, the first commercially distributed personal computer, Altair-8800, based on the Intel-8080 microprocessor, appeared. Additional devices: monitor for displaying information, keyboard, memory expansion units, etc.
At the end of 1975, Paul Allen and Bill Gates created a Basic language interpreter for the Altairo computer, which allowed users to easily communicate with the computer and easily write programs for it.
In 1976, a new company, Apple Computer, entered the market with the $666 Apple I computer.
But the Apple II computer, which appeared in 1977, became the prototype for most subsequent models, including the IBM PC.

The emergence of the IBM PC
All this led to a reduction in the cost of IBM PC-compatible computers and a rapid improvement in their characteristics, and therefore to an increase in their popularity.
Here's how the openness of the IBM PC architecture influenced the development of personal computers:
The promise and popularity of the IBM PC made the production of various components and additional devices for the IBM PC very attractive. Competition between manufacturers has led to cheaper components and devices.
Very soon, many companies ceased to be content with the role of manufacturers of components for the IBM PC and began to assemble their own computers compatible with the IBM PC. Since these companies did not need to bear IBM's huge costs for research and maintaining the structure of a huge company, they were able to sell their computers much cheaper (sometimes 2-3 times) than similar IBM computers.
Users were able to independently upgrade their computers and equip them with additional devices from hundreds of different manufacturers.

The presentation on the topic “History of the Development of Computer Technology” can be downloaded absolutely free on our website. Project subject: Computer science. Colorful slides and illustrations will help you engage your classmates or audience. To view the content, use the player, or if you want to download the report, click on the corresponding text under the player. The presentation contains 12 slide(s).

Presentation slides

Slide 1

Slide 2

The history of the development of computer technology is usually divided into prehistory and 4 generations of computer development:

Background; - First generation; - Second generation; - Third generation; - Fourth generation;

Slide 3

Background. In 1941, the German engineer Zuse built a small computer based on electromechanical relays, but due to the war his works were not published. In 1943, in the USA, at one of the IBM enterprises, Aiken created a more powerful computer, the Mark-1, which was used for military calculations. But electromechanical relays were slow and unreliable. First generation of computers (1946 - mid-50s) The generation of computers refers to all types and models of computers developed by different design teams, but built on the same scientific and technical principles. The appearance of the electron vacuum tube led to the creation of the first computer. In 1946, a computer for solving problems called ENIAC (Electronic Numerical Integrator and Calculator) appeared in the USA. This computer worked a thousand times faster than the Mark 1. But most of the time it was idle, because... To complete the program, it took several hours to connect the wires in the right way. The set of elements that make up a computer is called the element base. The elemental base of first generation computers are electron vacuum tubes, resistors and capacitors. The elements were connected by wires using overhead mounting. The computer consisted of many bulky cabinets and occupied a special computer room, weighed hundreds of tons and consumed hundreds of kilowatts of electricity. ENIAC had 20 thousand vacuum tubes. In 1 sec. The machine performed 300 multiplication operations or 5000 addition operations of multi-digit numbers. In 1945, the famous American mathematician John von Neumann presented a report to the general scientific community in which he was able to outline the formal logical organization of a computer, abstracting from circuits and radio tubes.

Slide 4

History of the development of computer technology. Classic principles of functional organization and operation of a computer:

1. Availability of main devices: control unit (CU), arithmetic-logical unit (ALU), storage device (RAM), input-output devices; 2. Storing data and commands in memory; 3. The principle of program control; 4. Sequential execution of operations; 5. Binary coding of information (the first computer "Mark-1" performed calculations in the decimal number system, but such coding is difficult to implement technically, and was later abandoned); 6. Use of electronic elements and electrical circuits for greater reliability (instead of electromechanical relays).

Slide 5

First generation of computers

The first domestic computer was created in 1951 under the leadership of academician S.A. Lebedev, and it was called MESM (small electronic calculating machine). Later, BESM-2 (large electronic calculating machine) was created. The most powerful computer of the first generation in Europe was the Soviet computer M-20 with a speed of 20 thousand op/sec., RAM capacity - 4000 machine words. On average, the speed of a first generation computer is 10-20 thousand ops/sec. The operation of first-generation computers is too complicated due to frequent failures: vacuum tubes often burned out and had to be replaced manually. A whole staff of engineers was involved in servicing such a computer. Programs for such machines were written in machine code; one had to know all the machine commands and their binary representation. In addition, such computers cost millions of dollars.

Slide 6

Second generation of computers

The invention of the transistor in 1948 made it possible to change the elemental base of the computer to semiconductor elements (transistors and diodes), as well as more advanced resistors and capacitors. One transistor replaced 40 vacuum tubes, worked faster, was cheaper and more reliable. The technology for connecting the element base has changed: the first printed circuit boards appeared - plates of insulating material on which transistors, diodes, resistors and capacitors were placed. Printed circuit boards were connected using surface mounting. Electricity consumption has been reduced, and the dimensions have decreased hundreds of times. The productivity of such computers is up to 1 million op./sec. If several elements failed, the entire board was replaced, rather than each element individually. After the advent of transistors, the most labor-intensive operation in computer manufacturing was connecting and soldering transistors to create electronic circuits. The advent of algorithmic languages ​​has made the process of writing programs easier. The principle of time sharing was introduced - various computer devices began to work simultaneously. In 1965, Digital Equipment released the first minicomputer, the PDP-8, the size of a refrigerator and costing only $20,000.

Slide 7

Third generation of computers

In 1958, John Kilby first created a prototype integrated circuit or chip. The integrated circuit performed the same functions as the electronic circuit in the second generation computer. It was a silicon wafer on which transistors and all the connections between them were placed. Element base - integrated circuits. Performance: hundreds of thousands - millions of operations per second. The first computer made on integrated circuits was the IBM-360 in 1968 from IBM, which marked the beginning of a whole series (the higher the number, the greater the capabilities of the computer). In 1970, Intel began selling memory integrated circuits. Subsequently, the number of transistors per unit area of ​​the integrated circuit approximately doubled annually. This ensured a constant reduction in cost and an increase in computer speed. The memory capacity has increased. Displays and plotters appeared, and various programming languages ​​continued to develop. In our country, two families of computers were produced: large (for example, ES-1022, ES-1035) and small (for example, SM-2, SM-3). At that time, the computer center was equipped with one or two EC-computer models and a display class, where each programmer could connect to the computer in time-sharing mode.

Slide 8

Fourth generation of computers

In 1970, Marchian Edward Hoff of Intel designed an integrated circuit similar in function to the central processing unit of a large computer. This is how the first microprocessor Intel-4004 appeared, which was released for sale in 1971. This microprocessor, less than 3 cm in size, was more productive than a giant machine. It was possible to place 2250 transistors on one silicon crystal. True, it worked much more slowly and could only process 4 bits of information at a time (instead of 16-32 bits for large computers), but it also cost tens of thousands of times less (about $500). Microprocessor performance soon began to rapidly increase. Microprocessors were first used in various computing devices (such as calculators). In 1974, several companies announced the creation of a personal computer based on the Intel-8008 microprocessor, i.e. device designed for one user.

Slide 9

Widespread sales of personal computers (PCs) on the market are associated with the names of young Americans S. Jobs and V. Wozniak, founders of Apple Computer, which began producing Apple personal computers in 1977. Sales growth was driven by numerous programs designed for business applications (word editing, spreadsheets for accounting).

Slide 10

In the late 1970s, the rise of PCs led to a decline in demand for large computers. This worried the management of IBM, a leading company in the production of large computers, and it decided to try its hand at the PC market as an experiment. In order not to spend a lot of money on this experiment, the department responsible for this project was allowed not to design a PC from scratch, but to use blocks manufactured by other companies. Thus, the latest 16-bit microprocessor Intel-8088 was chosen as the main microprocessor. The software was commissioned to be developed by a small company, Microsoft. In August 1981, the new IBM PC was ready and became very popular among users. IBM did not make its computer a single, all-in-one device and did not protect its design with patents. Instead, she assembled the computer from independently manufactured parts and did not keep how the parts were put together a secret; IBM PC designs were available to everyone. This allowed other companies to develop both hardware and software. Very soon, these companies ceased to be content with the role of manufacturers of components for the IBM PC and began to assemble PCs themselves that were compatible with the IBM PC. Competition between manufacturers has led to cheaper computers. Because these firms did not have to incur huge research costs, they could sell their computers much cheaper than similar IBM computers. Computers compatible with the IBM PC were called “clones” (doubles). A common feature of the IBM PC family and computers compatible with it is software compatibility and the principle of open architecture, i.e. the ability to add and replace existing hardware with more modern ones without replacing the entire computer. One of the most important ideas of fourth-generation computers is that several processors are used simultaneously to process information (multiprocessing).

Slide 11

A server is a powerful computer in computer networks that provides service to computers connected to it and access to other networks. Supercomputers appeared back in the 70s. Unlike computers of the Neumann structure, they use a multiprocessor processing method. With this method, the problem being solved is divided into several parts, each of which is solved in parallel on its own processor. This dramatically increases productivity. Their speed is billions of operations per second. But such computers cost millions of dollars. Personal computers (PCs) are used everywhere and have an affordable price. A large number of software tools have been developed for them for various applications that help a person process information. Now the PC has become multimedia, i.e. processes not only numerical and text information, but works effectively with sound and image. Portable computers (the Latin word "porto" means "to carry") are portable computers. The most common of them is a notebook ("note book") - a notebook personal computer. Industrial computers are designed for use in industrial settings (for example, to control machine tools, airplanes, and trains). They are subject to increased requirements for reliability of trouble-free operation, resistance to temperature changes, vibration, etc. Therefore, ordinary personal computers cannot be used as industrial ones.

  • There is no need to overload the slides of your project with text blocks; more illustrations and a minimum of text will better convey information and attract attention. The slide should contain only key information; the rest is best told to the audience orally.
  • The text must be well readable, otherwise the audience will not be able to see the information being presented, will be greatly distracted from the story, trying to at least make out something, or will completely lose all interest. To do this, you need to choose the right font, taking into account where and how the presentation will be broadcast, and also choose the right combination of background and text.
  • It is important to rehearse your report, think about how you will greet the audience, what you will say first, and how you will end the presentation. All comes with experience.
  • Choose the right outfit, because... The speaker's clothing also plays a big role in the perception of his speech.
  • Try to speak confidently, smoothly and coherently.
  • Try to enjoy the performance, then you will be more at ease and less nervous.
  • Loading...Loading...