The history of the invention of the computer for children. The history of the development of computers. Have you thought about what they will be in ten years?


Introduction

Personal computers (PCs) are becoming more and more integral to our lives and do not occupy the last place in it. If some 15 years ago they could only be seen in reputable organizations, today a PC is in every store, office, cafe, library or apartment.

Today, computers are used in human activities in many areas - for accounting and creating complex scientific models, designing and creating music, storing and searching information in databases, learning, playing games and listening to music. You need to know the computer, be able to use it. Not every person who works on a computer imagines a completely accurate composition of a PC.

Professionals working outside the computer sphere consider knowledge of the hardware of a personal computer, at least its basic technical characteristics, to be an indispensable component of their competence. Especially great is the interest in computers among young people, who widely use them for their own purposes.

The relevance of the chosen topic is due to the fact that the modern computer technology market is so diverse that it is not easy to determine the configuration of a PC with the required characteristics. It is almost impossible to do without special knowledge.

In this regard, the purpose of the course work is to study the main devices of a modern PC. In accordance with the goal, the following tasks were set:

Learn about the history of computers

Learn the basic components of a PC

To master their main properties and characteristics

History of the computer

The word "computer" means "computer" that is, a device for computing. The need for automation of calculations arose a very long time ago. Many thousands of years ago, pebbles, counting sticks, and similar devices were used. More than 1500 years ago, the so-called counting boards were invented, their descendant is the well-known abacus.

In 1642, the French scientist, physicist and philosopher Blaise Pascal invented the adding machine - a mechanical device for adding numbers. Pascal's calculating machine was conceived by him as early as 1640. Work on the calculating machine lasted about five years, about fifty different models were made, and was completed in 1645. In 1649, Pascal received a "royal privilege" (patent), giving the right to manufacture and sell the machine.

A number of such machines were indeed manufactured and sold by him. Subsequently, many different designs of mechanical calculating machines were proposed, but they were widely used only 200 years later, in the 19th century, when their industrial production became possible. Such machines began to be called adding machines - they mechanized all four operations of arithmetic: addition, subtraction, multiplication and division. Arithmometers and their development - electromechanical keyboard calculating machines were used until the 60s of the last century, when they were replaced by electronic microcalculators.

The mechanical computers discussed above were manual, that is, they required the participation of an operator in the calculation process. For each operation, it was necessary to enter the initial data into the machine and set the counting elements of the machine in motion to complete the operation. From time to time it was necessary to read and write the results obtained and to control the correctness of the calculations.

Is it possible to create an automatic computing machine that would carry out the required calculations without human intervention? The first to raise such a question and take serious steps to substantiate a positive answer to it was the remarkable English scientist, engineer and inventor Charles Babbage, who tried to build an automatic computing device (he called it an analytical machine) that works without human intervention - under the control of punched cards.

The Analytical Engine was not built, but Babbage made more than 200 drawings of its various components, about 30 variants of the general layout of the machine, and made some devices at his own expense.

At the end of the 19th and beginning of the 20th centuries, the so-called calculating and analytical machines, built on the development of the ideas of Pascal and Babbage, became widespread. To read punched cards, they began to use electrocontact devices, and an electric motor was used to drive the rotation of the counting wheels. Later, machines were constructed in which numbers were stored in binary form using groups of electric relays. Aiken in the USA, Zuse in Germany and others designed the so-called relay machines, which were used until the early 60s, competing with the electronic computers that had already appeared at that time.

The first true electronic mainframe was built in late 1945; the machine was named ENIAC (ENIAC - Electronic Numerical Integrator and Computer, electronic digital integrator and calculator). This structure contained over 18,000 vacuum tubes and consumed about 150 kW of power.

Beginning in 1944, one of the greatest American mathematicians, John Von Neumann, took part in the creation of electronic computers. He, in the article “Preliminary consideration of the logical design of an electronic computing device”, published in 1946 together with G. Goldstein and A. Burks, expressed two ideas that are used in all electronic computers to this day: the use of a binary number system and the principle of a stored program . Storing the program in the machine's memory allows the transformation of instructions during the operation of the machine, which makes the computational process flexible.

The computers of the 1940s and 1950s were very large devices and very expensive. However, in the struggle for buyers, firms that produced computers sought to make their products smaller and cheaper. In 1965, Digital Equipment released the first $20,000 refrigerator-sized PDP-8 minicomputer. Later, with the invention of integrated circuits - chips - it became possible to further reduce the size and reduce the cost of computers. In 1975, the first commercially distributed computer Altair-8800 was released, built on the basis of the Intel-8080 microprocessor. It cost $500. The production of personal computers began to grow.

In 1979, IBM, the world leader in the design and manufacture of large computers, decided to try its hand at the personal computer market. In 1981, a new computer called the IBM PC was introduced to the public.

A few years later, IBM personal computers became the market leaders. In fact, the IBM PC has become the standard for the personal computer. Now such computers (compatible with the IBM PC) make up about 90% of all personal computers produced in the world.

The main advantage of IBM computers is the so-called open architecture principle, that is, the ability to assemble a computer from various blocks by attaching them to the motherboard using standard connectors - slots. This allows you to increase the amount of memory, install new devices for image processing, etc.

The modern personal computer surpasses the first in its capabilities, just as the first electronic computer surpassed Pascal's calculating machine. However, there are areas of human activity where their power is not enough. This applies to the processing of very large amounts of information in scientific research, engineering calculations, and the creation of video films. In these cases, it is possible to store and process absolutely unimaginable amounts of information. If a personal computer stores hundreds of GB of information and has a speed of hundreds of millions of operations per second, then a supercomputer can store up to thousands of GB of information and process it at a speed of several trillion operations per second.

To successfully work on a personal computer, it is not necessary to know its device. However, it is better to know what devices are included in the PC, the basic principles of their operation and characteristics. This will allow you to consciously use all the technical capabilities of the computer, improve it.

Brief history of computers

Today it is difficult for modern man to imagine his life without electronic computers (computers). At present, anyone, in accordance with their requests, can assemble a full-fledged computer center on their desktop. It was not always so, of course. The path of mankind to this achievement was difficult and thorny. Many centuries ago, people wanted to have devices that would help them solve various problems. Many of these tasks were solved by the sequential execution of some routine actions, or, as they say now, by the execution of an algorithm. With an attempt to invent a device capable of implementing the simplest of these algorithms (addition and subtraction of numbers), it all began ...

The starting point can be considered the beginning of the 17th century (1623), when the scientist V. Shikard created a machine that could add and subtract numbers. But the first adding machine capable of performing four basic arithmetic operations was the adding machine of the famous French scientist and philosopher Blaise Pascal. The main element in it was a gear wheel, the invention of which in itself became a key event in the history of computer technology. I would like to note that the evolution in the field of computer technology is uneven, spasmodic in nature: periods of accumulation of forces are replaced by breakthroughs in development, after which there comes a period of stabilization, during which the results achieved are practically used and at the same time knowledge and forces are accumulated for the next leap forward. After each turn, the process of evolution enters a new, higher level.

In 1671, the German philosopher and mathematician Gustav Leibniz also created an adding machine based on a gear wheel of a special design - Leibniz's gear wheel. Leibniz's arithmometer, like the arithmometers of his predecessors, performed four basic arithmetic operations. On this, this period ended, and for almost a century and a half, mankind has been accumulating strength and knowledge for the next round of the evolution of computer technology. The 18th and 19th centuries were a time when various sciences developed rapidly, including mathematics and astronomy. They often encountered problems that required long and laborious calculations.

Another famous person in the history of computing was the English mathematician Charles Babbage. In 1823, Babbage began working on a machine for computing polynomials, but, more interestingly, this machine was supposed, in addition to direct calculations, to produce results - to print them on a negative plate for printing. It was planned that the machine would be powered by a steam engine. Due to technical difficulties, Babbage was unable to complete his project. Here, for the first time, the idea arose to use some external (peripheral) device to display the results of calculations. Note that another scientist, Scheutz, in 1853 nevertheless realized the machine conceived by Babbage (it turned out to be even smaller than planned). Perhaps Babbage liked the creative process of finding new ideas more than translating them into something material. In 1834, he outlined the principles of another machine, which he called "Analytical". Technical difficulties again did not allow him to fully realize his ideas. Babbage was able to bring the machine only to the experimental stage. But it is the idea that is the engine of scientific and technological progress. The next car of Charles Babbage was the embodiment of the following ideas:

Production process management. The machine controlled the work of the loom, changing the pattern of the created fabric depending on the combination of holes on a special paper tape. This tape became the forerunner of such media familiar to all of us as punched cards and punched tapes.

Programmability. The operation of the machine was also controlled by a special paper tape with holes. The order of the holes on it determined the commands and the data processed by these commands. The machine had an arithmetic unit and memory. The machine's instructions even included a conditional jump instruction that changed the course of calculations depending on some intermediate results.

Countess Ada Augusta Lovelace, who is considered the world's first programmer, took part in the development of this machine.

The ideas of Charles Babbage were developed and used by other scientists. So, in 1890, at the turn of the 20th century, the American Herman Hollerith developed a machine that worked with data tables (the first Excel?). The machine was controlled by a program on punched cards. It was used in the 1890 US Census. In 1896, Hollerith founded the company that was the predecessor of the IBM Corporation. With the death of Babbage, another break came in the evolution of computer technology until the 1930s. In the future, the entire development of mankind became unthinkable without computers.

In 1938, the development center briefly shifted from America to Germany, where Konrad Zuse created a machine that, unlike its predecessors, operated not with decimal numbers, but with binary ones. This machine was also still mechanical, but its undoubted advantage was that it implemented the idea of ​​processing data in binary code. Continuing his work, Zuse in 1941 created an electromechanical machine, the arithmetic unit of which was made on the basis of a relay. The machine was able to perform floating point operations.

Overseas, in America, during this period, work was also underway to create similar electromechanical machines. In 1944, Howard Aiken designed the machine, which they called the Mark-1. She, like the Zuse machine, worked on a relay. But because this machine was clearly influenced by Babbage's work, it operated on data in decimal form.

Naturally, due to the large proportion of mechanical parts, these machines were doomed. It was necessary to look for a new, more technologically advanced element base. And then they remembered the invention of Forest, who in 1906 created a three-electrode vacuum tube called a triode. Due to its functional properties, it has become the most natural replacement for the relay. In 1946, in the USA, at the University of Pennsylvania, the first universal computer was created - ENIAC. The ENIAC computer contained 18 thousand lamps, weighed 30 tons, occupied an area of ​​about 200 square meters and consumed enormous power. It still used decimal operations, and the axis was programmed by switching connectors and setting switches. Naturally, such "programming" entailed the appearance of many problems, caused, first of all, by incorrect installation of switches. The name of another key figure in the history of computer technology is associated with the ENIAC project - the mathematician John von Neumann. It was he who first proposed to write the program and its data into the memory of the machine so that they could be modified, if necessary, in the process of work. This key principle was later used to create a fundamentally new computer EDVAC (1951). This machine already uses binary arithmetic and uses a RAM built on ultrasonic mercury delay lines. The memory could store 1024 words. Each word consisted of 44 binary digits.

After the creation of EDVAC, humanity realized what heights of science and technology can be achieved by a human-computer tandem. This industry began to develop very quickly and dynamically, although there was also some periodicity associated with the need to accumulate a certain amount of knowledge for the next breakthrough. Until the mid-1980s, the process of evolution of computer technology was usually divided into generations. For the sake of completeness, we give these generations brief qualitative characteristics:

The first generation of computers (1945-1954) During this period, a typical set of structural elements that are part of the computer is formed. By this time, the developers had already formed approximately the same idea of ​​what elements a typical computer should consist of. These are the central processing unit (CPU), random access memory (or random access memory - RAM) and input-output devices (I/O). The CPU, in turn, must consist of an arithmetic logic unit (ALU) and a control unit (CU). The machines of this generation worked on a lamp element base, because of which they absorbed a huge amount of energy and were very unreliable. With their help, basically, scientific problems were solved. Programs for these machines could no longer be written in machine language, but in assembly language.

The second generation of computers (1955-1964). The change of generations was determined by the emergence of a new element base: instead of a bulky lamp, miniature transistors began to be used in computers, delay lines as elements of random access memory were replaced by magnetic core memory. This ultimately led to a reduction in size, an increase in the reliability and performance of computers. In the computer architecture, index registers and hardware for performing floating point operations appeared. Commands have been developed to call subroutines.

High-level programming languages ​​appeared - Algol, FORTRAN, COBOL - which created the prerequisites for the emergence of portable software that does not depend on the type of computer. With the advent of high-level languages, compilers for them, libraries of standard subroutines, and other things that are familiar to us now have appeared.

An important innovation that I would like to note is the appearance of the so-called input-output processors. These specialized processors made it possible to free the central processor from input-output control and to perform input-output using a specialized device simultaneously with the calculation process. At this stage, the circle of computer users expanded sharply and the range of tasks to be solved increased. Operating systems (OS) began to be used to efficiently manage machine resources.

The third generation of computers (1965-1970).). The change of generations was again due to the renewal of the element base: instead of transistors in various computer nodes, integrated circuits of various degrees of integration began to be used. Microcircuits made it possible to place dozens of elements on a plate several centimeters in size. This, in turn, not only increased the performance of computers, but also reduced their size and cost. Comparatively inexpensive and small-sized machines appeared - mini-computers. They were actively used to control various technological production processes in systems for collecting and processing information.

The increase in computer power made it possible to simultaneously execute several programs on one computer. To do this, it was necessary to learn how to coordinate with each other simultaneously performed actions, for which the functions of the operating system were expanded.

Simultaneously with active developments in the field of hardware and architectural solutions, the proportion of developments in the field of programming technologies is growing. At this time, the theoretical foundations of programming methods, compilation, databases, operating systems, etc. were being actively developed. Application software packages were created for various areas of human life.

Now it becomes an unaffordable luxury to rewrite all programs with the advent of each new type of computer. There is a tendency to create families of computers, that is, machines become compatible from the bottom up at the hardware and software level. The first of these families was the IBM System / 360 series and our domestic analogue of this computer - the EC computer.

The fourth generation of computers (1970-1984). Another change in the element base led to a change of generations. In the 1970s, work was actively carried out on the creation of large and super-large integrated circuits (LSI and VLSI), which made it possible to place tens of thousands of elements on a single chip. This led to a further significant reduction in the size and cost of computers. Working with the software has become more friendly, which has led to an increase in the number of users.

In principle, with such a degree of integration of elements, it became possible to try to create a functionally complete computer on a single chip. Appropriate attempts were made, although they were mostly met with an incredulous smile. Probably, these smiles would become less if it were possible to foresee that this very idea would become the cause of the extinction of large computers in some fifteen years.

Nevertheless, in the early 70s, Intel released a microprocessor (MP) 4004. And if before that there were only three directions in the world of computing (supercomputers, large computers (mainframes) and minicomputers), now to them added one more - microprocessor. In general, a processor is understood as a functional unit of a computer designed for logical and arithmetic processing of information based on the principle of microprogram control. By hardware implementation, processors can be divided into microprocessors (all processor functions are fully integrated) and processors with low and medium integration. Structurally, this is expressed in the fact that microprocessors implement all the functions of the processor on a single chip, while other types of processors implement them by connecting a large number of microcircuits.

So, the first microprocessor 4004 was created by Intel at the turn of the 70s. It was a 4-bit parallel computing device, and its capabilities were severely limited. 4004 could perform four basic arithmetic operations and was initially used only in pocket calculators. Later, its scope was expanded through the use in various control systems (for example, to control traffic lights). Intel, having correctly foreseen the promise of microprocessors, continued intensive development, and one of its projects eventually led to a major success that predetermined the future path of development of computer technology.

They became a project to develop an 8-bit 8080 processor (1974). This microprocessor had a fairly advanced instruction system and was able to divide numbers. It was he who was used to create the Altair personal computer, for which the young Bill Gates wrote one of his first BASIC interpreters. Probably, it is from this moment that the 5th generation should be counted.

Fifth generation of computers (1984 - today) can be called a microprocessor. Note that the fourth generation ended only in the early 80s, that is, parents in the face of large machines and their rapidly maturing and gaining strength "child" For almost 10 years, they existed relatively peacefully together. For both of them, this time has gone only for good. Designers of large computers have accumulated vast theoretical and practical experience, and microprocessor programmers have managed to find their own, albeit very narrow at first, niche in the market.

In 1976, Intel completed the development of the 16-bit 8086 processor. It had a sufficiently large register capacity (16 bits) and an address system bus (20 bits), due to which it could address up to 1 MB of RAM.

The 80286 was created in 1982. This processor was an improved version of the 8086. It already supported several modes of operation: real, when the address was formed according to the i8086 rules, and protected, which implemented multitasking and virtual memory management in hardware. 80286 also had a large address bus width - 24 bits versus 20 for 8086, and therefore it could address up to 16 MB of RAM. The first computers based on this processor appeared in 1984. In terms of its computing capabilities, this computer became comparable to the IBM System / 370. Therefore, we can assume that this is the end of the fourth generation of computer development.

In 1985, Intel introduced the first 32-bit microprocessor, the 80386, which was hardware compatible from the bottom up with all previous Intel processors. It was much more powerful than its predecessors, had a 32-bit architecture, and could directly address up to 4 GB of RAM. The 386 processor began to support a new mode of operation - the virtual 8086 mode, which not only provided greater efficiency for the programs developed for the 8086, but also allowed several such programs to work in parallel. Another important innovation - support for paging of RAM - made it possible to have a virtual memory space up to 4 TB in size.

The 386 processor was the first microprocessor to use parallel processing. So, at the same time, the following were carried out: access to memory and input-output devices, placing commands in a queue for execution, decoding them, converting a linear address into a physical one, as well as paging the address (information about the 32 most frequently used pages was placed in a special cache memory).

Shortly after the 386 processor, the 486 appeared. In its architecture, the ideas of parallel processing were further developed. The device for decoding and executing commands was organized in the form of a five-stage pipeline, on the second one, up to 5 commands could be in various stages of execution. A first-level cache was placed on the chip, which contained frequently used code and data. In addition, there was a cache memory of the second level with a capacity of up to 512 KB. Now you can build multiprocessor configurations. New instructions have been added to the processor instruction set. All these innovations, along with a significant (up to 133 MHz) increase in the clock frequency of the microprocessor, significantly increased the speed of program execution.

Since 1993, Intel Pentium microprocessors have been produced. Their appearance, at the beginning, was overshadowed by an error in the block of floating point operations. This error was quickly eliminated, but distrust of these microprocessors remained for some time.

Pentium continued to develop the ideas of parallel processing. A second pipeline has been added to the device for decoding and executing commands. Now the two pipelines (called u and v) together could execute two instructions per clock. The internal cache has been doubled to 8 KB for code and 8 KB for data. The processor has become more intelligent. The possibility of branch prediction was added to it, in connection with which the efficiency of the execution of nonlinear algorithms increased significantly. Despite the fact that the system architecture was still 32-bit, 128- and 256-bit data buses began to be used inside the microprocessor. The external data bus has been increased to 64 bits. The technologies connected with multiprocessor information processing continued their development.

The advent of the Pentium Pro microprocessor divided the market into two sectors - high-performance workstations and low-cost home computers. The most advanced technologies were implemented in the Pentium Pro processor. In particular, one more pipeline was added to the existing two for the Pentium processor. Thus, in one cycle of operation, the microprocessor began to execute up to three instructions.

Moreover, the Pentium Pro processor allowed for dynamic execution of commands (Dynamic Execution). Its essence is that three command decoding devices, working in parallel, divide commands into smaller parts called micro-ops. Further, these micro-ops can be executed in parallel by five devices (two integer, two floating point and one memory interface device). At the output, these instructions are again assembled in their original form and order. The power of the Pentium Pro is complemented by the improved organization of its cache memory. Like the Pentium processor, it has 8 KB L1 cache and 256 KB L2 cache. However, due to circuit solutions (using a dual independent bus architecture), the second-level cache was located on the same chip as the microprocessor, which significantly increased performance. The Pentium Pro implemented a 36-bit address bus, which made it possible to address up to 64 GB of RAM.

The process of development of the family of conventional Pentium processors also did not stand still. If in Pentium Pro processors the parallelism of computations was implemented due to architectural and circuitry solutions, then when creating models of the Pentium processor, they took a different path. They included new commands, to support which the microprocessor software model was somewhat changed. These commands, called MMX commands (MultiMedia eXtention - a multimedia extension of the command system), made it possible to simultaneously process several units of the same type of data.

The next released processor, called the Pentium II, combined all the technological achievements of both directions in the development of the Pentium architecture. In addition, he had new design features, in particular, his case was made in accordance with the new technology for manufacturing cases. The market for portable computers has not been forgotten either, in connection with which the processor supports several power-saving modes.

Pentium III processor. Traditionally, it supports all the achievements of its predecessors, its main (and perhaps the only?!) advantage is the presence of new 70 commands. These commands supplement the group of MMX commands, but for floating point numbers. To support these instructions, a special block was included in the processor architecture.

4.7 (93.53%) 337 votes


Once I was sitting at the computer, working calmly for myself, and then, suddenly, the thought came to me, how did it all start and what was the very first computer in the world? Of course, I decided to find the answer to this question, it really hooked me. And the answer was found! Naturally, he became the topic of the next blog post about all the most interesting things in the world that does not leave you indifferent. As always, with the definition of superiority, everything turned out to be not easy, but you can already get used to it ...

The very first computer in the world was created and built in the USA by Harvard University mathematician Howard Aiksn back in 1941. Together with four specialists from the company IBM, which ordered it to him, they created a computer based on the ideas of Charles Babbage. After all the tests, it was launched on August 7, 1944. It received the name "Mark 1" from its creators, and he was put to work at Harvard.


Then this computer cost five hundred thousand dollars, a fabulous amount for those times. It was assembled in a special case, which was made of glass and steel, not susceptible to corrosion. The body itself was at least seventeen meters long, more than 2.5 m high. Its mass was about 5 tons and it occupied a space of several tens of cubic meters.
"Mark 1" consisted of many switches and other mechanisms, the total number of which was 765 thousand.
His wires were a total length of about eight hundred kilometers!

The capabilities of the very first computer in the world now seem ridiculous to us, but at that time there was not a single computing device on the planet more powerful.

The machine could:

  • operate with seventy-two numbers, which in turn consisted of twenty-three decimal places
  • the computer could subtract, add, and each of the operations took him three seconds.
  • in addition, he also multiplied and divided, spending six and fifteen seconds on these operations.

To enter information into this apparatus, which was essentially just a faster adding machine, a special perforated paper tape was used. It was the first computer that did not need human intervention for its computing processes.

Back in 1942, the development of John Mauchli served as an impetus for the creation of the first computer, but at that moment few people paid attention to it. After the military engineers of the American army looked at it in 1943, attempts were made to create an apparatus that then received the name "ENIAC". The military was in charge of the finances and she allocated about five hundred thousand dollars for this project, as they wanted to design new types of weapons.
ENIAC consumed so much energy that during its operation, the nearby city experienced a shortage of electricity all the time and people sat without electricity, sometimes for several hours.

Specifications

Look at some very interesting characteristics of the very first computer in the world, according to the second version. Impressive isn't it?

  • He weighed 27 tons.
  • It contained 18,000 lamps and other details.
  • The memory was 4 KB.
  • Occupied an area of ​​135 sq. m. and the whole was entangled with many wires.

It was programmed by hand, and the operators just changed hundreds of switches, and had to turn it off and on every time because it did not have a hard drive. There was no keyboard and no monitor either. There were a number of dozens of cabinets with lamps, the machine often broke down, as it often overheated. Then it was used for the design of hydrogen atomic weapons. This machine worked for more than ten years, and in 1950, when the transistor was created, computers became smaller in size.

Where and when was the very first PC sold?

Little has changed in the concept of computers in two decades. Due to the fact that the microprocessor was introduced, the very creation of the computer went at a faster pace. Back in 1974, IBM wanted to bring the first computer to market, but there were almost no sales. The IBM5100 used cassettes where information was stored, and at that time it was very expensive - ten thousand dollars. Therefore, few people could afford to buy such a device then.
He could himself execute programs that were written in BASIC and APL, created in the bowels of IBM. The monitor could display sixteen lines of sixty-four characters, its memory was sixty-four KB. The cassettes themselves were very much like regular audio cassettes. There were almost no sales because of the high price and because of the ill-conceived interface. But still, there were people who bought it and who started a new era in the history of world markets - computer trading

What did you think they would be like in ten years?

Not so long ago, IBM showed the press the "Roadrunner" supercomputer with 1 quadrillion operations. It was collected for the US Department of Energy. It includes 6480 dual-core processors, and 12,960 Cell 8i processors. It consists of 278 cabinets, 88 kilometers of cable. It weighs 226 tons. It is located on an area of ​​1100 m² and costs $133,000,000.

As you can see, supercomputer cabinets are still in vogue, it's all about the design...

Watch about the very first computer in the world in video format:

This is how computer history turned out. Whether it was interesting or not - write in the comments!


We are opening a computer club "Click". Our classes will be held on Thursdays. And this is our first lesson where you will learn about the history of the computer. Probably, today there are no people left who would not use a computer, except perhaps distant African tribes, but the history of the computer went through several stages before reaching the modern level.

People constantly want to achieve something: fly, drive, build large structures, have pocket TVs, and there seems to be no end in sight.

Millions of people have been working for decades on the creation of computers, which at first occupied several rooms, then fit into meter blocks, and now they can even be several centimeters thick.

The computer era has come into our lives relatively recently. Literally 100 years ago, people did not know what a computer was, although its most distant predecessor, an abacus, appeared in ancient Babylon 3000 BC, which later received a Greek name abacus . It was a board with grooves along which pebbles moved.

A descendant of the abacus, simple abacus, not so long ago used in stores: a wooden frame with knitting needles inside, knuckles are strung on each knitting needle.

An attempt to create a mechanism with which it was possible to perform simple astronomical calculations was by Leonardo da Vinci in the 17th century. In fact, it was a mechanical calculator.

After that, several mechanical calculating machines appeared at once. One of which is "Pascaline" by Blaise Pascal. A device that added and subtracted eight-digit numbers (1642). This is the first digital computer. This discovery started it all...

Blaise Pascal French mathematician and physicist.

Mankind was striving for the computer era, creating more and more computers that performed more and more complex functions.

And yet the first computer that resembled modern computers was the analytical engine of the British mathematician Charles Babbage .

Although the machine was impressive in size, but the main part resembles the components of an ordinary modern computer. There is both a processor and memory. You must understand that the terms "processor" and "memory" appeared later. Charles Babbage called the processor "mill", and the memory - "store". Babbage's machine had the functions of a printer, that is, it could "read" the record and put the results on paper.

Charles Babbage

During World War II, calculating devices were developed mainly for military purposes.

By order of the US Army in 1946, ENIAC (which means "electronic numerical integrator and calculator") was created, the first electronic digital computer that could be reprogrammed to solve many problems. This device weighed 27 tons and occupied several rooms.

The first generation of computers required a huge amount of electricity and numerous staff to operate. In addition, they were very expensive, only governments and large research organizations could purchase them.

Founders of ENIAC. John Mauchly and J. Presper Eckert next to ENIAC, 1966.


UNIVAC (Universal Automatic Computer) was born in 1951. Then computers suddenly ceased to be only for the government, and became available for business.

In 1961, an experimental computer on microcircuits was created, and three years later, IBM launched the production of these IBM-360 computers.

The possibility of using a computer at home in those years was not even considered. It was tantamount to building a power plant to light a private house. The smallest computer by the early 1970s was the size of a refrigerator and was very expensive.

And suddenly, in 1976, Steve Wozniak and Steve Jobe, two young American technicians with no special education, in a workshop arranged in a garage, created a small device for video games with the ability to program.

They called their invention Apple ("apple"). Jobe founded Apple Computer and mass-produced personal computers.

This is the first Apple Macintosh 128K computer.

Demand for them exceeded all expectations. In a short time, Jobs's firm grew into a large, prosperous enterprise. This forced other firms to pay attention to the personal computer market.

In 1985, the computer company Microsoft Windows decided not to develop a new computer, but the operating system software needed to use the computer, Microsoft co-founders Bill Gates and Paul Allen.

Young Bill Gates. Another famous person among computer geniuses. In the future, he will become the richest man in the world.

In the late 1990s, netbooks, laptops designed for Internet access, and tablet computers equipped with a touch screen for working with a stylus or fingers without using a keyboard and mouse appeared.

Initially, computers were designed only for computing, then they have other functions. First of all, a computer is an information device, since we can work with information: receive news, store, record, send, edit, etc.

The modern PC is a communication and learning device, as well as an entertainment tool that allows you to listen to music, watch videos, play all kinds of games.

In 1991, Blizzard Entertainment was born and began creating online games.

In 1994, Sony released the famous home video game console, the PlayStation.

In 2001, the launch of the Wikipedia project. Wikipedia defines itself as “a free web-based multilingual encyclopedia project…. its 17 million articles … have been written collaboratively by volunteers around the world, and nearly all of the articles can be edited by anyone with access to the site.”

Founder of Wikipedia Jimmy Wales.

And more inventions, and more novelties, and more, and more…. It is difficult to say what the rapid and accelerating development of computer technologies can lead to, but some futurologists (these are scientists who predict the future) argue that by 2030 humanity can approach the creation of artificial intelligence and self-reproducing machines, as well as the integration of man and computer.

Editor's Choice
We all remember the old Soviet cartoon "The Kid Who Counted to Ten". In this story, the goat first got it for his...

The history of objective studies of numerical competence in animals dates back to the beginning of the 20th century. At the origins of this area lies...

The ancient people, apart from a stone ax and a skin instead of clothes, had nothing, so they had nothing to count. Gradually they became...

TAMBOV STATE UNIVERSITY NAMED AFTER G.R. DERZHAVINA DEPARTMENT OF THEORETICAL FOUNDATIONS OF PHYSICAL EDUCATION ABSTRACT ON THE TOPIC: "...
Ice cream production equipment: production technology + 3 types of ice cream business + necessary equipment...
. 2. Department of Green Algae. Class Isoflagellates. Class Conjugates. 3. Departments Yellow-green and Diatoms. 4. Kingdom...
In the life of modern man are used everywhere. Almost any electrical equipment and electrical engineering is powered by power, ...
One of the most amazing creatures of the underwater world is the axolotl. It is also often called the Mexican water dragon. Axolotl...
Environmental pollution is understood as the ingress of harmful substances into the external space, but this is not a complete definition. Pollution...