History of computing (wikipedia)

Invention

Nobody knows who built the first computer. This is because the word "computer" used to mean a person who did math as their job (a human computer). Because of this, some people say that humans were the first computers. Human computers got bored doing the same math over and over again, and made tools (mostly mechanical calculating devices like abacuses) to help them get the answers to their problems.

Automation

Humans have a problem with math. To show this, try doing 584 x 3,220 in your head. It is hard to remember all the steps! People made tools to help them remember where they were in a math problem. The other problem people have is that they have to do the same problem over and over and over again. A cashier used to make change every day in her head or with a piece of paper. That took a lot of time and people made mistakes. So people made machines that did those same things over and over. This part of computer history is called the "history of automated calculation," which is a fancy phrase for "the history of machines that make it easy for me to do this same math problem over and over without making mistakes."
The abacus, the slide rule, the astrolabe and the Antikythera mechanism (which dates from about 150-100 BC) are examples of automated calculation machines.

Programming

Some people did not want a machine that would do the same thing over and over again. For example, a music box is a machine that plays the same music over and over again. Some people wanted to be able to tell their machine to do different things. For example, they wanted to tell the music box to play different music every time. They wanted to be able to program the music box- to order the music box to play different music. This part of computer history is called the "history of programmable machines" which is a fancy phrase for "The history of machines that I can order to do different things if I know how to speak their language."
One of the first examples of this was built by Hero of Alexandria (c. 10–70 AD). He built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums. These ropes and drums were the language of the machine- they told what the machine did and when. Some people argue that this is the first programmable machine.[1]
Most historians agree that the "castle clock", an astronomical clock invented by Al-Jazari in 1206, is the first known programmable analog computer.[2] It showed the zodiac, the solar and lunar orbits, a crescent moon-shaped pointer travelling across a gateway that made some doors to open every hour,[3][4] and five robotic musicians who play music when levers hit them. The length of day and night could be changed (AKA re-programmed) every day in order to account for the changing lengths of day and night throughout the year.[2] Some people[who?] consider Ada Lovelace to be the first programmer.[source?]

The Computing Era

At the end of the Middle Ages, people in Europe thought math and engineering were more important. In 1623, Wilhelm Schickard made a mechanical calculator. Other Europeans made more calculators after him. They were not modern computers because they could only add, subtract, and multiply- you could not change what they did to make them do something like play tetris. Because of this, we say they were not programmable.
In 1801, Joseph Marie Jacquard used punched paper cards to tell his textile loom what kind of pattern to weave. He could use punch cards to tell the loom what to do, and he could change the punch cards, which means he could program the loom to weave the pattern he wanted. This means the loom was programmable.
Modern computers were made when someone (Charles Babbage) had a bright idea. He wanted to make a machine that could do all the boring parts of math, (like the automated calculators) and could be told to do them different ways (like the programmable machines.) Charles Babbage was the first to make a design of a fully programmable mechanical computer. He called it the "The Analytical Engine".[5] Because Babbage did not have enough money and always changed his design when he had a better idea, he never built his Analytical Engine.
As time went on, computers got more and more popular. And that stands out at the beginning. This is because people get bored easily doing the same thing over and over. Imagine spending your life writing things down on index cards, storing them, and then having to go find them again. The U.S. Census Bureau in 1890 had hundreds of people doing just that. People got very bored and very frustrated, and would say, "There HAS to be an easier way to do this." Then some bright person figured out how to make machines do a lot of the work. Herman Hollerith figured out how to make a machine that would automatically add up information that the Census bureau collected. The Computing Tabulating Recording Corporation(which later became IBM) made his machines, and everyone was happy. At least, they were happy until their machines broke down, got jammed, and had to be repaired. This is when the Computing Tabulating Recording Corporation invented tech support.
Because of machines like this, new ways of talking to these machines were invented, and new types of machines were invented, and eventually the computer that we all know and love today was born.

Analog and Digital Computers

In the first half of the 20th century, scientists started using computers, mostly because scientists had a lot of math to figure out and wanted to spend more of their time thinking about the secrets of the universe instead of spending hours adding numbers together. If you remember getting bored doing your times tables, you will know exactly how they felt.
So they put together computers. These computers used analog circuits, which made them very hard to program. Then, in the 1930s, they invented digital computers, which made them easier to program.

High-scale computers

Scientists figured out how to make and use digital computers in the 1930s and 1940s. Scientists made a lot of digital computers, and as they did, they figured out how to ask them the right sorts of questions to get the most out of them. Here are a few of the computers they built:
Defining characteristics of some early digital computers of the 1940s (In the history of computing hardware)
Name First operational Numeral system Computing mechanism Programming Turing complete
Zuse Z3 (Germany) May 1941 Binary Electro-mechanical Program-controlled by punched film stock Yes (1998)
Atanasoff–Berry Computer (US) mid-1941 Binary Electronic Not programmable—single purpose No
Colossus (UK) January 1944 Binary Electronic Program-controlled by patch cables and switches No
Harvard Mark I – IBM ASCC (US) 1944 Decimal Electro-mechanical Program-controlled by 24-channel punched paper tape (but no conditional branch) No
ENIAC (US) November 1945 Decimal Electronic Program-controlled by patch cables and switches Yes
Manchester Small-Scale Experimental Machine (UK) June 1948 Binary Electronic Stored-program in Williams cathode ray tube memory Yes
Modified ENIAC (US) September 1948 Decimal Electronic Program-controlled by patch cables and switches plus a primitive read-only stored programming mechanism using the Function Tables as program ROM Yes
EDSAC (UK) May 1949 Binary Electronic Stored-program in mercury delay line memory Yes
Manchester Mark 1 (UK) October 1949 Binary Electronic Stored-program in Williams cathode ray tube memory and magnetic drum memory Yes
CSIRAC (Australia) November 1949 Binary Electronic Stored-program in mercury delay line memory Yes
EDSAC was one of the first computers that remembered what you told it even after you turned the power off. This is called (von Neumann) architecture.
  • Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine that used binary arithmetic. Binary arithmetic means using "Yes" and "No." to add numbers together. You could also program it. In 1998 the Z3 was proved to be Turing complete. Turing complete means that it is possible to tell this particular computer anything that it is mathematically possible to tell a computer. It is the world's first modern computer.
  • The non-programmable Atanasoff–Berry Computer (1941) which used vacuum tubes to store "yes" and "no" answers, and regenerative capacitor memory.
  • The secret British Colossus computers (1943)[6], which you could kind of sort of program. It showed that even though it had thousands of tubes, it still worked most of the time. It was used for breaking German wartime codes.
  • The Harvard Mark I (1944), A big computer that you could kind of program.
  • The U.S. Army's Ballistics Research Laboratory ENIAC (1946), which could add numbers the way people do (using the numbers 0 through 9) and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). At first, however, the only way you could reprogram ENIAC was by rewiring it.
Several developers of ENIAC saw its problems. They invented a way to for a computer to remember what they had told it, and a way to change what it remembered. This is known as "stored program architecture" or von Neumann architecture. John von Neumann talked about this design in the paper First Draft of a Report on the EDVAC, distributed in 1945. A number of projects to develop computers based on the stored-program architecture started around this time. The first of these was completed in Great Britain. The first to be demonstrated working was the Manchester Small-Scale Experimental Machine (SSEM or "Baby"), while the EDSAC, completed a year after SSEM, was the first really useful computer that used the stored program design. Shortly afterwards, the machine originally described by von Neumann's paper—EDVAC—was completed but was not ready for two years.
Nearly all modern computers use the stored-program architecture in some form. It has become the main concept which defines a modern computer. Most of the technologies used to build computers have changed since the 1940s, but many current computers still use the von-Neumann architecture.
Microprocessors are miniaturized devices that often implement stored program CPUs.
In the 1950's computers were built out of mostly vacuum tubes. Transistors replaced vacuum tubes in the 1960's because they were smaller and cheaper. They also need less power and do not break down as much as vacuum tubes. In the 1970s, technologies were based on integrated circuits. Microprocessors, such as the Intel 4004 made computers smaller and cheaper. They also made computers faster and more reliable. By the 1980s, computers became small and cheap enough to replace mechanical controls in things like washing machines. The 1980s also saw home computers and personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household.
In 2005 Nokia started to call some of its mobile phones (the N-series) "multimedia computers" and after the launch of the Apple iPhone in 2007, many are now starting to add the smartphone category among "real" computers. In 2008, if the category of smartphones are included in the numbers of computers in the world, the biggest computer maker by units sold, is no longer Hewlett-Packard, but rather Nokia.

Kinds of computers

There are three types of computers: desktop computers, mainframes, and embedded computers.
A "desktop computer" is a small machine that has a screen (which is not part of the computer). Most people keep them on top of a desk, which is why they are called "desktop computers." "Laptop computers" are computers small enough to fit on your lap. This makes them easy to carry around. Both laptops and desktops are called personal computers, because one person at a time uses them for things like playing music, surfing the web, or playing video games.
There are bigger computers that many people at a time can use. These are called "Mainframes," and these computers do all the things that make things like the internet work. You can think of a personal computer like this: the personal computer is like your skin: you can see it, other people can see it, and through your skin you feel wind, water, air, and the rest of the world. A mainframe is more like your internal organs: you (hopefully) never see them, and you barely even think about them, but if they suddenly went missing, you would have some very big problems.
There is another type of computer, called an embedded computer. An embedded computer is a computer that does one thing and one thing only, and usually does it very well. For example, an alarm clock is a embedded computer: it tells the time. Unlike your personal computer, you cannot use your clock to play Tetris. Because of this, we say that embedded computers cannot be programmed, because you cannot install programs like Tetris on your clock. Some mobile phones, automatic teller machines, microwave ovens, CD players and cars are examples of embedded computers.

No comments:

Post a Comment

adsense