History of computers pdf


 

History of Computers. Earliest Computer. • Originally calculations were computed by humans, whose job title was computers. • These human computers were. Gordon Moore, a co- founder of Intel, predicts that the number of transistors which can be placed on a single chip will double every year. The prediction was. A Brief History Of Computing. Pre-Electronic Computing (up to the 's). Principles of Computing, Carnegie Mellon University Based on Slides.

Author:ELINORE KENNEMORE
Language:English, Spanish, Arabic
Country:Kiribati
Genre:Environment
Pages:422
Published (Last):08.04.2016
ISBN:736-1-15461-515-2
Distribution:Free* [*Register to download]
Uploaded by: ARMIDA

56143 downloads 119355 Views 31.40MB PDF Size Report


History Of Computers Pdf

History of Computers: From. Abacus to Smart-phones. Aim: In this lesson, you will learn: Various devices that have been used for computations in the past. A Brief History of Computers. By. Debdeep Mukhopadhyay. Assistant Professor. Dept of Computer Sc and Engg. IIT Madras. Although the history of analog computers is interesting in its own right, this module You'll begin by looking at the computing equivalent of ancient history.

A brief history of computers by Chris Woodford. Last updated: December 2, Computers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations more quickly than the human brain—is exactly the same. Read on to learn more about the history of computers—or take a look at our article on how computers work. Cogs and Calculators It is a measure of the brilliance of the abacus, invented in the Middle East circa BC, that it remained the fastest form of calculator until the middle of the 17th century. Then, in , aged only 18, French scientist and philosopher Blaise Pascal — invented the first practical mechanical calculator , the Pascaline, to help his tax-collector father do his sums. The machine had a series of interlocking cogs gear wheels with teeth around their outer edges that could add and subtract decimal numbers. Several decades later, in , German mathematician and philosopher Gottfried Wilhelm Leibniz — came up with a similar but more advanced machine. Instead of using cogs, it had a "stepped drum" a cylinder with teeth of increasing length around its edge , an innovation that survived in mechanical calculators for hundred years. The Leibniz machine could do much more than Pascal's: as well as adding and subtracting, it could multiply, divide, and work out square roots. Another pioneering feature was the first memory store or "register. Left: The "user interface": the part where you dial in numbers you want to calculate. Right: The internal gear mechanism.

Turing's ideas were hugely influential in the years that followed and many people regard him as the father of modern computing—the 20th-century's equivalent of Babbage. Although essentially a theoretician, Turing did get involved with real, practical machinery, unlike many mathematicians of his time. Today, Alan Turing is best known for conceiving what's become known as the Turing test, a simple way to find out whether a computer can be considered intelligent by seeing whether it can sustain a plausible conversation with a real human being.

The first modern computers The World War II years were a crucial period in the history of computing, when powerful gargantuan computers began to appear. Just before the outbreak of the war, in , German engineer Konrad Zuse — constructed his Z1, the world's first programmable binary computer, in his parents' living room.

It was a great advance— times more accurate than Bush's Differential Analyzer.

PRESENTATION-History and Evolution of computer | Digital Electronics | Computing

These were the first machines that used electrical switches to store numbers: when a switch was "off", it stored the number zero; flipped over to its other, "on", position, it stored the number one. Hundreds or thousands of switches could thus store a great many binary digits although binary is much less efficient in this respect than decimal, since it takes up to eight binary digits to store a three-digit decimal number. These machines were digital computers: unlike analog machines, which stored numbers using the positions of wheels and rods, they stored numbers as digits.

The first large-scale digital computer of this kind appeared in at Harvard University, built by mathematician Howard Aiken — A giant of a machine, stretching 15m 50ft in length, it was like a huge mechanical calculator built into a wall. It must have sounded impressive, because it stored and processed numbers using "clickety-clack" electromagnetic relays electrically operated magnets that automatically switched lines in telephone exchanges —no fewer than of them.

Impressive they may have been, but relays suffered from several problems: they were large that's why the Harvard Mark I had to be so big ; they needed quite hefty pulses of power to make them switch; and they were slow it took time for a relay to flip from "off" to "on" or from 0 to 1. Photo: An analog computer being used in military research in Most of the machines developed around this time were intended for military purposes.

Like Babbage's never-built mechanical engines, they were designed to calculate artillery firing tables and chew through the other complex chores that were then the lot of military mathematicians. During World War II, the military co-opted thousands of the best scientific minds: recognizing that science would win the war, Vannevar Bush's Office of Scientific Research and Development employed 10, scientists from the United States alone.

Things were very different in Germany. When Konrad Zuse offered to build his Z2 computer to help the army, they couldn't see the need—and turned him down. On the Allied side, great minds began to make great breakthroughs. In , a team of mathematicians based at Bletchley Park near London, England including Alan Turing built a computer called Colossus to help them crack secret German codes. Colossus was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube also known, especially in Britain, as a valve.

The vacuum tube, each one about as big as a person's thumb and glowing red hot like a tiny electric light bulb, had been invented in by Lee de Forest — , who named it the Audion. This breakthrough earned de Forest his nickname as "the father of radio" because their first major use was in radio receivers , where they amplified weak incoming signals so people could hear them more clearly. Just like the codes it was trying to crack, Colossus was top-secret and its existence wasn't confirmed until after the war ended.

It contained nearly 18, vacuum tubes nine times more than Colossus , was around 24 m 80 ft long, and weighed almost 30 tons.

ENIAC is generally recognized as the world's first fully electronic, general-purpose, digital computer. Colossus might have qualified for this title too, but it was designed purely for one job code-breaking ; since it couldn't store a program, it couldn't easily be reprogrammed to do other things.

ENIAC was just the beginning. Its two inventors formed the Eckert Mauchly Computer Corporation in the late s. In a key piece of work, von Neumann helped to define how the machine stored and processed its programs, laying the foundations for how all modern computers operate. They were helped in this task by a young, largely unknown American mathematician and Naval reserve named Grace Murray Hopper — , who had originally been employed by Howard Aiken on the Harvard Mark I.

It was then manufactured for other users—and became the world's first large-scale commercial computer. Which one was truly the first great modern computer? All of them and none: these—and several other important machines—evolved our idea of the modern electronic computer during the key period between the late s and the early s. The modern term for a problem that holds up a computer program is a "bug. But there were other problems with vacuum tubes too.

They consumed enormous amounts of power: the ENIAC used about times as much electricity as a modern laptop.

And they took up huge amounts of space.

Military needs were driving the development of machines like the ENIAC, but the sheer size of vacuum tubes had now become a real problem. The ENIAC's designers had boasted that its calculating speed was "at least times as great as that of any other existing computing machine.

So a new technology was urgently required. Photo: A typical transistor on an electronic circuit board. The solution appeared in thanks to three physicists working at Bell Telephone Laboratories Bell Labs. John Bardeen — , Walter Brattain — , and William Shockley — were then helping Bell to develop new technology for the American public telephone system, so the electrical signals that carried phone calls could be amplified more easily and carried further.

Shockley, who was leading the team, believed he could use semiconductors materials such as germanium and silicon that allow electricity to flow through them only when they've been treated in special ways to make a better form of amplifier than the vacuum tube.

When his early experiments failed, he set Bardeen and Brattain to work on the task for him. Eventually, in December , they created a new form of amplifier that became known as the point-contact transistor.

Bell Labs credited Bardeen and Brattain with the transistor and awarded them a patent. This enraged Shockley and prompted him to invent an even better design, the junction transistor, which has formed the basis of most transistors ever since.

Evolution and History of Computers – PPT / PDF

Like vacuum tubes, transistors could be used as amplifiers or as switches. But they had several major advantages. They were a fraction the size of vacuum tubes typically about as big as a pea , used no power at all unless they were in operation, and were virtually percent reliable.

The transistor was one of the most important breakthroughs in the history of computing and it earned its inventors the world's greatest science prize, the Nobel Prize in Physics. By that time, however, the three men had already gone their separate ways.

John Bardeen had begun pioneering research into superconductivity , which would earn him a second Nobel Prize in Walter Brattain moved to another part of Bell Labs. William Shockley decided to stick with the transistor, eventually forming his own corporation to develop it further.

His decision would have extraordinary consequences for the computer industry. With a small amount of capital, Shockley set about hiring the best brains he could find in American universities, including young electrical engineer Robert Noyce — and research chemist Gordon Moore —. It wasn't long before Shockley's idiosyncratic and bullying management style upset his workers. In , eight of them—including Noyce and Moore—left Shockley Transistor to found a company of their own, Fairchild Semiconductor, just down the road.

Thus began the growth of "Silicon Valley," the part of California centered on Palo Alto, where many of the world's leading computer and electronics companies have been based ever since. In Dallas, a young engineer from Kansas named Jack Kilby — was considering how to improve the transistor. Although transistors were a great advance on vacuum tubes, one key problem remained.

Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error prone.

Assignment 1 - History of Computer

Transistors Vacuum tubes were highly inefficient, required a great deal of space, and needed to be replaced often. Computers of the s and 50s had 18, tubes in them and housing all these tubes and cooling the rooms from the heat produced by 18, tubes was not cheap.

The transistor promised to solve all of these problems and it did so.

Transistors, however, had their problems too. The main problem was that transistors, like other electronic components, needed to be soldered together. As a result, the more complex the circuits became, the more complicated and numerous the connections between the individual transistors and the likelihood of faulty wiring increased.

In , this problem too was solved by Jack St. Clair Kilby of Texas Instruments. He manufactured the first integrated circuit or chip. A chip is really a collection of tiny transistors which are connected together when the transistor is manufactured.

Thus, the need for soldering together large numbers of transistors was practically nullified; now only connections were needed to other electronic components. In addition to saving space, the speed of the machine was now increased since there was a diminished distance that the electrons had to follow. Circuit Board Silicon Chip Mainframes to PCs The s saw large mainframe computers become much more common in large industries and with the US military and space program.

IBM became the unquestioned market leader in selling these large, expensive, error-prone, and very hard to use machines.

Programs and data could be stored on an everyday audio-cassette recorder. Before the end of the fair, Wozniak and Jobs had secured orders for the Apple II and from there Apple just took off. Also introduced in was the TRS This was a home computer manufactured by Tandy Radio Shack.

In its second incarnation, the TRS Model II, came complete with a 64, character memory and a disk drive to store programs and data on. Arithmometer In , Thomas de Colmar invented a mechanical calculator called arithmometers. It was a successful and useful machine which could perform the basic mathematical calculations. Difference Engine and Analytical Engine: Charles Babbage invented difference engine and analytical engine in the year and respectively. It was the foremost mechanical computer used to tabulate the polynomial functions.

First computer programmer Augusta Ada Byron wrote the programs for analytical engine and suggested Charles Babbage aid the binary system in Scheutzian calculation engine Per George Scheutz invented the first printing calculator having a base on difference engine in Tabulating machine The Tabulating machine was invented in the year by Herman Hollerith, it was used to accomplish in expressing the data and accounts in a concise form.

Related Posts:


Copyright © 2019 nvrehs.info. All rights reserved.
DMCA |Contact Us