answersLogoWhite

0

📱

Computer History

This category includes questions and answers about the history of the computer. Ask questions about the first computer and major developments in computing.

5,564 Questions

The average person's left hand does 56 of the typing?

This is true! The average person's right hand does do most of the typing!

Why is it how called the father of computer?

The Analytical machine had five units- input, output, store, mill, and control. Store was used for storing numbers and Mill was used to do the calculations by rotation of gears and wheels. Control unit did the job of supervision of all other units. Note that these five units are similar to the functional units of a modern digital computer.No wonder he was called the "Father of Computers"!

He was Charles Babbage.

How does a punchcard function?

A punch card are cards with punched holes in them that represent data. You feed them into a (usually) large-scale computer that can accept them.

Mammals that use computers?

All mammals use a computer: their own brains. However, in terms of analog and digital computers, humans are the only mammals that choose to use computers. Primates and other mammals can certainly be trained to use computers to perform simple tasks (mostly through game-play), however it's doubtful if they gain any real benefit. They certainly don't use them through choice.

Who invented the first PC modem and when?

Digital modems developed from the need to transmit data for North American air defense during the 1950s. Modems were used to communicate data over the public switched telephone network or PSTN. Analog telephone circuits can only transmit signals that are within the frequency range of voice communication. A modem sends and recieves data between two computers. Modem stands for modulate/demodulate.

In 1962, the first commercial modem was manufactured - the Bell 103 by AT&T. The Bell 103 was also the first modem with full-duplex transmission, frequency-shift keying or FSK, and had a speed of 300 bits per second or 300 bauds.

The 56K modem was invented by Dr. Brent Townshend in 1996.

Classification of computer according to generation?

First Generation (1940-1956) Vacuum Tubes Sponsored

Is your network ready for the cloud? Find out : read "Five Reasons Classic Ethernet Switches Won't Support the Cloud" and learn how to examine your network's strength and eliminate any weak points.

The first computers used vacuum tubes for circuitry and magnetic drums formemory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.

First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.

The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951. Second Generation (1956-1963) TransistorsTransistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.

Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

The first computers of this generation were developed for the atomic energy industry. Third Generation (1964-1971) Integrated CircuitsThe development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitorsand interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. Fourth Generation (1971-Present) MicroprocessorsThe microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer-from the central processing unit and memory to input/output controls-on a single chip.

In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handhelddevices. Fifth Generation (Present and Beyond) Artificial IntelligenceFifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

Why did man invent computers?

To do very complex math problems that would have taken months for a person to calculate. Computers basically all work by math - binary code is 1 and 0 - and everything we have done with computers since then is based on mathematical programming!

What could the Z1 computer do?

The Z1 introduced the computer architecture on which modern computers are designed. The device was used to perform decimal floating point calculations during WWII.

Who received the world's first email message?

First Email This is a excerpt from Ray Tomlinson's website entitled "First Email" " The first message was sent between two machines that were literally side by side. The only physical connection they had (aside from the floor they sat on) was through the ARPANET. I sent a number of test messages to myself from one machine to the other. The test messages were entirely forgettable and I have, therefore, forgotten them. Most likely the first message was QWERTYUIOP or something similar. When I was satisfied that the program seemed to work, I sent a message to the rest of my group explaining how to send messages over the network. The first use of network email announced its own existence. These first messages were sent in late 1971. The next release of TENEX went out in early 1972 and included the version of SNDMSG with network mail capabilities. The CPYNET protocol was soon replaced with a real file transfer protocol having specific mail handling features. Later, a number of more general mail protocols were developed. "

How much floor space did the ENIAC require?

ENIAC was composed of 40 standard 19 inch wide relay racks plus three mobile function tables plus punchcard readers and punches. The racks occupied three of the four walls of a room and if they had been in a straight line would have run 63 and a third feet.

In the floor layout shown in the link the room had 16 panels on each of the long walls and 8 panels on the short wall. Allowing for depth of the panels the room was roughly 27 feet long by 16 feet wide. In area this is 432 square feet of floor space. The panels were roughly 8 feet tall.

Difference between 80186 and 80286 and 80386?

5/2/2012 Laith fahim from Iraq,Baghdad 80386 (or 80386DX) is internally and externally a 32-bit microprocessor with a 32-bit address bus. It is capable of handling physical memory of up to 4 GB (232).Virtual memory was increased to 64 terabytes (246) .

80386SX has a 16-external data bus and a 24-bit address bus which gives a 16MB (224) of memory . that is makes more cheaper from the 80386DX

_________________________________________

the 8086 IBM PC and compatible computer 4ht edition by Mukammad Ali & Janice Gillispie

Who is the worlds first computer proggramer?

Charles Baggage was the first programmer.

... The spelling is Babbage not Baggage.

How many operating system in the world?

there are 4 version of operating system in the world.eg-apple mac,windows,ms-dos,linux

What is the difference between super computer and mainframe computer?

Supercomputers have multiple processing units. Making its speedunimaginably fast. It can even run a whole virtual world updating every giga-second.

A mainframe computer is similar only it is more slower and can run larger applications. Usually to huge amounts of data such as a census for every single house hold in the US. Otherwise, there is no difference.

What is first computer generation?

ENIAC, short for Electronic Numerical Integrator And Computer was unveiled in 1946.

The first personal computer, called the IBM PC was released in 1981

How many transistors does an computer have?

Anywhere from none to many billions, depending on the computer.

  1. vacuum tube computers had no transistors
  2. transistorized computers had a few thousand to a few hundred thousand transistors
  3. integrated circuit computers had a few tens of thousands to a few million transistors
  4. microprocessor computers have had a few thousand to many billions of transistors

Who created the Apple computer?

This is not clear and probably can never be precisely determined. The development of the first operating systems happened in several places and was an evolutionary process of modification and adaptation of the preceding Batch Monitorprograms.

What are some RISC and CISC examples?

  • CISC: Manchester Baby, EDVAC, EDSAC, BINAC, UNIVAC I, IBM 704, IBM 650, IBM 305, IBM AN/FSQ-7, IBM 7090, IBM 1620, IBM 1401, DEC PDP-1, Apollo Guidance Computer, IBM System/360, UNIVAC 1107, DEC PDP-8, DECSYSTEM-10, Intel 4004, Intel 8008, Intel 8080, Zilog Z-80, Zilog Z-8000, DEC PDP-11, Data General Nova, Motorola 68000, Intel Pentium, DEC VAX
  • RISC: CDC 6600, CDC 7600, Cray-1, Cray-2, Cray-3, Sparc, IBM RISC, IBM POWER, Intel i960, MIPS, Motorola 88000, DEC Alpha, PowerPC, ARM

Note: CISC was very important when memory was small and very expensive as it allowed programs to be reduced in size. It was also very important when more programming was done with assembly languages than with high order languages as it made for instructions that were easy for programmers to understand. But with declining memory costs and almost all programming being done with compilers, research showed that that most of the instructions in a CISC instruction set were redundant and not needed. Eliminating them and reducing the instruction set to just a core of essential instructions allowed for both a significant reduction in the hardware needed to build a computer and the development of simpler more powerful compilers. With less hardware in the computer optimizing its design for higher speeds became much easier. Thus the concept of RISC was formed.

The CDC machines and the Cray-1 listed above were designed before the concept of RISC formally existed, but clearly meet its definition in retrospect. Designed by Seymour Cray, his primary goal was to produce the fastest computer possible at the time, which required elimination of all unnecessary hardware that might limit the speed of the machine. He even eliminated all data synchronization latches/registers, instead using hand tuned delay lines made of loose hanging loops of twisted pair to make sure signals arrived where needed at exactly the right time.

What was the goal of ARPANET?

the goal of ARPANET was to allow university-based researchers working for the Defense Department to share information with their colleagues in other U.S. cities.