answersLogoWhite

0

📱

Computer History

This category includes questions and answers about the history of the computer. Ask questions about the first computer and major developments in computing.

5,564 Questions

Classification of computer according to generation?

First Generation (1940-1956) Vacuum Tubes Sponsored

Is your network ready for the cloud? Find out : read "Five Reasons Classic Ethernet Switches Won't Support the Cloud" and learn how to examine your network's strength and eliminate any weak points.

The first computers used vacuum tubes for circuitry and magnetic drums formemory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.

First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.

The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951. Second Generation (1956-1963) TransistorsTransistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.

Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

The first computers of this generation were developed for the atomic energy industry. Third Generation (1964-1971) Integrated CircuitsThe development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitorsand interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. Fourth Generation (1971-Present) MicroprocessorsThe microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer-from the central processing unit and memory to input/output controls-on a single chip.

In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handhelddevices. Fifth Generation (Present and Beyond) Artificial IntelligenceFifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

Why did man invent computers?

To do very complex math problems that would have taken months for a person to calculate. Computers basically all work by math - binary code is 1 and 0 - and everything we have done with computers since then is based on mathematical programming!

What could the Z1 computer do?

The Z1 introduced the computer architecture on which modern computers are designed. The device was used to perform decimal floating point calculations during WWII.

Who received the world's first email message?

First Email This is a excerpt from Ray Tomlinson's website entitled "First Email" " The first message was sent between two machines that were literally side by side. The only physical connection they had (aside from the floor they sat on) was through the ARPANET. I sent a number of test messages to myself from one machine to the other. The test messages were entirely forgettable and I have, therefore, forgotten them. Most likely the first message was QWERTYUIOP or something similar. When I was satisfied that the program seemed to work, I sent a message to the rest of my group explaining how to send messages over the network. The first use of network email announced its own existence. These first messages were sent in late 1971. The next release of TENEX went out in early 1972 and included the version of SNDMSG with network mail capabilities. The CPYNET protocol was soon replaced with a real file transfer protocol having specific mail handling features. Later, a number of more general mail protocols were developed. "

How much floor space did the ENIAC require?

ENIAC was composed of 40 standard 19 inch wide relay racks plus three mobile function tables plus punchcard readers and punches. The racks occupied three of the four walls of a room and if they had been in a straight line would have run 63 and a third feet.

In the floor layout shown in the link the room had 16 panels on each of the long walls and 8 panels on the short wall. Allowing for depth of the panels the room was roughly 27 feet long by 16 feet wide. In area this is 432 square feet of floor space. The panels were roughly 8 feet tall.

Difference between 80186 and 80286 and 80386?

5/2/2012 Laith fahim from Iraq,Baghdad 80386 (or 80386DX) is internally and externally a 32-bit microprocessor with a 32-bit address bus. It is capable of handling physical memory of up to 4 GB (232).Virtual memory was increased to 64 terabytes (246) .

80386SX has a 16-external data bus and a 24-bit address bus which gives a 16MB (224) of memory . that is makes more cheaper from the 80386DX

_________________________________________

the 8086 IBM PC and compatible computer 4ht edition by Mukammad Ali & Janice Gillispie

Who is the worlds first computer proggramer?

Charles Baggage was the first programmer.

... The spelling is Babbage not Baggage.

How many operating system in the world?

there are 4 version of operating system in the world.eg-apple mac,windows,ms-dos,linux

What is the difference between super computer and mainframe computer?

Supercomputers have multiple processing units. Making its speedunimaginably fast. It can even run a whole virtual world updating every giga-second.

A mainframe computer is similar only it is more slower and can run larger applications. Usually to huge amounts of data such as a census for every single house hold in the US. Otherwise, there is no difference.

What is first computer generation?

ENIAC, short for Electronic Numerical Integrator And Computer was unveiled in 1946.

The first personal computer, called the IBM PC was released in 1981

How many transistors does an computer have?

Anywhere from none to many billions, depending on the computer.

  1. vacuum tube computers had no transistors
  2. transistorized computers had a few thousand to a few hundred thousand transistors
  3. integrated circuit computers had a few tens of thousands to a few million transistors
  4. microprocessor computers have had a few thousand to many billions of transistors

Who created the Apple computer?

This is not clear and probably can never be precisely determined. The development of the first operating systems happened in several places and was an evolutionary process of modification and adaptation of the preceding Batch Monitorprograms.

What are some RISC and CISC examples?

  • CISC: Manchester Baby, EDVAC, EDSAC, BINAC, UNIVAC I, IBM 704, IBM 650, IBM 305, IBM AN/FSQ-7, IBM 7090, IBM 1620, IBM 1401, DEC PDP-1, Apollo Guidance Computer, IBM System/360, UNIVAC 1107, DEC PDP-8, DECSYSTEM-10, Intel 4004, Intel 8008, Intel 8080, Zilog Z-80, Zilog Z-8000, DEC PDP-11, Data General Nova, Motorola 68000, Intel Pentium, DEC VAX
  • RISC: CDC 6600, CDC 7600, Cray-1, Cray-2, Cray-3, Sparc, IBM RISC, IBM POWER, Intel i960, MIPS, Motorola 88000, DEC Alpha, PowerPC, ARM

Note: CISC was very important when memory was small and very expensive as it allowed programs to be reduced in size. It was also very important when more programming was done with assembly languages than with high order languages as it made for instructions that were easy for programmers to understand. But with declining memory costs and almost all programming being done with compilers, research showed that that most of the instructions in a CISC instruction set were redundant and not needed. Eliminating them and reducing the instruction set to just a core of essential instructions allowed for both a significant reduction in the hardware needed to build a computer and the development of simpler more powerful compilers. With less hardware in the computer optimizing its design for higher speeds became much easier. Thus the concept of RISC was formed.

The CDC machines and the Cray-1 listed above were designed before the concept of RISC formally existed, but clearly meet its definition in retrospect. Designed by Seymour Cray, his primary goal was to produce the fastest computer possible at the time, which required elimination of all unnecessary hardware that might limit the speed of the machine. He even eliminated all data synchronization latches/registers, instead using hand tuned delay lines made of loose hanging loops of twisted pair to make sure signals arrived where needed at exactly the right time.

What was the goal of ARPANET?

the goal of ARPANET was to allow university-based researchers working for the Defense Department to share information with their colleagues in other U.S. cities.

In which year computer invented?

computers are not found. they are designed and built. they are the most complex machines ever built.

Why did Apple name the computer Macintosh?

The Macintosh project started in the late 1970s with Jef Raskin, an Apple employee, who envisioned an easy-to-use, low-cost computer for the average consumer. He wanted to name the computer after his favorite type of apple, the McIntosh, but the name had to be changed for legal reasons.

What are the computer limitation?

They are unnatural, and therefore unintuitive - the natural way for a human to interact with another human is by verbal and nonverbal communication.

Computer's can't think for themselves. They aren't intelligent. If you feed a computer incorrect information, it will result in an incorrect output.

They cannot adequately deal with 'fuzzy' terms, unlike humans who describe most things in a fuzzy way: "Oh, it was *hot* today," or, "John's car cost him a *fortune*," and "It's *almost* seven o' clock - the takeaway's open."

A computer can't tell you something completely new, it can only derrive something based on something known that it was previously programmed to manipulate. They are not creative or imaginative, nor are they prone to arbitrary moments of genius.

They are unquestioning devices. They do not feel and they do not understand ethics.

Computers can only adequately process information that may be quantised. They cannot adequately deal with abstractions.

Finally, the greatest limitation of a computer is that they are engineered, built and used by humans. They expose our flaws and undersights.

Computers have many capabilities. They are not intelligent machines and work off instructions, therefore it can perform many instructions and complete them almost instantly. Computers can connect to the Internet, which means your computer can share information and resources in real time, thus expanding the capabilties of a computer. Computers are very efficient, fast and also have very accurate results to the things we ask them to do.

Is the Internet a technology?

Information and Communications Technology or Information and Communication Technology (ICT), is often used as an extended synonym for information technology (IT), but is a more specific term that stresses the role of unified communications and the integration of telecommunications (telephone lines and wireless signals), computers as well as necessary enterprise software, middleware, storage, and audio-visual systems, which enable users to access, store, transmit, and manipulate information. The term ICT is now also used to refer to the convergence of audio-visual and telephone networks with computer networks through a single cabling or link system.

What is the difference between print and print preview?

A preview of how the page is going to look when you print it so you can see if anything gets cut off the page.