When was the computer invented?

The answer to this question depends on your definition of a "computer". The earliest "computers" were mechanical devices used to help people count. The first known counting devices or tools were Tally Sticks from about 35,000 BCE.

The abacus was invented, possibly by the Babylonians or the Chinese in about 2400 BCE. The abacus consists of movable counters that can be manipulated to add and subtract. The abacus is still used today for basic arithmetic.

As mathematics became more complex, it got harder and harder to invent mechanical devices to solve math problems. One of these devices was conceived in 1786 by J. H. Mueller, who called it a "Difference Engine." It was never built.

Difference engines were forgotten and then rediscovered in 1822 by Mathematics Professor Charles Babbage. This machine used the decimal number system and was powered by cranking a handle. The British government first financed the project but later cut off support. Babbage went on to design a much more general analytical engine in 1845, but later returned and produced an improved design (his "Difference Engine No. 2") between 1847 and 1849. Babbage's design was completed in 1871 by his son, Helmet P.Babbage. The Analytical Engine was designed to be powered by a steam engine and was to use punched cards to direct its operation. Punched cards were in use to program mechanical looms at the time.

During WWII Konrad Zuse invented the Z1. According to Mary Bellis, the Z1 was the first real functioning, binary computer (actually, it was a very large calculator--but a computer nonetheless!). Zuse used it to explore several ground-breaking technologies in calculator development: floating-point arithmetic, high-capacity memory and modules or relays operating on the yes/no principle. Zuse's ideas, not fully implemented in the Z1, succeeded more with each Z prototype.

In 1939, Zuse completed the Z2, the first fully functioning electro-mechanical computer. It was followed by the Z3. These machines were used to produce secret codes for the German military. For a while this gave the Germans a decided advantage. But then, the British, guided by mathematician Alan Turing, created the Colossus Mark I.

Colossus was the world's first programmable, digital electronic computer, developed in 1942-43 at "Station X", Bletchley Park, England. British code breakers used Colossus to read the encrypted German messages. The Germans didn't know their "Enigma" code had been broken. This is one reason the D-Day Invasion succeeded.

In 1939, John V. Atanasoff and Clifford Berry developed the Atanasoff-Berry Computer (ABC) at Iowa State University, which was regarded as the first electronic digital computer. The ABC was built by hand and the design used over 300 vacuum tubes and had capacitors fixed in a mechanically rotating drum for memory.

In 1945, ENIAC, created by J. Presper Eckert and John Mauchly, was unveiled. ENIAC (Electronic Numerator Integrator Analyzer and Computer) weighed in at 27 tons and filled a large room. Not surprisingly, ENIAC also made big noises, cracking and buzzing while performing an equation of 5,000 additions. Before the invention of ENIAC, it took a room full of people to calculate a similar equation.

The first electronic computer that could store its own programs was developed in 1948 at Manchester University. It was called "The baby" and celebrated its 60th birthday in 2008. See BBC and Manchester University links in related links below. This is widely considered to be the forerunner of the modern computer.

The UNIVAC I (Universal Automatic Computer) was the first commercially available, "mass produced" electronic computer. It was manufactured by Remington Rand in the USA and was delivered to the US Census Bureau in June 1951. UNIVAC I used 5,200 vacuum tubes and consumed 125 kW of power. 46 machines were sold at more than $1 million each. By this time, computer design was limited primarily by the size and heat of vacuum tubes.

The vacuum tube was eventually replaced by the transistor. Shortly afterward, in 1959, the monolithic integrated circuit (now called the microchip) was invented by Jack Kilby at Texas Instruments in Dallas, Texas, and a few months later by Robert Noyce, of Fairchild Semiconductor in California. The two companies were embroiled in legal actions for years, but finally decided to cross-license their products. Kilby was awarded the Nobel Prize in Physics in 2000.

The microchip led to the development of the microcomputer -- small, low-cost computers that individuals and small businesses could afford. The first home computers became commercially viable in the mid to late 1970s, but more so in the early 1980s. By the 1990s, the microcomputer or Personal Computer (PC) became a common household appliance, and became even more widespread with the advent of the Internet.
It is hard to state the exact date the computer was invented as it a continuous process. Computers were first introduced in 1822.