How have computers improved since 1945?
A good number of things have changed:
Who invented windows for the computer?
Apple Computer developed first desktop computer with a GUI, graphical user interface and they were called "Windows", but to give credit where credit is due: The first graphical user interface was developed at the Xerox PARC (Palo Alto Research Center) in the 1970s for the Xerox Alto computer and they were also called "Windows". Steve Jobs visited PARC when the interface was under development and you know the rest of the story…it was incorporated into the Lisa's software platform which eventually led to the Macintosh.
The standard windowing system in the Unix world is the X Window System, first released in the mid-1980s. So "Windows" were pop-up screens called "Windows" in the computer world for those who are old as dirt.
History has away of white washing the truth sometimes.
Some people actually believe Bill Gates invented the Desktop computer.
When was the first Universal Power Supply Invented?
The first power supply was invented by Thomas Edison. It is unknown when exactly the first universal power supply for computers was invented.
What was Lady Ada Byron Lovelace first job?
Ada Byron Lovelace inherited the gift of intelligence of her father, Lord Byron. She was a brilliant mathematician, who along with Charles Babbage, worked to create the first programmable computer. Ada spent her life and did all of her work in England.
mac and Microsoft
edit:
also some other people who are not as well known like linux, minx and others.
and millions of people and companies make software. eveything that you download is software.
Who was the first computer named after?
The question does not have a simple and straightforward answer.
Charles Babbage invented, but did not build, an 'Anylitical Engine' which was mechanical programmable computer in the 19th century. Subsequently it has been proved to be workable.
An abacus is a kind of computer which predates most of western history.
The decoding specialist at Bletchley Park created an electronic computer (Called variously a 'bomb' or 'bombs' which was also programable. perhaps mainly through the genius of Alan Turing.
Commercially, perhaps ENIAC is recognised as the first electronic computer.
Summarise the current uses of computers?
computers are a tool that can be used for many things such as
accounting and business
info gathering
or online gaming.
How did george boole contribute to the information age?
He was the one who created AND, OR, and NOT. Like when you search in a box thingy. example: (search:) monkeys NOT chimpanzees (or...) (search:) boy AND girl
What started the computer revolution?
Computers were 1st made in Cupertino California. They were made in the year 1977.
How can you check avable memory?
go to my computer right click the drive go to properties and it will show you the free space.
Which connecton type supports upto 127 peripherals in a single connections?
Theoretically, you can use a USB port to connect up to 127 devices. To do so, you would need to use hubs. The problem with using hubs is that they count as devices. Thus if a hub has 4 sockets, it counts as 5 of the 127 devices.
How was the modern computer developed?
If you want to get technical, it was in the 1940s. it was the size of a small house. see http://en.wikipedia.org/wiki/ENIAC if you are thinking computers roughly the size you see them today, the answer could be argued between the 70s 80s and 90s. in the 70s you had the Xerox Alto, in the 80s you have the Macintosh and the IBM PS/2, 90s is when you start seeing more modern things though.
Why is the Stepped Reckoner unreliable?
The device tended to jam and malfunction because the parts of the machine were unreliable.
What Charles Babbage girlfriend who helped him?
"Girlfriend" is certainly NOT the right word to refer to Augusta Ada King, Countess of Lovelace, daughter of poet Lord Byron. But she did work with Charles Babbage, and developed the first algorithm that could be called a "computer program". Her mathematical work with Charles Babbage's "analytical engine" was groundbreaking and a century ahead of its time.
How do analog and digital computers differ?
A digital computer processes discrete data, an analog computer processes continuous data.
Analog computers are typically harder to program: they are either designed for one special purpose with the program permanently built in, or they must be "rewired" to reprogram them.
What computers are made in US?
I was just looking at QVC.com and they sell ZT System laptops that are made in the USA!
There are several companies building laptops in the USA. Zt Stysems, Systemax, Lotus PC, Polywell. They are generally supported by Americans too, a nice perk.
As computers become more a part of everyday life many people believe that are vital to success?
As computers become more a part of everyday life, many people believe that computer literacy is/are vital to success.
When were personal computers invented?
"By rights Xerox PARC could claim the PC was invented there and built in 1972, they created ALTO the first PC ever built." (years before Apple built their version and marketed it to the public), "not to mention laying the foundation for the program that eventually became the basis for the Macintosh
and Windows operating systems."
---research who pirated Apple after they pirated rights from Microsoft-
-the facts were stated by cutting edge research among today's foremost scientist's, and theoretical physicist Michio Kaku
--------------------------------------------------------------------------------------------------------------
First personal computer was invented by Steve Jobs and Steve Woznak in 1976. The first PC was an Apple computer.
Please Go To www.YouTube.com/BackSpaceFilms For More Info
The PC industry began in 1977, when Apple, along with Radio Shack and Commodore, introduced the first off-the-shelf personal computers as consumer products.
<><><>
Personal Computers and microcomputers were made possible by two technical innovations in the field of microelectronics: the integrated circuit, or IC, which was developed in 1959; and the microprocessor, which first appeared in 1971. The IC permitted the miniaturization of computer-memory circuits, and the microprocessor reduced the size of a computer's CPU to the size of a single silicon chip.
The invention of the microprocessor, a machine which combines the equivalent of thousands of transistors on a single, tiny silicon chip, was developed by Ted Hoff at Intel Corporation in the Santa Clara Valley south of San Francisco, California, an area that was destined to become known to the world as Silicon Valley because of the microprocessor and computer industry that grew up there. Because a CPU calculates, performs logical operations, contains operating instructions, and manages data flows, the potential existed for developing a separate system that could function as a complete microcomputer.
The first such desktop-size system specifically designed for personal use appeared in 1974; it was offered by Micro Instrumentation Telemetry Systems (MITS). The owners of the system were then encouraged by the editor of a popular technology magazine to create and sell a mail-order computer kit through the magazine. The computer, which was called Altair, retailed for slightly less than $400.
The demand for the microcomputer kit was immediate, unexpected, and totally overwhelming. Scores of small entrepreneurial companies responded to this demand by producing computers for the new market. The first major electronics firm to manufacture and sell personal computers, Tandy Corporation (Radio Shack), introduced its model in 1977. It quickly dominated the field, because of the combination of two attractive features: a keyboard and a cathode-ray display terminal (CRT). It was also popular because it could be programmed and the user was able to store information by means of cassette tape.
Soon after Tandy's new model was introduced, two engineer-programmers-Stephen Wozniak and Steven Jobs-started a new computer manufacturing company named Apple Computers.
In 1976, in what is now the Silicon Valley, Steve Jobs and Steve Wozniak created a homemade microprocessor computer board called Apple I. Working from Jobs' parents' garage, the two men began to manufacture and market the Apple I to local hobbyists and electronics enthusiasts. Early in 1977, Jobs and Wozniak founded Apple Computer, Inc., and in April of that year introduced the Apple II, the world's first personal computer. Based on a board of their design, the Apple II, complete with keyboard and color graphics capability, retailed for $1290.
Some of the new features they introduced into their own microcomputers were expanded memory, inexpensive disk-drive programs and data storage, and color graphics. Apple Computers went on to become the fastest-growing company in U.S. business history. Its rapid growth inspired a large number of similar microcomputer manufacturers to enter the field. Before the end of the decade, the market for personal computers had become clearly defined.
In 1981, IBM introduced its own microcomputer model, the IBM PC. Although it did not make use of the most recent computer technology, the PC was a milestone in this burgeoning field. It proved that the microcomputer industry was more than a current fad, and that the microcomputer was in fact a necessary tool for the business community. The PC's use of a 16-bit microprocessor initiated the development of faster and more powerful micros, and its use of an operating system that was available to all other computer makers led to a de facto standardization of the industry.
In the mid-1980s, a number of other developments were especially important for the growth of microcomputers. One of these was the introduction of a powerful 32-bit computer capable of running advanced multi-user operating systems at high speeds. This has dulled the distinction between microcomputers and minicomputers, placing enough computing power on an office desktop to serve all small businesses and most medium-size businesses.
Another innovation was the introduction of simpler, "user-friendly" methods for controlling the operations of microcomputers. By substituting a graphical user interface (GUI) for the conventional operating system, computers such as the Apple Macintosh allow the user to select icons-graphic symbols of computer functions-from a display screen instead of requiring typed commands. Douglas Engelbart, invented an "X-Y Position Indicator for a Display System": the prototype of the computer "mouse" whose convenience has revolutionized personal computing. New voice-controlled systems are now available, and users may eventually be able to use the words and syntax of spoken language to operate their microcomputers.