Microprocessors

A microprocessor is the heart of any computer, whether it is a server, a desktop machine, or a laptop. This single chip contains the arithmetic, control, and logic circuitry necessary to interpret and execute computer programs.

6,939 Questions
Inventions
Computer History
Microprocessors

When was the computer invented?

The answer to this question depends on your definition of a "computer". The earliest "computers" were mechanical devices used to help people count. The first known counting devices or tools were Tally Sticks from about 35,000 BCE.

The abacus was invented, possibly by the Babylonians or the Chinese in about 2400 BCE. The abacus consists of movable counters that can be manipulated to add and subtract. The abacus is still used today for basic arithmetic.

As mathematics became more complex, it got harder and harder to invent mechanical devices to solve math problems. One of these devices was conceived in 1786 by J. H. Mueller, who called it a "Difference Engine." It was never built.

Difference engines were forgotten and then rediscovered in 1822 by Mathematics Professor Charles Babbage. This machine used the decimal number system and was powered by cranking a handle. The British government first financed the project but later cut off support. Babbage went on to design a much more general analytical engine in 1845, but later returned and produced an improved design (his "Difference Engine No. 2") between 1847 and 1849. Babbage's design was completed in 1871 by his son, Helmet P.Babbage. The Analytical Engine was designed to be powered by a steam engine and was to use punched cards to direct its operation. Punched cards were in use to program mechanical looms at the time.

During WWII Konrad Zuse invented the Z1. According to Mary Bellis, the Z1 was the first real functioning, binary computer (actually, it was a very large calculator--but a computer nonetheless!). Zuse used it to explore several ground-breaking technologies in calculator development: floating-point arithmetic, high-capacity memory and modules or relays operating on the yes/no principle. Zuse's ideas, not fully implemented in the Z1, succeeded more with each Z prototype.

In 1939, Zuse completed the Z2, the first fully functioning electro-mechanical computer. It was followed by the Z3. These machines were used to produce secret codes for the German military. For a while this gave the Germans a decided advantage. But then, the British, guided by mathematician Alan Turing, created the Colossus Mark I.

Colossus was the world's first programmable, digital electronic computer, developed in 1942-43 at "Station X", Bletchley Park, England. British code breakers used Colossus to read the encrypted German messages. The Germans didn't know their "Enigma" code had been broken. This is one reason the D-Day Invasion succeeded.

In 1939, John V. Atanasoff and Clifford Berry developed the Atanasoff-Berry Computer (ABC) at Iowa State University, which was regarded as the first electronic digital computer. The ABC was built by hand and the design used over 300 vacuum tubes and had capacitors fixed in a mechanically rotating drum for memory.

In 1945, ENIAC, created by J. Presper Eckert and John Mauchly, was unveiled. ENIAC (Electronic Numerator Integrator Analyzer and Computer) weighed in at 27 tons and filled a large room. Not surprisingly, ENIAC also made big noises, cracking and buzzing while performing an equation of 5,000 additions. Before the invention of ENIAC, it took a room full of people to calculate a similar equation.

The first electronic computer that could store its own programs was developed in 1948 at Manchester University. It was called "The baby" and celebrated its 60th birthday in 2008. See BBC and Manchester University links in related links below. This is widely considered to be the forerunner of the modern computer.

The UNIVAC I (Universal Automatic Computer) was the first commercially available, "mass produced" electronic computer. It was manufactured by Remington Rand in the USA and was delivered to the US Census Bureau in June 1951. UNIVAC I used 5,200 vacuum tubes and consumed 125 kW of power. 46 machines were sold at more than $1 million each. By this time, computer design was limited primarily by the size and heat of vacuum tubes.

The vacuum tube was eventually replaced by the transistor. Shortly afterward, in 1959, the monolithic integrated circuit (now called the microchip) was invented by Jack Kilby at Texas Instruments in Dallas, Texas, and a few months later by Robert Noyce, of Fairchild Semiconductor in California. The two companies were embroiled in legal actions for years, but finally decided to cross-license their products. Kilby was awarded the Nobel Prize in Physics in 2000.

The microchip led to the development of the microcomputer -- small, low-cost computers that individuals and small businesses could afford. The first home computers became commercially viable in the mid to late 1970s, but more so in the early 1980s. By the 1990s, the microcomputer or Personal Computer (PC) became a common household appliance, and became even more widespread with the advent of the Internet.
It is hard to state the exact date the computer was invented as it a continuous process. Computers were first introduced in 1822.

733734735
Science
Electronics Engineering
Microprocessors
Intel 8085

What causes eddy current?

See Related Links below.

341342343
Computer Hardware
Microprocessors

What are the components of a CPU?

The CPU is comprised of three main parts:

ALU (Arithmetic Logic Unit): Does the actual logical comparisons that need to be processed.

Control Unit: Can execute or store the results coming out of the ALU.

Registers: Stores the data that is to be executed next.
THE alu and the control unit or memory and the ALU.

The Control Unit and the Arithmetic - logic Unit (ALU).

1)hard disk

2)processor

3)motherboard

4)ROM-random acces memory

5)ROM-read only memory

6)varius ports

7)floppy drive

8)varius resistors

9)cables

10)bios

11)SMPS

12)CMOS

13)

237238239
Computer Hardware
Microprocessors

What is Bit Slice Processor?

Bit slicing is a technique for constructing a processor from modules of smaller bit width. Each of these components processes one bit field or "slice" of an operand. The grouped processing components would then have the capability to process the chosen full word-length of a particular software design. Bit slice processors usually consist of an arithmetic logic unit (ALU) of 1, 2, 4 or 8 bits and control lines (including carry or overflow signals that are internal to the processor in non-bitsliced designs). For example, two 4-bit ALUs could be arranged side by side, with control lines between them, to form an 8-bit,16-bit,32-bit words (so the designer can add as many slices he wants to make it to manipulate longer words lengths). A microsequencer or Control ROM would be used to execute logic to provide data and control signals to regulate function of the component ALUs. Examples of bit-slice microprocessor modules can be seen in the Intel 3000 family, the AMD's Am2900 family the National Semiconductor IMP-16 and IMP-8 family, and the 74181.

303304305
Software and Applications (non-game)
Microprocessors

How much does the Jaguar supercomputer cost?

3.5 millon dollars

297298299
Microprocessors

What is memory space in microprocessor?

memory space in microprocessor means cache ,it is the part of the microprocessor which contains the memory to store instructions which are used to perform different functions by the processor. where cache1 is referred to memory in microprocessor and cache2 is placed on motherboard which also contains memory to store instructions.

Memory space can also mean the total size of virtual memory that a CPU can address, and the layout (flat or segmented) of this space.

Most modern CPUs found in PCs (AMD and Intel x86_64 chips) are 64-bit CPUs, but, due to cost and practical use considerations, limit their virtual memory space to be 48-bits (or less).

283284285
Microprocessors
Computer History
Electronics Engineering

Why microprocessor is called microprocessor why not mini-processor?

It is called a microprocessor because it is the single microchip integrated circuit processor of a microcomputer. The term minicomputer was already taken and referred to small computers whose processor was built of many different microchip integrated circuits. There was never anything called a "miniprocessor" in these computers, just as there was never anything called a "mainframe processor" in the big computers.

Note: Some early microprocessors were not single microchips, but were instead a family of related microchips that could be interconnected in a variety of different ways (e.g. AMD 2900 family). This served two functions:

  1. it avoided the limits of the number of transistors that could be put on a single microchip with the technology available then
  2. providing greater design flexibility (e.g. the same microprocessor family could be used to build microcomputers having very different word lengths depending on the requirements: 16 bit, 32 bit, 48 bit, 64 bit, etc.)
You do not see this anymore, all microprocessors now are single microchip integrated circuits.
283284285
Microprocessors
Technology Conferences

Are processors interchangeable?

Each motherboard is Chip specific. If your motherboard is for an AMD then you will only be able to use AMD processors in it. If you want change your Processor then you would have to get the right motherboard for that processor.

Answer

Processors (regardless of who makes them) have a set of electrical and communications standards which they use, and a physical design of the package. Processors which have the identical package, electrical characteristics, and bus communication standards can be interchanged with each other (provided the BIOS or PROM can properly understand the identity value the CPU has).

For a variety of reasons, mostly related to patents, interchangability these days (post 2005) is limited to processors from a single manufacturer, and within a "family".

Ultimately, you will have to refer to the MOTHERBOARD manufacturer's documentation to find out which CPUs can be used in that particular system.

283284285
Computer Programming
C Programming
Microprocessors

IN 8085 local variables will get stored in?

Registers or RAM-memory.

273274275
Manufacturing
Microprocessors
Motherboards

Which companies are manufacturing processor?

Most popular manufacturers of CPU are Intel (8086, 80286, 80386, 80486, Celeron, Pentium, Core, Core II, i3, i5, i7, Itanium series) and AMD (Duron, Sempron, Athlon, Phenom, Opteron CPU families).

Other companies are ARM (for mobile devices), alliance of Motorola, Apple and IBM (PowerPC), Hewlett-Packard (PA-RISC) and Sun Microsystems (SPARC processsors). Sun was actually acquired by Oracle Corporation and renamed to Oracle America, Inc.

267268269
Microprocessors
Intel Core 2

What does dual-core and quad core mean?

A multi-core processor is composed of two or more independent cores. One can describe it as an integrated circuit which has two or more individual processors (called cores in this sense).[1] Manufacturers typically integrate the cores onto a singleintegrated circuit die (known as a chip multiprocessor or CMP), or onto multiple dies in a single chip package.

261262263
Microprocessors
Intel Core 2
Intel Microprocessors

What is the worst processor Intel makes?

Celeron!

rewrite more depth-its actually, to be specific one of the celeron range with 1/1 core/threads and one with a 1mb or less than this cache for instance my laptop is proof. it has a intel celeron 900 with 1 core and 1 thread and also 1 mb of cache which means it can store and process very little data due to its pathetic 1mb cache and with its 1 core it can even distribute these processes over multiple cores and therefore can also only send through and utilize one thread meaning basicly 1 mb of data every like millisecond being processed which in turn means it loads something simple like an internet browser in about 5 minutes and take even longer to open word or something of other use. as for games i tried to run minecraft and it crashed on the lowest setting you could put the game on and even has crashed using google and the fan kicks in because the CPU tries to turbo boost and use a lot of power when i bring up the windows menu therefore rendering it useless no matter how much ram or hard rive space or anything it is rendered useless and your average mobile of today has a better processor. hope this helped.

245246247
Electronics Engineering
Microprocessors

What is the voltage of 8051 microcontroller?

This question is literally impossible to answer without knowing a specific part number (and manufacturer which is usually easy to determine because it is based off the part number). The 8051 was first used in the IBM type computer as the keyboard controller to send key presses to the CPU. It has since become an architecture that appears in a number of other places. There are several manufacturers that produce low cost and low power microcontrollers based upon the 8051 with other peripherals included in the same package. Because there are numerous 8051 controllers, there is no one standard for connecting them to power for each may design their controller to meet their own design guidelines.

Older 8051 microcontrollers usually require a 5 V power supply.

Newer 8051 microcontrollers are more likely to require a 3.3 V power supply.

The only way to tell is to read the part number off the chip and look up the manufacturer's data sheets for that part number.

Fortunately, most data sheets are online these days.

241242243
Computer Terminology
Microprocessors
Computer Memory

What is logical cache?

logical cache is what computer companies call L1, L2, and L3 cache.

It is smart memory on the chip or processor that remembers what the processor has ask for the last 10 to 100 processes.

227228229
Computers
Computer Hardware
Microprocessors
Computer Memory

Is it harder to add hardware to a Hewlett Packard than other computers?

You should be able to upgrade or add on most parts to just about any desktop computer available today. The only thing I've found difficult are the case designs of some computers I've worked on, and that only matters if you're replacing something big like the motherboard. Manufacturers save money by using standard parts, so you won't find too many "HP only" or "Gateway only" parts. They may *advertise* it that way, but that's just to get you to spend more money on them. I've not heard of HP being difficult in the past - perhaps you're thinking of PB (Packard Bell), which used to do crazy stuff like solder memory and CPUs in place, making it near impossible to replace them. Luckily, they don't make computers any more.



Some budget PC's from HP, like the a400n, come standard with mobo's that do not have an agp slot. Just be aware of what you are really getting with a cheap PC.



Find a local non-branded computer shop. Buy your new PC there. You'll be supporting a smaller store, your community, plus you'll probably get better service after-sale, and should get a better deal for your money.



dificulty of working on HP desktop depends on make. some have a bunch of stuff you got to remove just to get to the ram, sata ports and other ports. otherwise its no diffrent then working on another computer. only thing you have to remember is that HP flips their motherboard over to the rightside of case instead of left that is if you are looking at it from behind. if you change cases to a tipical atx case. Otherwise its simple as any other computer
204205206
Microprocessors

What is a subroutine?

In computer programming, a subroutine is an identified sequence of instructions with a start and an end point which may be invoked from another part of the program. When a subroutine is called, the processor executes the instructions until it reaches the end of the subroutine, at which point control is returned to the point in the program immediately following the call.

In most programming languages, a set of conventions are followed which allow values to be passed into the subroutine and for a result to be returned, so that the subroutine can be used in many different contexts. This is the most basic form of reusable software.

In higher-level languages, functions and methods are specialized forms of subroutines.

199200201
Inventions
Computer History
Microprocessors

Where was the computer invented?

"The Father of Computers" Charles Babbage who invented the 1st mechanical Computer.. in 1834 he dreamt of designing mechanical calculating machines. "... I was sitting in the rooms of the Analytical Society, at Cambridge, my head leaning forward on the table in a kind of dreamy mood, with a table of logarithms lying open before me. Another member, coming into the room, and seeing me half asleep, called out, "Well, Babbage, what are you dreaming about?" to which I replied "I am thinking that all these tables" (pointing to the logarithms) "might be calculated by machinery. "

Cambridge, England.. is where the 1st Computer was invented by Charles Babbage. Meanwhile Ada Lovelace is credited as the "first Computer programmer" since she was writing programs -that is, manipulating symbols according to rules-for Babbages machine.

While it was Babbage who invented the first Computer in Cambridge, another Englishman Alan Turing the "Father of modern computer science" provided an influential formalisation of the concept of the algorithm and computation with the Turing machine in 1936. Of his role in the modern Computer, Time Magazine in naming Turing one of the 100 most influential people of the 20th century, states.."The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine"

Iowa State University (Ames, Iowa) by John Vincent Atanasoff. There is some dispute as Atanasoff did not patent his digital computer, and eventually other computers came out. There was a lawsuit filed by ISU which they eventually won that establishes their campus as the "Birthplace of the electronic digital computer".

Check out the answer to "Why was the first computer made?" I've answered the question of when and where as well. Specifically, the first machine we classify as a true computer (capable of making decisions) was the ENIAC which was built at the University of Pennsylvania between 1943 and 1946 by John Mauchly and J. Presper Eckert. However, much of the credit for the original design of the electronic computer is given to John Atanasoff who, together with his graduate student Clifford Berry, developed a working digital computer on the Iowa State campus between 1939 and 1942. Because of improper handling of their patent application, it took almost 50 years for them to receive full credit for their invention the ABC (Atanasoff Berry Computer).

During World War II, "the father of modern Computer Science" Alan Turing (inventor of the Turing Machine) worked as a cryptographer, decoding codes and ciphers at one of the British government's top-secret establishments located at Bletchley Park. In January 1943, along with a number of colleagues, Turing began to construct an electronic machine to decode the Geheimfernschreiber cipher. This machine, which they dubbed COLOSSUS, comprised 1,800 vacuum tubes and was completed and working by December of the same year! By any standards COLOSSUS was one of the world's earliest working programmable electronic digital computers. But it was a special-purpose machine that was really only suited to a narrow range of tasks (for example, it was not capable of performing decimal multiplications). Having said that, although COLOSSUS was built as a special-purpose computer, it did prove flexible enough to be programmed to execute a variety of different routines.

Computers have been around for a very, very long time. But the definition of what makes something a computer has changed a great deal. And the progress made on developing computers was made by many many people, not just one "inventor". There are many people out there who would say that the first "computer" was the abacus, invented in Asia about 5000 years ago. But somehow I doubt that this is what you're looking for, so let's look a little more recently... As time went on, there were a number of special devices invented to help with things like tax collecting, taking the census, etc. At first, these were purely mechanical, but by the start of the twentieth century, they were run by steam. The first of the "modern" computers was invented during World War II, in 1941 by a German engineer named Konrad Zuse. The computer was called the Z3 and was used to help design German airplanes and missiles. A couple years later, in 1943, the Allied forces developed a computer called Colossus to help decode German messages. But since the Z3 was developed by the side that lost the war and Colossus stayed a military secret for many years, these computers didn't contribute much to the ones that came next. Independent of the Colossus project, the next computer was the Mark I, designed by Howard H. Aiken, an engineer working with Harvard and IBM. The Mark I was positively huge, taking up half of a football field, but it helped to create ballistic charts for the US Navy during the war. Shortly after this, though, came the Electronic Numerical Integrator and Computer (ENIAC), developed by John Presper Eckert and John W. Mauchly, working with the government and the University of Pennsylvania. ENIAC was a lot like the Mark I, except that it ran about 1000 times faster. Moving along, there were other computers like EDVAC (1945), UNIVAC I (1951), etc. But all these computers had something in common with the older computers - they were designed for a specific purpose and couldn't really be used for anything else. They also all worked by using vacuum tubes, which is what made them take up so much space. The invention of the transistor in 1947 by John Bardeen, Walter Brattain, and William Shockley at Bell Labs made the big difference from here. Using transistors, computers came around that could store memory and even run programs. Soon they even had computer languages so that people could change the programs run by the computer when they wanted to. After a while, the focus on computer research came to be on making them smaller, giving us the kinds of computers that we have today. For some great resources with a lot more detailed information, check out these links: Computers: History and Development - Lots of information starting all the way back in Asia all the way

Konrad Zuse - German - developed and built the first binary digital computer in the world, the Z1 in 1938 and in 1941 built the Z3 which was first fully functional program-controlled electromechanical digital computer in the world. Both of these machines were destroyed during WWII.

Different variants on the computer were invented nearly simultaneously by different people in the US, England, and Germany between 1938 and 1945.

Then there was Babbage's work in England in the 1940s, but nothing was built.

The digital computer was invented, but not built, in London England.

The electronic digital computer was invented in a bar in Indiana and built in Ames Iowa.

The analog computer was invented and built in Greece.

181182183
Computer Hardware
Microprocessors

Is 1Ghz Processor on a mobilephone good?

well an iphone 3gs is only 600mhz (0.6ghz) so yes 1ghz is very good. most phone brands are going with a snapdragon 1ghz on theirr higher end models now

197198199
Computers
Microprocessors
Intel Microprocessors
AMD Microprocessors

Will Intel make a Pentium 5?

The Pentium brand has been relegated to low-cost / budget processors. Creating a processor called the "Pentium 5" would confuse consumers, who now expect a Pentium to be a cheaper processor, while the name would imply that it was a flagship successor to the Pentium 4.

191192193
Microprocessors
Computer Memory

What are microchips and how are they related to integrated circuits?

Basically they can be considered the same thing.

Micro chip may refer to the fact that it contains a micro processor.

A microprocessor is an integrated circuit that is designed to handle instructions from a software program and form the core of a computer system.

Integrated circuits are any type of device, that has many components etched onto a single piece of silicon and embedded in a plastic package with conducting legs. They include microprocessors, but could be amplifiers, logic gates, memory storage, complete radios, or other dedicated circuits that do one job very well.

181182183
Microprocessors
The Difference Between

What is arm9 risc 32-bit processor?

Its exactly what is says on the tin. A 32-bit processor that uses the ARM9 architecture.

171172173
Microsoft Windows
Microprocessors
Hard Disk Drives

Why would a computer no longer remember passwords after installing a new hard drive and how do you make it remember again?

HARDDRIVES DO NOT COME WITH PASSWORDS YOU HAVE TO SET YOUR OWN.GO TO THE CONTROL PANEL AND CLICK ON PASSWORDS. A new hard drive is empty. It has nothing on it. When you install the operating system, you're starting over from scratch. You'll have to enter your passwords the same way you did before, and tell Windows to remember them. Just like you'll have to install any other software that you want on the machine. You're starting over again from scratch.

168169170
Microprocessors
Intel Core 2
Intel Microprocessors

Will gta 4 run on Intel you core i3?

Yes,but in low settings,if you want better settings you need a core2 quad q8400 or better.For maximum settings you need a core i7 and a gtx 260 or Ati 4870 graphic card,plus 4 gb of ram.

Frankly, most modern games are GPU (graphics-card) bound more than CPU-bound. A Core i3 will easily run GTA 4 in all but the highest resolutions (1900x1200 or higher) provided it has a good graphics card - generally at least an nVidia GTX 460 or Radeon 6850. You will need sufficient RAM (2GB minimum if you run nothing else).

The sad fact of the current state-of-the-art games is that they are poorly designed (from a software standpoint). Very few scale well with added cores (most barely use two cores well), and are notoriously poorly multi-threaded. Very, very few games really can make use of a Quad-core system - in fact, most games which "recommend" a quad-core system do so not for the game, but so that the game can have two cores to itself, and the additional cores run system or other apps. Now that most GPUs include physics modeling support, the vast majority of work in a modern PC game is done on the GPU, with the CPU generally being much less loaded (often, doing "housekeeping" functions and audio, and little else).

165166167
Microprocessors

Why CISC chips are relatively slow per instruction?

Short answer: CISC instructions consist of several RISC instructions. That makes CISC slower (per instruction) compared to RISC.

Compared to other architectures, CISC processors have relatively powerful instructions, which used run so called microprograms. For each instruction, the corresponding microprogram has to be fetched and executed, which is considerably slower than just using 'real' hardware instructions (compare RISC). According to wikipedia, microprogrammed processors are no longer common. Instead, a special unit resolves CISC instructions to RISC instructions.

161162163
Computer Terminology
Computer Programming
Microprocessors

What is the Difference between cross assembler and resident assembler?

An assembler which runs on a computer for which it produces object codes

159160161

Copyright © 2020 Multiply Media, LLC. All Rights Reserved. The material on this site can not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Multiply.