answersLogoWhite

0


Best Answer

The IBM 701, their first computer available for sale, had 36 bits per word. This word size was used on all their 700 and 7000 series binary scientific computers.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many bits in ibm's first computer?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about General History

When was computer first developed?

It was developed many, many years ago.


In computer terms what is a nibble?

In computer terms a nibble = 4 bits = 1/2 byte. You can further define the data segment as: Crumb = 2 bits Nibble = 4 bits Byte = 8 bits Word = 16 bits Double Word=32 bits Jury still out on 64 bits and Sentence In keeping with the spelling of "byte", the lick and nibble are sometimes spelled "lyck" and "nybble" ------------------- A nibble is half a byte, but believe it or not, a byte does not necessarily have to have eight bits. I don't know of any computer platform that uses anything but 8 bit bytes these days, but in computer science terms, a byte is generally the smallest addressable element, and the size needed to store a character. You also might be surprised to know that not all machines use ASCII (eight bit bytes) characters. IBM Mainframes still use EBCDIC under their traditional operating systems, and that stands for Extended Binary Coded Decimal Interchange code, which accounted for the lion's share of data until a few decades ago. It's an extended version of BCD, which uses 4 bits to express numbers, and there's no technical reason that a BCD based machine couldn't have 4 bit bytes. It's unlikely that you will ever encounter a computer that doesn't use eight bit bytes, but you may encounter people who studied computer science back in the 1970s. Back in the "old" days (the 1960's) when computer's didn't have operating systems or hign level programming languages, you always dealt with the byte. On some machines the byte was 8 bits and on others it was 8 bits + a parity bit for 9 bits. There was "even" parity and "odd parity", meaning you set the parity bit on if an even number of bits = 1 in the original 8 bits, or set it on for an odd number of bits = 1 in the original 8 bits. The "word" was originally set to be the size of a register (everything was done through a set of registers). The registers were used to assemble the current instruction that you wanted the computer to execute (what kind of action, like move a byte, add, subtract etc., plus the final length of your data, which determined how many cycles the computer had to go through to execute your instruction, plus where the data was coming from and going to). The "word" length was pegged to the length of the register, meaning that in treating the computer like a book, each register was a word. Since the first computers were totally byte oriented, a word was 8 bits. When 16-bit registers were implemented, they became 16 bits, then 32 bits and now 64 bits. There are some computers today that even have 128 bit words. So a "word" is the length of the registers in whatever computer you are using. It is also the biggest chunk of bits that the computer can process at one time. The word "nibble" was invented to specify the high-order 4 bits in a byte or the low-order 4 bits in a byte (like eating a nibble from a cookie, instead of the whole cookie). Since a number can be specified in 4 bits, you only needed a "nibble" to store a number. So, if you had a field that was all numbers, you could write it out in "nibbles", using half the space you would have used if it was in bytes. Back in those days, space counted. The first "mainframe" computers had 4k of memory (no, that really is 4k), so you didn't have any space to waste if you were doing something like payroll or inventory management. In some cases, individual bits within bytes are used to store flags (yes or no for a given attribute) and, in at least one IBM manual, these were referred to as tidbits. IBM was not known for a sense of humor, but the term never became a generally accepted abbreviation.


In the beginning how many types of computers were there?

There was one, the first computer was ENIAC, short for Electronic Numerical Integrator And Computer.


Why did the first computer have so many cords?

if you are talking about ENIAC, that is how it was programmed.


The first computer introduced in Nepal?

The National Computer Center of Nepal had an IBM 1401 at least several years before 1978. It was use for many purposes including census and Royal Nepal Airlines. It was certainly the first large computer used there.

Related questions

What was the microprocessor to make it into a home computer and how many bits could it process at one time?

The microprocessor used in the first home computer was the 8080. It could handle 8 bits at a time.


How many bits does your laptop have?

The easiest way to find out how many bits your laptop has is to open the system info in the control panel. First click the "Start" button on your desktop, then go to control panel and click "system." All your computer's operating specifications.


How many multiplexers are there in a bus in a computer?

32 multiplexers , each of size is 16 bits


How many bits are in flabbergasted?

A "flabbergasted" is not a measurement of computer memory. 8 bits are in a byte, 1000 bytes are in a kilobyte, 1000 kilobytes are in a megabyte, Etc.


How many bits could the first microprocessor handle at one time?

The first microprocessor was the 4004. It could handle 4 bits at a time.


Is the number of bits in a word the same in every computer?

No, computers have been built with as few as 1 bit in a word to 72 bits in a word and architectures have been proposed with as many as 256 bits in a word.


What is a bit word?

8bit 16 bits 32 bits and 64 bits and 128 bits imply a broadside [parallel] output of that many bits of digital information on a buss output. these bits represent a word output. therefore the longest the word the more information can be processed at a time imply more bits the faster the computer or data flow.


How many bits are need to represent colors?

Most modern digital cameras use 24 bits (8 bits per primary) to represent a color. But more or less can be used, depending on the quality desired. Many early computer graphics cards used only 4 bits to represent a color.


What is a byte and how many bits does it contain?

A byte is the basic unit of information in a computer. It is usually the smallest addressable object in memory. The size of a byte is a function of the design of the computer, but nearly universally, it has come to mean 8 bits. (Octet is a more precise definition of 8 bits of memory, in case there is any dichotomy.)


How many bits are in ten bytes?

There are 80 bits in 10 bytes. Each byte contains eight bits. This is critical to know when shopping for a new computer or adding memory to a compute as it will allow for an accurate calculation of the amount of storage space.


How many bianary digits create one byte?

Binary digit = 1 bit. Four bits = 1 nibble. 8 bits = 1 byte.[An obsolete computer type used 9 bits to a byte, but that is history, not modern practice. ]


How many bits are in a byte in a computer?

There are 8 bits in a byte.People use a lowercase 'b' to mean "bits" and an uppercase "B" to mean bytes. So if a hard drive has 500GB, that's giga-bytes. But if a network cable lists its speed as 10Mbps, that means 10 mega-bits per second.