answersLogoWhite

0


Best Answer

In Windows:

1. Go to My Computer.

2. Right-Click the Drive/ Removable Storage you would like to see how many bits it contains.

3. Click Properties.

4. It should tell you what it's Free Space, Used Space, and Capacity are in Bytes.

5. You probably mean Bytes, not Bits, but if you really mean Bits, then multiply how many Bytes there are by 8 to get how many Bits it is.

For the difference between Bytes & Bits, Read this below. Plus you'll learn more stuff.

based on 1000 KB = 1 MB and so on based on 1024 KB = 1 MB and so on

4 Nibbles = 1 Bit 4 Nibbles = 1 Bit

8 Bits = 1 Byte 8 Bits = 1 Byte

1,000 Bytes = 1 KB 1024 Bytes = 1 KB

1,000 KB = 1 MB 1024 KB = 1 MB

1,000 MB = 1 GB 1024 MB = 1 GB

1,000 GB = 1 TB 1024 GB = 1 TB

~and the list goes on~

Don't confuse bits with bytes, ex. 8 Kilobits (Kb) = 1 Kilobyte (KB)

so on with Megabits (Mb) Gigabits (Gb) ...

most people will use 1 MB being 1000 KB, but the reason for 1 MB = 1024 KB was/is for the way computer memory is stored. that's why you see flash drives that are usually

256 MB, 512 MB, 1024 MB (1 GB), 2 GB, 4 GB, 8 GB, 16 GB, and rarely 32 GB.

that's where it starts, because you keep with the pattern shown:

1, 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048, 4096 ...

I hope this wasn't too confusing for you, but maybe this will answer future questions & provide a better understanding if you can keep up.

P.S. for future conversions, If you'd like to convert based on 1000 KB = 1 MB, then all you have to do is move the decimal, and if you'd like to convert based on 1024KB = 1 MB, then Google calculator will provide you with the correct answer. You can use google calculator by typing into their searchbar something along the lines of:

KB in MB, but you have to remember to keep the letters capital, otherwise it will confuse it with Kilobits/Gigabits.

Oh, and remember, almost all ISP's (internet Service Providers such as Charter & Qwest) us Megabits per second, not Megabytes per second, so to find out how many Megabytes per second they are advertising, just divide it by 8.

User Avatar

Wiki User

15y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How do you figure out how many bits your computer has?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about General History

How many bits in ibm's first computer?

The IBM 701, their first computer available for sale, had 36 bits per word. This word size was used on all their 700 and 7000 series binary scientific computers.


In computer terms what is a nibble?

In computer terms a nibble = 4 bits = 1/2 byte. You can further define the data segment as: Crumb = 2 bits Nibble = 4 bits Byte = 8 bits Word = 16 bits Double Word=32 bits Jury still out on 64 bits and Sentence In keeping with the spelling of "byte", the lick and nibble are sometimes spelled "lyck" and "nybble" ------------------- A nibble is half a byte, but believe it or not, a byte does not necessarily have to have eight bits. I don't know of any computer platform that uses anything but 8 bit bytes these days, but in computer science terms, a byte is generally the smallest addressable element, and the size needed to store a character. You also might be surprised to know that not all machines use ASCII (eight bit bytes) characters. IBM Mainframes still use EBCDIC under their traditional operating systems, and that stands for Extended Binary Coded Decimal Interchange code, which accounted for the lion's share of data until a few decades ago. It's an extended version of BCD, which uses 4 bits to express numbers, and there's no technical reason that a BCD based machine couldn't have 4 bit bytes. It's unlikely that you will ever encounter a computer that doesn't use eight bit bytes, but you may encounter people who studied computer science back in the 1970s. Back in the "old" days (the 1960's) when computer's didn't have operating systems or hign level programming languages, you always dealt with the byte. On some machines the byte was 8 bits and on others it was 8 bits + a parity bit for 9 bits. There was "even" parity and "odd parity", meaning you set the parity bit on if an even number of bits = 1 in the original 8 bits, or set it on for an odd number of bits = 1 in the original 8 bits. The "word" was originally set to be the size of a register (everything was done through a set of registers). The registers were used to assemble the current instruction that you wanted the computer to execute (what kind of action, like move a byte, add, subtract etc., plus the final length of your data, which determined how many cycles the computer had to go through to execute your instruction, plus where the data was coming from and going to). The "word" length was pegged to the length of the register, meaning that in treating the computer like a book, each register was a word. Since the first computers were totally byte oriented, a word was 8 bits. When 16-bit registers were implemented, they became 16 bits, then 32 bits and now 64 bits. There are some computers today that even have 128 bit words. So a "word" is the length of the registers in whatever computer you are using. It is also the biggest chunk of bits that the computer can process at one time. The word "nibble" was invented to specify the high-order 4 bits in a byte or the low-order 4 bits in a byte (like eating a nibble from a cookie, instead of the whole cookie). Since a number can be specified in 4 bits, you only needed a "nibble" to store a number. So, if you had a field that was all numbers, you could write it out in "nibbles", using half the space you would have used if it was in bytes. Back in those days, space counted. The first "mainframe" computers had 4k of memory (no, that really is 4k), so you didn't have any space to waste if you were doing something like payroll or inventory management. In some cases, individual bits within bytes are used to store flags (yes or no for a given attribute) and, in at least one IBM manual, these were referred to as tidbits. IBM was not known for a sense of humor, but the term never became a generally accepted abbreviation.


What is byte addressability?

Early computers were placed in 2 categories, allowing them to be optimized to their user's needs:scientific - these computers had large fixed wordsizes(e.g. 24 bits, 36 bits, 40 bits, 48 bits, 60 bits) and their memory could generally only be addressed to the word, no smaller sized entity could be addressed.business - these computers addressed memory by characters (e.g. 6 bits), if they supported the concept of words at all the machine usually had a variable wordlength that the programmer could specify in someway according to the needs of the program. Their memory was addressed to the character.This was true for both first and second generation computers, but in the third generation computer manufacturers decided to unify the 2 categories of computers to reduce the number of different architectures they had to support. IBM with the introduction of the System/360 in 1964 introduced the concept of thebyte (8 bits) as an independently addressable part of a large fixed word (32 bits). Other computer manufacturers soon followed this practice too.


When was the first IMB Home Computer invented?

The IBM 701 vacuum tube computer was announced May 21, 1952. It had 2K words 36 bits long (~9K bytes) of Williams Tube CRT DRAM.


Who descibed many elements of the modern digital computer?

who discribed many elements of the modern digital computer ? who discribed many elements of the modern digital computer ?

Related questions

What was the microprocessor to make it into a home computer and how many bits could it process at one time?

The microprocessor used in the first home computer was the 8080. It could handle 8 bits at a time.


How many multiplexers are there in a bus in a computer?

32 multiplexers , each of size is 16 bits


How many bits are in flabbergasted?

A "flabbergasted" is not a measurement of computer memory. 8 bits are in a byte, 1000 bytes are in a kilobyte, 1000 kilobytes are in a megabyte, Etc.


Which figure have 0 edge?

A blob with no sharp bits.


Is the number of bits in a word the same in every computer?

No, computers have been built with as few as 1 bit in a word to 72 bits in a word and architectures have been proposed with as many as 256 bits in a word.


How many bits in ibm's first computer?

The IBM 701, their first computer available for sale, had 36 bits per word. This word size was used on all their 700 and 7000 series binary scientific computers.


What is a bit word?

8bit 16 bits 32 bits and 64 bits and 128 bits imply a broadside [parallel] output of that many bits of digital information on a buss output. these bits represent a word output. therefore the longest the word the more information can be processed at a time imply more bits the faster the computer or data flow.


How many bits are need to represent colors?

Most modern digital cameras use 24 bits (8 bits per primary) to represent a color. But more or less can be used, depending on the quality desired. Many early computer graphics cards used only 4 bits to represent a color.


What are the bits of a computer system?

0 and 1


What is a Set of computer bits called?

byte


What is a byte and how many bits does it contain?

A byte is the basic unit of information in a computer. It is usually the smallest addressable object in memory. The size of a byte is a function of the design of the computer, but nearly universally, it has come to mean 8 bits. (Octet is a more precise definition of 8 bits of memory, in case there is any dichotomy.)


How many bits are in ten bytes?

There are 80 bits in 10 bytes. Each byte contains eight bits. This is critical to know when shopping for a new computer or adding memory to a compute as it will allow for an accurate calculation of the amount of storage space.