answersLogoWhite

0

Bit -- Value

1 -- 1

11 -- 3

111 -- 7

1111 -- 15

11111 -- 31

111111 -- 63

1111111 -- 127

11111111 -- 255

111111111-- 511

1111111111 -- 1023

Therefore - 10 bits would be more than adequate (unless you had to represent capitals and lower case - in which case you would need one more bit).

User Avatar

Wiki User

15y ago

What else can I help you with?

Related Questions

How many bits it will take to represent the upper case and lower case alphabet?

Each letter of the alphabet, whether upper case or lower case, can be represented with 7 bits.


How many bits are needed to represent decimal 200?

8 bits if unsigned, 9 bits if signed


How many bytes make up each letter in the alphabet?

ASCII = 7 bit Unicode = 16 bits UTF-8 =8 bit


How many bits to represent twenty-six?

23 can be represented in binary as 10111 and would therefore require 5 bits to represent.


How many bits are need to represent colors?

Most modern digital cameras use 24 bits (8 bits per primary) to represent a color. But more or less can be used, depending on the quality desired. Many early computer graphics cards used only 4 bits to represent a color.


How many bits are in the letter l?

8 bits


How many bits does it take to represent 40 billion?

I get 36 .


8 bits can represent how many different characters?

There are 256 possible values (or characters) in 8 bits.


How many binary bits are necessary to represent 748 different numbers?

Binary bits are necessary to represent 748 different numbers in the sense that binary bits are represented in digital wave form. Binary bits also have an exponent of one.


How many bits are needed to represent decimal value ranging from 0 to 12500?

how many bits are needed to represent decimal values ranging from 0 to 12,500?


How many bits are used to represent a letter?

Well, honey, it depends on the encoding scheme you're using. In good ol' ASCII, it's 7 bits for the basic characters, but if you're feeling fancy with Unicode, it can go up to 32 bits for those special characters. So, long story short, it's anywhere from 7 to 32 bits, sugar.


How many bits need to represent the decimal number 200?

8