answersLogoWhite

0

45 in binary is 101101, so you need at least 6 bits to represent 45 characters.

User Avatar

Wiki User

15y ago

What else can I help you with?

Related Questions

A set of bits that represent a single character?

bites


How many bits are needed to represent decimal 200?

8 bits if unsigned, 9 bits if signed


What is the a byte?

A Byte is a collection of eight bits that can represent a single character.


How many bits are required to represent character in ascii?

4


How many bits are needed to represent decimal value ranging from 0 to 12500?

how many bits are needed to represent decimal values ranging from 0 to 12,500?


How many hamming bits are required for a single EBCDIC character?

To represent a single EBCDIC character, typically 8 bits are required. However, to ensure error detection and correction, additional parity bits known as Hamming bits are added. In the case of a single EBCDIC character, typically 4 Hamming bits are added, resulting in a total of 12 bits to represent the character. These Hamming bits help detect and correct errors that may occur during transmission or storage of the data.


How many bits are required to represent the phrase recommended setting?

"recommended setting" There are 19 characters including the space between the two words. If the old convention of using 1 byte to represent a character, then we would need (19 x 8) which is 152 bits. If we use unicode as most modern computers use (to accommodate all the languages in the world) then 2 bytes will represent each character and so the number of bits would be 304.


How many characters are in a nibble?

A nibble (also known as a nybble or nyble) can represent half a character(two nibbles are needed for a valid ASCII character). A nibble is made up of 4 bits and those 4 bits are usually represented by a single hexadecimal value. 4 bits only allows for 16 combinations, 8 bits allows for 255. An ASCII character is represented by two hexadecimal characters, which is the same as 8 bits or two nibbles.


How many binary bits are needed to represent decimal number 21?

5


How may bits are needed to represent the decimal number 200?

log2 200 = ln 200 ÷ ln 2 = 7.6... → need 8 bits. If a signed number is being stored, then 9 bits would be needed as one would be needed to indicate the sign of the number.


how many bits are needed to represent decimal values ranging from 0 to 12,500?

1200


How many bits to represent twenty-six?

23 can be represented in binary as 10111 and would therefore require 5 bits to represent.