That depends on the character code used:
45 in binary is 101101, so you need at least 6 bits to represent 45 characters.
bites
8 bits if unsigned, 9 bits if signed
A Byte is a collection of eight bits that can represent a single character.
4
how many bits are needed to represent decimal values ranging from 0 to 12,500?
To represent a single EBCDIC character, typically 8 bits are required. However, to ensure error detection and correction, additional parity bits known as Hamming bits are added. In the case of a single EBCDIC character, typically 4 Hamming bits are added, resulting in a total of 12 bits to represent the character. These Hamming bits help detect and correct errors that may occur during transmission or storage of the data.
"recommended setting" There are 19 characters including the space between the two words. If the old convention of using 1 byte to represent a character, then we would need (19 x 8) which is 152 bits. If we use unicode as most modern computers use (to accommodate all the languages in the world) then 2 bytes will represent each character and so the number of bits would be 304.
A nibble (also known as a nybble or nyble) can represent half a character(two nibbles are needed for a valid ASCII character). A nibble is made up of 4 bits and those 4 bits are usually represented by a single hexadecimal value. 4 bits only allows for 16 combinations, 8 bits allows for 255. An ASCII character is represented by two hexadecimal characters, which is the same as 8 bits or two nibbles.
5
log2 200 = ln 200 ÷ ln 2 = 7.6... → need 8 bits. If a signed number is being stored, then 9 bits would be needed as one would be needed to indicate the sign of the number.
1200