answersLogoWhite

0


Best Answer

bites

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: A set of bits that represent a single character?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How many bits would be needed to represent a 45 character set?

45 in binary is 101101, so you need at least 6 bits to represent 45 characters.


What is the need of unicode in java?

The idea is to have a single character set that can represent ALL the languages of the World, without the need to change between different character encodings.


What exactly is a byte?

A byte is the smallest data unit of modern binary computers. It represents either a 1 or a 0. Bits are compiled into a set of eight bits, known as a byte. Bytes represent one piece of data, such as a single letter, etc.


How do you represent 300 in 8 bits?

The binary of 300 is 100101100 which are 9 bits therefore the first 8 bits from LSB goes to the register and the carry is generated and carry flag is set to 1.


How many characters in 32bits?

If the characters are 8 bits then you have 4 for them in 32 bits. ASCII is an 7 bit character set but in most programming languages a char is 8 bits.


single number?

In Numerology, the nine single-digit numbers are the building blocks of the study of Numerology. These numbers in Numerology represent a specific set of character traits, symbolism (spiritual or otherwise), and meaning. In fact, it's almost like each of them has their own unique personality!


How much memory does it take to store the letter A?

You can store any of the 127 characters in the ASCII table using just 7 bits. The letter A has character code 65 (0x41) in all ASCII code pages. The code simply maps to the character's glyph in the current code page so you're not actually storing the letter, you are only storing its code. On most systems, the smallest unit of storage is a byte which is typically 8 bits long. The 8th bit is used to determine whether the character is in the standard ASCII character set (0 to 127) or the extended ASCII character set (128 to 255). Only the standard character set is guaranteed to be the same on all systems (the glyphs may vary in style but always represent the same character). The extended character set varies depending on which code page is current. If using UNICODE wide-characters, the character code will consume 2 or 4 bytes. On Windows, it is always 2 bytes. But if using multi-byte character encoding or standard ASCII, it is always 1 byte,


Does Character literals are stored as unicode characters?

Character literals in Java are stored as UTF-16 Unicode characters. Each character takes up 16 bits of memory, allowing for representation of a wide range of characters in the Unicode character set.


How many characters can 1 bit do?

None. A single binary digit can only be in one of two possible states. What that state physically represents is merely an abstraction: on or off, yes or no, positive or negative, true or false, black or white, the digits 0 or 1 and the characters 'A' or 'B' are all valid abstractions for the two possible states that any one bit may hold. But the computer has no notion of numbers let alone characters. These are simply our interpretation of what these states represent. The ANSI character set has 127 characters in total which we can identify using an array of 7-bits (yielding a decimal value in the range 0 through 127). The character 'A' is represented by the binary value 1000001 (65 decimal), while 'B' is 1000010 (66 decimal). Thus we could choose between these two individual characters by saying that if a particular bit is set then we select character 65, otherwise we select character 66. However, the computer cannot do it alone -- it has no logic circuitry capable of deciding which of any two characters a single bit represents. We must program it, which requires a good deal more than just 1 bit. Thus a single bit cannot "do" any characters.


How many bit are used to encode an ASCII character?

All ASCII character sets have exactly 128 characters, thus only 7-bits are required to represent each character as an integer in the range 0 to 127 (0x00 to 0x7F). If additional bits are available (most systems use at least an 8-bit byte), all the high-order bits must be zeroed. ANSI is similar to ASCII but uses 8-bit encodings rather than 7-bit encodings. If bit-7 (the high-order bit of an 8-bit byte) is not set (0), the 8-bit encoding typically represents one of the 128 standard ASCII character codes (0-127). If set (1), it represents a character from the extended ASCII character set (128-255). To ensure correct interpretation of the encodings, most ANSI code pages are standardised to include the standard ASCII character set, however the extended character set depends upon which ANSI code page was active during encoding and the same code page must be used during decoding. ANSI typically caters for US/UK-English characters (using ASCII) along with foreign language support, mostly European (Spanish, German, French, Italian). Languages which require more characters than can be provided by ANSI alone must use a multi-byte encoding, such as fixed-width UNICODE or variable-width UTF-8. However, these encodings are standardised such that the first 128 characters (the standard ASCII character set) have the same 7-bit representation (with all high-order bits zeroed).


Write short note on ASCII and unicode?

In computer memory, character are represented using predefined character set. Historically 7 bit American Standard Code for Information Interchange (ASCII) code, 8 bit American National Standards Institute (ANSI) code and Extended Binary Coded Decimal Interchange Code(EBCDIC) were used. These coding scheme represents selected characters into 7 or 8 bit binary code. These character schemes do not represent all the characters in all the languages in uniform format. At present Unicode is used to represent characters into the computer memory. Unicode provides universal and efficient character presentations and hence evolved as modern character representation scheme. Unicode scheme is maintained by a non-profit organization called Unicode consortium. Unicode is also compatible with other coding scheme like ASCII. Unicode use either 16 bits or 32 bits to represent a character. Unicode has capability represent characters from all the major languages in use currently across the world.


What are character constants?

A character constant is a single character in the host's character set, such as 'A', 'a', '0', '%', etc. Note the use of the single quotes instead of double quotes. (Double quotes are used for string constants, not character constants.) A character constant maps to a specific int (integer) value, but assuming anything about that relationship is non-portable.