answersLogoWhite

0


Want this question answered?

Be notified when an answer is posted

Add your answer:

Earn +20 pts
Q: How many character a 32 bit code represent?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

A 10-bit code could represent how many characters?

210 = 1024, so there are 1024 different bit configurations in a 10-bit code.


Write short note on ASCII and unicode?

In computer memory, character are represented using predefined character set. Historically 7 bit American Standard Code for Information Interchange (ASCII) code, 8 bit American National Standards Institute (ANSI) code and Extended Binary Coded Decimal Interchange Code(EBCDIC) were used. These coding scheme represents selected characters into 7 or 8 bit binary code. These character schemes do not represent all the characters in all the languages in uniform format. At present Unicode is used to represent characters into the computer memory. Unicode provides universal and efficient character presentations and hence evolved as modern character representation scheme. Unicode scheme is maintained by a non-profit organization called Unicode consortium. Unicode is also compatible with other coding scheme like ASCII. Unicode use either 16 bits or 32 bits to represent a character. Unicode has capability represent characters from all the major languages in use currently across the world.


IT What is a simple character set that can represent 128 different characters?

The ASCII character set is a simple character set that can represent 128 different characters, including letters, numbers, punctuation marks, and control characters. Each character is represented by a 7-bit code, allowing for 128 unique combinations.


Which code uses 7 bit binary code to reprent each characters?

Ascii codes is uses 7 bit binary code to reprsent each character


Character use only one byte why Java use two byte for character?

The number of bytes used by a character varies from language to language. Java uses a 16-bit (two-byte) character so that it can represent many non-Latin characters in the Unicode character set.


What is the largest positive number one can represent in an eight-bit two's complement code?

127.


Which of the following is true about 1 bit Can represent decimal values 0 and 9 Can be used to represent one character in the lowercase English alphabet Represents one binary digit Four binary?

Neither of the following are true about 1 bit, it can not represent decimal values 0 and 9 nor can it be used to represent one character in the lowercase English alphabet and one binary digit four binary. A true statement would be that 1 bit is represented by the decimal values 0 or 1.


The code established so that all computers used the same binary code is called?

ASCII, the American Standard Code for Information Interchange, is the most commonly used way to represent character and numerical information in a seven-bit binary format, for values from 0 to 127. Most modern computer systems tend to use ASCII values of 128 and above for extended character sets. EBCDIC, the Extended Binary Coded Decimal Interface Code, is an eight-bit binary format used by various IBM mainframe operating systems.


Is it true that each 1 and 0 in the binary system represents 1 bit of computer code?

Ye, 1 bit can either represent on "1" or off "0".


In hamming code you have 4 bit data How many extra bits are required for code?

you need thee extra bit for 4 bit data in hamming code.


How many different values can 1 bit represent?

Two: '0' or '1'


How much memory does it take to store the letter A?

You can store any of the 127 characters in the ASCII table using just 7 bits. The letter A has character code 65 (0x41) in all ASCII code pages. The code simply maps to the character's glyph in the current code page so you're not actually storing the letter, you are only storing its code. On most systems, the smallest unit of storage is a byte which is typically 8 bits long. The 8th bit is used to determine whether the character is in the standard ASCII character set (0 to 127) or the extended ASCII character set (128 to 255). Only the standard character set is guaranteed to be the same on all systems (the glyphs may vary in style but always represent the same character). The extended character set varies depending on which code page is current. If using UNICODE wide-characters, the character code will consume 2 or 4 bytes. On Windows, it is always 2 bytes. But if using multi-byte character encoding or standard ASCII, it is always 1 byte,