answersLogoWhite

0


Best Answer

24

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is the size in bits of a unicode character?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Does Character literals are stored as unicode characters?

Character literals in Java are stored as UTF-16 Unicode characters. Each character takes up 16 bits of memory, allowing for representation of a wide range of characters in the Unicode character set.


How many bits are in a Unicode character?

Depends on what you refer to as Unicode. Typically the ones you will see is UTF-8 which uses from up to one to three bytes per character (the two or three-byte characters are usually for characters used in various other languages that are not already covered under the ASCII codepage). Otherwise, the convention states that Unicode is UTF-16.


What size of space in a computer is used to hold a character?

1 byte (Unicode)


How many bits would be needed to represent a 64 character?

That depends on the character code used:baudot - 5 bits per character - 320 bitsFIELDATA - 6 bits per character - 384 bitsBCDIC - 6 bits per character - 384 bitsASCII - 7 bits per character - 448 bitsextended ASCII - 8 bits per character - 512 bitsEBCDIC - 8 bits per character - 512 bitsUnivac 1100 ASCII - 9 bits per character - 576 bitsUnicode UTF-8 - variable bits per character - depends on the characters in the textUnicode UTF-32 - 32 bits per character - 2048 bitsHuffman coding - variable bits per character - depends on the characters in the text


What is the unicode symbol E057?

I did it and it is this 


Write short note on ASCII and unicode?

In computer memory, character are represented using predefined character set. Historically 7 bit American Standard Code for Information Interchange (ASCII) code, 8 bit American National Standards Institute (ANSI) code and Extended Binary Coded Decimal Interchange Code(EBCDIC) were used. These coding scheme represents selected characters into 7 or 8 bit binary code. These character schemes do not represent all the characters in all the languages in uniform format. At present Unicode is used to represent characters into the computer memory. Unicode provides universal and efficient character presentations and hence evolved as modern character representation scheme. Unicode scheme is maintained by a non-profit organization called Unicode consortium. Unicode is also compatible with other coding scheme like ASCII. Unicode use either 16 bits or 32 bits to represent a character. Unicode has capability represent characters from all the major languages in use currently across the world.


How many bits does a unicode character require?

only uses one byte (8 bits) to encode English characters uses two bytes (16 bits) to encode the most commonly used characters. uses four bytes (32 bits) to encode the characters.


What is the size of one character in bits?

It is impossible to get a character in 1 bit as 8 bits equal 1 character. Perhaps you mean something else?


What code uses 16 bits and provides codes for 65000 characters?

unicode


How many bits does a Java char contain?

16 bits. Java char values (and Java String values) use Unicode.


What is the difference between binary and character?

binary stream reads data(8-bits) irrespective of encoding, character stream reads two bytes as character and convert into locale stream using unicode standard. binary stream better for socket reading and character stream is better for client input reading


Why does a character in Java take twice as much space to store as a character in C?

Different languages use different size types for different reasons. In this case, the difference is between ASCII and Unicode. Java characters use 2-bytes to store a Unicode character so as to allow a wider variety of characters in strings, whereas C, at least by default, only uses 1 byte to store a character.