answersLogoWhite

0


Best Answer

Depends on what you refer to as Unicode. Typically the ones you will see is UTF-8 which uses from up to one to three bytes per character (the two or three-byte characters are usually for characters used in various other languages that are not already covered under the ASCII codepage). Otherwise, the convention states that Unicode is UTF-16.

User Avatar

Wiki User

7y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

14y ago

Unicode uses 16 bits

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

16 bits

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many bits are in a Unicode character?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is the size in bits of a unicode character?

24


Does Character literals are stored as unicode characters?

Character literals in Java are stored as UTF-16 Unicode characters. Each character takes up 16 bits of memory, allowing for representation of a wide range of characters in the Unicode character set.


How many bits would be needed to represent a 64 character?

That depends on the character code used:baudot - 5 bits per character - 320 bitsFIELDATA - 6 bits per character - 384 bitsBCDIC - 6 bits per character - 384 bitsASCII - 7 bits per character - 448 bitsextended ASCII - 8 bits per character - 512 bitsEBCDIC - 8 bits per character - 512 bitsUnivac 1100 ASCII - 9 bits per character - 576 bitsUnicode UTF-8 - variable bits per character - depends on the characters in the textUnicode UTF-32 - 32 bits per character - 2048 bitsHuffman coding - variable bits per character - depends on the characters in the text


How many bits does a unicode character require?

only uses one byte (8 bits) to encode English characters uses two bytes (16 bits) to encode the most commonly used characters. uses four bytes (32 bits) to encode the characters.


How many bits does a Java char contain?

16 bits. Java char values (and Java String values) use Unicode.


What is the unicode symbol E057?

I did it and it is this 


Write short note on ASCII and unicode?

In computer memory, character are represented using predefined character set. Historically 7 bit American Standard Code for Information Interchange (ASCII) code, 8 bit American National Standards Institute (ANSI) code and Extended Binary Coded Decimal Interchange Code(EBCDIC) were used. These coding scheme represents selected characters into 7 or 8 bit binary code. These character schemes do not represent all the characters in all the languages in uniform format. At present Unicode is used to represent characters into the computer memory. Unicode provides universal and efficient character presentations and hence evolved as modern character representation scheme. Unicode scheme is maintained by a non-profit organization called Unicode consortium. Unicode is also compatible with other coding scheme like ASCII. Unicode use either 16 bits or 32 bits to represent a character. Unicode has capability represent characters from all the major languages in use currently across the world.


How many characters does unicode supports?

it support the 65000 different universal character.


How many KB in one character?

A character in ASCII format requires only one byte and a character in UNICODE requires 2 bytes.


What code uses 16 bits and provides codes for 65000 characters?

unicode


How many bits are required to represent the phrase recommended setting?

"recommended setting" There are 19 characters including the space between the two words. If the old convention of using 1 byte to represent a character, then we would need (19 x 8) which is 152 bits. If we use unicode as most modern computers use (to accommodate all the languages in the world) then 2 bytes will represent each character and so the number of bits would be 304.


What is the difference between binary and character?

binary stream reads data(8-bits) irrespective of encoding, character stream reads two bytes as character and convert into locale stream using unicode standard. binary stream better for socket reading and character stream is better for client input reading