answersLogoWhite

0


Best Answer

describe the destination index

User Avatar

Wiki User

9y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What are the difference between Unicode and ASCII code?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is ASCII code of U?

Upper case U in ASCII/Unicode is binary 0101011, U is code number 85. Lower case u in ASCII/Unicode is binary 01110101, u is code number 117.


What is the ASCII code for letter D?

The ASCII code for the letter D is 68 in decimal, 0x44 in hexadecimal/Unicode.


What are the main differences between ASCII code and Unicode?

Range. ASCII has only 128 characters (95 visible, 33 control), UniCode has many-many thousands. Note: UniCode includes ASCII (first 128 characters), and ISO-8859-1 (first 256 characters). (From these you can deduct that ISO-8859-1 also includes ASCII.)


Computer use special code for representing letter and number knows as the?

ASCII, EBCDIC and Unicode


What does Java use instead of ASCII?

ASCII and Java are 2 totally different things. ASCII is a naming convention where a certain letter, number, or punctuation mark is a specific keyboard code (Carriage Return, CR, is code 31, Line Feed 14, Capital A 96). Java is a programming language that handles text in multiple formats as needed, Unicode, EBDIC, ASCII. The two are not intertwined.


Different types of alphanumeric code?

ASCII EBCDIC Unicode search wikipedia for knowing more about these alpha numeric codes!


Which encoder creates ASCII?

ASCII (American Standard Code for Information Interchange) is a character-encoding scheme that was standardised in 1963. There is no encoder required to create ASCII. Every machine supports it as standard, although some implement it via UNICODE. The only difference is in the number of bytes used to represent each character. The default is one byte per character yielding 128 standard encodings that map exactly with the first 128 characters in UNICODE encoding.


Write short note on ASCII and unicode?

In computer memory, character are represented using predefined character set. Historically 7 bit American Standard Code for Information Interchange (ASCII) code, 8 bit American National Standards Institute (ANSI) code and Extended Binary Coded Decimal Interchange Code(EBCDIC) were used. These coding scheme represents selected characters into 7 or 8 bit binary code. These character schemes do not represent all the characters in all the languages in uniform format. At present Unicode is used to represent characters into the computer memory. Unicode provides universal and efficient character presentations and hence evolved as modern character representation scheme. Unicode scheme is maintained by a non-profit organization called Unicode consortium. Unicode is also compatible with other coding scheme like ASCII. Unicode use either 16 bits or 32 bits to represent a character. Unicode has capability represent characters from all the major languages in use currently across the world.


What is the HTML code for an asterisk?

It depends. In Unicode, its U+002A. If the page is in ASCII, then is 0x2A. But you shouldn't need an HTML entity. You should be about to just type it in *


Convert string have a nice day to equivalent ascii code include spaces between words in the resultant ascii?

Convert string have a nice day to equivalent ascii code include spaces between words in the resultant ascii?


How binary code of a will be?

In ASCII and UTF-8, the character 'a' has the 8-bit biary code 01100001 (which is 97 in decimal). Full lists of character codes can be obtained from several websites (just search for things like "character codes", "ASCII", "UTF-8", "UTF-16", "Unicode" and so on).


What is the purpose of ASCII?

The American Standard Code for Information Interchange was made to standardize 128 numeric codes that represent the English letters, Symbols, and Numbers. Any USA keyboard is made with this standard in mind.