answersLogoWhite

0

black characters: 233

User Avatar

Wiki User

13y ago

What else can I help you with?

Related Questions

1 word is equal to how many bytes?

every character consumes 2 bytes. so if your word has 4 characters then it will consume 8 bytes.


How many bytes first 6 characters of mac address?

A MAC address consists of 12 hexadecimal characters, representing 6 bytes in total. The first 6 characters of a MAC address represent the Organizationally Unique Identifier (OUI) and correspond to 3 bytes. Therefore, the first 6 characters of a MAC address occupy 3 bytes.


2 giga bytes equal how many kilo characters?

10


How many bytes in utf16?

UTF-16 uses either 2 or 4 bytes per character. Most common characters from the Basic Multilingual Plane (BMP) are encoded using 2 bytes, while characters outside this range require 4 bytes, represented as a pair of 2-byte code units known as surrogates. Therefore, the number of bytes needed depends on the specific characters being encoded.


How many bits does a unicode character require?

only uses one byte (8 bits) to encode English characters uses two bytes (16 bits) to encode the most commonly used characters. uses four bytes (32 bits) to encode the characters.


What is 1024 characters of data?

1024 characters is 1,000 bytes, or one kilobyte.


How many characters are in a kilabyte?

If you're referring to kilobyte, then it contains 1024 bytes and if the characters are the standard ASCII character set where 1 character is 1 byte, then a kilobyte would have 1024 characters.


Explain how bytes that represent keyboard characters are decoded?

The bytes representing keyboard characters are normally used to index some sort of array (or small database) to decode the information.


A computers has 512 mb memory how many characters can be stored in its memory at a time?

512 x 1024 bytes


How many bytes of storage would be needed to store the word microprocessors?

The word: microprocessors, is 15 characters long, and would need a minimum of 15 bytes to store as ASCII. Some systems may need additional bytes to indicate that text is stored there, or how long the text field is, though.


How many bytes are used to represent one character?

In the UTF 8 standard of representing text, which is the most commonly used has a varying amount of bytes to represent characters. The Latin alphabet and numbers as well as commonly used characters such as (but not limited to) <, >, -, /, \, $ , !, %, @, &, ^, (, ), and *. Characters after that, however, such as accented characters and different language scripts are usually represented as 2 bytes. The most a character can use is 4, I think (Can someone verify? I can't seem to find the answer).


How many bits and bytes are required to store the word World contain?

That depends what encoding is used. One common (fairly old) encoding is ASCII; that one uses one byte for each character (letter, symbol, space, etc.). Some systems use 2 bytes per character. Many modern systems use Unicode; if the Unicode characters are stored as UTF-16 - a fairly common encoding scheme - many common characters will still use a single byte, while many special symbols (for example, accented characters) will take up two bytes. The number of bits is simply the number of bytes multiplied by 8.