answersLogoWhite

0


Best Answer
A better question would be - What is the difference between 16bit and 24bit color?

Colour is usually represented on computers, & displayed using 3 colour elements - Red, Green & Blue.

16bit colour (known as "high colour") refers to the fact that only a total of 16 binary bits are used to represent each colour.

This usually results in 5 bits being used for red, 5 bits for blue, and 6 bits for green (due to the fact that we are apparently more visually sensitive to green)

This gives 32 shades of red and blue available, and 64 shades of green.

This results in 65536 different possible shades.

24bit colour (known as "true colour") results in 8 bits being used for each colour, allowing 256 shades of each, & a total of 4.2 Million colours.

Pretty much all digital displays use 24bit colour & consider it to be the full or "true" colour mode.

On a home computer 32bit means 24bit is used for red, green & blue.

The extra 8 bits are usually used for storing internal information (like transparency or stencil information).

32bit is often used because memory can be read easier in steps of 32bits (4 bytes), than it can in 24bits (3 bytes).

Visually they will look identical on your monitor.

24bit colour will look visually "better" than 16bit colour, when looking @ high colour images like Photography, or smooth gradients of colour.

In 16bit colour you can often see the "steps" as the colour transitions from one to the next.

In 24bit the colours are so close together that they are often indiscernible, & on most digital monitors, to most eyes, will appear perfectly smooth.

Most modern computer games require your graphics card to be capable of displaying 24bit colour & will no longer support 16bit.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

8y ago

16-bit Windows applications were designed to run under Windows 3.0 and 3.1, while 32-bit Windows applications were designed for Windows 95, 98, NT, and 2000. They are written to two different Application Program Interfaces (APIs) called "Win16" and "Win32".

The main differences between the Win16 and Win32 APIs are:

  • Memory model: Win16 uses a segmented memory model (each memory address is referred to using a segment address, and the offset within that segment), while Win32 uses a flat 32-bit address space.
  • Multitasking: Win16 uses cooperative multitasking. This means that the application must relinquish control before another application or program can run. Win32 uses preemptive multitasking, in which the operating system (Windows NT, 95, 98, or 2000) assigns time slices to each process.
  • Multithreading: Unlike Win16, Win32 supports multithreading. This means that each program is broken up into many threads, which can run simultaneously.

Windows 3.1 and Windows for Workgroups 3.11 can run a small subset of Win32 applications, mostly older ones, by using a subsystem called "Win32s". Win32s translates Win32 system calls to Win16. This process is called "thunking".

Windows 95, 98, NT, and 2000 can run Win16 applications by running them cooperatively in a Win16 compatibility box. (In the case of Windows NT, this is called "WOW" - Windows on Windows). If a 32-bit application crashes, it will not affect any other 32-bit or 16-bit applications. However, if a 16-bit application crashes, it might affect other 16-bit applications (but not 32-bit applications).

Both APIs contain the mechanisms used to link applications and documents together (e.g., OLE and OLE2).

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

It's a matter of how many colors you have to choose from. A 16 bit palette has 65,000 colors (also called Hicolor), while a 24-bit palette has 16.7 million colors (also called Truecolor).

Using more colors generally gives higher quality, at the cost of a larger image file.

See related link for an overview of color depth and the various standards for it.

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

What you are referring to is known as the processor 'word size'. The word size is a fundamental characteristic of a processor design that determines the basic number size that the processor deals with.

A 16-bit processor will be designed with buses that are 16-bits wide, registers that store 16-bit values and arithmetic and logic units that operate on 16-bit numbers. Similarly for 32-bit, 64-bit and so on.

The word size essentially sets a limit on what size of number can be operated on at any one time. For example, the largest number that can be represented using 16-bits is 216=65,536. Any number larger than this will require more than one word to represent it and therefore more than one clock cycle to transfer around the processor and perform calculations on.

In lay terms, processors of a larger word size can potentially perform larger calculations with each operation than smaller word-size architectures.

Most modern processors are now 64-bit. The move from 32 to 64-bit was important for a number of reasons. Crucially, a single word is used to address each byte of main memory. Since 32-bits can only represent 2^32 numbers (4,294,967,296), this means that only 4GB of memory could be addressed and hence presented an upper-limit on the amount of RAM that could be used in a modern system. By using 64-bit architectures, this limit is now 1.84467441 × 1019 (16 exabytes).

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

think of it as lanes on a highway16 lanes verses 32 lanes verse 64 hardly any cpus are 16 bit anymore not since 286 and early 386 cpus 16bit soundcards still exist as CD's still use this technologythere are ways to fool the computer or componets to be more then 16bit but it's always still 16 bit

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is the difference between a 16 bit and a 32 bit?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is the difference between 32- bit drivers and 16 -bit drivers?

32 bit drivers are designed for 32 bit computers. 16 bit drivers are designed for 16 bit computers. Most modern computers are either 32 or 64 bit.


What is the difference between winntexe and winnt32exe?

winnt.exe = 16-bit clean install winnt32.exe = 32-bit upgrade


What is the difference between windows 3.0 and Windows 7 architecture?

Windows 3.0 is a 16-bit architecture. Windows 7 is 32-bit architecture.


What is the difference between the two installation programs Winn.exe and Winnt32.exe?

Winnt.exe is the 16-bit version and Winnt32.exe is the 32-bit version


What is the difference between the Intel 80286 and the Intel 80386?

Comparison between micro processor Intel and Motorola


What is the difference between 32 bit and 64 bit CPU?

The difference between a 32 bit and 64 bit CPU is the speed in which a computer will read and process the information. A 64 bit CPU will read much quicker and can handle the memory usage better than a 32 bit.


What is the difference between 16 GB and 32 GB?

16 GB


How can i tell the difference between vista 32 bit and vista 64 bit?

It should say


Difference between 32 bit and 64 bit?

I want to download Aegisub on Medocow. But I see two files there aegisub-3.1.3_32.exe (with 32 bit) and aegisub-3.1.3_64.exe (with 64 bit) and I do now know what file I need. What difference between them?


What is the difference between 16 bit compilers and 32 bit compilers in C?

16 bit compilers compile the program into 16-bit machine code that will run on a computer with a 16-bit processor. 16-bit machine code will run on a 32-bit processor, but 32-bit machine code will not run on a 16-bit processor. 32-bit machine code is usually faster than 16-bit machine code.-DJ CraigNoteWith 16 bit compiler the type-sizes (in bits) are the following: short, int: 16long: 32long long: (no such type)pointer: 16/32 (but even 32 means only 1MB address-space on 8086)With 32 bit compiler the object-sizes (in bits) are the following:short: 16int, long: 32long long: 64pointer: 32With 64 bit compiler the object-sizes (in bits) are the following:short: 16int: 32long: 32 or 64 (!)long long: 64pointer: 64[While the above values are generally correct, they may vary for specific Operating Systems. Please check your compiler's documentation for the default sizes of standard types]Note: C language itself doesn't say anything about "16 bit compilers" and "32 bit compilers"


What is the difference between the Office 32 and 64 bit programs?

There is not much difference between Office 32 and Office 64 bit programs in terms of performance. However, Office 64 has a larger memory and stores more data.


What is the difference between a 64-bit operating system and a 32-bit operating system?

In simple terms, 32-bit operating systems can only be able to run on 32-bit CPUs and 32-bit apps, but a 64-bit operating system can be able to run both 32-bit and 64-bit CPUs and apps.