answersLogoWhite

0

I assume your speaking of the Intel 80286 and the 80386 computer processors. This series of processors are the reason we call "32-bit processor architecture" x86. Anyway, the importance of moving from 80286 to 80386 lies in the fact that the former could only execute 16-bit code, and the the latter supported 32-bit code. In layman terms, the 80286 could only natively calculate numbers up to 65,536-1 (65.536) without implementing advanced software to allow caclulations of numbers greater than that. The 80386, however, was capable of mathematically manipulating numbers up to 4,294,967,296-1 (or 4,294,967,295). This innovation allowed many programmers to write programs more easily without the need for them to implement extra functions to allow them to process large numbers. For this same reason, we now use 64-bit processors, and will eventually move on to 128-bit and 256-bit in the future, however, when this will happen is a matter of when hardware designers create and software programmers decide to support such a system.

User Avatar

Wiki User

14y ago

What else can I help you with?