Computer Science, asked by muqaddasshaaban, 1 year ago

compare the running time to determine if 16 bit and 32 bits are prime?

Answers

Answered by JESPHINALEX1967
0

Literally, the difference between 16-bit and 32-bit is 16 more bits, but there are, of course, other differences between the "8-16 bit era" and "32+ bit era", much more important than the number of bits per register.  The 32-bit CPUs, notably the 386, hence "386 enhanced mode" on Windows 3.1, introduced protected mode addressing (at least they were the first to have it on personal computers), which improved the stability of multi-tasking.  This is also why 386 is the bare minimum to run the Linux kernal.  While the 16-bit CPUs such as the 286 could do multitasking through the use of timer interrupts, it would not be suitable for a multi-user environment (where security and privacy are needed) because programs could write over each other's data. Even when there isn't a need for privacy (multi-task but not multi-user, examples: early Mac and Win 3.1, Amiga, Atari ST) such write-overs happened by accident and could crash the whole system.  Also, a 16-bit CPU with more than a 64 kilobytes of memory would need to use tricks such as segmentation and bank switching to access all that memory. A 32-bit CPU could access 4GB from a single indexing register.  That's a huge difference that saved a lot of difficulty ("fun" in the Atari 2600 sense) in bare metal programming on the PC.

Similar questions