Which microcontroller has maximum pieline stages?
Answers
Answered by
0
the pipeline itself comprises a whole task that has been broken out into smaller sub-tasks. the concept actually has its roots in mass production manufacturing plants, such as ford motor company. henry ford determined long ago that even though it took several hours to physically build a car, he could actually produce a car a minute if he broke out all of the steps required to put a car together into different physical stations on an assembly line. as such, one station was responsible for putting in the engine, another the tires, another the seats, and so on.
using this logic, when the car assembly line was initially turned on it still took several hours to get the first car to come off the end and be finished, but since everything was being done in steps or stages, the second car was right behind it and was almost completed when the first one rolled off. this followed with the third, fourth, and so on. thus the assembly line was formed, and mass production became a reality.
in computers, the same basic logic applies, but rather than producing something physical on an assembly line, it is the workload itself (required to carry out the task at hand) that gets broken down into smaller stages, called the pipeline.
consider a simple operation. suppose the need exists to take two numbers and multiply them together and then store the result. as humans, we would just look at the numbers and multiply them (or, if they're too big, punch them into a calculator) and then write down the result. we wouldn't give much thought to the process, we would just do it.
computers aren't that smart; they have to be told exactly how to do everything. so, a programmer would have to tell the computer where the first number was, where the second number was, what operation to perform (a multiply), and then where to store the result.
this logic can be broken down into the following (greatly simplified) steps–or stages–of the pipeline:
this pipeline has four stages. now suppose that each of these logical operations took one clock cycle to complete (which is fairly typical in modern computers). that would mean the completed task of multiplying two numbers together would take four clock cycles to complete. however, with the ability to do things at the same time (in parallel) rather than one after another, the result can often be that while the task itself physically takes four clock cycles to complete, it can actually appear to be completed in fewer clock cycles because each of those stages can also be doing something immediately before and after the first task's needs are met. as a result, after each clock cycle the output of those operations are “retired” or completed, meaning that task is done. and, since we're doing things in a pipeline, that means that each task, taking four clock cycles to complete, can actually appear to be retired one per clock cycle.
using this logic, when the car assembly line was initially turned on it still took several hours to get the first car to come off the end and be finished, but since everything was being done in steps or stages, the second car was right behind it and was almost completed when the first one rolled off. this followed with the third, fourth, and so on. thus the assembly line was formed, and mass production became a reality.
in computers, the same basic logic applies, but rather than producing something physical on an assembly line, it is the workload itself (required to carry out the task at hand) that gets broken down into smaller stages, called the pipeline.
consider a simple operation. suppose the need exists to take two numbers and multiply them together and then store the result. as humans, we would just look at the numbers and multiply them (or, if they're too big, punch them into a calculator) and then write down the result. we wouldn't give much thought to the process, we would just do it.
computers aren't that smart; they have to be told exactly how to do everything. so, a programmer would have to tell the computer where the first number was, where the second number was, what operation to perform (a multiply), and then where to store the result.
this logic can be broken down into the following (greatly simplified) steps–or stages–of the pipeline:
this pipeline has four stages. now suppose that each of these logical operations took one clock cycle to complete (which is fairly typical in modern computers). that would mean the completed task of multiplying two numbers together would take four clock cycles to complete. however, with the ability to do things at the same time (in parallel) rather than one after another, the result can often be that while the task itself physically takes four clock cycles to complete, it can actually appear to be completed in fewer clock cycles because each of those stages can also be doing something immediately before and after the first task's needs are met. as a result, after each clock cycle the output of those operations are “retired” or completed, meaning that task is done. and, since we're doing things in a pipeline, that means that each task, taking four clock cycles to complete, can actually appear to be retired one per clock cycle.
Similar questions