The CPU uses adders for multiplication right? there is no instruction...for multiplication. |
Nearly all modern CPU's have multiplication instructions, and a lot more.
The evolution itself is quite interesting, starting with CPU's of the ancient 1950's, most did not have multiplication in hardware. It was a particular point to check in the purchase of a high performance computer in the 60's and 70's that hardware multiplication was (or wasn't) a feature.
The CPU in the first Apple II, and the one in the TRS-80, the Altair, a number of those 8 bit CPU's from the 70's, did not have hardware multiplication.
Nearly all modern CPU's since the 8086 and 8088 do have hardware multiplication for integers at the very least.
Floating point hardware, even for addition and subtraction, was generally optional from the 8086 (about 1980) until the 80486 around 1990. When floating point hardware became, generally, standard, the entire compliment of operations from multiplication through square root, with the standard fare of trig functions, came together in one package.
Then, it goes further.
Up to that point the operation was generally like that of any calculator in which two operands were treated with some chosen operation to produce a result. This means a statement like this:
a = b + c + d;
Is performed in at least two steps. An addition for, say, c + d produces a temporary result which is then used for the addition with b. This may seem obvious, but there are a number of very common formulae for which this burden is a significant performance problem.
This lead to the inclusion of vector or SIMD instructions, themselves from examples dating back to the late 60's and early 70's on high performance machines.
There are a select number of patterns, but this illustrates what they are like:
a = b * c + d * e;
There are, of course, many that are more complex. This is typical of linear algebra and some scientific calculations. A slightly different pattern may be something like:
1 2 3
|
a.x = b.x + c.x;
a.y = b.y + c.y;
a.z = b.z + c.z;
|
It isn't important to detail the actual instructions, but to generally note that where such common patterns would lead to a series of instructions on the standard CPU compliment, here, in a vector processor or an SIMD processor, there are instructions which process this entire formula. We can fill registers with the operands, call one instruction and get the result as a single step.
In the Intel line, these have followed a progression, each with expanding features, with names like "MMX" and "SSE" (in various versions).
Yet, look again at the last example sequence which is adding two 3D points. Note that the second has nothing to do with the first, and the third has nothing to do with with either of the others, except the fact they are adjacent within a structure holding x, y and z.
Most modern CPU's can schedule these additions (and other instructions) to be executed simultaneously in a single core, even though they are written by the compiler as a series of additions. This seems to do something like the SIMD instructions, where a collection of simple operations may be performed at once. The concepts are somewhat loosely related in that there is a performance gain by performing activity in some parallel fashion where possible, but the SIMD or vector instructions pack more work into smaller instructions defining some particular formula or pattern, like the process for multiplying a vector and a matrix. The latter will end up being more efficient, but those instructions are usually called upon rather deliberately. The ability of the CPU to schedule multiple simple operations for simultaneous execution is somewhat automatic, and independent of any intended formula.
The 8 bit CPU's of the 70's were comprised of some 3,500 transistors, each performing some portion of the work on individual bits. Most modern CPU's are comprised of a few billion transistors, with considerably more sophisticated features that complete the basic work far more speedily than their ancestors.
...sources that took you from beginner++/ intermediate to proficient in the language where you understand these more "advanced" subject matter...
|
I'm beginning to think at least some of us should write one.
I come from an era where assembling a computer required a soldering iron and weeks of patience.
I do consider it an asset to have emerged from primitive and relatively naive hardware. I've watched the industry expand many orders of magnitude.
Don Knuth's work may be of interest, as well as other computer scientists' reflections.