The U.S. has made it through another Black Friday of the must-have gadgets and electronics. Practically every major retailer takes part in the after-Thanksgiving deep discounting, but lines are the longest for computers, televisions with Wi-Fi capabilities, and new Apple products. You might think that these electronics keep improving faster and faster – and you’d be correct. The same principal that makes every iPhone better than the last is also at work in the robotics field.
That principle is known as Moore’s Law (though it’s more of an observation than a law or a principle). Gordon Moore, who is the cofounder of Intel among other life achievements, observed in a 1965 article titled “Cramming More Components Onto Integrated Circuits” that the processing power one could buy was doubling every year. In his words,
“The complexity for minimum component costs has increased at a rate of roughly a factor of two per year… Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least ten years.”
In layman’s terms, this means that every dollar you spend on a computing-based device buys you double the processing power you could have gotten the previous year. It’s basic exponential growth, and it has continued long past Moore’s prediction of ten years. Moore later changed the observation to say a doubling happens every two years rather than one, though most professionals agree that 18 months is a more accurate timeframe.
To illustrate Moore’s Law, consider that the Cray-2 supercomputer, which cost more than $35 million (adjusted for inflation) in 1985, had the same calculation speeds to the iPad 2 tablet, which cost under $1,000 in 2011. The same trend can be seen in technology other than integrated circuits; network capacity and pixels per dollar in digital cameras, for example.
The observation that Moore made in his article was fully accepted in the computer engineering world in the next decade, but in the process, it was also set as an industry goal and standard for growth. Companies pushed to double speeds every year to two years to keep up with the perceived rate. When engineers hit a limit in physics, they found ways around it, like stacking computer chips on top of one another when no more circuits could be squeezed onto a single chip. Many think that the very idea of Moore’s Law pushed for innovation in the industry, thus making it a self-fulfilling prophecy.
Because of the exponential growth described in Moore’s Law, we are seeing astounding advances in computing today, and it’s all happening very fast. Things that were once the stuff of science fiction are becoming realities before our eyes, including the functionality of complex computer software that we know as robotic process automation (RPA). There’s reason to believe that Moore’s Law is finally slowing down, but on the other hand, people have been predicting the end for years. It’s unlikely to stop before the year 2020, which leaves quite a bit of time for RPA to improve well beyond where it is today.
This is the first part of a series on the history of computer robotics.