A reader's post about how MCU hardware and software evolution are moving in opposition got me thinking about how today's common design practices came about.
The conflict seems to be that hardware evolution in MCU designs is a matter of growing in complexity and capability by integrating more and more on chip. This has a positive feel to it. But MCU software evolution seems to be a matter of taking software from larger systems and cutting it down to fit, which feels disappointing.
I think this is mostly a matter of perception and comes from the rise of the PC. (After all, the PC seems to be blamed for many of the world's other ills, so why not here as well?) Here's how I see it as having unfolded.
At one time there were just big computers. Then, they started getting smaller. One day, the CPU was born, allowing computers to occupy briefcases instead of closets. This was the time when embedded systems began to arise, with the CPU sensing and controlling real-world events through discrete I/O peripherals. Around the same time, the PC arose. A few IC generations later, the MCU, with its integrated I/O and memory, appeared.
This is where things started inverting. As IC technology improved, the PC kept getting more and more powerful, quickly taking over software functions originally developed for larger computers and then expanding upon them. The fact that PCs used separate CPUs, I/O devices, and memory meant that each individual component doubled in performance and capacity every 18 months in accordance with Moore's Law. Further, the memory capacity available to a PC -- including ROM, RAM, and magnetic mass storage -- grew faster than Moore's law as costs came down.
The MCU, on the other hand, aimed to keep size down as much as possible and to integrate as much as possible. As a result, its total performance could not rise nearly as fast as the PC's. The best MCU fell further and further behind the best PC in terms of what it could accomplish.
Meanwhile, the PC and other embedded computing that was not restricted to single-chip designs grew in popularity. This, in turn, caused a change in user expectations. People have now come to expect that anything with even the appearance of intelligence will provide the same kinds of performance and user experience as the PC. So now it appears that, on a software level anyway, MCUs are in a situation of offering only cut-back software.
It's a false image, of course. If you look at MCUs alone, they have been steadily rising in capability, and the software they run has been steadily increasing in performance. They are following the same growth pattern as discrete-device systems like the PC and set-top boxes. It's just that they are a few years behind. It is only when you compare the MCU and discrete device systems together, and expect the user experience to be the same, that you feel that MCU software is a disappointment.
Think of the MCU as the little brother of the discrete-device design: smaller, not as mature or experienced, but able to fit into places big brother cannot and able to avoid mistakes big brother has made. Sure, some of the clothes are hand-me-downs, but little brother will grow into them and look just as good as, or better than, their original owner.