As we get rolling on this site, letís try to pin down just what a microcontroller is. Sounds simple, right? But what really defines an MCU?
Iíve been working with microcontrollers and microprocessors all my life, and I still havenít heard a good reason one is different from the other. Most people have some vague notion that microprocessors (or CPUs) are somehow ďbigger, better, fasterĒ than microcontrollers (or MCUs). But is that really the case?
One market research firm defines an MCU as any processor chip that has on-chip memory. MCUs can therefore work in a standalone manner, whereas CPU chips require some sort of off-chip memory. That was an OK definition in the 1990s, but today every processor chip has some sort of nonvolatile on-chip memory. So are they all MCUs?
Another company defines MCUs as devices with on-chip peripherals or I/O. CPUs, in their view, are compute-only engines, while MCUs combine both computation and peripherals. OK, but Intel and AMD both make quad-core PC processors with on-chip DRAM controllers and built-in graphics. Does that make them MCUs?
Maybe it has to do with instruction sets. Iím pretty sure an 8051-based chip is going to be a microcontroller, and one running an x86 instruction set probably isnít. But maybe there could be an x86-based MCU -- just not from Intel or AMD. So thatís not really a good test.
Also, think about the ARM instruction set, MIPS, or even SPARC. ARM is pretty well defined as an embedded processor family, but does that make all ARM-based chips MCUs? Not necessarily. Companies like Calxeda and Applied Micro make big, fast servers based on ARM chips. Those clearly arenít MCUs, at least not by the usual definition. And MIPS and SPARC were both developed for high-end RISC workstations, so even though theyíre often used in embedded devices now, Iím not quite ready to call them microcontrollers.
It used to be that MCUs were eight-bit devices, and ďcomputersĒ always used 16-bit processors (and later 32-bit, then 64-bit, and so on). Nowadays itís pretty safe to label any eight-bit device as an MCU and not a CPU, but how about 16-bitters? Or a 32-bit Cortex-M4 device?
Do high-end features like multicore architecture, big caches, or on-chip accelerators disqualify a chip from MCU-hood? If so, there are a lot of microcontrollers at Texas Instruments, NXP, and other companies that will have to be reclassified. Even low-cost chips today have multiple processor cores and do multitasking.
So what really makes one chip an MCU and the other a CPU? Itís hard to pin down, so maybe we have to rely on ďI know it when I see itĒ and see how that works for us here.
Do you know a microcontroller when you see one? If so, letís hear how you recognize it.