A microcontroller, or MCU for short, is a computer placed on an integrated circuit chip. Microcontrollers aren’t computers in the sense that they work like a laptop or gaming PC might. They contain one or more CPUs, their memory, and input/output connectors, but nothing like an OS or UI.
Their main interfacing elements are switches, LEDs, or sensors. They are limited in what programs they can run by what will fit on their onboard memory – it’s costly and inefficient to attach additional memory, so whatever they are meant to do needs to fit onto the available memory of the chip. Microcontrollers are used for embedded applications rather than personal computer use. That means they serve a specific, dedicated purpose within a more extensive electronics system.
Tip: Compare the function of a microcontroller to that of a cog in a machine. Rather than being directly accessible to the user, it quietly fulfills its purpose of making the system it is in run smoothly.
Microcontrollers are similar to SoCs or System-on-a-chips. SoCs are somewhat more sophisticated, but they can appear together – an SoC might, for example, control external microcontrollers connected via a motherboard. Unlike microcontrollers, SoCs usually have some form of GPU and network connectivity tools (such as a Wi-Fi interface) attached.
Microcontrollers in the Real World
One of their defining characteristics is that they aren’t accessed directly but automatically controlled in larger systems. They might be found in a car’s onboard computer, power tools, or even medical devices. Microcontrollers can range in size but be small, making it possible to place them in tiny devices.
They can also be built in such a way that they consume incredibly little power – while sitting idle or waiting for input; it’s possible to build microcontrollers that consume as little as a nanowatt per minute – that’s one billionth of a watt. While not all can be this efficient, many make do with fractions of a watt for power consumption. This makes them ideally suited for devices that run on limited battery charges.
The History of the Microcontroller
The first microcontroller was created in 1971, though it took until 1974 for the first microcontroller to be commercially available. It featured a more straightforward setup than modern ones and was specifically built for embedded systems. Especially Japanese manufacturers picked up the tech and started making them for cars. They found use in in-car entertainment, automated or sensor-controlled windshield wipers, electronic locks, dashboards, and engine controls.
Tip: A modern, mid-market car will likely have about 30 different microcontrollers. You can also find some in washing machines, ovens, phones, and intercom systems.
Early models were strongly limited in how easily they could be erased and rewritten and weren’t easy to manufacture either. This has since ceased to be an issue – since 1993, when a new type of memory was included in microcontrollers, they have become significantly cheaper to make. Most models will cost only a few cents to manufacture – and sell for around a dollar, depending on the specifics.
Nowadays, microcontrollers are also used outside dedicated embedded systems – they are popular with hobby engineers who enjoy tinkering with them. Some specific models even have entire online communities devoted to them and their possible uses.
A microcontroller is a small processor. They’re typically used to manage something specific such as enabling the windshield wipers when water is detected. They’re typically entirely automatic, not needing remote control from a general-purpose CPU, though they can feed back some simple telemetry. As the name suggests, microcontrollers tend to be physically small.
They also have a small power draw and a low price tag. While early models tended to be locked to their specific function, modern microcontrollers can generally be reprogrammed, though specific hardware is often required. This programmability enables a community that enjoys tinkering with them and putting them to use in unusual ways.