History of ARM

· rtnF


The Computer Programme #

It all started with a TV show, a 1982 BBC program called "The Computer Programme". This was an attempt by the BBC to educate Britons about computers. The show was part of a larger Computer Literacy Project started by the British government as a response to fears that the UK was deeply and alarmingly unprepared for the new revolution in personal computing that was happening in America.

The BBC wanted to feature a computer on the show that would be used to explain fundamental computing concepts and teach a bit of BASIC programming. The concepts included graphics and sound, the ability to connect to teletext network, speech synthesis and even AI. As a result, the computer needed for the show would have to be pretty good. In fact, the producers' demands were initially so high that nothing on the market really satisfied the BBC's aspirations.

So, the BBC put out a call to the UK's young computer industry, which was then dominated by Sinclair, a company that made its fortune in calculators and tiny televisions. Ultimately, it was a much smaller upstart company that ended up getting the lucrative contract : Acorn Computers. Acorn was a Cambridge-based firm that started in 1979. Back then, Acorn had developed a home computer called the Atom. When the BBC opportunity arose, they started plans for the Atom's successor to be developed into what would become the BBC Micro.

The BBC's demanding list of features ensured the resulting machine would be quite powerful for the era. That Atom successor would have featured two CPUs, a tried-and-true 6502 and as-yet undecided 16-bit CPU. Later, Atom dropped that CPU but kept an interface system, called "The Tube", that would allow for additional CPUs to be connected to the machine.

The engineering of the BBC Micro really pushed Acorn's limits. This resulted in some fascinatingly weird but workable engineering decisions. For example, nobody ever really figured out why the machine only worked when a finger was placed on a certain point on the motherboard, but once they were able to emulate the finger touch with resistors, they were just satisfied it worked and moved on.

Later, the BBC Micro proved to be a big success for Acorn, becoming the dominant eduactional computer in the UK in the 1980s. Sales of its BBC Micro were skyrocketing and on pace to exceed 1.2 million units.

The 1980s #

The 1980s were a very important time in the history of computing. The world of personal computers was changing. IBM's PC was relased in 1981, setting the standard for personal computing for decades to come. The Apple Lisa in 1983 presaged the whole revolution of the windows-icons-mouse graphical user interface that would dominate computing to come. The market for cheap 8-bit micros was becoming saturated. And new machines, like the IBM PC and Apple Macintosh, promised significantly more power and ease of use.

Acorn needed a way to compete and realized they would need something more powerful than the aging 6502 to power their future machines.

ARM #

Acorn had already been experimenting with a lot of 16-bit CPUs : the 65816, the Motorola 68000 and National Semiconductor 32016, but none of these were really doing the job. Acorn reached out to Intel to see about implementing the Intel 80286 CPUs into their new architecture, but Intel ignored them completely. So Acorn considered to design their own CPU. But, was it even thinkable for the tiny Acorn to make its own CPU from scratch? To find out, Wilson and Furber took a trip to National Semiconductor's factory in Israel. They saw hundreds of engineers and a massive amount of expensive equipment. This confirmed their suspicions that such a task might be beyond them.

Then they visited the Western Design Center in Mesa, Arizona. This company was making 6502 and designing a 16-bit successor, the 65C618. Wilson and Furber found little more than a "bungalow in a suburb" with a few engineers and some students making diagrams using old Apple II computers and bits of sticky tape. Suddenly, making their own CPU seemed like it might be possible. Wilson and Furber's small team had built custom chips before, like the graphics and I/O chips for the BBC Micro, even though those designs were simple than an actual CPU.

Upper management at Acorn supported their efforts. Acorn co-founder, Hermann Hauser, gave the team copies of IBM research papers describing a new type of CPU : reduced instruction set computing (RISC). RISC contains fewer instructions, less hardware on the chip itself and every instruction can be executed in a single clock cycle. As a result, code has to be longer, but the chip itself is simpler and can execute the simple instructions faster.

The development of the first Acorn RISC Machine (ARM) chip took 18 months. To save money, the team spent a lot of time testing the design before they put it into silicon. Furber wrote an emulator for the ARM CPU in interpreted BASIC on the BBC Micro. This was incredibly slow, but it helped prove the concept and validate what Wilson's instruction set would work as designed.

Furber did much of the layout and design of the chip itself, while Wilson concentrated on the instruction set. The two jobs were deeply intertwined. The code number is chosen so that when it is translated into binary digits, appropriate wires along the instruction bus active the right decoding and routing circuits.

Later, Wilson led a team that wrote a more advanced emulator. ARM machine code itself was so high-performance that the result of interpreted ARM machine code was often better than compiled code on the same platform. These results spurred the small team to finish the job. The design for the first ARM CPU was sent to be fabricated at VLSI Technology, Inc., an American semicondutor manufacturing firm. The first version of the chip came back to Acorn on April 26, 1985. Wilson plugged it into the Tube slot on the BBC Micro, loaded up the ARM-version of BBC BASIC and tested it with a special PRINT command. The chip replied, "Hello World, I am ARM", and the team cracked open a bottle of champagne.

The entire ARM design team consisted of Sophie Wilson, Steve Furber, a couple of chip designers and a four-person team writing testing and verification software. This new 32-bit CPU based on RISC design was created by fewer than 10 people. This V1 chip had only 27,000 transistors (while the 80286 had 134,000) and was fabricated on a 3-micromater process (3000 nm), about a thousand times less granular than today's CPUs. It had no onboard cache memory. It didn't have multiplication or division circuits. It also lacked a floating point unit, even though the use a simple barrel shifter helped with floating point numbers. It ran at a very modest 6 MHz. It was found to be roughly 10 times faster than an Intel 80286 at the same clock speed and equivalent to a 32-bit Motorola 68020 running at 17 MHz.

Archimedes #

What the company needed was a brand new computer to really showcase the power of this new CPU. Before it could do this, it needed to upgrade the original ARM just a bit. The ARM V2 came out in 1986 and added support for coprocessors (such as floating point coprocessor) and built-in hardware multiplication circuits. It was fabricated on a 2 micrometer process, which meant that Acorn could boost the clock rate to 8 MHz without consuming any more power.

But a CPU alone wasn't enough to build a complete computer. So the team built a graphics controller chip, an input/output controller and a memory controler. By 1987, all four chips, including the ARM V2 were ready, along with a prototype computer to put them in : The Acorn Archimedes.

It was 1987, personal computers were now expected to come equipped with more than just BASIC interpreter prompt. Users demanded pretty graphical user interfaces like those on the Amiga, the Atari ST and the Macintosh. To solve this problem, Acorn had set up a remote software development team in Palo Alto, California, to design an operating system for the Archimedes. The quick and dirty OS called "Project Arthur" was worked in five months, based on an extension of BBC BASIC with a window manager. It was burned into the ROM of the first batch of computers.

The first Archimedes models shipped in June 1987 with the introductory price of 800 pounds. It got good reviews in the UK press and gained a passionate fan base.

Low Power / Low Heat #

The ARM's lack of transistors was a tell about the relative simplicity of the ARM itself. As a result, the chip used much less power and ran far less hot than the others.

The ARM chip was specifically designed to run at very low power. Wilson explained that this was entirely a cost-saving measure. The team wanted to use a plastic case for the chip instead of a ceramic one, so they set maximum target of 1 watt of power usage. To make sure they didn't go over the limit and melt the plastic, they were very conservative with every design detail. The actual power draw ended up 0.1 watts. In fact, one of the first test boards the team plugged the ARM into had a broken connection and was not attached to any power at all, but the CPU had been working the whole time. It had turned on just from electrical leakage coming from the support chips.

This low power consumption and low heat production made ARM a natural for mobile devices. Which is why Apple came sniffing around in the late 1980s looking for a CPU powerful enough to translate handwriting into text, all while running a GUI and being powered by AA batteries and not turning the handheld device it was to run into a hand-burning block of pain : the Apple Newton.

Apple partnered with Acorn to spin off the ARM division into its own new company, called Advanced RISC Machines. Under this alliance, ARM would develop the ARM6 core, with the ARM610 CPU being the first production chip based on that core, and, in a 20 Mhz version, would go on to power Apple Newton in 1993.

While the Newton was kind of a failure, in hindsight it was something much, much more : a handheld, battery-powered touch-screen device powered by an ARM CPU. Today this same description could be used to describe the modern smartphones.

Later, in 2001, an ARM7-core CPU would power Apple's iPod and Nintendo's Game Boy Advance. In 2004, a pair of ARMs would drive the tween screens of the Nintendo DS. Then, in 2007, Apple would release the first iPhone with its ARM11-core CPU. From that moment on, everything went ARM. ARM CPUs became the default choice for smartphones.


References #

  1. Jason Tochinsky (2020) "How an obscure British PC maker invented ARM and changed the world" Ars Technica
  2. Jeremy Reimer (2022) "A history of ARM, part 1: Building the first chip" Ars Technica