Processor Types |
Either way, Acorn made use of the 6502 processor in the Atom, some kits, and some rackmount
machines in the late seventies. As 1980 rolled in, the BBC went looking for a computer to fit a
series of programmes they wanted to produce. Rather than these days, when the programmes are
much more likely to fit the computer; the BBC had in mind the sort of specification it was
looking for. A number of companies well known at the time tendered their designs. Acorn revamped
the Atom design, throwing into it as much as possible, and building an entire working machine
from the ground up in a matter of days. That's the stuff legends are made of, and that seems to
be the stuff Acorn was good at, like "Hey, guys, let's pull an all-nighter and write an
operating system".
The BBC loved the machine, and the rather naffly named "The Micro Program" was released
in 1982 alongside the BBC microcomputer. It filled school computer rooms. Many were sold. Not
many in American terms, but staggering in European terms.
The BBC micro, like earlier Acorn machines, was based around the 6502 processor - as were other
popular computers such as the Apple II.
From the outset, you could have colour graphics and text on-screen. Not to be outdone, the BBC
micro offered seven screen 'modes' of varying types - ranging from high resolution monochrome to
eight colour (plus eight flashing colours) and an eight colour 'teletext' mode that only required
1K of memory per screen; a cassette interface for cheap and cheerful use, on-board provision for
a floppy disc interface (you needed to add a couple of ICs like the 1772 disc controller, that's
all), serial, four channel analogue, eight channel digital I/O, tube for co-processors, a 1MHz
system bus for serious fiddling and for harddiscs... and by adding a couple of extra components,
you had built-in networking.
Econet might have been slow and simple, but it was a revolution in those days, when it was
stated that Bill Gates, among other notable gaffs, asked "what's a network?" - though
this may well be urban legend. In any case, running multiple processor systems, and networking
all sorts of machines was something that Acorn users were au fait with long before the PC
marketplace kicked off, never mind implementing such things for itself.
However, Acorn had their sights set on the future, and between 1983 and 1985 the ARM processor
design was designed by Steve Furber and Sophie Wilson (or, Roger Wilson, back then). This was a
leap of faith and optimism, when only a year previous they had released a 32K 8 bit machine,
they were then designing a 32 bit machine that could cope with up to 16Mb RAM, and some ROM as
well.
Why?
Acorn continued to produce the BBC micro and variants. Indeed, the production of their most successful version of the BBC micro - the Master - only finished in May 1993. However, back a decade in 1983 it was quite clear to the innovators inside Acorn that the next generation of machine should provide something far better than rehashing old ideas over and over. In this, lay the problem. Which processor to use? There was nothing that stood out from the crowd. Acorn had produced a machine with the 16 bit 6502-alike, the 65C816, but this wasn't up to the vision that Acorn had. They tried all of the 16 and 32 bit processors available by building second processor units for the BBC micro to aid in their evaluation.
So there was one idea left. To make the processor that they were looking for. Something that kept the ideals of the 6502, but provided raw power. Something small, cheap - both to produce and to power, and something fairly simple both internally and to program. The important early design decisions were to use a fixed instruction length (which makes it possible to accurately disassemble any random memory address simply by looking to see what is there - every instruction is word aligned), and to use a load/store model.
In that day, companies were talking about extending their CISC processors. The 8088 became the
80186 (briefly), the 80286, and so on to the processor it is today. RISC processors existed, but
the majority of them were designed in-house as embedded controllers. Acorn took their ideas and
requirements and wrote a BASIC program that emulated the ARM 1 instruction set. The designers of
the processor were new to processor design, some of the tools used were not exactly cutting edge.
This prevented the processor design from being large and complex, which in it's way was the best
thing, and is now being spun as a 'plus' for the ARM processor, as indeed it is.
While Acorn had very clear ideas of what they wanted the processor to do, they also wanted good
all-round performance, rather than something so tailored to the end design that it obsoletes
itself.
So. For the processor, Acorn rolled their own.
Please, take a moment to consider this.
Not only did Acorn create an entire powerful and innovative operating system with a tiny
crew (MicroSoft probably employs more people to clean their toilets than Acorn employed in
total); but they also designed their own chipset.
So basically these guys designed an entire computer from the ground up, on a tiny budget and
with a tiny workforce.
You can fault Acorn for many things - lack of development, lack of advertising - but you can never fault them for having the sheer balls to pull it off in the first place.
At the time the "Archimedes" was released, it was widely touted as the world's fastest desktop machine. It also boasted a display system that could spit out loads of different resolutions. My A5000 (same video hardware) can output 640x480 at 256 colours, or 800x600 at 16 colours. It doesn't sound impressive, but this was using hardware developed in the mid '80s. The rest of the world (save Apple Macs) was using CGA and like; or Hercules for the truly deranged!
Not a lot was made of the fact that the machines were RISC. Maybe Acorn figured the name of the operating system (RISC OS) was a big hint. Maybe they figured they had enough going for the machine without getting all geeky.
So when, in the early '90s, Apple announced the world's first RISC desktop machine, we laughed. And Acorn ran a good-humoured advert in the Times welcoming Apple to RISC.
The chipset was:
The original operating system of the ARM-based machine was to be ARX, but it was taking too long
and was running overbudget. So Arthur was designed. It has been said that Arthur's name derives
from the porting of the BBC MOS "A RISC operating system by
Thursday". Sadly, it has a lot of the hang ups of the BBC micro, such as a lack of
memory protection (like 'modules' running in SVC mode (really only the kernel should run in
SVC mode)), there's a plethora of unrelated things done with the OS_Byte SWI, the service call
mechanism...
From Arthur came RISC OS, which improved certain aspects of the system, but perhaps the most
significant improvement was the Desktop. Instead of a bizarre looking (and horribly coloured)
thing that could only run a task at a time, it introduced proper co-operative multitasking.
The debate between pre-emptive and co-operative multitasking is legion, but I feel that Acorn
wanted co-operative. That it was a design decision instead of a cop-out. Because, while it makes
it slightly harder to program and more liable to problems with errant tasks, it fits so
beautifully into Acorn's ethos. There's no process 'protection' like on Unix. You can drop to
a privileged processor mode with little more than a SWI call, and a lot of stuff (that probably
shouldn't) runs in SVC mode. Because, at it's heart, RISC OS is a hacker's operating system. Not
the same type of 'hacking' that Linux and netbsd comes from - such things were not known in the
home/office computer sector in those days, but in it's way, RISC OS is practically begging for
you to whip out the disassembler and start poking around it's internals. The original Arthur PRMs
said that any serious application would be written in assembler (a view they later changed, to
suggesting serious applications would be written in C).
When the ARM processor team split off into ARM Ltd, they adopted a new numbering system for the processors. Originally, the numerical suffix reflected the revision of the device, the ARM 1, the ARM 2, the ARM 3 ... followed by the ARM two-and-a-half, which is 250 in the tradition of multiplying version numbers by a hundred.
Now, the single number reflects the macrocell as is always - ARM6, ARM7...
A processor with a twin number denotes a self-contained processor and basic interface circuitry, like the ARM60 and the VIDC20 (VIDC not strictly a processor, but part of the ARM chipset).
A processor with a triple number denotes the processor macrocell combined with other macrocells, or custom logic, like the ARM610 and the ARM710. Because of the simplicity of the designs, and the predefined parts, the ARM610 went from specification to silicon in under four months. Short development times are invaluable for custom devices where every development day matters... It also matters that ARM's designs will arrive on time, so you don't end up with your computer or PDA (or whatever) sitting there awaiting the processor. Within ARM's converted barn, a line of opened champagne bottles line the staircase - a testament to how many of their designs worked from the very first silicon implementation - which is virtually every single one of them.
So there you have it.
From an idea to a global leader in microprocessors (Intel has said recently it is making more ARM silicon than x86 silicon), the ARM processor's birth is wrapped in spectacular innovation.
While it is not entirely certain where RISC OS is heading, one thing is for sure. The beautiful processor in our RISC OS machines is going from strength to strength.
We at Heyrick wish ARM Ltd all the best...