As Neal Stephenson, the cyberpunk sci-fi author, so insightfully observed in his non-fiction work ‘In the Beginning... Was the Command Line’ this preference for a programmable computer, one that would neatly fit into the corner of every living room, gave rise to an army of young coders. And that, dear reader, included us.

The Spectrum, the Vic20, Commodore 64, the quintessentially British BBC micro. So many 8 bit home computers with so many incompatible operating systems and vaguely similar versions of the programming language BASIC, it was a fascinating time to be alive.

The indisputable thing that made them marketable, that made people want to buy them, was that you could plug them into a normal television and they would play the same games that were previously the exclusive domain of the slot arcades. Most importantly, the games were graphical. But unlike the Atari and Nintendo console machines, you could code your own games, cool visual tools and demos.

Those early days of home computing features in Cadasio’s DNA. Many an hour was spent by our youthful selves, hacking about with code copied from one of the ubiquitous computing magazines of the day. Given the graphical nature of Cadasio, we thought it would be interesting to look at how we got from 8 bit 2D games to 3D fully interactive web based apps and how the pursuit of entertainment delivered some really useful things.

Before we start, I know what some of you are thinking: what about CAD? Obviously Cadasio, as our name implies, is all about turning CAD data into a whole range of useful things. But the main drivers for consumer adoption of 3D graphical tools were the desire for affordable games consoles and the need for home PCs. CAD systems for engineering have always been expensive, due to the need for raw power and complex software. While CAD technology has been a driver for innovation in many industries, we feel that home computers and games pushed a lot further into the consumer’s purview.

Sprites, Bitmaps and Vectors

In the beginning there were essentially two different ways of making computer graphics. Sprites/Bitmaps and Vectors.

Vector graphics represented images as a series of mathematical equations, allowing for smooth lines and curves. The most notable example of vector graphics during this time was Atari's arcade game "Asteroids," which featured wireframe graphics. The game was widely cloned and arrived on all the 8 bit home computer systems in one form or another during the early 80s.  

There are many decent Javascript versions available to play today.

The class asteroid game

But later, British developer David Braben released the classic space combat/trading game Elite which also used vector graphics. Initially a big hit on the BBC Micro, Elite went on to be released on a total of thirteen different platforms. It inspired a whole generation of developers who went on to make games like Grand Theft Auto and Eve Online.

The introduction of sprite graphics revolutionised the gaming industry in the early 1980s. Sprites were small, movable objects that could be individually controlled and overlaid onto a background. This technique allowed for more detailed and animated graphics. Systems like the Spectrum ZX81 and Commodore 64 popularised the use of sprites in gaming. Games like Jet Set Willy and Manic Miner are great examples of sprite based games.

As computing power increased, bitmapped graphics started to gain prominence. Instead of representing images mathematically like vector graphics, bitmapped graphics stored images as a grid of pixels. This approach allowed for more complex and detailed visual representations.

Due to the limitations of the early 8 bit machines, resolutions were low so everything seemed a bit blocky and the amount of colours a machine could display were limited. Which made for some interesting art direction. People forget the creativity that was needed to make great graphics with such limitations. Here’s Sylvester Stalone in all his 80s 8 bit, limited palette glory:

Cobra had a great title screen

Incidentally, Ocean Software (featured in the above image) was a hugely successful British games developer, based in Liverpool that produced some of the most memorable titles of the 80s and eventually became part of the Bandai Namco Entertainment multinational corporation.

To 16 bit and Beyond…

There’s only so much you can do with 8 bits and only so useful such home computers can be. There was a limited selection of software like home accountancy applications for the main platforms, but the focus was always games. As such, the market for home computers in the 1980s was limited to young people into games and enthusiasts. But technology always finds a way and as the 1990s approached things got a lot better; 16 bit machines arrived.

Atari launched the Atari ST in June 1985, hot on their heels, in July of that year, was Commodore with the launch of the first version of the Amiga.

Meanwhile, IBM agreed with Microsoft that MS-DOS would be able to work on non-IBM made but compatible PCs. This opened up the market to makers of IBM PC clones (Viglin, Tandy, Amstrad etc) and these machines, previously the reserve of the business world, became accessible to home users. But perhaps as important: Apple was becoming a highly recognisable brand and they were the leading champion of the graphical user interface - few of us now remember a time before the GUI. More on this later.

But how significant were these chip architecture advances for graphics and what changes did they herald?

First, let’s look at the technology changes.

Increased Memory: One significant advantage of 16-bit computers was their expanded memory capacity. With 16 bits, these machines could address larger amounts of memory, allowing for more extensive graphical data storage. This extra memory allowed for higher-resolution graphics, larger colour palettes, and more detailed images compared to the limited memory of 8-bit machines.

Enhanced Colour Depth: 16-bit computers supported a wider range of colours compared to their 8-bit counterparts. 16-bit systems could display thousands or even millions of colours. The increased colour depth facilitated more realistic and visually appealing graphics, enabling smoother gradients, more accurate colour representation, and improved overall visual fidelity.

Improved Graphics Modes: 16-bit computers introduced advanced graphics modes that offered higher resolutions and more flexible display options. For example, the VGA (Video Graphics Array) standard introduced by IBM in the late 1980s allowed for resolutions up to 640x480 pixels with 256 colours. This was a significant leap from the lower resolutions and limited colour capabilities of most 8-bit machines, which typically operated at resolutions around 320x200 pixels.

Apart from video game visual improvements there was one huge step change to home computing which the new architecture enabled: the graphical user interface (GUI).

The windows, icons, mouse and pointer (WIMP) user interface had been around since Rank Xerox Labs developed it back in the 1970s, but until home computers got more powerful they were only controllable with a command line. Apple had been quick to adopt this easy to use interface, but also, the Atari ST and Amiga were bundled with early forms of WIMP enabled operating systems.

Going from this:

Commodore 64 Basic v2

To this:

Amiga Workbench

This was a huge change, the graphical user interface opened up the home computer market to people who had previously never considered owning one. The command line was scary, the GUI was cool and easy to use. While Apple Macs were the gold standard for this new approach, they remained expensive, but the proliferation of IBM PC clones created a competitive market place and Microsoft knew it had to step up its GUI. game.

MS-DOS was Microsoft’s operating system, all the games and applications ran on top of that. Microsoft Windows was first released and also ran on top of MS-DOS. MS-DOS was primarily a command line interface based operating system, which required some specialist knowledge to get the most of it. With Windows tasks that would take a complicated string of commands could be executed with a click, drag and a drop. But Microsoft wasn’t too committed to the GUI at first. Windows 1 was released in 1985 and is generally regarded as a bit of a flop. Windows 2 was a bit better, but it wasn’t until Windows 3.1 did Microsoft start taking the WIMP thing seriously. By now we’re in the early 90s and Microsoft was a bit behind the curve.

Hardware Accelerated

32 bit computers arrived, Windows 95 came along and it felt like Microsoft finally had their GUI under control. Apple was still making, as they continue to do, beautiful but expensive products. The proliferation of home computers narrowed down to IBM compatible with Windows or Apple Mac. The Amiga and Atari ST had their swan song and,  once again, gracefully passed into the realm of ‘enthusiast’. All the while something behind the scenes,  more connected, more distributed and profoundly world changing was happening…

But before we move onto that rather obvious change, what was happening to the hardware making the graphics go pop? By the mid 90s consoles and PC games were moving away from the bitmap type games into the 3D worlds that we are familiar with today, where the bitmaps were textures applied to vector type environments.

Doo was a favourite of the team when they where growing up

While there were other 3D games that came before it, Doom by id Software was the first 3D game that got serious traction on the PC. It had a great single player game, which was easy to learn, but required real skill to master AND co-op and player versus player multiplayer modes. Still loved today, it’s been ported to so many different platforms - including some weird ones like electronic pregnancy test kits. It was quite literally a game changer. Looking at these shifts in consumer desire, hardware manufacturers spotted an opportunity to make hardware specifically to enhance 3D games and applications.

The first big player in this domain was 3DFX with their Voodoo range of ‘graphics cards’. Although much loved, they didn’t last long - filing for bankruptcy in 2002. But they started something that had some unlikely side effects. Graphical Processing Units (GPUs) are designed specifically to draw millions of polygons per second, useful in 3D games, whereas the central processing unit (CPU) of your computer is a generalist and must be able to handle the sorts of calculations needed in a range of activities. This means that GPUs are very good at a certain type of calculation and it just so happens that these calculations are useful for a couple of other things too.

Firstly in cryptocurrency, calculations are used to ‘mine’ currencies like Bitcoin and Ethereum. Miners use computational power to solve complex mathematical problems, which allows them to secure the network and earn rewards in the form of newly minted cryptocurrency. It just so happens that GPUs are much better at this task than CPUs. As a result a couple of years ago it was difficult to find any decent graphics card as cryptominers were buying them all as soon as they hit the stores. This drove the prices of GPUs through the roof, and they still haven’t recovered, even though mining crypto isn’t as lucrative as it once was.

Secondly, the recent boom in AI has been powered largely by the advances in GPUs. Due to their highly parallel chip architecture with a large number of cores compared to CPUs, combined with their effectiveness at matrix operations, such as matrix multiplications and convolutions - used intensely in neural networks - GPUs are a relatively cheap way to build things like object recognition systems. For more detail on this you can read our previous article on the subject.

All this started with the market pull for better 3D graphics in games. But it was another pull that eventually led to the technology that enabled us to build Cadasio.

And then there was the web browser…

You can still visit the world's first website if you like.

The worlds first webpage

Not a stunner is it? But we don’t need to tell you what it led to. In the 1980s we had a lot of different home computers with incompatible operating systems, meaning that games and apps had to be redeveloped for each system. Over time, market forces and savvy business decisions rationalised these down to essentially two consumer platforms PC and Mac. But then came along the internet and the web browser and this created a kind of meta-platform. A space in which all manner of things could happen and it didn’t really matter what computer, or later on, which mobile phone it was running on.

Just like the creation of the GUI opened up the home computer market to non-technical people, the web browser allowed us all to look for and view content on an endless range of subject matter and it quickly became part of our daily lives. But there is only so much you can do with text and hyperlinks. Even when images and eventually video was added, web browsers didn’t really get more functional than a shopping cart form.

JavaScript and CSS added basic graphical elements, but Macromedia (subsequently Adobe) Flash was the first real push into highly functional visual experiences. It was, for a period of time, huge. Flash games, Flash financial trading systems, entire websites were built out of Flash. People made some really cool stuff with it. Flash was essentially a plug in that worked in your web browser that was tied into your operating system - so it wasn’t truly independent of the platform. But something happened that killed it pretty much overnight. In 2007 Apple launched the iPhone and decided not to include Flash in the handsets version of their web browser Safari. This was almost inconceivable as Flash was so ubiquitous in websites at the time it meant that anyone with the new iPhone couldn’t use a lot of the sites properly. But it turned out the iPhone was just so awesome it didn’t matter.

By 2010 Apple had essentially declared war on Flash and that was enough to put an end to it.

But something had been brewing at the research labs of the Mozilla Foundation that would not only go on to replace Flash, but be better and supported by all web popular browsers: WebGL.

WebGL - web graphics library - is a JavaScript API for manipulating graphics in all compatible web browsers. All modern web browsers worth their salt support it. It’s actually based on a much older graphics library, OpenGL, which has been around since the early 90s and serves as a set of functions and commands that allow developers to interact with a system’s GPU without having to worry about the specifics of each GPU type or platform it’s plugged into.

To make the creation of graphically based websites even easier to do, the very talented developer Ricardo Cabello (better known as Mr Doob) created a JavaScript library and API, called three.js.  This makes it a lot simpler to create animations and lighting etc. Anyway, we don’t need to get bogged down in what happens under the hood, let’s focus on what it does.

Slowroads.io is a fantastic browser based game

Of course, there are many games. Our favourite web based way to unwind is the procedurally generated slowroads.io. You can drive a car or a bus or a motorbike, through winter, spring, summer or autumn landscapes - or even on Mars. Forever. It will just keep randomly creating landscapes. It’s very relaxing.

But the world has adopted WebGL for so much more. We built Cadasio in three.js - it was the obvious choice for us. With inbuilt functions for setting up animations, creating scenes with lighting and camera types.

All car manufacturers use 3D models that are interactive, configurable and always beautiful to try and sell their products. Here is the latest offering from Volkswagen.

Car configurators are becoming extremely popular

Consumer products such as clothing and equipment can also be configured using WebGL. Need a new motorbike helmet?

A 3D helmet configurator is only now possible thanks to recent WebGL technologies

These are great examples of product demonstrators and configurators, but you don’t need to be a developer to make them - for more straightforward product demos and instructional content, you can use Cadasio for this type of thing too.

We could go on, there is so much awesome stuff out there built in three.js and WebGL. But before we go on to summarise, a quick shout out to Bruno Simon a fellow three.js developer (seriously, check out his portfolio site) and educator who has created an excellent, straightforward three.js course that our non-technical members are undertaking so they get up to speed with how Cadasio works. They’re really enjoying it. So if you’re interested in learning three.js, check out his course.

In summary…

Hopefully you’ve enjoyed this journey that traces the origins of home computer graphics, how they ignited an appetite for more beautiful and realistic games, how this in turn pushed the hardware manufacturers to provide developers with more and more tools. These advances were repurposed to give users easier ways to interact with their computers. The advent of the internet created a need to merge graphics technology into platform-independent but graphically beautiful web based user experiences. After the death of Flash (the first attempt) this delivered the WebGL technology. WebGL meant it was possible to build games and tools that worked in a 3D world of limitless possibilities. One of those possibilities is Cadasio. We love making Cadasio and we’re looking forward to seeing how it will grow and what it will become. We hope you’re with us all the way.