For most people, the history of graphics cards begins with Nvidia and AMD. The reality is completely different.
Long before one of these two companies began to dominate the market, and long before GPUs became the mainstay of modern gaming, we witnessed the early days of computer graphics. To understand how we got to modern GPUs, we have to go back to a time when the term “video card” didn't exist.
What happened before the advent of video cards?
For most people, computer graphics wasn't a big deal yet.
So we can discuss everything now new GPUs coming in 2026someone had to invent computer graphics anyway – and that's not the same thing as a graphics processor, not even close.
Early computers did not display pixel-addressed images, windows, or even graphics. Most of the results were text. Electromechanical teletype machines, such as the 1963 Teletype Model 33, which adopted the new ASCII standard, were essentially glorified typewriters, spitting results onto paper one line at a time.
They were painfully slow, loud and very literal. There were no visuals other than the text you told the computer to print.
Next came video terminals, called “dumb terminals.” They were essentially keyboards connected to screens, but they weren't computers themselves – they were all part of a network connected to a host computer, and the terminal displayed whatever the host sent back. The screen was divided into a fixed grid (usually 80 columns wide), and each small square could display a character from a predetermined set. This allowed people to get creative and create a very simple ASCII image, but it was all done using pre-programmed characters and assigned to specific keys.
In the 60s, computer graphics appeared in one form or another.
Although most people still had to deal with text, some researchers were already experimenting with interactive graphics. In 1963, Ivan Sutherland created Sketchpad, a system that allowed users to draw and manipulate lines directly on the screen. Sounds a lot like modern touch screens, doesn't it? To do this, I used a light pen in Sketchpad.
Interactive computer graphics have already begun to be implemented in the systems of large companies. IBM began shipping graphics terminals such as the 2250 in 1965.
The advent of the PC accelerated the evolution of graphics.
But this still had little to do with video cards.
The end of the 1970s brought us the advent of personal computers. Not as we know them today, but in general. Before this, computers were massive machines that took up entire rooms and were used by businesses; the advent of PCs made them widely available. This brings us closer to computer graphics.
In the early days of computer graphics, we were given low-resolution, often monochrome displays. PCs had limited memory, which meant programmers had to get creative to achieve anything remotely interesting in terms of graphics output.
Machines such as the Radio Shack TRS-80 had raster graphics, but at extremely low resolution (128×48).
Before the advent of graphics cards and accelerators, the CPU and memory played a large role in display output. Because early computers had RAM in kilobytes rather than megabytes, storing full-screen images was expensive and impractical. Graphics were kept minimal and heavily reused as needed.
This was a time when there were no standard image formats, before JPG, BMP and PNG. The software had to store images as raw bitmaps or custom data structures and compress them like there was no tomorrow.
Early IBM display standards shaped PC graphics
Apple was there too with its Macintosh.
IBM was a major player in the early days of personal computing. In 1981, IBM introduced the “PC” and in 1983, the PC-XT. However, the original IBM PC had virtually no graphics power. Most workloads were still text-based, so text clarity and software reliability were the two main concerns.
The IBM PC could be equipped with a monochrome display adapter (MDA), which was state-of-the-art for its time and provided high-contrast text without any bitmap graphics at all. It was used for word processing, databases and spreadsheets.
The Color Graphics Adapter (CGA), also introduced in the IBM PC, finally added basic color graphics, but of course the color palette was tiny and everything was in ultra-low resolution. But hey, at least we had graphics.
Apple took a different approach with the early Macintosh, or Mac as we know it today. On Mac computers, the screen was treated as a bitmap image rather than a pre-programmed grid of characters, giving the user much more freedom. With this choice, Apple blazed a trail in industries like graphic design and publishing, and remains a leader in similar workloads to this day.
While IBM wasn't very generous with its graphical interfaces back then, it did something much more important: it helped standardize PC display modes across the IBM-compatible ecosystem. As developers and manufacturers optimized their products to be compatible with IBM as an industry standard, the doors to computer graphics were finally wide open.
The advent of 2D graphics was a turning point in computing.
However, we are still a long way from today's GPUs.
By the late 1980s, raster graphics modes had become commonplace on IBM-compatible PCs, and the advent of Windows meant that more software was optimized for pixel-addressed screens. Text modes didn't disappear overnight, but basic graphics became the norm. And by 1987, IBM PS/2 introduced VGA, which became the base version of the PC, and although these ports are officially outdatedThey were revolutionaries then.
VGA finally allowed personal computers to display semi-realistic images, games and movies. It also greatly expanded practical resolution and color options on PCs by introducing the popular 256-color mode.
Even if much of the video was still tiny, highly compressed, and generally unimpressive by today's standards, it was still groundbreaking for its time; in addition, VGA finally became a framework that developers could build upon, accelerating the graphics revolution. SVGA came out in the early 1990s and PCs were finally able to handle higher resolutions, up to 1024 x 768.
However, graphics were still largely handled by the CPU, and dedicated “graphics cards” as we call them today were not yet common, although display adapters such as CGAs did exist. But by the late 1980s and early 1990s, 2D acceleration became more common.
Early 2D acceleration was implemented as special hardware on additional video cards (again, they weren't called that back then). In the 1990s they increasingly became integrated 2D engines into mainstream VGA/SVGA cards.
These 2D accelerators have done a great job of making PCs more responsive. More importantly, they made it clear that computer graphics was a workload that deserved its own hardware, which ultimately led us to the GPUs we know today.






