The Evolution of Video Game Graphics: From Pixels to Realness

In the 80’s, we were happy with the first 8-bit screens with their square pixels, while today we marvel at photorealistic worlds where every leaf and drop of water looks like real life. The evolution of graphics in video games has been made possible by decades of technological breakthroughs. 

We’ll tell you all about how gaming graphics became so realistic, and which technologies have changed gaming.

The Birth of 3D Gaming

Starting in the mid-90’s, the video game industry has made a huge leap from 2D games to 3D worlds. The emergence of 3D graphics both transformed games’ visual components and changed the very concept of how players interacted with virtual gaming spaces. Developers were able to create fully 3D objects, and graphics hardware allowed them to render these objects in real time.

How Graphics Hardware Changed The Video Game Market

One of the most important developments in 3D graphics has been the emergence of GPUs. Up until the late 1980’s, video cards were mainly used for processing 2D graphics, and were only involved in 3D rendering. As the complexity of 3Dmodels and scenes in gaming increased, however, the standard graphics cards couldn’t keep up with the load. 

In 1991, 3Dfx Interactive released Voodoo Graphics, a revolutionary graphics card that accelerated 3D object rendering. This allowed games to include high quality textures and effects. Voodoo Graphics instantly became popular with gamers and gaming companies, as it allowed for smoother and better quality graphics. This was particularly evident in Need for Speed (1996) and Quake (1996).

Need For Speed: Special Edition (1996). Source

Voodoo Graphics could handle complex 3D effects, such as textures and lighting. It also improved texture rendering, allowing developers to create detailed game worlds and then scale them up. This technology opened up tons of new possibilities for developers to create realistic worlds with dynamic lighting effects and textures. Plus, the quality of games’ visuals increased quite a bit.

But it was the graphics processor build by Sony PlayStation that really boosted the hardware’s evolution. In 1994, the console was already offering users built-in support for 3Dgraphics. This allowed them to achieve high quality rendering without additional hardware. Games for PlayStation, such as Tekken and Ride Racer, used built-in rendering and created 3D worlds that could rotate and change perspective.

Tekken. Source

Nvidia GeForce and ATI Radeon graphics cards were introduced in the late 90’s.  They worked with textures, shaders, and complex 3D effects.

Video cards with Direct3D and OpenGL support gave developers tools to create dynamic shadows, reflections, mirrors, and complex destruction scenes. This allowed for games with more elaborate textures and lighting, and the graphics in a lot of the popular games in the late 90’s and early 2000’s became much better.

Pioneers: How True 3D Gaming Came About

In the early 90’s, games with full-fledged 3D graphics didn’t exist yet. Developers used pseudo 3D technology. This allowed them to create the appearance of volume, but the objects themselves remained 2D. This all changed with the advent of more powerful hardware that allowed developers to build real 3D worlds. This allowed players to both look at the environment, as well as interact with it.

Doom (1993) from id Software was the first game to change the way graphics were approached. It used 2D sprites and textures, but developers found a way to create the illusion of 3D space. The levels’ geometry looked 3D thanks to the ray casting technology. This wasn’t true 3D, but in that age the game made an impression. Doom became a popular shooter, and inspired a lot of developers.

Doom. Source 

Quake was introduced to the world in 1996. And this project was a real breakthrough. Unlike Doom, the whole game was made in 3D: the weapons, the enemies, and the environment. Players could change the angle of view, rotate the camera, and explore the world from any angle. Quake was the first game to add physics, which made interacting with object more realistic. This game engine, Quake Engine, became the basis for dozens of new projects, and defined what 3D graphics should look like.

Quake. Source

5th Gen Consoles And Their Hits

In the late 90’s, 5th generation consoles formed the basis future 3D game development. During this time the Sony PlayStation, Nintendo 64, Sega Saturn, and Atari Jaguar were all on the market. These consoles brought significant changes to the industry, and made 3D graphics accessible to the masses. These platforms spawned what would become iconic projects and set new standards for video game development for years to come.

Sony PlayStation

Released in 1994, Sony’s PlayStation broke new ground for three-dimensional video games. The console quickly gained popularity thanks to its extensive library of games, including many that showed the possibilities of 3D graphics in different genres. Developers began to create more complex and elaborate worlds filled with multi-layered detail for PlayStation.

One of the first games to showcase the PlayStation’s capabilities was Gran Turismo (1997). This simulator allowed gamers to see what it feels like sitting behind the wheel of realistic automobiles on detailed tracks. The game is impressive due to its visuals, as well as the effort put into creating physical models of the cars. This was the first time that auto simulators had shown how 3D graphics could be used to create impressive simulations while accurately rendering every element.

Gran Turismo. Source

Another landmark project on the PlayStation was Final Fantasy VII (1997). The game brought full 3D graphics to the series, which were used to both visualize the world and animate the in-between scenes. The game’s story gained in popularity, but it was the 3D  graphics which were used in battle scenes and cinematic inserts that made Final Fantasy VII a true sensation. A breakthrough in Japanese RPGs that also influenced an entire industry.

Another game, Metal Gear Solid (1998), used 3D graphics to create a rich atmosphere. Players could move around 3D levels and rotate the camera freely, giving them a sense of complete freedom. This was the first project to utilize a 3D environment to create a unique experience that held both stealth and suspense.

Metal Gear Solid. Source

Nintendo 64 

The Nintendo 64, released in 1996, was a unique platform for developing 3D games. Because it used cartridges instead of CDs, the console could load games faster but had limited data capacity. In spite of this, the Nintendo 64 offered unique solutions for displaying 3D graphics, and this was the start of several iconic games.

Super Mario 64 (1996) was the first full-fledged 3D platformer and one of the most significant projects for this console. The game used a 3D environment that players could move freely around in. The camera could be rotated, which allowed for better control and view of the scene. Previously platformers were only 2D, but Super Mario 64 broke new ground in the genre. The game also employed innovative gameplay elements such as level variety and the ability to interact with the game world.

Super Mario 64. Source

The next landmark game for the Nintendo 64 was The Legend of Zelda: Ocarina of Time (1998). It was one of the largest and most ambitious projects on the platform. The game offered players a huge 3D world that was filled with mysteries, complex dungeons, and interesting characters. Ocarina of Time not only used 3D graphics, it also created one of the most memorable game worlds of the time. All of the elements, characters, scenes, and surrounding landscapes were carefully crafted, giving players a sense of completeness and freedom.

 

The Legend of Zelda. Source

Sega Saturn and Atari Jaguar

While the Sega Saturn and Atari Jaguar couldn’t compete with giants like the PlayStation or Nintendo 64, they still played a role in the evolution of 3D graphics.

Released in 1994, the Sega Saturn focused on supporting 3D graphics, although its capabilities were limited. One of the first games to showcase this console’s 3D potential was Virtua Fighter (1993). Virtua Fighter was a real breakthrough for the fighting game genre, as it showed how 3D could be applied during dynamic fighting scenes.

Virtua Fighter. Source

While the Atari Jaguar failed to achieve significant commercial success, it was still one of the first consoles to offer mass 3D games. One of this console’s most famous projects was Alien vs. Predator (1994), which became popular due to its interesting gameplay and unique perspective: gamers could choose which side to fight on in the conflict between Predators, Aliens, and humans. This game used 3D graphics to create an atmosphere of tension, and add complex textures, which were an innovation during this time.

Alien vs. Predator. Source

The Road to Realism: The Technological Race of Graphics Capabilities

With each passing decade games have become ever more beautiful and realistic. New generations of consoles and computers have given developers more opportunities to create graphics, that reflect real life. Their goal has always been the same: make games as believable as possible, thus allowing players to immerse themselves in virtual worlds and fully experience them.

Detailed Models And Their Optimization

In the early days of 3D graphics, in-game models were as simple as possible. Developers used low-polygonal objects and basic textures so that the hardware wouldn’t get overloaded. This allowed them to create 3D games on weak systems, but these models looked angular and unnatural.

With the development of graphics cards and rendering, games have become much more detailed. With the release of the first Xbox 360 and PlayStation 3 in the 2000’s, the gaming industry began using much more complex 3D models with tens or even hundreds of thousands of polygons. One of the first video games where this type of rendering is evident is The Elder Scrolls IV: Oblivion (2006). Buildings, characters, and nature looked more realistic thanks to these new texturing techniques.

The Elder Scrolls Source

As the number of polygons increased and the models became more complex, new optimization methods were required. The main goal was to maintain a high level of realism, without compromising performance. To achieve this, developers used several rendering techniques:

  • Normal Mapping. This mapping technique creates the illusion of volume on textures. Even flat surfaces can have details, such as creases, cracks, or scratches, but the number of polygons remains low.
  • Bump Mapping. This type of mapping is used to simulate shallow relief. Examples might be walls, soil, or fabric that appear 3D, although this is only a visual effect.
  • Level of Detail (LOD). The farther an object is from the camera, the simpler the model. Players don’t notice the difference, and the system doesn’t have to process as much data. This technique has become standard in large, open worlds.

Moving to HD Resolution

When games moved to HD resolution in 2005-2006, there was a marked improvement in the graphics. The PlayStation 3 and Xbox 360 consoles had hardware that could support 720p and 1080p resolution, which hugely improved the details on textures and models.

HD resolution appeared in games with the release of the Xbox 360 and PlayStation 3 in 2005-2006. These consoles supported 720p and 1080p resolution, leading to: 

  • A sharper picture. Before HD, textures in games was blurry and lacked small details. Now, buildings, trees, and even character’s faces became much more detailed. 

For instance, in Grand Theft Auto IV, reflections of light in puddles and raindrops on car windows was now possible.

Grand Theft Auto IV. Source

  • No more jagged edges. Previously, the lines on objects in games often appeared jagged. Higher resolutions smoother out these sharp points, making the image more normal.

HD resolution also led to animation technology being developed. Motion capture allowed characters to move naturally.

This can be seen in the games Heavy Rain (2010) and L.A. Noire (2011), where the character’s gestures have become more natural. The animation work done on facial expressions was especially impressive, making the characters seem more alive.

Heavy Rain. Source

Shaders And Their Influence on Visual Effects

Shaders are programs that run on the graphics processing unit (GPU) and create visual effects in games. They’re responsible for how objects, textures, light, and shadows are displayed on screens. Each type of shader performs a different task:

  • Vertex shaders control the shape of objects.
  • Pixel shaders are responsible for the lighting and color of each pixel.
  • Fragment shaders add effects like 3D light or additional details to scenes.

The evolution of shaders 

  • 1990’s: Fixed rendering

The first shaders could only perform basic functions, and were built into the graphics processors. They calculated the lighting, shadows, and textures. For example, Quake (1996) used simple algorithms to create static lighting effects. These shader technologies were limited, but served as the basis for further developments.

  • 2000’s: Programmable shaders 

With the advent of GeForce 3, developers gained the freedom to program shaders. This allowed them to create more complex visual effects:

  • Bump mapping and normal mapping gave textures volume, while keeping the polygon levels low.
  • Ambient Occlusion (SSAO), first seen in Crysis (2007), added shadows to objects, making scenes deeper and realistic.

Crysis. Source

  • 2010’s — Today: Ray tracing and hybrid solutions

The release of Nvidia RTX graphics cards in 2018 saw shaders boosted to a whole new level. They began to be used to calculate ray tracing in real time, adding realistic reflections, dynamic shadows, and complex lighting. These effects are combined with traditional rendering to create more believable scenes.

Examples of shader advancements

  • Deferred rendering. First used in Killzone 2 (2009), this type of shader rendering allowed developers to work with a large number of light sources, without overloading the system. This makes for a richer and more detailed scene.

Killzone 2. Source

  • Parallax Occlusion Mapping. This type of mapping was used in Metro Exodus (2019) to create relief and depth on surfaces, helping flat textures look 3D.
  • Screen Space Reflection (SSR). This technology was introduced in Crysis 2 (2011), and is responsible for reflections on shiny surfaces. This makes water, glass, and metal look more realistic.

Physically Based Rendering

Physically based rendering (PBR) is a technique that makes lighting and materials look like the real deal. Developers use the laws of physics to achieve realistic surfaces.

Core Principles:

  • Albedo — the base color of a material, without taking light into account.
  • Metallic — the reflectivity of the material.
  • Roughness — the degree of light scattered on a surface.

These parameters are stored in textures called PBR maps. They allowed the rendering to accurately recreate the materials’ appearance. For example, light will reflect differently depending on if it falls on metal, fabric, or wood. Water becomes transparent depending on its depth, and characters’ skin looks realistic thanks to the proper distribution of light. These effects are possible because PBR takes reflection, refraction, and the scattering of light into account.

For instance, in The Witcher 3: Wild Hunt PBR is used to create texture on armor, weapons, water, and other objects. We can see the metallic gleam of armor, or the way the river reflects the sky. Thanks to this technology the game world feels real.

With PBR, scenes in games look natural in all lighting conditions. Daylight, rain, or twilight — everything is rendered realistically. For instance, in Cyberpunk 2077 or Horizon Zero Dawn the time of day and weather effects set the whole gaming  mood.

HDR: Graphics That Play With Contrasts

HDR — stands for high dynamic range. This was a breakthrough when it came to creating realistic graphics. High dynamic range displays pictures with more detail in bright and dark areas. With SDR (Standard Dynamic Range), screens show a limited range of brightness — dark scenes may appear too dark and bright scenes lose their details. HDR can fix:

  • Bright areas, by making the sun or light clearer.
  • Shadows, by retaining texture without turning them into black smudges.
  • Colors, by making them more saturated and helping shades blend seamlessly.

Advantages of HDR in games:

  • Realism High dynamic range allows developers to create an immersive experience, by making things on the screen look and feel like real life: 

Comparison of HDR and SDR in Horizon Zero Dawn. Source

Some gamers do feel though that the picture is more vivid without HDR. Source

  • Scene dynamics. HDR improves the contrast between light and dark colored elements in scenes, making transitions more natural. For example, in Shadow of the Tomb Raider, HDR emphasizes the light shining through the dense jungle foliage.

Shadow of the Tomb Raider. Source

  • Depth of color. High dynamic range screens support more shades, thus creating smooth color gradients without banding. For instance, in Cyberpunk 2077 (2020) cityscapes come to life with glowing neon signs and detailed reflections thanks to HDR.

Cyberpunk 2077. Source

To fully utilize HDR:

  • Use a display with a screen that supports HDR10 or Dolby Vision.
  • Play games with integrated HDR technology.
  • Play with a modern console or graphics card.

HDR support is available on new generation consoles like the PlayStation 5 and Xbox Series X/S. Combined with a suitable TV, gamers can enjoy crisp and vivid images on their screens, with pronounced details and colors. 

Ray Tracing: Lights And Shadows Move in Real Time

Ray tracing is a technology that helps create more realistic lighting and shadows in games. It works by projecting a beam of light from the source, which is passed through various objects in the frame. Depending on how the light interacts with the surfaces, reflections, shadows, and even refractions of light can appear. 

Previously ray tracing technology was only available in the movie industry, where it could take hours to render a single scene. In games, where scenes must be updated 60 times per second, this level of resource consumption was simply impossible. This all changed though with the introduction of Nvidia RTX graphics cards in 2018. These made ray tracing available for the first time for rendering in real time. The best uses for this in games are:

  • Realistic reflections. Water, mirrors, and polished surfaces can now display realistic scenes , such as characters and the surroundings. Battlefield V is an example of where RTX was used.

Reflection in the water in Battlefield V. Source

  • Moving shadows. Shadows no longer look flat. They can become soft or sharp depending on the distance and angle of lighting — like in real life. In Minecraft, for instance, RTX blocks cast shadows with such precision that scenes take on a new dimension of realism.
  • Refractions and transparency. Glass surfaces look natural. Light passes through the glass which changes its direction, and stained glass windows produce colored rays of light.
  • Dynamic lighting. The sun moves across the sky, changing the angle of light and the length of shadows. In Control, for instance, light sources react dynamically to characters and objects moving around.

The downside of ray tracing technology is the high level of hardware requirements. Right now it’s only available to those who own powerful Nvidia RTX 20-series and 30-series graphics cards, and new consoles (PS5, Xbox Series X/S).

A Brief Overview of the Evolution of Video Game Graphics

  • Transition from 2D to 3D graphics
    In the 90’s 3D graphics began to replace their 2D counterparts. This was made possible by GPUs and new consoles. Games like Quake and Super Mario 64 served as the foundation for continued development.
  • The advent of HD resolution
    With the release of the Xbox 360 and PlayStation 3, games received HD resolution, which increased the details on texture and improved lighting. GTA IV and Oblivion are classic examples.
  • Development of Graphics shaders
    Shaders became an important tool for creating effects such as shadows and reflections, as seen in Battlefield V and Metro Exodus.
  • Physically based rendering (PBR)
    PBR allows for highly accurate light modeling, like inThe Witcher 3 and Red Dead Redemption 2, plus improved lighting and textures.
  • Implementation of HDR
    High dynamic range improved brightness and contrast, making images more detailed in both dark and light areas, just like in Shadow of the Tomb Raider.
  • Ray tracing
    Ray tracing was introduced with the Nvidia RTX, and improved lighting and reflections in video games such as Battlefield V.

Games are no longer just entertainment — they’re art. Each new technology makes gameplay more exciting, and the future of graphics holds so much more excitement.