My resume is my Gamerscore

I am, and have been, a hardcore gamer since birth. Well, at least since the time I had enough motor control to use a keyboard. Some of my first gaming memories include sitting on my dad’s lap at his PC in 1992 playing Wolfenstein 3D co-op; he would use the arrow keys to move us around and I would press spacebar to open doors and control to shoot the guns until the startlingly realistic graphics got so scary that I had to hide behind his office desk. While the state of the art graphics of 1992 look pretty ridiculous today, in 2015 gamers are on the cusp of graphical achievements that might someday make Call of Duty‘s graphics feel like Minecraft. The same goes for gaming accessories too, with websites like gear surfer keeping people up to date with the latest developments for products such as wireless headsets and gaming chairs. So, what technologies really matter, and how will they change the way we play games? And are they really necessary? Do they add to the thrills of these classic racing games (https://www.allcarleasing.co.uk/blog/classic-racing-games/) or detract? Grab your controller and follow me for the inside scoop.

Pixels, Polygons, and FPS

When it comes to photo-realism in games, there are three ways we’ve seen technology get better over the years. The first is through increasing the detail of how many points of color, or pixels, make up your screen. The second is through better, faster hardware that allows game developers to create characters with more detail and have more objects in a scene at once. For example, compare these similar scenes from The Witcher (2009) and The Witcher 3: Wild Hunt (2015). The third is by upping the refresh rate of monitors and the games themselves, which in turn makes any action on the screen look smoother and more pleasing to the eye.

The Age of UltraHighDefinition

No, this isn’t a scrapped Avengers villain, but the future of high def. I know what you’re thinking: “But Devin, didn’t I just spend $$$ upgrading from Dad’s hand-me-down 200 lb. big screen TV to this super svelte LED HDTV? What could be better?” HD is old news, friends, and Ultra High Def is the new kid on the block.

Typically, HDTVs in homes today max out at 1080p, or full-HD. When you see an image on your TV, what you’re really seeing are rows of individual pixels that make up a single image. The 1080 number indicates the number of vertical lines of resolution on the screen at once. Because HDTVs are typically wider than they are tall, the standard resolution is 1920 horizontal lines with 1080 vertical lines across the screen, hence the often seen 1920×1080 screen resolution on many computer monitors. The “p” in 1080p relates to “progressive scan.” This means each horizontal line of pixels is drawn one after the other, resulting in a smoother image with less flickering.

These days, consumers are starting to see full-HD’s big brothers appear in stores – 4K and 8K TVs. While still priced high enough to break the bank, 4K and 8K are essentially what you expect. 4K fits four times the resolution of HD into a single 4K display, and 8K is a whopping eight times more resolution than high definition (so much that you can’t pack all that detail on a blu-ray disk)! But the ability to show those crisp, highly detailed images won’t mean diddly-squat if we don’t have the hardware to display it.

Graphic Goodness

Ask any PC gamer to boast about their rig, and the first thing they’ll likely mention are their graphics cards. While having a fast system with plenty of RAM is important, the graphics card can make or break which level of detail settings you can apply to your favorite games. Graphics cards are purposefully built for the particular type of number crunching that turns the 3D geometry of high def, photo-realistic games into 2D (or 3D!) images that appear on your monitor. The more powerful graphics card you have, the more detail and effects that can be added without bringing your system to its knees. You can even sell your graphics card if you’re looking to get a new one which is more advanced, if this is the case then just take a look at exITtechnologies.com who are experts and will give you an accurate evaluation when selling your gpu.

Full of Pixels

My God it’s full of pixels.

Console gamers have one major thing going for them – all Playstations, Xboxes, Wiis, etc. have the same hardware in each box. Developers like this because they can optimize for the platform and push out the highest level of detail possible for that system. However, when it comes to PCs, everyone has a different configuration from the low-end to the high, which can be a challenge (to say the least) when testing PC games. Some people opt to look for the cheapest builds they can do, referring to pages like this Best Gaming PC Under $300 (Cheapest) 2020 Edition that discuss the cheaper end. Others choose to push the limit as high as it can go. That said, PCs are the place where gamers push the envelope and graphics shine. PC gamers can even combine multiple graphics cards together like some mythological hardware hydra to display those 4K graphics on their beautiful 4K screens. This kind of power doesn’t come cheap, but companies like Nvidia see the desire for 4K gaming and recently released the GTX 980 TI, a $650 graphics card capable of outputting in 4K. Sure, it’s expensive, but recent reviews say it almost matches the performance of its big brother, the $1,000 GTX Titan X, making it a go-to card for gamers upgrading older systems.

FPS (and I don’t mean First Person Shooter), or The Truth Hertz

FPS stands for Frames per Second and is a measure of how many images are shown (rasterized) on your display each second. The higher the frame rate, the smoother the image looks. Games today typically shoot for a smooth 60fps, but 30fps is deemed “acceptable” by a majority of the gaming public. For reference, theater films are often shot at 24fps. Why are games held to a higher standard? A major factor is that games are highly interactive. The human brain does a great job of anticipating what things should look like concerning lighting, movement, or motion. When game frame rates drop, the game will appear to stutter, causing a dissatisfying experience. Nobody likes to get shot in a multiplayer session because their frame rate lagged and caused them to zig when they should have zagged.

Keep in mind, just because you have a beastly graphics card capable of outputting hundreds of frames per second rather than a mere 30 or 60, that doesn’t mean you’re actually GETTING the smooth-as-buttery benefits of that high frame rate. This is because many computer monitors we use today have a refresh rate of 60Hz. This means that each second, your monitor is only capable of refreshing its picture 60 times. If your graphics card is outputting at 100 FPS, those extra frames are “lost”. To address this problem, gaming monitors with a jaw dropping 144Hz refresh rate have begun to hit the market. Frankly, I’m trying to avoid seeing one in person because I’m afraid that once I do, I’ll never be able to go back to my 60Hz monitor, and the best 144Hz displays are still pretty brutal on the wallet.

If this is the future, what’s next?

The Next Big Thing in gaming is right around the corner, and it goes by the name of Virtual/Augmented Reality. We’re finally reaching the point where we can affordably create virtual and augmented reality systems and even more immersive worlds. Since the dawn of 3D gaming, we’ve been exploring 3D worlds through 2D screens, and companies such as Oculus, HTC, Sony, Microsoft, and others are poised to enter the Virtual or Augmented reality markets later this year. I have a feeling E3 will be a breakout conference for this tech, so June is shaping up to be a great month for gaming.

What gaming technology are you most excited about?

View Results

Loading ... Loading ...

Featured image via Game Zone. Body image via Vimeo.