How to Upscale Video to 4K, 8K, and Beyond

How to Upscale Video to 4K, 8K, and Beyond

Over the last five years, video and image upscaling — the process of turning a lower-resolution video or photo into a higher-resolution image — has gone from the realm of research papers and tech demos to commercial products. In addition to invalidating several decades worth of nerd complaints about how looking at the screen and saying “Enhance” does absolutely nothing, it’s now possible to upscale video dramatically and improve overall quality in the process.

Upscaling technology is now built into a variety of consumer devices you can purchase and there are software packages available for video editing as well.

The word “upscale” generically means “to improve the value or quality of something.” In the video and PC space, it’s almost always a reference to increasing the effective resolution of a piece of content. Some AI upscaling approaches work in real-time and some do not.

There is a difference between upscaling and what we call native resolution. Upscaling a 1080p image into 4K means taking an image originally encoded in 1920×1080 and algorithmically transforming it into what a higher-resolution version of the same image ought to look like. This upscaled image will not be identical to a native 4K signal, but it should offer a better picture than what was previously available on your 720p or 1080p television.

Keyword: “should.” Video scalar quality in TVs can vary widely between different product families. In some cases, you might be better off using a GPU to drive a picture than relying on the TV’s native rescaling capability, while other TVs have excellent upscalers. Manufacturers rarely disclose their upscaling hardware choices, but should have improved upscaling capabilities. If you have a UHD Blu-ray player paired with an older or lower-quality 1080p or 4K TV, you might even get better results by running all video signals through the Blu-ray player rather than the television. Generally speaking, a good TV upscaler is considered to be as good or better than a GPU.

How the Scalar Sausage Gets Made

The most basic function of a video scaler is to take whatever image it receives — 480i, 720p, 1080p — and stretch it across the entire screen. Without this functionality, a 1080p signal would take up just a fraction of a 4K television’s display. This simple resizing is typically done by taking each individual 1080p pixel and creating four pixels out of it (remember, 4K is four times the pixels of 1080p).

But many TVs and Blu-ray players do more than just perform a simple 1:4 mapping. They also use video processing techniques to extrapolate what details ought to be present in the scene. How well this works depends on the type of content being upscaled.

Image by

In the image above, you can see how the upscaled 4K is much more nuanced than the simple 1:4 mapping in the second grid from the left. If you’re having trouble seeing the difference between the 1080p upscale and the native 4K, look at the left-side blocks in the very first row and the right-side blocks in the very last row. The native 4K image resolves into distinctly different colors than the 1080p image in R1C2 (That’s 1st row, 2nd column), R1C3, and R8C8. As the amount of available horsepower in televisions has improved, the quality and sophistication of their integrated upscalers have grown as well. Some modern TVs have sophisticated sharpening algorithms to reverse the blur created by upscaling and interpolation algorithms good enough to almost match the precision of a native 4K signal.

How does all this look with real content? A 2017 Rtings article answer that question. The first image below is from a 4K set displaying 1080p in upscaled 4K, while the second is native 1080p.

Image

If you have trouble seeing a difference between the two images, open both of them in a new tab and focus your eyes near the center of the image. See the house with a brown roof near the center of the image, with three windows facing approximately south-southwest and a fourth pointed east? (All directional cues based on north being “up”, not the direction of the sunlight). Look at that specific spot in both images, and the roofs in the buildings immediately adjacent. The difference should jump out at you. In this case, even using TVs that date back to 2015, the 1080p upscale to 4K is better than the 1080p image.

Image by Rtings

If you have an older TV or a budget 4K model, there’s one obvious method of improving your TV’s upscaling: Buy a better television. Unfortunately, it’s impossible to predict how well this will work without knowing exactly what you own now and what you plan to purchase to replace it. The older your current TV, the better the chances that a new set will deliver upgrades in all respects, but many of those improvements may have nothing to do with the way your upscaler handles <4K content.

If you aren’t happy with your current TV, can’t replace it at the moment, and happen to own a high-end Blu-ray or UHD player, you can also try running content through its upscaler rather than relying on the television to handle it. In some cases, a top-end UHD Blu-ray player may deliver a better experience than an entry-level 4K TV from a few years back. If you’re still using a DVD player to feed a picture to a mediocre 1080p or 4K panel when/if you play DVDs, and you can swap over to a high-end Blu-ray/UHD Blu-ray player instead, I’d try it. It may or may not help, but it definitely won’t hurt. What you’re trying to do here is route the signal through the upscaler that’ll give it the best quality kick.

Still need a higher-quality picture? You’re in luck.

Real-Time AI Processing

I haven’t tested the most recent Nvidia Shield myself, but there’s a demo you can actually play with on to apply the effect the TV offers. Here’s a screenshot of the effect. I’ve positioned the slider over the lizard’s eye because it’s the easiest place to see the upscaler’s impact:

Left: Regular upscale
Right: Nvidia Shield

Still not clear? Here’s an enlarged version of the same screenshot.

The image on the left is a traditional upscaler, the image on the right is Nvidia’s Shield when asked to scale up 720p or 1080p content to 4K (content below 720p is not supported for upscaling, at least not yet). The AI component of the upscaler obviously improves the overall image quality. In my experience with applications like TVEAI, this is a fair representation of the improvements that can be achieved.

Third-party reviews of the Shield agree. Slashgear writes that when it works, the effect is “fairly ” So far as I’m aware, the Shield is currently the only set-top or 2D box offering this kind of functionality.

Preprocessing Video

This step is technically optional because there’s no reason you can’t take a DVD source file and feed it directly through upscaling software (more on this below). Often, however, this is not an ideal way to extract maximum quality from your source. It is possible to achieve meaningful improvements in image quality with traditional video editing tools. The first image below is from Deep Space Nine episode 6×06, “Sacrifice of Angels:”


Here’s the same image after being processed but before upscaling. I’ve applied denoising filters and a sharpening filter, respectively. There’s a careful balance to be struck with some of these techniques, between blending away features and drawing out detail.

for a direct comparison between the two. Both frames have been resized to 2560×1920 to make it easier to see the differences between them.

Video Upscaling Via Third-Party Software

For best results, you can use a third-party upscaler, such as I’ve made extensive use of Video Enhance AI as part of the and can confirm that the application is capable of yielding stunning results. The software currently supports upscaling on AMD and Nvidia GPUs, as well as Intel integrated solutions. AMD APUs are also supported.

The reason we can now “enhance” images to improve their clarity is that AI-based applications are capable of analyzing a video field and estimating what detail would exist if the image were of higher quality already.

One of the typical ways to train a video-enhancing application such as Video Enhance AI is to provide the neural net with the same image or video in both high and low quality. The neural net is then tasked with finding a way to make the low-quality source look as much like the high-quality source as possible. Instead of trying to teach a computer what lines and curves look like by providing painstaking examples, we’ve developed the ability to make the computer do the work of teaching itself. Saying “Enhance” has gone from a complete joke to a plausible reality in a matter of a few years.

Applications like Topaz Video Enhance AI can cost several hundred dollars and they don’t run in real-time. The result of using these applications, however, is a vastly better picture than you’ll see from any other source.

And, again, . This time, QTGMC pre-processed output is on the left, and the upscale is on the right. If you’d like to see more examples of what Topaz VEAI is capable of, I’ve been experimenting on Star Trek: Voyager as well, as shown in the slideshow below:

Topaz supports multiple models to handle various kinds of video, from models designed for deinterlacing (though not reverse telecine), models designed for video that’s been damaged in various ways, and models that can be tuned to your video depending on your needs in that specific clip. With a little color grading, a fairly dramatic overall improvement can be achieved. shows a base frame on one side and the final output, with color grading, on the other.

I suspect we’ll see AI video processing and upscaling become more important over time as Intel, Nvidia, and AMD introduce their next-generation graphics processors. There’s an awful lot of old content orphaned on poor-quality source, and building specialized AI processors to fix it will likely consume some silicon for all of the concerned parties over the next few years. The latest version of the Deep Space Nine credits that I’ve created is shown below:

Upscalers are amazing and only getting better. That’s true no matter how you consume content or what technology you use to do it. Depending on the shows you like and how much time you want to sink into the project, there are tremendous improvements to be had… or you can just wait a few years, and You can read more about DS9UP at the links below.

Feature image on top shows the impact of upscaling as well as color correction via Photoshop.

Now Read:

Facebook Twitter Google+ Pinterest
Tel. 619-537-8820

Email. This email address is being protected from spambots. You need JavaScript enabled to view it.