We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›
If you bought a TV since roughly 2020, there’s a good chance it boasts “4K resolution.” A 4K TV has been the standard for a while now, but what does it actually mean? Are all 4K TVs created equally? What came before 4K? What the heck is 8K, and do you need it? The TV resolution landscape isn’t as complicated as it sounds.
What do the TV resolution numbers mean?
There’s some nuance here that requires some context for a complete answer, but here’s a basic chart to lay the groundwork:
Name | Total Pixels | Horizontal Pixels | Vertical Pixels | Notes |
480p | 345,600 | 720 | 480 | This is the resolution for the lowest-data streaming from services such as Netflix (typically for streaming on phones over cellular data). Only second-hand TVs will have this resolution as their maximum. |
720p | 869,760 | 1280 | 720 | This is also known as “HD.” Some extremely cheap new TVs will have this resolution, but it’s very rare. |
1080p | Over 2 million | 1920 | 1080 | This is also known as “Full HD.” This is the lowest-resolution television that is easily found at retail. |
4K | Over 8 million | 3840 | 2160 | This is also known as “Ultra HD.” This is the highest resolution in which most media (Blu Ray, streaming) is available. |
8K | Over 33 million | 7680 | 4320 | Very few native 8K media sources are available, but 8K TVs have technology that improves the image of 4K sources. |
In simplest terms, TV resolution is the amount of discreet visual information a television signal or set can share or display. The higher the resolution—the more discreet pieces of visual information that can be placed adjacent to each other–the more detailed an image can become. Generally speaking, more is better in terms of resolution, at least to a point. These days, due to the way the technology for television screens (as well as monitors, tablets, phones, or anything else with a screen) works, these individual pieces of visual information are called pixels. A pixel can only have one color at any one moment, though the number of different colors a pixel can be has expanded as technology advances.
While early televisions had absurdly low resolution that, due to differences in technology, didn’t really have measurements in pixels but rather in lines, TV resolution standardized in the 1960s with electronic televisions and broadcasts. In those days, the standard television screen had a 4:3 aspect ratio, meaning no matter the size of the display, there were 4 units of length horizontally for every three units of height vertically. Around the same time DVD overtook VHS as the home-video viewing technology of choice at the turn of the 21st century, the standard television aspect ratio went from 4:3 to 16:9. This began a rapid increase in the resolution of TVs (as well as other screen displays, such as computer monitors and eventually tablets and phones). You’ll often see terms like “1080p,” “4K,” and “8K” in advertising materials for television sets to describe the resolution of the screen. The best resolution available in consumer-level products is 8K, but it’s not the only choice.
Computers are capable of producing nearly any resolution, even non-standard ones, based on user choice, and the computer monitor market has many non-16:9 aspect ratio monitors (often dubbed “ultrawide”), but most design choices in computer software and games default to 16:9.
A higher-resolution screen is always capable of displaying a lower-resolution source. Most 4K or 8K TVs have very sophisticated additional technology that “upscales” or “upconverts” lower resolutions so that they look better on the higher-resolution sets. However, if upconversion or upscaling is not possible or poorly implemented, lower resolutions will look “muddy” or “blurry” compared to higher resolutions and may be more visually appealing on TV sets that match the source’s resolution. You’ll also notice this if you try to play a very low-res source on a high-res screen. Playing a 480p DVD on a 4K TV, for instance, won’t look great, no matter how good the upscaling is.
Another way of making a lower-resolution source look better to the naked eye is by making it physically smaller. If you have a 4K computer monitor and put a 1080p video source in a small window, the image will look crisper and smoother than it would if set to take up the full screen.
In order to really see the difference in quality between 1080p and 4K, or 4K and 8K, size matters. Larger TV screens benefit from greater resolutions because the way the human eye works means that the smaller a high-resolution image is, the more crisp it looks.
What the heck is 1080i and how is it different than 1080p?
There was a time when these two letters were important for your TV-watching experience, but that’s not so much the case anymore. TV screens don’t refresh every pixel at once. Instead, the image scans from the top to the bottom of the screen so quickly that the human eye can’t detect it. Progressive scanning (like with 1080p or 1440p) works as you’d imagine. Each line refreshes in order. Interlaced scanning cuts corners by only displaying the even or odd rows at any given time. This also happens too quickly for the eye to notice, but it does have some negative effects on the image, especially when there’s fast-moving action on-screen. You likely won’t run into this choice very often out in the world. Companies don’t typically advertise interlaced resolution like they once did when 720p and 1080i were battling it out in TV press releases.
How will resolution change my viewing experience?
If you are reading this article, chances are pretty good that you are thinking about buying a 1080p, 4K, or 8K television set, as those are the commonly available resolutions in new consumer models. While displays go up to 8K, the vast majority of media sources—broadcasts you pick up via antenna, Blu-ray players, streamers, video game consoles, etc.—currently max out at 4K resolution. This means that you will need at least a 4K television to display the best-quality images. 4K streaming is not available with every service, but it is with most of the most popular ones, such as Netflix, Max, and Disney+. Standard Blu-ray is 1080p, but “4K Blu-ray,” the preferred format for most people interested in physical media, displays in 4K. Xbox Series X and PlayStation 5 both have 4K available with most of their games. (Nintendo Switch and Xbox Series S do not, though a new Nintendo Switch may be released in the next 18 months, and it could potentially, though no official details are available for that system yet.) Based on all of this and the fact that low-end 4K is barely more expensive than 1080p, we recommend 4K as the minimum for any new TV you buy. 8K is more of a luxury due to the low amount of content available in native 8K.
So, do I need 8K?
This is an important and complicated question. So much so that we dedicated an entire article to it not long ago. The thoughts in that article still stand, but if you want a summary, it’s this: 8K is undoubtedly better than 4K, so if you want “the best,” it’s that. However, until streaming, physical media, and games are widely available in 8K resolution, 8K TVs are not a must. Also keep in mind that the distance away from a television set that you sit and the size of that set affects how much information your eye can process. Therefore, your personal room configuration may make certain TV sizes and resolutions look better. For example, if you have a 70-inch screen 2 meters away, you can definitely discern the visual difference in an 8K TV vs a 4K TV. But if you move one more meter away from that same screen, the 4K TV and 8K TV will look nearly identical (assuming everything else about the televisions is the same).
Streaming, in particular, is going to be a driver of whether or not you need to get an 8K TV. While YouTube has a small amount of mostly nature documentary content that will stream at 8K, none of the major entertainment production studios produce 8K films or television shows for home viewing, topping out at 4K. Digital film projection in movie theaters is a different story, but one irrelevant to television buying. So, no, you don’t need 8K. It’s nice, but it’s not a need. Yet. The technology will become more ubiquitous, 8K gaming and streaming will become standard, and one day, the question will change from “Is 4K enough or do I need 8K?” to “Is 8K enough or do I need 16K?” That’s the way technology works.
The resolution will be televised
Hopefully this gives you a better idea about what resolution is and what you want. (Hint: It’s probably 4K, but it might be 8K.) But don’t stop there, look at our reporting on the latest 4K TVs, larger-size TVs, as well as Roku and Android TVs (which have streaming apps and games built into the TV’s operating system). Learn about what choosing between OLED vs. Mini-LED technology brings to the table. There are a lot of options, and we’ve, well, screened them for you.