TV technology is probably second only to phones these days in terms of just how fast it moves. Just 10 years ago many people were still using an old fashioned Cathode Ray Tube (CRT) TV that worked off on an analog TV signal. But that changed in 2009 when most broadcasters switched over to digital transmissions and high power analog broadcasts were switched off. When that happened, people either needed to get a converter box to watch digital broadcasts on an analog TV, or they had to go out and buy a new TV with a digital tuner. Unsurpisingly, many chose the latter. But those TVs that were purchased back in 2009 are no longer state of the art. In fact, many of those TVs were not even progressive scan and had a resolution of 1080i.
What is Progressive Scan?
A QUICK LESSON: Older TVs scan in a picture by beaming one line at time across the screen. The beam actually skips every other line for the first pass, than returns to scan the missed lines on a second pass. Known as interlaced scanning, this is what the "i" in 1080i designates. While there aren't many interlaced TVs on the market any longer, many HD broadcasts from network and cable providers are still in 1080i.
So, what then is progressive scan? This is when a TV beams the entire picture onto the screen in one all encompassing pass. No skipping, no alternating lines. As you can imagine, it’s a more vivid, uniform picture. But remember: to maximize the progressive scan image on your TV, your image sources (Blu-ray, broadcast, streaming) must be spitting out a progressive image. So when you hear about 1080p resolution, the "p" stands for progressive. Since progressive shows all of the pixels at once, it actually offers twice the clarity of an interlaced source of the same resolution.
Can We Get Some Resolution on This Resolution Thing?
So, the 1080 in 1080p is the resolution, or the number of lines displayed on the screen. As recently as three or four years ago, 1080p was the highest available resolution for most video sources that were produced, but that's changed with the adoption of 4K. Here's a brief rundown of the typical resolutions you'll see currently:
- Standard Definition (SD or SDTV) When you're watching a grainy non-HD channel on TV or a DVD, you're watching an SD source with a resolution of 480i. These days with our eyes being used to higher definition sources as well as the much larger screen sizes of today's TVs, SD sources can be pretty tough to watch and should be avoided when possible.
- High Definition (HD or HDTV) HD sources come in three flavors: 720p and the 1080i and 1080p that we already mentioned. 720p is referred to as "HD ready" and 1080i and 1080p are both known as "full HD". Even though 1080i technically is a higher resolution than 720p, a 720p image will offer more clarity and less motion blur than 1080i since it displays all the lines of resolution at once rather than interlacing them. It's a fairly moot point though, as you don't see many 720p broadcasts or any TVs these days.
- 4K Ultra HD (UHD) The current gold standard of resolution is 4K Ultra HD, also known as just 4K or UHD. 4K boasts 3840 x 2160 pixels. A 4K image has four times the resolution of 1080p and the difference is breathtaking. It’s called 4K because the movie theater standard, called Cinema 4K, has slightly more pixels and vertically is 4096 pixels, which is shortened to 4K. Oddly enough, the naming convention that was used for SD and HD would have 4K labeled as 2160p, but I guess somebody thought that wasn't good enough and it had to be called 4K. In the last two years more and more 4K TVs have become available and at more reasonable prices too, with some even less than $500. Also proliferating are the release of 4K UHD Blu-rays and increased 4K streaming options from the likes of Netflix and Amazon. How long will 4K remain on the top of the heap? Probably not long if things keep going the way they have been, Dell has already introduced an 8K monitor aimed at the gaming market and more will follow soon. Having said that, 4K will not be abandoned by TV manufacturers or content producers anytime soon.
What About HDR, What's That?
High Dynamic Range (HDR) is a new standard which creates an image with a wider and more accurate range of color, also known as color gamut, and better contrast ratio. A better contrast ratio means that HDR heightens the difference between dark and light areas of the image to make them look more realistic. While 4K allows for a more detailed image, HDR makes that image more accurate when compared to the real thing. Or to put it simply, 4K is more pixels, while HDR is pixels with enhanced color and light. HDR technology originated in cameras, you’ve probably seen the HDR setting in your phone’s camera. Everyone agrees that content that uses HDR looks better than content that doesn’t, so while you are looking for a TV, make sure that it also supports HDR. Fortunately almost every new 4K display will have HDR built in. But it isn’t quite that simple, as there are two different HDR formats, HDR10, and the recently announced HDR10+ version, and Dolby Vision. HDR10 is the basic standard, while Dolby’s licensed version touts enhanced color and contrast over HDR10.
Aspect Ratio: Keeping Things in Proportion
Another difference between SD and HD or 4K is the aspect ratio, which is the shape of the image with respect to its relative width and height. SD signals are natively in a 4:3 aspect ratio with 4 being the width and 3 being the height. This is because old school CRTs were more of a square shape as opposed to modern widescreen TVs. Thus, the image was sized to fit the screen. Which was fine for TV shows and other broadcasts. The problem was that movie screens were much wider so when they were converted to a 4:3 ratio, much of the original image was lost and you were actually only watching a portion of the image. Back then the solution was "letterboxed" VHS and DVDs, which made the image the same ratio as the original theater print by adding black bars to the top and bottom of the screen. The result was that none of the image was lost, but the overall size was much reduced. But with the advent of HD, screens could be produced that were relatively wider and the adopted aspect ratio for both the TV screens and the content became 16:9. That's still not as long horizontally as the format for movie theaters, but it's closer, so those black bars when watching movies are now much smaller, which allows for a larger image.
Smart TVs: Your TV is a Computer
A big leap forward in TV technology has been the rise of Smart TVs. That is, TVs that can connect to the internet and use apps. TVs outfitted with computer processors have actually been around for decades, but in the early 2000s they began adding functionality like web browsers and online gaming. Now with streaming media and audio becoming the huge percentage of our watching and listening, a Smart TV has become a necessity for many and you'd probably be hard pressed to find a new TV that is not "smart". Many people still access apps through a Blu-Ray player or their phones and then connect to their TV wirelessly through Chromecast or Bluetooth. However, it's still nice to have apps like Netflix, Amazon or Hulu pre-loaded on the TV out of the box, so you can use it as a stand alone unit without any external device. Since they all have WiFi, getting them connected to the internet via your network is a snap, as long as you can remember your password. As mentioned, any Smart TV will also have built in Bluetooth so you can stream audio directly from your phone or other Bluetooth enabled device. One big difference between Smart TVs is the operating system (OS) that each uses. A few of the most popular OS options are Android TV, Roku, Amazon Fire TV and Samsung's new Tizen OS. When you're shopping for a TV, try out a few of the different OS types at the store to see which you like best.
That concludes our look at the terms and technologies you need to know when getting into the world of today's TV tech. We hope you found it informative and with you happy watching!