Showing posts with label HD. Show all posts
Showing posts with label HD. Show all posts

Graphic Display Resolutions – What Do The Numbers, SD - HD Ready - Full HD, 720p versus 1080p - 1080i Mean?

Graphic Display resolutions can be a rather cryptic business, with multiple standards used to describe the same display resolution in 10 different ways. All of those technical terms tend to change based on the display’s purpose (television versus computer monitor) and even your region (the meaning of HD Ready).

Previously, we talked about 7 Important Things To Know When Buying an LCD Monitor, the difference between full HD and HD Ready and even how Apple’s retina display works. Today, we’ll help you make sense of the different terms people tend to throw around when describing display resolutions. When buying a computer monitor or a TV screen, it can be incredibly useful to know what those numbers mean. Not just to differentiate between two displays, but also to determine what kind of display you should be looking for.

Width x Height

The easiest convention is the one that’s used to describe the maximum resolution of computer monitors. A lot of laptop displays have a maximum resolution of 1280×800, and the resolution of larger computer screens often go into the neighborhood of 1680×1050.

These numbers describe the width and height of the display in pixels – the building blocks of your display. Some displays have different pixel densities (most famously, Apple’s retina displays), meaning the physical size of two displays with the same maximum graphic display resolution is not necessarily uniform. But the actual resolution (that is, the amount of available building blocks to construct a picture) is unambiguous.


SD, HD Ready Or Full HD

The difference between SD and anything with ‘HD’ in its name is simple. SD – or Standard Definition – is usually used to indicate television displays that are not 720p or 1080p High Definition screens, or 480p Enhanced Definition screens. More generally, the term SD display is used to indicate 576i displays in the PAL and SECAM regions, or 480i displays in the NTSC region.

Full HD is used to describe 1080p displays. The difference between HD Ready and Full HD is more ambiguous, and depends on the region. For more information, read Matt’s article on The Difference Between HD Ready and Full HD.

If some of the words used above made little sense, don’t worry. We’ll explain the meaning of 720p and 1080p in a bit, as well as the difference between 1080p and 1080i (progressive versus interlaced) displays.


720p versus 1080p or 1080i

Modern televisions are often described using terms like 720p and 1080p, or 1080i. The number at the front of the term indicates the lines of vertical resolution. Thus, 720p and 1080p have 720 and 1080 lines respectively of vertical resolution. Both 1080p and 1080i screens have 1080 lines of vertical resolution (we’ll explain the difference between these two below).

To compare the resolution of these displays to the width x height notation of computer displays, we can glean the lines of horizontal resolution from the aspect ratio. For example, a 1080p display with a conventional 16:9 aspect ratio has 1920 lines of horizontal resolution, meaning a 16:9 1080p screen has a resolution of 1920 x 1080 pixels.


1080p versus 1080i, or Progressive versus Interlaced

The difference between 1080p and 1080i, or rather the difference between progressive and interlaced displays comes down to how the image is displayed.

Progressive displays use frames. One frame is one completely rendered image. If you press pause while watching a video, you’re looking at a single frame. If a progressive display is said to have 25 frames per second, that means it renders 25 distinct images every second.

Interlaced displays works very differently. Instead of refreshing the entire picture, it refreshes half the lines in the picture. It’s meaningless to talk about frames per second, because an interlaced display never displays a ‘complete frame’. Instead, we express the refresh rate in fields per second, in which one field contains half the lines of the display.

In an ideal, theoretical world, progressive would always be better than interlaced. However, there are a few problems with that thought. Progressive displays don’t have the same refresh rate as do interlaced displays. Although interlaced displays only render half the lines on the displays with each refresh, it refreshes twice as often as the equivalent progressive display, and each of these fields is part of a distinct snapshot. On top of that comes the fact that television broadcasting uses interlaced video.

All this makes interlaced pictures more fluid in motion than the equivalent deinterlaced pictures. On the other hand, progressive pictures are more easily scaled, paused and edited – which makes the image more adaptable with less loss of quality.

What other specs do you look at when shopping for displays? Let us know.