Screen resolution

Pages: 12
Watching a properly display movie widescreen movie on a squareish display has black bars at the top and the bottom. That's one way to put a widescreen image onto a 4:3 display.

Another way is to squash the image so it fits.

Or do "pan and scan." Crop the image, pan around for the image to display and chop off the extended sides.

Now with widescreen TVs watching older TV shows and movies properly displayed there are black bars at the sides. Otherwise the image is stretched to fit.

A lot of movies have a bigger aspect ratio than would fit on a 16:9 display. So there are still black bars at the top and bottom.

The original US movie format, what's called "Academy format," the image is 1:1.37. Almost the same as original TV aspect ratio of 4:3 or 1:133.

Made it "easy" to show a movie on a 4:3 TV, very minimal visual alteration.

Hollywood began creating widescreen movies to compete with TV. Get people to pay for an experience they couldn't get for free at home.

I am a "purist." I want to see a movie or TV show how it was originally presented. If that means black bars then so be it.

When it comes to monitors the image is usually scaled by the OS if the app isn't a full screen game. How games handle different resolutions/aspect ratios vary from game to game.
interestingly most movies are 30 FPS or lower. I think new ones are 60 or so? But all the old ones before (2000 or so?) are at or below 30. Yet I have seen people freak when their video games fall below 60 even though you can barely tell if not doing high speed spin in place stuff.

I am not doing well with the modern techs here. The older I get and the worse my vision gets, the smaller everything becomes due to increasing resolutions.
@jonnin
Yeah, that’s been my problem. I keep turning the font size up on my phone, but nothing seems to get big enough to read. I’ve got to use those dollar-store +2 glasses.
Aspect ratio and resolution ain't the same thing, though a lot of people confuse them.

The framerates of movies on film was traditionally 24 frames per second. That caused problems when they were shown on (US) TV. 24 =/= 30. 30 fps was the standard for analog US TVs. Analog TVs are considered Standard Definition (SD).

Well, NTSC (US analog) is actually approx. 29.97 fps, PAL (European) is 25 fps.

Then you get into whether the content is Progressive or Interlaced.

https://www.manchestervideo.com/2013/10/16/quick-guide-to-video-frame-rates/

Showing a movie on US TV required being scanned so each film frame became two half video frames, with some frames being repeated so the motion was smooth and seamless.

A computer monitor has software that can jigger things, or the content has been smashed around beforehand.

Movies upscaled to 48 fps, starting with "The Hobbit: An Unexpected Journey." The push for 60 fps is the latest craze.

Now that TVs are digital -- Standard Def, High Def or Ultra Hi Def -- things get really screwy and confusing.

Video games and computer monitors are a whole different ball of wax, and have different ways of dealing with playback content.

Some people hook a computer up to a TV so they can get a really big screen. Computer monitors can be expensive buggers and they aren't made in TV sizes that I know of except for very specialized uses. Sports stadiums, for instance.

Bottom line, to get really good quality playback at a good resolution, TV or monitor, it will cost money.
Yet I have seen people freak when their video games fall below 60 even though you can barely tell if not doing high speed spin in place stuff.
The difference is that a video game is interactive, and you're directly causing things like the camera to quickly look around; more frames = less lag when moving mouse (all other things equal).
Duthomhas wrote:
I keep turning the font size up on my phone, but nothing seems to get big enough to read. I’ve got to use those dollar-store +2 glasses.

Huh, I've got the same problem! Although I use expensive contact lenses because glasses always fall down my nose...

As for height-by-width, in math class, we learned the Cartesian coordinate system, which uses (x, y), or (horizontal, vertical). So that's what I've always used.

Although I think most TV screens are measured on the diagonal, am I correct?

Furry Guy wrote:
Now that TVs are digital -- Standard Def, High Def or Ultra Hi Def -- things get really screwy and confusing.

Ugh. Whenever I watch a newer movie– e.g. The Matrix– and then during the week we'll watch old TV shows, like Hogan's Heroes– I always have to fiddle around with the resolution, aspect ratio, etc. It's too complicated for an old fogey like me who just wants to turn the tube on and laugh at the antics of Werner Klemperer, John Banner, &co.

Some people hook a computer up to a TV so they can get a really big screen.

We used to watch movies on my 2010 iMac– it has a 16:9 aspect ratio and it worked pretty well, has a nice big screen.
LOL, these days you can get a wall-sized TV for $700 at Costco. Good enough to build your own mini-theater.

Yes, TV screens are measured on the diagonal — a super stupid metric, IMO. Here’s Samsung’s guide to TV screens: https://www.samsung.com/uk/tvs/tv-buying-guide/what-size-tv-should-i-get/

Video, on the other hand, is measured by the vertical (number of scan lines, which may be half of what you expect). The corresponding horizontal value is in a standards document somewhere. For example, a 720p video is an HD video, with 720 scan lines and 1280 pixels, or 1280×720.
Yes, TV screens are measured on the diagonal — a super stupid metric, IMO


It made sense in 1955. There was only one think the diagonal number could mean, so it was sufficient. It no longer makes sense -- a pixel count would be a little better as a single number to compare 2 things at a glance.
US$700 is well outside the comfort zone of my very fixed and limited income and budget.

Besides, I don't have the space for a TV larger than 50"-60" nor do I want to do a wall mount.

Whenever I watch a newer movie– e.g. The Matrix– and then during the week we'll watch old TV shows, like Hogan's Heroes– I always have to fiddle around with the resolution, aspect ratio, etc.

With even a budget priced Blu-ray player with HDMI input switching from HD to SD content and back shouldn't require changing the aspect ratio.

Cable/satellite/over-the-air channel content can be a pain like that, the broadcaster/cable provider needs to set the aspect ratio for all the channels and LEAVE IT THE HELL ALONE!
Topic archived. No new replies allowed.
Pages: 12