PDA

View Full Version : 120 hz movies.



ZenOps
04-01-2019, 03:19 PM
https://www.engadget.com/2019/03/13/paramount-ang-lee-sci-fi-film-hfr/

Always said this should be a thing. Years of playing videogames have trained me to recognize 100+hz. Movie theatres at 24hz are just plain painful nowadays. Did I notice the smoothness of the Hobbit? Absolutely.

Asians with fast acuity and millennials demanding more of film that is based on 1927 standards.

Mitsu3000gt
04-01-2019, 04:30 PM
Hz and FPS are not the same thing.

I cannot stand non-animated movies shot at 48+ FPS. Action sequences look OK but any still scenes get a very strong "soap opera" effect and it becomes painfully obvious that you are watching a movie set with props and other fake accessories. It looks like you're watching a school play or something or like it was filmed with a handy cam. I'm sure it's one of things where some people are more sensitive to it than others but I notice immediately and it ruins it for me. The non-action scenes in the new Hobbit trilogy were some of the worst I have seen. I wish I was less sensitive to it. 120FPS would probably be unwatchable for me unless it was animated haha.

Buster
04-01-2019, 05:20 PM
The Hobbit looked like shit.

ExtraSlow
04-01-2019, 05:50 PM
How is fps different than Hz?

firebane
04-01-2019, 06:16 PM
How is fps different than Hz?

Hz is the amount of times a screen updates itself per second. FPS is the amount of frames something is capable per second.

ExtraSlow
04-01-2019, 06:22 PM
I'm no cinephile, but those sound the same to me?

firebane
04-01-2019, 06:31 PM
I'm no cinephile, but those sound the same to me?

They are close but not the same. For example if you are a gamer and want a high fps you need a high refresh rate.

ExtraSlow
04-01-2019, 06:55 PM
I can see where a monitor could have a higher refresh rate (hz) than the content frame rate (fps). that makes sense to me. I guess the convention is that the fps refers to the content, and the hz refers to the display?

killramos
04-01-2019, 07:59 PM
I can see where a monitor could have a higher refresh rate (hz) than the content frame rate (fps). that makes sense to me. I guess the convention is that the fps refers to the content, and the hz refers to the display?

It all depends on how autistic you want to be about nomenclature...
85411

I am no mathematician, but something that measures the number of times something occurs per second, and another thing that tells how many times something occurs per second. Are the same unit.

ExtraSlow
04-01-2019, 08:04 PM
It all depends on how autistic you want to be about nomenclature...
85411

I am no mathematician, but something that measures the number of times something occurs per second, and another thing that tells how many times something occurs per second. Are the same unit.

That was my initial confusion. I'm pretty pedantic, but once I know the common usage of terminology, I'll adhere to it.

Buster
04-01-2019, 08:07 PM
Hi can be applied to any frequency... So fps can be referred to as hz.

Edit: beat me to it

01RedDX
04-01-2019, 08:26 PM
.

Buster
04-01-2019, 09:24 PM
Monitor refresh "hz" is one number.

Video card frames "hz" is a separate number. Normally referred to as FPS.

Your graphics card output is a frequency, so therefor can be described in hz as well.

rage2
04-01-2019, 10:02 PM
Like everyone has already said, it’s just semantics. Even though it’s technically describing the exact same thing, people have used FPS to describe content frame rate, and Hz to describe the display’s physical rate. Move over to cinematography where things were film, it’s just FPS to describe the content’s frame rate, and you’re just projecting the film. There is no display before digital projection. Even with digital projection, to keep in line with film nomenclature, they just refer to what they can play with FPS. No mention of Hz anywhere.

Mitsu3000gt
04-02-2019, 10:34 AM
I can see where a monitor could have a higher refresh rate (hz) than the content frame rate (fps). that makes sense to me. I guess the convention is that the fps refers to the content, and the hz refers to the display?

You pretty much got it, but they are quite different.

FPS is how many frames are being built/rendered per second, Hz is simply how many times your display device is refreshing the on-screen content. They are separate and often confused. 60Hz is kind of the 'magical' point where most people do not see any blinking or flashing and virtually very monitor and TV these days are at least 60Hz.

Put another way, in terms of a computer game which is the arena this is discussed in most often, your monitor's refresh rate has absolutely nothing to do with how many frames per second your computer can produce. The difference is if your refresh rate is lower than the available FPS, your monitor will not be able to display all the frames that are getting produced. Think of the refresh rate as a ceiling for how many frames you can view per second, but it has nothing at all to do with how many frames get produced. If you have a beast of a computer that can put out 144+ FPS at max settings but only a 60Hz monitor, it will only be possible for you to see 60 unique images per second despite what is being produced behind the scenes.

If you have higher FPS than you do refresh rate and the two are not in sync, it is possible to get "screen tearing" which is a phenomenon where you see multiple frames in a single screen draw. Again in the computer world, the ways to avoid this are to enable things like V-Sync which forces the GPU to send a new frame when your monitor is ready for it (but also limiting your FPS to the refresh rate of the display). The second (and better) way to do it is with displays compatible with Nvidia's G-Sync or AMD's FreeSync. They are able to communicate with the GPU and allow a variable refresh rate (up to the display's maximum refresh rate) to match the GPU's frame rate, displaying the frame as soon as it's rendered. In computer games especially, maximum frame rates vary so this works really well.

Also, just like most other things, it's only as good as the weakest link - everything in the chain (video card, monitor, ports, etc.) has to be capable of X refresh rate and X FPS to take advantage.

So yeah, FPS had Hz are not the same thing. You could have a 1 FPS feed, or even a still image, and refresh the display a million times per second or you could have a 1Hz monitor and have a supercomputer feed it a million useless FPS if you had the means to do so.

In the film world it's not quite as straight forward as most films are shot in 24 FPS and displays and HT projectors have refresh rates well above 24Hz. Very few HT displays have the ability to vary their refresh rates to 48Hz 72Hz or whatever (some plasmas could). You want a refresh rate that is an even multiple of 24. In that scenario you are never getting more than 24 native/input FPS but if you have a 60Hz display it uses 3:2 pull-down and when you have a 120Hz or 240Hz display with the "motion enhancement" features turned on (some people are very sensitive to the look) those displays are performing frame interpolation and if you have the enhancements turned off it uses frame repetition. For a 48FPS movie like the Hobbit you would want to watch at 48 / 96 / 144 / 192 / 240 Hz for no judder.