120Hz, 240Hz, 480Hz...

Just curious if anyone else finds the increasingly-higher refresh rates kinda... weird-looking.
Particularly for more cinematic experiences.

I remember the first time I saw one of these on display in an electronics store, with a familiar movie on demo. It did not look right. The first thing that came to mind was that the playback was "too fast". Not sped up... but you could see the appearance of "more frames" (I know that's the point; that's what it's supposed to do - to interpolate more frames.)

To be honest, I had figured the manufacturers are gonna always need a feature to increment or something to keep upping for sales purposes. Once you hit a resolution max for a while... and tout progressive... seems you have to find another feature to market. (You know... "Man, they've got 480Hz TVs now! Gotta go get one!") So I get the gimmicky/sales part of it.

But I was visiting a friend who had a classic war movie going. Again, to me it just looked goofy, particularly smeary on parts with lots of motion. I talked to him about it and he seemed to think it was really quite NICE.

Then I got a little concerned that this was the way things were going and I'd have to accept it. ...and I started thinking maybe I'm just old-fashioned. After all, I watch all of my movies in "theater mode", with flat/muted colors, as opposed to blindingly-bright and saturated. (And I abhor 3D, but that's another thread.)

So... high refresh rates. Do you like them?
 
They're great for video. News, sports, even made-for-tv things like sitcoms (though most big budget ones still shoot on film).

For movies, it's not a bad thing. Most won't tell the difference, and too fast is better than too slow.

For 3D, it's great. having the glasses and screen sync in their super high refresh rate was a great idea for separating right eye left eye. Way better than polarization IMO.
 
It's just repetition. I noticed immediately that it looked perculiar when purchasing a new Television, but eight months on, it's custom.

To what do i prefer? I'm not so sure...
 
Eventual all the old farts will die, and movies will be at acceptable frame rates! ;)

From James Cameron interview.. http://www.variety.com/article/VR1117983864



I'm hearing that there are already calls to increase the frame rate to at least 30 fps for digital 3-D because certain camera moves, especially pans, look jumpy in 3-D. I saw that in the Imax 3-D "Beowulf." You've been an advocate for both 3-D and higher frame rates. Have you seen the problem and do you have any thoughts on it?


For three-fourths of a century of 2-D cinema, we have grown accustomed to the strobing effect produced by the 24 frame per second display rate. When we see the same thing in 3-D, it stands out more, not because it is intrinsically worse, but because all other things have gotten better. Suddenly the image looks so real it's like you're standing there in the room with the characters, but when the camera pans, there is this strange motion artifact. It's like you never saw it before, when in fact it's been hiding in plain sight the whole time. Some people call it judder, others strobing. I call it annoying. It's also easily fixed, because the stereo renaissance is enabled by digital cinema, and digital cinema supplies the answer to the strobing problem.

The DLP chip in our current generation of digital projectors can currently run up to 144 frames per second, and they are still being improved. The maximum data rate currently supports stereo at 24 frames per second or 2-D at 48 frames per second. So right now, today, we could be shooting 2-D movies at 48 frames and running them at that speed. This alone would make 2-D movies look astonishingly clear and sharp, at very little extra cost, with equipment that's already installed or being installed.

Increasing the data-handling capacity of the projectors and servers is not a big deal, if there is demand. I've run tests on 48 frame per second stereo and it is stunning. The cameras can do it, the projectors can (with a small modification) do it. So why aren't we doing it, as an industry?

Because people have been asking the wrong question for years. They have been so focused on resolution, and counting pixels and lines, that they have forgotten about frame rate. Perceived resolution = pixels x replacement rate. A 2K image at 48 frames per second looks as sharp as a 4K image at 24 frames per second ... with one fundamental difference: the 4K/24 image will judder miserably during a panning shot, and the 2K/48 won't. Higher pixel counts only preserve motion artifacts like strobing with greater fidelity. They don't solve them at all.
 
James Cameron is the film anti-christ. Somebody needs to put him on a church alter and shove 7 daggers in his body before that fucking slimeball, no talent, hack tortures us with another shitty film.
 
Cue Cracker Funk in 5, 4, 3… :D

HA! No doubt.

He's evil incarnate, the anti-film, everything wrong, bad, and destructive that we should be attempting to destroy before it infects more filmmakers. He is the AIDS of film. The cancer of film that needs to be cut out, and blasted with radiation before it metasticizes.
 
You're right. I don't want to change. Also, if I wave my hand in front of my face really quick, I get a "strobing" and/or "blurring" effect. Is there any way to upgrade the framerate of my ocular sensors?
 
HA! No doubt.

He's evil incarnate, the anti-film, everything wrong, bad, and destructive that we should be attempting to destroy before it infects more filmmakers. He is the AIDS of film. The cancer of film that needs to be cut out, and blasted with radiation before it metasticizes.

Yeah, but tell us what you really think.
 
hahah..


Then you take that excellent and PERFECT footage and beat it to death to get the pleasing result.. like the OP did in this other post..

http://www.indietalk.com/showthread.php?p=151858#post151858


7D quality footage massacre! But looks great!

Not to knock the post processing or the camera (both are pretty sweet) - but you could also just use 8mm in the first place as well. Probably knock out a double handful of 8mm work for the cost of a 7D and aks.

T-o-m-eh-t-o / T-o-m-ah-t-o AFAIAC though. Once you take away the debate of personal opinion it just comes down to an artist in their chosen tools. I like certain kinds of tools. I also like old things, not because I don't like technology, but just because I do. I like my 28yo truck because it is freaking bulletproof and in the rare event something goes south, it's easy to fix. I like my AE-1 because I have been using it forever and I like the images that I get with it and the mental process that I go through when I use it. I like my Galaxy phone because I can track mileage for business, listen to music, check email, take cool pics, surf this site, calculate depth of field, or just about anything else that can be done with software. ;)

I'm also talking about the projection though. I've been following the 48fps projection debate elsewhere on the internet and can understand where folks are coming from on both sides. But let's ask a question here.

Why does every movie need to render motion as if it were an HD broadcast of a live sporting event on a rapid refresh rate screen? How would an ethereal work like Days of Heaven benefit from HD gloss and game console like refresh rates?

Maybe I am an old fart (although, if memory serves we're not that far off in age, you fart! :D), but I don't see the advantages of 48fps projection as always being advantageous. I guess my point is this: if frame rates @ projection can be varied for different effects to the image (only talking about 24@24 30@30 or 48@48, etc matching rates, not under/over cranking, not pull-downs) - then the choice becomes an artistic one as much a business one. If the projectors can run at 100fps, and a camera can shoot at 100fps - then perhaps there is a case to do so. Of course some story/look/style combos are going to be (IMHO) more suited to 24@24.
 
Last edited:
Back
Top