editing 25 FPS Vs. 30 FPS

I'm starting to realize that my present production will look bbetter rendered at 30 fps over 25 fps with all of the special effects and CGI I'm doing because too much is getting lost with the fancy effects I'm doing with laser fire and disintergration.

I know 25 fps is more like film. But, in this day and age of digital effects, is the standard shifting over to 30 fps?
 
Yes a motor or digital system is good. Not good to have a hand crank all the time, but a switch you could flip to move to manual crank would be good for controlling how many frames you want in slow motion shots and what not. But it would be much better if they just made cheaper cams with more fps options.
 
I'm interested in what 120 fps looks like after reading this. Any easy to find movies, shot in that?

Even if you had a camera that could shoot at 120 fps, you'll have a hell of a time playing it back correctly. When I did these tests I had a fancy high-frequency CRT that had some insane vertical refresh rate (somewhere in the neighborhood of 160 Hz -- it was a giant beast of a thing I picked up at Boeing Surplus back when they had some killer equipment for sale).

I don't think most common flat-panel displays can run 100+fps video aside from some of the higher-end specialty ones that cost and arm and a spleen. I haven't looked into it recently, though. I think a lot of fancier HD TVs can do 120 fps now, but they're limited to 1920x1080. The higher-resolution computer monitors are lagging behind, I'd bet.

(As a snooty mac user, I won't settle for anything less than 2560x1440 at 27".)
 
Last edited:
In terms of visual effects and the low-budget filmmaker, the higher frame rates are a liability for rendering times and rotoscoping. When I moved from TV to film, the difference was tangible (Perhaps less so with today's speedy machines). I'll stick with good ol' 24 fps.

Rok
 
Oh okay. I thought the TVs just broadcast whatever is going through them, regardless.

No, CRT's (Cathode Ray Tube) are the old style televisions and they are set for one "frame rate". These aren't capable of a progressive image as they can only show 60 FIELDS per second (NTSC, and PAL are at 50 fields per second) which are not the same as frames.

Even LCD's cannot show any frame rate, as they are limited by the refresh rates.
 
Doing your effects shots at a different frame rate than the rest of your footage is not advantageous for several reasons.....
You cant match the plate exactly.... there will be some "tween" or blend frames that cause a timing issue.
The motion will be more smooth with less blur in your FX shots than your shot footage... this is one of the factors that makes CG look like CG.

If your FX shots dont look right, then fix the effect not the frame rate.
 
Doing your effects shots at a different frame rate than the rest of your footage is not advantageous for several reasons.....
You cant match the plate exactly.... there will be some "tween" or blend frames that cause a timing issue.
The motion will be more smooth with less blur in your FX shots than your shot footage... this is one of the factors that makes CG look like CG.

If your FX shots dont look right, then fix the effect not the frame rate.

Agreed - I've worked on a lot of digital FX and none of them were ever fixed by changing the frame rate - if the effect isn't working it's because it's not properly integrated - i.e. doesn't match the camera settings (Field of View, aperture etc.) or the scene lighting.
 
Allow me to clarify my earlier post.

I'm not suggesting that anyone change frame rate of their source material. My point is that it takes fewer man & machine resources to work with material shot at lower frame rates. This is important info for the budget-minded.
 
I like where you're head's at Rok :) Having a computer process a shot that requires motion tracking would remove 6% of the processing (30fps down to 24fps) required to do it. That's real time and cost savings... and is the reason film is at 24p in the first place... the least amount of stock to get an image that the eye will accept as fluid motion. Less stock = less cost... same thing in the digital world, but the savings are more in time than materials.
 
I hear ya. If you're knocking out frames every 15 seconds or so, then even a 50% hit may not hurt that much. We once had to cram a shot through the farm that was simmering at about 40 hours a frame. Thank God we got it right after the second re-render :lol:

Rok
 
40 hours! Wow. What were you rendering, and at what resolution?

(Quick! Throw in per-frame global illumination and blurry reflections! Blinn shading is for wimps!)
 
Escher,

No, nothing with all the current bells and whistles. It was a CG environment used in a science fiction film you may have seen. Renderman, 2K, lots of comp elements to make it look better than it was ;) . Although simple by today's standards, it was cutting edge at the time ('95-'96).

Back then, simply interacting with the software was a slow and arduous process. Just waiting for the wireframe to refresh on the screen was pretty brutal. Today's crop of animators can't fathom how difficult it used to be. We used to set a few keyframes (based on stopwatch timing) and then render a test to see if it worked. 2011, and we can all just scrub back-and-forth in real time. Occasionally, I miss the old days for the people, but not the hardware.

I did some stopmo in a show you've heard of. Risky but rewarding work.
 
I got everything working with 24 / 25 fps between Sony Vegas Pro and Adobe Premiere. The DVD version shows all of the old animation too. Old, because I've been working every night since last weekend creating new effects at 24 fps with Particle Illusion and using the Fx tools with Vegas Pro to fit them into a new video layer right over the live action. The preview screen shows it show work out very nicely. The teleports, warp drive, explosions, and laser fire are better than before. This weekend, I should have Video Copilot. I'm already making use of the bonus pack.

Slow rendering time for effects means a newer computer with a faster processor and bigger hard drive with a greater percentage of free space will make a big difference. So far, my 35 minute production with effects included takes 2 1/2 hours to render. It slows up for rendering the effects and speeds up for just straight editing scenes.

I remember back in the year 2K when we were making Very Special Agents and editing on a Mac AVID with special accelerator boards with After Effects, the angel zapping the vampires with lightening scene took over a day of non-stop rendering to render the one effect in After Effects. The longest scence to render in IC2 is Artemis descending onto the alien planet as a ball of energy and transforming. That takes nearly an hour to render. It will probably take longer now, because I'm adding more turbulance effects.

These days, rendering HD special effects with anything less than a dual core 2 is insane. A computer would probably be rendering all week ... at least.
 
It's pretty amazing what can be done with today's machines.

Earlier this year, I went to Canada to shoot something that releases in '12. When I got there, I realized that I'll need to previs a ton of shots to help reverse-engineer some SFX rigs that I needed them to build for me. So, I licensed Maya to my 13" MacBook Pro, and gave it a shot. Wow, I can't believe what even a base-level laptop can do.

The tools are finally catching up with our ambition :yes: . The same goes for filmmaking. Not only are these DSLRs allowing us to make gorgeous looking indies, the pipeline finally supports it, as well. Good times.
 
Back
Top