F55 doesn't do 120 hfr in 4k raw?

I was just doing some fantasy reading and I can't seem to find any literature saying that the F55 does 120fps in 4k raw. I know it can do up to 240 in 2k raw, but only up to 60 in 4k raw? If that's true, it's so strange that the fs700 can do 120 in 4k raw even though the F55 costs almost four times more... thanks in advance. And if it doesn't do 120 4k raw, is there any firmware update in the works for it?
 
I wrote that the F55 costs almost four times more... but I didn't include the cost of the fs700 interface unit, etc... so after adding up all the accessories for the fs700, I think the F55 will be about two times more (including the r5 recorder).
 
The FS700 can only record 4-second bursts in 4k 120fps (as of firmware V3). IOW, a 4-second burst of 4k 120fps is practically useless when you can have continuous record at 240fps at 2k.
 
I'm with you on 4 seconds not being much... but I imagined that the short 4 sec burst specifically had to do with the fs700, like the write speed being limited by the fs700's hardware and not by the r5 recorder (I'm just speculating, I don't know if that's true).

Since the F55 does 4k internally, I thought it might be possible... but maybe 4k raw 120fps is just too much for the r5 recorder to handle for any practical length of time. If it's the r5 that's limiting it, I wonder if a future firmware update would even help-- just thinking about the Odyssey 7q striping the hfr data, but that's not possible on the r5 b/c it only has one card slot right? I hope it is possible, though. Any thoughts on the feasibility?
 
The AXS R5 recorder records 16-bit raw on the F5 and F55. On the FS700, it is 12-bit, which would allow a tad more bandwidth to be able to get that 4k 120fps. Realistically though, it seems more like a marketing gimmick to be able to say that the FS700 can record at 4k 120fps.

I believe the R5's raw is also mostly uncompressed which makes it more difficult to get higher frame rates.

Realistically, high frame rate is useful for some applications but 99% of the time, it's the not the crucial deciding factor in a camera purchase.
Secondly, 2k is perfectly fine for high frame rates, as a vast majority of films are still finished at 2k, so 2k raw is more than enough for high speed, generally.

Also, high speed has traditionally been a specialist application, for which you would generally need a specialist film camera, and churn through rolls and rolls of film.

These days, not a lot has changed, except for the film part. If you want to get incredibly high frame rates, you would rent a Phantom or similar (or buy, if you really needed it).

For a frame of reference, The lower range Phantom Miro, costs between $25,000-$70,000, shoots in 1080p, with a maximum frame rate of 1500fps (and a 2.7 second burst/buffer at 1500fps).
 
Realistically, high frame rate is useful for some applications but 99% of the time, it's the not the crucial deciding factor in a camera purchase... Also, high speed has traditionally been a specialist application, for which you would generally need a specialist film camera...

This is the reason I was dreaming about the F55 having 4k hfr. That ultra hfr on the Phantoms really is a niche function, I can't imagine I'll ever need that high of a frame rate. There's a slo-mo scene in The Hurt Locker that comes to mind, where an explosion shakes the dirt on the ground... I'm not sure, but I would guess that she used a special high-speed camera for that... it's always surreal seeing those short real-time moments slowed down enough to see the physics of high-speed motion. And I thought that scene was great b/c it showed the power of the bomb blast translating into the physics... but it's such a specific premise, when else would it make sense to slow something down to 1500fps for dramatic content? Most of the time, I feel that ultra hfr will just be a great visual effect, but an abstract and clumsy tool if used for dramatic content; I think it's great for informative stuff or special interest stuff, but rarely useful for dramatic content. So I don't think I'll ever need a specialty camera like the Phantoms.

Whereas, 120 or 240 is slow enough to capture that surreal slow-down effect, but still fast enough to bookend a dramatic event.... beginning, middle, end... without making the audience sit through 60sec of something that only takes a few seconds to "get the point." I really like 120, I think it's a great creative tool, a fantastic accent for storytelling, really powerful for creating altered perspective for dramatic content, even if it's just for one or two scenes in a feature. I can't even afford the F55, but it's a dream to look forward to, and I'd be over the moon if it could do 120 in 4k... no need for continuous record, but something more substantial than 4 sec would be amazing. The global shutter, and the rest of the features, PLUS the 120fps would make the F55 a fantastic all-around camera for me. I think I would get many years of use out of that camera, and still be competitive.

And I don't know what kind of work I'll have to do to get myself in a position to shoot some spec material... but if I ever have the opportunity to shoot food commercials, the slow motion will be a must; I've actually been paying attention to commercials lately, and nearly every food commercial I see has slow motion. So whether or not the hfr is requisite for a camera, I guess it'll depend the work someone seeks out.

Secondly, 2k is perfectly fine for high frame rates, as a vast majority of films are still finished at 2k, so 2k raw is more than enough for high speed, generally.

This post is getting a little long, but it be great to hear thoughts on 4k vs 2k. Some things I read suggest that 4k is just a fad, and that it'll be about as successful as 3D TV, and that 2k is sufficient. Others say that 4k is the next standard. I saw that Arri and BMD are both coming out with 4k cameras, and Sony already has a couple cameras that record 4k internally- in addition to the ones that record 4k raw with the R5, and C500 does internal 4k. And then I see consumer 4k TVs rolling out... If I had to guess, I would think 4k is going to be here to stay, but who knows... definitely, it seems like 4k is the next product platform the manufacturers are going to push. So at the moment, I feel like 2k is sufficient... but I want to be competitive in the future with a 4k camera, if that's where everyone else is going. If you think 4k is the future, how long do you think it'll take 4k to become ubiquitous?
 
There's a slo-mo scene in The Hurt Locker that comes to mind...
I'll have to watch The Hurt Locker again and try and find the scene, perhaps I'll be able to hazard a guess at the ballpark frame rate.

Honestly, even 50-96fps is pretty slow, and I would personally take higher dynamic range and better colour rendition over resolution and especially over higher frame rates any day.

I'm not sure what your experience is, or how many cameras you've owned. For me, I'm not a fan of owning cameras as I believe it locks you in to that camera, rather than being able to explore different options depending on the project.

That's not to say I wouldn't ever buy one.

If I were to buy one, there's no way I'd purchase a camera without spending an extensive amount of time shooting with the camera, doing tests and simply working with it.
The F55 and even the F5 look great on paper. Their specs are great, the F5 even has a base ISO of 1250!
And yet, I hate the images out of them. They look so... 'Sony' for lack of a better term. Sony's have a certain look, just like Canon's and Panny's have a certain look. But that's also why I wouldn't use a Sony on a narrative project (I might consider an F65). I'd also be reluctant to use a Canon, assuming I had the budget for anything else.

Specs are great, but all I care about is how the image looks up on the big screen, and as far as I'm concerned, the F5 and F55 are only worth using if you're shooting a doco. I'd rather shoot on a Scarlet or an Epic for a narrative project.

Of course, if there were specific projects that really suited the Sony or Canon look, I'd shoot on them, but in general I'm not a fan, and I would advise anyone who's making a camera purchase to look at the images, and look at them in a theatre (a tiny web viewer can make anythign look decent).
I've seen the F55 projected in 4k and I was less than impressed. But, of course, that's simply my opinion, and there are many that like the look of such cameras. But, make the decisions for yourself, and use your own eyes, rather than those who post blogs and may or may not be on the payroll of the company they're endorsing.

And I don't know what kind of work I'll have to do to get myself in a position to shoot some spec material... but if I ever have the opportunity to shoot food commercials, the slow motion will be a must; I've actually been paying attention to commercials lately, and nearly every food commercial I see has slow motion. So whether or not the hfr is requisite for a camera, I guess it'll depend the work someone seeks out.
I recently Focus Pulled on a food commercial, and there was one or two shots that hit 50fps. Nothing went over, 98% of the spot was shot at 25fps. It was all wide open on incredibly long lenses (fun for me ;)) but barely any slow-mo.


This post is getting a little long, but it be great to hear thoughts on 4k vs 2k. Some things I read suggest that 4k is just a fad, and that it'll be about as successful as 3D TV, and that 2k is sufficient. Others say that 4k is the next standard. I saw that Arri and BMD are both coming out with 4k cameras, and Sony already has a couple cameras that record 4k internally- in addition to the ones that record 4k raw with the R5, and C500 does internal 4k. And then I see consumer 4k TVs rolling out... If I had to guess, I would think 4k is going to be here to stay, but who knows... definitely, it seems like 4k is the next product platform the manufacturers are going to push. So at the moment, I feel like 2k is sufficient... but I want to be competitive in the future with a 4k camera, if that's where everyone else is going. If you think 4k is the future, how long do you think it'll take 4k to become ubiquitous?

This is a complex question, and the landscape of the industry is, well, complex. 4K is certainly the future of theatrical distribution, but most films are still distributed in 2k. Realistically, the layman would struggle to tell the difference between 2k and 4k anyway, unless on a huge screen and in the first half of the cinema. But, it's coming.
Also, the C500 does not record 4k without the help of an external recorder.

Also, 4K broadcast television is years and years and years away. I'd guess at at least 10. Television broadcast currently doesn't have the bandwidth for 1080p] and barely has the bandwidth for 1080i at anything but compressed as hell, which is why your BluRay looks so much better than your TV network broadcast. I'd hazard a guess and say it's probably 10 years at least before we start to see some sort of broadcasting in 4k, simlpy because most television networks only upgarded to HD a few years ago. On top of that, there are currently no ENG/TV style 4k cameras.

And, in addition to all of that, there's currently no mass distribution method for 4k content.

So, the future of 4k is likely to be limited to theatrical distribution, at least for now. The future of 4k at home is certainly on its way, but not for a long long while.

Moving back to the discussion of cameras, the Arri Alexa shoots at a maximum of 3.2k resolution in raw format. And yet, it's (arguably) much more ubiquitous on the Hollywood and high budget set than any RED, Sony or Canon. The reason is simple. The best dynamic range, the easiest workflow, it works like a camera should and it has the best colour rendition and skin tones of any camera currently on the market.

Alexa footage has been uprezzed to IMAX (Skyfall) without issue. I'd rather shoot on an Alexa in 2k than a RED in 6k. Spatial resolution does not, in any way, beat out a better image. In days gone by, people would only care about the images that a Cinematographer could get, and the Cinematographer would use the film stock that could get him the best possible images, and he decided this by doing extensive tests before each project.

These days, marketing hype has pushed numbers, and people purchase cameras based on the numbers on a piece of paper, often without even seeing a frame of footage out of it.

I'll try to get off my soapbox, but do your own tests and make your own judgements and comparisons. What kind of work are you doing? What kind of work do you expect to be doing? If you're really into slow motion, that's fine - maybe you want to create a niche market for yourself as a slow-mo cinematographer, or technician (i.e. Phantom). But don't base your decision based on spatial resolution, or frame rates, or anything you see on paper. Look at the images, especially in comparison to cameras of a similar price range (RED Epic, Amira - same sensor/images as Alexa), and then make your decision.
 
Back
Top