Which Camera?

Alright,

So I'm a huge fan of DSLR's and I love them to death, especially the 5d mark 2 and 3, but now I'm at a dilemma.

I'm working out my first feature film, and although I am used to using DSLR's for my films, I don't know what I should shoot on for my first feature film.

So my basic question is this, does DSLR footage shot in 1080p going to look good in a theater?

I know it's a long shot to get my film shown in an actual movie theater, but still-I'd like to know. Would DSLR footage suffice, or am I going to have to look into putting the RED into my budget.

I really don't want to do that because I'd rather spend my budget on the film's other aspects rather than on the camera.
 
I know we say this a lot "the camera doesn't matter" but I have to ask, have you watched a late 80's - 90's "independent" movie that was shot on "video" lately? Throw in "typical" indie level acting.. and it just plain hurts !

Just sayin..
I have always said that if everything else is excellent the camera
won’t matter as much to the audience. Not that the camera doesn’t
matter.

I still enjoy some movies from the 80’/90’s that were shot on “video”
because everything else is great. Sure, the quality of the image isn’t
up to the standards of film or a DSLR but when everything else is tops
I’m okay with that.

I may be pretty much alone in that.

Today I still go to 8 to 10 film festivals a year and I’m seeing movies
with an excellent image. I rarely see a movie that really moves me as
an movie goer - make me really laugh, scares me, makes me cry,
makes me think. Sure, I see very shallow DOF and sharp images and
lots of color correction and digital efx. I get the impression that that is
what is driving many filmmakers. I still see “indie” level acting, no
lighting at all, fair audio and uninteresting stories.

But I see I’m in the minority here. Which is fine - I find myself in the
minority more often then not. I will, in the future, not mention it.
 
I know we say this a lot "the camera doesn't matter" but I have to ask, have you watched a late 80's - 90's "independent" movie that was shot on "video" lately? Throw in "typical" indie level acting.. and it just plain hurts !

Just sayin..

It's not so much that the camera doesn't matter at all, moreso that the camera body matters less than the Story, Production Design, Direction, Acting, Sound, Lenses etc.

If you had a screenplay written by Aaron Sorkin, A-listers as your actors, and collated a team of Hollywood professionals for your crew and then decided to shoot with an iPhone, then it certainly wouldn't be as good as shooting with say, an Alexa.

But, that's not really the level we're talking about here. I'm an advocate of 'the camera matters less' mostly because 99% of indie filmmakers and first-time Directors get so caught up in the camera body, and having the latest and greatest or the best of the best, and spending thousands of dollars on renting or buying the best camera body they can afford, and put no tthought or effort into the story, production design, direction, sound, lensing, lighting etc.

Those indie 'video' films you're talking about wouldn't look any better simply by shooting them on a Scarlet - Video of the 80's and 90's wasn't quite where it is now, but there are a lot of indie 16 and S16 movies from the same time frame as well as earlier (and even later) that look just as bad because of the reasons I mentioned above.

For some reason, these days people think good cinematography is purely about what camera body you shoot on, and that is completely false - and I hope that the latest Zacuto shootout doco helps to point that out. It's somewhat offensive to Cinematographers in general to assume there is some magic button or magic camera that makes everything look beautiful. An ugly unlit scene will look ugly and unlit whether shot on S35, IMAX, Alexa, DSLR or iPhone
 
If you had a screenplay written by Aaron Sorkin, A-listers as your actors, and collated a team of Hollywood professionals for your crew and then decided to shoot with an iPhone, then it certainly wouldn't be as good as shooting with say, an Alexa.

this is exactly what I'm talking about.

The camera does matter. For instance, I wouldn't want to shoot on a Hi-8 camera if it's going to make my feature look like crap if I'm trying to make a modernized looking action film.

Why spend so much money on the actors, sets, wardrobe, and crew if you're just going to undermine the camera? That just sounds ridiculous to me. Although you can have an awesome script, having a shitty camera isn't going to help you show your vision (unless your script calls for a shitty camera).

My simple question was if a DSLR looks good in a theater, or would I have to upgrade to a RED, or other higher standard camera.

No one said screw my budget I'm getting a RED no matter what.

Regardless, the camera does matter. Even though there is no "Magic Button" you still need a camera that can hold up visually with the competition and also combat with it's story.
 
With that said, however, I did only say it 'wouldn't be as good as' not that it would be unwatchable. If you did have all those elements, you could still shoot it on an iPhone and people would watch it.

DSLRs are fine for theatres, though I'd rather shoot on a Red, both as DP and AC. Depends on your budget though, if you have a budget of $40,000 then it would make sense to get a DSLR, and get some really good crew and production value for your film, rather than spending $30,000 on an Epic.
 
Alright,

So I'm a huge fan of DSLR's and I love them to death, especially the 5d mark 2 and 3, but now I'm at a dilemma.

I'm working out my first feature film, and although I am used to using DSLR's for my films, I don't know what I should shoot on for my first feature film.

So my basic question is this, does DSLR footage shot in 1080p going to look good in a theater?

I know it's a long shot to get my film shown in an actual movie theater, but still-I'd like to know. Would DSLR footage suffice, or am I going to have to look into putting the RED into my budget.

I really don't want to do that because I'd rather spend my budget on the film's other aspects rather than on the camera.

Seeing as how Film has roughly the resolution of Standard Def, after duping it so many times, from the negative, to the inter-positive, to the inter-negative, to the release print, I think you should be fine on resolution.

That's never been the real issue... the real issue with DSLRs and the like, is dynamic range and color reproduction.
Something like the Blackmagic Design Cinema Camera certainly covers the issues of DSLRs; and you get to keep the similar DSLR shooting feel.
 
Last edited:
Seeing as how Film has roughly the resolution of Standard Def, after duping it so many times, from the negative, to the inter-positive, to the inter-negative, to the release print, I think you should be fine on resolution.
Wait, what? Don't know where you got that from. Maybe if you telecine in SD, but even then a film telecine in SD is going to look better than an SD handycam (for example)
That's never been the real issue... the real issue with DSLRs and the like, is dynamic range and color reproduction.
Something like the Blackmagic Design Cinema Camera certainly covers the issues of DSLRs; and you get to keep the similar DSLR shooting feel.
I've said it before - the BMD is a cinema camera and not for the average DSLR user.
 
Wait, what? Don't know where you got that from. Maybe if you telecine in SD, but even then a film telecine in SD is going to look better than an SD handycam (for example)

It's fact. Film (35mm) starts off at about 4k/6k and with every dupe you lose about half the resolution. After 4 copies, you end up with a resolution of roughly between 540 and 850 instead of 4096. You also introduce the grain and dirt onto each new copy from the previous copy.

If you do a digital intermediate, you end up somewhere between 1024 and 2000.



Edit:
http://savestarwars.com/filmpreservation.html
As you can see, there is a lot of duplication there from the negative to the theatre. Each time you copy the film, you lose quality. First, the grain increases with each generation: the original negative has only the grain of the negative, while the interpositive has the grain from the negative plus the interpositive emulsion, while the internegative has both grains plus its own emulsion grain, and so on. The same applies to dirt and dust, and even hairs (sometimes they are not literally hair, but lint or whatever). Labs are as clean as possible and technicians use white gloves to handle the film, but you still tend to end up with debris of some kind, and this adds up with each generation. A negative might look clean, but by the time you get to a theatrical print, the layers of copied dirt have become noticeable. Dirt on the negative and internegative prints as white specs, while dirt on the interpositive and print itself shows as black specs. Each time you copy the film, you also lose resolution, much like in a photocopy; if, let's say, the interpositive was 90% as sharp as the original negative and each copy was of the same degradation rate, that means by the time you got to an actual print you've lost close to half of your picture information. As a result, although 35mm film is said to resolve about five thousand lines of resolution, an actual theatrical print does not have much more picture information than a high-def projection. Each copy also increases contrast, and also loses colour information.

The quality loss is closer to 50% not 10%.

The reason film has looked better than digital/video up to this point is because of better Dynamic Range, Color Reproduction, and better, more natural, highlight roll-off.

I've said it before - the BMD is a cinema camera and not for the average DSLR user.

What I meant was the form-factor is familiar, not that there's not a bit of a learning curve.
 
Last edited:
It's fact. Film (35mm) starts off at about 4k/6k and with every dupe you lose about half the resolution. After 4 copies, you end up with a resolution of roughly between 540 and 850 instead of 4096. You also introduce the grain and dirt onto each new copy from the previous copy.

If you do a digital intermediate, you end up somewhere between 1024 and 2000.

Whilst it is true that each time you 'dupe' it you increase contrast slightly, and increase grain slightly, it is not 'closer to 50%'. Not only that, but why would you do it 4 times? You'd generally only do it once - you take the processed negative and dupe it to a positive for projection. Normally, you're telecineing or DI'ing the negative and you get full resolution. Then you'd scan back to a positive projection print. You're not losing any quality by DI'ing or telecine'ing.

Whilst it is true that film of hte 70's (Star Wars era) had to be done the way it says in the link, that is not how film works these days at all, and that's how you get the restoration - they go back to the negative and essentially do similar processes to what they do now.

A DI of 35mm film also only gives you 2k of resolution if you DI it at 2k. You can DI 35mm at 4k. You can telecine 35mm at 4k. You can telecine 16mm at 2k if you have the money.
If you want to go back to the archaic ways of cutting on a 70's style film cutter and duping from one to the next, and then creating release prints that aren't used at most theatres these days be my guest, but that's not how it's doen these days.

Films can be projected at both 2k and 4k and even a 2k film projection has higher resolution than any SD or HD footage if you compare closely.

What I meant was the form-factor is familiar, not that there's not a bit of a learning curve.
It's not just a learning curve. It's a completely different way of thinking. Anyone can pick up a DSLR and get decent images, but to be able to make a cinema camera look good, you need to think like a cinematographer. You need to be able to light scenes. You need to have a DIT on set. You need to have your DIT verify and backup your files. You need to have a post workflow in place. Are you going to shoot raw and then convert to ProRes, or simply shoot ProRes Log? You need to have a colourist, and you need to grade every single thing you shoot.

Contrary to it's name, the Blackmagic camera is not in fact magic. It's a cheaper alternative to a digital cinema camera. It doesn't automatically make your image any better than anything else simply because of the camera it is. There are going to be a lot of casual DSLR users who will buy the BMD and be disappointed with their images because they won't be that much better than what they got out of a DSLR.
 
Last edited:
It's not just a learning curve. It's a completely different way of thinking. Anyone can pick up a DSLR and get decent images, but to be able to make a cinema camera look good, you need to think like a cinematographer. You need to be able to light scenes. You need to have a DIT on set. You need to have your DIT verify and backup your files. You need to have a post workflow in place. Are you going to shoot raw and then convert to ProRes, or simply shoot ProRes Log? You need to have a colourist, and you need to grade every single thing you shoot.

Contrary to it's name, the Blackmagic camera is not in fact magic. It's a cheaper alternative to a digital cinema camera. It doesn't automatically make your image any better than anything else simply because of the camera it is. There are going to be a lot of casual DSLR users who will buy the BMD and be disappointed with their images because they won't be that much better than what they got out of a DSLR.

You seriously have that exactly backwards.

It'll be significantly easier to use a more high-end professional camera, in the sense of exposure... the ironic thing about professional cameras are they let you screw up...

Since they have so much more latitude/dynamic range, you can fix missed exposures in post. You can pull detail out of the "blown" highlights or out of the underexposed shadows; and get a useable image.

Your standard DSLR has roughly 8 stops of DR (at best) and you're pretty much stuck with what you exposed for when you shot, the BM Camera has 13 stops; plus the better highlight roll-off. With DSLRs or smaller HD camera, your stuck with what you shot... if you don't know what you're doing, you're going to get what is essentially unuseable material.

The same thing goes for color... due to DSLRs 4:2:0 AVCHD codec (which you're stuck with), as soon as you try to color correct beyond a certain point everything falls apart and macro-blocks.


Not only that, but why would you do it 4 times? You'd generally only do it once - you take the processed negative and dupe it to a positive for projection.

No you don't... I'm sorry but it seems like you really don't know what you're talking about.

You take your negative, telecine or DI for your rough cut, then go to an inter-positive (whether that be your DI or an optical inter-positive), from the original negative via an EDL. The inter-positive is your "Positive Master"; generally there a about a half-dozen inter-positives made. Then your print your inter-negatives (a few dozen are made), which is your "Master Print Negative". Then you move onto release prints and make a few thousand copies... You're inter-negatives are going to get seriously worn out, and deteriorate after about 100-200 release prints being made. Running a physical piece of film through a machine 100 times doesn't do anything good too it. Plus you're introducing the dirt and lubricant gunk (most films are printed on "wet head printers") contained on the sprockets of the release print printer onto the inter-negative every time you run it through the machine. So you keep switch out inter-negatives to produce your release prints. Eventually you'll run out of useable inter-negatives, and have to go back to the inter-positive, to make new inter-negatives, to make more release prints.

With a DI the best you're doing is removing the physical inter-positive from the mix and going from the original negative straight to the inter-negative via digital inter-positive (so you retain the resolution of the original negative). So like I said you're coming out with only SLIGHTLY more resolution than you would have, had you not done a DI.

If you were to make release prints from your original negative, after about 100-200 release prints your negative would be completely fucked, and now you no longer have your original source material.


http://hollywoodfilmco.com/HfcDcp.htm

At this point, I'm done this discussion...
 
Last edited:
You seriously have that exactly backwards.
I have what backwards? The fact that you need a DIT? A post workflow? Need a handle on how to work things? Need to know how to light to get good images? Need a hefty computer to be able to process raw footage in the first place?
I stand by my statement - the casual DSLR users will likely be disappointed that the BMD doesn't immediately make their images look like cinema. There's a lot more to a cinematic image than pure dynamic range or raw workflow, not to mention the fact that an inexperienced colourist is almost as bad as not having a colourist. If you're inexperienced, your image will look like it's shot by someone inexperienced even if it's on S35mm film. I've seen Red footage that's looked no better than something shot on an EX1 because it was in the hands of inexperienced users. I've also seen EX1 footage that's looked like it could've been shot on a cinema camera.


No you don't... I'm sorry but it seems like you really don't know what you're talking about.

I never said that wasn't the way it has been done in the past.

These days, you'd be just as likely to keep everything digital after the telecine or DI.
Some workflows will have things only telecine'd or only DI'd (the quality difference between the two is ngeligible). I've also seen film telecine'd for the rough cut, then an EDL used to DI the needed parts for the fine cut, and SFX work. Then it's either packaged into a DCP, or lasered back using an Arri Laser (http://www.arri.com/digital_intermediate_systems/arrilaser.html).


With a DI the best you're doing is removing the physical inter-positive from the mix and going from the original negative straight to the inter-negative via digital inter-positive (so you retain the resolution of the original negative). So like I said you're coming out with only SLIGHTLY more resolution than you would have, had you not done a DI.
I would still disagree that you're losing 50% of your quality each time.

If you were to make release prints from your original negative, after about 100-200 release prints your negative would be completely fucked, and now you no longer have your original source material.
Yes I didn't really mean making release prints from original negatives, though I can't say exactly what I did mean..

You also seem to bypass the fact that traditionally, you'd be going through the same process anyway even if you were capturing digitally. If you needed to make release prints, the end process after the film's finished is the same whether going to DCP or release print, or capturing digitally or via film, and if you really are losing all of that information, then I'd rather start at 4k+ and dub down to 1k than start at less than 2k and dub down from there.
Not to mention that anyone who's ever worked with a Red can tell you that downsampling a huge image is going to look 10x better than if you simply start at the downsampled size.
 
Back
Top