• Wondering which camera, gear, computer, or software to buy? Ask in our Gear Guide.

I have a color grading issue.

I was color grading a scene I shot, by adding some blue to give it sort of a gritty atmosphere. It looked really good after, but when I blow up the movie to a full size HDTV, the actors lips come out really pink. I was told by a collaborator who knows a little more about it than I do is because the footage is not RAW, and therefore will look bad after color grading if it's H.264 which is what I shot it on originally with the Canon T2i.

He says he does not know how to get rid of pink lips. I also sent it to a colorist and he gave me a sample, but in order to get rid of it he desaturated the movie into almost complete black and white, like Saving Private Ryan which I don't want. Is there anything I can do to get rid of it? It's only noticeable on the HDTV so far, but I want my footage to look normal on ALL systems. Thanks.
 
Last edited:
One of the (general) primary differences between Quicktime and MP4 is the colour space they expect to see. MP4 (generally) wants to see sRGB ( Studio RGB: 16-235 values) but QuickTime wants (generally) cRGB ( Computer RGB: 0-255 values). I use (general) a lot in there, as it seems there is very little consensus on who does what with what, on PC at least.

When viewing your results, it's terribly important ( as important as capturing the visuals or the audio ) to have both properly calibrated equipment and a knowledge of what colour space your working in.

Now, I must prefix this with that that I work in Sony Vegas. I don't know if this is an issue with other editing programs, but it sounds a lot like this sort of thing.

When you view something on your monitor, it's in cRGB space. That is, 0 is totally "off" and 255 is totally "on" for any of the three Red, Green and Blue colour components. When you're viewing something on a TV, it's in sRGB space, where 16 is totally "off" and 235 is totally "on". Also, (again with Vegas) different codecs decode to different spaces; AVCHD, MPEG-2, MPEG-4 decode to Vegas in sRGB, so "black" is actually the cRGB value of 16 (dark grey) and "white" is 235 (light grey), so when viewed on a computer monitor it looks washed out. But, if you have a proper preview device hooked up that works in sRGB space, it will look fine, OR if you configure your preview device (not the window, but the whole device) to USE sRGB space, it will also look fine (well, pretty close).

So, I would suggest you might have over graded looking at it on your wider range cRGB device, then when it was "converted" to sRGB for your DVD, it was simply recorded with these superblacks and superwhites, so you lost a % of the colour you thought you had.

I watched some HORRIBLE YouTube colour "correcting" videos where the "host" advised to use sRGB to cRGB filters to "correct" a low contrast looking video, when it was totally supposed to look like that on a computer. Play it back on a TV after his "correcting" (he applied it twice!) and you'd end up with something like a mere 73% of the colour information you started with. Not gonna be able to grade that very well at all.

Depending on what you're using for correcting/grading, look into how it handles sRGB and cRGB, or get a real sRGB preview device hooked up if you can afford it. At least, get a reasonable monitor calibration kit and be very careful; there just isn't that much information in these cheap cameras to push too far.

This is almost starting to sound like a Computers & Software thread...

Hope this helps,

CraigL
 
Okay thanks. I am using Premiere Pro. For me actually, it's not that some of the color is missing on the HDTV. The color looks about right and about the same, it's just the lips are horribly pink for some reason, as well as objects of similar color. But other than that it looks normal. I didn't really overgrade and only adjusted the curve by a little. It's actually quite subtle in the program, but just enough to give it a bit of a moody looks, but still subtle. If I take it down, it won't have the blue grit anymore, since there was not much to begin with.
 
White Balance as in what you do with your camera before shooting is probably what he means.

I've done some color theory, but never tried to do a color grade so some of this may have been misinterpreted by me so use at your own risk.

What your friend told you has some truth to it due to the way that most Canons capture their data. Most Canons use 4:2:0, which means that the information that holds the reds (the :0) is mostly discarded. The problem with this is the reds is where Skin tones are. The Mark 3's are supposed to get a firmware upgrade in March that lets them save to an external device in 4:2:2. The C100, C200 and C300 already have this. Someone correct if this is incorrect. Raw is going to be easier to correct as all the information is there. It doesn't mean that you cannot color correct x.264 footage.

It sounds like you've put in a blue filter over the footage to change the feel of your film and in the process you've turned your actors lips pink.

Something to pay attention to is each codec handles information differently. On top of that, some codecs have different color spaces to others. So changing the codec can change the way that the color is handled, that can change what you were anticipating. On top of that, each device has its own way to interpret those color spaces. Try it on another device or two and see if the issue persists. It could be specific to one device.

If the problem persists go back to the original footage and try color grading while paying special attention not to change your skin tones pink. If it continues after that, consider changing your computer monitor to one that is more color accurate.

It's a good thing that you're using this time to test to see how you should do it. I'd hate to have shot everything, needing to have it all color graded to blue and be unable to do it with your equipment or for one reason or another.
 
Interesting.

This subject is the first I've heard of this sRGB ( Studio RGB: 16-235 values) vs. cRGB ( Computer RGB: 0-255 values) discrepancy between what is recorded in camera, edited on computer, then displayed on both TVs and computers - only to wonder why the images don't match exactly the same between the TV and computer.

So... I just spent the last twenty minutes in a google search of "H.264 cRGB or sRGB" which of course rendered this same conversation on umpteen different forums that are understandably against IT forum rules to link to.

However, let me assure you there's no simple fix.

Mostly it's a lot of jibber-jabber that's either over my head or runs around in authoritative circles with the short-sweet of it being "Just deal with it."

Additionally, REPEATEDLY the problem comes up when working on Sony Vegas.
Either Sony Vegas is a pain in the @ss to work with IN THIS REGARD - or - d@mn near everyone's using Sony Vegas as their NLE, therefore most of the grievances are when trying to figure why SV rendered video edits don't shuck & jive between TV (DVD) and computer (YouTube & Vimeo) outputs.


IDK.
Looks like a complete FUBAR industry issue.

Video cameras (at the consumer level, it seems) record video to be displayed on TV, (sRGB). Duh.
Consumer video cameras for consumer TVs.
sRGB ---> sRBG.
Cool. Fine. Makes sense.

But "computer world" can handle color in a broader spectrum, 0 - 255, cRBG.
So, it "stretches" consumer camera H.264 @ 16 - 235 (sRGB) out to 0 - 255 (cRGB) in most NLEs, then default records the output in 0 - 255 (cRGB) for display on computers, via Youtube, Vimeo, etc.
Well... okay.
I hope close enough is good enough, because there's already been a single episode of distortion going on with your video at that point.
sRGB ---> cRGB

The "problem" is when you input consumer camera sRGB into computer cRGB then back into sRGB for TV display via DVD or BR there've been two distortion events.
sRGB ---> cRGB ---> sRGB
Not so good.

An illustrated example of the math involved, turning colors into pixels or resolution:
sRGB-cRGB-sRGBConverionLoss2_zps25820bf2.png


Some NLEs have some sort of "preservation" setting that will convert or retain some interpretation the original sRGB input for a sRGB output after being converted to cRGB in the NLE - BUT - you gotta specify this, especially if you know you're about to burn a DVD/BR.


Honestly, although this MIGHT seem like a buncha fuss over wienie things, I can see how on some highly visual dependent film products the output differences can be "not what I wanted" when shown on TV compared to what's on the computer, especially if you're sending in DVDs for film festival entries (along with entry fee's.)
There ain't no telling WTH people are doing on either end of that transaction.
Maybe the sender doesn't know WTH they're doing and sends a FUBAR DVD.
Maybe the film festival receiver reviews the FUBAR DVD on a FUBAR HD TV or HD computer monitor.
It'll possibly make a significant difference. (Although I'm MUCH more inclined to believe more significant issues will doom an entry. Cough, cough.)
Probably won't. But it could.


Maybe I'm something of a video pig: I don't really care about subtle nuances between a lot of these things.
If a shot looks like cr@p then it looks like cr@p.
You're not going to color grade your way out of cr@p.
Garbage in, garbage out.
If a shot looks great then it looks great, and no amount gilding the lily is really worth too much (some, but not too much) hand-wringing over sh!t no one's gonna remember or care about an hour later.
Care, but don't go ape-shit bananas futzin' with this stuff.
No one's gonna givash!t, except you, on a lot of this.
But I do respect the process, or at least learning your process.
 
Last edited:
On a side note, gotta love it Ray how you do this [trying not to use foul language.]

But end up doing this anyway [using foul language].
Yeah... well... :lol:
I live in a stupid world where I can do a p!ss poor job at work for years on end, but if I get caught using vulgarity (or napping) I'll get fired on the spot. :rolleyes:

Words?!
WORDS?!
I can loose my job over using "the wrong" words?! :rolleyes:
Beats the f#ck outta me.

I've never understood why "this" euphemism word is an acceptable alternative for "that" vulgar word.
Why is "that" word vulgar when "this" word means the exact same thing and everybody knows it?
It's retarded.
The entire concept is just bizarre to me.
Butt eye deel whith itt tha besst eye kan.
Don't always suckseed, tho. ;):lol:


If you like, substitute "distortion" for "fuckage." More betterer?
 
Last edited:
This subject is the first I've heard of this discrepancy between what is recorded in camera, edited on computer, then displayed on both TVs and computers - only to wonder why the images don't match exactly the same between the TV and computer.

I first learned it in regards to digital cinema projection systems where the problem is more important since in a darkened room, you're going to notice the differences more than on a TV in a lit room. Even in an industry like cinema projectors, each manufacturer has their own way to implement the color space. On top of that, there are different technologies that handle light differently, all changing the way that the light displays your picture.

To be totally honest, it didn't mention anything about TV systems but it's reasonable to assume the problems also occur there.

One thing to also note, when you're talking the data of 0-255 containing your color information, you're talking 8 bits of data per color with colors used. Some cameras can record using 10 or 12 bits per color (I'd assume some could even do higher). The more bits used, the more accurate you can depict a precise color.

On an aside point: I don't know what the bitrate of each color is on the Canon cameras, but it's suggested that you want to use 12bit per color when your intention is for cinema use. Do Canon cameras use 8bits for each color?

Filming can get rather technical, right? The real question is how does all this apply? That is something I cannot yet tell you. Without those answers, I hope people see the importance of testing.
 
Additionally, REPEATEDLY the problem comes up when working on Sony Vegas.
Either Sony Vegas is a pain in the @ss to work with IN THIS REGARD - or - d@mn near everyone's using Sony Vegas as their NLE, therefore most of the grievances are when trying to figure why SV rendered video edits don't shuck & jive between TV (DVD) and computer (YouTube & Vimeo) outputs.

Yeah, I'm not sure why. Perhaps on Mac where everything is "according to Steve" the issue is more uniformly handled. Or perhaps the predominate use of Quicktime and ProRes sorta end-runs around it. On windows, anything (and frequently everything) goes, so it's more messed. One of the real issues is, as far as I know, there's no standard to ascertain from a codec/container what space it's in. There are also the issue that HDTV uses REC 709 as a colour standard (which is compatible with sRGB I think) but SD uses REC 601 which is different, which can also cause colour shifts if not converted properly.

To determine what you're seeing, drop an SMPTE bar generator on your timeline (remember, normally items generated in by the computer will be in cRGB) and see if you see the PLUNGE bars, the black area at the bottom. If you see four black bars, you're seeing superblack as different from black. There should be two, black and grey. If you then render that out the watch it in your favourite viewer, you might see two or four depending on both how it was rendered and what your viewer does. If you render it to DVD/BluRay, you should definitely only see two, but you should definitely see two and not one. Then, for fun, upload it to YouTube and see what you see there...

I think all systems have this issue, just normally it isn't a big deal or people don't notice. Aside from the lower contrast, it's normally hard to notice in decent footage. It's only in areas of low chrominance difference (like a sky or underwater) that you will see things like banding increasing as you lose colour fidelity.

As for the actual source colour information, Sweetie is correct that many lower end cameras use 4:2:0 (I don't know about the 5D upgrade, but that'd be cool). This is the chroma subsampling that happens; luminance information (how bright it is) for each pixel gets recorded, but not colour information. 4:2:0 means for every four pixels, it records two samples of colour horizontally (two pixels in each), and zero extra vertically, so the chroma information is for a whole 2x2 pixel region. 4:2:2 means two extra samples of chroma information vertically, so it will be two chroma samples for the same 2x2 region (2x1 pixels each). 4:4:4 being the best, one chroma sample for each pixel. Here's a good explanation for anyone who cares to know more:
http://en.wikipedia.org/wiki/Chroma_subsampling

Once you also start throwing the actual image compression into the mix, it's a wonder it produces an image at the other end at all sometimes!

This works because our eyes suck (relatively) at differentiating chrominance information , but are very good at picking up luminance differences. But when you start actually messing around with it, this lack of information starts becoming a serious problem.

So you can probably ignore the whole issue, unless you feel you're seeing an inexplicable loss or increase in overall contrast. But, the more heavily you grade/correct, the more and more it will affect the image as there's less and less information to use.

CraigL
 
On an aside point: I don't know what the bitrate of each color is on the Canon cameras, but it's suggested that you want to use 12bit per color when your intention is for cinema use. Do Canon cameras use 8bits for each color?

The H.264 coming from the Canon's is 8bit, yes. And it's around 40Mb/s or so, but I think it's also variable, not constant.

To get 12bit ( 4096 discreet colour values per component, instead of 256 with 8bit) you are generally dealing with RAW files (why get beautiful colour depth then compress it). The BMCC can do it even, so you don't have to go to the monster cameras to get it.

CraigL
 
Yep it does shed some insight. Well I suppose I could just tell my friend that we should send the movie out and see what happens. We may not be able to get it to look good on every TV and projector, like Hollywood distributors can do with their movies, but I guess it's the best we can do. I hope it's okay though. I've never seen a movie where pink lips was acceptable enough for the movie to be shown.
 
/smacks forehead upon table.

You don't really like listening to other people do you?

Read a little bit of technical information or suggestions that require work and you seem to throw them into the too hard basket and accept an inferior result instead of spending the extra 30% more work to get an acceptable result.

We may not be able to get it to look good on every TV and projector, like Hollywood distributors can do with their movies, but I guess it's the best we can do.

At a dead minimum, you did at least switch the monitor to see if it was your monitor that was calibrated wrong? If your source monitor is not configured correctly, it will look wrong on every other display device with the exception of monitors that are similarly mis-configured.

I've never seen a movie where pink lips was acceptable enough for the movie to be shown.

If you're failing to plan, then you're planning to fail.
 
Okay thanks. I was speaking more for my friend. It was her goal to want to release the movie soon to some festivals before their release dates are up. We kind of think it's better to send it in with the pink lips, then miss the dates. But we are trying to correct the pink lips in the mean time. I read the replies. I didn't know I could change the codec once the movie has already been formatted.
 
Okay thanks. I was speaking more for my friend. It was her goal to want to release the movie soon to some festivals before their release dates are up. We kind of think it's better to send it in with the pink lips, then miss the dates. But we are trying to correct the pink lips in the mean time. I read the replies. I didn't know I could change the codec once the movie has already been formatted.

I have also played around a bit with the HDTV but I don't know which calibration I am suppose to switch it too. Do you mean calibrations such as 'normal mode', 'theater mode', etc or are we talking about something completely different?
 
Okay thanks. I was speaking more for my friend. It was her goal to want to release the movie soon to some festivals before their release dates are up. We kind of think it's better to send it in with the pink lips, then miss the dates. But we are trying to correct the pink lips in the mean time. I read the replies. I didn't know I could change the codec once the movie has already been formatted.

I get the suspicion you're not really sure what a codec is, right? If you're going to be a one man show, you really should start learning some of the more important, somewhat technical subjects. In simple terms, a codec is how images are formatted and stored within a video stream. This includes compression, data rates, color space and so on. I'm far from an expert on the subject but learning the basics is going to help you get a long way. Also knowing what are the benefits/downsides to particular codecs can also help if you have to change things.

I have also played around a bit with the HDTV but I don't know which calibration I am suppose to switch it too. Do you mean calibrations such as 'normal mode', 'theater mode', etc or are we talking about something completely different?

Calibration on your monitor. As in your computer screen. AFAIK most TV's don't really let you calibrate them. I'd suggest borrowing a monitor from a graphic artist friend who dropped a decent amount on a monitor to see how yours compares.

When you did the color grade, did you change the reds? I suspect you pushed the red too far in compensating for turning everything blue and turned the lips pink (slightly too far into the red).
 
I did take away the reds in a lot of the shots, so the movie would look more blue, than blue-purple. Well the codec is H.264 on the Canon T2i, if that's what you mean. I also exported it at H264 after editing, so the final exported copy so far is H264, and hasn't been downgraded, unless I'm wrong.

I'm doing as much research as I can on this but not finding a lot of info that I understand what I would do to fix it.
 
Back
Top