Colour (Color) Bars – 100% vs 75% bars – Analog/Digital/HD Questions?

Home Page forums Autodesk/Discreet Flame and Smoke Colour (Color) Bars – 100% vs 75% bars – Analog/Digital/HD Questions?

Viewing 9 posts - 1 through 9 (of 9 total)
  • Author
    Posts
  • #203077
    rob fisher
    Participant

    Hi,

    From what I’ve been taught/told in the past the use of colour bars on a Master tape sent to a distribution house for broadcast is only necessary for analogue tape dubbing/calibration. However in my NTSC part of the world all the distribution houses require the SMPTE test pattern and 1kHz tone on all tapes.

    At the particular company I work at we used 75% bars for SD DigiBeta cassettes and 100% bars for HD Masters. However, we noticed that the distribution tape room ops were absolutely killing the grade making all our spots look incorrect on air when calibrating the SD spots on DigiBeta. So we started laying off both NTSC Masters and HD Masters with 100% bars out of the Flame and things started looking correct on air.

    End of last week I got a call from one particular distributor who said whoa whoa whoa, this Masters you sent should have 75% bars as it is DigiBeta SD NTSC. However, this was only flagged after the 3rd time this job was sent to them (many slight legal revisions after the job had aired). So there is some definite inconsistency.

    So my question to the forums is this: what is the technical responses to this? I don’t know personally why they are calibrating to bars on a digital tape. I also don’t know why 75% is more acceptable for SD NTSC over 100% bars. I can’t find any white papers online, so maybe someone here can provide me with some technical background about which bars to use on which masters and why.

    Thanks for any responses. I’m sure this thread will come in handy for others googling this same issue – detail is much appreciated. Or just link to me to the correct answer, either way.

    -rand

    #218256
    Anonymous
    Inactive

    I don’t know what the answer is, but, a potential solution would be to simply label your tapes with whatever kinds of bars you put on.

    Oh, I’ve never used 100% on NTSC Digi.

    keep us posted.
    randy
    mill NY

    #218258
    new way
    Participant

    well turns out there is no ‘standard’ per say.

    1st response:

    I can tell you that in HD we do 100% full-frame top to bottom bars from the deck, but we often get SMPTE-like patterns at the usual 75% with the 100% box (etc etc), and tone usually at -20dB. The tone sometimes changes, but we *like* it at -20. Yes, we do set levels based on the bars and tone on the master (Digital Betacam). Please set the bars to 75% SMPTE, and the tone to 0VU/+8dbm

    2nd response:

    75% SMPTE Bars for both SD & HD. And absolutely we adjust playback levels according to the Bars and Tone at the head of each tape using digital waveform monitors and vectorscopes. Black is set to 0% IRE digital 7.5 % IRE analogue on a waveform monitor with white levels peak at 100%IRE. Chroma is calibrated via a vectorscope ensuring each colour is represented in the vector position represented on the vectorscope display. Tone is set to playback at 0Vu

    3rd response (from the distributor with the initial complaint):

    no response.

    So it seems NTSC SD 75% bars is what people are used to. 100% bars sometimes for HD, but 75% bars is also acceptable? sheesh. We’ve also decided to start labelling all Masters with the Bars percentage as you’d suggested Randy.

    Our problem was that the grades were effed up when on air and of course the client is calling us. Maybe someone else can give a better explanation as to the calibration process at a dub facility that they work(ed) at. I’d love to hear it.

    -rand.

    #218257
    Anonymous
    Inactive

    wow. sounds nasty.

    good detective work though.

    thanks for sharing.

    good luck.

    randy
    millNY

    #218255
    Martin Furness
    Participant

    Been doing 75% for SD and HD since the beginning of time. Very good initial and interesting question though.

    #218259
    Roger Koller
    Participant

    Hey rand!

    Good question!

    I´d like to know it for PAL.

    Regards

    #218261
    moc mo
    Participant

    In the good old days of analog TV the level of the color bars (and hence the chroma level in the composite baseband signal) could not exceed 100% of the luminance signal (in PAL = 700 mV) – so it was set to 75% chroma level. If it would have been set to 100% chroma level the chroma would have been 133% of the luminance level and would have caused heavy distortions on the RF carrier over the air. As it was the general understanding that in the real world no natural color is fully saturated 75% chroma seemed to be sufficient and it was. (This is actually the same with NTSC except for the fact that 100% luminance is slightly more than 700mV as the sync level is slightly less than the 300 mV – the total video voltage including sync is 1 Vpp in both NTSC and PAL)

    Even if you recorded the video on a DigiBeta you had to make sure that the max. chroma level did not exceed this 75% mark because too much chroma could cause troubles with some studio equipment AND the transmission.

    In the digital domain some things have changed –
    1) There is no NTSC or PAL (NTSC refers to the committee that chose the standard way back and PAL refers to the method of reversing one of the color difference signal in phase every other line – hence PAL = Phase Alternating Line)
    2) The only thing that is left is the line frequency of 15.734 Hz and the odd 59.94 Hz field frequency for NTSC and 15.625 Hz and 50 Hz for PAL, respectively.
    3) The color difference signals that were previously modulated as I and Q (NTSC) or as U and V (PAL) onto a subcarrier at 3.579 MHz (NTSC) or 4.43 MHz (PAL) are now digitized and serialized and sent as Cr and Cb in a sequential line with the digitized Y in the SDI data stream. No more modulation of analog subcarriers somehow fitted to luminance.
    4) As the signal is SDI now the level of the chrominance can be either 75% or 100%. It is more or less a matter of the convention.
    5) Only if the (digital SD-SDI) signal would have to be re-converted to an analog signal (be it NTSC or PAL) the chroma level again must not exceed 75% or else….
    6) On this one I am not perfectly sure (but I will check): the current convention seems to be 75% chroma level for all SD sources (analog and SDI) and 100% chroma level for HD-SDI sources.

    Hope this helps.

    #218260
    Brent Felix
    Participant

    I’ve always used 75% bars with 1khz tone @ -20db for both HD & SD. An experienced machine room tech should know right away whether they are looking at 75% or 100% bars on a scope / monitor. It would be fairly obvious that you were doing something incorrectly if you were trying to swing 75% bars to 100% or the other way around.

    #218262
    Anonymous
    Inactive

    Thank you for the information… i was searching this for a long time..

Viewing 9 posts - 1 through 9 (of 9 total)
  • You must be logged in to reply to this topic.
Copy link
Powered by Social Snap