8-bit vs. 10-bit Video | What’s the Difference?

July 30, 2019 posted by


8-bit vs. 10-bit – you might have heard
these terms before, but what exactly do they mean? When should you use one over the other? For Starters, 8-bit isn’t this! 8-bit video is everywhere. Regardless of how it was originally shot. DVDs, Blu-rays, TV and almost anything you see
online is generally in 8-bit. So, if 8-bits are enough for a Hollywood blockbuster,
that you watch from the comfort of your own home, then why would you ever, need to shoot
in 10-bit? Well what we’re really talking about is
something known as bit depth or color depth. Bit depth can be thought of as how many possible
gradations of color there are in an image. It is not the same as color gamut, which is
the range of possible color. You can have 256 shades of red, blue, or green
in an 8-bit image, but with 10-bit video, you can have 1024 shades! Do we really need that many shades? Well that depends on who you ask. 10-bit video has existed for many years in
the professional broadcast and film world, but only now are we really seeing it come
into the mainstream. You can thank three different trends for making
this possible: Cameras with log modes, easily accessible color grading software, and more
recently, delivering in HDR with wider color gamuts Ask anyone who’s tried to grade a log image,
and they’ll tell you that you can only do so much with 8-bit video. With only 256 shades per color, some scenes
can exhibit what’s called “banding” in what should be a smooth gradient. In other instances, banding doesn’t appear
until you color grade, meaning you can only push the image to a certain degree. You don’t have a lot of wiggle room in 8-bit,
meaning you have to get your exposure correct in camera. Now while you can use a log mode with an 8-bit
camera it’s not always a great idea, and some log modes handle it better than others. Canon’s original C-Log for example, was
designed for 8-bit video. It does hold up pretty well. You see, standard Rec. 709 footage can present
roughly 5-6 stops of dynamic range, so when you’re using a non-log, or normal mode those 8-bits
only have to be spread across a picture that doesn’t utilize the sensor’s entire
range. Log modes exist in order to get every stop
out of the sensor into a limited space, so when you record this way, you’re using those
same 8-bits across far more information, which then needs to be stretched out later. You can actually visualize this if you look
at a waveform monitor. As you add curves or adjust levels, you’re
stretching the data of the video out, revealing “gaps” in the image. Banding is exactly what happens when these
gaps become too prominent. But, shooting in 10-bit solves this. Now, all that information from the sensor
can be defined in 1024 steps, allowing for more drastic color grades, and smooth gradients
in places like skies. Let’s say that Rec. 709 has “regular old
blue”, while P3 or Rec. 2020 have “brilliant blue”. I just made those up. 8-bit color can still display “brilliant
blue”, but it’s going to have fewer shades between that and “regular old blue.” The larger your gamut, the more challenging
it is for an 8-bit mode to accurately represent it. This is why HDR requires a minimum of 10-bits. GRRRRRRRRR. But, that still does not answer how nearly
every form of video gets away with it. I know this is going to sound weird, but believe
it or not, for most types of video, 8-bit works just fine. Provided you have the bitrate, all it takes,
is a little bit of noise. Don’t believe me? Watch what happens when I add just a small
bit of noise to this image. The banding is a lot less noticeable. And just like we could see the gaps in the
waveform, you can see the noise filling-in those gaps. When professional content is finished in 10-bits
or more, the version you end up seeing comes from that source. It’s a lot like how a 4k image looks really
sharp and clean when downscaled to 1080p. The process in this case is called “dithering”
and it blends colors together using noise. On top of this, most movies, even those shot
digitally, have a form of sensor noise or film grain that hides the limits of 8-bit
recording. Lastly, just because you shoot 10-bit, doesn’t
mean you need a 10-bit monitor to appreciate the gains. Yes, it’s true. The most accurate representation and benefits
of 10-bit can only be seen on a 10-bit display. However, shooting and especially grading in
higher bit depths still gives you added flexibility that you see in the final image. Those cleaner gradients for example, are still
visible in the 8-bit monitor that you or anyone else has. In fact, it’s the monitor or graphics card
you have that is dithering the image before you even see it. So if you aren’t doing heavy grading, or
better yet, if you’re not using a log mode at all, shooting in 8-bits isn’t all that
bad. That said, if you shoot in log modes, and
want to perform extensive, creative color grades, you will definitely want to shoot
in 10-bit. And if you want to produce HDR content, you’re
actually required to shoot in at least 10-bits! As always consider the needs of your production. From B&H, this is Doug, I’ll see you next
time.

59 Comments

59 Replies to “8-bit vs. 10-bit Video | What’s the Difference?”

  1. Guru Raj says:

    Add good sample shots…

  2. Design Media Consultants says:

    B H is our go to source for gear and equipment. Been using you guys for over 30 years. Appreciate your honest and informative videos

  3. Mitch Thompson says:

    Still confused lol

  4. Abbott Racing says:

    OK so why not just always shoot in 10bit? It seems like you are saying 10bit is better in every way but is just not normally used because it is not noticeable. Is there any downside to 10bit?

  5. Prince Peprah says:

    4k 60p-8bit and 1080p-10bit at 60p with 4:2:2 which one has the better quality..??? confused…mostly shot music videos with my GH5 so which 1 will gve me better color grading results..

  6. pleggli says:

    You can only have 256 total colors in an 8bit per pixel image, in an 8 bit per channel RGB image can have 256x256x256 (24 bits) per pixel or 10 bit per pixel (30 bits per pixel).
    It's gets a bit confusing just calling it 8bit without specifying which kind of 8bit you are talking about.. Especially in the title of the video and introduction…

  7. Frank Kujawski says:

    For a suggestion on how 10 bit can work for you. On your two blue locks, if you had put in a "B" in the first one and an "H" in the second one that was almost the same color, and then used curves to enhance the B & H, you would have more dramatically shown how 10 bits can help in certain situations.

  8. Prawda TV says:

    Did you shoot this video on 10bit? Your background looks noisy.

  9. MDF RESCUER says:

    Nicely.

  10. Pieter says:

    Very informative.Great work.

  11. R Rivera says:

    Awesome presentation. Clearly explained although not entirely for the newbie.

  12. Lyle Stavast says:

    That's a great layman's explanation and set of graphics…

  13. John Olson says:

    Nicely done. Great speaking voice for YouTube productions and you're a great "explainer." Keep producing! Oh, and the content was great, too! (smiley thing)

  14. Hokgiarto Saliem says:

    Hasselblad raw can do 16 bit / channel, Canon stil raw 14 bit / color (or channel). I think many pro video should be easy at those level also. Dolby Vision require 12 bit / color to end user. I hope soon all entry level SLR can do 10 / 12 bit HDR as same as end user, and pro is more than that.

  15. theUKtoday says:

    Mp4 or 1080p is more than enough

  16. Aakash Kumar says:

    shoot with log mode or without​ log mode.. which will give me best result for short movie or music video… anyone plzzz suggest…bcoz many person hav their different reviews…so i m confused…

  17. Prince Peprah says:

    All-I on Gh5 & Prores on Black Magic Video Assist, which is best for editing and better color grading results..Most shoot 1080p 60fps 10bit ..

  18. Hiro Nito says:

    I’ve seen a lot of 8bit+frc 4k tv with banding. I guess this video is telling me something that is TRUE.

  19. Rationalific says:

    Awesome overview!

  20. Frank Simmons says:

    perfect! thanks

  21. YVZ STUDIOS says:

    I just have one question. I've found out that YouTube is supporting HDR video for a while now. Apparently YouTube recognizes that if the video is in 10bit. I don't care about HDR though, only the advantages of 10bit, because it gives me better gradients in my videos (for even less filesize).
    But how to create a 10 H.264 mp4?
    That's no problem! Handbrake supports H.264 encoding in 10bit mode now which is awesome! :^)
    My question is: Should I add some dithering (1% noise) to my 10bit video before I encode it with Handbrake and upload it to YouTube? What will look better? 🤔
    I'm using RF (Constant Quality) and not Average Bitrate, btw. The video is also in (fake) 60fps to get even more quality (bitrate) from the YouTube compression.
    https://support.google.com/youtube/answer/7126552?hl=en

  22. 360 grad says:

    thinking about buying a 5D mark IV for video productions, it would be right to upgrade to c-log if i understand you right, no matter if i shoot in full HD 4.2.0 or 4k 4.2.2… correct? btw. thanks for the vid. i learned something today 🙂

  23. M S says:

    So what the conclusion, can i have better image with 10bit video on 8 bit monitor than 8 bit video on 8 bit monitor?

  24. WSWEss says:

    explications not clear, ended up believing in flat earth theory

  25. Reza Rishehri says:

    Thank you so much it helps a lot

  26. james romero says:

    256 shades of Gray . yeah!

  27. Nicolas Mango says:

    Awesome video. Thank you very much. I finally understand.

  28. Shinigama says:

    thanks doug <3

  29. Michael Wanyoike says:

    Nicely explained! As a one-man video production crew, I'll stick to 8-bit as the quality is acceptable for the average consumer, and doesn't require more space and processing requirements like the 10-bit. I don't see the budget justification for using 10- bit.

  30. Sean Phạm says:

    Well, I'll put it simpler: 10 bit to 8 bit, is like 4K to FullHD. When recording in 4K, You'll have more informations, pixels, to edit later, and the final result even if you export in 1080p, will be better than shooting at 1080p at the very beginning

  31. Julius Kingsley says:

    TL;DW 10bit is better in every way, but companies are slow, lazy, and cheap.

    I have watched 10bit video on my 8bit monitor for years (when I can find it) and the difference is clear. You need a decent processor to run it, but it can be better looking and/or smaller in file size.

  32. Art Altman says:

    Excellent presentation, and needed. I've long wondered about the significance of 8 bit vs 10 bit. Thank you.

  33. Stefan Weiberg says:

    And always make sure that your editing software does handle/use 10bit. LumaFusion reduced all my 10bit shots to 8bit and I have heavy color bending in my final results.

  34. Vandicleuza Maria Carvalho says:

    VIDEO CHATO QUE IAGO FEIS 👌👌👌👌👌👌👎👎👎🖕🖕🖕🖕🖕

  35. Vandicleuza Maria Carvalho says:

    PIO VIDEO QUE IAGO FEIS

  36. Gašper Jager says:

    Great explanation. To the point and just important info. Nicely done.

  37. Jim says:

    That was a good explanation. 10 bits per color channel makes sense. But also I usually don't notice the color banding, and we're up against being meaningful in homes with 4k, but whatever. Vaccinate your kids, you filthy animals.

  38. *ظ ́ ́ʇ ́ ́ظ* says:

    I'm trying to solve the color banding issue. I have an 8 bit Samsung 43" SVA monitor. Is colorbanding an inconvinient feature of such panel or is my unit faulty?

  39. Ai Hang says:

    One question bothered me for a long time. OLED iPhone has 8bit display, but it supports both HDR10 and Dolby Vision, why?

  40. Mauktik Rawat says:

    Very well explained 👍

  41. J Rogers says:

    Is the sony x900f in 49’ 8bit or 10?

  42. Brian Eason says:

    i do real estate video and photograpy. Which would recommend Fuji XT3 with the Atomos V or Sony A7iii with the Atomos V?

  43. NevadaScrubJay says:

    As you said, it is still seen in your and your viewers monitor. Plus, videos might be affected more than stills. I believe Lightroom only uses 8 bit? What about the difference in pixel size due to monitor size; as 27" vs 32"??? Then there is the difference of HDR in 2k or 4k vs monitors without HDR? Is all of this "too much information" for an amateur, or even most professional, still photographers using Lightroom?

  44. Mandilo23 says:

    I'd like to see a bandwidth comparison in shooting [email protected] on 8-bit vs 10-bit. Is it a 4x gain on file size?

  45. Kevin Endella says:

    No wonder there is a term "HDR10". So basically 10 stand for 10-bit?

  46. Pirsigs Journey says:

    Thank you, it's all starting to make better sense now.

  47. Ted Olthof says:

    Thank you for a very well explained video!

  48. This Epic Life says:

    Fantastic.
    Great explanation.

  49. Ali Sadeghy says:

    Really Helpful! Thanks!

  50. //KΛイΛLУZイ// says:

    I have a Ryzen 3 2200g and a Flatron E1940S … and curiously the granulation is extreme, it seems that it had a color depth of 16 bits and not 32, and amd's configuration only lets me put up to 8 bpc (I guess the what the monitor allows me to do …)

    Is there any way to edit this and see how I looked in Windows 7 with my old PC?

  51. Wesley Oostvogels says:

    I’m confused about the following: when a display truly has 10bit color depth (so not 8-bit +FRC) then it should not be dithering right? It should be able to fill up all those 1024 grades just like 8 bit fills 256 grades? Does it mean 10 bit camera files will be shown more smooth? If this makes any sense.

  52. Anton Markaj says:

    If I put my ASUS PG27UQ to 82Hz (of max 144Hz) I can enable Full RGB 12bpc. Is it any usefull or should I stick with 120Hz 8bpc?

  53. Poor youtuber says:

    i have a friend who download only 10bit movies from torrent 😂😂

  54. Gab Larkin says:

    Best explaination

  55. sitcommander says:

    So is all YouTube video 8bit VP9?

  56. Magnifico Official says:

    That was a clear and peaceful explanation

  57. BTMovieSecondChannel says:

    Very informative!

  58. Loganbogan9 says:

    Dark colors on 8-bit look like crap.

Leave a Comment

Your email address will not be published. Required fields are marked *