Jump to content

HDR on Youtube - next big thing? Requirements?


Axel
 Share

Consider HDR already?   

57 members have voted

  1. 1. Consider HDR already?

    • Not interested at all.
      7
    • Don't need it now, will evaluate it when it's everywhere.
      27
    • I wasn't aware of the latest developments, but I'm looking into it now.
      16
    • I am already updating my workflow and hardware, HDR is the next big thing.
      7


Recommended Posts

Everything is very specific and standardized , and it is here to stay, HDR is much more greater a leap, than 4K is, it is just that the 2 technologies will coexist from now on. I guess it is better to wait a bit until the market stabilizes a bit (after this summer I guess), as last years TVs, only a handful were truly HDR (almost half a dozen in total). This year also, low cost sets are "almost" HDR, even thought they say they are, but from the middle and up they truly are. There is also the PREMIUM, which is where the industry is going, but it is very expensive at the moment (maybe for the next couple of years as well). 

Dolby's system is more advanced, but right now its unrealistic (and it will take some time to achieve its maximum potential, hardware wise), also take note that there are different standards for OLED TVs, because they can go unbelievably low (for blacks), but not as high.

http://4k.com/news/a-clarification-on-sony-4k-tvs-the-ultra-hd-premium-standard-sony-4k-hdr-and-other-hdr-types-14436/

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

"I'm lost. There are displays that actually show up to 15 stops of dr now? Or still just an image with 15 stops compressed, but not as much? What does 10 bit have to do with hdr? Just another advancement that's buddying up with hdr in youtube and better screens? They definitely don't have to go hand in hand, right? A good hdr display still uses all of its dr with a video not made for hdr, right? Not just essentially lifting the blacks and everything to match a bad display? A video made IN hdr, when played on a normal screen, will clip the boundaries? Or compess it, giving it that flatter look? Will old content be poorly converted in order to appear to be hdr?

Sorry.. last I heard on this topic was like "Wow, there's a display what has EIGHT stops of Dr!! The futuuuuuure!" so I'm pretty behind. I should try to figure it out on my own really, but if I'm missing something important, please share. Also are pretty much all 4k tvs these days that say hdr liars?"

 

We are all a little lost since this is all new, but I have some answers.

1. You are right - "HDR" has really two advances and the specs require they go hand in hand - greater dynamic range (that most people are obsessing about) and better color. The better color reproduction comes from: 4:2:2 chroma sampling, 10bits (more gradations of a color), and more colors (wider color gamut compared to good old REC709). Now, many scenes in fact do not actually have much dynamic range, but almost all scenes have color. So I regard the improvement in color to be much more important (hence the HDR video I shot emphasized coloration (Fall trees)).

2. What does a true HDR video look like played on a normal screen - well, you have a normal screen, right? So here is a true HDR video:

Looks flat. But if you forced your (fictional) true HDR TV to go into HDR mode, this video would magically appear in full color and full contrast. What YouTube does if the HDR video has the correct metadata is it converts the HDR video to SDR, so you see the video in full color (REC709). You have to ask them what they do exactly to convert. The other HDR video I posted in an earlier post (same exact specs as the one here), had the metadata, so you see it in full SDR color. But it is not the HDR version.

3. On lying about "HDR". When a camera says it shoots 1080 video, you know it never achieves the full resolution (DeBayering, pixel binning). When a camera says it shoots 4K, does it really achieve 3840x1920 resolution? - no (but it usually means it has higher resolution than Full HD).

So lying is standard in video (do you call Canon a liar?). Same with TV's - do HDR TVs really give the full REC2020 color gamut? - none do. Do they have 1000 nits and above? almost none do. Are they 10bit - yes; do they have a gamut greater than REC709 - yes. Do they have a DR greater than REC709? - yes. So, you do get better color and DR compared with SDR.

There, however, is one additional fib that is less benign and less standard. You see the same "HDR" logo, but then the fine print says "HDR compatibility". This just means that the TV will convert the HDR video to SDR. You do not see HDR at all (you don't see video that looks like the above, you get one that has color etc. but not in HDR). HDR converted to SDR.

Link to comment
Share on other sites

10 hours ago, Policar said:

The specs I'm hearing are 15 stops DR, rec2020. For acquisition. Then 10 bit 4000 nit wide gamut for the panel itself. Obviously not many current systems meet these specs and there are many, many competing standards. After all, 1024X720 was once "HD."

The result is breathtaking, though. Especially on the 10,000+ nit display. Only one other tech demo impressed me as much this year and it felt less mature. What's cool is you'll be able to see colors you've never seen before.

 

Thanks for responding. Those specs are really scary.

Link to comment
Share on other sites

5 hours ago, markr041 said:

3. On lying about "HDR". When a camera says it shoots 1080 video, you know it never achieves the full resolution (DeBayering, pixel binning). When a camera says it shoots 4K, does it really achieve 3840x1920 resolution? - no (but it usually means it has higher resolution than Full HD).

So lying is standard in video (do you call Canon a liar?). Same with TV's - do HDR TVs really give the full REC2020 color gamut? - none do. Do they have 1000 nits and above? almost none do. Are they 10bit - yes; do they have a gamut greater than REC709 - yes. Do they have a DR greater than REC709? - yes. So, you do get better color and DR compared with SDR.

This is somewhat disingenuous.  First of all not all cameras generate final images the same way.  Even within the same camera we see more detail with raw than a compressed codec (Canon 5D MK III).  Andrew has called out and praised multiple camera manufactures for lying or telling the truth about resolution.  1080p from the C300 has been praised.  It is down sampled from a 4k sensor.  I haven't done the math but obviously it is higher resolution than the 1080p from a T3i.  This has been noted many times by Andrew.  Even the C100 is down sampled from a 4k sensor.  Slap on an external recorder and you've got some of the best 1080p around.

And of course Samsung NX1, Sony A6300, etc.  All 4k downsampled from 6k.

Point being people should definitely call out lying by camera makers (as they already do), TV manufactures, and broadcasters.  I praise the people who point out that most TV content isn't even 1080p.  To be honest the only time my TV sees true 1080p content is when I fire up my media center PC.

Link to comment
Share on other sites

1 minute ago, Damphousse said:

This is somewhat disingenuous.  First of all not all cameras generate final images the same way.  Even within the same camera we see more detail with raw than a compressed codec (Canon 5D MK III).  Andrew has called out and praised multiple camera manufactures for lying or telling the truth about resolution.  1080p from the C300 has been praised.  It is down sampled from a 4k sensor.  I haven't done the math but obviously it is higher resolution than the 1080p from a T3i.  This has been noted many times by Andrew.  Even the C100 is down sampled from a 4k sensor.  Slap on an external recorder and you've got some of the best 1080p around.

And of course Samsung NX1, Sony A6300, etc.  All 4k downsampled from 6k.

Point being people should definitely call out lying by camera makers (as they already do), TV manufactures, and broadcasters.  I praise the people who point out that most TV content isn't even 1080p.  To be honest the only time my TV sees true 1080p content is when I fire up my media center PC.

i dont understand why you used the word disingenuous when what you go on to say just repeats what i said - cameras that are labeled 4K dont achieve 4K resolution just like tv's labeled HDR  do not meet all the specs of HDR. I did not say all cameras or all tvs. You did add a lot of irrelevant info, which could be useful. Thanks for the praise. :)

Link to comment
Share on other sites

13 minutes ago, markr041 said:

i dont understand why you used the word disingenuous when what you go on to say just repeats what i said - cameras that are labeled 4K dont achieve 4K resolution just like tv's labeled HDR  do not meet all the specs of HDR. I did not say all cameras or all tvs. You did add a lot of irrelevant info, which could be useful. Thanks for the praise. :)

Yeah.  Probably misunderstood your post.  I retract that statement.

Link to comment
Share on other sites

On 11/18/2016 at 9:49 PM, Liam said:

I'm lost. There are displays that actually show up to 15 stops of dr now? Or still just an image with 15 stops compressed, but not as much? What does 10 bit have to do with hdr? Just another advancement that's buddying up with hdr in youtube and better screens? They definitely don't have to go hand in hand, right? A good hdr display still uses all of its dr with a video not made for hdr, right? Not just essentially lifting the blacks and everything to match a bad display? A video made IN hdr, when played on a normal screen, will clip the boundaries? Or compess it, giving it that flatter look? Will old content be poorly converted in order to appear to be hdr?

Sorry.. last I heard on this topic was like "Wow, there's a display what has EIGHT stops of Dr!! The futuuuuuure!" so I'm pretty behind. I should try to figure it out on my own really, but if I'm missing something important, please share. Also are pretty much all 4k tvs these days that say hdr liars?

I've never seen stops as a measurement for a tv's contrast ratio (if that's what you mean by dynamic range, the misnomer is confusing me?), but it's trivially easy to convert between contrast ratio (X:1) and stops. Fwiw, LCDs have had greater than eight stops of contrast for many, many years. For reference, nice prints on paper can have at most 4-5 stops of contrast, which is part of the reason high contrast film was popular for print (slide film) but insufficient for for digital distribution. To convert for a tv, take the log base 2 of the contrast ratio you'll have your contrast ratio in stops.

Dolby claims in their Dolby Vision white paper that blu ray standards limit content from a range of 100 nits to 0.117 nits, which is under 10 stops of contrast. Perhaps this is what you are referring to. But that's not true of the display itself, just the arbitrary standard, and OLEDs and plasmas can get very very high contrast ratios, they just don't get that bright. (And neither has an intrinsic "dynamic range"––that's a measure of the signal captured or represented in a capture, not the display used to show it.) What defines HDR is a combination of contrast ratio, brightness, and color gamut. The massive increase in brightness is what's most significant. 

I'm not sure what the current spec for HDR is but there are many that are competing and within those that are competing, many tiers. 720p and 1080p were decided upon as HDTV standards, but that was among a number of competing standards. For HDR we don't even have the set of standards finalized. But I think the current spec is 1000 nits for most manufacturers (though you see HDR labels on far dimmer displays) and maybe 10 bit color? Dolby Vision I believe is 4000 nits and higher with a goal for 10,000 nits and higher. I'd read their white paper, it's interesting without getting very technical.

Today's standards for HDR are not too high, but the displays are not too impressive. The 15 stop standard (for acquisition, the display itself surely has a much higher contrast ratio than 15 stops and I can explain why but if you think about it then it's obvious) and 4000/10000 nit standard are I believe Dolby's goals for the future. It's true that there are competing standards still (DR10, Dolby Vision, and most requiring only 1000 nits to get the label), but I personally wouldn't consider them HDR based on what I've seen of them. Like 4k, they don't look good enough or different enough to catch on and are mostly just marketing or halo items.

Link to comment
Share on other sites

If I remember correctly Philips research showed that people can't tolerate 10000 nits. Yes its breathtaking for first minute, but after that you feel its destroying your eyes. 4000 was acceptable, and 2000 was most preferred. 

At the moment, we only can do this to future-proof our work:

Shoot log, 10bit, widest color gamut available. Underexpose to save the highlights but be careful that unavoidable clipped highlights stay very small part of the whole image area. 

Link to comment
Share on other sites

1 hour ago, Eric Calabros said:

If I remember correctly Philips research showed that people can't tolerate 10000 nits. Yes its breathtaking for first minute, but after that you feel its destroying your eyes. 4000 was acceptable, and 2000 was most preferred. 

At the moment, we only can do this to future-proof our work:

Shoot log, 10bit, widest color gamut available. Underexpose to save the highlights but be careful that unavoidable clipped highlights stay very small part of the whole image area. 

If that's what Dolby is saying, it's ridiculous. Step outside during a sunny day and you're exposing yourself to brightness far in excess of 10,000 nits all around you. Orders of magnitude brighter. Having seen 10,000 and 4000 side-by-side I can confirm that neither is fatiguing, even in indoor light, though you might choose to turn brightness down at night as you would with any display. And simply because a display can get that bright doesn't mean it will. It's like saying 8k is too sharp. It might be unnecessarily sharp, but you can always show a less sharp image. Only very small parts of the frame ever get that bright.

Dolby, who doesn't have a horse in this race as Philips does (hardware manufacturers want lower levels to be perceived as preferable so they can sell displays before technology advances, whereas Dolby just wants to have the best standard that will eventually get adopted) has found that 10,000 nits is what 90% of consumers consider "adequate." Based on my experiences with HDR, I wouldn't consider anything below 4000 nits even "true" HDR (though it might still be quite impressive) and 10,000 nits for me at least is the baseline of what's ideal, similar to how I think 2k or 1080p is an ideal "good enough" resolution. Anything beyond that is great but diminishing returns and only really appeals to that last 10%. Of course, 2000 nits will still look much better than 100, which is the standard now. It's considerably more than halfway to 10000 in log base 2.

As regards future proofing, I don't think any camera I have will come close to matching HDR spec so I don't worry about it. The C300 Mk II with the new SLOG 2 upgrade is the cheapest thing on the market that is HDR-ready, and it was actually designed as such. 

Link to comment
Share on other sites

"The C300 Mk II with the new SLOG 2 upgrade is the cheapest thing on the market that is HDR-ready, and it was actually designed as such."

Nonsense. The combination of Panasonic GH4 + Atomos Shogun Flame gives you 12-stops dynamic range (using Vlog L), 10-bit 4:2:2 and UHD, with extended color gamut at very high bitrates. That is about the same as the C300 Mk II ($12,000) provides at that resolution, for far less money. And don't think you don't need the Shogun or something like it for the Canon - the Canon viewfinders cannot show what HDR video looks in the field. The Shogun can, at 1500 nits and the appropriate HDR LUTs.

Shooting in log at HDR specs but seeing only REC709 without LUTs is just not going to provide good HDR video. An appropriate HDR monitor is needed. Someone here evidently thinks shooting in log and underexposing is the correct way to shoot HDR; that is exactly the kind of thinking that reflects what you think you have to do when you cannot monitor HDR video in the field.

Link to comment
Share on other sites

54 minutes ago, markr041 said:

"The C300 Mk II with the new SLOG 2 upgrade is the cheapest thing on the market that is HDR-ready, and it was actually designed as such."

Nonsense. The combination of Panasonic GH4 + Atomos Shogun Flame gives you 12-stops dynamic range (using Vlog L), 10-bit 4:2:2 and UHD, with extended color gamut at very high bitrates. That is about the same as the C300 Mk II ($12,000) provides at that resolution, for far less money. And don't think you don't need the Shogun or something like it for the Canon - the Canon viewfinders cannot show what HDR video looks in the field. The Shogun can, at 1500 nits and the appropriate HDR LUTs.

Shooting in log at HDR specs but seeing only REC709 without LUTs is just not going to provide good HDR video. An appropriate HDR monitor is needed. Someone here evidently thinks shooting in log and underexposing is the correct way to shoot HDR; that is exactly the kind of thinking that reflects what you think you have to do when you cannot monitor HDR video in the field.

The HDR acquisition spec among those developing next generation standards is 15 stops, though, and rec2020. Not 12 stops and rec709. I've used the GH4 and it simply doesn't have that much highlight latitude. I have friends at Dolby, Technicolor, Deluxe, CO3, etc. developing the next-gen HDR grading systems and they're all operating under the 15 stop spec. To reveal my sources more specifically would break NDA, but I know for a fact that Canon targeted 15 stops as it was considered baseline for HDR. 

To be fair, these labs are remastering content from film and cameras specc'ed at 14 stops and under. I'm sure you can create compelling personal HDR-ready content with that system, but it doesn't meet or even come close to the standards being developed for professional use. And when distributors and exhibitors further and more formally standardize, the 15 stop/rec2020 spec (or better) is what's going to be expected. Whether the C300 II meets its 15 stop claim is another question (before the firmware update it probably didn't; I have a number of friends who worked on developing that camera and it was an ugly protracted development, the early failures of which mask a now-impressive sensor in firmware update SLOG 2), but anything less isn't even close. (Fwiw, I would rate the Alexa Mini at 15+ stops, and internally, Arri does as well. Likewise, the F65 has footage that can be graded for HDR even if it wouldn't meet specs for official support, so yes, it's not cut and dry.)

I'm not saying this to argue with you but to educate you. I have no doubt that you can make impressive content on your camera that looks impressive on an HDR display. But this is not an approach I would recommend to anyone as anything other than a hobby and the results will never make full use of the technology being developed, not even close. That said, from this experience you are building now, you will be on the forefront of shooting and grading HDR, and that is an extremely valuable skill going forward.

Link to comment
Share on other sites

Who is talking about REC709? REC2020 is already an official spec of HDR - this generation. The GH4/Shogun combo can do REC2020 color no worse than the Canon. Canon beats it by a few stops in DR when the GH4 uses Vlog L (12 stops) (if Canon meets the goal of 15 stops), but The GH4 is equal in providing expanded color gamut, color gradations and color sampling, which are just as important for HDR video. In short, that combo is way beyond the specs of REC709. And 12-stops is within the specs of HDR10, an agreed on standard for HDR.

The phrase "doesn't meet the standard for professional use" is an incorrect statement (not to mention arrogant), as there is no "standard" for "professional" use of HDR, although there are current standards for HDR (like HDR10), which the GH4 meets as does the Canon.

I have knowledgeable friends who shoot professional video by any definition of professional and use GH4's. They do not have secret relationships with companies that try to sell products by setting arbitrary standards in the interest of maximizing profits (nor do I); they are just professionals creating art and getting paid for the quality of their artistic product.

Link to comment
Share on other sites

22 minutes ago, markr041 said:

Who is talking about REC709? REC2020 is already an official spec of HDR - this generation. The GH4/Shogun combo can do REC2020 color no worse than the Canon. Canon beats it by a few stops in DR when the GH4 uses Vlog L (12 stops) (if Canon meets the goal of 15 stops), but The GH4 is equal in providing expanded color gamut, color gradations and color sampling, which are just as important for HDR video. In short, that combo is way beyond the specs of REC709. And 12-stops is within the specs of HDR10, an agreed on standard for HDR.

The phrase "doesn't meet the standard for professional use" is an incorrect statement (not to mention arrogant), as there is no "standard" for "professional" use of HDR, although there are current standards for HDR (like HDR10), which the GH4 meets as does the Canon.

I have knowledgeable friends who shoot professional video by any definition of professional and use GH4's. They do not have secret relationships with companies that try to sell products by setting arbitrary standards in the interest of maximizing profits (nor do I); they are just professionals creating art and getting paid for the quality of their artistic product.

Good for you. I have friends shooting and grading major features on Alexas and who are on the ACES board. And I recently worked as a consultant at Dolby for their new Dolby Vision workflows.

I'm glad your made up HDR standard works for you and your friends with GH4s. Which can't deliver a full rec2020 image, unlike the C300 Mk II, but that's fine that you think it can. I'll let Technicolor know that they should throw out their research. It's true that there is no current standard beyond 10 bit and rec2020 (and 1000 nits target) for HDR10 because those are display standards and not acquisition standards, and the lowest end ones that will soon be abandoned, anyway. But it's fine, 28 Days Later is on Blu Ray, and that meets the 1080p standard... because it was unpressed to it. It still looks like shit.

I'm trying to provide some insight into the future, even possibly breaking NDA because I'm so enthusiastic about this forthcoming tech that I've been lucky enough to have demoed for me. And for those who are interested in where things are going, I want to offer up some advice so they don't make the wrong purchasing decisions. The standards being developed will require 15 stops and rec2020. At least.

But you're right. You and your friends have it figured out. I'll call Dolby up and tell them to quit it.

Link to comment
Share on other sites

Balderdash. Where is the evidence that the Canon can deliver a full REC2020 image - or can't you reveal that? No camera can provide the full REC202 gamut, my friend. It is next to impossible because of physics (some of my best friends are physicists). No TV in existence can either (check with your Technicolor friends), yet many TV's are still considered HDR, yielding pictures that anyone can see are superior to REC709 images.

We all benefited from your insights about future standards, less so from your arrogance about what is professional and what is not, and your absurd claims.  And everyone now sees that you are trying to influence "purchasing decisions" when you are a paid consultant to those selling products. Lucky you indeed.

Link to comment
Share on other sites

Some of that is fair. But for those looking into a camera that will meet future HDR specs for YouTube and Netflix, etc. the C300 Mk II is the lowest end that has a good chance of being approved. That said, this site has never been about what standards others have approved, and instead about getting great results with what you have. For the money, the GH4 definitely offers a good image.

The rest... take it or leave it. We all have different goals. (Mine isn't further consultant work, it's simply that I think Dolby's standard is superior.) I don't have a crystal ball but I do have access to information that many people don't. I don't know enough about gamuts and chromaticities to claim any camera can cover any given gamut outside of what is printed in white papers, and it's clear from Canon's white papers that the C300 Mk II does not cover rec2020 in full (I don't believe anything does?). But its native gamut lines up very closely with rec2020 and it covers far, far more of it than most cheaper cameras, and was designed specifically to cover more of it than the competition.

Link to comment
Share on other sites

The HDR is very new and the standards are competing. Many HDR bluray movies are made from Alexa 2.7k source material or even from 2k final edits. I think they make special effects like sunsets or shiny things artificially because the original material has not enough "power". It is also easy to add computer graphics to enhance HDR.

The 10bit HDR HEVC file is rendered with the rec2020 profile or gamut. The 2020 is a very wide gamut and no TV can show the extreme colors it has. The best TVs can show about 80% of rec2020 and 98% of DCI-P3 which is the current cinema projection standard. No standard says that the HDR must use all of the rec2020 colors. Rec2020 is just a container. Maybe some day HDR can show all of the rec2020.

The GH4 can shoot about 12 stops of dynamic range but nothing tells how vibrant colors it can capture. When using GH4 10bit V-log it gives only about 650 levels of 1024 possible because V-log L compresses heavily the possible range to match professional V-log cameras. So it is like a 9 bit video. Here is more about GH4 V-log-L:

http://www.provideocoalition.com/v-log-l-on-the-gh4-don-t-panic/

The professional cameras has much better sensors and they record 4:4:4 12bit or even 16bit RAW video. When grading the final HDR video for rec2020 container the possibilities are much finer and better. The camera records very fine gradients and can capture very wide color range. The normal life seldom contains very vibrant colors so the colorist may want to enhance colors to get HDR impact. 

Link to comment
Share on other sites

I apologize for any perceived attitude. I'm a camera enthusiast who still shoots as a hobby (I used to shoot tv professionally), but now that I'm working in post I get to work on some even higher end projects with the most cutting edge camera systems and lenses. And I wanted to share some of the latest news that I was incredibly excited about as well as my experiences with different camera systems. Every day I' working with Alexa footage with the highest end lenses or Varicam footage and with the highest end lenses, which is especially fun for me since I'd shot with almost every camera system previously, but I get an even better impression working with the files in post and interfacing with people on the cutting edge and learning their preferences and prognostication. And I never got to the level on set where I could shoot Ultra Primes next to Aluras one day and then C Series anamorphic the next, but I do get to work with that footage now on a daily basis. But I get that with this (and my confidence in the people around me and what they say–not in my own opinions, which I try not to mix in and apologize for my misunderstanding regarding the rec2020 color space*) isn't welcome here because of the perceived arrogance, and I won't be posting here anymore.

*That said, Canon's white papers do indicate a camera that fills most of the gamut (which naturally includes imaginary colors the eye can't see and so arguably needn't be filled) and we've seen before that a camera needn't resolve a full "4k" to be 4k compliant and yet Netflix does exclude some cameras from common use that claim a 4k spec. So there is a middle ground, and I believe this is where the divide falls. The advice I'm hearing from the top brass at top labs is that 15 stops and support for high bit depth high bit rate rec2020 will be what producers ask for from a camera when shooting HDR. Canon had HDR in mind specifically when developing the C300 Mk II and C700. That said, I will have to trust you that the GH4 can also fill almost the entire rec2020 gamut as I have not read the white papers and my only experience with that camera is without an external recorder or VLOG, where it still performs well for the price. As regards the 12 stop spec in HDR10 for acquisition, that is news to me, and interesting to consider given that Dolby (where I consulted for free simply due to my enthusiasm for the product; I have friends with financial interests, but I don't have one) is looking toward much higher numbers.

Link to comment
Share on other sites

45 minutes ago, Policar said:

....because of the perceived arrogance, and I won't be posting here anymore.

Dont quit. I have red your writings with great interest. I will buy a HDR TV sooner or later and I want to make HDR videos and photos. We can together think what is the best and reasonable workflow for us consumers. I think it needs a better camera than current consumer models. 10bit HEVC is also kind of tricky with current editors and computers. We need also a new 10bit or better photo standard. 16bit TIF is an overkill.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...