Jump to content

markr041

Members
  • Posts

    892
  • Joined

  • Last visited

Everything posted by markr041

  1. Balderdash. Where is the evidence that the Canon can deliver a full REC2020 image - or can't you reveal that? No camera can provide the full REC202 gamut, my friend. It is next to impossible because of physics (some of my best friends are physicists). No TV in existence can either (check with your Technicolor friends), yet many TV's are still considered HDR, yielding pictures that anyone can see are superior to REC709 images. We all benefited from your insights about future standards, less so from your arrogance about what is professional and what is not, and your absurd claims. And everyone now sees that you are trying to influence "purchasing decisions" when you are a paid consultant to those selling products. Lucky you indeed.
  2. Who is talking about REC709? REC2020 is already an official spec of HDR - this generation. The GH4/Shogun combo can do REC2020 color no worse than the Canon. Canon beats it by a few stops in DR when the GH4 uses Vlog L (12 stops) (if Canon meets the goal of 15 stops), but The GH4 is equal in providing expanded color gamut, color gradations and color sampling, which are just as important for HDR video. In short, that combo is way beyond the specs of REC709. And 12-stops is within the specs of HDR10, an agreed on standard for HDR. The phrase "doesn't meet the standard for professional use" is an incorrect statement (not to mention arrogant), as there is no "standard" for "professional" use of HDR, although there are current standards for HDR (like HDR10), which the GH4 meets as does the Canon. I have knowledgeable friends who shoot professional video by any definition of professional and use GH4's. They do not have secret relationships with companies that try to sell products by setting arbitrary standards in the interest of maximizing profits (nor do I); they are just professionals creating art and getting paid for the quality of their artistic product.
  3. "The C300 Mk II with the new SLOG 2 upgrade is the cheapest thing on the market that is HDR-ready, and it was actually designed as such." Nonsense. The combination of Panasonic GH4 + Atomos Shogun Flame gives you 12-stops dynamic range (using Vlog L), 10-bit 4:2:2 and UHD, with extended color gamut at very high bitrates. That is about the same as the C300 Mk II ($12,000) provides at that resolution, for far less money. And don't think you don't need the Shogun or something like it for the Canon - the Canon viewfinders cannot show what HDR video looks in the field. The Shogun can, at 1500 nits and the appropriate HDR LUTs. Shooting in log at HDR specs but seeing only REC709 without LUTs is just not going to provide good HDR video. An appropriate HDR monitor is needed. Someone here evidently thinks shooting in log and underexposing is the correct way to shoot HDR; that is exactly the kind of thinking that reflects what you think you have to do when you cannot monitor HDR video in the field.
  4. 24p at best in 4K for an action cam (latest firmware). LOL.
  5. i dont understand why you used the word disingenuous when what you go on to say just repeats what i said - cameras that are labeled 4K dont achieve 4K resolution just like tv's labeled HDR do not meet all the specs of HDR. I did not say all cameras or all tvs. You did add a lot of irrelevant info, which could be useful. Thanks for the praise.
  6. "I'm lost. There are displays that actually show up to 15 stops of dr now? Or still just an image with 15 stops compressed, but not as much? What does 10 bit have to do with hdr? Just another advancement that's buddying up with hdr in youtube and better screens? They definitely don't have to go hand in hand, right? A good hdr display still uses all of its dr with a video not made for hdr, right? Not just essentially lifting the blacks and everything to match a bad display? A video made IN hdr, when played on a normal screen, will clip the boundaries? Or compess it, giving it that flatter look? Will old content be poorly converted in order to appear to be hdr? Sorry.. last I heard on this topic was like "Wow, there's a display what has EIGHT stops of Dr!! The futuuuuuure!" so I'm pretty behind. I should try to figure it out on my own really, but if I'm missing something important, please share. Also are pretty much all 4k tvs these days that say hdr liars?" We are all a little lost since this is all new, but I have some answers. 1. You are right - "HDR" has really two advances and the specs require they go hand in hand - greater dynamic range (that most people are obsessing about) and better color. The better color reproduction comes from: 4:2:2 chroma sampling, 10bits (more gradations of a color), and more colors (wider color gamut compared to good old REC709). Now, many scenes in fact do not actually have much dynamic range, but almost all scenes have color. So I regard the improvement in color to be much more important (hence the HDR video I shot emphasized coloration (Fall trees)). 2. What does a true HDR video look like played on a normal screen - well, you have a normal screen, right? So here is a true HDR video: Looks flat. But if you forced your (fictional) true HDR TV to go into HDR mode, this video would magically appear in full color and full contrast. What YouTube does if the HDR video has the correct metadata is it converts the HDR video to SDR, so you see the video in full color (REC709). You have to ask them what they do exactly to convert. The other HDR video I posted in an earlier post (same exact specs as the one here), had the metadata, so you see it in full SDR color. But it is not the HDR version. 3. On lying about "HDR". When a camera says it shoots 1080 video, you know it never achieves the full resolution (DeBayering, pixel binning). When a camera says it shoots 4K, does it really achieve 3840x1920 resolution? - no (but it usually means it has higher resolution than Full HD). So lying is standard in video (do you call Canon a liar?). Same with TV's - do HDR TVs really give the full REC2020 color gamut? - none do. Do they have 1000 nits and above? almost none do. Are they 10bit - yes; do they have a gamut greater than REC709 - yes. Do they have a DR greater than REC709? - yes. So, you do get better color and DR compared with SDR. There, however, is one additional fib that is less benign and less standard. You see the same "HDR" logo, but then the fine print says "HDR compatibility". This just means that the TV will convert the HDR video to SDR. You do not see HDR at all (you don't see video that looks like the above, you get one that has color etc. but not in HDR). HDR converted to SDR.
  7. "It is not true HDR unless you have a 10bit graphics card and a 10bit HDR standard monitor or TV. How do you say to monitor that now comes HDR material? The display must have a HDR setting." I know that and everyone knows that (I produce HDR videos). What I am saying is that if you have all of the HDR viewing hardware it is not enough to see the HDR YouTube video in HDR. An HDR system - HDR TV with a switchable mode or a combo appropriate graphics card and 10bit monitor capable of REC2020 color - will not show an HDR YouTube video in HDR. You can only see the SDR version. The bottleneck is that you cannot get YouTube to switch to the HDR version. So if you have that equipment you can see the top video I posted in HDR, but you cannot see the bottom one, which has the correct metadata. You will see the REC709 video version in HDR mode, which looks awful.
  8. Let's get back to the topic of this thread: YouTube HDR. And the news is bad. Ok, so I uploaded my HDR video to YouTube because I want people to see a real HDR video (10+ stops, REC2020 color gamut, 10bit, 4:2:2) that is not a promo piece. And I followed Youtube's instructions, and they worked - YouTube converted the HDR video to SDR so that people without HDR viewers can see the video. That is what you are seeing above - SDR. Looks ok; YouTube did a good job converting. But it is REC709 all the way (8bit, 4:2:0, 5-6 stops, limited color gamut). Ok, how can you view the HDR version if you have an HDR capable screen/TV/monitor? You cannot! There is no way within YouTube to tell it to show the HDR version instead of the SDR version. Does that mean it is impossible to see the HDR version in HDR? No, there is one way, and only one way - using the new $69 Google Chromecast Ultra. It will stream YouTube HDR videos in HDR. Nothing else will. It may be that YouTube has licensed the YouTube player in some Samsung TV's to also play its HDR videos, that is yet to be confirmed. This is the current sorry state of YouTube HDR. It is a proprietary system.
  9. Here is my first true YouTube-compliant 4K HDR (10bit, 4:2:2 REC2020 12-stop) video. You will see it translated to SDR (REC709) if you have an SDR viewing device; and HDR if you can watch in HDR: Panasonic GH4 10bit 4:2:2 to Shogun Inferno, graded in resolve 12.5 in HDR. Output as DNxHR 444 12bit, with injected HDR metadata.
  10. Yes, you are not watching the video in HDR mode. Even if you had an HDR viewing device Youtube is not communicating to the viewing device that the video is HDR. When viewers force their HDR TV to HDR mode the colors emerge. Any nice looking videos that claim to be HDR that you see in SDR are not HDR. In HDR mode the video looks better than the SDR version I also posted, on SDR devices the HDR video looks flat and colorless. HDR seen on SDR screens is not like 4K seen on lower resolution screens, which looks more than fine.
  11. REC709 version of HDR video mentioned above: https://vimeo.com/190925903 HDR version uploaded to Youtube: https://youtu.be/0c--s8UMb0g Recorded from 10bit 4:2:2 Vlog L 12-stop HDMI stream from GH4 to Atomos Shogun Inferno recorded in ProRes HQ. Edited in Resolve in REC202 with Resolve HDR color management and rendered in 12bit DNxHR 444. Monitored in HDR on the Shogun Inferno in HDR (10bit, REC2020, 1500 nits).
  12. I produced an HDR video, edited in Resolve and followed Youtube's instructions to the letter. When i uploaded the 10bit, 4:4:4 REC 2020 DNxHR with metadata signaling HDR injected by Resolve video to Youtube, Youtube did not recognize it as HDR. So it did transcode it and it can be streamed, but it is all washed out in SDR and viewed on a tv in HDR mode it is colorful in an odd way compared to the original. Early days.
  13. Shots recorded simultaneously (in camera) at 100 Mbps X AVC S and externally via HDMI by the Shogun Inferno using ProRes HQ 4K. Then the Slog2/SGamut clips were color graded with exactly the same settings and combined sequentially. Rendered in XAVC Intra, which is a 4:2:2 codec. Atomos claims that even at 8-bit, shooting in 4:2:2 rather than 4:2:0 makes a difference, and arguably the extremely high bitrate of ProRes HQ avoids the macroblocking that some claim to see from XAVC S at even 100 Mbps. This video has plenty of colors and details, including moving leaves that are hard on long GOP codecs. See any differences? Would 10-bits make any difference to these grades (see any banding)?
  14. Thanks. I shot between 50 and 100mm, and because it was dim, it was mostly wide-open at the relevant max for the focal lengths. Seems dual-IS takes care of any jitters.
  15. OK, here's a video with lots of skin, with my favorite settings for exclusively outdoors:
  16. This was shot using the Lumix 35-100 f2.8 lens handheld, but using dual IS on the GX85:
  17. "4K @ 60Mbps is not good at all, DJI should give an option of selectable bitrate from low to high 100-200Mbps 4K @ 60Mbps is the best the GoPro 5 offers too. Btw, the Sony X1000V and the new Sony X3000 offer 4K @ 100Mbps. And you can also select 60Mbps.
  18. "10bit 422 internal and 4k 60fps" Yes, but not together. 10bit 422 internal is for UHD at 30p. Otherwise 8bit 420 internal. What is not clear is whether there is 10bit 422 external for UHD 60p.
  19. (Berlin, September 15): Panasonic today announced that the new GH5 will be completely wireless - no headphone jack or mic jack. A spokesman said "We consulted Swedish video and audio experts, and they all agreed that analog audio ports were a thing of the past. They thought wireless micing was the way to go. Since they know more than anyone about these issues, we at Panasonic decided, given the go-ahead by Apple, to save money and space on the camera with this initiative." Panasonic will also be coming out with a new line of wireless mics that will have rechargeable batteries. Battery life was not available at this time. Industry enthusiasts who devote a lot of time to on-line forums welcomed this advance.
  20. "Sony really should make it easier to find information and LUTs for grading log material from their cameras. When I first started using the A7s I couldn't find a proper slog2 to Rec709 LUT anywhere that worked correctly. I spent countless hours looking for a neutral converstion LUT and I knew exactly what I was looking for. The slog curve is not the problem the problem is the sgamut color space that is the default on the camera when shooting in log. That colorspace must be converted to Rec709 via a LUT or color transformation matrix otherwise you can get funky colors." This is bizarre. There is no default gamut. Sony supplies some example "Picture Profiles." One of them happens to use the Slog gamma curve and SGamut. Those "profiles" are just examples, for beginners to try out. You are free to combine the log gamma and the REC709 color space or any of many combinations of curves and color settings. And within any of those there are multitudes of tweaks - knee setting, sharpening, etc. It is ok to say one has a hard time getting the color one wants from a Sony camera. It is another to blame it on Sony, given the number of options it provides. On a more positive note: Sony Catalyst Browse will create a LUT that converts the combination of the log gamma curve and SGamut (or any of its combos) to full REC709 automatically, which you can use in your favorite editing software. It is free.
  21. Mr. Burling said: "Of course they do. They know more about it than most (including you) will ever do. But you misread my post. I suggest not having the defensive glasses on and not apply your needs and standards on others. They are real audiophiles and could care less about the 3.5mm jack in their phone since they would never use the phone in the first place when quality is of concern. They look for "good enough"." I like your posts and videos, but every once in a while you get nasty and personal. You ought to check it. You also missed the point of my later post: having the choice of an analog port and wireless in a phone beats not having the choice. Apple is the one that imposed their values on all of us by removing something that some value. I just advocated leaving a choice, and noted why the analog port has some merit (unlike the usb compared to usb-c in the false analogy) and why the wireless has some pitfalls. I am guilty of defending choice. And, no, "real" audiophiles by definition do not look for "good enough". Obviously you are not one of them. Audiophiles look for the best, taking into account (for most) expenditure constraints. For audio, Apple is not the best.
  22. Wireless is convenient for some, analog port is convenient for others (no charging of headphones, smaller headphones). And some would like to listen to good quality audio, which is not possible by the standards you all apply to video here (get it, some care as much about audio quality as you care about video). Those folks who "work with audio/video on a daily basis so they know all about quality" obviously do not care about audio quality from portable devices or do not know about wireless technology and audio. Samsung phones offers both options, Samsung phones do not cost more and are not bigger or heavier because of the analog port. How is that not superior? Note that the latest LG phone touts its audio quality - so they think there is a market for that. Apple does not care, evidently, about audio quality. They indeed led the way to highly compressed, artifact-filled audio.
  23. Ports: There is no analogy between going from micro-usb to usb-c and going from an analog audio port (wired connection0 to wireless. usb-c is superior in every way to micro-usb (or usb 3.0) - it allows faster data transfers, it is easier to use (symmetric port), and has additional features. The new technology here totally dominates, go for it. However, the new technology replacing the "50-year old" analog audio port is inferior in every way. It requires wireless transmission that cannot pass as much data as a wire, thus crippling the quality of audio quite audibly (this is the opposite of usb-c and micro-usb). Whatever the quality of the audio source wireless is additional degradation via compression. We all know what compression does, right? Moreover, now you have to use powered headphones. That makes them larger and requires paying attention to charging and re-charging; really inconvenient for long flights. A wireless option is fine; but requiring it is a reduction in both quality and convenience.
×
×
  • Create New...