Jump to content


  • Content Count

  • Joined

  • Last visited

  • Days Won


Reputation Activity

  1. Like
    Axel got a reaction from kidzrevil in HDR on Youtube - next big thing? Requirements?   
    Meanwhile on Germany's huge slashCAM forum. Started a controversial HDR thread there - the first one, apart from one guy who asked how best to achieve a working Youtube upload. Very few fans there as well. The usual suspects, like here. Said HDR would change the way films were lit and framed (finally understanding the vanHurkman thoughts). Was baracked. HDR was for nature docs and candy ads, not suitable for serious storytelling. Wrote, what about *light*? There is no light in SDR, only it's faint reflection. Answer (by a pro DoP): that's enough, brightness is relative. HDR is a gimmick like 3D was, and it will disappear just the same, get over it ...
    Another article in the news section: UHD TVs bigger than 50" came to 40% of all houses in Germany in 2017, AUO announces 8k TVs for 2018 and projects 10% market share within two years ...
  2. Like
    Axel reacted to markr041 in 4K Videos Shot for HDR Examples   
    HLG allows more room for highlights. Night video has bright lights surrounded by dark. With little DR (like for SDR), either all the lights are blown out (and thus lack color) or if shot to avoid blow outs, there is zero information in the dark part.

    So here is an HLG video of a street scene at night:
  3. Like
    Axel reacted to Matthew Hartman in HDR on Youtube - next big thing? Requirements?   
    Good to know. I'm still getting aquainted with this format. You've elightened the subject a few times, much appreciated. 
  4. Like
    Axel reacted to jonpais in HDR on Youtube - next big thing? Requirements?   
    @Matthew Hartman The colors can be as subtle or as saturated as you like. Chef’s Table on Netflix is an example of HDR used to great effect for documentary work.
  5. Like
    Axel reacted to Matthew Hartman in HDR on Youtube - next big thing? Requirements?   
    I can see why ppl would be attracted to HDR technology. Colors pop like no one's business. It creates a rich viewing experience, as opposed to simply watching something. And I'm all about the tech and propelling the industry forward and questioning those old conventions that were perpetrated on misconceptions. 
    That being said, I'm on the fence. Is it possible to create a film that's so vivid and rich in detail and color that it takes your audience right out of the story?
    I'm not sold yet that this is a good platform for narrative work. For advertising, it can't come soon enough. This is going to make product shots absolutely pop and scream "buy me immediately". 
  6. Like
    Axel reacted to jonpais in HDR on Youtube - next big thing? Requirements?   
    Late Q2 or early Q3. Sony and Panasonic each have a 5% stake in JOLED, who make the panel for this monitor. 2018, still no OLED monitors on sale to the general public. The 22” Asus will probably go for thousands even though a top of the line 55” OLED TV sells for $1,500. It’s got USB-C ports, which is phenomenal, but dollars to donuts not Thunderbolt.
  7. Like
    Axel got a reaction from jonpais in Vimeo x Youtube high bit depth workflow ?   
    Well, yes, I suppose it IS a bug. Just tried it now, in the same project where nits/cmd/m² were available before (in the scope settings dropdown menu), these are now greyed out. Feedback sent. They have to iron out some really annoying things asap, I think.
  8. Haha
    Axel got a reaction from jonpais in Vimeo x Youtube high bit depth workflow ?   
    Round tripping would be an option, but if I remember correctly, as well 10-bit as HLG and 4k (the limit is UHD) clips from the GH5 are not supported in the free version of Resolve. And it makes no difference whether you created Optimized Media (=ProRes) before or not, because Resolve will refer to the original media in the roundtrip. Your only free option was to create ProRes copies outside of FCP X. Or, but this is a little awkward, extract the Optimized Media from the Library bundle and re-import them to a new Library. I also feel that it would be better to work in ProResHQ, and you can't choose that for Optimized Media.
    Is the fact that the histogram of the FCP scopes doesn't switch to nits in an HDR Library a bug? Another color-related bug? I'm not sure. Note that in this Ripple Training the histogram isn't even mentioned, it's the waveform in RGB parade:
    And, apart from that, the known Color Wheels issue (behaving weird in rec_709) isn't an issue for HDR/rec_2020. And the comparatively simple workflow described above is for the faint-of-heart. 
    (this is redundant for jonpais, I know)
  9. Thanks
    Axel reacted to markr041 in 4K Videos Shot for HDR Examples   
    By videos shot for HDR I mean videos shot with an emphasis on scenes that would benefit in particular from the extended DR of HDR, especially highlights.
    Here are three examples, shot in 4K using HLG with REC2020 color and uploaded to YouTube so that those with HDR viewing devices can view them in HDR, those not in SDR. But, because these videos emphasize highlights that can best be captured by HDR, the view in SDR will miss the point (to exaggerate, like looking at a picture of a rainbow in B&W). The SDR versions will look perfectly fine, but they will be far less interesting. To view in HDR, if you can, you must go to the YouTube site to view by clicking on the YouTube logo.
    First up (already posted in the contentious YouTube HDR thread) is a video shot on a sunny day after a snowstorm. White snow expanses in bright sun are almost blindingly bright in the real world, ideal for capture in HDR:
    Next are two videos shot at the golden hour. The bright, golden light reflected off surfaces and/or coloring objects are again captured well in HDR (white surfaces reflecting bright light are the most impressive). 
  10. Like
    Axel got a reaction from jonpais in HDR on Youtube - next big thing? Requirements?   
    SDR, to put it that way, has to compose an image with rough strokes. Shades, lights and colors are there, but the differentiation is too weak to be subtle and seductive. Vignettes, sDoF, background motion blur and *more cuts* are used instead.
    With HDR (and of course UHD), the viewer's gaze can linger. The take can last longer, it can be a wider shot, containing more detail. The rarer the cuts, the more powerful they are. They become lean-forward, hold-your-breath moments. 
    Yes, I can see how HDR may change the way films are made. These are creative possibilities indeed. Not yet explored, and there's a lot to learn.
  11. Like
    Axel reacted to markr041 in HDR on Youtube - next big thing? Requirements?   
    Speaking of inaccuracies! I think you missed the point. There is HLG content now, in fact, right in this thread. And many cameras now can create HLG video directly - it's a standard among creators.
    This whole forum is about creating video, not watching Hulu, or Neflix or Vudu (or Voodo). Go to some TV forum, wherever that is, to make your case to stick with your outmoded TV. That forum is for watchers and couch potatoes, not creators. There is plenty of whining there about standards and which service providers supply which flavor of HDR. You can also argue about which HDR mode is better!
    What is relevant here is that if you get an HDR TV that has HLG or HDR10 you will be able to watch HDR videos you create and those created by many videographers all over the world, including some posting here. By not getting one you will be stuck in SDR land, inclusive of any videos you create. You just have to shoot in some log gamma with an extended color gamut to create HDR videos, and you may already be shooting in those modes (when you are not watching Amazon Video or Netflix). But you won't create HDR videos as long as you can't watch them.
    So, you are missing out creatively; and the fact you do not have an HDR viewing device means that you are unable to even make an informed decision as to whether shooting in let alone viewing HDR is worth it. So, your SDR TV means you are missing a lot, unless your only purpose is to watch commercial TV. Too bad. 
  12. Thanks
    Axel reacted to jonpais in HDR on Youtube - next big thing? Requirements?   
    1) Our cameras have been shooting HDR for years
    2) We compress our files to rec.709 because that has been the HDTV standard since 1990
    3) Rec.709 has nothing to do with the ‘film look’
    4) Under the best of circumstances, rec.709 is capable of only 6-1/2 stops of dynamic range
    5) HDR allows us to see a much wider dynamic range and more color, closer to what the human eye can see
    6) OLED televisions offer the best image quality today, superior even to what industry standard reference monitors costing $50,000 were capable of just a few years ago
    7) The highest-rated HDR TVs are now selling for little more than the cost of a midrange digital camera 
    8) YouTube and Vimeo, recognizing the demand for HDR content, have both rolled out the format on their video sharing platforms
    9) Nearly all NLEs now support HDR, including DaVinci Resolve and Final Cut Pro
    10) Compared to SDR, the extra storage space and computer processing power required for HDR capture and finishing is negligible
    11) A client who sees the HDR grade will not be satisfied with the SDR (Arri)
    12) To future-proof your videos today, it is best to shoot in either RAW or Log formats
    13) Filmmakers are not the only ones pushing for HDR: gamers are also demanding  more HDR content and displays. It also has uses in the medical field
    14) Atomos monitor/reccorders may be used for gauging exposure on set and as grading monitors in the grading suite (this is the sticky part)
    15) HDR videos uploaded to YouTube are sharper-looking, have far fewer artifacts and macroblocking and far richer contrast than SDR videos
    16) There will be growing pains. When I purchased my 2016 Macbook Pro, few peripherals had USB-C connectors. Now, they are becoming commonplace.
    There really is no answer to those who say they prefer watching movies on their 2011 13” Macbook Air at 540p with the audio turned off. If that’s how you view films, fine. As far as early adoption goes, I could also wait for the remaining 53% of the world’s population to get internet access before uploading videos to YouTube. There are also a few who say it’s not ‘true’ HDR yet. The same could be said of the ‘fake’ 4K video shot with my GH5 or any other consumer camera. If I kept waiting for the perfect camera to come out, I’d never shoot a single clip.
  13. Like
    Axel got a reaction from Mattias Burling in How stills killed casual video for me   
    The following are not my words, I quote and compile the thoughts of others, famous photographers as well as filmmakers. 
    When a filmmaker is reborn, he becomes a photographer. When a photographer is reborn, he becomes a filmmaker (a filmmaker who started as a photographer).
    You freeze one moment of eternity (said by a famous run & gun photographer).
    I compose an image in my mind. It already exists when I push the release (famous photographer).
    A powerful sequence is made of imperfect images. There must be something missing and unsatisfying in every single shot (renowned DP).
    I never shoot anything I don't want (filmmaker).
    There comes a moment when it is no longer you who takes the photograph, but receives the way to do it quite naturally and fully (famous portrait and nature photographer).
  14. Like
    Axel reacted to Rodolfo Fernandes in How stills killed casual video for me   
    When i started shooting (stills) all i did was grab the camera walk around the city where i live and try to look for interesting things happening around me and i do remember the feeling i had when i actually captured something nice! And at the time i was using medium format so i only had 12 chances to get something awesome, and i really miss the ritual of going out shooting and then developing those rolls just to find what came out of it, they werent always the best picture one could take but it was my own personal ritual which i think i will go back to.

  15. Like
    Axel reacted to Mattias Burling in How stills killed casual video for me   
    This is just a little something I shot on the subway, absolutely nothing special. But it made me think.
    I used to shoot a lot of casual video on the streets. But ever since I became passionate about stills, more specifically street photography I never do it. Mostly because when recording video I feel like Im just missing out on great stills. An this was no exception. In fact its 1.5 years since I shot casual video out and about if we don't count "test shots" around the house.
    The irony is that I started shooting street photography to become a better videographer. I used to love shooting video. The best part for me was composition and grading. So I started taking Raw stills in order to get to practice bigger quantities of grading. The plan was to eventually make a video about it and recommend all aspiring videographers to shoot a lot of stills. Just like the DP's in the Zacuto shoot out says (forgot which).
    But along the way it completely switched on me. I started liking stills more. Specially the challenge. To capture a whole scene, the entire atmosphere, all the feelings in one single frame. Compressing an entire story into one single moment. 
Im not in anyway implying that Im good at it. But I love trying.

    Now when I watch vlogging street shooters on youtube, you know the type, Gopro in the hot shoe and a montage to hip-hop beats, I just get annoyed. In the video montages they miss out on so many great images. 
Same thing for me today. In this short video montage of nothingness I see at least 4 potentially great images.
    So, thats it. I still recommend all here that are aspiring DP's to shoot more stills. With something like a GRii you can always have it ready. Look for light, composition and story.
 Just be aware that you might switch to the dark side like I have. 
    Don’t underestimate its power.
    What about you guys? Who shoots stills, why, why not?
    (btw, really soft lens in action here. It looked awesome with gorilla grain on my computer but I removed it since youtube always messes it up imo. And therefor its pretty darn soft. A couple of focus misses doesn't help either.)
  16. Like
    Axel got a reaction from jonpais in HDR on Youtube - next big thing? Requirements?   
    It's time to sum up my findings on the topic so far and invite you to comment.
    First of all, I bought an HDR TV, finally. There is no risk in doing that, and it doesn't have to be about 'adopting new technology'. Everyone here should do this. You will watch content in better quality, any content, including your own.
    Now I can answer my own question I asked a couple times in this thread. Does SDR on an HDR display look better or worse?
    Better, in most cases. With one exception: LOG footage that has been graded and still is somewhat flat. Some may find it aesthetically pleasing in SDR. On an HDR TV, you'd simply say, lacks contrast. NOT good.
    HDR does look better. Unfortunately, it's not easy or cheap to monitor and grade it now. This will change. As we discuss this, not a single TV panel for SDR is manufactured anymore. It's the future. Our rec_709 stuff doesn't become unendurable over night, that's the good news. 
    Okay, you may say, it's like the jump from HD to 4k. We kept up with the times, but HD is still okay. Wrong. HDR will change much more. It may prove that your current equipment is not suitable to achieve the level of quality you demand and see elsewhere. That's the bad news.
  17. Like
    Axel got a reaction from Mark Romero 2 in A6500 AF, quick question :)   
    No, I can't tell "with confidence". I only ever made 5-6 test shots with slomos. One was a fast jump to my microphone shock mount on the table from 6 feet distance to close up, AF set to center. With this 5 x slomo, it nailed the focus perfectly in every frame. But then, the motif was black with sharp outlines and lying on bright wood. 
    What this test showed clearly was that the image quality is really poor then. That's why I didn't do more slomo tests.
    AF is good in general, but it all depends on the mode, if you chose the right one. The most reliable modes are those which involve tracking, for example face detection. They can be called perfect. But there are caveats there as well. See Max Yuryevs and Jason Wongs tips on Youtube. A vlogger's face may be constantly sharp as he/she moves forwards and backwards, but there may be perceptible breathing of the oof background, and this also depends on the AF speed and (here it gets complicated) on the lens used.
    Fact is, the AF works best if it has little to do. When the lens is stopped down to it's maximum resolution and the focal length allows hyperfocal distance. What's the point then? 
    Here is a real world example from a wedding videographers view. It's a bright day, and you are outside. In 2160p, the display gets so dark, you really can't see your image!  In AF mode, there s no peaking to check critical focus. So what I do is either focus manually through the viewfinder (with IBIS and OIS you need only the most basic rig, like a more comfortable *grip*, if at all) or, if I am to use a gimbal, use a bright field monitor (i.e. the smallHD Focus, which has peaking in AF mode). Touch AF doesn't work then, for obvious reasons. I then stop down the aperture to - typically - f7.
    Yes, I do lose nice DoF effects, but the AF won't fail then. One of the biggest advantages of having a camera with good low light capabilities is to be able to use, say, f5.6 in an available light situation. Also, shooting with 120 fps. You need five times the light for that. Then again, since the HD quality of this camera is nothing to write home about anyway, you'll want to use a lower ISO to get a cleaner image.
    Lenses: too many aspects to consider. If good and reliable AF is essential, use good Sony lenses. Or perhaps the new Sigma 16mm, the reviews suggest it's the perfect gimbal lens for the A6500.
    There is no easy answer ...
  18. Like
    Axel got a reaction from Matt James Smith ? in Is FCPX still trash?   
    Yes, of course! Good explanation. If it's the last clip (appended at end), you could also hold p and slightly move it to the right (produces a slug). That'd be the fade-in. Then comes Matthias:
    This will fade out.
  19. Thanks
    Axel reacted to Matt James Smith ? in Is FCPX still trash?   
    @Andrew Reid if you want to fade to black, you need some black to fade to. 
    Snap the cursor to the beginning/end of your clip and hit Alt-W. That drops in a slug. Now you can Cmd-T (or drag any transition) onto your clip and it will transition to black. Simples.
    You have to stop thinking of the timeline as a static, empty canvas. It's more like a sculpture in space. The magnetic timeline is made up only of the elements you add and their interrelationships.
  20. Like
    Axel got a reaction from hyalinejim in The "Annihilation" of Paramount Pictures   
    If it's *just* about money, if the only votes that count are those of the shareholders, this industry is doomed. Applicable to everything. If we are measured by by the degree we can be exploited, our kidneys will be sold and the rest becomes soap. 
    Watch The Cooler. The old casino mafia ruled this frivole business with cruelty - and passion. Then the bankers appeared and took over. And the world turned to shit.
  21. Haha
    Axel got a reaction from jonpais in Is FCPX still trash?   
    You can save the keyframed opacity as an effect preset for fade in, fade out or both at once. Double clicking will apply either of them. You can lift a clip from the primary and apply cmd+t to either left or right bracket or whole clip. 
    You are married to Premiere, Andrew. Move on.
  22. Like
    Axel got a reaction from Stathman in Is FCPX still trash?   
    Though you can do that (Deadcode wrote), it's not recommended. FCPX is not good with clips that are isolated from their card-structure bundle. Through the import dialog, FCP will copy selections from the card by wrapping them in a Quicktime container. That's the way it works. But keep in mind: 
    1. because of the skimmer, you can browse the card very quickly, find your shots and import only selections - in case you think you need to save disk space.
    2. you can alternatively just import everything and do the qualifying and organization later on (you did it in the Finder? - waste of time). Forces you to copy from the card. Into the Library or to an external folder. Links to clips without copies ("leave files in place", the Premiere way) is possible (by copying only the MTS files to disk or by drag&drop), but you shouldn't do that!
    3. either way you can start editing while the clips are still being imported = in effect that's editing directly from card.
    The concept of the primary storyline. In this (e, w, d, ctrl+shift+y) you create the spine of your narration. If you have text (off comments or an interview), you put that to the primary and all illustrating clips are then connected (q). For a music video, you'd first select the music track and hit q. From there on, you connect everything to the black slug. Connected clips can move freely (no need to use p). The magnetic timeline is locked. To lock a connected clip in it's position relative to the music, just overwrite it to the primary storyline by hitting cmd+alt+down.
    The following 10.3 show is staged of course. But it's done by a master. You think FCPX is inferior to Premiere? Watch it from beginning to end, particularly what he does in the timeline, and imagine what ordeal this had been in a track-based timeline:
  23. Like
    Axel got a reaction from hyalinejim in What was the first professional camera to shoot LOG gamma?   
    I'm afraid I'm about to make another TLDR posting ;-)
    What's meant by Dynamic Range?
    Someone above said common rec_709 displays could show 7 stops of light at best. This is the standard. Let's therefore agree to call it SDR.
    Many modern cameras can record 10-13 stops. A linear recording (picture profile-related) would only be able (hypothetically, see below) to store an excerpted range of nine stops. Due to broadcast-rules, the values below 16 and above 235 were considered 'illegal', because they couldn't be truthfully reproduced. That's the background to 7 stops of light at best. 
    But none of the non-LOG profiles bakes in a truly linear curve, because that would look terrible. They reserve most of the values for the midtones and rob some of the wasted 20 superwhite values. The image of a standard picture profile - often called Standard - tends to look punchy as well as natural. Most of the time this is actually what one wants to achieve.
    There are also profiles that favor skintones ('Portrait' - for Sony mirrorless the creative style Autumn Leaves has become popular because it additionally has a color shift complimentary for skin), lanscapes and so on. I don't know the exact numbers, but let's assume that there will be 50-60 values reserved for the skin range then.
    A LOG quantization curve tries to distribute more or less (there is a knee for superwhite usually) the same amount of values for every stop of light into that 256 scheme. For the Sony A7sii, which claims to record 14 stops @Slog3, this would mean 18 values for each stop (it may be somewhat more complicated, but for the sake of simplicity, let's agree upon that simplification).
    One term that currently came up in conjunction with the new GH5S' dual ISO is *useable dynamic range*. Tom Antos' tests to verify the actual dynamic range of cameras showed that all LOG recordings capture the noise floor too, so one has to substract one stop for the shadows. But there are limiting factors for the highlights too. If a daylight sky's gradient does show banding in Slog-3 in a graded rec_709 version, where in-between-values are being interpolated with floating point accuracy in post, one can imagine how this will affect HDR. 
    Admittedly I am not a broadcast engineer. But taking all of he above into account, I'd say that 10-bit or more is needed to really extend the DR. Please comment and correct my arguments.
    Now for a subjective point of view:
    We still live in an SDR world. Seven stops. Light is just white, a dull color mix of RGB. 13 stops of light crammed into rec_709, lifting shadows, preserving highlights, results in the kind of artificial-looking images we often see when people record in LOG. If light is the subject, if it's prominent in the image, I consider this counterproductive and pathetic. It's better to let the highlights clip, to let the 100 IRE white eat away the detail. I found an image to demonstrate what I mean:

    No, but the unwritten law to never let the highlights clip is stupid, imo. Light is not just an informal part of the environment, it's an epiphany. I like a punchy image more than an expressionless one. Oh, Larry, well done! I really see detail in the clouds, terrific!
  24. Like
    Axel got a reaction from webrunner5 in Panasonic GH5S 4K / 240fps low light monster   
    The reasoning behind this:
    The GH5S has a bigger sensor compared to GH5 (17,3 mm x 13 mm), but not as big as that of my A6500 (23.5 mm x 15.6 mm). Lowlight performance with my camera is still better. But I don't care at all about lowlight.
    Dynamic range is also slightly better, 13 stops (Sony) over 12 stops (Pana). 
    The Pocket (as well as the Micro, which has the advantage of 50/60p @ 1080) has 13 stops. 
    People now have introduced the term usable dynamic range in connection with the cleaner shadows of the GH5S. But there is another side to this: gradation of the highlights. You'd usually ETTR and bring the highlights down to IRE 100 in post. Dave Dugdale has made a video in which he proves 8-bit to be much less problematic compared to 10-bit than is commonly believed. But this is for rec_709, where you squeeze 13 stops into 6 stops: ultra low dynamic range.
    The Pocket, with it's tiny sensor, is by no means a lowlight performer. Shadows will be very noisy (you see it in froess' fairground clip, but also in the shadows in the forest clip). The noise does look a lot like film grain, but of course you would want to avoid it nonetheless. You can do so by setting zebra to 100% and adjust exposure so that the highlights just start to clip. This camera really is (contrary to it's reputation) the easiest of them all in this regard. Of course: there has to be enough light and/or you have to use fast lenses and additionally speedboost them.
    But. Then. The colors are beautiful to start with, and with 10 or 12 bit, they stay beautiful in post. 
    Now finally we here the bell ringing for SDR. I bet if you saw a highkey image with sun and sky in HDR, shot with Pocket, you said wow! Shot with Slog in 8-bit? Probably eew ...
    I wonder if the GH5S could be the right compromise. Good enough images, good resolution, low noise in the shadows, enough bit depth for good highlights. That's paramount. Missing AF, IBIS and high ISOs don't need to show in the final image.
  25. Like
    Axel got a reaction from PannySVHS in The "Annihilation" of Paramount Pictures   
    If it's *just* about money, if the only votes that count are those of the shareholders, this industry is doomed. Applicable to everything. If we are measured by by the degree we can be exploited, our kidneys will be sold and the rest becomes soap. 
    Watch The Cooler. The old casino mafia ruled this frivole business with cruelty - and passion. Then the bankers appeared and took over. And the world turned to shit.
  • Create New...