Jump to content

studiodc

Members
  • Posts

    69
  • Joined

  • Last visited

Reputation Activity

  1. Like
    studiodc reacted to JazzBox in EOSHD Pro Color for Panasonic (GH4, GX85, G85 + more)   
    Ciao!

    This is a music video I shot with GH4 and G7 (both with Natural profile, saturation -5, contrast -5).
    I shot it very quickly: two afternoon and an evening, one week between the 2 days of shooting and in that time I sold the GH4, so the second day of shooting was with the G7.

    I shot all handheld without shoulder rig except for a part of the indoor shot with the voodoo witch, where I used a tripod while she mix the potion.
    The drone scenes are shot on DJI Phantom 3 Professional (not mine, we called a pilot), which is a little too "digital" for my taste. 

    I used Andrew's EOSHD Pro Color for Panasonic and I'm really happy about the skin tones and the overall colors. I also used FilmConvert's Kodak Vision3 5207 with grain 20%.

    Thank you Andrew

    p.s.: can you spot which is the GH4 and which is the G7?  
     
     
  2. Like
    studiodc got a reaction from kidzrevil in Panasonic G85 review - is there any need to get an Olympus E-M1 Mark II for video?   
    Can't say for the G85, and I just got my GX85, but on a 4K camera I never shoot in 1080p unless I need slomo and in that case it's not a big deal if it's a little soft as the slo-mo I shoot (unless I'm renting a seriously capable rig) is going to be for dreamy stuff anyway. 
    If you plan to work in 1080p, do yourself a huge favour, shoot in 4K and use something like EditReady to transcode to 1080p ProRes as part of your ingest step (possibly applying Andrew's excellent LUTs at the same time). You'll end up with roughly equivalent to 4:4:4 colour and the downsampling will increase sharpness and decrease noise by a lot. You get basically 4x oversampled 1080p (2x horizontal, 2x vertical) for your trouble, and not horribly greater file sizes. 
  3. Like
    studiodc got a reaction from dbp in GH5 Prototype   
    @mercer I think we're all in agreement that it exists, if in perception only. It may mean different things to different people. Personally, in the (admittedly unscientific) studies I've done on the subject, it's got to do with things like precise frame timing, proper frame exposure (there was a bug in the original C300 - not sure if it's still there, I haven't shot on that camera in a while - where footage shot as 24p would always have a 1/60 shutter speed, reducing motion blur from the expected 1/48 by enough that was noticeable - it made all the footage feel slightly less "smooth", for instance), and of course ensuring that playback was at a matching frame rate. For others, it's got more to do with colour, for others, bokeh or lack thereof, film grain or noise "feel", and the anamorphic effects, etc. etc. and of course any/all of these in combination. This subject has been beaten to death and we could still talk about it, but I think it really comes down to how you, personally, define it, and whether or not you can, in post, create that effect or if it's purely the result of the in-camera hardware and encoding. 
    That said, I'd argue you could make stuff that very nearly everyone except a small percentage of other filmmakers finds entirely "cinematic" or "filmic" with nearly any decent video-capable camera these days, and that it's not as much of a hardware thing as people think. Can you do it on all cameras, straight off the card, in available light? Not easily, and certainly not with as much success. 
    My personal perspective on the whole thing is that we've failed to carry forward actual cinema production technique and style and values. The importance of lighting and light modifiers. The use of colour for symbolism and message. The practice of motion. The importance of DOF (not low-DOF) and when/how to use it. Traditional framing versus unconventional. The language of editing. The incredible contributions of the art dept. (even if it's just one person) and wardrobe/makeup. Having a good colourist. To me, these contribute FAR FAR more to the feeling of "cinematic" or "filmic" than any of the gear you shoot with or what LUTs you slap on the footage, and my own eyes, in blind viewings, have held that up so far.
  4. Like
    studiodc got a reaction from Kisaha in GH5 Prototype   
    @mercer I think we're all in agreement that it exists, if in perception only. It may mean different things to different people. Personally, in the (admittedly unscientific) studies I've done on the subject, it's got to do with things like precise frame timing, proper frame exposure (there was a bug in the original C300 - not sure if it's still there, I haven't shot on that camera in a while - where footage shot as 24p would always have a 1/60 shutter speed, reducing motion blur from the expected 1/48 by enough that was noticeable - it made all the footage feel slightly less "smooth", for instance), and of course ensuring that playback was at a matching frame rate. For others, it's got more to do with colour, for others, bokeh or lack thereof, film grain or noise "feel", and the anamorphic effects, etc. etc. and of course any/all of these in combination. This subject has been beaten to death and we could still talk about it, but I think it really comes down to how you, personally, define it, and whether or not you can, in post, create that effect or if it's purely the result of the in-camera hardware and encoding. 
    That said, I'd argue you could make stuff that very nearly everyone except a small percentage of other filmmakers finds entirely "cinematic" or "filmic" with nearly any decent video-capable camera these days, and that it's not as much of a hardware thing as people think. Can you do it on all cameras, straight off the card, in available light? Not easily, and certainly not with as much success. 
    My personal perspective on the whole thing is that we've failed to carry forward actual cinema production technique and style and values. The importance of lighting and light modifiers. The use of colour for symbolism and message. The practice of motion. The importance of DOF (not low-DOF) and when/how to use it. Traditional framing versus unconventional. The language of editing. The incredible contributions of the art dept. (even if it's just one person) and wardrobe/makeup. Having a good colourist. To me, these contribute FAR FAR more to the feeling of "cinematic" or "filmic" than any of the gear you shoot with or what LUTs you slap on the footage, and my own eyes, in blind viewings, have held that up so far.
  5. Like
    studiodc got a reaction from Emanuel in GH5 Prototype   
    I can definitely understand what you're talking about - but that's not something I would at all class as "motion cadence", which to me implies a) regularity in frame exposure duration and b) regularity in frame exposure timing. Judder is a big factor here - the playback interpolation of frames during 24p -> 60p telecine pulldown to match most modern monitor refresh rates makes a far bigger difference in the viewer's perception. For instance, when shooting GH4 footage at 30p it will often be described by clients as more "cinematic" when in fact, they are discussing video shown on monitors versus what they see in projected 24p theatres. So, there's a big perception difference in frame rates, pulldown effects, and frame timing (which is provably inaccurate on some cameras at certain frame rates), all of which I would classify into perceived motion cadence issues.
    The "quantized in-frame capture" you mention on the other hand... that's odd. I'd noticed it before on some footage (can't remember which camera) and wrote it off to perhaps the lighting in the studio (60hz AC versus 23.97p) doing something a bit funky, but now that you mention it outdoors, I'll have to comb through the footage again. This said, I can't say that would make a significant difference in "motion" perception, although perhaps if it's randomized frame to frame instead of consistent it could lead to a flicker effect. But of course I've seen 35mm film in fast pans or fast cross-frame motion "flicker" too, thanks to our inherent perception of the motion differences versus blur in those large-delta situations, so it's entirely possible.
    Then again, I did a lot of my GH4 recording externally direct to ProRes... might make a difference.
    Edit: thinking about it: are you sure this isn't including telecine effects of frame interpolation? It sure as hell looks like it, some modern telecine implementations (in FCPX, for instance) frame blend instead of just blandly repeating...
  6. Like
    studiodc got a reaction from Zak Forsman in GH5 Prototype   
    I can definitely understand what you're talking about - but that's not something I would at all class as "motion cadence", which to me implies a) regularity in frame exposure duration and b) regularity in frame exposure timing. Judder is a big factor here - the playback interpolation of frames during 24p -> 60p telecine pulldown to match most modern monitor refresh rates makes a far bigger difference in the viewer's perception. For instance, when shooting GH4 footage at 30p it will often be described by clients as more "cinematic" when in fact, they are discussing video shown on monitors versus what they see in projected 24p theatres. So, there's a big perception difference in frame rates, pulldown effects, and frame timing (which is provably inaccurate on some cameras at certain frame rates), all of which I would classify into perceived motion cadence issues.
    The "quantized in-frame capture" you mention on the other hand... that's odd. I'd noticed it before on some footage (can't remember which camera) and wrote it off to perhaps the lighting in the studio (60hz AC versus 23.97p) doing something a bit funky, but now that you mention it outdoors, I'll have to comb through the footage again. This said, I can't say that would make a significant difference in "motion" perception, although perhaps if it's randomized frame to frame instead of consistent it could lead to a flicker effect. But of course I've seen 35mm film in fast pans or fast cross-frame motion "flicker" too, thanks to our inherent perception of the motion differences versus blur in those large-delta situations, so it's entirely possible.
    Then again, I did a lot of my GH4 recording externally direct to ProRes... might make a difference.
    Edit: thinking about it: are you sure this isn't including telecine effects of frame interpolation? It sure as hell looks like it, some modern telecine implementations (in FCPX, for instance) frame blend instead of just blandly repeating...
  7. Like
    studiodc got a reaction from zetty in GH5 Prototype   
    I can definitely understand what you're talking about - but that's not something I would at all class as "motion cadence", which to me implies a) regularity in frame exposure duration and b) regularity in frame exposure timing. Judder is a big factor here - the playback interpolation of frames during 24p -> 60p telecine pulldown to match most modern monitor refresh rates makes a far bigger difference in the viewer's perception. For instance, when shooting GH4 footage at 30p it will often be described by clients as more "cinematic" when in fact, they are discussing video shown on monitors versus what they see in projected 24p theatres. So, there's a big perception difference in frame rates, pulldown effects, and frame timing (which is provably inaccurate on some cameras at certain frame rates), all of which I would classify into perceived motion cadence issues.
    The "quantized in-frame capture" you mention on the other hand... that's odd. I'd noticed it before on some footage (can't remember which camera) and wrote it off to perhaps the lighting in the studio (60hz AC versus 23.97p) doing something a bit funky, but now that you mention it outdoors, I'll have to comb through the footage again. This said, I can't say that would make a significant difference in "motion" perception, although perhaps if it's randomized frame to frame instead of consistent it could lead to a flicker effect. But of course I've seen 35mm film in fast pans or fast cross-frame motion "flicker" too, thanks to our inherent perception of the motion differences versus blur in those large-delta situations, so it's entirely possible.
    Then again, I did a lot of my GH4 recording externally direct to ProRes... might make a difference.
    Edit: thinking about it: are you sure this isn't including telecine effects of frame interpolation? It sure as hell looks like it, some modern telecine implementations (in FCPX, for instance) frame blend instead of just blandly repeating...
  8. Like
    studiodc got a reaction from jonpais in An adventure into the Panasonic GX85/80 begins - and a look at the Leica Nocticron for Micro Four Thirds   
    @Huey, @fuzzynormal Let's keep this discussion focused on the GX85, there is actually a perfectly good topic to read on this specific lens in this specific case, even. Let's move this topic there, shall we?
    That said, I've just ordered a GX85, and am putting my GH4 up for sale. The main reason is that I need a compact, discreet camera more than I need a full video workhorse, and that I love the rangefinder format for the bulk of my personal work which is candids. However, I'm concerned that I may have bought too early. Is anyone anticipating a successor to this camera (in this format) for CES in January?
  9. Like
    studiodc got a reaction from Orangenz in An adventure into the Panasonic GX85/80 begins - and a look at the Leica Nocticron for Micro Four Thirds   
    @Huey, @fuzzynormal Let's keep this discussion focused on the GX85, there is actually a perfectly good topic to read on this specific lens in this specific case, even. Let's move this topic there, shall we?
    That said, I've just ordered a GX85, and am putting my GH4 up for sale. The main reason is that I need a compact, discreet camera more than I need a full video workhorse, and that I love the rangefinder format for the bulk of my personal work which is candids. However, I'm concerned that I may have bought too early. Is anyone anticipating a successor to this camera (in this format) for CES in January?
  10. Like
    studiodc got a reaction from kidzrevil in How to handle out of range video levels (GX80 / GX85 in Premiere Pro)   
    All processing is done internally in 32-bit high resolution colour.

    You'd do this before you start grading, to tell it how to interpret the footage (e.g. where to map the 8 bit signal into that 32-bit space). After that, until you export, it's in the very large 32-bit colour space. 
  11. Like
    studiodc reacted to scotteverett in Canon sponsored content on DPReview   
    Hey all, I work at DPReview, so figured I chime in and perhaps dispel a few myths. 
    sponsored content and the end of the world as we know it...
    Yes, we do sponsored content. Most for-profit publishers do now, as banner ads do not work, and marketers are realizing this and switching gears. But it is not some clandestine operation, a complex web of ethical quagmires. DPReview hires scientists, PHDs, literally, to design our camera tests. We then perform those tests and write about real world usability to create a review, which is a combination of facts and opinions. Whether our camera testers get everything right (with their real world usage opinions) is up for debate, but we strive to ensure our reviews align with the science in such a way that questions about editorial integrity (very common well before we started sponsored content) do not hold weight. Go back to any review we've done and look at the data. You will bore yourself to death looking at test charts before you discover any bias towards a brand. 
    Taking a step back, I can speak personally to how the advertising campaigns (including sponsored content) come to fruition because I manage them (I am the product manager). The reality is that camera manufacturers, if they are smart, realize they cannot put lipstick on a pig, no matter how hard they try. So they are resorting to finding ways to actually engage people. It's the early days, so a lot of the sponsored content is still shit. But the vision is that publishers influence brands to actually make content worth looking at, with the goal that the quality of their product, the soul of their brand (if there is one), actually resonates in an honest way. But it will always be, in some way, advertising. Maybe one day they will turn the corner and actually make content people want to watch/read on the regular, but only time will tell.
    So when we decided to go down this road, we asked a few basic Qs. What could we make that would be worthwhile? How could we do it in a way that we were able to make videos we wanted to make, but could not afford to do so. And one of the first roads we went down was having pro photographers use cameras in the real world, and make little short films showcasing their experiences. On the whole, I think those videos have worked out well, we have been able to showcase photographer workflows, tell stories about real people and places, have cameras used in real world shooting situations, and on the whole, do all of this with almost zero manufacturer interaction. There is a ton of room for improvement, and I talk with visitors all the time about what else we could try. 
    But alas, the topic at hand, we also decided to do "Native Advertising", where we have no role in creation of the content, and simply provide real estate for a brand to promote content they have created. As I mentioned above, the vision is that this type of content will get better over time, but the camera industry is a few years behind the broader CE industry. I remember a good 4-5 years ago Intel sponsored a series of short films telling the story of a handful of creatives that were excellent, beautifully shot, and overall just worthwhile to watch. There is no reason the Camera industry cannot get there, but as I said above, time will tell. 
    DPReview staff are photographers, no really, I swear.
    We are not winning any World Press Photo awards by any means, that's for sure, but we are enthusiasts just like our visitors for the most part. It's an important point because I think when the topic of editorial integrity, advertising, relationships with the brands, and all of these themes come up, it gets lost. Maybe I'm an optimist, but I like to think that generally people are "good", and difficult to corrupt, and that set of assumption definitely applies to my coworkers at DPReview. Our guys are using cameras every day, shooting photos, and spending a ridiculous # of hours every year thinking about cameras. The last thing they would ever want is to feel like everything they were doing was compromised, without purpose. It would be soul crushing, demotivating, and completely unsustainable. So yeah, I understand the need to be skeptical, critical, and diligent in pushing for transparency in journalism, and we are no different. But aside from the more potent reasons I just mentioned, there are also the legal realities of sponsored content; we are legally required to disclose when a brand is involved with something that goes on our site, and we follow it to a T because, quite frankly, the world of hurt we'd face if we didn't would be far worse than a 3% increase in advertising revenue. 
    DPReview hates EOSHD, and had a bitter falling out with Andrew Reid.
    Simply put, this is just not the case. We obviously respected EOSHD and Andrew as we tapped him to be involved in our efforts to begin talking more seriously about video capture. The reality of our publishing process is that it involves several layers of editing, and Andrew is not the first writer we've worked with that experienced this process as frustrating, and wont be the last. Hell, I even tried to write a few articles when I first joined DPR and definitely pulled out a few hairs when my words and voice were changed to be in line with DPR's style. But editing is a necessary step in the publishing process, and in the end, even if DPR and Andrew didn't align on how to do this, none of us here view the situation as anything other than a freelancer not working out, it happens ALL THE TIME. There doesn't need to be a villain. Blogging is a much less rigid workflow and half of the writers we work with are much happier in that context. It makes sense. No hard feelings are needed. 
  12. Like
    studiodc reacted to Andrew Reid in Canon sponsored content on DPReview   
    Hey Scott.
    This sums up quite well what I loved about DPReview's editorial and why I was so proud to be contributor for those years. I looked upon DPR as something of a leading light, the most respected review site for digital cameras, one of the first, and that's why I hold it to a higher standard than others. I care passionately enough about it to get upset and to shout about it when it goes wrong and I think it's in danger.
    DPR did go in-depth, especially on the technical side and still does.
    If the new advertising in the form of sponsored content also did this to the same standard, then the quality would remain and not hurt the brand as much but even then there's a problem, because it would only work as long as it was impartial, which advertising never is and never can be.
    Flick through an old fashion magazine from the 1970's and it is almost ALL advertising yet readers still bought it in droves.... you'll see a lot of high quality advertising, fantastic photos (David Bailey, Helmut Newton) and minimal words, minimal editorial pieces! I am not against advertising culture entirely or with zero tolerance of ads and I'm not a communist although I do live in Berlin
    The problem I have is that more and more the manufacturers seem to be the boss, the paymaster and the editors, if not directly then certainly in subliminally controlling ways like with the PR organised events and it is wrong that this appears to be our only choice as reviewers if we want to get our hands on new gear at the earliest opportunity.
    We join the hype train by doing this and we trade our credibility, or at least it looks that way for the readers.
    I am open for a civilised debate on what we can do in the industry to recover some integrity in what we do. There needs to be some collective action.
    So the PR companies and manufacturers are after our jobs Scott.
    And we are going to just let them take over on the content side?
    But their purpose, if sponsored, is to sell a camera.
    For me that is not the purpose of what we do.
    I hate this insidious influence.
    If we for example are to put out educational content for instance and it is paid for by Canon, then whichever manufacturer sponsors us the most or pays the most, the more content on that particular brand there will be, and there's yet another form of bias. Even if the content itself had zero bias, the money still control the agenda.
    It's our job to create excellent content that's worth watching, not Intel's.
    By taking their money, you are trading your position as a content creator with them and one day you will be without a job.
    Of course! I understand that and always have.
    That's because it's being traded in bit by bit.
    Your voice replaced by somebody else's.
    If it's only a 3% increase in ad revenue and you're owned by Amazon, why do it at all? Why take such a big risk with the brand for the sake of bowing to the manufacturers and 0.001% of their overall ad spend budget? Tell them to fuck off!
    Thanks for the message on here Scott, I do appreciate it.
    If I can ever mend my relationship with DPReview I would.
    I have friends there and the only bad words exchanged were with Barney and Simon Joinson.
    In the end the buck stops with them.
    If they are going to take the site in this direction, they know my opinion on how wrong this is and why it won't turn out the way they hoped.
    They have a responsibility as the senior figures to change tac.
    Their responsibility to the readers should come before their financial obligations to advertisers anyway, because without any readers there won't be any advertisers!
  13. Like
    studiodc reacted to Chris Oh in An adventure into the Panasonic GX85/80 begins - and a look at the Leica Nocticron for Micro Four Thirds   
    slightly on topic. It's currently on sale at BH!
    $697.99
    https://www.bhphotovideo.com/c/product/1302079-REG/panasonic_dmc_gx85kk_lk_dmc_gx85_digital_camera_black.html
    GX85 with Kit lens
    Panny 45-150mm
    $100 gift card
     
  14. Like
    studiodc reacted to Jimbo in An adventure into the Panasonic GX85/80 begins - and a look at the Leica Nocticron for Micro Four Thirds   
    Received my GX85 last week in time for a Friday wedding and just wanted to share my initial thoughts on the camera (being a GH4 and GX7 owner) and a short teaser film I made for the couple I filmed with the camera on Friday. Heads up: this is the same post I put on personal-view to avoid any duplicate reading time ;-)
    Firstly, operationally, this camera with its stabilisation is a gem. For a number of years now, every time I've wanted to go handheld at a wedding I've had to remove my baseplate and attach my Zacuto target shooter and z-finder (I don't keep the Zacuto baseplate on permanently as I value the GH4s flippy screen way too much). Weddings being as they are, this meant I only used my mini rig at strategic times of the day when the switchover meant I wouldn't miss anything. So having it built in meant I could go handheld whenever I wanted; and I did. I had it slung around my neck most of the day, operating my GH4 on tripod and mono as my A Cam and then choosing certain moments to use the GX85. As the day progressed I began using the GX85 more and more and it gave me a photography-like framing freedom to the height and angles I could get to without having to hoist up my beloved monopod or put my tripod down into low leg move. I can see the pitfalls of a device making shooting so easy... you get lazy... but on Friday I felt a freedom and creativity I haven't felt in a long time. I started looking with my eyes more knowing that my gear wouldn't stop me getting into position (and quickly at that).
    And the quality of the stabilisation? Well, you can probably already tell from the videos that it's incredible. And where I wouldn't normally go handheld with anything over 25mm on my Zacuto, by the end of the day I had my Nikon 105mm and speedbooster (making it 75mm) and it was fantastic, allowing me to grab shots that normally I'd have to set a tripod up for. And with some practise I felt quite comfortable panning gently with it too, it doesn't come to a hard stop like the Olympus stabilisation seem to, it tapers off nicely.
    I won't comment too much on image quality at this stage as I haven't had time to pore over the footage. My initial feeling is that in terms of DR, detail and colour it is on par with the GH4 with a slightly different colour signature. I feel it is definitely less noisy than the GH4 at high ISO (which is a boon for my work, although I'm really not afraid of noise as much as some people) but detail and DR like normal still seem to struggle once you hit 1600. Also, I don't think the slow-mo (50p/100 shutter on a 25p timeline) is as good as the GH4, doesn't seem as fluid. The GH4 has fantastic 50p. Again, please take this with a pinch of salt as I've scanned my footage for all of an hour and never have time or inclination to pixel peep (too busy editing bloody weddings!). It's just my feel.
    Now the cons to the camera.... instantly the moment I took it out of the box and put it up to my face I was disappointed. I love the left-side EVF, it means guests can see more of my face and I can smile to put them at ease and more easily talk to the bride and groom if doing any direction. However, that damned left strap lug just sticks into my nose. The GX7 has the correct placement of this lug and I cannot believe Panasonic engineers changed it and then didn't think about the consequences. Anyone else notice this? I want to hold the camera to my nose for additional stabilisation and can't do it comfortably now. I can live with it, but found myself pushing the viewfinder upwards into my eye socket so I didn't have red marks on my nose all day. It's workable, but the GX7 lug placement was perfect.
    Secondly, no battery charger included. You have to charge the battery in camera which is less handy when you want to have your second battery on charge while using the first. I do this a lot.
    Obviously it's a real shame it doesn't have mic input, but I understand this isn't a pro model. I just have to choose wisely when I use it for my work until the GH5 comes out as audio clips are so important to the way I edit (I use natural, candid audio whenever I can, especially in my longer edits). However, I really will have so much pleasure using it for my work in the meantime, and it is now the perfect personal camera for me too.
    So here is a mini teaser video I put together for my bride and groom from Friday. Shots 1, 2 and 4 at 50mm, shots 3 and 5 (cliff ones) at 75mm. Yes, I have slowed them down, but I would be more than happy (and will) use them at normal speed too. The 75mm on the cliff was a bit of a wow moment for me. With the wind the way it was, there is no way I would have been able to get the shot on a monopod that steady, and I would have never been able to get the variety of shots I did if I were using a tripod (there was no time for a tripod anyway). The GX85 is opening my mind up to new possibilities.
    All shots straight out of camera, natural w/ -2 contrast, -4 sharpness, -3 NR:
     
  15. Like
    studiodc got a reaction from Don Kotlos in A comparison of 4K to 1080p downscaling with GH4 / Ninja Star vs. FCPX   
    So, I've just shot a quick test using my GH4 and a Ninja Star. The test is to check the differences between the internal HDMI-based 4K downscale and the 4K -> 1080p downscale using the 4:2:0 8-bit H264 as a source.
    The images can be downloaded in full resolution here: https://www.dropbox.com/s/e6o06zoowk7z5fh/GH4%20Test%20Images.zip?dl=0
    See the attached files for some side-by-side 100% comparisons.
    Images are from the GH4 (set to a gently tweaked CineV, 4K 100M 24p - really 23.98, 180 degree shutter, Sigma 18-35 at f/2.8 on a Speedbooster at ISO 400, auto white balance, tungsten lighting).
    They were captured to 1080p using the following workflows:
    Clip 1: Recorded in GH4 (4:2:0 8-bit H.264 4K) then downscaled in Red Giant Bulletproof to 1080p ProRes 422HQ.
    Clip 2: Recorded in GH4 (4:2:0 8-bit H.264 4K) then downscaled in FCPX by adding to a 1080p timeline.
    Clip 3: GH4 HDMI out (4:2:2 8-bit) to Atomos Ninja Star (ProRes 422HQ)
    Clip 4: GH4 HDMI out (4:2:2 10-bit) to Atomos Ninja Star (ProRes 422HQ)
    All clips were then placed in an FCPX timeline (Clip 2 was placed as a 4K file), then a frame was exported as a TIFF uncompressed file at full 1080p resolution. No effects, grading, or anything else was applied.
    I tried to choose a subject (ColorChecker Passport and tomato, lit just off-axis with a side light accent) that would both provide some sharp edges as well as gentle gradients, and the green/red of the tomato should hopefully accentuate any actual chroma sub-sampling issues that exist. Camera was on a beanbag so please excuse any pixel-level misregistration, I don't have my tripod handy right now.
    I hope this helps someone.
    GH4 Test Image 1.tiff
    GH4 Test Image 2.tiff
    GH4 Test Image 3.tiff
  16. Like
    studiodc got a reaction from TheRenaissanceMan in Gh4 4k downscaled recording to Atomos Ninja Star?   
    I do this with the Ninja Star all the time. Results are very good. The 4K 10-bit downscaled in-camera to ProRes 422 looks fantastic, grades surprisingly well (lifted shadows 3 stops and got still usable tone with a very film-grain-like noise structure, much like 16mm film). 
    For me, since I deliver 1080p and 720p I see this as a major advantage and don't see a benefit to always shooting 4k in camera - I consider the resulting quality between in-camera 4K downscaled to 1080p in post and the 1080p downscale captured as 10-bit ProRes to be negligible, with an advantage to ProRes for workflow speed and latitude for grading later, and a slight advantage to in-camera for one less thing to carry on the rig and the possible option for reframing in post, but I pay for that in post render times and consider it a very very bad habit to shoot with the idea that you'll reframe later. So if you light properly or know how to work with available light well to begin with, and expose/balance properly to begin with (which you should be doing anyway) then you'll get fantastic and very pleasing results with the Ninja Star. I shoot nearly all of my client work with it now, it really does produce some gorgeous footage.
  17. Like
    studiodc got a reaction from JazzBox in EOSHD LOG Converter for the GH4   
    JazzBox, LUTs are resolution independent. You can shoot at any resolution and apply it.
  18. Like
    studiodc reacted to IronFilm in On Adobe, Apple, BlackMagic and 'being careful' ...   
    ​One of the hardest things in scriptwriting is to write a good low budget film. 
     
    http://www.scriptmag.com/features/alt-script-five-good-reasons-to-write-a-no-low-budget-script
     
    http://www.scriptmag.com/features/alt-script-four-ways-to-control-your-scripts-budget-without-compromising-the-film
  19. Like
    studiodc reacted to jax_rox in On Adobe, Apple, BlackMagic and 'being careful' ...   
    ​If you want to make a commercially viable film - of course you have to be held up to the same standards. Have you ever been to a movie and though 'well that looked and sounded like utter sh*t, but hey - they didn't have the same budget, so I guess it's great that they tried'?
    Those who spend years honing their craft are able to make professional quality projects on much smaller budgets than you would think. You've chosen  to be a jack of all trades, rather than hone in on one specific skillset - you could do so and be able to compete with the 'big boys' or even develop a network of people who you can call upon to help you with your no/lo budget projects.
    Your argument strikes me as being this: I'm a DIY builder. I love to build stuff in my spare time. I particularly like to build chairs that hopefully people will be able to sit on. Here's the thing though - there's builders out there who have done apprenticeships and been working for a very long time and they make these amazing chairs that everyone loves and are perfectly sturdy enough to sit on. Why am I expected to be able to make chairs that are good enough, and sturdy enough, for people to be able to sit on, when I haven't spent years working to be a builder??
    ​Why would a sound engineer want the ability to change the colour of the film in their software? I'm sure as a Director, the last thing you would want is the dialogue mixer accidentally screwing with the colour of your film, or your colourist accidentally changing the sound mix..
    Imagine all the potential issues if everyone had to work from one project file on one piece of software...
    ​I don't understand - are you saying that software companies should make one 'post software to rule them all' in addition to all their other seperate, job-specific, stand-alone software..?
    ​Of course it does - the only thing it doesn't necessarily need to be able to do is apply 'funky' looks via some sort of 'filter bank' a la Instagram. AVID actually has quite powerful Colour Correction tools.
     
    In terms of being 'lied to' - the problem lies in the fact that companies like Apple, Adobe, Blackmagic - in general they're targetting the consumers who have throwaway cash to spend on what can often be boiled down to essentially a hobby - and so of course they're going to try and wring out as much cash as possible. It's like the gimmicks they use to sell TVs et al. Professionals see right through that because they have a much deeper knowledge and understanding, whereas the consumers who have this as a hobby are continually looking for that something 'better' that's going to make their footage and films look and sound as good as someone who's spent 25 years honing their skills, as quickly and easily as the push of a button.
  20. Like
    studiodc reacted to David Barkan in Another Use for the GH4's Anamorphic Mode (16mm lenses)   
    Hello everyone!
    With the new ability to shoot a 4K 1.33:1 image with the GH4, I immediately remembered of my film days (as an AC of course) where you had head and floor room on the viewfinder to shoot... and thought that I could use 16mm lenses on the GH4 with a decent crop...  meaning, that one would be able to shoot with 16mm and s16mm lenses, crop out the heavy vignette and still get nice 3K sized footage.
    Here's a little proof test I shot:
     
     
     
  21. Like
    studiodc got a reaction from Peter Rzazewski in Thunderbolt drive options for video editing   
    I edit on a dual-core late 2013 macbook pro laptop. I can edit GH4 .mov H.264 files natively without problem. I can also edit 4k ProRes 422 HQ without a problem. I use a G-Drive 2-drive RAID enclosure, thunderbolt. It wasn't cheap, but not as expensive as the Pegasus. I use it in RAID 1, for drive redundancy, so don't get the benefit of RAID 0 speeds. 
    However, I can also easily edit off a single G-Drive USB3 external, if I'm only doing one stream of GH4 H.264 4k or ProRes converted footage. 
    But, this is all in Final Cut X. Premiere Pro is vastly slower, and a serious pain in the ass to use. It can't be hard drive speed because the data coming off the drive is the same. So I would say your system itself is probably not fast enough. Firewire 800 is easily fast enough for most video streams (800 mbit/sec is faster than the 220 mbit/sec of 1080p ProRes HQ and not quite enough for the 960 mbit/sec of 4k ProRes HQ but since most ProRes HQ doesn't need the max bitrate you'd be likely to handle a single stream on FW800 just fine in the real world), so USB3 or Thunderbolt wouldn't be absolutely necessary (although really nice to have). 
    My point is that you're not likely being held back by your hard disks unless they are USB 2.0 or FW400 and also internally very slow (5400 rpm or something). You're much more likely to be limited by your GPU/CPU with Premiere, and possibly by RAM although I have 16GB and again, no issues at all.
    If 4K is essential to you, you need a much faster system. Otherwise, I'd strongly recommend transcoding on import to 1080p 422 ProRes standard (not HQ or LT). It's a perfectly good codec, much much easier on your CPU/GPU, and your hard disks are likely plenty fast enough to handle the increased bitrate and take some of the load off your processors.
  22. Like
    studiodc got a reaction from IronFilm in Matching BMPCC and LX100   
    Caveat: I don't own either.
    However, I've had to match many cameras before.
    Trick is to first nail your white balance with a proper grey card (do this religiously, in the full light that your subject's faces will be reflecting) and THEN shoot a colour card (in the same place your grey card was). The LX100, from what I've seen, looks pretty good. Shoot it with settings as close to the final look as you can, and shoot the BMPCC in either RAW if you have the time or ProRes if you don't. You won't go wrong either way, since the LX100 will be "driving" the look as it will be more limited in it's ability to adapt in post. As long as you find a setting in the LX100 that looks close to what you intend to end up with, you'll be great, since the BMPCC can really stretch to match it. However, the LX100 can be made to adapt pretty well to the look you want to begin with. Try playing with the photo settings, and then fine-tune contrast, saturation, etc. to get to where you want.
    The pros do test shots with both cameras under a variety of settings, then match in post and see what works the best for them. This is highly recommended for anybody trying to get two cameras of different ilk to work together.
    And don't forget to control your lighting as much as possible. Help the camera out rather than forcing it to deal with high-contrast situations it may not be suitable for.
  23. Like
    studiodc got a reaction from Liam in Matching BMPCC and LX100   
    Caveat: I don't own either.
    However, I've had to match many cameras before.
    Trick is to first nail your white balance with a proper grey card (do this religiously, in the full light that your subject's faces will be reflecting) and THEN shoot a colour card (in the same place your grey card was). The LX100, from what I've seen, looks pretty good. Shoot it with settings as close to the final look as you can, and shoot the BMPCC in either RAW if you have the time or ProRes if you don't. You won't go wrong either way, since the LX100 will be "driving" the look as it will be more limited in it's ability to adapt in post. As long as you find a setting in the LX100 that looks close to what you intend to end up with, you'll be great, since the BMPCC can really stretch to match it. However, the LX100 can be made to adapt pretty well to the look you want to begin with. Try playing with the photo settings, and then fine-tune contrast, saturation, etc. to get to where you want.
    The pros do test shots with both cameras under a variety of settings, then match in post and see what works the best for them. This is highly recommended for anybody trying to get two cameras of different ilk to work together.
    And don't forget to control your lighting as much as possible. Help the camera out rather than forcing it to deal with high-contrast situations it may not be suitable for.
  24. Like
    studiodc reacted to AaronChicago in BM micro cinema grip and LCD   
    I love when companies make bare bones cameras. Most of us already own monitors, external batteries, etc. It's nice just to plug-n-play a new sensor in a box.
  25. Like
    studiodc reacted to Nick Hughes in Blackmagic Next Surprise: External 5" 1080p recorder for 495$!   
    ​http://www.bhphotovideo.com/c/product/1137222-REG/video_devices_pix_e5h_5_monitor_with_hdmi.html
     
    Specs here
×
×
  • Create New...