Jump to content

Axel

Members
  • Posts

    1,900
  • Joined

  • Last visited

Posts posted by Axel

  1. 16 minutes ago, Cary Knoop said:

    I think your black levels are way too high.

    Needed for the analog look. As soon as the black slug was projected on the screen after the lights went down, the screen became brighter. So much for the allegedly higher DR in classic cinema.

  2. 1 hour ago, ricardo_sousa11 said:

    This was a specific clip to go for a moddy style, but you can go for a more vibrant style very easily and it still looks great.

    Not very vibrant either (Mara-smile). It seems to mimic some film stock, has grain and the colors looks like from the 70's. Admit that's it's highly stylized. Skin is rare, but in CUs it looks peculiar (1'37"). And everything that's perfectly in focus looks embossed almost as if like for AAE's cartoon effect.

    Don't get me wrong, YOUR work is perfect. But imo the NX1 is not the right tool for everbody. 

  3. @ricardo_sousa11

    Beautiful clip, exquisite style. But imo too flat, dark and desaturated to judge the camera's DR and colors. As I see it, you successfully managed to work around the NX1's prominent feature, overly sharp outlines in every detail, mistaken for superior resolution and not looking natural. Search on Youtube Samsung NX1 wedding, and you see what I mean in many instances. You tamed the beast, but your approach is not that of a normal highkey wedding with vivid colors. I even found the fir needles @1'30" a little too much.

  4. 2 hours ago, jgharding said:

    *LOLLES TO SELF AND DIVES UNDER TROLL BRIDGE*

    The troll would be me, fanning resentments in a Premiere jubilation thread, but anyway.

    2 hours ago, jgharding said:

    ... new MacBook "Pro" (...)  are toys for people who like to sit in Starbucks pretending to work...

    Yes, and they always look exactly how everybody thinks creatives look like. Very funny. Reminds me of Matrix, the roughly crocheted woolen shirts and the designer sunglasses at night. Someone really should analyze the target groups. 

    2 hours ago, jgharding said:

    The main reason for thinking of switching was the persistant bugs in Premiere. There are now some horrible bugs that I've been using workarounds for since years and years ago.

    FCP X is buggy too. Visit the fcp.co forum. There was an update already to 10.3, fixing four or five of the worst bugs. 

    2 hours ago, jgharding said:

    It makes me sad that they take in hundreds of millions a year and yet don't focus on solidifying core products, but instead spend fortunes on creating frivolous mobile apps and so on, which no one really uses outside of ukelele-soundtracked Instagram adverts.

    The shareholders dictate that. I like Adobe, but find them very conservative. They care about their pro clients, that's true. They are afraid to make too radical changes. Therefore, they just keep adding features and overhauling the old routines. Okay for Photoshop and After Effects, I use both. When I watch my buddy edit in Premiere (I knew it well up to CS5.5), I can't help thinking, why is this whole process such a mess? Why can't they kill all this redundancy?

  5. 38 minutes ago, John Matthews said:

    Has anyone been using the effect marked "Hue/Saturation" in Final Cut Pro X for correcting skin tones? I'm having better results than using the color correction tool. I'm just wondering if others have done this and experience major drawbacks....

    I've seen this effect, but off the cuff I'm not sure how it helps to correct skin tones. Please explain.

    I made a Motion template to make Sony skin tones match Canon skin tones, here. Unzip the file and move the "AXS" folder to >user > movies > Motion templates >effects. No risk, if you don't like it, you can simply move it to the trash. In FCP X, you find it in the effects tab under "AXS". It's called Hautfarben (german for skin tones). You'll see a preview of it if you select a timeline clip and skim over the effect's icon. The preview is set to 50% mix, but you have an amount-slider rigged to go from 0-100%. 

    Base is keying the typically lifeless skin of correctly white balanced Sony clips. Applied to Canon clips, the skin will have an unnatural orange cast. You can easily change the color itself within the filter. But if you happen to use a camera with a different color science than Sony, you'd have to open the filter in Motion (right click) and reset the keyer.

  6. 1 hour ago, MountneerMan said:

    Am I missing something here?

    Yes. Whatever footage interpretation you make within the AME dialog (even if you just "match" the frame rate, which didn't work in the video above, because Premiere misinterpreted it), it's baked into the proxy file and cannot be changed later on.  If it was 60p, you have to manually override the flag in AME to 24. 

    BUT: the parallel original media timeline doesn't know that the proxy had been interpreted. 

    The point is, retiming is fundamental for editing. Proxies should mirror original media perfectly. Adobe should fix this asap. 

  7. 6 minutes ago, MountneerMan said:

    So if I ingest two files one UHD 30p and one 1080 120p the two proxies would be 960x540 30p and 960x540 120p respectively. You still have to scale up or down the footage to fit either a 4k or 1080 timeline as usual as well. Basically to me everything is the same the only different is when you have the view proxies turned on it plays the proxy instead of the original.

    Interpreting footage for proxies seems to work only in AME, not in Premiere, and it doesn't seem to be limited to slomos created from higher fps footage:

    I wouldn't call this a workflow. A flow implies some ease, not this PITA. Instead of render queues via AME and attached reports, there simply should be a button PROXY. And Adobe should come off the stick and make their own proxy codec. 

  8. 17 hours ago, MountneerMan said:

    Yes transcoding takes along time(~18 hours for 240gb of footage) but my workflow usually involves working on a project for alot longer than a week so it is definitely worth the extra time on the front end to save it in the back end.

    And during this transcoding process the computer isn't bothered with any other task? That seems to me to be an exceptional long time. If rendering to proxy isn't at most half realtime, then there is obviously something not working optimally. Give background rendering a try, in earnest. Do some preparatory work in the meantime, so that rendering is paused every few seconds. If your computer isn't a dedicated workhorse and aggressively cooled, rendering slows down significantly at high temperature (can you touch the housing after ten hours?). That's what common benchmark comparisons don't show. Or you chose the wrong codec for fast transcoding.

    17 hours ago, MountneerMan said:

    What codec is everyone using for your proxies?

    If on a Mac, try ProResProxy. I just stopped encoding a 48" long UHD XAVC clip to ProResProxy at 13" in FCP X. If I let transcoding run uninterrupted (for many clips of course), I hear the fans start up after minutes. I won't do a test of that now, but by chance I do have a project with 233 GB original media, and it was transcoded in an afternoon, while I still edited with original media.

    The clip in question is very small compared to the original:

    EOSHD%20Proxy.jpg

    EOSHD-XAVC.jpg

    ProResProxy is always quarter resolution, and if you watch it at quarter resolution (i.e. 1080 instead of 2160), you hardly notice a difference.

     

  9. All right, this may not be suitable for a short timeline or for an urgent project, but just try encoding proxies in the background. You can start editing with the not-so-smooth performance you are used to anyway with native 4k (or 6k, 8k, what have you :expressionless:). When the proxies are finally rendered, toggle proxy. Better for your CPU's lifespan also.

  10. 58 minutes ago, joema said:

    As Axel said, Adobe has long said Premiere is so fast you don't need to transcode -- even at 4k. In fact right now Adobe's Premiere overview video on their web site says that:

    https://helpx.adobe.com/premiere-pro/how-to/what-is-premiere-pro-cc.html?set=premiere-pro--get-started--overview

    "....allows editors to work with 4k and beyond, without time-consuming transcoding....never needing to render until you work is complete"

    Last year I made the point that Adobe was almost ashamed to advertise their really, really good Lumetri. I sensed a similar hesitation about proxies in Adobes NAB 2016 announcement:

    Quote

    This release of Premiere Pro also brings support for more native formats than ever including 6K and 8K files from the RED Weapon camera. For times when the media you need to work with is heavier than your system can manage – for example, when you want to work on a lightweight portable device – you can now generate proxies on ingest, automatically associating them with the native full-resolution media. A single click lets you toggle between full-res and proxy.

    Like when you are sitting in a Greyhound bus with your Lenovo thinkpad and have to edit some hours of 8k footage. You might think, hey, this stuff runs somewhat lagging today, should toggle proxy ...

    Note also how the paragraph starts with native RED in 6k and 8k and how the proxy workflow becomes a subclause, not a HEADLINE, bracketed again by full resolution media.

    Very funny.

     

  11. I don't want to sound like a wisenheimer, but I predicted the success of an easy proxy workflow in Premiere since the first rumors of it's implementation. Good for all Premiere users. Allow me to remind you of the days when Premiere supported the proprietary Nvidia Mercury Engine. I could (but I won't) dig out some lengthy threads in which proud Premiere owners (you used to buy it) poured scorn on us FCP people. The mark of a professional NLE, they said, is the ability to edit the native codec. Now it's the other way around. "X" lets you skip intermediates because it's so streamlined, you use proxies mainly for multicam. The mark of a smart editor is that he uses everything that let's him work with less problems.

    Grandma Adobe abandons one of her last fortresses. What's next? Tracks?

  12. 9 hours ago, HelsinkiZim said:

    o, and remember... you are the invisible man. Let ur brother do the talking. Know ur shit.

    My style is different. I talk to everybody, make compliments and jokes, smile. By this I indirectly ask everybody to look at me because my job is to portrait them in the best light. I casually reassure them, make them feel comfortable. They don't awkwardly avoid me (but forget my presence when it's appropriate), they smile directly into my lens. In other words: I direct them. 

    I had this in me before, but I really improved this as a technique since I began my new job five years ago, nursing people with dementia. It's called validation. You approach the human being in question with the conviction that she is the most valuable person in the world, that he makes you radiantly smile, you let them mirror their best experiences. To achieve this, you first have to sense what makes them wary, shy or even hostile. There are quite reliable signals of body language. You charm away their unease just by your own positive reaction to that, by subconscious changes in your own eyes, posture and voice too subtle to feign. What I learned from this: you can manipulate others in a good sense and in a bad sense. People, no matter how blatantly they contradict the current concept of ideal beauty, like themselves on photos/videos when they are happy.

     

  13. 2 hours ago, kaylee said:

    ... what digital camera gets you closest to this? in terms of color whats better than 5d3 raw?

    Part of this is grading with a lot of well saturated colors (Speed Racer, the terrible but beautifully graded Burlesque) and imo has little to with the camera or codec used. In terms of color differentiation I guess you could shoot ProRes log with a BM camera and achieve this.

  14. 5 hours ago, dbp said:

    I'm about to order this mic for the pocket. It's cheap and plugs discreetly into the mic port. Obviously not great for any real sound, but it'll increase the quality of the onboard sound and actually make it usable. Handy for weddings, and one of the reasons why I avoided using the pocket before. 

    Try it, and then report your experience. I am sceptic because shooting a wedding implies a lot of hand held camera (to move fast and get into narrow places or between the guests). Though I am quite good at avoiding shaky camera even with the tiny Pocket (my first challenge was to learn to hold right), it's tiring. If a mic was mounted to the camera directly, it had to have a shock mount and be a shotgun, otherwise I would hear my pressed breathing and noises from focus manipulations right before the mic. An earwig isolates these influences. If there was an unidirectional version, it would be even better for voices. 

  15. I shot one wedding with my Pocket. It has a notoriously bad internal mic, utterly useless even for reference audio in most cases. Being a one-man-band and not into wiring anyone (which I did on one occasion, good audio, but too much interference and affected "performances" too much, aside from being a hassle), I was thinking about a "lenshopper" and my usual two Zoom H1's I place or ask people to hold. I researched and found a product from german private inventor Wolfgang Winne, then called Ohrwurm (ear worm), in my case the particularly ugly big headset with the ridiculous dead cats that makes you look like an idiot. I plugged one cable into the phone jack, the other to the mic jack, set the mic level manually to 70% once, done! The microphones have an incredible dynamic range, capturing quiet sounds with no noise and don't clip or distort below 130 dB (that's when it starts to hurt). They are practically dummy head mics that let you record what you hear. I originally planned to get useable reference sound for the Zooms (mono recording), but I didn't use their sound at all.

    The smaller earwigs have no headphone-control (I don't know them personally), but they have a very good reputation. Even if you mistrust them for the final audio, they will deliver better audio than any built-in camera mic. 

  16. Valuable information above. I am no professional wedding videographer, did 12 weddings over the past 9 years and always demanded a new piece of equipment as compensation. For me, shooting weddings is big fun, a bunch of challenges and a thousand times as rewarding as shooting corporate videos (I know a few people in that business).

    4 hours ago, Davey said:

    The only wedding I won't touch after that feedback will be where either the bride or groom are professional videographers. They would notice the slight moire on the registrar's tie, the crushed shadows miles away in the corner of the church, the purple fringing in one second of the entire production - and then sue you silly for daring to soil their big day.

    I did that two times, no problem. As I said, I'm not a pro, but I went to film school, and I know how to present my point of view. What I wouldn't touch is if either the groom or the bright is an abominable wedding videographer, the most begrudging species in the world. They always hate their rivals and despise their work, if they admit it or not. The work of others is either cheesy, poorly photographed, has no story, too long, boring, the music makes my toenails curl in horror (and/or is stolen, would like to report you) or it's completely professional, an exchangeable template, lifeless, loveless, everybody can see how much you disliked the bride by the amount of diffusion you applied, are you so dull to miss that? And so forth. How I know? I once had been reading guest of a wedding videographer forum to perhaps steal some tricks. Make no mistake: hell didn't exist before, it was invented for this vermin. 

    2 hours ago, Jimbo said:

    Editing is a real bitch when it comes to weddings, if you're like me you are going to feel that intensity to film EVERYTHING but every shot you take is going to have a knock-on effect with your editing pipeline. My footage is better when I shoot less and observe more; waiting, looking, for special angles and moments. Of course, sometimes you've just got to grab what you can given the day's timings.

    Good distinction: special angles (beautiful photography) and special moments (emotional mise-en-scène). A successful wedding video always has both. When in doubt, go for the emotion. A sequence of perfect beauty shots that looks like a lavish commercial: unendurable. Grandma fighting her tears, whispering God bless you: YES!

  17. 4 hours ago, Dimitris Stasinos said:

    I tried holding alt but doesn't change anything. I did run the process many times and the final result is miles away from any average monitor.

    Process? You seem to have found the expert mode. I used to calibrate my old display that way. Looked good then, looks terrible now. Perhaps because our own expectations are higher after a few years. The whole grading hype educates us to be unhappy with what we've got. I think the built-in monitor calibration never was exactly brilliant.

  18. 2 hours ago, Dimitris Stasinos said:

    Hello guys! I need your help here. My screen (imac 27 late 2012) is totally off and Yosemite's calibration process seems like a total mess. I am sure many off you have tried all these fancy stuff from x-rite and other companies. Can you suggest me a cost effective calibration tool please?

    I had a Spyder Pro, now a X-rite i1Display Pro, used with the free software DisplayCAL. Takes a long time, but is very precise.

    Why is Yosemite's calibration process a "total mess"? Missed the old "expert mode"? It appears only if you hold "alt" while clicking in >system preferences >monitors >colors  and there >calibrate.

  19. 1 hour ago, freeman said:

    When the thing does end up working the way I usually grade over multiple clips is by using a title (with no text) over all the footage I want a Lut applied to. The title acts as an adjustment layer. So you drag the Lut loader onto the title instead of the clip, then everything under that title gets the Lut applied.

    Best way to actually see what's going on. You instantly see the difference, because the LUT doesn't cut off highlights. 

    My proposal for a LUT-alternative:

    If you'd rather avoid a Resolve roundtrip, make primary and secondary corrections on more than one* color instance, (I usually have three to five, as many as I'd have serial or parallel nodes in Resolve, apply more funky look effects if you want to create a better look-LUT) save them together as moonlight shadows and apply moonlight shadows to every clip that fits. You can easily group the clips with a corresponding tag moonlight shadows and select them as group from the timeline index (start typing moon..., then hit cmd + a to select them all). Each "node" of the stack stays individually editable in the info tab. Best of all: the new super LUT now lives in the effect browser. By selecting a new clip in the timeline and skimming over the moonlight shadow icon, you see an instant preview of it, fullscreen, realtime, with no need to apply the effect in the first place. How is that? LUTloader? Come on ...

    *more than one because then you can adjust things with more precision.

  20. 2 hours ago, Simon Srečković said:

    Thank for your answer, "because I became aware that luts are not the right approach if you want the best results" what do you recommend? Learning the skill "manually" ?

     A LUT can be used for roughly two different purposes, one of them is applying a look. No one who really concerned himself with CC would do that. I know some will contradict me here. I am interested to hear their arguments. The second purpose is to translate an image to another color space (i.e. RGB to CMYK), resp. "normalize" a log recording to, say, rec_709, the usual procedure nowadays. Because the table has only a limited number of sample points, the result is merely an approximation. There are many thousand rounding errors. The transformation is very lossy.

    1 hour ago, Andrew Reid said:

    If you're going to use the same LUT on every clip I recommend EditReady to apply the LUT during transcoding to ProRes. It will make your edit a lot smoother too than editing the camera files direct.

    That's what is called generation loss. If you recorded flat to capture more shadows and highlights, these values are gone for good after the LUT eliminated them. You could better have shot in 709 from the start, because then you had more original data.

    FCP X has some rudimentary log-normalizing LUTs built in (>info >settings):

    FCPX-LUTs.jpg

    .. but even you shot with, say, C-Log, you shouldn't use them. They always degrade the DR and colors, because the essential order of operations can't be changed. If you have a stack of corrections and effects (a 'pipeline'), they are on top, and everything you tried to preserve is lost. A good way to deal with LUTs (if you have to) is to apply them to an adjustment layer underneath which you can change the original video independently. Or, of course, use LUT Utility, because it is an effect, has an amount-slider and can be applied downstream. There are few objections anyone could raise against this. Part of the grading process is wysiwyg, and you are taking away unwanted things step by step and can go back any time, like with nodes in Resolve.

    As mentioned before, a more precise way of translating colors than LUTs would be ACES. To be honest, I have no personal experiences with this yet.

    Though FCP X computes with 32-bit floating point accuracy, it's tools for color grading aren't too precise (and before Andrew chimes in, the ones in the wonderful Lumetri panel - no irony - aren't either). They are enough for getting fast and good results. In Resolve (remember: it's free!) you can set the boarders for what is considered shadows, midtones and highlights (color wheels >log) and how smooth the 'gradient' is. Have specialized tools to saturate very specific color ranges, change their color, their luma, the brightness-saturation-ratio, what have you. You have a channel mixer. And so on. 

    LUTs are mere crooks.

     

     

     

  21. I never had LUT Loader but LUT Utility. Hadn't installed any of them on 10.3 yet, because I became aware that luts are not the right approach if you want the best results. With FCP X, you have many options to group clips and to batch-apply multiple "layers" of corrections on them using tags, combined effects presets and adjustment layers. Now with the new easy way to disable attributes it has become even better. Luts are too crude.

    But anyway, since LUTloader is free, I installed it just now on Sierra (clean install) with FCP X 10.3 (clean install) . The installation guide says that Sierra only allows manual installation of the plugin. No problem, I always do that anyway. Nothing crashes if I apply it to a clip, but nothing happens either. This is a crop of the instructions jpeg delivered in the download folder:

    Lutloader1.jpg

    ... and this is how it looks in 10.3:

    Lutloader2.jpg

    I can't load luts ...

    Maybe since it became freeware, they stopped adaptation to newer FCP X versions, practically EOL. Maybe you installed OSX / MacOS another way. Color Finale, which has a 7 day trial with full functionality (perhaps even Color Finale Pro, superior because of ACES) includes LUT Utility. You can download that and see if it crashes. If that's the case, I recommend a clean install.

    EDIT: Lut Utility still is available as stand-alone version for $29, and it also still has a free trial, here.

    And: I just received a newsletter from CGC, that on Black Friday every CGC software gets a 40% discount. A coupon will be sent to all subscribers. So hurry to register and download a trial!

×
×
  • Create New...