Jump to content

Axel

Members
  • Posts

    1,900
  • Joined

  • Last visited

Everything posted by Axel

  1. Needed for the analog look. As soon as the black slug was projected on the screen after the lights went down, the screen became brighter. So much for the allegedly higher DR in classic cinema.
  2. Not very vibrant either (Mara-smile). It seems to mimic some film stock, has grain and the colors looks like from the 70's. Admit that's it's highly stylized. Skin is rare, but in CUs it looks peculiar (1'37"). And everything that's perfectly in focus looks embossed almost as if like for AAE's cartoon effect. Don't get me wrong, YOUR work is perfect. But imo the NX1 is not the right tool for everbody.
  3. @ricardo_sousa11 Beautiful clip, exquisite style. But imo too flat, dark and desaturated to judge the camera's DR and colors. As I see it, you successfully managed to work around the NX1's prominent feature, overly sharp outlines in every detail, mistaken for superior resolution and not looking natural. Search on Youtube Samsung NX1 wedding, and you see what I mean in many instances. You tamed the beast, but your approach is not that of a normal highkey wedding with vivid colors. I even found the fir needles @1'30" a little too much.
  4. The troll would be me, fanning resentments in a Premiere jubilation thread, but anyway. Yes, and they always look exactly how everybody thinks creatives look like. Very funny. Reminds me of Matrix, the roughly crocheted woolen shirts and the designer sunglasses at night. Someone really should analyze the target groups. FCP X is buggy too. Visit the fcp.co forum. There was an update already to 10.3, fixing four or five of the worst bugs. The shareholders dictate that. I like Adobe, but find them very conservative. They care about their pro clients, that's true. They are afraid to make too radical changes. Therefore, they just keep adding features and overhauling the old routines. Okay for Photoshop and After Effects, I use both. When I watch my buddy edit in Premiere (I knew it well up to CS5.5), I can't help thinking, why is this whole process such a mess? Why can't they kill all this redundancy?
  5. I've seen this effect, but off the cuff I'm not sure how it helps to correct skin tones. Please explain. I made a Motion template to make Sony skin tones match Canon skin tones, here. Unzip the file and move the "AXS" folder to >user > movies > Motion templates >effects. No risk, if you don't like it, you can simply move it to the trash. In FCP X, you find it in the effects tab under "AXS". It's called Hautfarben (german for skin tones). You'll see a preview of it if you select a timeline clip and skim over the effect's icon. The preview is set to 50% mix, but you have an amount-slider rigged to go from 0-100%. Base is keying the typically lifeless skin of correctly white balanced Sony clips. Applied to Canon clips, the skin will have an unnatural orange cast. You can easily change the color itself within the filter. But if you happen to use a camera with a different color science than Sony, you'd have to open the filter in Motion (right click) and reset the keyer.
  6. @MountneerMan I must apologize. It was me who didn't get it. So you first interpret the original footage (say 120 or 60 to 24p). After you did this you transcode to proxy to match those. You are right.
  7. Yes. Whatever footage interpretation you make within the AME dialog (even if you just "match" the frame rate, which didn't work in the video above, because Premiere misinterpreted it), it's baked into the proxy file and cannot be changed later on. If it was 60p, you have to manually override the flag in AME to 24. BUT: the parallel original media timeline doesn't know that the proxy had been interpreted. The point is, retiming is fundamental for editing. Proxies should mirror original media perfectly. Adobe should fix this asap.
  8. Interpreting footage for proxies seems to work only in AME, not in Premiere, and it doesn't seem to be limited to slomos created from higher fps footage: I wouldn't call this a workflow. A flow implies some ease, not this PITA. Instead of render queues via AME and attached reports, there simply should be a button PROXY. And Adobe should come off the stick and make their own proxy codec.
  9. And during this transcoding process the computer isn't bothered with any other task? That seems to me to be an exceptional long time. If rendering to proxy isn't at most half realtime, then there is obviously something not working optimally. Give background rendering a try, in earnest. Do some preparatory work in the meantime, so that rendering is paused every few seconds. If your computer isn't a dedicated workhorse and aggressively cooled, rendering slows down significantly at high temperature (can you touch the housing after ten hours?). That's what common benchmark comparisons don't show. Or you chose the wrong codec for fast transcoding. If on a Mac, try ProResProxy. I just stopped encoding a 48" long UHD XAVC clip to ProResProxy at 13" in FCP X. If I let transcoding run uninterrupted (for many clips of course), I hear the fans start up after minutes. I won't do a test of that now, but by chance I do have a project with 233 GB original media, and it was transcoded in an afternoon, while I still edited with original media. The clip in question is very small compared to the original: ProResProxy is always quarter resolution, and if you watch it at quarter resolution (i.e. 1080 instead of 2160), you hardly notice a difference.
  10. All right, this may not be suitable for a short timeline or for an urgent project, but just try encoding proxies in the background. You can start editing with the not-so-smooth performance you are used to anyway with native 4k (or 6k, 8k, what have you ). When the proxies are finally rendered, toggle proxy. Better for your CPU's lifespan also.
  11. Last year I made the point that Adobe was almost ashamed to advertise their really, really good Lumetri. I sensed a similar hesitation about proxies in Adobes NAB 2016 announcement: Like when you are sitting in a Greyhound bus with your Lenovo thinkpad and have to edit some hours of 8k footage. You might think, hey, this stuff runs somewhat lagging today, should toggle proxy ... Note also how the paragraph starts with native RED in 6k and 8k and how the proxy workflow becomes a subclause, not a HEADLINE, bracketed again by full resolution media. Very funny.
  12. I don't want to sound like a wisenheimer, but I predicted the success of an easy proxy workflow in Premiere since the first rumors of it's implementation. Good for all Premiere users. Allow me to remind you of the days when Premiere supported the proprietary Nvidia Mercury Engine. I could (but I won't) dig out some lengthy threads in which proud Premiere owners (you used to buy it) poured scorn on us FCP people. The mark of a professional NLE, they said, is the ability to edit the native codec. Now it's the other way around. "X" lets you skip intermediates because it's so streamlined, you use proxies mainly for multicam. The mark of a smart editor is that he uses everything that let's him work with less problems. Grandma Adobe abandons one of her last fortresses. What's next? Tracks?
  13. My style is different. I talk to everybody, make compliments and jokes, smile. By this I indirectly ask everybody to look at me because my job is to portrait them in the best light. I casually reassure them, make them feel comfortable. They don't awkwardly avoid me (but forget my presence when it's appropriate), they smile directly into my lens. In other words: I direct them. I had this in me before, but I really improved this as a technique since I began my new job five years ago, nursing people with dementia. It's called validation. You approach the human being in question with the conviction that she is the most valuable person in the world, that he makes you radiantly smile, you let them mirror their best experiences. To achieve this, you first have to sense what makes them wary, shy or even hostile. There are quite reliable signals of body language. You charm away their unease just by your own positive reaction to that, by subconscious changes in your own eyes, posture and voice too subtle to feign. What I learned from this: you can manipulate others in a good sense and in a bad sense. People, no matter how blatantly they contradict the current concept of ideal beauty, like themselves on photos/videos when they are happy.
  14. Part of this is grading with a lot of well saturated colors (Speed Racer, the terrible but beautifully graded Burlesque) and imo has little to with the camera or codec used. In terms of color differentiation I guess you could shoot ProRes log with a BM camera and achieve this.
  15. Try it, and then report your experience. I am sceptic because shooting a wedding implies a lot of hand held camera (to move fast and get into narrow places or between the guests). Though I am quite good at avoiding shaky camera even with the tiny Pocket (my first challenge was to learn to hold right), it's tiring. If a mic was mounted to the camera directly, it had to have a shock mount and be a shotgun, otherwise I would hear my pressed breathing and noises from focus manipulations right before the mic. An earwig isolates these influences. If there was an unidirectional version, it would be even better for voices.
  16. They can count on me as a customer! This is the kind of simplicity I admire.
  17. I shot one wedding with my Pocket. It has a notoriously bad internal mic, utterly useless even for reference audio in most cases. Being a one-man-band and not into wiring anyone (which I did on one occasion, good audio, but too much interference and affected "performances" too much, aside from being a hassle), I was thinking about a "lenshopper" and my usual two Zoom H1's I place or ask people to hold. I researched and found a product from german private inventor Wolfgang Winne, then called Ohrwurm (ear worm), in my case the particularly ugly big headset with the ridiculous dead cats that makes you look like an idiot. I plugged one cable into the phone jack, the other to the mic jack, set the mic level manually to 70% once, done! The microphones have an incredible dynamic range, capturing quiet sounds with no noise and don't clip or distort below 130 dB (that's when it starts to hurt). They are practically dummy head mics that let you record what you hear. I originally planned to get useable reference sound for the Zooms (mono recording), but I didn't use their sound at all. The smaller earwigs have no headphone-control (I don't know them personally), but they have a very good reputation. Even if you mistrust them for the final audio, they will deliver better audio than any built-in camera mic.
  18. Why not just *key* the skin and color correct it? I made this an effect. With mix-slider, everything in one small window.
  19. Valuable information above. I am no professional wedding videographer, did 12 weddings over the past 9 years and always demanded a new piece of equipment as compensation. For me, shooting weddings is big fun, a bunch of challenges and a thousand times as rewarding as shooting corporate videos (I know a few people in that business). I did that two times, no problem. As I said, I'm not a pro, but I went to film school, and I know how to present my point of view. What I wouldn't touch is if either the groom or the bright is an abominable wedding videographer, the most begrudging species in the world. They always hate their rivals and despise their work, if they admit it or not. The work of others is either cheesy, poorly photographed, has no story, too long, boring, the music makes my toenails curl in horror (and/or is stolen, would like to report you) or it's completely professional, an exchangeable template, lifeless, loveless, everybody can see how much you disliked the bride by the amount of diffusion you applied, are you so dull to miss that? And so forth. How I know? I once had been reading guest of a wedding videographer forum to perhaps steal some tricks. Make no mistake: hell didn't exist before, it was invented for this vermin. Good distinction: special angles (beautiful photography) and special moments (emotional mise-en-scène). A successful wedding video always has both. When in doubt, go for the emotion. A sequence of perfect beauty shots that looks like a lavish commercial: unendurable. Grandma fighting her tears, whispering God bless you: YES!
  20. Process? You seem to have found the expert mode. I used to calibrate my old display that way. Looked good then, looks terrible now. Perhaps because our own expectations are higher after a few years. The whole grading hype educates us to be unhappy with what we've got. I think the built-in monitor calibration never was exactly brilliant.
  21. I had a Spyder Pro, now a X-rite i1Display Pro, used with the free software DisplayCAL. Takes a long time, but is very precise. Why is Yosemite's calibration process a "total mess"? Missed the old "expert mode"? It appears only if you hold "alt" while clicking in >system preferences >monitors >colors and there >calibrate.
  22. Best way to actually see what's going on. You instantly see the difference, because the LUT doesn't cut off highlights. My proposal for a LUT-alternative: If you'd rather avoid a Resolve roundtrip, make primary and secondary corrections on more than one* color instance, (I usually have three to five, as many as I'd have serial or parallel nodes in Resolve, apply more funky look effects if you want to create a better look-LUT) save them together as moonlight shadows and apply moonlight shadows to every clip that fits. You can easily group the clips with a corresponding tag moonlight shadows and select them as group from the timeline index (start typing moon..., then hit cmd + a to select them all). Each "node" of the stack stays individually editable in the info tab. Best of all: the new super LUT now lives in the effect browser. By selecting a new clip in the timeline and skimming over the moonlight shadow icon, you see an instant preview of it, fullscreen, realtime, with no need to apply the effect in the first place. How is that? LUTloader? Come on ... *more than one because then you can adjust things with more precision.
  23. A LUT can be used for roughly two different purposes, one of them is applying a look. No one who really concerned himself with CC would do that. I know some will contradict me here. I am interested to hear their arguments. The second purpose is to translate an image to another color space (i.e. RGB to CMYK), resp. "normalize" a log recording to, say, rec_709, the usual procedure nowadays. Because the table has only a limited number of sample points, the result is merely an approximation. There are many thousand rounding errors. The transformation is very lossy. That's what is called generation loss. If you recorded flat to capture more shadows and highlights, these values are gone for good after the LUT eliminated them. You could better have shot in 709 from the start, because then you had more original data. FCP X has some rudimentary log-normalizing LUTs built in (>info >settings): .. but even you shot with, say, C-Log, you shouldn't use them. They always degrade the DR and colors, because the essential order of operations can't be changed. If you have a stack of corrections and effects (a 'pipeline'), they are on top, and everything you tried to preserve is lost. A good way to deal with LUTs (if you have to) is to apply them to an adjustment layer underneath which you can change the original video independently. Or, of course, use LUT Utility, because it is an effect, has an amount-slider and can be applied downstream. There are few objections anyone could raise against this. Part of the grading process is wysiwyg, and you are taking away unwanted things step by step and can go back any time, like with nodes in Resolve. As mentioned before, a more precise way of translating colors than LUTs would be ACES. To be honest, I have no personal experiences with this yet. Though FCP X computes with 32-bit floating point accuracy, it's tools for color grading aren't too precise (and before Andrew chimes in, the ones in the wonderful Lumetri panel - no irony - aren't either). They are enough for getting fast and good results. In Resolve (remember: it's free!) you can set the boarders for what is considered shadows, midtones and highlights (color wheels >log) and how smooth the 'gradient' is. Have specialized tools to saturate very specific color ranges, change their color, their luma, the brightness-saturation-ratio, what have you. You have a channel mixer. And so on. LUTs are mere crooks.
  24. I never had LUT Loader but LUT Utility. Hadn't installed any of them on 10.3 yet, because I became aware that luts are not the right approach if you want the best results. With FCP X, you have many options to group clips and to batch-apply multiple "layers" of corrections on them using tags, combined effects presets and adjustment layers. Now with the new easy way to disable attributes it has become even better. Luts are too crude. But anyway, since LUTloader is free, I installed it just now on Sierra (clean install) with FCP X 10.3 (clean install) . The installation guide says that Sierra only allows manual installation of the plugin. No problem, I always do that anyway. Nothing crashes if I apply it to a clip, but nothing happens either. This is a crop of the instructions jpeg delivered in the download folder: ... and this is how it looks in 10.3: I can't load luts ... Maybe since it became freeware, they stopped adaptation to newer FCP X versions, practically EOL. Maybe you installed OSX / MacOS another way. Color Finale, which has a 7 day trial with full functionality (perhaps even Color Finale Pro, superior because of ACES) includes LUT Utility. You can download that and see if it crashes. If that's the case, I recommend a clean install. EDIT: Lut Utility still is available as stand-alone version for $29, and it also still has a free trial, here. And: I just received a newsletter from CGC, that on Black Friday every CGC software gets a 40% discount. A coupon will be sent to all subscribers. So hurry to register and download a trial!
×
×
  • Create New...