Jump to content

Axel

Members
  • Posts

    1,900
  • Joined

  • Last visited

Everything posted by Axel

  1. What this demonstrates, imo, is that BlackMagic do their homework. They rewrite and optimize their software. Whereas Adobe doesn't. Shouldn't it be the easiest thing on earth to optimize Premiere for Macs? Known parts, drivers, OS. But still it ridiculously fails, by far outperformed by a free software.
  2. S-log simply is a way to stuff more DR into 8-bit and 10-bit which both - with linear brightness values - can't hold 14 or 15 stops. Of course if you map the values to your liking to rec_709, practically all colors and nuances are just interpolations. Therefore, compared to raw, which just provides quantitatively more information, you will introduce visible artifacts sooner. That's the difference. S-log is sometimes called the poor man's raw.
  3. Just stumbled upon BetterTouchTool, which seems to work similarly. Lets you assign whole cascades of actions to single clicks, shortcuts, shortcut sequences or gestures. In other words, lets you fully customize FCP X. It's a beast already, now I can unleash it's real ferocity. Love it.
  4. Yes, magic keyboard is junk. Worst thing are the loud keys, can't type at night. Abandoned it for the old, standard keyboard. Tried a lot of other keyboards but like this one the most.
  5. Axel

    The Sweet Spot

    @BTM_Pix This is a nightmare and paradise at the same time.
  6. Both are fantastic, but since this is about cinematic grading, I do find something to gripe about in both. Here, the shadows are washed out. We can argue about excessive use of slomos, kitsch (but forget it, I do weddings and have to keep my mouth shut!) and other aspects forever, but you never get sth. like this through a quality control screening for a cinema release. It's very distracting and unnecessary. And you can always avoid criticism by claiming it was an artistic choice. There is art, and there is craft.
  7. Which clip was that? I am not saying don't use retro looks. They are in the giant tool box, and everything is permitted. 99% of these tools are there to enhance the experience by reducing the input, to filter out mundane or distracting elements. The content of the images (the 'what' you see) should be as strong in itself as possible, without music enhancing it (and ironically often shiftening and lessening the original impact). A well-composed image will make it easy to choose the right style (the 'how'), be it sDoF, HDR, punchy, black & white, strong grain or moody colors. Or reduced image steadiness.
  8. Good point. It's a layer of meaningfulness, if you will. On the one hand people strive for ever higher resolution and and ever more realistic detail, on the other hand this never was the reason why audiences suspend their disbelief. They are captured by something that looks stylized, intentionally distorted, artistically transformed. It's not about reality, cleanness, it's about transcendence. Filmlook recipes during the last decade made these looks exchangeable. If not used with due subtlety und scrupulousness It's not perceived as a genuine quality (subconsciously associated with the old and magic techniques of big cinema) but as amateurish. It's then more linked to millions of cheap music videos on youtube. EDIT: so if Dan Wake wants to simulate original "shake" of the old movies in 2017 he should be aware that the average viewer might also subconsciously see through this and spontaneously decide whether the whole simulation is worth suspending his disbelief.
  9. It's funny you mention that. Turn on the english translation in this video (timed to the Tarantino part, poor auto-translation, but you'll get the picture): They talk about 4k, HFR, 3D (they contributed to Billy Lynn). Then it's about H8ful 8. 70 mm. The title wobbles? Nostalgic, pathetic. I agree. And you know what: if there is but one person on this planet who really loves mechanical film, it's me. I hated Hobbit, and I always wanted to make a mockumentary on analog cinema based on Theodore Roszaks great novel Flicker. Film has had it's time. Audiovisual storytelling is here to stay, but we have to move on. Continuing to mimic film failures is equal to designing furniture and paint them all with patina. Cheap and not inventive. The holy fathers of cinema would turn in their graves. They used state-of-the-art techniques in their days!
  10. Hans Punk, have you ever been a projectionist? You sound like one. Only you and me and maybe a handful of freaks will notice a difference between a 'scientifically accurate' simulation and a cheap retro effect. This is true for a lot of film effects (light leaks transitions, grain overlays, filmstock simulating LUTs). Anything goes. In the end, no matter how 'realistic' it is, it's still a travesty. It's like the spice smoke burger restaurants spray on their meat, triggering a vague association of something that was char-grilled. No chief would fall for that.
  11. That's interesting. Inside I'm a textbook introvert, ridiculously shy, but I can jump over my shadow at will. Learned that in three years psychotherapy. The key, in a nutshell, is doing at any costs what your fears tell you to avoid. So do speak in front of a camera. The smartest people with the most interesting things to say often don't have professional voices (Stephen Hawking comes to mind, among others). And vice versa, some textbook extroverts, radiating self-confidence and with loud, pleasing voices, bundles of joy, are completely hollow and boring. Some novelist wrote, those extroverts wear their shallowest feelings on the surface. A good voice is an authentic voice. You can't feign that. Analyze what bothers you about your voice/appearance. Are there false notes? Be yourself, say what you mean and feel in the most direct and simple way. Does your body language and facial expression betray your words? Then you may have the perfect radio voice but still won't convince yourself, let alone others.
  12. There: That's exactly my experience with the Zeiss glasses, which allegedly have superior image quality compared to FPV competitors. One would wish them to allow complete immersion. With handheld camera, I prefer to use the EVF with headphones on, be isolated from the real world and just see and hear what my future audience sees and hears. It's not like this with glasses.
  13. See here. My problem is, for a very special scene I need to be able to move the camera in direct, harsh sunlight. I'm very skeptic about the visibility of the upcoming smallHD Focus, and I think I will just keep the Cinemizer. To make it look less ridiculous, I spray-painted it matte black. It needs time to get used to, but with some practice I am now able to frame shots in very bright environments. The Vufine seems to have one problem: the right eye needs to focus to a fraction of an inch, whereas the left eye doesn't. Could be very difficult on the long run.
  14. Yes, of course. It all depends. As I said, there are systems already available that can edit native 4k like formerly DV, actually smoother and with more realtime, without necessarily being considered all too powerful in terms of hardware specs. My iMac for instance with FCP X. Honestly, sooner or later I also reach the end of he tether. The Premiere answer would be even faster hardware. Until, like you said, a timeline loaded with effects again sets the limit. Premiere just never developed an easy and reliable proxy or intermediate workflow, this thread is a proof of that. You pay more monthly for having - to put it mildly - suboptimal performance and way more hassle than others.
  15. The future is now: Premiere was updated an now officially supports GH5 10-bit video. No need for second guessing and time-consuming workarounds.
  16. Not gone, but noticeably reduced to the point where it's no longer irritating. Agreed? This is pathologic pixelpeeping anyway, isn't it? One should try Neat, grain and a ProResLT. Maybe it's just that there are too few values to represent the 50 shades of grey of he flapjack, and with this data rate macroblocking can't be stopped at all.
  17. Macroblocking in the shadows is the encoder deciding that noise is no valid signal and overly simplifying the area. In case of a H.264 master, that distinction is in the upload file already and only gets worse when the original is reduced again by the YT processing. There are three strategies against that: 1. Get rid of noise using Neat Video. The downside to this is that, no matter how well you know the expert mode of Neat, you will always also sacrifice texture detail. Which to protect by any means is the idea of 4k. 2. Dithering shadows by either keeping noise or applying grain. The first approach is unreliable and can't be fully controlled. The grain structure has been proven fail-save. The amount of the grain needs thorough testing, but you can do it before the upload. Philipp Bloom does this with FilmConvert. He also sometimes uploaded ProResLT, but this didn't make a huge difference - if at all - in my own tests on Vimeo. The grain (one can use the built-in grain effect of FCP) also reliably helps with fade-ins and fade-outs that can cause temporal banding. 3. ETTR. With less noise, there are less encoding errors.
  18. The first one looks natural, normal. But in a way like video. Not like a raw photo, must be a jpeg, automatic program. Could have been a smartphone (sDoF? Don't know). In the second one the colors are off. The plants look sick. Maybe there was a black cloud suddenly. Don't touch the peppermint, it's poisoned! I like the third the most. Has the best colors of all, looks vivid, threedimensional and real. I guess it was RAW originally. I don't know about the first two, but the third one also was shot with a good lens, therefore ...
  19. Axel

    iMac Pro

    There is huge difference now. Do you remember? Now you can transcode in the background and start editing right away. Whereas you can import all footage at once and immediately throw it into the timeline with FCP X now (even without transcoding, and, if you are fearless, even with naked files that are left in place!), it's not very smart. In my very humble opinion. This app offers us the best tools to prepare our clips for editing in the browser. In my eyes this is at least as crucial as the efficiency of the magnetic timeline. But I learned to respect the workflows of others. Mine also changes from time to time.
  20. Axel

    iMac Pro

    ". The greatly improved CPU and GPU performance for the iMac Pro is vitally needed for this ..." Well, financially I couldn't justify 5000 or 10.000 € for a new machine. My own stuff is indeed XAVC UHD, what I get from others (I also edit weddings) as well, and I don't see lags - but that must be the transcoding. In the end, it's not a rational decision. I spend a lot of time every day facing the computer, and I want it to be sexy. So far it had been a good year. Maybe these greedy suckers will again get my money. There you have it, haters, I admit it. I'm a miserable consumerist Apple fanboy!
  21. Axel

    iMac Pro

    jompais and Bioskop.Inc ask about the contradictions in our statements. You work on very long features, mine never are longer than 10 minutes (weddings). I usually have around 100 - 150 GB of footage for one day's shoot, between 100 and 300 clips, and external audio. I optimize my selections on import, just in case. I never multicam, just tried it once in FCP X out of curiosity with old FCP 7 music video footage (ProRes). My machine isn't configured for that anyway since I have the 256 GB flash and my TB raid (Pegasus with HDDs) reads just below 500Mbps. Ever since I began editing in 2002, I never had the fastest computers, and additionally the old FCP could address only 2,5 GB RAM. Therefore, longer features (beginner's mistake) had to be split into chapter-sequences or else FCP became slow and would often crash. That happened most reliably when you scrubbed the playhead rapidly over a long timeline. These two habits - structuring a story into sequences (even for ten minutes, I'll have three or four) and not skimming rapidly in the timeline (if at all) - I kept. I limit the number of clips FCP X has to access at once - by filtering in the browser and by compounding chapters. I disable background rendering. CC, effects with seldom used Neat asf. I make the last stage, and once the edit is locked, I don't care too much if I lose realtime. I render selected and that's the extend of it. To evaluate the usability of a system one has to know the workflows and demands of the people who report bottlenecks or not. I never found benchmark comparisons particularly useful for my stuff. With performance I just mean smooth playback and overall responsiveness, I just don't care if the final export takes an hour or a half. I don't sit there and stare at the progress bar.
  22. Axel

    iMac Pro

    Neat is notoriously slow. I wonder if the maxed-out nMP has realtime with that. Who has? Please tell the rest of us! What joema wrote about proxy encoding: he must have had many, many clips ("on multiple SSD raids" - this gives you an idea). Say, you have a library of 1 TB 4k H.264. That's many hours of footage, and you want to batch-transcode them to proxy. *One* clip is transcoded almost instantly, below realtime, AND FASTER THAN WITH A nMP. After a while, the fans turn on, the machine gets warmer and the performance decreases noticeably. Here's Max again: Why? Because an iMac is not the right machine for this. Here is a trick, a little workaround I found out for myself: Keep the transcoding in the background, do some time-consuming foreground work. Meticulously favorite, tag, sort all clips. You can do that clip-wise without proxies, because this machines' strength is smooth playback of 4k H.264. Actually start editing (not multicam though). In other words: Interrupt the transcoding process. Force it to pause. I had 140 GB H.264 footage, and it background-transcoded to Optimized AND Proxy in just a few hours. It was finished shortly after lunch. Rarely heard the fans. But in the first place: decide whether proxy/optimized make any sense at all. Since, as is generally accepted, the computer has no problems to play back 4k H.264. No multicam? No Neat? No five, six and more different effects stacked (you can also right-click transcode selected clips only or render selected parts of the timeline)? Then stay native!
  23. Axel

    iMac Pro

    Sure. You do use proxies for multicam then, don't you? But to be fair and put things into perspective: you then paid around $3200 for your machine with a fine display. My only way to directly compare my (same) iMac to other options is my best buddy's $10.000 HP 2015 workstation (sans monitors) with Adobe. He is a graduate engineer for AV media and formerly had been a full time Adobe teacher. He knows what he's doing. I always find he struggles more with performance, particularly playback. But it's a genuine workstation. The following is not an issue for him: An iMac is the wrong machine for this task. And I assume the iMac Pro will also not be ideal for this.
  24. If anybody has to justify his choice, then only to himself. You often hear that Mac users show the Stockholm syndrome. As you can see in discussions like this one, this also applies to Windows users. Now that I've made this decision, I need to defend it. Defending is okay, defiance or evangelizing is ridiculous. And yet we all do it.
  25. Axel

    iMac Pro

    The beauty of working with a compact system (also applies to laptops like MBP) is that you are forced to reconsider your workflows. I used to have a "triptychon", one monitor for browser, one for timeline and viewer, one for AV-out. > You need a big desk for that. > The monitors for the GUI don't have to be state-of-the-art and top notch, but the reference monitor has. > The latter needs to top the color accuracy the system (OS with Colorsync and embedded NLE) can provide. Was justified in the past with iMac displays. You had to bypass the system's color profiles. Isn't anymore. > You need a video card (unless you use a TV set over HDMI, which even some renowned colorists see as a viable option). > Additional costs, more chances that drivers don't work after an update. > You need to see full resolution at two stages in the process: qualifying your footage and certain tricky post effects. > 50-70% of the GUI isn't needed then. For instance, you don't need a timeline to organize footage. Actually, you don't need any GUI at all. Toggle fullscreen, with continuous playback enabled. Navigate with up, down and JKL, set i + o, hit f to subclip your selections. That's eight keys to get rid of the GUI. At least with a laptop, I'd do it this way. Fortunately, with 10.3's workspaces on a 5k display, I can see the event browser AND the inspector AND a 100% UHD viewer nonetheless. Why should there be a middle monitor with a timeline and a, er, canvas idle? All in all, the more I know what I'm doing at any given time and where I am in my project, the less I need a graphic representation. I admit, though, that this was asking too much if I had to manage tracks. But I don't, so I can as well reduce he UI to the maximum (for instance the height of the timeline, if I "zoom" on video, the audio clips may just be colored lines). When I see a traditional editing suite now, I realize that I don't miss it.
×
×
  • Create New...