Jump to content

Axel

Members
  • Posts

    1,900
  • Joined

  • Last visited

Everything posted by Axel

  1. Digging for the roots: North By Northwest. Few know that this very film was the blueprint for the James Bond style: chases in impossible places, denying probability, the plot driven by some deliberately silly pretext which Hitchcock himself labelled as "McGuffin". Before that, Hitchcock had made many comic thrillers. He liked the cinematic approach from the silent movies. Montages of sensations (a term coined by Sergej Eisenstein), that's film in it's origin. It's all about fun, being amazing. Of these films I remember leaving the cinema, and everybody could just say "Wow-wow-wow!" : The first Terminator (hate all others). Robocop (the first, hate all others, but find the remake quite good). Alien and Alien³. Luc Besson (his approach). Kill Bill (that was pure fun) With modern CGI, anything goes. But little is awesome anymore. Speaking for myself, I am bored to death by watching superheroes or desasters.
  2. Imo there are circumstances where raw can be the better choice. Ed mentioned the mixed color temperatures. This is where every average camera fails and where raw cameras shine. Mixed lighting was seldom used before, but with raw it can give your images that special dynamic and depth that make them stand out. On the other hand, there are disadvantages too. Depending on the lens used, you will get considerably more moire than with ProRes. This is particularly true for the very sharp Sigma 18-35. And ProRes LOG ("film"), compared to ordinary 8-bit rec709 ("video") and with the latter as your target, ProRes also gives you a lot of freedom with WB and exposure. Make your own tests. In Resolve, both raw and ProRes mix with no issues. I don't know your Mac or which other software you use, but if your graphics are too weak (you should have 2GB VRAM) to play back HD CinemaDNG in Resolve in realtime, you can generate ProRes as proxies, either for editing it in Premiere or FCP X or even for Resolve as NLE. A few years ago I used Apples Color, which didn't have RT playback. But for grading, RT isn't mandatory. Not so much as for motion graphics, and AAE users don't have even that these days (this is allegedly going to change now). Otherwise the raw feels like DV in Resolve.
  3. I second kidzrevil. Shows perfectly the BMPCC's excellent colors. Because all three images are good, the first one, rather flat, looks cold and misty and could work as well to transport that mood. I had the FC trial once but didn't buy it. Yes, I see the idea behind it, but there's nothing that can't be accomplished manually within, say, Resolve. But that's just me, and I don't want to argue.
  4. Oh, yes, it may well be that the excellent C100 records full range, it will at least do so in C-Log ("Cinema-Lock"), because otherwise a log profile made little sense. But let this be confirmed by "someone bright". Let me quote from the 5D2RGB manual: But even if "full range" is the correct setting for your camera and profile, you should think about what they say here: How on earth can a copy be better than the original? Chroma smoothing? You lose information by that. Yes, I know that many posted "proof" that 5D2RGB did some magic on original footage that showed banding. If that's what you are looking for, a baked-in banding-filter, go ahead. The whole point of using ProRes is that it's visually lossless. Compared to the original, you can't tell the difference. But it doesn't add color depth or chroma samples, otherwise it would be absolutely lossy. The ancient *religion* of Adobe to always use the native files to compute changes is theoretically the only way to preserve full quality. There is a loss with ProRes, but no one can see it, and it's many advantages outweigh this hypothetic chance.
  5. Yes, we had this before. If your 8-bit camera records in broadcast range, it most probably means 16-235. The range between 236 and 255 is not clipped, it just doesn't show immediately because it's 'superwhite'. But the highlights can be recovered in your NLE by drawing 110 beneath 100, thereby remapping the original values by a gamma curve. But: If you let remap 5D2RGB the 235 of your recording to 1024 in 10-bit ProRes, you lose these 'illegal' values. That's what happens if you choose "Full Range" (sounds better, but isn't). That's what your images show - blown out sun in full range. Only very few 8-bit cameras record full range, among them the 5Ds, if I don't err. What about floating point? The 10-bit of ProRes don't help much if a mere 8-bit image is the source. With 32-bit computing, an 8-bit image is treated as if it had a much higher bit depth. There are practically no 'rounding errors'. An 8-bit clip transcoded to ProRes remains an 8-bit image, there is no wizardry going on. Only with floating point precision you can avoid banding and other artifacts during grading - with AVCHD originals just as good as with ProRes copies. The latter are just more edit-friendly, and that's the extend of it. There are some very good tutorials on the web that prove these simple facts, but I am on a train with a cell phone and won't search for them now.
  6. The Red looked worse than anything else? The Canon colors were off? To be honest, ALL colors were terrible. I think the test was a worst case scenario also insofar as it was maximally biased. We didn't another test to prove that 8-bit can look better than raw. That was the conclusion of the Zacuto shootout 2012. Where they tried to make everything look as good as possible. And what they found was that the more 'forgiving' codecs made the DPs sloppy. The test proves further, that Sony colors look bad under almost all circumstances and that you better have 10-bit to be able to make them acceptable in post. FS5 has better ergonomics for video than A7rii? Absolutely. That's because Sony tried very hard to make the latter as video-unfriendly as possible. And they succeeded.
  7. ProRes is a flavour of mpeg2, a very old codec that is not very intelligent/effective in terms of compression. H.264 allows for much smaller file sizes to store the same quality. More so H265. But these codecs fall apart very quickly once you change anything. For delivery, i.e. to upload to Youtube or to stream for broadcast, they may be better. But they are not good for editing, let alone grading or compositing.
  8. Well, I am glad you mentioned my poor English, but that was exactly what I was trying to tell you. ProRes isn't known for being a particularly effective codec, it's famous for it's robustness (other word for that?). It made uncompressed obsolete. Don't worry about recompression.
  9. You mix up interframe (images referring to frames only partially described long before within the GroupOfPictures, backwards and forwards, with variable data rates affecting the precision with which individual pixels are stored) with intRAframe (individually stored frames, variable data rates due to complexity of the individual images). I state you can completely neglect recompression artifacts with ProRes.
  10. You probably read the book DV Rebel's Guide by Stu Maschwitz. He said in order to maintain quality you should never recompress. I think it was an own chapter with capital letters: NO RECOMPRESSION Therefore he recommended to use DV (an intraframe codec as well) only as acquisition codec and to encode to *uncompressed* immediately before any further changes were made (referring to changes to the pixels, not simple cuts). That was in 2006. Now ProRes is not uncompressed and it's lossy, though 'visually lossless'. Surely after some generations compression artifacts will show, I don't know when. I don't worry about generations though if I stabilize and denoise my clips and re-import them before making the Resolve-roundtrip (don't have Studio and don't have the Neat plugin for Resolve). With interframe codecs, the image is always completely recompressed. So-called 'smart rendering' isn't smart. Depending on the data rates and the profiles, you will *see* compression artifacts not later than in the third generation, sometimes in the second. This could be called 'visually lossy'.
  11. ProRes is an intraframe codec. That means if you don't change anything within the frame itself, grade it or resize it, there will be no recompression.
  12. Best watch a step-by-step tutorial for the basic editing workflow in advance. Those often have the effect on editors experienced in other NLEs that they feel like a bloody amateur in a iMovie-like software. Entry level, capable of little, exit. Good demonstration of how complex the seemingly low-profile NLE FCP X can get is this old timelapse video: One year after putting the first trial of FCP X into the trash in 2011 (wasn't stable - to put it mildly a hundredfold, it permanently crashed), I met a guy with a MBP on a set. He sat on the floor and edited something like the above in a very relaxed posture. I didn't recognize the software and asked how many tracks he used. He said, just one, look, here it is (if you follow the Nepal clip, you can see though, that the primary storyline = "the" track, - often changes to secondary storylines). I was impressed, bought 10.0.6 and never rued it.
  13. Well said. True. It boils down to how you like to edit. These things can get rather philosophical, like Walter Murchs essays on editing (i.e. In The Blink Of An Eye). And I think it's worth thinking about. I started reading Lawrences excellent article contra the magnetic timeline on Creative Cow. In the comments, a vivid discussion began and grew, I almost read all day. One contributor drew a handy conclusion: to write an NLE, the programmer is forced to introduce a paradigm, That's how you edit. As a user, you have to either accept that or change the NLE (like many FCP7-editors demanded: give us back tracks!) One argument Lawrence and others often give, that a magnetic timeline was contra-intuitive and arbitrary, is imo not valid. People who never touched an NLE before understand FCP X with no problems. On the contrary, if they are confronted with independant tracks afterwards, they don't see any logic in that.
  14. I wonder when there will be new new Mac Pros because the current models aren't that new anymore. These questions.
  15. FCP X on El Capitan. I must apologize in advance, because my postings often sound too enthusiastic when it comes to my NLE. I feel like I have to defend it when others complain that it's frustrating because of the magnetic timeline paradigm. I have been editing (mainly weddings) since 2002, and I know FCP up to FCS2 and Premiere up to CS5.5. Why FCP X? The simple answer: because I'm on a Mac. Imo it makes no sense to install Premiere on a Mac because it doesn't take full advantage of the OS and (with modern Macs) hardware. You could edit 4k video with a tiny MB with 1,1 GHz, 8 GB RAM and no dedicated graphic card using FCP X. Premiere now starts with proxy workflows as well (as I described here), and I think that's a good idea. When in FCP classic you had to always log (& capture/ & transfer), which meant you knew your footage in greater detail before you actually could start editing in the timeline. Whereas with FCP X you could start editing after import, it's not a wise thing to do. And many of those who are used to other NLEs and 'give FCP X a go' aren't wise, because they tend to skip this part. For instance, when I review freshly imported clips in the browser, I already synchronize external audio, uncheck the camera audio and mark all usable portions of clips with "f"(avorite). I then set up a smart collection with the following rules: Show synchronized clips, favorites and unused clips. Then I work from this smart collection. I don't see anything I don't want or need, and as I proceed my browser gets emptied because all used clips simply disappear. This of course is just the tip of the iceberg, not using custom tags and roles. Organizing many clips (like typically from a wedding) is a time-consuming nightmare in Premiere, and I can understand why people do that in the timeline instead of renaming, tagging or sorting things into folders. However, if I was to abandon Apple, I would go Adobe on Windows. Simply because I know it. I never touched Avid, so I can't tell. I assume it's good. We have that in common. I have always preferred JKL (even over the funky 'skimmer'), and I always learn and customize all shortcuts. Why? Because it's way faster than shoving the mouse. I type blind and just watch the image.
  16. Do you only know cmd+shift+3 or also cmd+shift+4? Can't be easier. With Affinity Photo (Mac only), you can save the Photoshop fee. It was $39 last year when I bought it (you pay once & get all updates free). Now it's $49. At first glance, you think it's just an overly simplified PS clone that offers much less functionality. After you got used to the "persona" workflow you realize it lets you do the same things, and sometimes more easily and logically: BTW: Affinity also brings back the functionality of the old iPhoto (indeed more, and better) to the crippled Photo app OSX now includes: In german of course on my system. This is one of the Affinity extensions that Photo gets from Affinity called "Develop" (meant for raw). Develop is a "persona". Some of the classic Photoshop tools appear if you switch to "Retouch": These screenshots were both made in no time with cmd+shift+4, drawing a rectangle with the mouse and letting loose.
  17. Well, here is the answer. Make a Timemachine backup of your current system (any external drive will do), and you are back there within half an hour if necessary. Drag & drop works. Of course every new OS has some bugs. I always read bug reports before I update. But I update eventually. I have an office iMac with Mountain Lion, and though it's working fine it feels slow and antiquated in comparison. Staying with an old OS is not the Apple way ;-) Since Yosemite, the most dramatic change is the appearance. The system font changed from Lucida to Helvetica, which is harder to read on low res displays but easier on retina displays. El Capitan supports "metal" for most Macs built after 2012. Apples statement that it would make graphics up to ten times faster has been challenged, but an average boost of round about 30% has been reported, depending on system and software. My 2009 MP with GTX680 feels faster, but I didn't measure anything, may well be a placebo. The most annoying bug, NO BLUETOOTH, has allegedly been fixed with 10.11.2. Don't know, use no BT. I didn't find this bug description, but my WiFi and Lan-Connection failed reliably after a while since El Capitan. I tested different things and identified the cause: Adobe Flash. I deinstalled this. Only little AV content will not be played back through HTML5 (Firefox announced "Shumway", a plugin that will play the dubious rest too), and Flash is risky anyway. Do not install it, even (and particularly) if you are prompted to.
  18. You are right. I just wanted to stress that one shouldn't make a dogma from anything. Artists have been performing "post" long before computers were invented. Surely the most fun comes from finding, making, forging things to become worth filming.
  19. You should watch one of those colorists' show reels how they treat an average shot by rigorously re-lighting it and put a multitude of combined nodes with tracked vignettes in them to it. I am sure that Resolve contains every filter of the Tiffen DFx suite - and Resolve is free AND better for general CC. If you know how to do it. So it's my own evidence of incapacity that I recommend this collection of instant effects. I didn't buy them yet. I had those sterile FS7-shots, and the trial saved me. Above some seem to prefer glass diffusion over post. I'm curious to learn why. Remember the first Ursa Mini shots? People suspected some kind of blooming bug: The filmmaker explained he had 'messed with some diffusion filters'. That's the problem. These *expensive* filters have a very subtle effect if they are good. And they are hard to judge through an EVF or on a monitor, even if that's at least an HD one. I think he should have 'taken off the edge' in post. Much better to control. But with BM footage, there is no such edge anyway. That glass is better than post imo belongs to those modern myths. Like: > you should always make an exact white balance metering before shooting! Why on earth? There is no white, neutral light in real life. You'll have a harder time matching all those completely different looking shots in post, whereas with fixed settings you just had to compensate for clouds and things like this. 3200, 5600 and ("winter") 6500K is all one ever needs. > an 8-bit image has all the values baked in, whereas a raw image offers you total freedom! Bullshit. That implies that the two were fundamentally different. What is, for instance, CinemaDNG? A sequence of *jpegs* (spatially highly compressed) in 12-bit, all values effectively baked in. The difference to, say, AVCHD, is that you have more freedom to change them. That's all. > don't "fix it in the post"! Sounds as if it were better to make everything in-camera. But you can't do that because modern cameras either are too smart (having too many parameters fixed by the electronics) or too dumb (LOG with little to tweak). So you have to do some post. You just should be aware of that.
  20. Well, for me there is one answer: Tiffen Dfx! It's a bundle - indeed a LARGE bundle - of digital filters as plugins hosted by Premiere, Avid, FCP X and Resolve (with or without own GUI) or as stand alone app. It's not exactly cheap, but it does better than any glass filter by the same brand. Because you can very subtly choose the luma/chroma range for each of them in post, they have many parameters to tweak, and there simply isn't anything comparable on the market. In this webinar, jump to 52" to see how you can deal with skintones: For my taste, the filters applied here are almost too crude. These are extreme examples for the audience to see what's going on. I shot a restaurant scene that was way too bright and lacked any atmosphere. There are incredible light effects in Dfx that let you make the candles in the scene appear as if they were the sole light source (fall-off, shape, skin-protection, what have you). Highly recommended. There is a trial too.
  21. Axel

    Color Finale PRO

    @bluefonia Very smart. Would like to see real splitscreens/wipes in CFP nonetheless.
  22. Axel

    Color Finale PRO

    Grading in Resolve, despite the necessary roundtrip, is still more comfortable than grading in FCP X, for different reasons. The two most prominent for me are (1) saved grades represented by stills, as described in the Resolve manual page 600: You don't grade 20 shots separately, you grade the sequence. You start with a defining shot, and you then compare the rest to this reference and match them. Within FCP X, there are two workarounds for this (maybe you know a better one): 1. you can make a freeze frame of the reference by hitting alt+f and move it over all your other shots, making the splitscreen manually (cumbersome). 2. you can park the playhead on the reference frame and compare it quickly with "s" (skimmer on the current frame on/off). Easy, but somewhat limited. For instance, you can't directly compare the scopes which help tremendously in quickly matching shots. This is what I hope the still store to do in CFP. The second reason why I find Resolve more suitable right now is that every saved grade has a very clear node representation that lets you see at once what stages of correction were just for the clip and should be adjusted after copying the grade to another clip.
  23. Axel

    Color Finale PRO

    Just received an NAB newsletter in which Denver Riddle states the following: Should the still store mean that I can view grades as splitscreen-comparisons, this would be really exciting ...
  24. They officially stopped support (which in Apples understanding means no further development) FIVE YEARS ago. They occasionally updated some securtiy protocols, that was the extend of it. There was an old rug. They said it wouldn't carry. They ripped nothing. They just watched the competitors trample on the rug. Neither did they pull a plug. They just did nothing. Adobe seems to have done some homework. In their official statement, it sounds as if no QT was needed to read (i.E.) ProRes or other movs. And why should it? Many companies have their own, proprietary codecs, and wasn't it always Adobes pride to support those without need for plugins (what IS that QT actually, as far as ProRes is concerned?)? How about five years notice? Ciritical features? Anybody uses QT player? On Windows machines, I see a lot of those orange-white traffic cones, no blue "Q". Guess what: on OSX you see them too. Drop? As I see it, to drop something you must have held it before.
×
×
  • Create New...