Jump to content

Axel

Members
  • Posts

    1,900
  • Joined

  • Last visited

Everything posted by Axel

  1. Oh, sorry, I missed the "5" in H.265
  2. http://blog.trendmicro.com/urgent-call-action-uninstall-quicktime-windows-today/ Obviously many still use QT on Windows although the support by Apple had been poor during the recent years (i.e. no upgrade from obsolete 32-bit QT7). Let's see what comes next. EDIT: as of now, there is a lot of rumour, but apparently no reliable source. Trendmicro is a corporation like McAfee, who also warn Mac-users of iWorm. You don't know iWorm? Install Trendmicro's firewall on your Mac, and never make the acquaintance! So perhaps this is just frightening the horses by some security jerks. Or, even better, Apple will finally release QTX for Windows?
  3. Don't know if I understand your question. You had problems with luminance levels whilst transcoding (camera-)mpeg4 to ProRes? Then it's because the levels were misinterpreted. We had this issue discussed before. Different cameras record at different levels. Those are embedded in the metadata of the camera card files. Makes no sense to change those because aside from a milky or too contrasty look you will introduce banding. Back? I see. You want to color correct with full range, not just an arbitrarily limited range. But by baking a falsely interpreted range into the ProRes file, you actually introduce shifts and other problems. In DSLR history, there were famous bugs. In the beginning, Quicktime didn't correctly interpret the 5D's H.264 RGB. That was when? 2007? With the popular 5D2RGB-transcoder, you could change the levels at will, resulting in this phenomenon. But don't take my word for it. 5D2RGB has a free version. Transcode your clips with full range, import them in FCP X and watch what has happened in the waveform monitor. Tell me if the chances to recover highlights or shadows have become better or worse compared to original range. Clipwrap (which also optionally transcodes to ProRes), on the other hand, will correctly read the metadata. FCP X does it all per default. It automatically rewraps the original clips if you imported directly from a card or camera archive (always use camera archives! If a camera archive exists on your Mac, clips never can go offline again, even if you imported optimized or proxy media and accidentally moved or deleted them) and doesn't change anything. It transcodes AVCHD/XAVC to ProRes in the background if you chose that, and it reliably reads the correct levels, So I have no idea why anybody would prefer to transcode his/her clips in advance. I used to warn people not to orphanize their clips by moving them out of the card structure (i.e. by only keeping the STREAM or CLIP folder), and I still do for different reasons (spanned clips!), but I've never experienced wrong levels within FCP X, even if I just imported naked .mts-, .mov- or .MP4 - files (because someone gave them to me).
  4. So another year passed by, and Adobe surreptitiously dumped their caution in the Guadalupe River and made the Lumetri integration their new major feature: https://blogs.adobe.com/creativecloud/whats-coming-next-in-adobe-premiere-pro-cc-and-media-encoder-cc/ Another novum: the CC now allows for proxies, which are being transcoded in the background (where did I last hear of such a thing, in these words?) Of course, with the incomparable realtime-power of Premieres native capabilities, Adobe feels obliged to clarify the rare occasions where such a feature could be useful: Very carefully said. Let's wait another year if the cloud's crowd embraces a, er, native alternative to natively editing highly compressed 4k.
  5. Would it necessarily have to write 2,5k files? You see it in the BMCC: 2,5k sensor, but 1080 in ProRes. Due to debayering, it's better HD. As far as I am concerned (my biggest failure: to have sold the Pocket, influenced by a friend who lured me to the Sony FS7/A7rii -4k- route), somewhat below-HD resolution is the only limitation of the Pocket/Micro sensor, not so apparent in ProRes, but sometimes very annoying in raw (*m*o*i*r*e*) ... Don't need GS if RS artifacts are few and if GS meant smaller DR.
  6. Stupid question only if you had the Metabones Nikon version, which just has an aperture ring. For the EF-version, Metabones says: Which are: IRIS and up and down buttons. Can't tell if this works reliably though, I only ever had the Nikon speedbooster.
  7. Why would ProRes issues worry you on a Windows system? Whereas it's true that ProRes is much slower on Windows QT (only 32-bit computing there), you as the editor would cut the FS7's XAVC natively anyway. And if someone gave you ProRes to use in your project: of all NLEs on Windows, Premiere handles ProRes best. However, data rates for XAVC UHD don't force you to use Thunderbolt. You wrote you were on a Mac for 7 years. So perhaps you have a 2009 MacPro, like me. I edit FS7 footage with ease, but I'm using FCP X. This plays back the original files pretty smoothly, and even large projects don't scare me. Because in this case, I allow ProRes proxy (completely automated workflow, a no-brainer). I can switch to original media at any time, i.e. for critical work like color correction or before the Resolve roundtrip, which works best with FCP X. Exchange with Premiere on Windows is safe with XtoCC (although you have to take some care to organize your media in such a way that Premiere always has access to original media, actually simple through "camera archives"). To get the same performance on Premiere on Windows, specs alone don't tell much. You have to find the right components and best someone with experience who helps you configurate your system. I couldn't tell, because I never built my own machine. 12 GB graphics sound impressing, but will as well Premiere as Resolve use it well? As for the 2013 MacPro, I was occasionally tempted to buy one, but as your friends said, it may become obsolete any time now. Furthermore, you'd need to buy a completely new ecosystem. Without the whole periphery being Thunderbolt, you never exhaust the possibilities. And these gadgets still cost a fortune.
  8. None of the rigs convinces me. Particularly this: How's that supposed to be held? Both front handles rest on the potbelly, left hand takes wooden grip (why wood?), right hand squeezes it's way from beneath to operate the absolutely useless follow focus .... The handle needs to be on the right side, the smallHD 501 with sidefinder left. Needs to be a compact unit, like a camcorder.
  9. Thank you. This camera is really interesting.
  10. Just for the record: This was two OSXs back and several FCP X updates as well (which can be installed only on Yosemite or El Capitan). Current are OSX 10.11.4, FCP X 10.2.3 and Pro Codecs 2.0.4. XAVC is natively supported since quite a while now, no need to rewrap or transcode or use a Sony plugin. The professional tool monica66 recommends is for FCP7.
  11. It must fit the narration. However, would you say Birdman or Revenant appeared rather real than surreal? I can only tell for myself. I expect some kind of meaningful stylization. That signals - from the very start - that the whole thing is worth watching. In the first seconds of any film the filmmaker must catch my attention, he has to promise a good show. A good director once said, realism doesn't transcend. And anyway, of all isms it is the most arbitrary one. You need to exclude emotion, mood and passion for something to appear real. Hyperrealism is the right term. By this is meant a heightened sense of reality. You can't keep that through feature-length. Special moments in life (more adrenaline), special moments in films. And all the more stylized.
  12. That goes without saying. All tests that proved the Sigma to be not parfocal had their motifs relatively close. Six feet distance @f1.8? But if your backfocus is correct, the distance is more than 15 feet and the aperture f3,5 and above it will look parfocal enough. The same is true for the breathing behavior. That is when you make a focus transition with open aperture @ 18mm to an object very close to the lens, it will be slightly zoomed in during focussing.
  13. We've seen aerial shots from a drone already with the BMMCC. For Ursa Mini: I've read it's a different sensor, not just firmware updatable. If you deem GS essential, you'd have to buy the big Ursa and then exchange the sensor. If that option still exists, I don't care. To keep a sense of proportion, did you know that even the big fat Alexa Studio has a RS, it only uses an analog trick to simulate a GS: All other models simply have RS with good read-out-times. You know the price tag of those cameras? Alexa Mini €38.000, Amira €39.000. How much is an Alexa Studio? From what we've seen so far from the UM, there won't be terrible jello.
  14. Thanks for the review. You forgot to mention that the 501 can be used as a very compact HD EVF with the optional sidefinder. This may be an important feature for many.
  15. My favorite music video was one I did not shoot but edit, for free. It was a wild story provided by the band's lead singer. There were three days of shooting, around ten places with tons of actors and a big party in the end, tight schedule. Second and third morning I presented to them drafts from the intertwining little stories filmed so far, and they were enthusiastic. Of course I had spent some hours each night to edit that. Since it was a poetic story, I cut my own version, with much shortened story lines in dechronological order with occasional appearances of the singer. And a third version with the band more prominent. I made I-don't-know-how-many little changes (every single one well motivated) at the request of the band, and that took at least two weeks. A month or so later, the singer phoned to tell me that the manager (a guy who had a camera himself and, I believe, Vegas) was not satisfied with it and that he had found another editor. That was a shock. It was really bitter. I became friend with the new editor, but his version wasn't approved either. I said to him, this is not going to happen to me again. Later that year, the manager called me to ask if I was *free* to edit a multicam session of a gig. I hang up and never heard of them again. Personality rights (and specifically the rights of one's own picture) are more or less the same in every civilized country. I can't publish any recognizable face if the owner forbids it. And even if posing in front of a camera implies the will to have the image published, I can't do it because of the music. Even the band can't allow me if they had already sold the rights.
  16. Found on fcp.co, I don't want to judge you by my own standards, but in this videos I found some workflow tips with shortcuts and methods I didn't know of or didn't deem useful: Below I list some shortcuts used in an edit-to-music-project: 1. To start the project, just select the music clip and hit q. You get the audio connected to a slug ('gap clip') instantly. 2. To edit to the beat, you can chop the gap clip to placeholders for the actual clips hitting cmd+b while the audio is playing. Since I find it a little awkward to have to shortly press two keys at the same time, I will probably change the keyboard shortcut for this to a single key (like german ö perhaps) later. 3. To substitute the placeholder-slugs with the images, you can perform a simple three-point-edit > by first selecting in and outs for the gap in the timeline with x. > by selecting an in-point on the clip in the browser and hitting d to replace the slug from the start > by selecting an out-point in the browser and hitting shift+d to replace the slug from the end. Useful, isn't it? We were all told at some time that we should forget about overwrite (delete), because it was a counter-intuitive command in a magnetic timeline. Wrong. Another video elaborates on footage organization, and though I do use this on a regular basis, I find it entertaining and inspiring to see how others use it: https://vimeo.com/158641571
  17. LUT Utility (integrated in Color Finale) is available as stand alone plugin for $29: http://www.colorgradingcentral.com/lututility/ I get everything free, because I roundtrip to Resolve. But if I was ever to grade in FCP X completely (which I did, using the trial period of CF), I would then buy CF and Coremelts SliceX+TrackX: By the way: SliceX is free now, it's then only limited to rectangular or oval shapes. Wicked, because with FCP X's own new mask tool and TrackX you could Mocha track your CC masks in the fashion showed above (EDIT: well, I think not, but anyway, the combo Slice/Track makes sense). For me, CF came too late. I had already learned to make minor adjustments with the simple color board and to move to Resolve if that didn't suffice. In Resolve, adding nodes and organizing them is a smarter way to grade step-by-step than changing the order of effects in inspector, where the need to constantly scroll up and down bothers me. Free Luts, free tracker.
  18. It's a good thing to show some DB stuff for a change. The colors look like what you'd extract from raw stills in ACR, Raw Therapee (or Lightroom? - I don't know Lightroom). Resolve seems to interpret raw differently, but people use it because it's so easy. How did you develop the raw?
  19. I now feel I just never noticed how good the DB can actually be. But in comparison it looks a little more Super 8 than 16mm. I think you are right. Also, with the Micro, one should be able to build a more usable rig (smallHD 501 with sidefinder and rec709 Lut permanently fixed with HDMI connection hidden), because the body itself is so small.
  20. Where have you been hiding all those years? Or had I been just blind? We are finally talking sense.
  21. You must admit that in this thread we saw some of the best examples of how these cameras (DB, Pocket) beat some state-of-the-art high-tech-gadgets made of plastic in terms of image quality. And, it seems to me, in filmic/cinematic/narrative creative FUN. Thanks also to Dave. Very good for wedding videographers also. I'm one of them.
  22. Bioskop, very well done. Only after I had bought a gimbal myself - the Ronin M - I came to realize of how little use it was for me. Another example of terminator vision (I used this term on poor color depiction in another thread). And there is another aspect your clip demonstrates: lowlight abilities. The whole thing is rather dark. Could have looked a better lit place with the A7s. Now I see (to quote Bloom) the problem with that. Because obviously the place was rather dark. And the Pocket renders the backlight very naturally as well as the shadows (which are called depths -Tiefen - in german. And that's how they appear: deep).
  23. Quite spot-on description for image quality! I always try to make my 'rigs' Super 8 - like. GH2: The DB lacks a big fat eyepiece for three-point-stabilization. That's all. Anyway, the best design had the original bolex: Hard to believe, but there hadn't been lots of shaky, er, clips at that time ...
×
×
  • Create New...