Jump to content

cantsin

Members
  • Posts

    948
  • Joined

  • Last visited

Everything posted by cantsin

  1. ​This is how I bought my full copy of Resolve, too - I just paid 350. There's a healthy second hand market thanks to people who already owned the full version of Resolve when they bought a BM Camera that came with a dongle. Neat's DR is infinitely better than the built-in DR of Resolve. It's no comparison at all. However, the built-in DR is handy if you just want to quickly get rid of a few noise artifacts in one particular shot without sacrificing real-time performance.
  2. ​It's definitely not more intuitive since Premiere (and FCPX) provide one integrated interface for media management, editing and rendering, whereas in Resolve these three functions are separated into three modules/screens that look and feel like independent programs. (Color Grading is the fourth module/tab, btw.) So you'll find yourself switching forth and back a lot. The advantage of this interface design is that Blackmagic can improve and extend the editing functionality without making the program convoluted. Whether or not Resolve is faster, depends on your hardware. Resolve has high hardware requirements, particularly for CPU, GPU and RAM while Premiere is a great performer even on low end and older machines. However, if you have a fast machine (for example, with a current-generation i7 CPU, current-generation Nvidia GTX card, 16-32 GB RAM and dedicated video hard drive), Resolve can perform much better than Premiere. Thanks to very advanced disk caching, you don't run into disk performance problems where you do with Premiere, and thanks to excellent GPU support, real-time playback even with complicated color grades is great, and rendering times are much shorter. At the moment, Resolve 11 can basically do everything that Premiere and FCPX can do (manifest in the fact that it can import their projects via XML) except for a number of built-in effect filters. However, there are two limitations: (1) The editing tools are quite basic. You get pointer/drag-and-drop, blade, trim/slip but no ripple and roll edit tools. Keyboard shortcuts are very basic and limited, too, so currently I'm editing much faster with Premiere. (2) While Premiere imports almost any video and audio file, Resolve is limited to h264 + all common professional raw, intermediate and camera codecs (CinemaDNG, Red, ArriRaw, ProRes, Cineform, DnXHD/HR, XDCAM, XAVC etc.). It only imports audio files that have been conformed to 16bit/48 KHz wav. Audio editing, in Resolve 11, is quite rudimentary, too. You can indeed buy Neat Video as an OFX plugin and use it in Resolve. The only problem is that the OFX version costs twice as much as the Premiere version (around $350), and that applying Neat within a Resolve project will kill real-time performance (unless you manually deactivate the nodes where Neat has been applied).
  3. ​As a Premiere user, I've been quite aware of that. But as I wrote, CinemaDNG support in Premiere is abysmal. The tutorial you linked to was written by Matthias (who's with us here on in this forum), and it contains the following statement: "you have access to a stripped-down set of source settings which let you change the white balance, tint, and exposure of the RAW files. Unfortunately, you lose about 95% of the control that you would get by performing an intermediate step and processing the files with a more comprehensive tool like Adobe Camera RAW".
  4. ​Resolve already does, even in the current version 11. It's just not yet as comfortable as Final Cut and Premiere. The business model for Resolve is clear. The full version, with among others noise filtering and 3D support, still costs several hundred $ and requires a hardware dongle. The real pro version is a combination of software and hardware, including the operation deck for several ten thousands. By providing a powerful Lite version, Blackmagic can (a) train students and enthusiasts to become Resolve specialists and later demand Resolve for their pro jobs; (b) bring advanced color correction and grading to the masses. Once you get used to those possibilities, you will curse at all 8-bit/DSLR material in your project and want a raw camera. Which, bingo, only Blackmagic sells for an affordable price. Conversely, Premiere and FCPX obstruct the mass adoption of Blackmagic cameras because of their lack of proper support for CinemaDNG. It's been three years since BM's first camera came out, and still the situation hasn't improved towards making CinemaDNG truly work inside those NLEs. So Blackmagic had to do something about it and get people to use Resolve instead.
  5. What a pathetic piece of wannabe crap. At least Werner's fake accent correctly if you choose to fake him. Screw you for your willfull misleading and for wasting our time. EDIT: Seems as if Vimeo has taken down the video. Good.
  6. ​I think there is something wrong with either your lens or your test setup. I just quickly replicated your test using (a) a Sigma 18-35mm/f1.8 (b) a Nikon 28mm/f2 Ai-s prime ...both attached to the Metabones BMPC Speedbooster and a Blackmagic Pocket Cinema Camera mounted on a tripod, shooting a color chart in CinemaDNG raw at 24fps with a 172.5 degree shutter angle and ISO 800, in a completely dark room using only one artificial light source (a camera-mounted LED panel). The CinemaDNG files were imported into Resolve, with white balance/color correction first applied to the Sigma clip and applied with identical parameters to the two Nikon clips. Sorry that I didn't spend time on more careful framing/focusing - it didn't matter for this test. First picture is the Sigma at f1.8, second picture the Nikon at f2.0, third picture the Nikon at f2.8. I think that the pictures speak for themselves.
  7. This page provides valuable information: http://www.drastic.tv/index.php?option=com_content&view=article&id=202:prores-colour-shifts-in-post-production&catid=59&Itemid=94 Scroll down to section "ProRes Problems/Issues". This one here is particularly telling:
  8. I stand corrected - but the problem still seems to be that BT.1886 gamma came late and isn't universally used in Rec709 implementations. Particularly not in computer video (where compatibilty issues between Rec709 and sRGB, plus non-standard Rec709 implementations do abound).
  9. The problem lies deeper than just with QuickTime - it has to do with the fact that consumer HD video (including HDTV, web video and DSLR video) is recorded in the Rec709 color space, and Rec709 has no defined standard gamma. In other words, color science in today's consumer video technology is a completely screwed up and an unrepairable mess. It's years behind digital photography where sRGB and AdobeRGB are solid color spaces that make sure that colors of digital stills remain consistent across platforms and software applications. The only fix will be, on the consumer end, Rec2020, the color space defined for 4K/UltraHD, and ACES, the new color space defined for postproduction (that has the potential of replacing today's vendor-proprietary log color spaces and raw video). Unfortunately, today's consumer 4K/UltraHD cameras, flatscreen TVs and streaming web sites do not implement Rec2020 yet, but still use Rec709. ACES 1.0 has been standardized just a month ago and will take years if not a decade to find its way into cameras and post production.
  10. OK guys, please allow me to be polemical (no personal offense to anyone here intended): Nowadays, "motion cadence" is snake oil and hogwash. Much of this discussion here proves it since we're happily lumping together a whole number of different things: recording frame rateplayback frame rate (including interpolated/software-generated higher frame rates in flatscreen TVs)shutter speed/anglerolling shutter vs. global shuttermotion encoding in the codec (such as: interframe vs. intraframe).You could argue that all these factors combined amount to "motion cadence" - but why not more simply and correctly: "motion look" of footage. In video forum discussions, however, "motion cadence" is mostly referred to as some elusive quality of how a camera renders motion. Typical discussions concern a camera A versus a camera B, both recording video at 24 fps with 1/48 global shutter, yet people think that footage recorded with camera A has a "more beautiful motion cadence" than camera B - whatever that is. I found the earliest mention of "motion cadence" in a 2005 discussion thread on DVXuser.com. I could find zero (in other words: zero) mentions of "motion cadence" in any technical papers on Google Scholar or research databases, zero occurrences in books (when searching Google Books and Amazon) or periodicals. The original thread on DVXuser, and a Wikipedia article on Sony camcorders, mention "motion cadence" in the context of true vs. fake 24p video recording. Back in the dark ages of ten years ago, most camcorders only recorded interlaced video. Some offered pseudo-24p that in reality was 48i footage deinterlaced in the camera. There's truth to the argument that real 24p and 24p deinterlaced from 48i have different "motion cadences" (even if the word remains fuzzy) because it literally concerns different cadences/intervals of recorded frames as opposed to the interval of frames encoded in the digital footage. To my knowledge, we no longer live in these dark ages, and all cameras that record 24p today record true 24p - or at least, the cameras we're discussing here on this forum. Issues with the motion look can stem from all the factors mentioned in the bullet list above, but they have nothing to do with recorded motion cadence/intervals.
  11. Coincidentally, Gunther Machu just uploaded a film parts of which were shot at the same location: Quite interesting also a practical comparison of the image aesthetic produced by the A7s (Brandon's video) vs. the Blackmagic Pocket (Gunther's video). -F
  12. You must make sure that you: use the same camera for both shots;have the scene lit by nothing but artificial light (no day light/sun light interference) which remains at the same brightness;keep identical shutter and ISO settings on the camera through the comparative shot;make sure that when developing the raw images, your raw converter uses identical settings and applies no automatic exposure/gain adjustment.
  13. Just a word of caution: "motion cadence" is neither a scientific, nor a technical term, but a term that only exists on videographer forums...
  14. Canon's demo film for the camera is out, and yes, it's got the Canon Cinema EOS colors and good looks (while being limited to deep focus cinematography due to lens aperture and sensor size): So, no, cheaper consumer cameras with similar specs on paper won't replace this.
  15. Again, the C300 was everything but a cutting edge cinema camera when it came out in 2011 (on the spec sheet: way behind what RED and Sony offered at that time), and it costed twice as much as a Scarlet. If - and that is: a big hypothetical if - the xc10 will produce a great image and turn out to be a robust workhorse, it will find its market even for $2500. EDIT: For example, people who buy a Blackmagic Pocket camera for $1000 and rig it up with lenses, ND filters, audio preamps, cages etc., will also end up paying $2500 for a 1" camera, and still end up being frustrated by its quirks and strong reliance on post production. I'm a very happy Pocket user btw. - but different people have different needs.
  16. Remember when the C300 came out, and everybody laughed it off because it looked like nothing against the Red Scarlet, which had been released at the same time, sporting much better specs (on paper) for a much lower price? (Remember this: http://philipbloom.net/2011/11/04/newcameras/ ?) Well, this story did not end the way everyone had expected. The moral: Don't judge a camera from the spec sheet. The Canon xc10 doesn't look very compelling on paper, but could still edge out its 1"-fixed-zoom consumer alternatives like the RX10 with only a few details: - C300 colors + log profile; - No artificial image sharpening; - No downsampled megapixel-overloaded consumer photo camera sensor with bad dynamic range and bad noise; - A fully corrected lens and no artificial software lens correction, as opposed to the badly distorting, in-camera-photoshopped "Zeiss" and "Leica" offerings by Sony and Panasonic. Admittedly, the demo video doesn't suggest any of the above, but as with the C300, I'd rather wait how the camera will perform in real life than jump to premature conclusions with no real basis but the camera's spec sheet.
  17. ​Schacht was a West German brand with no relation to the East German companies you mentioned.
  18. Well, these days the Alexa has become the de facto standard movie camera. Your question was as easy to answer as a hypothetical question for the two most likely manufacturers of the camera used for some professionally shot still photograph.
  19. Arri Alexa... What's so special about the camera here?
  20. Back on topic: If you make films because it's your passion, not your money job, then expensive equipment can actually make you less productive. It's the typical fallacy that, in the beginning, you think that your creativity is limited by your non-professional gear - while every artist will tell you that creativity comes from constraints. (Dogme95 was a prominent example - another one was the French writer's group Oulipo in the 1960s whose members wrote whole novels without a single occurrence of the letter "e".) But it seems that you first have to go through the pains of learning professional gear before you can throw it away again. Like a Zen student who learns everything from the masters but only achieves enlightenment when s/he ultimately abandons that knowledge. Nothing is more satisfying to shoot with a crappy, limited camera - knowing every of its quirks and every trick in the bag to get a good-looking film out of it. Today, quality of equipment really no longer is a question of price, since you can buy a Blackmagic cinema camera for $1000 and edit and grade on a $1500 gamer PC running the free copy of Resolve. Very soon - even if you're getting good with ETTR raw/log shooting, color balancing and grading - you realize that you will never fully master this system because it takes a team of professional camera operators, light technicians, sound recordists, film editors and colorists to get the maximum out of it. It's like having a Steinway concert piano almost for free - but still realizing that this won't make you Glenn Gould. But becoming a wizard with some cheapo Casio keyboard creating truly interesting tunes (think of this here: http://www.discogs.com/Various-Sweet-Sweet-Casio/release/3502194) is much more fruitful, useful for culture (which has no need for another wannabe-Glenn Gould) and much more subjectively rewarding. I'm currently getting a kick out of making videos with a GoPro Hero (granted, it's not such a bad camera if you use its Protune options - and if you know how to use Davinci Resolve...), using it not as an action cam, but like a regular video camera. It's incredibly fun and indeed liberating - carrying just that plastic cube thingie with you, never worrying about equipment that you might have forgot or that might get damaged, making camera moves just by putting it on escalators etc.. And a quite a kick if you manage to make it look good in the edit. It won't ever be a replacement for my Blackmagics in any more serious scenario. And if this had been my beginner's camera, I would be constantly frustrating with it and longing for more professional gear. You get the idea. Btw., there was a similar downgrading movement among indie filmmakers in the 1980s when Fisher-Price brought out its audio cassette-based PXL2000 camcorder: http://en.wikipedia.org/wiki/PXL-2000
  21. Vimeo is not an independent entity, but part of IAC (InterActiveCorp), the company that owns and runs OKCupid, Tinder, Ask.com and The Daily Beast, among others. IAC has a revenue of $3 billion to which Vimeo contributes about $30 million. In other words, Vimeo is profitable while YouTube still isn't for its parent Google. Vimeo's profits almost exclusively come from its paid "Pro" subscriptions. If you work in film/media, then you'll likely know that Vimeo's publicly visible content is just the tip of an iceberg. For film producers, film festivals, advertising companies and everyone else making moving images for a living, it's now standard to preview their productions to clients, critics, festivals and other third parties via non-public, password-protected Vimeo videos. These days, even the big film festivals rarely receive DVD screeners any more but default to reviewing private Vimeo videos. This is why - speaking purely from a business management perspective -, improvements of playback quality or social functions don't matter that much to Vimeo anymore. While Vimeo's original concept and selling point was to provide a community site for filmmakers, it's just the end-user and community features that are rotten: dysfunctional search, discussions forums that became nearly invisible and are hardly used anymore since Vimeo's site redesign in 2012, largely broken groups/channel functionality, a poor mobile app (Chromecast support in the latest version is incredibly buggy and unreliable), etc.etc. Vimeo also doesn't use any technical infrastructure of its own, but runs everything on Amazon.com's AWS cloud servers. The comparison to Canon might be a good one. Just as Canon decided that the beef of their video market lies in the Cinema EOS line rather than in consumer DSLR video, Vimeo might have decided at some point that their real market lies in people using Pro accounts for non-public videos. For the latter, social and community functions don't matter. For Canon, video DSLR is now just a way to make people future customers of its Cinema EOS line, and therefore video quality is stagnating on a 2009 level. For Vimeo, the filmmaker community now just seems to be a way of making people future customers of Pro accounts, so video quality is stagnating on 2009 levels as well, and community features are slipping into bit rot.
  22. Back to the topic: This move makes a lot of business sense for Canon. The Spartan Analyzer device is now sold as a product to be coupled with netbooks and inkjet printers. Canon will be able to turn this into one integrated device - and the company's printer/photocopier business already is larger and creates more revenue than its camera and lens business. On top of that, Spartan's technology will nicely complement Canon's medical imaging products. For Canon, this is a sound strategic bet on the future - where cameras will be a saturated market moving towards niche. (Think of HiFi components - a giant, booming, high-end/high-revenue consumer market in the 1970s that collapsed as early as the 1980s when {a} cheap mobile devices like the Walkman took over and {b} consumers felt no more need to upgrade the stereos they had.) All the while, there's a rapidly aging population in most industrialized countries. Japan, btw., is the most drastic example. It will provide Japanese tech companies a competitive advantage: the need for technological innovation of the health/care industries will be more urgent there than in the rest of the world. But no matter the country, medical services and improvement of medical technology already are an explosively growing, highly profitable market. (I happen to work for an art school that already trains its students to seek their future in service design for the health industries rather than in the creative industries.) Still, one can get philosophical (or melancholic) about the fact that business models no longer focus on trend products for young people, but on electronics for an ever-growing population of old people that will need robots to be taken care of. Assuming that the average EOSHD reader is a 20-to-40-something, it's safe to say that, in the future, we won't care about Canon 1Ds or 5Ds, but will depend on Canon Spartans and day care robots.
  23. Add the Sigma 30mm/1.4, the Sigma 8-16mm/2.8, the Tokina 11-16mm/2.8 and 12-24mm/4.0, the Tamron 17-50mm/2.8, the Samyang 16mm/2.0, 10mm/2.8 and 8mm Fisheye - all excellent APS-C/DX/EF-S lenses.
  24. The two lenses you're considering have two distinctly different focal lengths. The 50mm will end up being a portrait tele lens with the Speed Booster on your camera, the 35mm will behave as a normal lens. Comparing the two therefore is, I'm afraid, comparing apples and oranges. On top of that, the Zeiss will have a modern, cool, rather clinical look, the MIR will look slightly more vintage - but not much since it's a rather modern lens compared to the Russian vintage classics Helios 58mm, Zenit 85mm and MIR 37mm. I did have the MIR 35mm/2.0 in Nikon mount, but sold it after comparing it to my Nikkor Ai-s 35mm/2.0. The two lenses didn't differ that much since both produced a similar-looking, "warm"/non-clinical and subjectively pleasing image somewhere in between a vintage and a contemporary lens. But the Nikkor had overally better optical quality (color rendition and detail resolution) and a more solid metal construction. Since the two lenses came really close, I'd advise to not hesitate buying the MIR if you should find it for $100 or less. (The Nikkor normally sells for around $150 in good used condition, but since I happened to get mine for around $70, it even was a better value deal for me than the MIR.)
×
×
  • Create New...