Jump to content

Sean Cunningham

Members
  • Posts

    997
  • Joined

  • Last visited

Everything posted by Sean Cunningham

  1. Absolutely gorgeous. Great concept too. I'm curious, in the comments section you discuss various diopters being used, were they just for the close-ups? Your lead actor has such an interesting face. All that detail was rather stunning. The edges had a nice subtle falloff in the wides. It looks like you might have center-cropped to something more like 2.40:1 would that be correct?
  2. Apparently it's a danger with the technology used in any battery of that type. It's rare but it happens. This is the first and possibly only time it's happened with a BMCC so far. It's being passed around everywhere. Let's see what BMD's response is. I question the motivation behind releasing it without more information.
  3. What would the light loss be, if any, from the adapter?
  4. The footage shouldn't look different in DNxHD but this is likely from a setting mismatch somewhere, which is easy to do. It's a terrible design. DNxHD just happens to be the best option if you're on a Windows platform from a writing standpoint for platform independence. Unfortunately, Avid are just terrible software designers and their codecs have always revealed this. Any codec that is actually installed should be visible and accessible to any program that gives you access to multiple codecs. Some applications won't always expose every option but they will be there. Anyway, you should maybe stick to just exporting from Premiere with as large a file as is doable with your 5G/week limit. You are now battling unfamiliarity on multiple fronts with different solutions for each.
  5. Since the Helios 44 is a copy of a Zeiss Biotar I would think wider Jena lenses would be in the ballpark. They tend to not be as fast as the 58mm however: ...is it the slower speed or other properties that don't make something like this appealing, Rich?
  6. This is 100% not true. VIMEO and Youtube will always recompress what you upload and uploading to their target streaming rate is not how to get the best quality out of their services.
  7. Yeah, some footage had me totally convinced it was a mask shape and then later on I'd see it hit something that looked a little more complex than that would be practical. Digital + analog is often a winning combo ;)
  8. The humanoid shape looks to me to be an animated mask shape most of the time and not "in camera". It could also be put over a shadow that was actually in the footage but that didn't have the ghostly silhouette value they were going for. Much of it feels interpolated in its movements without much of the high-frequency motion that's present.
  9. And remember, "story" doesn't have to be a whole complex narrative. It could be something of a theme explored or imagery related to a mood the music produces or some interpretation of the lyrics, literal or otherwise. Here are some of my favorite examples of this where there's a story but not anything resembling a typical short film: ...these guys mix your typical shoot-the-band-playing-the-song type of music video with images that could be right out of a longer piece. There's a "story" there but not one you need to really explain or explore beyond how the imagery fits the mood of the piece. ...here you don't even have lyrics, just mood and themes, but there is a "story" to be interpreted. If you decide to forgo story, going with the seemingly "simpler" approach of shooting the artist performing, I've been terribly impressed over this last year by how creative and busy a fellow GH2 shooter has been, the young DeShon Dixon. Armed only with his GH2 and a fast 35mm prime he's constantly doing videos for new artists in LA. He has no lights and rides the bus to location. His story is rather incredible and even though most of the videos have nothing approaching "story" he's often able to create something that keeps me watching, even though the music itself isn't to my taste: http://vimeo.com/80451208 ...he's fairly masterful at finding natural lighting scenarios and positioning the talent relative to what's just there, even at night. Check out his VIMEO channel for other examples.
  10. Export from the editor to a Prores (or DNxHD), high quality, uncompressed file. Check out MPEG Streamclip. It gives you more control and better results than Handbrake. This way you can tweak your settings without having to constantly render from your editor and you can have more control over your codec. Re-compressing an MP4 to a smaller MP4 is you killing your quality before they can kill your quality. 500MB for a file of your length is way smaller than I'd expect for a quality upload master. I don't want to speculate on what it should be but 500MB is small for this length, for an upload master.
  11. The classics are not applicable to everything. Conclusion: there seems to be no getting away from hipsters.
  12. Tweaking the Sequence preference to show full quality doesn't appreciably increase chroma smoothness on an MTS clip in PPro CS6. Adding a 32bit RGB filter and giving it a nudge, to try and force some kind of resampling and smoothing, does not seem to show an increase in smoothness. 5DtoRGB transcode still wins.
  13. The link you posted above is discussing the filtering done on spatial scaling and otherwise transforming the image, not chroma scaling. I don't think you can assume the same filtering applies. Perhaps you posted the wrong link but nothing in that document implies these methods are used to scale up under-sampled chroma.
  14. That article appears to only be referring to resampling from spatial transformation. That doesn't mean they don't switch to filtering when you engage a color effect of course but the text of that document is talking about the results from scaling, repositioning and rotation. The 32bit linearization that occurs, that they describe, is necessary to preserve the luminance of small features that might otherwise be filtered away in a normal, integer operation with even the best filtering methods.
  15. Unless Premiere does it better than After Effects, which I doubt, it won't do as good a job, because After Effects does not, which has a better pipeline thru-and-thru than Premiere. If you're just going to rely on the app to do the filtering. And you don't have the tools really, in Premiere, to process luma and chroma independently of one another. Odds are, if you're really concerned about quality though, you're not finishing in Premiere anyway so, carry on. The problem with this is MTS rewrapped as .mov have lower playback performance in some applications, on some systems, compared to the original MTS even though internally it's the same contents. That being the case I'd take a little extra up-front processing and just go "all in" and convert to Prores rather than some half measure.
  16. Two crappy movies do not negate Die Hard, Apocalypse Now, Christine, Rushmore, Boogie Nights, Magnolia, Blade Runner, etc., etc., etc. These Arris can't be digitally made to look the same any more than Panavision's own Primos can reasonably substitute for the C or B series or competitive post-Cinemascope packages from Kowa, etc. These new anamorphics are for people who want what they have to offer. There are, after all, folks who like Hawks or other mid/rear optic anamorphics. They're not appealing to everyone. The classic lens packages are not going to be collecting dust on rental shelves.
  17. They (Panavision) fixed the problem with Cinemascope being incapable of proper looking close-ups (mumps). That is a simple truth. Decreasing the breathing associated with anamorphic focus pulls is a different matter (that's not what you were talking about when I replied) and more a matter of taste. The value of what Arri/Zeiss did is directly proportional to your like or dislike of the Panavision system. For me it was a "problem" that didn't need to be solved, but, bravo. I like the look of anamorphic focus pulls in my favorite films from the '70s and '80s, which would not be meaningfully enhanced by the fairly clinical look these seem to bring, and would therefore never use these lenses.
  18. Panavision fixing this problem buried Cinemascope. What was surprising to some Cinemascope engineers was learning that they did it, in those first Panavision anamorphics, with the very same B&L optics used in Cinemascope lenses. I'd be very surprised if they didn't have a patent lock on doing this sort of variable astigmatising.
  19. Gotcha. It seems fairly obvious, with some, when looking at a CCTV versus 16mm cine lens but I've also been reviewing streams that have various lens types in their description or tags. Admittedly I had only ever considered using the BMPCC with a Speedbooster attached but after playing with that little Fujinon on the D16 it got me thinking a little differently, especially after seeing some of the prices compared to what I was expecting after looking up $5000 and $7000 (or more) Canon and even Century Optics lenses for Super-16mm. Seems like you could build a fairly nice little kit with a fast 25mm, 50mm and one of those sub-12mm lenses. A few I could tell the lens was quite soft but I saw this one that had a fantastic quality about it that was softer than what I know you can get with the Pocket but it felt totally unlike anything else I'd ever seen shot on it and very analog. According to the fellow in the Bolex booth these lenses have a different background falloff than the same focal with an SLR stills lens. I know that the engineering for something like that is possible (and what I'm assuming is incorporated into certain portrait lenses to enhance bokeh) but I wonder if it truly applies to the CCTV lenses versus "true" cine lenses designed for Super-16mm shooting.
  20. You wouldn't uprez your footage you would overlay the 4K grain on top of your footage after scaling it down to something between 4K and 1080P until you got the look you wanted. Upping the bitrate will mean when VIMEO goes to re-compress it has room to work and then room to improve later as internet speeds and compression schemes improve. It goes towards "future proofing" your content, though I'm sure that's not a consideration for everyone.
  21. I confirmed just in the last week my suspicion that 5DtoRGB improved even All-Intra footage and this was visible with a simple A-B between the two clips lined up on top of each other, in both Premiere and After Effects, without any special tricks to highlight the differences (being slightly zoomed in did help, however). Boundaries where the edges of very colorful objects meet darker parts of the image were smoother in the transcoded Prores. In the instance of two brightly colored objects that overlap or otherwise share an edge this smoothness was even more apparent. I also noticed areas with chroma noise being overall a little smoother (though that was only really apparent under magnification) Not all footage will show off the improvement of course. You need to look at the kinds footage or subject scenarios that hits color under-sampling where it lives. Then decide how much "better" is worth the investment to actually do work with the better methodology. Premiere and After Effects show a slight difference in their handling of the AVCHD footage. My guess is because you engage Premiere's precision as a render setting, meaning any interactive color correction is only semi WYSIWYG, unlike After Effects which has a color managed viewport. Premiere does no chroma filtering of AVCHD to the viewer. It might when you crank up the rendering precision and be somewhat closer to After Effects quality. After Effects does seem to do some form of chroma filtering to AVCHD footage when dropped into a 32bit project file. It just doesn't do as good a job as 5DtoRGB. Unlike Premiere, however, now I figure I can manually filter the chroma on the AVCD in After Effects and be as good as the 5DtoRGB filtering and skip the transcode if I'm finishing in After Effects, which is generally the case.
  22. @Andy (or anyone) do you have experience with other C-mount lenses? I figure the Fujinon more or less automatically means CCTV and old video so it stands to reason a lens of the same focal length for 16mm film might be sharper. What about these other c-mounts that I see like Comiscars, Computars, Navitars and Kern Switars? I'm familiar with Angenieux and, of course, Pentax but what would be manufacturers to more or less stay clear of for applications like the BMPCC and D16 given they're higher resolution than what any of these lenses were originally designed to resolve for?
  23. There's also lenses that cover that particular wide end even faster, like the SLR Magic 12mm T1.6 and then you can get equally fast or faster lenses at 25mm and 35mm, especially at 25mm where you have multiple T0.95 options which gives you the look and feel of a very fast "nifty 50" on a 5D. And then of course you have the Speedbooster and all that it brings to the table.
  24. None of the presets that ship with editors or compression packages are actually adequate for high quality results with VIMEO or Youtube. They all seem to make the mistake of targeting too close to the actual streaming bitrate used by the services, based on bad suggestions all over the internet and old, incomplete and poorly written upload advisories that the sites themselves do not actually recommend any longer (and never did to their high-value customers). I'd make the h.264 upload at between 20-25Mbit/sec for something like this. Professional content providers often upload double that and are encouraged to, even by Youtube themselves. I know this is VIMEO but if Youtube is advising folks to upload at four times or more current streaming quality then that applies to VIMEO as well which is generally a higher standard to begin with even though they both stream at similar bitrates.
  25. 2.54:1 is close to some older anamorphic processes. Unless you're doing something commercial or going to DCP you don't really have to sweat hitting 2.35:1 - 2.40:1 exactly. Getting a viable 16:9 version of something becomes the big headache no matter the anamorphic ratio, for some commercial applications. It's not unheard of for lenses to be mismarked or the original designers to just be off in their estimation. Apparently there are a slew of Ultra-Panavision lenses listed as 1.33x when it's actually a 1.25x compression format.
×
×
  • Create New...