Jump to content

sunyata

Members
  • Posts

    391
  • Joined

  • Last visited

Everything posted by sunyata

  1. http://www.studiodaily.com/2015/05/gopro-drone-vr-rig/?hq_e=el&hq_m=3089076&hq_l=4&hq_v=f7e4b89c3c "..The GoPro VR rig is due later this year..."
  2. Yes, there were indeed lots of variables involved in Technicolor. Assuming Technicolor 4 process, you had a light loss from the beam splitter (approximately resulting in an iso of 5) which demanded much brighter sets (maybe you can try adjusting gain or contrast to simulate). There would be some crosstalk on the bipack strip for the red layer (behind the blue on the dual monochrome strips), probably very minor. They used proprietary dyes and matrices. Their shoots were also somewhat art directed by a Technicolor employee to meet their requirements. You had some fringing on the final subtractive prints too because they were essentially 3 color carbon prints, or gum prints on film, which required precise registration. They used to use a "Key"layer (black) to hide it before they ironed out the fringing. But since Technicolor used 3 separate RGB monochrome strips, in theory you should be able to attempt to simulate a print starting with RGB footage, converting to respective complimentary colors, playing around with some filters / grading on CMY nodes etc.. the problem is mainly your source material, i.e the inherent problem with 3D LUTs.
  3. To be fair, I haven't tried their pasta sauce.
  4. http://www.nytimes.com/video/business/100000003538108/kodak-after-the-bankruptcy.html?playlistId=1194811622182&region=video-grid&version=video-grid-headline&contentCollection=Times+Video&contentPlacement=6&module=recent-videos&action=click&pgType=Multimedia&eventName=video-grid-click After watching this I'm left with the feeling that the death spiral has started.. or is nearing completion. I wish they would innovate their way out of the situation, but that's not the impression you get.
  5. Interesting link showing some common grading / timing looks, or pitfalls depending on how you look at it. http://www.flicks.co.nz/blog/film-is-a-harsh-mistress/making-the-digital-grade/
  6. Hey Ebrahim, the simplest answer is to grade for your final target format, which sometimes will add contrast, but to really know you need a test monitor since even a newer flatscreen TV's can add contrast. I'm not sure how people got the idea that low contrast is filmic, film can be very high contrast depending on the look someone is going for, i.e. skipping bleach or pushing a couple stops. Just make it look the way you want for your target media, which ideally you can test. Buy a vanilla TV flatscreen with HDMI inputs for example and use it as a second monitor. You can test title safe that way too.
  7. ​You make a lot of good points but if you're concerned about the environment as a whole, you also have to consider the amount of carbon produced to keep all the - always on or charging - servers, devices and infrastructure running 247, since that's the macro environment that has made digital viable and also how it will be consumed. As more people come online with broadband, our media consumption has gone through the roof, which takes more servers and more electricity to deliver. I know there's no going backwards, but I just want to emphasize that digital is not clean, you just don't see the pollution when you create it.
  8. Nice explanation, he even weighs in on the HFR debate.
  9. You see a lot more of the television static type of noise in the F35 examples, especially in the neutral backgrounds.
  10. Since mention of the price drop with used F35's, I've been meaning to compare similar shots from 2 different seasons of a show known to switch from the F35 to the Alexa (w/o a heavy grade), finally got around to it... Alexa on the left, F35 on the right. Season one and four are compared (or 4 next to 1). The color grading was less consistent with season one and these are final, so I'd say take it with a grain of salt. You can download a full size jpeg here to see at 100% actual (cropped).
  11. You should look up the whole Ed Catmull deposition stuff if you really want to know what goes on. I prefer to remain anonymous when I post stuff online because of the security paranoia with studios (even before the hack), you're always on pins and needles and that's pathetic in and of itself really. Vfxsoldier just ceased his fundraising efforts to lobby congress for tariffs (that would prevent uprooting families year after year), he's now deciding how to pay his legal fees. I got a postcard from a girl that went to India to find greater meaning, not for work, she changed her name to something I can't pronounce (many people have recently made resolutions to find another career). Meanwhile vfx movies are topping the box office, Marvel and Disney are breaking records with lesser known stories, much more is on the way. Ed Catmull is currently president of Disney and Pixar animation, and just sold his modest custom built mansion for 11 million in Marin. Obama comes to Hollywood when he's campaigning, he talks convincingly about keeping jobs in Los Angeles, later he attends an exclusive fundraising dinner arranged by David Geffen, we see jobs go overseas. On and on. It's not just a vfx story.
  12. A little more information on this: The scanline on the right side was not part of the slate, it was applied to the rendered gradients in compositing to test the gamma and make sure it didn't change. It appears that editing in Premiere, with settings at max quality max bit depth made no difference in the render (I'm using CS6 though), but importing a Premiere project into After Effects, setting the comp in AE to 16bit, then rendering at minimum "Trillions of Colors" comes closer the top interpolation in nuke of Y'CbCrA 444 (rec601) to display RGB (AE 32bit didn't make a noticeable difference). This might allow skipping transcoding and still using Premiere for editing of mjpeg files, if quality is deemed good enough. Importing into AE with 8bit project settings but 16bit output doesn't improve the banding on render, you have to set the project to 16+. Also, color space in AE was set to none.
  13. ​I tested importing a Premiere project into AE and rendering the mjpeg clip as an uncompressed tiff sequence from within an 8bit and 16bit project, thinking maybe that would be a workaround, but the banding was still re-baked into the file. Although this banding was a little different, not an identical match to the render test from Premiere. I didn't test rendering float from Premiere after the 16bit AE test though, or check the video preview box, I looked for something like that but couldn't find it. ​Renders were tested early on by Andrew and I also tested in Premiere and AE, but as discussed above, I have not tested rendering float with the settings above for Premiere, only tested 8bit render and also used default video preview settings in Premiere. If you want to play around with it, you can use this slate: http://collectfolder.com/temp/slate_test_8bit_srgb_mjpeg.mov.zip
  14. Very interesting, I'm going to have to check that out now.. thanks! A quick test on the transcode thing: I'm getting average around 14fps transcoding 4k mjpeg to ProRes Proxy, highest settings in ffmpeg. frame= 1440 fps= 14 q=0.0 Lsize= 1417024kB time=00:01:00.00 bitrate=193470.9kbits/s CentOS 6, IvyBridge i7 3.4Ghz. Not magical, but not too bad.
  15. Hey Steve, or anyone that can answer this, it's been a long time since I used FCP and I was wondering if there are changes with searchable markers or "cues" to a clip. Say I have 4 dailies that are 4+ minutes each. I need to assign meaningful cues that can be searched later, like "betty_outdoor_gun". Can I do this in FCPX w/o cutting the clip into subclips first? In Lightworks for example, I can assign cues as I preview a clip, I name them and when done, export a spread sheet with those names/timecodes for editorial or someone else to use as a reference. Then a search for a term like "betty", "outdoor", or "gun" will get a list of those returns, delimiters ignored, so all "outdoor" shots or all shots of "Betty" etc. This was worth the $25 per month fee! We often have multi level approvals where someone at the last minute will ask for a shot in a specific context and unless you have a photographic memory, you need to scrub through miles of footage.
  16. But it only makes the differences more apparent, the main problem does seems to be the conversion from yuvj422 to RGB display space in Premiere, also happens in After Effects.
  17. ​Yes, trying to illustrate why you can't rely on QT player a frame checker. Also known to auto-correct aspect ratio, interlacing and colors.
  18. You know what's a great day killer? Shoot C-41 color film, 35 or 120, develop your own negatives, wait for them to get close to dry and scan them as negatives, then do your own reversal in post. You can get the chemicals here: http://www.freestylephoto.biz/20411-Arista-C-41-Liquid-Color-Negative-Developing-Kit-1-Quart The bleach and fix are mixed into one step called "blix". You can also develop ECN-2 film with this kit. Just got a hand etched screen for my Pentacon Six today actually, it has a 1:85 pattern on it, kinda nice. Managed to get a Flektagon 50mm F4, a stuck aperture but super clean, with hood and filters, for around $60. Just need to put on a hazmat suit and do the CLA.
  19. It also looks like QuickTime Player is altering the gamma of mjpeg files to 1.8 or something close vs Premiere, AE and Nuke (sRGB). I looked to see if there were any hackable files in Premiere and all the import files seem to be compiled. Various containers didn't avoid the problem. Doesn't seem to be a rec709/601 legal or scaling issue. I say that because the slate test shows the values out of studio swing (shows full range in source test). Since this is an issue with Premiere and AE, I don't think you're likely to get a fix any time soon. Importing a Premiere project into AE also didn't remove banding with 8bit or 16bit render to tiff. As was tested, rendering from Premiere also bakes in the banding, although I didn't test the higher bit depth render options, didn't think that would matter. Additionally, Premiere still seems to have virtually no file import options, import LUT, transform matrix, or CDL support. Anyway, a fast lossless codec option is Animation (qtrle) 8bit RGB (or 24bit per pixel), although not as friendly on slower systems. You can do it from ffmpeg like so: ffmpeg -i mympegfile.mov -c:v qtrle -r 24 -pix_fmt rgb24 muchbetterinpremiere.mov I'm surprised Premiere is still so limited.
  20. I think Digital Cinematography by David Stump is excellent: http://www.amazon.com/Digital-Cinematography-Fundamentals-Techniques-Workflows/dp/0240817915 then there are books on the human visual system, colorimetry etc, as well as public info if you're into reading PDF's online. This paper by Jeremy Selan http://github.com/jeremyselan/cinematiccolor/raw/master/ves/Cinematic_Color_VES.pdf formerly of ImageWorks is really great but it skews towards the VFX industry. To answer your above question though, bits and bytes are the basic units of digital computing, compression (lossy and lossless) involves using algorithms to encode or reduce the resulting data. I agree though that to fixate on the simple formula for color depth is missing the point of all the discreet processes that happen inside a digital camera, wish I could help with cadence. The 24fps 180 film shutter still seems to be the standard I prefer and the Alexa seems to come the closest IMO, not news.
  21. Ebrahim - Very nice, your article reads like an Apple support page, excellent supporting visuals! Tiny suggestion to disassociate bit depth from compression, but that's all I got. Great post.
  22. Photography is not dead, asking is photography dead, is dead.
  23. Andrew- I modified a slate to help test the Premiere issue. If you can import and do the same as with the other files, then render out highest settings RGB, I can compare to the source in nuke. Below is a plot scanline expression that tests the linear ramp at the end of the slate. http://collectfolder.com/temp/slate_test_8bit_srgb_mjpeg.mov.zip Would be happy to post the results to compare with the source, which is lossy mjpeg compressed 8bit sRGB gamma, full range.
  24. A couple things you might want to also check (I see you've done a lot), make sure there is no transcoding or conforming happening on import, I'm assuming there isn't because it would be an obvious thing to check (this is akin to rendering media on import), some software does that automatically. Then to test your theory "What I think Premiere might be doing is remapping the 0-255 MJPEG material to the broadcast standard 16-235 " also check with an eye dropper etc in Premiere to see if your blacks and whites are still full range (0-255), this will rule out gamma scaling issues due to possibly working in a rec709 project. If there is no transcoding/conforming on import and no gamma/LUT applied, then I'm not sure why the banding is baked in even when rendering unncompressed, very odd. Also, you can do a test with an image sequence of the same clip at 8bit uncompressed just to be sure it's definitely something relating to mjpeg movie files.
×
×
  • Create New...