Jump to content

jcs

Members
  • Posts

    1,839
  • Joined

  • Last visited

Reputation Activity

  1. Like
    jcs got a reaction from Christina Ava in The very underestimated problem of RADIOACTIVE lenses   
    Out of curiosity I purchased one of these to measure the computers/electronics around me: http://www.amazon.com/Trifield-100XE-EMF-Meter/dp/B00050WQ1G/ref=sr_1_1?ie=UTF8&qid=1398562001&sr=8-1&keywords=radiation+meter . I ended up moving some equipment around so that I would only be exposed to 2-3mGauss (some UPSs were outputting 100+mG). About a year later I purchased a new MBP and got the meter out to check it. I turned the meter on and the meter was pegged (100+mG, perhaps over 200mG based on the increasing scale). I thought, wow Apple, that's not cool. Then figured perhaps the batteries were low or the meter was bad, since as I walked away from the laptop, the meter didn't drop. I put a new battery in it- same issue. I tried another meter (had also purchased a directional meter)- same issue. I noticed as I walked away from the window the meter dropped a little. So I left the apartment, and walked down the hall. The meter slowly dropped. I left the building until out in the middle of the street- finally back down to 2-3mG. Then a lightbulb went on- about 6 months after I purchased the meter,  SoCal Edison had upgraded the power lines by my window. I had measured right at the window when I got the meter to check the power lines- it was 3mG. I returned to the same window position and the meter was pegged- 100+mG (guessing over 200mG from increasing scale).
     
    I had developed this weird shoulder issue where the muscles always stayed contracted- it was my right shoulder and figured was due to mouse/computer use. However every time I left the apartment for a few days or more, my shoulder got better. I never put 2+2 together regarding the upgraded power line. So, I moved everything out of those rooms away from the power line as much as possible (new locations were about 15-30mG, still too high but much lower). My shoulder got better in about a week. I asked my MD if this could be psychosomatic- he said, probably not, he's heard from plenty of patients whose issue(s) got better after reducing EMF exposure.
     
    I took the meter with me when looking for a new place to live. I was surprised how bad other places were, but none as high as the old place (top floor, right by power line). Surprisingly, landlords and real estate agents said other people did this practice as well (brought meters with them). The new place I moved to reads less than 2mG in most areas.
     
    High EMF is linked to brain cancer, ALS, Alzheimer's, and leukemia, however the evidence is not yet strong enough for the EPA to regulate it and/or politics and influence from the power companies: http://www.epa.gov/radtown/power-lines.html. They suggest moving away from the source of EMF, which I did. Another weird symptom was strange allergies- was it something in that apartment other than EMF? I don't know, but I brought all my equipment and furniture from the old place- so far no more allergies.
     
    Regarding ionizing radiation. What does it do? It damages your cells and DNA over time. The good news is if you're eating healthy, exercising, and getting plenty of antioxidants, your body can repair the damage (including DNA) if the doses aren't too high. The problem with studying the health risks is the wide level of variables. Cancer from low-dose long exposure radiation is hard to prove a source of the cause. Smoking won't kill you right away, but it may lead to lung cancer, heart disease, and other diseases. Some folks won't get cancer because their bodies can handle the toxic smoke. Others who only got second hand smoke will get cancer. It took a long time to overcome politics, etc., for the truth to get out and for warning labels to be required. That said, people smoke anyways, some still smoking after getting cancer/emphysema, as nicotine is so addictive. Many people think they are immortal or don't think they care about living a long healthy life. However after getting sick some decide they want to live and radically change their behavior.
     
    I used to snicker a bit about the 'tin foil hat people'. Now, if there is a known risk and it's easy to avoid, I don't think twice about avoiding the risk. The only issue about the new Faraday cage-like place is I can't get OTA digital TV signals and Verizon coverage is poor (I'm using ATT, some friends use Verizon and their phones don't work very well) :)
     
    I could go on about heavy metals, but that's further off topic and a story for another day (short summary- avoid mercury and aluminum in vaccines, don't drink tap water (use RO or distillation and add trace minerals), remove amalgam fillings, skip gadolinium contrast if you ever do an MRI, limit large fish consumption (except perhaps wild salmon), don't drink bismuth (Pepto Bismol etc.)). Two books which can be very helpful:
    http://www.amazon.com/Amalgam-Illness-Diagnosis-Treatment-Better/dp/0967616808/ref=sr_1_1?ie=UTF8&qid=1398564805&sr=8-1&keywords=heavy+metal+cutler
    http://www.amazon.com/Hair-Test-Interpretation-Finding-Toxicities/dp/0967616816/ref=sr_1_3?ie=UTF8&qid=1398564805&sr=8-3&keywords=heavy+metal+cutler
  2. Like
    jcs got a reaction from jpfilmz in The very underestimated problem of RADIOACTIVE lenses   
    Out of curiosity I purchased one of these to measure the computers/electronics around me: http://www.amazon.com/Trifield-100XE-EMF-Meter/dp/B00050WQ1G/ref=sr_1_1?ie=UTF8&qid=1398562001&sr=8-1&keywords=radiation+meter . I ended up moving some equipment around so that I would only be exposed to 2-3mGauss (some UPSs were outputting 100+mG). About a year later I purchased a new MBP and got the meter out to check it. I turned the meter on and the meter was pegged (100+mG, perhaps over 200mG based on the increasing scale). I thought, wow Apple, that's not cool. Then figured perhaps the batteries were low or the meter was bad, since as I walked away from the laptop, the meter didn't drop. I put a new battery in it- same issue. I tried another meter (had also purchased a directional meter)- same issue. I noticed as I walked away from the window the meter dropped a little. So I left the apartment, and walked down the hall. The meter slowly dropped. I left the building until out in the middle of the street- finally back down to 2-3mG. Then a lightbulb went on- about 6 months after I purchased the meter,  SoCal Edison had upgraded the power lines by my window. I had measured right at the window when I got the meter to check the power lines- it was 3mG. I returned to the same window position and the meter was pegged- 100+mG (guessing over 200mG from increasing scale).
     
    I had developed this weird shoulder issue where the muscles always stayed contracted- it was my right shoulder and figured was due to mouse/computer use. However every time I left the apartment for a few days or more, my shoulder got better. I never put 2+2 together regarding the upgraded power line. So, I moved everything out of those rooms away from the power line as much as possible (new locations were about 15-30mG, still too high but much lower). My shoulder got better in about a week. I asked my MD if this could be psychosomatic- he said, probably not, he's heard from plenty of patients whose issue(s) got better after reducing EMF exposure.
     
    I took the meter with me when looking for a new place to live. I was surprised how bad other places were, but none as high as the old place (top floor, right by power line). Surprisingly, landlords and real estate agents said other people did this practice as well (brought meters with them). The new place I moved to reads less than 2mG in most areas.
     
    High EMF is linked to brain cancer, ALS, Alzheimer's, and leukemia, however the evidence is not yet strong enough for the EPA to regulate it and/or politics and influence from the power companies: http://www.epa.gov/radtown/power-lines.html. They suggest moving away from the source of EMF, which I did. Another weird symptom was strange allergies- was it something in that apartment other than EMF? I don't know, but I brought all my equipment and furniture from the old place- so far no more allergies.
     
    Regarding ionizing radiation. What does it do? It damages your cells and DNA over time. The good news is if you're eating healthy, exercising, and getting plenty of antioxidants, your body can repair the damage (including DNA) if the doses aren't too high. The problem with studying the health risks is the wide level of variables. Cancer from low-dose long exposure radiation is hard to prove a source of the cause. Smoking won't kill you right away, but it may lead to lung cancer, heart disease, and other diseases. Some folks won't get cancer because their bodies can handle the toxic smoke. Others who only got second hand smoke will get cancer. It took a long time to overcome politics, etc., for the truth to get out and for warning labels to be required. That said, people smoke anyways, some still smoking after getting cancer/emphysema, as nicotine is so addictive. Many people think they are immortal or don't think they care about living a long healthy life. However after getting sick some decide they want to live and radically change their behavior.
     
    I used to snicker a bit about the 'tin foil hat people'. Now, if there is a known risk and it's easy to avoid, I don't think twice about avoiding the risk. The only issue about the new Faraday cage-like place is I can't get OTA digital TV signals and Verizon coverage is poor (I'm using ATT, some friends use Verizon and their phones don't work very well) :)
     
    I could go on about heavy metals, but that's further off topic and a story for another day (short summary- avoid mercury and aluminum in vaccines, don't drink tap water (use RO or distillation and add trace minerals), remove amalgam fillings, skip gadolinium contrast if you ever do an MRI, limit large fish consumption (except perhaps wild salmon), don't drink bismuth (Pepto Bismol etc.)). Two books which can be very helpful:
    http://www.amazon.com/Amalgam-Illness-Diagnosis-Treatment-Better/dp/0967616808/ref=sr_1_1?ie=UTF8&qid=1398564805&sr=8-1&keywords=heavy+metal+cutler
    http://www.amazon.com/Hair-Test-Interpretation-Finding-Toxicities/dp/0967616816/ref=sr_1_3?ie=UTF8&qid=1398564805&sr=8-3&keywords=heavy+metal+cutler
  3. Like
    jcs got a reaction from Christina Ava in Would you buy a 5d mark III now?   
    Canon (and the 5D3) has the nicest color processing in the price range, especially for skintones. In good/bright light, the 5D3 H.264 sharpens nicely and looks really good on a large HDTV at a normal viewing distance. It's very important to use sharp lenses and to nail focus. The 24-105 F4L is ok for closeups, but not so good for anything else (sharpness). The 16-35 F2.8L II, 24-70 F2.8L II, and 70-200 F2.8L II are very sharp and really help the 5D3 H.264 look its best. Nikon and Zeiss primes are also very sharp; the new Sigma 50mm should be spectacular.

    Canon L lenses with H.264 (IPB): looks good on HDTV:


    I also shoot with the Sony FS700 and Speedbooster. The FS700 is sharper and has greater dynamic range (at least 2 stops) along with smaller H.264 files. When white balance is set properly, color processing is decent but still can't match Canon, especially for skintones.

    The Odyssey 7Q (haven't used one yet) takes the FS700 to another level, though from footage posted so far the color processing is still not as good as 5D3 14-bit raw.

    The GH4 has resolution to spare but falls short on color processing and DR and has a tiny sensor (only S35 with a Speedbooster- an issue if needing full frame FOV). The A7S looks to have excellent color processing and DR though has a fair amount of rolling shutter.

    For run and gun, the 5D3 is better than the FS700 as it's lighter and easier to focus when using a quality viewfinder loupe. The FS700 can use autofocus lenses but they are slow and don't always focus on the desired object.

    5D3 14-bit raw is spectacular, especially when processed with ACR. The tradeoff is card and disk space and processing time. For the price, this is the closest one can get to an ARRI Alexa/Amira- the best cameras in most people's eyes.

    (GH4 can't compete with this color; processed with Resolve- ACR wasn't used)
    http://www.dvxuser.com/V6/showthread.php?322289-Wedding-(RAW)&s=6bee780c6bd52fdfdd928bcee7cec245 (Low light raw, processed with ACR).
  4. Like
    jcs got a reaction from sandro in 5D3 RAW, FS700+SB, iPhone 5S cut together   
    Part of the learning process as a filmmaker is to try different cameras to see how well they work to and see how they cut together. There is no camera (yet!) perfect for all situations. Sometimes the camera you have on-hand is the best camera (or when the batteries died on your A-camera :) ).
     
    Here we tested 5D3 raw, processed with raw2cdng (very fast but no vertical stripe removal yet) and ACR (first shot, outdoors at golden hour), Sennheiser G3 wireless, Oscar Sound Tech OST-802 mic (great value at around $110 for the kit wired for the G3)), FS700 + SpeedBooster, and iPhone 5S for the last shot (FS700 batteries gone). Canon 24-105 F4L used on both cameras, same mic.

    Lighting (except first shot- only sun, no reflectors) was 2 6-element 'twisted bulb' high CRI fluorescent studio lights (sorry, don't have brand info anymore- they were low-cost) with diffuser material draped over lights.
     
    When the host (Elena) saw the iPhone 5S footage, she exclaimed "I like that better than the Sony!", lol. Getting good color out of the FS700 is possible, but takes work (other folks have stated the same thing- even with a 7Q it's not as good as Canon (or Apple!) color processing (from what I have seen online so far)).
     
    Here's 100% FS700+SB + Canon 24-105 F4L at F4, handheld with an HDV-Z96 LED light on camera. This was our first time shooting this kind of thing and the live audio ended up needing to be replaced with ADR (live quality was excellent with the Audio Technica 4029 stereo shotgun and Sony preamps, however the dialog had to be re-done). We used a Shure SM7B and a Focusrite Scarlet 2i2 for ADR (great preamps for the price- sold my RME Fireface 800* after getting this!), recorded directly into PPro at my desk in my home office (not a sound-treated room). I had not done this before and it was good practice (didn't try for perfect sync + some of the words were changed- kind of reminded me of old Clint Eastwood Spaghetti Westerns).

    * The Sound Devices USBPre2 has more accurate preamps, while the Scarlet 2i2 is a bit smoother and well suited for vocal recordings. The Fireface 800 is a great, very stable product, however I never liked the preamps (and while the routing complexity will blow your mind (in a good way), it's so complicated it will blow your mind (in not a good way when things don't work :) )).
     
    Here are the same clips on youtube: https://www.youtube.com/user/TheHotDishShow
    There was a time when youtube and vimeo offered similar quality- it's clear vimeo is now better (too bad as youtube has more social/virality features).
     
    We just shot another piece using 100% 5D3 RAW + Z96 light and G3 + OST-802. When we ran out of card space (2 64GB and 1 32GB Komputerbay), I pulled out the slower CF cards and continued with H.264. With good lighting I believe it will cut well with RAW, especially for internet delivery.
     
    I picked up this viewfinder and mini rail system for $130: http://www.amazon.com/AUTHENTIC-KAMERAR-VIEWFINDER-FINDER-NIKON/dp/B00EAXU6UM/ (says Nikon but works perfectly on the 5D3).
     
    I almost got the Zacuto for $300 but decided to try this first. I mounted a handle using the rails under the camera for balance: this setup works very well! Can't imagine the Zacuto or other brand would be any better in terms of optical quality and magnification ratio (2.5x)(Zacuto has anti-fog- not an issue so far in SoCal). Focusing wide shots is still an issue, even with ML peaking (have to punch in first). Really need an electronic viewfinder with full 1920x1080- I can easily focus when hooked up to an HDTV (no peaking even needed).
  5. Like
    jcs got a reaction from Ratguity in $700 28" 4K Monitor from Samsung / 10-bit 1080p OLED   
    http://www.pcworld.com/article/2137477/samsung-lowballs-the-4k-competition-with-700-display.html
     
    Want 10-bit 1080p 24.5" OLED instead? http://www.flandersscientific.com/index/cm250.php ($6500)
    A 10-bit display is needed to see 10-bit material. For 8-bit displays, the image needs to be tone-mapped and dithered from 10-bit- this usually happens on final render. Not clear if apps like Resolve and AE do real-time tone-mapping+dithering to 8-bit (or at least when paused).
  6. Like
    jcs reacted to jcs in Calling all colourists - Grade Panasonic GH4 4K ProRes next to Arri Alexa 2K ProRes   
    Sounds about right dhessel- what we'd expect from 8.67 bits and no sensor-provided increase in DR.

    HurtinMinorKey- is there an error in the math? Only green gets 2x unique samples in the Bayer array, red and blue are 1/2 resolution; Y is R*.3 + G*.6 + B*.1; only Y gets 4 samples to combine vs 1 for U and V in 420. If on-camera debayer is very good, the low res R and B will be better than 1/2 resolution, however at best we are still looking at 8.67-bit YUV after downscaling. After conversion to RGB, it will be somewhat less than 8.67 based on how good the debayer method performs. Thus not being able to see a difference matches what the math predicts.
  7. Like
    jcs got a reaction from Henry Gentles in This new 4K camera will blow the GH4 away   
    http://panasonicprovideo.tumblr.com/post/80965679215/varicam-modularity
    ;)
  8. Like
    jcs got a reaction from maxotics in Simple question about 4k   
    Nyquist-Shannon sampling theory shows us that we need twice as many samples as our target frequency to prevent aliasing. This is true for audio and video and all sampled systems. For example, 48kHz audio can reproduce 24kHz without aliasing. All frequencies above 24kHz must be filtered during recording and playback (in practice higher sampling rates are used to allow for lower cost analog filters). For video we're sampling light instead of sound and the same theory applies. 1920 pixels can represent a max of 1920/2 = 960 'pixels pairs' (on/off) before aliasing. For example, a chart with black and white vertical lines can be used to measure 960 lines pairs from a 1920 image. Even if the camera sensor and chart aren't perfectly aligned, we can see 960 line pairs. When looking at even higher resolution lines, we can see more than 960 line pairs are visible, but with aliasing artifacts (e.g. on the right side of the vertical line chart): http://www.graphics.cornell.edu/~westin/misc/ISO_12233-reschart.pdf
     
    Top cameras have optical antialiasing filters which cut off frequencies effectively above the sensor aliasing frequency. These cameras exhibit very high quality images, especially important for moving images/video. While aliasing can provide a false appearance of higher detail, it's one of the give-aways the image/video is not film. If the aliasing is very high frequency, and the image is somewhat noisy, it's not as apparent (e.g. BM cameras with no AA filter). Again, top cameras have very good AA filters (changeable/removable) such as the ARRI Alexa and Sony F55. When looking at a chart we see excellent low-aliasing.
     
    The TM700 was the top camcorder (by a quite a bit), IQ-wise in its price range when it was new. 3x 2.53Mpixel sensors (no debayering or related artifacts!), Leica DICOMAR lens (F1.5 - 2.8), 1080p60 support for slowmo, an excellent power OIS, decent audio quality with an external mic (indoors- only neg. issue with camera is fan noise picked up on internal mics). It wasn't very good in low light, however a simple LED light on top of camera, with a shotgun external mic makes a nice ENG/interview camera for any lighting condition. I still have mine but haven't used it much since getting the 5D3. The 17Mbps AVCHD codec while pretty good is a bit over-compressed for large motion scenes/handheld, etc. (60p is 28Mbps). An external recorder might help but I haven't tried it. The zoom range is very impressive: 35mm-420mm (35mm equivalent). With power OIS, even at max zoom, image is pretty stable. Walking while shooting also works very well- no rig needed! :). The autofocus also works pretty well (easier for small sensor cameras).
     
    http://shop.panasonic.com/shop/model/HDC-TM700K?t=specs&support#tabs
     
    The TM900 is a newer version with slightly better IQ (looks like more contrast):
    http://camcorder-test.slashcam.com/compare-what-i-cmp-u-cmd-i-view-u-mode-i-docompare-u-lang-i-en-u-id-i-167-y-185-u-name-i-Panasonic%20HDC-TM900-u-bname-i-Panasonic%20HDC-TM700-u-cmd-i-vergleich.html
    Note comments: "Sharpness and color at the AVCHD limits". 
     
    Thus the 4K GH4 with one of Panasonic's OIS autofocus lenses should be quite spectacular for doc/ENG/(narrative- some shots).
  9. Like
    jcs got a reaction from Sean Cunningham in POLL: What editing software do you use?   
    If FCPX works for you- it's a great value (+ Motion). I use FCPX for access to some of the effects but use PPro as it runs faster and has superior audio handling & better color grading built in (no color curves in FCPX?).

    Resolve is also evolving into an NLE- very fast and even better color grading tools (somewhat archaic GUI and flow, but nothing a little googling can't solve).
  10. Like
    jcs got a reaction from nazdar in Canon 4K refresh - C200 and C400 coming at NAB?   
    A 10/12-bit 422/444 long GOP codec similar to Sony's XAVC would be excellent for these cameras (part of the H.264 spec). After spending long hours dealing with RAW, I only use RAW when I have no other choice. The 24Mbps FS700 codec still outperforms 5D3 14-bit RAW in resolution and actual dynamic range (due to the better sensor in the FS700). 5D3 RAW wins in color fidelity and post latitude: 10/12-bit 422/444 efficiently compressed is the best of both worlds.
  11. Like
    jcs got a reaction from Ratguity in Write your own custom GPU effects   
    Adobe used to have Pixel Bender (but never for Premiere), now Red Giant is releasing what could be a very cool technology: a fast and relatively easy way for developers (and crafty end users) to write GPU accelerated effects: http://www.redgiant.com/store/universe/
     
    Maybe this will motivate Adobe to open up a GLSL/OpenCL API for Premiere ;)  
  12. Like
    jcs got a reaction from Jolley in Best DSLR For Property Video?   
    Sounds like a GH3/GH4 will give you the detail and quality you are looking for straight from the camera. I currently shoot with a 5D3 (RAW mostly lately) and FS700 + Speedbooster (when I need autofocus, pro audio, or slomo). RAW looks great, but is a lot of work, time, and disk space. The 24Mbit/s files from the FS700 are a bit over compressed (have a Nanoflash external recorder- that's extra weight and complexity). The GH3 has a decent bitrate, and the GH4 does as well (including 4K support, which will make very detailed 1080p in post). A GH3/GH4 with a Speedbooster and the Sigma 18-35 F1.8 would make a nice combo for what you are shooting.
  13. Like
    jcs got a reaction from etidona in Sony AX100 4K video camera - how much rolling shutter is too much?   
    Perfect for babies, kids, puppies, kittens: the target mass market & youtube :)
    Plus a nice $2k way to create very crisp, 444 8.67-bit 1080p.
    XAVC-S looks to be a decent codec.
    120fps, ND's, Zeiss lens, good low-light (per Sony), good image stabilizer, might be a nice camera for the price.
     
    The camera has a very good image stabilizer- that also 'helps' make the rolling shutter more visible: 
    (1080p60 much less RS).
  14. Like
    jcs got a reaction from maxotics in What does Pixel Format (8bit-32bit Floating point Video levels-Full Range) mean?   
    There was a time when integer/fixed-point math was faster than floating point. Today, floating point is much faster. For GPU accelerated apps such as Premiere, Resolve, (FCPX?), they always operate in floating point. The only area where 8-bit can be faster is on slower systems where memory bandwidth is the bottleneck (or really old/legacy systems, perhaps After Effects).
  15. Like
    jcs got a reaction from maxotics in getting the best footage on Vimeo   
    Last I checked both YouTube and Vimeo use customized ffmpeg (with x264) to transcode. x264 has been the best H.264 encoder for a while now. Thus if you want the most efficient upload you could use any tool which uses a recent version of ffmpeg (rendering out ProRes/DNxHD then using Handbrake is a decent way to go).

    The challenge with your footage is high detail, fast motion. Adding grain or more detail (by itself) can make it worse. In order to help H.264 compress more efficiently in this case you need less detail in the high motion areas. You can achieve this by first shooting with a slower shutter speed (1/48 or even slower if possible). Next, use a tool in post which allows you to add motion blur. In this example you could cheat and use tools to mask off the skateboarder and car and Gaussian blur everything else in motion (mostly the sides but not so much the center/background). You could also apply Neat Video to remove noise and high frequency detail (in the moving regions only) and not use any additional motion blur as this will affect energy/tension of the shot (through adding more blur to motion will help the most).

    Once you have effectively lowered detail in the high motion areas (however achieved), H.264 will be able to better preserve detail for the lower motion areas- the skateboard, car, and distant background.
  16. Like
    jcs got a reaction from jagnje in getting the best footage on Vimeo   
    Last I checked both YouTube and Vimeo use customized ffmpeg (with x264) to transcode. x264 has been the best H.264 encoder for a while now. Thus if you want the most efficient upload you could use any tool which uses a recent version of ffmpeg (rendering out ProRes/DNxHD then using Handbrake is a decent way to go).

    The challenge with your footage is high detail, fast motion. Adding grain or more detail (by itself) can make it worse. In order to help H.264 compress more efficiently in this case you need less detail in the high motion areas. You can achieve this by first shooting with a slower shutter speed (1/48 or even slower if possible). Next, use a tool in post which allows you to add motion blur. In this example you could cheat and use tools to mask off the skateboarder and car and Gaussian blur everything else in motion (mostly the sides but not so much the center/background). You could also apply Neat Video to remove noise and high frequency detail (in the moving regions only) and not use any additional motion blur as this will affect energy/tension of the shot (through adding more blur to motion will help the most).

    Once you have effectively lowered detail in the high motion areas (however achieved), H.264 will be able to better preserve detail for the lower motion areas- the skateboard, car, and distant background.
  17. Like
    jcs got a reaction from nahua in Should I convert MOV files to ProRes 422?   
    Going from 420 to 422 is only going to (possibly) improve the horizontal chroma. If the tool is improving quality better than an NLE, ideally you'd go to 444 (both horizontal and vertical). One way to test to see what's going on is to bring in both clips (transcoded and original) to the NLE and place on the timeline/sequence, one above the other. Then set the top track blend mode to Subtract. Add a gamma filter/effect to the top clip and crank it up until you can start to see differences.
     
    Another test is to bring the clips in and just A/B (track toggle) the two clips. If you can't see any difference visually, not worth it to transcode (even if you see a difference with the Subtract+gamma test).
     
    Pascal's two clips differ in a major gamma/contrast shift.
  18. Like
    jcs got a reaction from odie in Resolve or ACR with 5D3 RAW   
    I've been experimenting with Resolve and ACR, trying to find the best workflow for RAW. By best, the highest quality possible in the shortest amount of time.  ACR+AE runs a few frames/sec on my 4-core MBP and about 6-fps on my 12-core Mac Pro (both cases running with AE multiframe processing on). Resolve runs near real-time on the laptop, and slightly faster on the Mac Pro (Quadro 5000 is getting long in the tooth- a newer consumer card with hacked mac ROM would be faster).
     
    While I have been able to get Resolve looking closer to ACR, I have not been able to match it.
    http://www.dvxuser.com/V6/showthread.php?320727-Results-of-testing-various-5D3-RAW-workflows
     
    The first goal was to get Resolve to show an accurate image of what was shot. All the experiments with BMD Film color space and LUTs weren't able to do this (including the EOSHD and Hunters "Alexa" LUTs). I also experimented with creating 3D LUTs from scratch.
     
    The 5D3 RAW footage is pretty much linear RGB. It appears it can be transformed to BT.709/Rec709 with a simple matrix (linear transform). Using the Rec 709 color space and gamma in Resolve, I can get much closer to the correct colors for the scene shot. Unless we are going from 14-bit to 10-bit (for example), it doesn't make sense to convert to a log color space since we've already got everything captured in linear. From there we can grade for style then save out the final render with a Rec 709 compliant color space (it's not clear what Resolve does with out of gamut colors- clip, etc.). Log makes sense in a camera capturing and preserving dynamic range to a file format which can't otherwise store the full range sufficiently. Log can also make sense if one wishes to use a 3D LUT that needs log as input. However, there a many different log formats, ARRI, Canon, Sony, etc. all make different versions. At the end of the day, we need to get back to something like Rec 709 for broadcast (of course internet test videos are excluded).
     
    After comparing ACR, Resolve 10, and AMaZE (used by mlrawviewer: http://www.magiclantern.fm/forum/index.php?topic=9560.0, which plays MLV's in real-time and can output ProRes HQ (a little faster than ACR, but still slow (AMaZE only used for export)) debayering, they are all very close. ACR is doing something with color and microcontrast that puts it above everything else right now. It looks very accurate and also 'pops' in a way that I haven't been able to get from Resolve (granted, I'm just now really learning it).
     
    I'm planning for a ~90-minute final product which means lots of footage. Even shooting Robert Rodriguez style (limited takes- edit as much as possible in-camera), there will be terabytes of footage. For now I'm thinking to use Resolve to make ProRes/DNxHD proxies quickly then go back and only process necessary clips with ACR+AE for the final cut (until or unless I can figure out how to make Resolve look as good as ACR). Or batch convert ACR+AE overnight and replace proxies as I go (deleting the massive RAW files).
     
    5D3 RAW is amazing, but requires a lot of planning, work, and most significantly, drive space.
     
     
  19. Like
    jcs got a reaction from odie in Video quality charts - February 2014   
    '?do=embed' frameborder='0' data-embedContent>>
     
    Interesting to see how these lists compare a year later. Resolution isn't everything- I would put the Alexa on top for overall image quality. They designed their sensor after learning from years of experience building film scanners.
     
    We used to argue about whether digital effects for audio would ever match or surpass analog gear. Finally a few years ago, even the most stalwart musician friend finally agreed digital matched analog with the Axe Fx: http://www.fractalaudio.com
     
    I just watched MiB II again and was blown away with some of the shots in terms of skin tone and color (Eastman EXR 100T 5248). Is it possible to convert my 5D3 (RAW) or FS700 (RAW) to look like that with something like FilmConvert (two cameras I use)? No, not yet, as I don't think anyone has accurately modeled the film process well enough yet to emulate what film does with light. Initial ADCs and DACs sounded harsh and brittle when CD's were first released (folks still dig records and tapes). Now digital audio is fantastic (and most folks listen to highly compressed audio on iPod/Phone/Droid or cheap computer speakers!).
     
    Digital cameras are still very much like early digital audio. Matching the pleasing look of film has not yet been achieved with digital; so far ARRI has gotten the closest, and FilmConvert has made a good start. Accurately modeling and simulating film with digital cameras will happen someday. Why bother? Because it looks better, especially for narrative (where unreality and dreaminess are helpful in storytelling). Dynamic range, resolution, lack of aliasing, and accurate color processing are all very important. The last piece of the puzzle is (optionally) being able to get highlights and skin tones etc. to look like film.
  20. Like
    jcs got a reaction from etidona in Video quality charts - February 2014   
    '?do=embed' frameborder='0' data-embedContent>>
     
    Interesting to see how these lists compare a year later. Resolution isn't everything- I would put the Alexa on top for overall image quality. They designed their sensor after learning from years of experience building film scanners.
     
    We used to argue about whether digital effects for audio would ever match or surpass analog gear. Finally a few years ago, even the most stalwart musician friend finally agreed digital matched analog with the Axe Fx: http://www.fractalaudio.com
     
    I just watched MiB II again and was blown away with some of the shots in terms of skin tone and color (Eastman EXR 100T 5248). Is it possible to convert my 5D3 (RAW) or FS700 (RAW) to look like that with something like FilmConvert (two cameras I use)? No, not yet, as I don't think anyone has accurately modeled the film process well enough yet to emulate what film does with light. Initial ADCs and DACs sounded harsh and brittle when CD's were first released (folks still dig records and tapes). Now digital audio is fantastic (and most folks listen to highly compressed audio on iPod/Phone/Droid or cheap computer speakers!).
     
    Digital cameras are still very much like early digital audio. Matching the pleasing look of film has not yet been achieved with digital; so far ARRI has gotten the closest, and FilmConvert has made a good start. Accurately modeling and simulating film with digital cameras will happen someday. Why bother? Because it looks better, especially for narrative (where unreality and dreaminess are helpful in storytelling). Dynamic range, resolution, lack of aliasing, and accurate color processing are all very important. The last piece of the puzzle is (optionally) being able to get highlights and skin tones etc. to look like film.
  21. Like
    jcs got a reaction from rafael3d in Discovery: 4K 8bit 4:2:0 on the Panasonic GH4 converts to 1080p 10bit 4:4:4   
    That's true- JPEG supports 444, 422, 420, lossless, etc. However when comparing images the compression level needs to be the same (most importantly, the DCT quantization). The rescaled web images are likely 420 while the downloads could still possibly be originals. The point about 420 is that web-facing content is typically 420 (with rare exceptions due to the desire to conserve bandwidth and save costs).
     
    If at 100% A/B cycling we can't see any difference, then there's nothing gained since the results aren't visible (except when blown up or via difference analysis, etc.).
     
    Scaling an image down (with e.g. bicubic) then scaling it back up acts like a low-pass filter (smoothes out macroblock edges and reduces noise). I downloaded the 4K 420 image and scaled in down then up and A/B-compared in Photoshop- couldn't see a difference other than the reduction of macroblock edges (low pass filtering) etc. Adding a sharpen filtering got both images looking very similar. A difference compare was black (not visible), however with gamma cranked up the differences from the effective low-pass filtering are clear.
  22. Like
    jcs got a reaction from P337 in Discovery: 4K 8bit 4:2:0 on the Panasonic GH4 converts to 1080p 10bit 4:4:4   
    Julian's images: saving the 4K example at quality 6 creates DCT macroblock artifacts that don't show up in the 444 example at quality 10. All the images posted are 420: that's JPG. To compare the 1080p 444 example to the 4K 420 example: bicubic scale up the 1080p image to match exactly the same image region as the 4K image (examples posted are different regions and scale). The 1080p image will be slightly softer but should have less noise and artifacts. Combining both images as layers in a image editor then computing the difference (and scaling the brightness/gamma) up so the changes are clearly visible will help show exactly what has happened numerically; helpful if the differences aren't very obvious on visual inspection.
     
    We agree that 420 4K scaled to 1080p 444 will look better than 1080p captured at 420 (need to shoot a scene with camera on tripod and compare A/B to really see benefits clearly). 444 has full color sampling per pixel vs 420 having 1/4 the color sampling (1/2 vertical and 1/2 horizontal). My point is that we're not really getting any significant color element bit depth improvement which allows significant post-grading latitude as provided by a native 10-bit capture (at best there's ~8.5-9-bits of information encoded after this process: will be hard to see much difference when viewed normally (vs. via analysis)). Another thing to keep in mind is that all > 8-bit (24-bit), e.g. 10-bit (30-bit) images, need a 10-bit graphics card and monitor to view. Very few folks have 10-bit systems (I have a 10-bit graphics card in one of my machines, but am using 8-bit displays). >8-bit systems images need to be dithered and/or tone mapped to 8-bit to take advantage of the >8-bit information. Everything currently viewable on the internet is 8-bit (24-bit) and almost all 420 (JPG and H.264).
     
    re: H.264 being less that 8-bits- it's a effectively a lot less than 8-bits not only from initial DCT quantization and compression (for the macroblocks), but also from the motion vector estimation, motion compression, and macro block reconstruction (which includes fixing the macroblock edges on higher quality decoders).
  23. Like
    jcs got a reaction from maxotics in Discovery: 4K 8bit 4:2:0 on the Panasonic GH4 converts to 1080p 10bit 4:4:4   
    Julian's images: saving the 4K example at quality 6 creates DCT macroblock artifacts that don't show up in the 444 example at quality 10. All the images posted are 420: that's JPG. To compare the 1080p 444 example to the 4K 420 example: bicubic scale up the 1080p image to match exactly the same image region as the 4K image (examples posted are different regions and scale). The 1080p image will be slightly softer but should have less noise and artifacts. Combining both images as layers in a image editor then computing the difference (and scaling the brightness/gamma) up so the changes are clearly visible will help show exactly what has happened numerically; helpful if the differences aren't very obvious on visual inspection.
     
    We agree that 420 4K scaled to 1080p 444 will look better than 1080p captured at 420 (need to shoot a scene with camera on tripod and compare A/B to really see benefits clearly). 444 has full color sampling per pixel vs 420 having 1/4 the color sampling (1/2 vertical and 1/2 horizontal). My point is that we're not really getting any significant color element bit depth improvement which allows significant post-grading latitude as provided by a native 10-bit capture (at best there's ~8.5-9-bits of information encoded after this process: will be hard to see much difference when viewed normally (vs. via analysis)). Another thing to keep in mind is that all > 8-bit (24-bit), e.g. 10-bit (30-bit) images, need a 10-bit graphics card and monitor to view. Very few folks have 10-bit systems (I have a 10-bit graphics card in one of my machines, but am using 8-bit displays). >8-bit systems images need to be dithered and/or tone mapped to 8-bit to take advantage of the >8-bit information. Everything currently viewable on the internet is 8-bit (24-bit) and almost all 420 (JPG and H.264).
     
    re: H.264 being less that 8-bits- it's a effectively a lot less than 8-bits not only from initial DCT quantization and compression (for the macroblocks), but also from the motion vector estimation, motion compression, and macro block reconstruction (which includes fixing the macroblock edges on higher quality decoders).
  24. Like
    jcs got a reaction from Ernesto Mantaras in Discovery: 4K 8bit 4:2:0 on the Panasonic GH4 converts to 1080p 10bit 4:4:4   
    Any workflow which can downsample 4K to 1080p at 10-bits or more should be able to produce 9-bit 444 RGB. It won't be full 10-bit as the chroma was only 8-bit, however the luma mixed in with chroma when converting to RGB will be 10-bit, so really more like 9-bit at the end of the day.
  25. Like
    jcs got a reaction from jurgen in Panasonic GH4 in a professional setting - FAQ   
    Scaling down to 1080 in Premiere will be very high quality (Lanczos+bicubic) and will easily run in real-time with CUDA/OpenCL: http://blogs.adobe.com/premierepro/2010/10/scaling-in-premiere-pro-cs5.html (Max Quality is always on for scaling with GPU accel).
     
    420 QuadHD and higher will indeed scale down to 444 1080p (with additional vertical averaging acting as a low pass filter to help reduce aliasing). Depending on the compression quality, shooting in 420 QuadHD can produce higher quality vs. 1080p. If for example AVCHD 1080p ultimately does a better job due to bitrate relative to frame size, it might look better in some scenes (> HD resolutions appear to be limited by card write speeds).
×
×
  • Create New...