Jump to content

jcs

Members
  • Posts

    1,839
  • Joined

  • Last visited

Reputation Activity

  1. Like
    jcs reacted to Don Kotlos in Don’t forget about Digital Bolex!   
    You will need to shoot the original real life image though  
    I do wish they would allow baking in 3D LUTs with the GH5. So much potential. 
  2. Like
    jcs got a reaction from Jonesy Jones in Don’t forget about Digital Bolex!   
    Take your GH5 and take a picture of that image. Take the GH5 copy and the original into any tool which does shot matching. Perform the shot match operation. Test and tweak further to get both images to match as close as possible by eye. Save out the resulting 3D LUT. To generalize and automate this process, so the resulting LUT will work better in the general case, use machine learning to train a neural network to create an optimized LUT based on GH5 input and <target camera> output  
  3. Like
    jcs reacted to kaylee in Wes Anderson - Isle of Dogs   
    i had like... 1.25 hot dogs lol. jas got the lions share
    mmmmmm.... bacon wrapped prawns....

  4. Like
    jcs got a reaction from kaylee in Wes Anderson - Isle of Dogs   
    Happy Birthday Jasper! Lol how many did you have?
    Bacon wraps nicely around prawns too.
    <?>?</?>
     
  5. Like
    jcs got a reaction from EthanAlexander in Wes Anderson - Isle of Dogs   
    Inspiration on so many levels!
  6. Like
    jcs got a reaction from ade towell in Wes Anderson - Isle of Dogs   
    Inspiration on so many levels!
  7. Thanks
    jcs got a reaction from Aussie Ash in Wes Anderson - Isle of Dogs   
    Inspiration on so many levels!
  8. Like
    jcs got a reaction from Kisaha in Wes Anderson - Isle of Dogs   
    Inspiration on so many levels!
  9. Like
    jcs got a reaction from Adept in Wes Anderson - Isle of Dogs   
    Inspiration on so many levels!
  10. Like
    jcs got a reaction from tellure in Shooting 4K 60P RAW and 4K 120P Run and Gun: FS700R + Shogun Inferno   
    Nice chickentones  
  11. Like
    jcs reacted to Tim Sewell in Very cool 30 day sea voyage timelapse   
  12. Like
    jcs reacted to Dave Maze in The Future is Computational. ANAMORPHIC ON IPHONE!   
    With iOS 11 Apple opened up the dual camera depth data API to developers. I’ve been using the built in portrait mode on my iPhone 7 Plus for a while now and love it. 
    However, today I downloaded a new app that takes full advantage of the depth sensing technology and it’s blowing my mind!! 
    The app is called Anamorphic and it was $2.99 and it may be the best app I’ve ever purchased. The app allows you to take an image, and then adjust where you want the blur to happen in 3D space. You can then feather the blur in the same way you would feather a Luma Key, but it feathers it in Z space. It then applies the most realistic and gorgeous anamorphic bokeh and aberration. Export it out and put VSCO Cam on it and it looks like you took an image with a film camera with a T1.4 vintage Cooke anamorphic lens on it!!!
    Just testing this app out today has made me giddy and it really feels like magic when you use it. I’m so excited to experiment more with this and even more excited about getting an iPhone X with the faster telephoto lens and OIS! 
    This excites me about the future of cinematography. It’s going to be computational! Imagine shooting on a $100,000 cinema lens that has been profiled that you can simulate in post. Lytro is working on it on the high end, but the fact that my tiny cheap iPhone can do it blows my mind  
    Below you’ll see the normal image that was taken and then the “Anamorphic” with VSCO one. 
    Download the app here: https://itunes.apple.com/us/app/anamorphic/id1247287369?mt=8
    I would love for this thread to become a place where we all can share our shots taken with computational photography methods if the admins allow it. If you have an iPhone 7 Plus or 8 now, get the anamorphic app and start shooting!!




  13. Thanks
    jcs got a reaction from tweak in What affects a camera's "motion cadence"?   
    It would appear the primary factor in motion cadence would be the clock / sampling interval. If the sampling interval is perfect, e.g. each frame sampled at very close to 1s/23.976, that will have different a perception vs. a system with temporal jitter. It would seem film would have some temporal jitter, vs. a digital system would could be a lot more precise. The question is what 'looks better' and if temporal jitter is helpful, how would a digital camera implement it? (statistical sampling: average interval, variance, magnitude etc.). Likewise, if temporal jitter is pleasing, why aren't there post tools which specifically address this? (I've seen film effects which 'damage' footage, however nothing so far that subtly creates temporal jitter based on specific camera systems in the same way e.g. Film Convert works for color/grain).
    With a precise motion pattern, cameras could be measured for temporal jitter / motion cadence (it would appear that cameras with genlocks could be jittered live).
  14. Like
    jcs got a reaction from Drew Allegre in What affects a camera's "motion cadence"?   
    Where people prefer the motion cadence of one camera to another, the difference would be the sampling interval. Is it precise, right-on every frame, or does it jitter, and if so is it constant, variable, is the magnitude constant, variable, is it uniform/Gaussian random etc. It's interesting with all the tests folks have posted over the years, I don't recall seeing one on motion cadence / temporal and/or spatial jitter. Motion picture film has spatial jitter during playback in the theater, so even digitally acquired material transferred to film would have playback jitter. Via testing one could discover how much, if any jitter helps make the viewing experience more pleasing or not.
    I'm personally am not a fan of large amounts of jitter, however perhaps small amounts might help create the illusion of being more organic. Or maybe the opposite is happening, cameras with little or no jitter are what people prefer when they say they like the motion cadence of a camera. A scientific test with perhaps a very accurate strobe could be used to test cameras. Another way would be to introduce jitter in a completely synthetic test with computer graphics/animation. I do know from video games running at constant- vblank-sync'd 60fps, a single frame drop is massively noticeable and used to be a criterion to fail a game in QA back when there were no online updates available. Similarly, if a game is running with constant variable jitter, a single frame drop or game slowdown can be barely noticeable.
    Found research papers on temporal jitter for video: https://www.google.com/search?rlz=1C5CHFA_enUS508US508&q=temporal+jitter+video+quality&oq=temporal+jitter+video+quality. Quickly skimming the conclusions, I didn't see a clear idea presented beyond what I mentioned from video game experience: constant temporal jitter is better than perfect timing with occasional jitter/dropped frames.
    Temporal jitter used to remove rendering artifacts:
    24Hz judder can be an issue with panning, perhaps temporal jitter can provide a kind of temporal anti-aliasing, resulting in less apparent judder. This would make sense based on the 'perfect 60fps highly noticeable frame-drop video game' effect. This could be tested by adding jitter to juddering video to see if it helps hide the judder.
    Another factor could be effective motion blur with a digital sensor (may not be precisely related to the camera's shutter setting). This too could be scientifically tested.
  15. Like
    jcs reacted to Drew Allegre in What affects a camera's "motion cadence"?   
    I always thought it was the Phillip Bloom woodgrain pocket dolly that gave that authentic organic look to footage.

    It makes sense to me that the playback device is a pretty major factor in perceived cadence.  Once you eliminate variables: playback device, frame rate, shutter speed/angle, rolling shutter, jitter (introduced at any point in the chain), scene and lighting, codec and compression, is there some "thing" that makes one sensor or camera subjectively better than another?  It almost seems like audiophile territory trying to define the marginally perceptible, but I think many of us would agree that there is definitely something there to perceive and define.

    Gimme a good motion cadence mixed with that Canon color science.
  16. Like
    jcs reacted to Ed_David in What affects a camera's "motion cadence"?   
    Well, I mean,
    has this group ever been the cutting edge bastion of science? Or is it, just a bunch of filmmakers and shooters trying to help each other with tips and tricks to improve?
    And if one person asks the question, and another person tries to help and answer it, should you, dear friend, Can'tSin, be upset?
    These are great mysteries of life.
    If you want true, science-based research and answers, go on cinematography.com and ask Art Adams or David Mullen.
    If you want to have a little fun, stay here.
  17. Like
    jcs got a reaction from Jonesy Jones in What affects a camera's "motion cadence"?   
    It would appear the primary factor in motion cadence would be the clock / sampling interval. If the sampling interval is perfect, e.g. each frame sampled at very close to 1s/23.976, that will have different a perception vs. a system with temporal jitter. It would seem film would have some temporal jitter, vs. a digital system would could be a lot more precise. The question is what 'looks better' and if temporal jitter is helpful, how would a digital camera implement it? (statistical sampling: average interval, variance, magnitude etc.). Likewise, if temporal jitter is pleasing, why aren't there post tools which specifically address this? (I've seen film effects which 'damage' footage, however nothing so far that subtly creates temporal jitter based on specific camera systems in the same way e.g. Film Convert works for color/grain).
    With a precise motion pattern, cameras could be measured for temporal jitter / motion cadence (it would appear that cameras with genlocks could be jittered live).
  18. Like
    jcs got a reaction from Jonesy Jones in There is an imposter about   
    In the same way cryptocurrency (bitcoin et al) provides a new means for energy exchange/currency, we need a non-government global identity/trust system. After the 143 million person Equifax breach, I received a phone call from my own phone number, stating it was from the phone company and there was a security issue, enter last 4 of social etc. I did nothing and the call hung up after 30s. Called phone company and they acknowledged that the phone network can be hacked and is not secure. So if the phone system can be hacked, calling 611 might not route your call to the actual phone company, and thus any information you provide may not be going to the phone company. HTTPS (AES encryption) is designed to provide a secure connection over the internet, however it's not 100% secure either (it's mostly secure, the likelihood at this point for exploits for most things is low). There's currently nothing like HTTPS for phone calls (or email that is standardized/widely used).
    It's interesting that most people in this thread so far are using their real name (I'm using my initials with links in signature that can be followed to see my identity). I think there is value in real identities (Facebook tried to make this case, though it's understood that it's not a 'kind' system nor is google ('Do no evil' lol)). So it's understood that some people may wish to remain anonymous (as much as possible) when dealing with 'unkind' systems. However for a forum such as EOSHD and other friendly communities, perhaps it would be cool if there were 'game theory' -based reasons / rewards for people using their real identities as well as positive reinforcement for desired community behavior. Maybe a "Verified ID" badge or similar.
    For email between trusted friends / colleagues, folks used to use PGP, however it doesn't appear to be widely used anymore. Seems like a great business opportunity for folks to tackle- a universally trusted crypto ID system that is pervasive and not owned or controlled by any government or corporation, similar to bitcoin for currency.
    Rawlite OLPF- was curious what camera this was for... http://rawlite.com/
    EDIT: 'Edgar' was hacked at the SEC: https://www.bloomberg.com/news/articles/2017-09-21/all-about-edgar-and-why-its-hack-shakes-the-sec-quicktake-q-a . This is a very big deal. The world could really use a new trust system...
  19. Like
    jcs reacted to EthanAlexander in Game of Egos   
    You're presupposing that an inability to know what world we "live" in, and a willingness to question it, means that he will conduct himself with any less meaning than you. 
    Take for example a Catholic and an Athiest: a worshipping catholic will likely view this life as a time to honor God through every action, a time to follow God's plan. This gives that person immense meaning and a calling to do good. On the other hand, many Athiests, seeing this time on earth as our only time, with no afterlife or greater power, will choose to make the most of every second still breathing. This includes treating others with respect because this is their only time, too. 
    Two opposite ends of the spectrum, same result in action. 
    So, why can't someone question this reality without being labeled ignorant? If people never questioned anything, we'd be no different than chimpanzees.
  20. Like
    jcs got a reaction from kidzrevil in Multi-spectral Detail Enhancement   
    Sometimes we need to enhance the detail of a shot: a very soft lens, slightly out of focus, slow motion, post cropping (for story/emotion or after stabilization), and so on. Most are familiar with the sharpen effect and the unsharp masking effect. We can combine both effects, as well as use unsharp masking to create a local contrast enhancement effect as well.
    Canon 1DX II and Canon 50mm 1.4 at 1.4, 1080p (Filmic Skin picture style):

     
    Multi-spectral Detail Enhancement (let's call it MSDE, based on the physics of Acutance)
    Fine noise grain: adds texture and increases perception of detail (Noise effect: 2%, color, not clipped) High frequency sharpening: in PP CC this is called Sharpen (as a standalone effect) or via Lumetri/Creative/Sharpen (as used here: 93.4) Mid frequency sharpening: Unsharp masking effect with amount 41 and a radius of 5 Low frequency sharpening (Local Contrast Enhancement or LCE): Unsharp masking effect with amount 50 and a radius of 300
    While this may be a bit too sharp/detailed for some, it illustrates MSDE, and one can add detail to taste using this technique. Note we didn't use a contrast effect or curves to achieve this look.
    MSDE can also be use to improve HD to 4K upscales: apply after upscaling. Also a great way to use Canon's soft-ish 1080p along with DPAF (since it's not currently available in any other cameras on the market). The GH5 is the new kid on the block with excellent detail, however Canon still looks more filmic to me and has excellent AF  
    Someday Adobe will GPU accelerate their Unsharp Mask effect (it's a trivially easy effect to code too!), so this can easily run in real-time while editing.
  21. Like
    jcs reacted to EthanAlexander in Game of Egos   
    Brother, apologies if what I wrote sounded accusatory - That was not my intention. Perhaps I should have phrased it differently. I was simply suggesting that it is just as valid to investigate assumptions that we've never questioned before as it is to leave them be. That is what I was attempting to illustrate with the "opposite ends of the spectrum" example of Catholicism and Atheism. 
  22. Like
    jcs got a reaction from EthanAlexander in Game of Egos   
    How can we determine if reality is real, a dream, a simulation, or something else? I think therefore I am- René Descartes.
  23. Like
    jcs got a reaction from EthanAlexander in Game of Egos   
    How do you know you're alive? 
    How will you know when you are dead?
  24. Like
  25. Like
    jcs reacted to Trek of Joy in My guide to buying a cheap Hasselblad medium format camera   
    Move the 35mm back and this image is dramatically different. And yes it will match a 90mm since all that matters is distance to subject, not all this lens compression nonsense. This article explains/demonstrates it perfectly.
    http://admiringlight.com/blog/perspective-correcting-myth/
    "The perspective in a photograph is 100%, completely dependent on the photographer’s physical distance between them and their subject.  That’s it.  Not the lens, not the format, nothing but the distance.
    To create an image with telephoto compression, the photographer backs away from the subject and uses a longer focal length to keep the framing the way they want.  The key point is: It’s the backing up that changes the perspective, not the lens.
    If I am 1 foot from my subject, and the background is 100 feet behind, and I frame the subject with an ultra-wide angle lens, the difference between me and the subject and the subject to the background is 1 to 100.  Now, if I back up 10 feet and frame the subject so they’re the same size in the frame as my original composition by using a short telephoto lens, now that ratio is only 1:10.  This causes the background to appear much closer to my subject than in the first instance."
    Shoot both lenses from 15 or so feet away and crop the 35 to match the 90, images will be identical. If you try to get the same framing in cam, the 35 will be much closer to your subject, creating the unflattering look with people, minimizing background elements - like making mountains and buildings look smaller and so on.
×
×
  • Create New...