Jump to content
EOSHD C-LOG and Film Profiles for All Canon DSLRs
EOSHD Pro Color 3.0 for Sony cameras
EOSHD Pro LOG for Sony cameras
EOSHD 5D Mark III 3.5K RAW Shooter's Guide

  • Popular Contributors

  • Forum Statistics

    • Total Topics
    • Total Posts
  • Member Statistics

    • Total Members
    • Most Online

    Newest Member
  • Posts

    • I remember reading about this when it was first introduced, and maybe I'm remembering incorrectly but I think the iPhone 5 had some.  My iPhone 6 Plus definitely had some - you would take a shot and then take a burst, and if you compared the single shot it had far less ISO noise than each frame in the burst.  It lowered the ISO noise by taking a burst and combining them to average-out the ISO noise. People have criticised Apple for not competing in the megapixel wars but I remember reading that Apple were putting the horsepower into the image processor instead of the sensor because they were looking at photography in a new way. The original way is: one exposure from the sensor -> save straight to storage (eg, film) The old way is: one exposure from the sensor -> colour and image processing (colour science, lens distortion compensation) -> save to storage The new way is: many exposures from many sensors -> colour and image processing of each exposure (like above) -> processing many images together to create a single image -> save The future will be: many exposures from many sensors -> colour and image processing of each exposure (like above) -> processing many images together to create something more than a single image -> save. (this may be combining images into a 3D environment, a VR output file, a file that lets you choose focus point and DoF in post, etc) The processing may also do things like combine multiple images together to find the best moment from a facial expression perspective ("peak smile" or "eyes open"), or more sophisticated would be choosing separate exposures and combining them together to pick the best frame for each person in a group shot so everyone is at peak-smile and no-one has their eyes closed. Cameras that learn your style from giving you many options and you choosing one will come, this will be a precursor to you not needing to hit the shutter button or even engage the camera - once they can analyse a scene they will know what a touching moment looks like and will just be watching the whole time and will just save (or flag) the best bits, creating little sequences for you.  Pair the Apple Glasses with the Apple Watch and your iPhone and the watch will know when your heart rate is up and hear your vocal intonations to know if this exciting moment is the applause for your kid in the school play or a car accident, the Apple Glasses will be sending high-speed image sequences, and your iPhone will be processing and storing the whole lot. It will create magical memories from your life, it will save all the steps in how you made that recipe so you can make it again, it will create BTS videos, it will automatically log who you met with and what you spoke about so you can network and manage business and sales opportunities.  It will be monitoring what you buy, it will be analysing and categorising and diagnosing you, and it will be talking to Apple about what you might want to buy next. It will be magical, people will queue up around huge city blocks for it, and the privacy warriors will go absolutely bananas.
    • massive.  prediction: the sort of processing I do on real estate pics will be pretty well dead in 5-10 years time.  They won't be as good, but they will be "good enough".  I can imagine the new image processing software will just have  bunch of looks, and ai will do the rest. This is the reason I'm trying to get out of the game now.  It's bound to come to video, too.   
    • It is fine on G lenses, but they are not exactly light setups. For example the 24-70 2.8G on the adapter is not a light setup to haul around all day on a gimbal. 
    • Tom Antos has used them quite a bit, so they can't be that bad. 
  • Topics

  • Create New...