Jump to content
Sign in to follow this  
Andrew Reid

New Sony sensor has 21 stops dynamic range, 5120 native ISO - and destined for a video device NOT a smartphone!

Recommended Posts

Guest Ebrahim Saadawi

I thought we were always told in opthalmology classes that the human eye has 6.5 stops of dynamic range is each position/scene/frame, but when factor in when you look with your eyes to darker areas and to brighter areas it shows 20 stops by adapting sensitivty, but that's not simultaneously so should be compared to a camera's frame. So it waz 6.5 stops as I remember. At least that's what professors said, about 30 years ago! :D

I would like my camera to have much more dynamic range that my cinema camera. When I look out of a bright window I can't see the room inside simultaneously so my dynamic range is quite limited, I want my camera to see that room in full brightness. If I want to emulate real world and human vision feel I would shoot 30p with a betacam all with handheld shake and a 50mm eqiuv. Lens at at f11. That's pretty realistic to what we see.

Cinema is about creating a fantasy world, you enter a theatre to abandon reality and live in an alternate world that you don't know. That's why we use shallow depth of field, slowmotion, timelapse, 24p, long compressed lenses, wide distorted lenses, steady cinematic camera movement, film grain and texture, colour grading, etc

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
Guest

Yup. Art is created from limitations, not endless possibilities. Film is a medium used to speak a language, not a replicator of reality. Film and photography are the most confusing art form though, because unlike writing, painting, music, etc, cameras produce an 'indexical' image (like a footprint or fingerprint, a photo has a direct, physical connection to reality). But as soon as you frame something, edit it, grade it, choose a different lens, use a different camera, smile at your subject, ask a question of your subject, point a camera at someone who can see you are filming them, live in a society where cameras are omnipresent, live in a society where we act out our lives massively influenced by the videos we see on TV/online/in cinemas - subjectivity enters the equation. And where you have subjectivity, you have language, and where you have language, you have art. The lie that is photographic "truth" (e.g. Cinema Verite) has been exposed, denounced and thrown to the dogs. Video is not a footprint, it's a paintbrush. See Barthes (Camera Lucida), Mulvey (Death 24X a Second), Pierce, Tom Gunning, David Campany, etc, etc, for more ...

Share this post


Link to post
Share on other sites

This is most likely a hoax. I mean, someone just sent a whitepaper to sonyalpharumors and that's it, everyones blogging about it, hoping for clicks.

 

http://image-sensors-world.blogspot.com.es/2014/11/active-pixel-color-sampling-story-goes.html

 

"It looks like someone has taken a machine vision datasheet and modified to make the sensor bigger, with faster frame rate and more DR. Then added some non-sensical numbers."

Share this post


Link to post
Share on other sites

I thought we were always told in opthalmology classes that the human eye has 6.5 stops of dynamic range is each position/scene/frame, but when factor in when you look with your eyes to darker areas and to brighter areas it shows 20 stops by adapting sensitivty, but that's not simultaneously so should be compared to a camera's frame. So it waz 6.5 stops as I remember. At least that's what professors said, about 30 years ago! :D

I would like my camera to have much more dynamic range that my cinema camera. 

Ebrahim, "Tim's Vermeer" is out now (guy who invented the Tricasters btw).  Everyone on this blog would enjoy that documentary.  In it he explores how Vermeer must have used optical equipment to paint the proper dynamic range in an image.  Anyway...

 

To expand on what you wrote, our eyes have 6.5 stops of dynamic range (and also only about 5 megapixels of center resolution).  What gives us 21 stops of dynamic range isn't the mechanics of our eye but the compositing nature of our brain.  When you look from bright to dark, the brain essentailly HDRs the image in your head (as you say, the iris changes).  There are few "blown out" area or shadows without detail because our eyes adjust.  One day, I believe camera sensors will do this.  The camera computer will take sections of sensor data, change the exposure a bit, then create one HDR image.  Right now, they can only do it with the whole sensor (unlike our brains which pick and choose sensory data).

 

Some people confuse the dynamic range of a camera (a Nikon is close to 20, a Sigma 8, etc.) with the dynamic range of human vision.  They are really to different things.  The dynamic range of a sensor is the gradations of light it can capture in a scene, however, those gradations are based on a single reference point.  When you change one, you change them all.  The dynamic range of our vision is not based on one reference point of brightness, but on a reference point of image sense.  Our brain is the master photographer.  Indeed, philosophically, can we create mechanical images that are better than what's in our head?  

 

If we take a Nikon image, say, and want to bring back detail in our blown out areas, we might change the "dynamics" at a part of the image that's at the 19the stop of DR.  However, we don't get full detail to work with because if we need 6 stops, we're only getting 17 to 20, not 16 to 22.  Our eye gets the full 6 at the "window" to use.  And it does so smoothly.  Which is difficult to do in image editors (assuming we have multiple exposed shots) without tons of time and skill.  

 

As for it being a hoax.  That's certain possible!  However, I don't see how it isn't possible if you accept that the color will suffer from temporal and other accuracy issues.  they can vibrate a color filter over the sensor as well as DIYers vibrated focusing glass in those early DOF adapters.  What Sony is doing is what they did with the A7S.  They are created specialized sensors for specialized needs using known technical trade-offs.  (keep in mind, the A7S doesn't get it's low-light for free.  It looses DR and resolution in good light.  Sony believes there are enough people who will pay for that trade-off--time will tell, so far, looks like a good gamble). 

Share this post


Link to post
Share on other sites

You need severe learning difficulties to struggle with the ergonomics of the a7s.  I mean so severe you've got close to zero motor functions and need a full time carer.  It's nearly impossible to struggle with the a7s.  I love stupid trolls since they donlt even have the ability to upset people vie the internet.

Wow there, tiger - some of us are actually disabled on this board and while I have very limited motor function and my son has severe learning disabilities, neither of us struggle with the ergonomics of my A7s. Even with our (part time) care, we are fully capable of using the camera as well as anyone, so don't tar us with the "too incapable to use it" brush. :D :D :D

Share this post


Link to post
Share on other sites
Guest

Wow there, tiger - some of us are actually disabled on this board and while I have very limited motor function and my son has severe learning disabilities, neither of us struggle with the ergonomics of my A7s. Even with our (part time) care, we are fully capable of using the camera as well as anyone, so don't tar us with the "too incapable to use it" brush. :D :D :D

 

 

No you are not disabled, actually. Everyone who uses this forum is white, male, college educated, western, without disability, heterosexual and aged between 25-45. 

 

They have to be. Otherwise we'd always have to consider the possibility that the 'other person' has a worldview different to ours. Which would of course mean we'd have to be tolerant, thoughtful and open to the possibility that what might be true for us is not necessarily true for the other person.

 

I mean come on dude, can you really see that working here?

Share this post


Link to post
Share on other sites

This is most likely a hoax. I mean, someone just sent a whitepaper to sonyalpharumors and that's it, everyones blogging about it, hoping for clicks.

 

http://image-sensors-world.blogspot.com.es/2014/11/active-pixel-color-sampling-story-goes.html

 

"It looks like someone has taken a machine vision datasheet and modified to make the sensor bigger, with faster frame rate and more DR. Then added some non-sensical numbers."

 

 

The chances of being a hoax are slim. The chances of the hoaxer having this much technical knowledge of sensors is even slimmer. The chances of the hoaxer having this much creativity with the technical knowledge of sensors is vanishingly small.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...