Jump to content
Andrew Reid

EXCLUSIVE: Blackmagic Pocket Cinema Camera 4K gets Time of Flight Autofocus with Prototype add-on

Recommended Posts

Really impressive, but it needs to be smooth like DPAF. Right now it looks way too jumpy to be usable. Any chance this will happen?  Either way, this is still insanely impressive!  Would love one of these on my g7!

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
4 minutes ago, Shell64 said:

Really impressive, but it needs to be smooth like DPAF. Right now it looks way too jumpy to be usable. Any chance this will happen?  Either way, this is still insanely impressive!  Would love one of these on my g7!

Think you missed the part where he said

Quote

There is no ramping set but as with the 3C app this will be an adjustable parameter.

 

Share this post


Link to post
Share on other sites
17 minutes ago, Stathman said:

You should buy one or two more frying pans, you never know.. :tounge_xd: 

 

Those are actually the spare ones :)

19 minutes ago, Stathman said:

I can see a middle step before it focuses on target or my eyes are flipping?

That particular lens is a bit of a heavy breather which makes it look a bit off particularly in snap mode but the sensor needs a bit more tuning too.

More samples in a few days and a bit more of an insight into how the other non-ToF aspects of the focus controller work.

Share this post


Link to post
Share on other sites
1 hour ago, Márcio Kabke Pinheiro said:

I thought an even simpler device, only the wheel, display, and 1 or 2 buttons (easier to put on a gimbal), but this one could work too. :)

 

The small orange one (the PBC) which won't be orange has an expansion module with five buttons and a small wheel.

It communicates to the PBC by bluetooth so you can split them if you want or attach into one controller.

So you can have the PBC on top of the camera for example and the expander on a handle.

Its a lot cooler than I'm making it sound.

1 hour ago, thephoenix said:

of course it will be bmpcc6k compliant ? ;)

But of course.

Probably

Share this post


Link to post
Share on other sites

@BTM_Pix so how does it work for all the different focal length? You cross the data between what the camera knows about its lens and focal length with the data the LIDAR sensor gets? So as long as it's calibrated the LIDAR knows what the camera is looking at?

Also, how does it work for continuous AF? It uses the camera's own algorithm to track?

If you were to speculate over how Panasonic could implement ToF AF in their future camera, how do you think it would be the best way for them to do it?

Is it even possible to make the normal camera sensor have some in-sensor LIDAR capability or does it have to be a completely separate sensor? If it has to be separate it would have to be implemented in a place that won't be easily obstructed, so probably near the EVF - like mounting on the hot shoe. But I wonder if it's possible to make it in a way that the LIDAR sensor can see the same as the main sensor/lens.

Sorry for all the questions but I'm just trying to understand how does this work or what could be done by a manufacturer to make it even better. If they cross the data of the LIDAR with the lens data - like DFD does -, plus the data they get from AI regarding how to identify a subject, how to give priority to eye tracking - like Sony does -, if they combine all of that, they should get a pretty good AF.

Share this post


Link to post
Share on other sites
2 hours ago, theSUBVERSIVE said:

@BTM_Pix so how does it work for all the different focal length? You cross the data between what the camera knows about its lens and focal length with the data the LIDAR sensor gets? So as long as it's calibrated the LIDAR knows what the camera is looking at?

Also, how does it work for continuous AF? It uses the camera's own algorithm to track?

If you were to speculate over how Panasonic could implement ToF AF in their future camera, how do you think it would be the best way for them to do it?

Is it even possible to make the normal camera sensor have some in-sensor LIDAR capability or does it have to be a completely separate sensor? If it has to be separate it would have to be implemented in a place that won't be easily obstructed, so probably near the EVF - like mounting on the hot shoe. But I wonder if it's possible to make it in a way that the LIDAR sensor can see the same as the main sensor/lens.

Sorry for all the questions but I'm just trying to understand how does this work or what could be done by a manufacturer to make it even better. If they cross the data of the LIDAR with the lens data - like DFD does -, plus the data they get from AI regarding how to identify a subject, how to give priority to eye tracking - like Sony does -, if they combine all of that, they should get a pretty good AF.

Each lens is profiled to calibrate real distance and focus position.

The camera's internal AF is bypassed completely and the lens is operated as though it were under manual control which allows instant switch or ramped changes as it is tracking.

In terms of manufacturers making it better, then Panasonic et al will implement this in a different way and will definitely improve on it by virtue of having far, far smarter people than me involved in its development and having far more of them. In terms of speculating how they will do it internally whether its incorporated into the sensor itself or as part of the image processing pipeline through AI then they could go either way really. 

I've mentioned a couple of times in this thread that what I've shown thus far is very much a jumping off point for this and by the time the commercial product version of it becomes available in early 2020 it won't have changed conceptually in terms of it being an outboard system that can be added to different cameras but how that is achieved will and, without giving too much away at this stage, already is changing.

I'll drip a couple of things into this thread between now and then.

Share this post


Link to post
Share on other sites
2 hours ago, BTM_Pix said:

Each lens is profiled to calibrate real distance and focus position.

The camera's internal AF is bypassed completely and the lens is operated as though it were under manual control which allows instant switch or ramped changes as it is tracking.

In terms of manufacturers making it better, then Panasonic et al will implement this in a different way and will definitely improve on it by virtue of having far, far smarter people than me involved in its development and having far more of them. In terms of speculating how they will do it internally whether its incorporated into the sensor itself or as part of the image processing pipeline through AI then they could go either way really. 

I've mentioned a couple of times in this thread that what I've shown thus far is very much a jumping off point for this and by the time the commercial product version of it becomes available in early 2020 it won't have changed conceptually in terms of it being an outboard system that can be added to different cameras but how that is achieved will and, without giving too much away at this stage, already is changing.

I'll drip a couple of things into this thread between now and then.

just remember to ship a prototype out here it will need proper testing in real extremes, an english summer just wont do :)

i can justify that with 2days into spring and already we are having bushfires

IMG20190906163707.jpg

IMG20190906163844.jpg

IMG20190906164518.jpg

Share this post


Link to post
Share on other sites
7 hours ago, BTM_Pix said:

Each lens is profiled to calibrate real distance and focus position.

The camera's internal AF is bypassed completely and the lens is operated as though it were under manual control which allows instant switch or ramped changes as it is tracking.

In terms of manufacturers making it better, then Panasonic et al will implement this in a different way and will definitely improve on it by virtue of having far, far smarter people than me involved in its development and having far more of them. In terms of speculating how they will do it internally whether its incorporated into the sensor itself or as part of the image processing pipeline through AI then they could go either way really. 

I've mentioned a couple of times in this thread that what I've shown thus far is very much a jumping off point for this and by the time the commercial product version of it becomes available in early 2020 it won't have changed conceptually in terms of it being an outboard system that can be added to different cameras but how that is achieved will and, without giving too much away at this stage, already is changing.

I'll drip a couple of things into this thread between now and then.

@BTM_Pix Thank you very much for your reply.

I wasn't even sure if in-sensor was possible in an interchangeable lens system.

From what I was able to understand, there are a lot of different ways to implement LIDAR technology and one of the things I've read is that one LIDAR can interfere with each other. Is that true for all implementations or just for some? Because it would be an issue to have interference if there are two or more cameras.

From what I've read, you can use one device that emits the laser and then you could use a CMOS sensor as a receiver/detector, if so I wonder if this would be a good option for an interchangeable lens system.

Does anyone know the difference between hybrid CDAF/PDAF and Dual-Pixel AF? Does Dual-Pixel AF suffers the same degradation/banding as hybrid CD/PD AF?

Share this post


Link to post
Share on other sites
On 8/30/2019 at 9:38 PM, BTM_Pix said:

The autofocus adapter will be available in early 2020.

If you need any beta testers, let me know.  I would be willing to pay for the prototype and shipping.  I would like to use it on a gimbal to help keep focus.  We shoot a variety of projects from a local TV show to events like weddings.  We shoot short films, web series, and we just completed our first feature.  Plus, we film a fair amount of corporate and commercial work locally.  

We would switch completely to the BMPCC if the autofocus were better for our event work.  When it is that run and gun, we prefer some good tracking autofocus.

Thanks!

 

 

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...