Jump to content

Where are the gimbal performance measurements / standards?


kye
 Share

Recommended Posts

We have standards for tonnes of things, why not gimbals?    Specifically, how well they stabilise?

As far as I can tell, a gimbal is a physical device that receives vibrations from the handle and through the three motors forms a low-pass filter such that only large slow motions are able to make it through to the camera.  This should be easily test-able via a test rig of some kind.  I would expect a graph showing dB of attenuation across a range of frequencies over the three axis's of motion.

That way we'd be able to say things like:
"gimbal X has better attenuation than gimbal Y up to vibrations of strength Z, but above that X runs out of steam and Y is better, therefore for fine work X > Y but for difficult environments Y > X"
or
"gimbal A has much better attenuation of higher frequencies than B or C or D, therefore if you plan on mounting it to a vehicle (which has a vibration frequency distribution shown in the graph below) you're better off with A".

Instead, what we get is "I'm going to watch youtube videos where people compare two different gimbals by running with each in turn, therefore seeing how well each performs IN STABILISING A COMPLETELY DIFFERENT SET OF VIBRATIONS".  Hardly the best way to compare devices costing hundreds or thousands of dollars.

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

The problem are the variables... excluding weird terrain, air, different walking patterns and speed is what your model can achieve.

But what about camera weight, combo with x, y, z lens, x, y, z sunshade and filters, cage, extra battery, matte box, external mic...

The weight distribution is the problem with stabilizing (be it gimbal or steadycam type devices)...

Recreating same conditions for the same camera and ONE gimbal is very difficult, now add multiple cameras with multiple lenses and several stabilizers... good luck!

Link to comment
Share on other sites

27 minutes ago, elgabogomez said:

The problem are the variables... excluding weird terrain, air, different walking patterns and speed is what your model can achieve.

But what about camera weight, combo with x, y, z lens, x, y, z sunshade and filters, cage, extra battery, matte box, external mic...

The weight distribution is the problem with stabilizing (be it gimbal or steadycam type devices)...

Recreating same conditions for the same camera and ONE gimbal is very difficult, now add multiple cameras with multiple lenses and several stabilizers... good luck!

Not at all.

You simply have to have an arm that you can mount the handle on that can output repeatable vibrations, and then mount a camera (have a few different weight setups) and record what comes out, then analyse it for how much motion came through on the footage.  In a way it would be a device like an orbital sander, but where you can control the direction and amount of vibration.

Think about music, it is infinitely complex and hugely complicated but that doesn't mean we don't have measurements for frequency response, distortion etc.  Light is hugely complex with infinite shades and colours, but we can measure devices in terms of DR, colour gamuts, etc.  

The testing method would be straight-forward - setup and balance the gimbal, put the device on the arm, hit record on the camera and go on the arm, the arm does several 'passes' where the vibration gets larger and larger, then you download the footage and analyse it for motion.  You'd see that gimbal A eliminated all motion up until 7s in, but gimbal B made it to 11s in, or that gimbal C let through higher frequency vibrations, etc.

It's not simple, but it's not impossible.

Edit: In order to test different camera setups, you might have a few weights and for each weight you might test a camera that's well balanced and then one that isn't (eg, it's front-heavy to simulate having a long lens).  

If gimbal A setup with the off-balance setup stabilised better than gimbal B then you could assume that this difference in performance would translate to all off-balance setups as this typically is down to the strength of the motors.

You could also test battery life under identical loads.

Link to comment
Share on other sites

43 minutes ago, kye said:

Not at all.

You simply have to have an arm that you can mount the handle on that can output repeatable vibrations, and then mount a camera (have a few different weight setups) and record what comes out, then analyse it for how much motion came through on the footage.  In a way it would be a device like an orbital sander, but where you can control the direction and amount of vibration.

Think about music, it is infinitely complex and hugely complicated but that doesn't mean we don't have measurements for frequency response, distortion etc.  Light is hugely complex with infinite shades and colours, but we can measure devices in terms of DR, colour gamuts, etc.  

The testing method would be straight-forward - setup and balance the gimbal, put the device on the arm, hit record on the camera and go on the arm, the arm does several 'passes' where the vibration gets larger and larger, then you download the footage and analyse it for motion.  You'd see that gimbal A eliminated all motion up until 7s in, but gimbal B made it to 11s in, or that gimbal C let through higher frequency vibrations, etc.

It's not simple, but it's not impossible.

Edit: In order to test different camera setups, you might have a few weights and for each weight you might test a camera that's well balanced and then one that isn't (eg, it's front-heavy to simulate having a long lens).  

If gimbal A setup with the off-balance setup stabilised better than gimbal B then you could assume that this difference in performance would translate to all off-balance setups as this typically is down to the strength of the motors.

You could also test battery life under identical loads.

Agreed, it’s all rather non scientific... wild Wild West nonsense. No real standards are applied. Highly subjective and non repeatable.

Link to comment
Share on other sites

It can certainly be done, but look at the cinema5d tests for dynamic range... there are several ways to compromise results. Not convinced with the example, look at the tests made by ebu to determine if a camera can be used for broadcast, as soon as you add extra tests as technology allows, how can your new results compare to your old ones?... Or look at the anamorphic test by sharegrid, how many people involved do you need to have comparable and quantifiable results? That’s what the challenge is and now consider doing retests cause the motors where acting strange on that day you rented the Gemini or the alexa mini...

Link to comment
Share on other sites

29 minutes ago, elgabogomez said:

It can certainly be done, but look at the cinema5d tests for dynamic range... there are several ways to compromise results. Not convinced with the example, look at the tests made by ebu to determine if a camera can be used for broadcast, as soon as you add extra tests as technology allows, how can your new results compare to your old ones?... Or look at the anamorphic test by sharegrid, how many people involved do you need to have comparable and quantifiable results? That’s what the challenge is and now consider doing retests cause the motors where acting strange on that day you rented the Gemini or the alexa mini...

Are you saying that because it can't be perfect forever that it shouldn't be done at all?

I guess we should stop testing cameras because no-one can test Mojo (TM) yet!

Link to comment
Share on other sites

I agree, there could be a series of standardized tests developed for a gimbal, and it would be nice to have a consistent set of metrics to judge gimbal performance on.  Probably a fairy significant effort to develop a legitimate, fair standard that would account the variety of conditions gimbals are used in and the variety of equipment that's mounted on them.  Would be nice to see but I kind of doubt anyone will put enough time into doing it right.

Seems like each gimbal manufacturer probably has some version of these tests already for testing their own hardware (like the camera mfrs do for dynamic range).  If you're really gung ho about it, might be worth reaching out to them.  Some mfrs probably already run their tests on competing hardware and if they know theirs is superior they might want to share their testing methodology so it can be replicated and generate good PR for them.

Interestingly, I think one way to do this kind of testing would be with another gimbal that's attached to the gimbal being tested.  The SimpleBGC software that many popular gimbals run is open source and you can also create scripts that create predetermined movements, like this guy below did for doing timelapses.  Could probably script a series of different movements that would be used to apply various forces against the gimbal being tested, then measure a sensor or laser or something mounted to the test gimbal to see how much it controls for those movements.

https://beyondthetime.net/3-axis-gimbal-timelapse/

Link to comment
Share on other sites

i agree too, it would be like premiere export review, they have diferent ways to do benchmark, like preset sequences with standard files, codecs, effects, etc...

It also dont replace the actual ways to test it, like youtubers do, its only shows to us another variables for choose a product.

 

Link to comment
Share on other sites

Kye, I hope such tests get done, gimbals are relatively new and can do great things. I’m just saying that the effort required to do them is bigger and more expensive than just getting all the gear together and creating a machine that can test them. I’ve personally only used a ronin as a camera assistant in one feature and as a camera operator in a short. I’ve played with a friend’s Zhiyun Crane for a few hours and found it easier to set. But the feature was with an alexa mini and the short with a c100... that the Zhiyun was flying my a6300 was obviously a lot easier task. For many things I prefer to use a glidecam but that’s a biased opinion cause I’ve used them for years. So as a user I strongly believe that the end result of a project has more to do with the user/team of users than the tool.

Link to comment
Share on other sites

@tellure I agree that manufacturers probably have equipment that does this - it's a pity they can't (or won't) share their results with us!  Of course, having all the setups calibrated is another whole thing, I'm just talking about a single test regime applied to all the gimbals.

@capitanazo & @elgabogomez I agree that gimbals are more than just how well they stabilise, a lot matters in terms of ergonomics, features, how well the app works, etc.  this is just one aspect, but a pretty important one!

@elgabogomez I'd imagine that you'd need test setups for different weights, for example: smartphone, large smartphone, 500g, 1kg, 2kg, 3kg, 5kg, 8kg, etc.  Of course you'd need a balanced version of those setups and an un-balanced one to test how well they do without a perfectly balanced load.  You might also find that a gimbal may perform worse with a load a lot lighter than its maximum load, so you might want to test it near its maximum load and also at its minimum load too.

I think there's a business opportunity here for a site that really reviews gimbals instead of the kind-of reviews that are being done now - perhaps the DxOmark of gimbals?  

I don't want to be that person, I just want that person to tell me the answers so that I can buy the right device when I'm in the market for one!  This thread is kind of an open letter to that person - please go ahead!!

 

Link to comment
Share on other sites

Actually for gimbal the software and tuning is extremely important if not more than the hardware.

Essentially at the center of the gimbal you have an IMU (motion sensors) with gyroscope an accelerometer to feel the motion and send counter information to the motors. But then, you need to tune the filtering and the PIDs which is critical. Vibration filtering is only one of the parameters and you must select a cut off value so this is subjective based on the use. Handheld use won't have the same profile as drone use, and even drones have a lot of variation based on the vibration frequency induced by the propellers and the resonance of the frame which in turns depends of the frame rigidity and rpm of the motors. The loop speed is important also (refresh frequency/rate of the sensors).
Having oversized motor to handle the load despite non-optimum balancing is also necessary (sometime the lack of balance is simply due to the CG shift when the camera moves up and down). So it's hard to give clear metrics. In my personal experience, the hardware is not the limiting factor nowadays. You can find plenty of powerful IMUs and processing unit on ebay for a few dollars. The difference between a bad gimbal and a good one is the hardware integration and PID tuning.

@kye "I don't want to be that person, I just want that person to tell me the answers so that I can buy the right device when I'm in the market for one!  This thread is kind of an open letter to that person - please go ahead!!"

Unfortunately there is not simple answer, a DxO of gimbal would show the resolution, loop speed, motor force, etc. of the gimbals but without actual real life testing, such metrics wouldn't be very useful because integration and tuning is key. For instance Sony camera have "shitty" skintones but when Nikon implement Sony sensor the colors are nicer our of the box.

Link to comment
Share on other sites

@OliKMIA You raise excellent points, however I still believe that "black box" testing as I've described above would still be useful.  The same kind of testing would apply, but you'd have to re-test given firmware updates.

It doesn't matter what the mechanisms are within the gimbal, it can be reduced to a "black box" and tested by providing a known input vibration and measuring the output vibration (which would ideally be zero above some cut-off frequency).

In analog audio circuits there are two main parts of the circuit - the signal path and the power supply.  The job of the signal path is to create an output signal as close as possible to the input signal but amplified (voltage and/or current amplification).  The job of the power supply is to take the awful noisy mess the AC power from the power company normally is and make it a DC power source with zero AC on it, both at idle and during heavy amplifier loads.
There are dozens / hundreds of designs for signal paths with varying architectures (global feedback / local feedback / zero feedback / Class-A / Class-AB / Class-D / MOSFETs / JFETS / pentodes / triodes / etc) and there are as many power supply designs (linear / regulated / passive filtering / active filtering / valve / solid-state / etc) but all of these can still be tested by looking at what they output with a given typical load.  In fact, these don't even require the same testing signal to be applied for calibrated testing setups to create measurements that can be compared to each other.

Everything I said above about audio applies to the analog components of video processing and broadcast as well, just at a higher bandwidth and with the video embedded on a carrier wave instead of 'raw' through the circuitry, but the principles remain.  If an analog video signal path had a high-frequency rolloff or the power-supply was noisy or didn't have a low output impedance it would result in visual degradation of the picture - something that the test pattern would ruthlessly reveal, which is why it designed and used.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...