top | item 40374879

Earth rotation limits in-body image stabilization to 6.3 stops (2020)

166 points| pwnna | 1 year ago |thecentercolumn.com

105 comments

order

eggy|1 year ago

Well, if we're nitpicking here, it is not 86,000s/day (24 hours * 3600s/hour) and 7.27x10^-5 radians/s, but 86,164.091s and 7.29x10^-5 radians/s.

24 hours is the time it takes the sun to return to the same spot in the sky due to earth having to rotate for another 3m56s to make up for angle gained by revolving around the sun in the same direction as the rotation of the Earth. This applies for the other planets that also rotate and revolve in the same direction - Mercury, Earth, Mars, Jupiter, Saturn, and Neptune. A sidereal day is 23h 56m 4.091s for distant stars to return to the same spot in the sky.

Damn, I knew that is why I botched my 6-stop exposure at my daughter's graduation! She can't blame me now! Thank you HN!

trhway|1 year ago

>Damn, I knew that is why I botched my 6-stop exposure at my daughter's graduation!

how about driving for 6-stop before taking the shot a tank with stabilized gun trained to the target. Now the tank gunner has the excuse too.

t0mas88|1 year ago

You don't need GPS to figure out the correction for this. Inertial navigation systems in aircraft (which use very stabilised platforms with a lot of math involved) worked before GPS was available.

It helps to have a rough indication of the current latitude on startup, but you can also figure it out from the gyro outputs. Just takes longer.

With modern sensors (solid state laser gyroscopes) it has all become a lot smaller so if you really want to you can do this in a camera. It's just probably going to be too expensive for what it brings, because 6+ stops of stabilisation is a lot already.

crubier|1 year ago

Aerospace grade laser gyroscopes are incredibly expensive (and bulky), and even then, they still have massive drift after several hours. If you don't have GPS to relocalize precisely at least every day, there is no way you can know the location of the camera on earth for more than a day, even with state of the art aerospace stuff

akira2501|1 year ago

> worked before GPS was available.

Worked makes it seem like you throw a switch and it just gives you position data. Those units take anywhere from 6 to 10 minutes to align, if you move the platform, it will error out and you must restart the alignment. The current systems take their initial fix from GPS, but the initial systems, the operator had to manually know and then key that information into the unit.

"Worked" with extreme care operated by a qualified professional.

svalorzen|1 year ago

I mean, surely if you are doing something that requires this level of precision, you could just ask the user to input its current known location? I doubt that even if the user misdialed by ten or twenty meters the difference in compensation would matter (or even if the camera was actually moving around).

Rinzler89|1 year ago

> Inertial navigation systems in aircraft (which use very stabilised platforms with a lot of math involved) worked before GPS was available.

Inertial measurement units for aircrafts and submarines cost as much as a house in California. Good luck putting those in a phone.

isoprophlex|1 year ago

> The second solution is much more plausible, but still very difficult. The user would have to be pointing the camera at the subject for long enough such that the drift in their aim at the subject is smaller than the drift from the rotation of the earth. This is also implausible. What is concerning though, is that this second method is one that could work very well to cancel out Earth’s rotation on the CIPA specified stabilization test apparatus.

So, basically dieselgate but for image stabilization

nick7376182|1 year ago

It seems the camera could use optical flow to get a baseline reading and calibrate the inertial frame offsets. They don't need to point accurately for a long time?

Or maybe that is the method they assume for the second solution and they calculated that it's infeasible.

moi2388|1 year ago

Can somebody ELI5 this to me?

The image with the 2 earths.. that only works if the camera is not also on the ground, but it is? How is the rotation of the object and the camera not identical? Why would it rotate ‘upwards’?

Also, if the issue is relative motion or rotation between camera and object, wouldn’t two sensors, one on the camera and one on the subject be able to solve this, since we can see if their rotations/movement match up or not?

SamBam|1 year ago

Imagine the camera were floating just above the surface of the Earth, and also that it had perfect image stabilization. This image stabilization would keep the camera always oriented in the same direction. Same direction relative to what? To the rest of the universe. So if it was pointing right at a star, it would continue pointing directly at that star as it went around and around the Earth. From our perspective on the surface, the camera would appear to be flipping over itself as it kept pointing at that star.

Unfortunately, this would be pretty bad for taking a picture of something that was right in front of the camera (relative to the surface of the Earth). You'd be in front of the camera, ready for your picture, and the camera would appear start rotating as it kept that distant star in view.

So with a perfect image stabilizer, this is what the camera is actually trying to do, even when standing on the Earth with a tripod. It actually senses the rotation of the Earth, and tries to cancel it out, just like it would cancel out your hands shaking. But while it's good to cancel out your hands shaking (because that's a motion that's independent of the subject of the photo), it's not good to cancel out the rotation of the Earth (because the subject of the photo is actually moving with you).

llm_trw|1 year ago

The position of the gyro is attached to the earth surface but its orientation is not. See Foucault's Pendulum.

mikewarot|1 year ago

We all want to keep missiles out of the hands of bad people.

Parts to make really good cameras could be taken out and used in missiles, to tell them where to go.

So we now have laws to keep those really good parts out of cameras, for safety. Cameras still work fine, but you need a tripod to get good pictures when it's dark out.

DoctorOetker|1 year ago

This can be fixed in software:

you can back calculate orientations with high pass filterd gyro data, to rotate the unfiltered gyro date into the current reference frame, then low pass the unfiltered but rotation corrected gyro data to get the earth rotation axis in the current reference frame, then one can estimate the expected rotation that should be ignored.

chris_va|1 year ago

Solution (2) as written seems to imply that the camera can only use the gyroscope signal while the camera is pointed at the subject, but I cannot see why that is a strong limitation.

In theory, you can take the last N seconds of data from the gyroscope (I assume it is running while the camera is active) to get the overall drift, even if it is tumbling around for a while before being pointed at the subject... Assuming the tumbling has enough periods of time that are correlated with the earth's rotation (e.g. someone carrying it, not pointing it an an aircraft or something moving EW for the window duration that is anticorrelated with the rotation).

felixhandte|1 year ago

That would only work in the case that the camera is fixed on a tripod and has a long period of stable / rigid pointing before the exposure during which to collect this data. This is sometimes the situation in which image stabilization is used. (But if you can be that stable for that long on a tripod, you may not actually need image stabilization.)

By far the more common case for image stabilization is one in which the photographer is hand-holding the camera and may not frame the subject until the moment before the exposure begins. The camera movement will likely be several orders of magnitude (~4 to 7) larger than the drift that you want to measure. A low pass filter will tell you nothing at all.

At a certain point we can just start using guide stars [0].

[0] https://en.wikipedia.org/wiki/Guide_star

Asraelite|1 year ago

> The first isn’t a good solution for many reasons. Don’t have GPS signal? Shooting next to a magnet? Your system won’t work.

These seem trivial to work around. Just store the last known position and use that. It's rare that you'll be without a GPS signal or beside a magnet, and you certainly won't be traveling long distances in those conditions. And since when do magnets block GPS signals?

sokoloff|1 year ago

It’s not that a magnet blocks GPS signals, but it does affect the compass in the context of using 6 of the 9 degrees of freedom in the first proposed solution: “Use the camera’s GPS, accelerometer, and compass to calculate exactly where it is pointed and its latitude. ” (This solution should also do sensor fusion with the gyroscope, not just accelerometer and compass for orientation from a 9DoF system.)

_ph_|1 year ago

Version 2 sounds to me as the probably reason for the ability of Cameras like the OM1-2 to go over 8 stops. Yes, it is probably not a simple task to measure the earths drift with the gyroscopes, but there is one thing that might help: the frequency of that drift is exactly known - it is the speed of the earths rotation. So it should be possible to tune a very narrow filter to that frequency and only analyze the gyroscope signal for that frequency. With that one could at least partially compensate for the drift.

sib|1 year ago

Nikon claims 8.0 stops of "VR image stabilization" for their Zf camera (released late in 2023).

https://www.nikonusa.com/p/z-f/1761/overview

("Based on CIPA standards; when using the telephoto end of the NIKKOR Z 24-120mm f/4 S" - for clarity, that lens does not have optical VR in the lens itself, so this is all based on in-body stabilization.)

GuB-42|1 year ago

On the other hand, that should be awesome for astrophotography.

dakr|1 year ago

The issue is rotation of the sky about the line of sight axis. Whether the exposures are short or long, over time the sky will rotate more than what an in-camera system can compensate for (the amount of rotation depends on location/time/direction). Over these timescales a rotator that can perform larger movements is needed. This can be provided by an equatorial mount or an internal rotator.

SamBam|1 year ago

I believe that fancy astrophotography tripods already do that rotation for you, right?

I think that for astrophotography, the shutter times are so long that you have to build it into the tripod, instead of relying on the tiny amount of stabilization that can be done in-camera.

Although maybe it would be helpful to cancel out some motor noise of vibrations from the tripod. But probably the existing image stabilization already does this.

bongodongobob|1 year ago

It has no bearing. Tracking is how you keep stars from smearing, not stabilization.

mrandish|1 year ago

Perhaps in some camera firmware bug database there's a closed bug marked: "Won't fix. Tested working in orbit."

cesaref|1 year ago

This is analogous to astro-photography problems with keeping stars as points rather than as blurred lines in long exposures. If you think about it, if a long exposure at night has a static landscape but moving stars, the IBIS equivalent would have static stars and a moving landscape :)

wiml|1 year ago

There are some fairly enjoyable time lapse videos taken in a non rotating frame:

  https://youtu.be/DmwaUBY53YQ
  https://youtu.be/zRTJ5ISmVXE

Delmololo|1 year ago

You should be able to calculate it out by telling the user to press a button and after this, not rotating the camera away.

Right?

Might just not be practical at all.

On the other hand, shouldn't the earth rotate fast enough to figure this out in a short timeframe while the photographer starts looking through the finder?

mikhailfranco|1 year ago

Yes, basically Method (2) with stable measurement window. Just put the camera down on a stable surface, click button. Let system wait some ms to allow click disturbance to pass, then integrate signal over some fixed time to establish the rotation, then pick up and continue...

somat|1 year ago

Why not stabilize optically?

I am probably missing something huge. But if the goal is a stable image why use gyros. use the image itself to apply the correction factor to the final integration. sort of the same way videos are stabilized.

thrtythreeforty|1 year ago

You can do this two ways: you can take a bunch of images, and align and stack them. Or you can take one motion-blurred image and infer the convolution kernel somehow. The former is undesirable because each frame you take has a fixed amount of "read noise" from the sensor. So you'd change your sensor noise for the worse.

The second way is undesirable because it's really hard. There is a lot of research into this and some of the results are good but some are not.

kybernetyk|1 year ago

This would make the resulting image frame smaller. A Nono in the current full frame meta.

Photo hobbyists are snobs :)

tetris11|1 year ago

I still don't quite follow the explanation. The duck and I are on the surface of the same body and are rotating together, maintaining a constant distance... why does Earth rotation need to be corrected for?

zamalek|1 year ago

In terms of flatland:

Ignore the camera. Instead you have a planet (a circle in flatland), a gyroscope (an arrow that always points in the same direction on the page in flatland), and Mr Square.

        --> [.]
             |
        /----\
        |    |
        \----/
Start off at noon, with Mr Square and the arrow at the top of the planet, the gyroscope to the left of Mr Square pointing at him. Now progress time by 6 hours, by rotating the planet clockwise by 90 degrees. Mr Square and the gyroscope will move with the surface of the planet, resulting in them being on the right side of the circle on the page (the gyroscope above Mr Square on the page). Mr Square's feet will be on the surface of the planet, meaning his rotation matched the planet. However, the gyroscope always points in the same direction on the page. It's now pointing at the sky.

        /----\
        |    | -->
        \----/-[.]
In conclusion: both Mr Square and the gyroscope move with the surface of the planet - in exactly the same way. However, Mr Square will always be standing (along with everything else on the planet), while the gyroscope always points in the same direction on the page (irrespective of the time of day). A camera using the gyroscope would have to account for that.

We wouldn't have the same issue on a (non-rotating) space station. That's why planetary rotation is blamed.

yetihehe|1 year ago

It's about actual gyroscopes (motion sensors), not optical stabilisation. Gyroscopes in cameras are now so good they can pick up earth rotation. Perfect for stabilising image of stars, not so good for stabilising imae of duck translating over those stars. For that you would need optical stabilisation. In-body stabilisation is inertial, not optical.

Gravityloss|1 year ago

There was an escape system in the Soyuz rocket that fired if the rocket tilted too much. It was based on gyroscopes.

Once, a launch was aborted just before liftoff. The rocket stayed on the pad and the cosmonauts were sitting in the spacecraft for some time. Suddenly the abort system fired and pulled the capsule from the rocket. They landed safely on parachutes.

It was discovered that earth had rotated and the gyroscope had detected the tilt of the rocket, so it fired the escape system.

seszett|1 year ago

It's explained here:

> Your camera, which is using its IBIS system to attempt to keep everything as still as possible, may not realize that you are rotating with your subject and will instead try to zero out any rotation of the camera, including that of the Earth

The problem is that the stabilization system tries to compensate for the rotation of Earth (because it can't make the difference between the rotation of Earth, which shouldn't be compensated for, and the movement of the holder which should be).

So it would work if you were taking a photo of a subject not rotating together with the Earth. Like the stars.

aljgz|1 year ago

Let's do a small though experiment: Assume you have fixed your camera and the duck on a surface. Then while taking the photo, you rotate the surface. The motion sensor in the camera tries to cancel out this motion, which is suitable for taking a photo of something that's not fixed on the surface, which means it does not work well for the duck that's moving with the camera.

lolc|1 year ago

The opposite: Earth rotation is measured by the camera and can't be easily distinguished from camera rotation relative to earth. So image stabilization will also correct for earth rotation, which is undesirable.

contravariant|1 year ago

Well if it keeps pointing in the exact same direction then it would stay fixed on whatever star it is currently pointing towards.

Which is normally not a problem, but relative to something on the surface of the Earth the stars do move.

So I guess you should ask people to stand directly in front of Polaris if at all possible.

aidenn0|1 year ago

You should be able to exceed 6.3 stops if you are pointing north/south rather than east/west, right? Maybe they are just measuring it pointing north/south.

quonn|1 year ago

Would it be possible to correct for the rotation by counter rotating if the orientation of the camera is known (or determined by GPS + compass)?

mikewarot|1 year ago

Bullshit. It's ITAR, they don't want parts floating around in the world that can make a dead nuts accurate INS - inertial navigation system, as this enables weapons we don't want in the wild.

You can stabilize out everything and account for the rotation by simply watching the vector of gravity over time.

kybernetyk|1 year ago

Nikon has 8 stops so they somehow beat physics

kqr|1 year ago

6.3 stops is a lot, though. That's basically the fully usable aperture range of a kit zoom lens.

noselasd|1 year ago

What are these "stops" in this context, for the non-photo nerds ?

nimbleal|1 year ago

Yes, or considered another way 1/25th shutter vs almost 1/2000th, ie a lot of motion blur vs. virtually nothing will be able to provoke blurring

vouaobrasil|1 year ago

However, it's not the aperture range that matters. Theoretically, if earth were not rotating, then 10 stops would still be useful for long-exposure photography. In other words, the stop differences in stabilization are more useful when you think of then in terms of shutter speed, NOT aperture.

imglorp|1 year ago

Is a plain phone gyroscope enough to detect Earth rotation? Is there an app for that?

pixelpoet|1 year ago

Yet another example of b0rked / unescaped TeX, specifically log vs \log in this case. Blows my mind that nobody sees it...