top | item 21645249

Astrophotography with Night Sight on Pixel Phones

167 points| JayXon | 6 years ago |ai.googleblog.com | reply

72 comments

order
[+] emptybits|6 years ago|reply
This is amazing. Now just FYI, for those curious, some recent "old guard" innovations by pro camera makers for astrophotography include:

1. Leveraging the multi-axis image-stabilizing movement available to an in-camera DSLR sensor with GPS for the purpose of tracking the sky during a long-ish exposure to reduce star trails. Ricoh-Pentax Astrotracer. http://www.ricoh-imaging.co.jp/english/photo-life/astro/

2. Removing the camera's IR filter, allowing it to capture hydrogen-alpha rays (656nm). This captures energy (image, color) not otherwise seen by normal camera sensors. Canon EOS Ra. https://www.usa.canon.com/internet/portal/us/home/products/d...

I know this only because I've been researching a DSLR/mirrorless camera upgrade but also delaying it regularly while I'm reminded how excellent phone cameras have become! Unless you're a pro, a pixel-peeper, an artist, or just someone who simply enjoys the machinery and process.

[+] dr_zoidberg|6 years ago|reply
I didn't quite like the article. At the begining it shows a good picture of the Milky Way and it says:

> The image has not been retouched or post-processed in any way.

Then the whole article describes how they've automated a full astroprocessing pipeline inside the phone, that makes heavy postprocessing...

[+] 6gvONxR4sf7o|6 years ago|reply
If you're delaying a DSLR/mirrorless because the camera already in your pocket is good, I would suggest also looking into the highest end point-and-shoots. They're getting crazy useful, they shoot in RAW, they have most of the bells and whistles you'd want outside of physical stuff like changeable lens options. Mine is Good Enough™ for everything hobbyist and I'm super happy with it. And it fits in my pocket.
[+] mobilemidget|6 years ago|reply
pixel peeper perhaps, but these images are nice when you look at them on a smartphone screen, print one or view on large screen and the pictures from a mobile phone give a whole different feeling and sense of quality.
[+] anovikov|6 years ago|reply
(2) is about 15 years old, but can you explain how actually does (1) work?
[+] saiya-jin|6 years ago|reply
As a full frame DSLR shooter (Nikon D750 + 20mm F1.8), this is wild considering those crappy tiny sensors on phones. To get similar (albeit much, much sharper) results, I have to lug around 2kg of camera and lens plus bulky tripod.

Even with this, to get those dark dust clouds and stark colors some heavy postprocessing is required (which I mostly don't do because I consider it too much an alteration of original image, but it creates more interesting image). Don't think for a second that those superb images you can see everywhere are not literally over-painted in Photoshop (look at online tutorials on how to do it if you don't believe me).

I guess to make things impressive, google guys went to some proper remote desert far from any artificial light. And unless I missed something, they still used some tripod. In european alps, this kind of result is practically impossible - there is always some tiny village in every valley, and even if not light pollution seeps from far. One night panorama I have has quite strong glow coming from village of Chamonix some 15km far, that is on the other side of massive Mont Blanc range [1]. Anything can be achieved if you start playing a lot with Photoshop brushes, layers etc. but for me its one step too far.

Imagine what results can be had when such algorithms are paired to a full frame (or bigger) sensor!

[1] https://www.flickr.com/photos/99251154@N04/22790364795/in/al...

[+] lm28469|6 years ago|reply
> To get similar (albeit much, much sharper) results

Tbh it's not that hard to get sharper results, this was shot on a 40 years old camera, with a 50+ years old lens using a cheapo carbon fiber tripod ( 25+mph wind that probably rocked my camera quite a bit)

https://i.imgur.com/sdxyyEw.jpg

> And unless I missed something, they still used some tripod.

They did: "Clearly, this cannot work with a handheld camera; the phone would have to be placed on a tripod, a rock, or whatever else might be available to hold the camera steady."

[+] rbritton|6 years ago|reply
Zoomed out, those photos aren’t terrible, but if you click to view a higher resolution version, you can see the softness created by the noise removal algorithm. Based on my experience, the image quality is comparable to DSLRs of about ten years ago. Noise removal is a bandaid. In astrophotography, it will virtually always look softer than reality actually is, and it will also have removed actual stellar objects. There is no substitute for a better, lower noise sensor, but you work with what you have.

Sensor size is inversely correlated with the noise level. Smaller sensors have more noise where the least noise is on full frame (i.e., 35mm) or larger sensors. Phones have small sensors.

[+] jillesvangurp|6 years ago|reply
This not just simple noise removal but combining multiple photos with the noise appearing in different places (aside from the hot pixels). So, it's a strategy to get much of the same effects of a really long eposure with much less noise than you'd expect based on the tiny sensor because it averages out over multiple shots. In principle of course possible to do with a DSLR as well. E.g. Hugin might be able to do this.

Of course regular noise cancellation and other very lossy processing still kicks in after that (which may explain the blurry result). It would be interesting to look at the raw image produced by this.

I use open camera on my cheap Nokia 7 plus (which uses two cameras) and have been getting OK-ish results in Darktable. The dng file you get combines information from both sensors. One of them is black and white so these look really flat until you fix it in post processing. The raw photos have lots of noise (as you would expect) but noise filtering is pretty effective.

I imagine for this it would produce a dng with information from the different stills combined but none of the other post processing (except maybe hot pixel removal).

[+] baq|6 years ago|reply
that's exactly what is amazing about this tech - 10 years ago you'd have to carry a backpack, now the same quality pictures can be taken with a multi-purpose device which fits in your pocket. nothing to complain about if you ask me!
[+] tigershark|6 years ago|reply
Not exactly, noise level is inversely correlated with the photosensitive element size. The smaller they are, the more noise you have. For example an 8MPX APC sensor should have less noise than a 40MPX full frame sensor.
[+] bla3|6 years ago|reply
This is super cool tech. But I can't help thinking that this shows that the Pixel camera lead must be someone with an engineering background who got nerd sniped by this problem because it's cool and hard to do. Someone with a product background would've focused on something less niche.

On the other hand, some people who like Google like it because it still sometimes works on geeky, cool, fun stuff instead of being super product focused.

[+] tln|6 years ago|reply
My impression is that night sight as a feature has been pretty impactful in the market. As in, people buy them on this feature alone
[+] cageface|6 years ago|reply
I'd prefer to see Google focus on just making the Pixel line better everyday phones than working on exotic stuff like this or the motion sensor they added in the Pixel 4. Right now there's not much reason to choose a Pixel unless stock Android is your top priority.
[+] driverdan|6 years ago|reply
What needs improvement? I have a Pixel 2 and it's a great everyday phone. The only thing I can think of is a bigger battery but that's a problem with every phone.
[+] neogodless|6 years ago|reply
A few years ago, it seemed like the manufacturers were going haywire with terrible, clunky launchers that slowed down already slow hardware. A clean, stock Android seemed like the only option for reasonable performance, and there was a promise of timely updates.

Well, now hardware is fine and the "updates" are starting to feel less and less valuable. They no longer bring faster, cleaner interfaces. They just bring some new widgets and gizmos.

Now, I think the Pixel line is... OK. My wife and I have a Pixel 2 XL (used, ebay) and a Pixel 3 (spring sale for $400). And the 3a line is close to "everyday" pricing, especially when it goes on sale. But I'm starting to question whether it's worth sticking to Google's stock phones or if it's time to start cross-shopping competitors once again. But for so many of us, being able to snap photos and have them look pretty good is a nice comfort after the ugly early years of phones with cameras that required a lot of patience and persistence to use.

So in my opinion, there's value in putting resources into ensuring good photography, even if that's not the priority for every phone buyer. What are the best alternative phones with "good enough" cameras and "everyday" pricing?

[+] AllanHoustonSt|6 years ago|reply
I don't think Google has it in them. Something about their vertical integration, supply chain management, cross team coordination, etc etc. They just consistently put out phones that are subpar in the daily usability and quality control fronts. So I'm kinda glad they go for these exotic features. They know they have to stand out somehow.
[+] harrygeez|6 years ago|reply
Having Chrome on my phone or choice of any alternative web browser is big enough of a reason for me to pick Android. Besides, my experience with Android in the last couple of years has been more solid and consistent than iOS, even if the UI and animations are less polished.
[+] habosa|6 years ago|reply
The photos are gorgeous and I would definitely use this mode.

Astrophotography always kind of rubs me the wrong way though because that's not how it looks. Even if you go out to somewhere that's really dark, like a large National Park, and wait for a clear night it's never going to look in your eye like it does on Instagram. Don't get me wrong, what you do see is absolutely magnificent, it's just not what's in those pictures.

Seeing the galaxy with your own eyes is one of the most majestic things you'll ever witness. It's something that has inspired spontaneous prayer throughout history. It doesn't really need a filter.

[+] jakecopp|6 years ago|reply
If all the advancements in smartphone photography is in software, why aren't DSLR/mirrorless manufactures doing it too?

I don't want my camera to have a touchscreen/social media/wifi but it'd be cool if Adobe Camera Raw/some alternative could do this stuff!

[+] ygra|6 years ago|reply
IMHO they've basically missed an opportunity here for many years. They'd be in a perfect position to offer those things, combined with a much better/larger sensor, which enables even better images. On smartphones it's a matter of necessity, as the sensor is (fairly) crappy in comparison, but on a DSLR it could still be a benefit. Personally I'd be perfectly happy to get a pre-processed DNG from the camera instead of having to do this afterwards. And then give me the raw files to do it manually as well.

Perhaps they're trying not to cannibalize their lower market segments or think that professionals would never use those things (on which they might be correct). But I can definitely see that computational photography beyond raw->JPEG conversion with a color profile could have its place in a DSLR.

[+] penagwin|6 years ago|reply
Idk, I feel like we have far better software on our computers, and mirrorless/dslr's will always beat our phones in terms of raw specs.

This is more of a "Now anyone can capture decent astrophotography with just their phone!" then some revolutionary new thing if that makes sense? Basically they have to push their phone's sensor as far as it'll go and use software to remove the noise, instead of using a better lens/sensor setup (which is space and cost prohibitive for a phone).

You're still welcome to do extensive post-processing on your computer (I don't want my camera doing any processing), and indeed that's what any astro-photographer will do if they wanted.

That said I've captured some amazing night/moon photos with my A6000 that I have never been able to achieve before, and that's without any extra processing.

[+] lm28469|6 years ago|reply
> why aren't DSLR/mirrorless manufactures doing it too?

Most of them are historical companies and are probably very old school / slow to adapt. Google has probably access to better software engineers than Nikon or Canon which seem to barely be able to develop a working bluetooth/wifi sync.

[+] PopePompus|6 years ago|reply
The photos taken in this mode are amazing. But I think they are improved a lot if a vignetting correction is applied. Without that correction, the sky brightness is much greater near the center of the field than near the edges.
[+] kohtatsu|6 years ago|reply
Any recommendations for tripods/cases?

iPhone 11 Pro here.

[+] imvetri|6 years ago|reply
What are the possibilities that this AI camera app is not faking the image.

Here is why I ask - It has gyro - so knows whether we are pointing at sky or not. - AI checks whether its a clear sky - if yes - post fake image. If no - Dont risk getting caught. - Time + geo spacing - Gives the angle, position of camera relative to the space above us.

[+] PopePompus|6 years ago|reply
There is no chance this is being faked. I've been playing with the astrophotography mode on a Pixel 4 XL in a remote country location with no cell phone connectivity. A single 4 minute exposure is able to record stars down to about magnitude 9.5. A single exposure can detect the Crab Nebula, or the two brightest satellite galaxies of the Andromeda Galaxy (M 32 and NGC 205). To fake such results without internet connectivity would mean that the camera software would have to have an internal catalog of about 250,000 stars, with accurate colors and coordinates. It would need the size and shape of nebulae, the contours and brightness of the Milky Way, etc. When I add multiple frames, I see fainter stars appear, as they should, so the internal catalog would really need millions of stars, most of which would not show up unless the user carefully aligned multiple individual image and summed them. This is not being faked.
[+] kevingadd|6 years ago|reply
"Faking" isn't thinking about it right. There's no point in going through all the trouble of manually constructing a fake sky photo like that when you can just aggressively train a machine learning model to produce good-looking skies in photographs, which they appear to have done for the purposes of things like adjusting the brightness of the sky in night-time photos. In the end, it's not a question of whether it's "fake", just how much of the resulting photo is the invention of a neural net instead of the result of light hitting the sensor.

In the end photography like this is art, though, so if the person taking the shot is happy with it, then it's fine, probably. Just don't enter it in a competition with rules against retouching...

[+] ganitarashid|6 years ago|reply
As long as you manually control the exposure and exposure time, the same can be achieved with an iPhone camera. This is not specific to Pixel.
[+] oceanofsolaris|6 years ago|reply
They mention a couple of features that are not just "long exposure" in the post:

* Compensating for moving stars

* "Live viewfinder" during exposure

* Selectively darkening the sky

* Dark current compensation (though that is probably needed for all long-exposure photography...still, not a simple "more exposure" feature)

[+] pixelpoet|6 years ago|reply
It seems they have some ML stuff in there for specific features, e.g. sky / land light balance and hot pixel removal (probably similar to how denoising for MC path tracing works).

Aside: randomly recognised Ryan Geiss in the credits, he did the Milkdrop plugin for Winamp back in the day, and also some cool tech demos for Nvidia...