top | item 44366011

(no title)

binarystargazer | 8 months ago

I'm the Rubin team member responsible for mapping the data into RGB images. I have been a long time reader of hacker news, but finally made an account to comment on this. I wanted to thank everyone here for their interest and taking their time to check out these images. Seeing everyone interested and engaged makes all the long hours worth it.

discuss

order

phkahler|8 months ago

What range of wavelengths are in the original images? Do you produce multiple RGB images for looking at different things? c'mon, what does that entail? ;-)

binarystargazer|8 months ago

The filters used for this range from near infrared to near uv. We used 4 different filters in all (for this image, the telescope has more). In general yes to fully appreciate all the color information as a human we need to generate different color combos so our eyes can pick up different contrasts.

However, what we strive for is being accurate to "if your eyes COULD see like this, it would look like this". To the best our our ability of course. We did a lot of research into human perception to create this and tired to map the information of color and intensity in a similar way to how your brain constructs that information into an image.

Let me tell you, I did not appreciate how deep a topic this was before starting, and how limited our file formats and electronic reproduction capabilities are for this. The data has such a range of information (in color and intensity) it is hard to encode into existing formats that most people are able to display. I really want to spend some time to do this in modern HDR (true HDR, not tone-mapping) where the brightness can actually be encoded separately than just RGB values. The documentation on these (several competing) formats is a bit all over the place though.

Edit: I wanted to edit to add, if anyone reading this is an expert in HDR formats and or processing, I'd live to pick your brain a bit!

legohead|8 months ago

Thank you for your work!