This video is a great primer on the state of the art of display technology right now and seriously changed how I view each tech. Seems like there is a lot of convergence between the main technologies and they all borrow from each other in different ways in their pursuit of the ideal display.
And yet even decent HiDPI displays (I'm talking 300ppi+) are barely a thing outside of the Apple ecosystem, or affordable. Seems like everyone is okay with 20 year-old display resolutions and keep pushing instead for higher refresh rates. I for one just don't want pixelated/blurry fonts and care little for refresh rates.
This is driving me insane. There are literally 5 desktop monitor models in the market today that provide a natively scaled experience for macOS users. All over $1000. The only monitors that would provide the equivalent screen real estate of my 2005 30" Apple Cinema Display @ 2560x1600 are the $2800 Dell 6k and the $5000 XDR.
Apple released "retina" scaling in 2012. It's been more than 10 years.
Apple Notebooks have 224-254ppi, the external displays 218ppi.
The only higher ppi displays from Apple are on the iphone, and they are not special, most decent (android) phones from 5 years ago have 400ppi+. Funnily Apple was dragging their feet in this space back then.
Apple is more consistent, but it really isn't hard to get 4k laptop displays now.
As I get older I care about bigger fonts and therefore bigger monitors. Higher resolution does nothing for me really. I need bigger fonts not better anti aliasing. My main is a 4k43"144hz which is basically the same dpi as a 27" 2560x1440 which I have 4 of in a semi sphere setup. If I ran that 4k at half the resolution to get nice fonts I'd have massive eye strain. Instead I can keep the monitors at a comfortable arms length so I don't squint up close and still get tons of text on screen.
I also get a lot better frame rates than pushing 5 4k or 8k screens.
I tried out apples absurd 8k display and it's just so small I'm getting half the text on the screen which is basically throwing money at Apple for no reason.
The 4k43 is glorious for games and for fusion360. Also as a grow light :)
Outside of the Apple ecosystem you have still apps struggling with high dpi, it is a pain to mix with non hdpi displays, less performance/higher power consumption.... and for what? A bit more relative crispness? Maybe I'm oldschool but native resolution even with visible pixel size looks way more sharper with OS ui borders than something that is scaled.
i'm no expert in display manufacturing, but as per my understanding driving higher refresh rate is much different challenge than creating more densly packed pixels.
display overclocking has been a thing for the longest time, which also implied that getting more refreshes is often a product of display controller and how reliably your display can work in higher voltage.
getting high yield on larger displays with high ppi is still tricky iirc, especially when some 1080p displays can still come with dead pixels.
i am sure companies would push for higher pixel count displays if economics were rightly aligned.
> Replacing the fluorescent blue with phosphorescent blue will mean a more balanced pixel structure and could enable higher-resolution displays in the future. In the near term, the switch will lead to an approximate 25 percent gain in efficiency
I would have expected a 50% gain. According to the quoted efficiencies, the blue fluorescent subpixel needs 4x more power (at 25% efficency) than the phosphorescent red and green subpixels (at near 100% efficiency).
So making the blue phosphorescent as well should reduce 1+1+4 to 1+1+1 power, a 50% reduction (technically a 100% gain in efficiency). Why is the near term gain only 25% ?
My professional experience is that blue sub-pixels for a 5500K balanced white use about 50% of total power (rather than the 66% you show). My understanding is that this is because even though Blue 460nm is shorter wavelength (higher photon energy) than RG(530&610nm) it is also less significantly less bright (in photons/sec/solid angle).
I've struggled to find a good webpage, but roughly in subpixel power% it comes to 45%+35%+4x(20%)=160%. By improving blue efficiency it could become 45%+35%+20%=100% and require only ~2/3 of the original power and total display power efficiency by ~50% (ignoring all the computation, RC losses, coms, etc).
White balanced power is independent of the number of pixels (or pixel arrangement) such as RGB vs RGGB, but RGBW or RGBY or RGBC can improve efficiency (and reduce this relative improvement %).
Not just VR, but think AR wearables as well. This is accretive to enabling smaller power sources without sacrificing capabilities, thus improving form factors.
I hope this leads to a phone / smart watch that lasts multiple days. Does anyone know how energy requirements break down between CPU and display in a typical device?
One problem for OLED screens compared to LCDs is their rather low maximum brightness. Unfortunately it doesn't seem like this new blue dye will change much about that.
This might even rub off on glow-in-the-dark technology. Exciting! Most glow-in-the-dark materials in blue are feeble by comparison to green and must rely on various inefficient tricks.
This seems like well-disguised marketing speak. The biggest weakness for OLEDs isn't brightness or battery life; it's burn-in, or rather burn-out. Will blue PHOLEDs make it so device manufacturers will stop telling me to set the Taskbar to auto-hide? If not, I don't see why anyone outside the industry should give a damn.
> Will blue PHOLEDs make it so device manufacturers will stop telling me to set the Taskbar to auto-hide
It will help, it says as much in the article. OLED burn-in is a function of how hard the pixel is being driven. Greater efficiency means less current required for the same brightness means less heat generated means longer lasting displays.
What's the state of MiniLED for gaming/movies? Isn't that best of both worlds? No burn-in, higher brightness (I know OLED would be pain to use in a room where blinds cannot be pulled down all the time). And image quality can match, or be better than OLED?
FWIW, I don’t think you should wait. I bought this year’s samsung s95c qd-oled, and it is such a nice looking display that I feel we’re well into the territory of diminishing returns of further improvements. I was struck by how much nicer 4k hdr movies look compared to the last few times I went to a movie theatre.
The only real downside is that now I notice just how much content is not 4k hdr. Improving the upscaler’s software would probably make a bigger real world difference than improving the panel, at least for me.
I went back to an LCD phone. So much better for reading - for me. I have no idea if it was because of pwm flicker or something else. I just hope they keep making phones with LCD screens.
I’m just waiting for a tv that has instant startup times like phone displays, why they cant replicate instant sleep/wake like mobile phones can is a mystery to me.
I don't think it has anything to do with the display.
It's the TV software waking itself up from power-saving sleep mode, possibly combined with some HDMI negotiation, which may involve waking up a second device from sleep like your Apple TV or Xbox.
Isn't that also down to legislation mandating devices to be off and have a maximum off power draw? Mind you, phones seem to be doing alright in that regard.
Their software takes awhile to boot. If they had something like VRRoom internally, the user experience would be a lot better, switching inputs faster etc.
Sure, that's annoying but I cannot understand how syncing up to an HDMI signal can take 5 to 10 seconds. Frankly, I can't understand how it's not measured in milliseconds. WTF are TVs doing? Is the protocol so bad at getting a picture to the screen quickly, or is it the TVs? Just switching from SDR to HDR blacks out the screen for multiple seconds. Come on.
[+] [-] loufe|2 years ago|reply
https://www.youtube.com/watch?v=TyUA1OmXMXA&pp=ygUjZGlzcGxhe...
IIRC talks about PHOLED as one of the upcoming technologies to get to the pinnacle.
[+] [-] Dolototo|2 years ago|reply
But the last few years they became noticable used.
This year in a outdoor it event I had to check the display wall behind the speaker to check it out.
It was full color, fast, bright (we are talking no cloud hot bright summer day and that display was in the sun and it was LEDs!
Crazy impressive
[+] [-] askonomm|2 years ago|reply
[+] [-] mastercheif|2 years ago|reply
Apple released "retina" scaling in 2012. It's been more than 10 years.
[+] [-] ysleepy|2 years ago|reply
Apple Notebooks have 224-254ppi, the external displays 218ppi. The only higher ppi displays from Apple are on the iphone, and they are not special, most decent (android) phones from 5 years ago have 400ppi+. Funnily Apple was dragging their feet in this space back then.
Apple is more consistent, but it really isn't hard to get 4k laptop displays now.
[+] [-] grogenaut|2 years ago|reply
I also get a lot better frame rates than pushing 5 4k or 8k screens.
I tried out apples absurd 8k display and it's just so small I'm getting half the text on the screen which is basically throwing money at Apple for no reason.
The 4k43 is glorious for games and for fusion360. Also as a grow light :)
[+] [-] tempestn|2 years ago|reply
[+] [-] poisonborz|2 years ago|reply
[+] [-] rldjbpin|2 years ago|reply
display overclocking has been a thing for the longest time, which also implied that getting more refreshes is often a product of display controller and how reliably your display can work in higher voltage.
getting high yield on larger displays with high ppi is still tricky iirc, especially when some 1080p displays can still come with dead pixels.
i am sure companies would push for higher pixel count displays if economics were rightly aligned.
[+] [-] andrepd|2 years ago|reply
[+] [-] tromp|2 years ago|reply
I would have expected a 50% gain. According to the quoted efficiencies, the blue fluorescent subpixel needs 4x more power (at 25% efficency) than the phosphorescent red and green subpixels (at near 100% efficiency). So making the blue phosphorescent as well should reduce 1+1+4 to 1+1+1 power, a 50% reduction (technically a 100% gain in efficiency). Why is the near term gain only 25% ?
[+] [-] kurthr|2 years ago|reply
I've struggled to find a good webpage, but roughly in subpixel power% it comes to 45%+35%+4x(20%)=160%. By improving blue efficiency it could become 45%+35%+20%=100% and require only ~2/3 of the original power and total display power efficiency by ~50% (ignoring all the computation, RC losses, coms, etc).
White balanced power is independent of the number of pixels (or pixel arrangement) such as RGB vs RGGB, but RGBW or RGBY or RGBC can improve efficiency (and reduce this relative improvement %).
[+] [-] manwe150|2 years ago|reply
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] aredox|2 years ago|reply
https://spectrum.ieee.org/bright-blue-pholeds-almost-ready-f...
[+] [-] kridsdale1|2 years ago|reply
[+] [-] tedunangst|2 years ago|reply
[+] [-] chilmers|2 years ago|reply
[+] [-] TheFuzzball|2 years ago|reply
[+] [-] shostack|2 years ago|reply
[+] [-] csdvrx|2 years ago|reply
I thought 4k was great, but if I can get a 25% increase in dpi or a better efficiency, I'm very interested!
[+] [-] FredPret|2 years ago|reply
[+] [-] cubefox|2 years ago|reply
[+] [-] wthomp|2 years ago|reply
[+] [-] inopinatus|2 years ago|reply
[+] [-] at_a_remove|2 years ago|reply
[+] [-] causi|2 years ago|reply
[+] [-] deergomoo|2 years ago|reply
It will help, it says as much in the article. OLED burn-in is a function of how hard the pixel is being driven. Greater efficiency means less current required for the same brightness means less heat generated means longer lasting displays.
[+] [-] pawelduda|2 years ago|reply
[+] [-] TheFuzzball|2 years ago|reply
[+] [-] Joeri|2 years ago|reply
The only real downside is that now I notice just how much content is not 4k hdr. Improving the upscaler’s software would probably make a bigger real world difference than improving the panel, at least for me.
[+] [-] discreteevent|2 years ago|reply
[+] [-] staflow|2 years ago|reply
[+] [-] moneywoes|2 years ago|reply
[+] [-] m3kw9|2 years ago|reply
[+] [-] crazygringo|2 years ago|reply
It's the TV software waking itself up from power-saving sleep mode, possibly combined with some HDMI negotiation, which may involve waking up a second device from sleep like your Apple TV or Xbox.
[+] [-] londons_explore|2 years ago|reply
The thing that takes ages to boot up is the 'smart' functionality, on screen display, hdmi link training, etc.
[+] [-] organsnyder|2 years ago|reply
[+] [-] solarkraft|2 years ago|reply
[+] [-] Cthulhu_|2 years ago|reply
[+] [-] skunkworker|2 years ago|reply
https://hdfury.com/product/8k-vrroom-40gbps/
[+] [-] __david__|2 years ago|reply
[+] [-] leptons|2 years ago|reply
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] 6R1M0R4CL3|2 years ago|reply
[deleted]
[+] [-] unknown|2 years ago|reply
[deleted]