dbttdft's comments

dbttdft | 1 year ago | on: The Weird Nerd comes with trade-offs

It was *very* hard to decode what this article is about when it started off with "quickly forgot about their support for “Women in STEM”", which (after 3 reads of the first 2 paragraphs and then closing the page) has nothing to do with anything, but makes it sound as if the article is about identity politics, which it isn't.

dbttdft | 1 year ago | on: Microsoft AI spying scandal: time to rethink privacy standards

> we need to step back and look at what a healthy technological ecosystem would look like

...aaand as usual mr tech commentator was right up until this point. There doesn't need to be a balance. People are always talking about "balance". What balance? TV/radio didn't spy on you to operate a profitable business in the broadcast days. I can do anything on an Amiga computer with probably not bad UI compared to the latest versions of Windows, and it will never have to phone home for anything. This opinion itself, ironically, is just a shifting baseline. You are talking about "balance" (translation: compromises) because the 3 maintained pieces of software for your domain (such as camera in your house) are by two scum corporations and the 3rd is some garbage quality open source software. Nothing stops someone from making actual good software/hardware, closed or open.

dbttdft | 2 years ago | on: The internet isn't dying, it's changing

The more viewers your website gets the more money it needs for hosting (unlike proper ways of serving documents, like Torrent), this naturally leads to centralization. The more americunts browse your site the more likely it needs lawyers and since you are now a business (which you shouldn't be because websites for profit are garbage), you have to act "responsibly", like a business. The more you publish stuff that goes against the grain the more your site will be DDoSed. Today even just not being polite is enough to get DDoSed by some kind of blue haired "anarchist". I mean this literally, not in some nazi way - merely writing the Spanish word for black regardless of context is enough for DDoS. You need professional publishers to imagine every single problematic thing you could write if you want your blog today to have a mere 1 million viewers. The more you do anything what so ever the more your website is pushed around and eventually has no choice but to go with some big enterprise data centers who has their own stupid rules like Cloudflare who blocks every second user for their broken firewall and constantly insists they are right.

dbttdft | 2 years ago | on: How to Get Out of Vi (2015)

> Given that the vi user interface is logical, and therefore easy to learn

This is the number one problem with tech industry. Autists don't understand that people other than them don't spend thousands of hours on what they so happen to be intersted in (for example caring this much about one text editor, when there are literally 1000 other things to learn in operating a computer). While you're at it, explain that you are supposed to press esc then wait 500ms (it's obvious why it works this way if you invest a few hundred hours in studying terminals) then type :q and explain that people born in 2010 are supposed to know what a terminal is even though nobody even used terminals by the late 90s.

dbttdft | 2 years ago | on: The internet isn't dying, it's changing

You're wrong because you're just saying Cloudflare is making the web more centralized. The web is centralized by design and a defective technology. It costs money to host text files (and for no reason, see Bit Torrent for a counter example (and don't talk to me about unseeded content because you could literally just seed your website if you care about it so it would be no different than current web)).

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

Actually people who consistently spread misinformation that powers the corporate racket that normalizes low quality technology deserve to be shamed, you just naively value politeness over correctness, which I am very aware is a huge problem with sites like HN, Slashdot, and Reddit.

Where are you every time some stupid Apple or Tesla user explains how other people are just "poor" so they won't get why their overpriced garbage is so good? Nowhere. You're just an autistic moderator and will always be a bane to society with your social pedantry.

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

On second thought I think duration of persistence of vision (or afterimages) doesn't matter, but more the velocity of the object which your eye is following and the amount of angular "resolution" your eye can perceive. If your focus moves X amount while the object is lit, if that distance is more than the angular "resolution" your eye can perceive, it will appear as some blur behind the object.

You made me think about an interesting point I long avoided going into.

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

Well, ironically CRT has less blur than LCD in practice unless you only ever look at still images. Both strain my eyes but CRTs would have much better focus by now if they kept on being developed. And as I said and you didn't understand, flicker on CRTs can be reduced by increasing to whatever rate you want, even 200Hz (although 85Hz should be enough) on a cheap 1999 model I have, and most LEDs in light bulbs and absolutely any appliance etc are already flickering at 120Hz at most, so you're basically just picking and choosing what things you think affect you.

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

Hmm, I haven't thought this far into it, what matters is probably how long the persistence of vision of the human is.

For the LG CX OLED (https://tftcentral.co.uk/reviews/lg_cx_oled) (oddly one of the only OLEDs that have BFI) we can see they have no problem pulsing the pixels on on for only 3-4ms (https://tftcentral.co.uk/images/lg_cx_oled/bfi_120_high.png) (5ms divisions), resulting in these images (which I can't tell how representative of a human are because I don't know if the camera exposure matches human vision and the camera is probably wobbling at such high speeds, adding more blur than a human would see which is why these pursuit camera pictures are always blurry to hell - IIRC all those LCDs from the last 6 years with half working BFI aren't actually blurry, they are just buggily implemented so they have the top of the screen and/or bottom showing double images, each image is crisp, just doubled):

https://www.tftcentral.co.uk/images/lg_cx_oled/pursuit_120hz...

https://tftcentral.co.uk/images/lg_cx_oled/pursuit_60hz.jpg

If you just take a 240Hz OLED and blank 3/4 frames to get back a 60Hz image I'd be half surprised if that actually looks blurrier than a CRT.

EDIT: yup, just tested one of my garbage LCDs even on 75Hz with BFI enabled in the monitor menu, the image is perfectly crisp, just there are artifacts everywhere, mainly double images. That wouldn't happen with OLED with BFI (with BFI in the hardware/firmware obviously to minimize the on time beyond just dividing the frame rate).

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

What are you even talking about, an OLED running at 100Hz with BFI already would have zero motion blur. I'm not sure whether or not that's implemented yet but it's just a matter of time before the shitshow of monitor vendors figures that out.

dbttdft | 2 years ago | on: The internet isn't dying, it's changing

You're both wrong. Cloudflare is breaking the internet by doing interactive verification of users, meaning the web is no longer an open protocol. I'm referring to that "enable javascript" page and "one more step" page. They require you to have Firefox or Chrome, neither of which are acceptable, which they verify with scripts and deep packet inspection. Contrast this to old school communication protocols which I could implement in an hour. I can't implement a Firefox in 100 years. Firefox can't even run on 95% of my machines, it just lags to hell once you open two tabs. And I need as many instances of Firefox as possible across many boxes to prevent being data mined. Cloudflare is the nail in the coffin for the web, they just not have advanced that far yet. Just wait until "something happens" and Cloudflare ups their policing.

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

This discussion made me want to refresh my understanding of current Apple monitors and if they have anything to offer.

I looked at the Apple Studio Display glossy and matte versions at the Apple store. The matte version uses "nano coating" instead of whatever regular matte LCDs use, and is a $200 addon despite being inferior.

-1 The reflections of the glossy screen are far worse than what you get with most (all?) CRTs. This is standard with LCDs for some reason, they are just perfect mirrors that reflect even a T-shirt in a well lit room.

+1 The pixel density is superb as expected. The difference from average (~100PPI) monitors is night and day. I don't know why for the last 20 years ~100PPI has been standard. It's terrible. For some reason there are almost no monitors with high pixel density other than this Apple and some Dells.

-2 Contrary to the IPS hype of the last decade, the viewing angle is terrible. At a wide angle, (like if you are just standing anywhere near the desk and not bobbing your head down to the level of where it would be if you're sitting) the image is completely washed out. Sitting in front of the white screen at normal distance (a few feet back), the edges are grey. Just moving your head 10 degrees in any direction causes abrupt color / luminance shift on any content. This is exactly the same as all IPS monitors, from $2-$3000. Nothing unexpected here, unless you thought the $1500 price would magically fix this.

+1 I correct myself on one point in previous discussions: The glossy screen is not obscured by grids like with most other glossy LCDs. Normally, with a glossy LCD you still have a very fine grid of black on top of the image, said to be the transistors blocking a small portion of each pixel. Perhaps the high pixel density here alleviates that issue.

+0.5 The color gamut impresses with high saturation. The accuracy and real (as opposed to claimed) coverage, however was not tested. Setting the color gamut to "legacy" modes in the Apple OS menus, like BT. 709 (basically sRGB, as I didn't see an "sRGB" option) made the image too dark and washed out, worse than what you'd get with a standard sRGB IPS LCD as well as disabled the brightness setting for some made up reason.

-1 The input lag was abysmal, as bad as TVs. Which could be due to the fact that the store only has wireless Apple mice, but could also be the monitor itself or the OS or many other things. I tested 4 Apples each hooked up to one of these monitors. Out of 75 monitors I have tested in recent years, only a few (such as the Dell 2407WFP, Dell 2048WFP, and NEC LCD2070NX) come anywhere close to as laggy as this. The lag is too high for even desktop use. For reference it's twice as bad as if you played a video game on 60Hz with vsync. Since the monitor has GBs of RAM I'm leaning toward the lag being from it and not the wireless mouse.

-1 It only does 60Hz which is terrible because that causes intense motion blur. Just scrolling in a web browser drops the resolution below 640x480 and increases the blur above even the cheapest oldest CRT from the 80s.

0 The "nano coating" is worse than average LCD matte coatings. With nano coating, all of the above results were the same, except the image is obscured by what looks like rainbow colored sand, that shifts color when you move your head, just like Dell 2408WFP or Dell 2007FP, from ~2007, or later VA crap from 2012 by BenQ. The grains of sand are finer than those older monitors, but still unignorably visible at normal distance (a few feet back) on a screen of one color or white or some pale color. The glossy version is better (and $200 less).

-1 For a monitor it has terrible unwanted features like built in ultra high res webcam, microphone built in chips which are allegedly just copied from a phone, reportedly with 64GB RAM but "only 2GB is used at any given moment" (whatever that means). Absolute nightmare for security, hopefully you could just mod the monitor and make it take a signal straight to the panel without any of the massive and pointless chips and hidden OS in between. I review only based on image quality but this is too much of an atrocity to ignore, even paying $1500 for hardly anything is not nearly as bad as this.

Rating: 5/10, OLED is better and the poor image quality is not worth the pixel density increase. You're basically getting a premium calculator screen. You're better of just getting a 3840x2160 27" OLED which is 160PPI, a small decrease from 218PPI of the Apple Studio Display. LCD remains a dead end tech, no matter how much premium and "engineering" you add to it. High end CRT is still vastly superior to this, and were half the price, they are slightly blurry but they have superior contrast, no viewing angle issues, no lag, and no motion blur which makes them sharper any time you view a moving image (yep I'm aware that you people don't understand this).

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

That's because expensive models, and the only ones with any remotely decent picture (maybe scratch that as the sandy coating is abysmal) like the $1000+ Dell 2408WFP, had miles of input lag to the point where you couldn't even control your mouse on the desktop. Like most TVs now. However all LCDs also just do have high input lag, you're just not sensitive to it. On the more tame LCDs like any random TN from 2005+, it's just a minor annoyance, although it does make mouse based 3D games unplayable.

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

When 85Hz CRTs were expensive, so were LCDs. CRTs were basically free by the time LCDs became remotely usable. There was basically no reason to ever buy an LCD until way after 2010. Everyone here who thinks otherwise has some dumb embarrassing reason like they thought they could finally get laid if they made their desk less cluttered.

> And the flicker would still be crap even at 85 Hz.

Oh yes, I'm sure, and you are just magically fine with 100Hz LCD flicker that existed in most of them until 2015+ and even then only on every odd model. You clearly know what you're talking about and not just assuming the values that matter align with your beliefs. Listen, you have zero clue what you're talking about. Please at least take the time to understand this next fact:

Practically all LEDs flicker, usually at some absurdly low level like 100Hz or even 60Hz in really cheap ones. This is on 99% of modern vehicle tail lights, light bulbs, shaver LEDs, dishwashers, microwaves, practically all electronics such as even an external hard drive bay or USB stick, laundry and every appliance. And it annoys me. Why? Because it's at the corner of my eye. A human's eyes are highly sensitive to flicker, but only for stuff on the sides of their head, not directly in front. Which brings me back to CRT. When I'm using CRT, I'm looking directly at it, and this makes the flicker imperceptible even as low as 75Hz or so. Only if I turned my head to the side such that the CRT was on the edge of my vision it would become noticeable. For this reason, outside of this backwards world, one would have non flickering lights everywhere, and the only thing that flickers would be their monitor (because this is needed to prevent motion blur, the only alternative is to have a huge refresh rate like 500Hz+ and software that can keep up). And you are implying things work otherwise. They don't.

> Talking about the blurry low resolution. LCD is sharp.

Which part did you not get? I said a panning object on LCD will not be sharp or anything remotely close to it, it will be worse than the worst CRT.

> Not talking about pixel response time.

As I said LCDs have motion blur. It has nothing to do with pixel response. We had 200Hz CRTs in 1999. 144Hz LCD is still badly blurry, and these "gamer" LCDs tend to have weird bugs adding more pixel response artifacts than there would be if the "gamers" didn't design the firmware.

> Pixel response time on gaming LCDs is 1ms nowadays.

No, they are not. Why did you even edit that in. Those marketing numbers are simply false. In reality you have a different pixel response for every pixel transition (0-255), (0-40), etc. And you continue to not know what you're talking about by thinking this would reduce the motion blur. The motion blur in LCD has absolutely nothing to do with pixel response.

> [i have the glossy apple]

Okay, and the glossy ones have terrible glare. Somehow glossy LCDs always have worse glare than most CRTs, probably because the vendor only cares about the wow factor of what can be achieved by just ripping off the anti glare coating of any $2 monitor. LCDs also have worse image clarity on a glossy screen than a CRT. A plain screen set to show one color on even a glossy LCD looks gritty, while on a CRT it looks clear.

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

CRT black levels compared to LCD are night and day. Even a game like Genshin Impact (shit game I know, but it has good graphics) looks unimaginably better on CRT than any LCD, due to the high contrast as a consequence of the low black level. CRTs generally have less reflection than a glossy LCD due to some different kind of coating or glass treatment hard to find info on it). Last summer I used a CRT that is 99% dead (99% brightness / contrast) with the windows open every day letting sun in on a ~90 degree angle, and there was never an issue, nor with cheaper ones.

dbttdft | 2 years ago | on: The Secret Life of XY Monitors (2001)

You don't know what you're talking about.

LCDs were completely unusable until the mid 2000s, even then they were all 60Hz/75Hz with no black frame insertion, which made any animation what so ever, even text scrolling blurred to crap. It took me years to figure out that this was why I suddenly couldn't read any text scrolling on logs on terminals.

Yes, CRTs, flicker, it's a feature, it prevents motion blur. You're meant to put them up to 85Hz or so and flicker becomes less of an issue. There are two different types of flicker, the flicker fusion threshold, which is when humans still see flicker when looking directly at a flickering light source, and just general perceptible flicker, which can be seen easier at the corner of your eyes and in my experience goes up to 100s of Hz. The latter is an issue for all CRTs, and basically all LCDs up until 2015 or so when they started pushing flicker free. Before 2015, basically all LCDs flickered, typically between 100Hz-300Hz, and it was always visible, and also left artifacts on the screen during image panning (such as in a 3D game or when scrolling a map).

Yes, low end CRTs had lots of blur but by the late 90s any mid ranged model had somewhat acceptable levels of blur, and moreover much less blur than you get the moment an image pans across an LCD. As an experiment, if you simply move an image across the screen at 15pixels per frame on a 60Hz LCD, it will already be blured to hell.

Those Apple monitors have terrible grainy coating which some monitors choose to have for some reason. There's no tradeoff there, since there are plenty of LCDs without it. They also have bad viewing angles like all IPS contrary to marketing. The only advantage they have is high pixel density, and high color gamut which may or may not be functional or desirable but this is another can of worms. Wide gamut on a monitor is not just automatically good.

page 1