KatrKat's comments

KatrKat | 2 years ago | on: Ask HN: How to deploy LLMs to serve a million users?

I think some domain-specific considerations include:

1. You need a really big in-memory data set that you touch ~all of several times for each request, so you really want to e.g. memory-map it and make sure it actually fits in memory on the machine.

2. If using a GPU, you have to make sure the GPU is hooked up to the serving process. You probably want your processes to be heavier-weight than they otherwise would be.

3. You might want to batch requests from several users for processing in the same stream of commands to the GPU. So you need to collect the right number of requests before processing any of them, without making any requests wait too long. You might need to sort these out by what inference parameters they want to override, and send them to different servers, because they might need to be batched accordingly.

4. You might want to stream the output more or less character by character. Possibly to several users, from one live run on a GPU, after having batched up enough requests to justify a run.

5. Content moderation when you are sending data to the user before you have even seen all of it yourself is an unsolved problem.

KatrKat | 3 years ago | on: Tell HN: Do not store any funds in PayPal or use them for anything critical

When PayPal goes to make the withdrawal, the bank doesn't have to let them and give the account a negative balance. Usually you can turn off overdraft and withdrawals will bounce if the funds aren't there.

This doesn't stop PayPal from reversing a previous inbound transfer they decide shouldn't have happened, but it ought to stop PayPal unilaterally deciding they would like some of your money.

KatrKat | 3 years ago | on: Is the FSF Fighting the Previous War?

Not all proprietary software currently insists that you get into a TPM-mediated Dom/sub relationship with its developers. Its currently possible, and might be ethically necessary, not to buy those ones, and instead to buy the other ones.

But it's also probably important to pursue a political avenue as well. The government should absolutely not be using this stuff, and shouldn't be advising citizens to do it to access government services. We could even pass a law requiring purchased hardware and software to meet a fiduciary standard towards its users.

KatrKat | 4 years ago | on: Microsoft no longer signs Windows drivers for Process Hacker

> I don't believe any entirely locked down firmware ever made it into any x86 board.

There are some Android x86 devices that won't boot unsigned firmware and won't let you change the signing keys. But I've only seen that in non-BIOS, non-UEFI devices.

KatrKat | 4 years ago | on: NYT journalist hacked with Pegasus after reporting on previous hacking attempts

iMessage can run over data, right? It sounds like the bugs exploited here were iMessage and WhatsApp holes, not weird mystery-baseband flaws (which are harder to patch but only ever affect a fraction of the phones you want to sell the ability to compromise). So similar Android exploits would just go right through the hotspot and compromise the Android device that does everything.

The only way out of this mess is actually correct code on actually correct hardware. Maybe you have to run Linux and Android at the top to run existing apps, but somewhere below there you need a supervisor that makes security guarantees that are actually true. You can't just port a monolithic C kernel onto hardware that's struggling to be faster than the competition and call it good.

Journalists need to buy communications equipment that doesn't come with that "NO WARRANTY OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE" line in the EULA. Sadly, it is not for sale.

KatrKat | 4 years ago | on: Spy tech that followed kids home for remote learning

Sorry, what I meant to say here I think is that Gaggle isn't being used to enforce a restriction against personal use. It seems like the school district in the article doesn't particularly care whether students are using their laptops or school accounts for personal use or not. They're not surveilling Google chat and e-mail looking for just any personal communications, or excluding academic communications form surveillance.

Look at the "what Gaggle flagged on kids' computers" chart: none of this is "video games" or "wasting time on YouTube" or "visiting sketchy domains that might host ransomware". If that sort of stuff was in "Other", "Other" would be the biggest category. If personal use of the devices or accounts outside the particular areas Gaggle scans for is against policy, the school doesn't seem to be using this tool to try and enforce that policy.

They are instead using the tool to examine everything the students do or store that might be related to these topics, whether it happens during personal or academic use.

KatrKat | 4 years ago | on: Spy tech that followed kids home for remote learning

Sorry, there may be a restriction against personal use; this system is not being used to enforce a restriction against personal use, as described in the article. It's being used to monitor all use, personal or otherwise, for content the school is interested in, no matter in what context it is written. It's not scanning for non-school documents in general, or video games.

The school has a right to impose conditions or monitor, to the same extent that anyone lending someone something has that right. But in this case it is in conflict with the child's rights, in a few ways:

1. The child may not know about the monitoring. 2. The child, being a child, may not actually have a feasible way to use their "own" device. The school-issued device may be the only device they have access to or that they or their family can afford. 3. The school may expect or require them to use the school-issued device in certain situations, and may not appreciate it when the child troops into class with their own personal machine, or tries to take a test from home on an ordinary PC. 4. The child has a right to a good education from the school, to make them into an adult that one is happy to share a society with. A good education should teach a person not to tolerate arbitrary restrictions, conditions, or monitoring without good cause, especially from a governmental agency.

Finally, I believe users of computing devices have the right to be able to rely on those devices as extensions of themselves. Having somebody else all up in your computing experience is a lot like having somebody else inside your head, and in general it shouldn't be allowed.

KatrKat | 4 years ago | on: Spy tech that followed kids home for remote learning

> The point is that Organization A is providing their owned device to Person A, and they have a right to monitor its use.

Why should that right win out in this case over the user's right to fiduciary technology that puts their interests first?

A company car can be worn out by driving it around for frivolous reasons, so the company will impose a condition that you can't do that, and monitor you to make sure you obey the restriction you agreed to.

A school issued laptop is not going to wear out any faster or slower depending on what you type on it. And there's clearly no restriction here against personal use.

The school might have a legitimate interest in whether the students are leaving their laptops plugged in all night mining Bitcoins and wearing out the battery. They don't have any legitimate interest in the contents of the students' diaries, whether they're written using school-issued tools or not.

Teaching students to expect this sort of treatment from people with power over them is corrosive to society.

page 1