alexanderskates's comments

alexanderskates | 4 years ago | on: ArchiveTeam Warrior backing up Reddit

Their video player is also garbage, especially on mobile. I'll have to open and close a video multiple times to get it to play, the quality will take a nosedive midway through and just stay that way for the rest of the video, and they take forever to load.

alexanderskates | 4 years ago | on: Challenges students face when learning to work with relational databases and SQL

I had this same issue in Redshift and ended up populating a table with values 1 to the maximum number of commas found (e.g. using max(regexp_count(...)) or something), then cross joining on the table with the csv column and calling split_part on the corresponding column and index (with the index coming from the numbers table). The cross join ensures that you index every value of the csv column.

alexanderskates | 4 years ago | on: Launch HN: BlackOakTV (YC S21) – Netflix for black people

What does "X for white people" look like though? The only reason that "X for black people" exists is that black people are a minority group and aren't sufficiently catered to by X, which is already more or less "X for white people" by default, at least in the US and much of Europe. As such, any product that that markets itself explicitly to white people (again, only referencing the US and Europe here) is much more likely to have less socially acceptable intent behind it

"X for white people" makes more sense in a population where caucasians are the minority, for instance in China.

alexanderskates | 5 years ago | on: I wasted $40k on a fantastic startup idea

At least in the UK, sandwiches are the most that pharma reps can use to bribe doctors with... Which is not to say they aren't effective (you'll get butts in chairs at least, no guarantee they will pay any attention to you though).

alexanderskates | 5 years ago | on: Yann LeCun on GPT-3

I think an important distinction to make is your use of the word "language", and how we think of language as it concerns human minds, and as it concerns GPT-3.

In our heads, language is a combination of words and concepts, and knowledge can be encoded by making connections between concepts, not simply words. If there is no concept or idea backing up the words, it can hardly be called knowledge. Consider the case of the man who did not speak French, yet memorised a French dictionary, and subsequently went on to win a Scrabble competition. Just because he knows the words, would you say he knows the language?

A language model such as GPT-3 operates only on words, not concepts. It can make connections between words on the basis of statistical correlations, but has no capacity for encoding concepts, and therefore cannot "know" anything.

alexanderskates | 5 years ago | on: Show HN: Handwritten.js – Convert typed text to realistic handwriting

The "handwriting" data for this model is basically the coordinates of a pen. The length of the string representation of the text is very different from the length of the coordinate representation of the text, therefore the model "learns" a window corresponding to when it is drawing the current letter, and when to start the next letter. For these letters, as the model doesn't learn how long this window should be, nor how to transition from it to the next letter, it gets stuck and outputs nonsense.

alexanderskates | 5 years ago | on: Why did the A-level algorithm say no?

The problem is that the prior appears to be placed over the school, rather than the individual -- ie if your school had a low proportion of high achievers in previous years, students are finding their grades marked down, almost regardless of their performance. This results in particularly high grade disparity between independent and state schools. So it is not so much the case that good students get good marks, bad students get bad marks, but rather good schools get good marks, "bad" schools get bad marks.
page 1