llasram
|
7 years ago
|
on: Adobe Flash’s Gaming Legacy and My Efforts To Save It
Say you somehow have advance knowledge of the burning of the Library of Alexandria. Is it legal to steal as many books as you can from the Library prior? Of course not. Is this the only way to save unique volumes from irrevocable destruction when you have no way to convince the librarians of its imminent fate, or even to contact them at all? Yes.
llasram
|
8 years ago
|
on: O'Reilly's Decision and Its DRM Implication
I think the broad market trends are pretty suggestive -- consumers in general don't care about DRM or wouldn't see e.g. Amazon's success in the ebook business. For more specifics I think we'd need hyper-detailed data. For example, how many people consider Google's use of the broken ACS4 DRM "close enough" to DRM-free?
llasram
|
8 years ago
|
on: O'Reilly's Decision and Its DRM Implication
I'm just a consumer, but Google Play Books are in fact available for DRM-free download when so-requested by the publisher. Or least that's the reason Google gives when directly providing the book as a DRM-free download.
llasram
|
8 years ago
I agree that innumeracy of various degrees is a widespread problem, but I do think the last example is because the term "false positive rate" sounds to many people like it should mean the false discovery rate. I'm sure there are some people who do have trouble reasoning from the correct definitions, but mis-identifying/remembering the semantics of the values provided leaves no chance for succesful calculation.
llasram
|
9 years ago
|
on: Paradoxes of probability and other statistical strangeness
That we're modeling the outcome of a coin toss as a sample space set containing at least two event elements, one of which we call "heads," and that we have a probability measure which assigns 0.5 to the subset consisting of only the "heads" event.
llasram
|
9 years ago
|
on: Mastering Bash and Terminal
Worth it just for finding out about `stty -ixon`. I never would have guessed from the `stty` man page description that this option would give me back C-s and C-q to bind to something actually useful.
llasram
|
9 years ago
|
on: Model-Based Machine Learning
My take from the introduction is that the books is going to mostly be about probabilistic graphical models (PGMs).
I look forward to reading this book when finished and hope they find success with this presentation of the core ideas. As a practitioner I see a fair amount of "I have a hammer; now I just need this problem to be a nail" type thinking with regard to using off-the-shelf techniques.
In the intro to this book the authors have an example with Kalman filters. A similar example is how Latent Dirichlet Allocation (LDA) is treated by different communities. In a certain chunk of the CS-dominated topic-modeling literature and in the data science blogosphere LDA is this recieved atomic technique; a black-box tool for modeling documents. In the Stan manual, it is one fairly boring example of a mixture model, only worth talking about explicitly because so many people ask about it.
llasram
|
9 years ago
|
on: “They Went to Sea in a Sieve, They Did”
If you want more detail on this story, I found Peter Nichol's book A Voyage for Madmen on the race as a whole quite engaging.
llasram
|
9 years ago
|
on: Stanford Degrees in Statistics
I've been enrolled in an MS Statistics program part-time while working full-time. I'm around half-way done, and by the end it will have taken me three years total taking two classes at a time, although that includes a few extra courses beyond what the program strictly requires.
llasram
|
9 years ago
|
on: Quick, How Might the Alien Spacecraft Work?
I had a pretty different reaction to "Story of Your Life." I can maybe see the description of "nihilist," but only in the sense of "positive nihilism" -- nothing outside of conscious life defines value, so it's up to us to define it and find it ourselves. The fact that life ends doesn't make it less valuable: it just frames the urgency of finding meaning within the small window we have.
I find that this is a theme which runs through a number of Chiang's stories. If you haven't read it, you might find "Exhalation" interesting: http://www.lightspeedmagazine.com/fiction/exhalation/ . You may also find it nihilst, but makes the positive, celabratory element even more explicit.
llasram
|
9 years ago
|
on: Machine Learning in a Year
That's a really good question. The best people I've seen so far have had a willingness to rigorously align their models with reality, plus the knowledge and/or experience to know what to check. Verifying assumptions, picking evaluation metrics appropriate for the problem, checking for model interpretability, checking that model decisions are sane, and so on.
llasram
|
9 years ago
|
on: Machine Learning in a Year
And in my experience the opposite applies as well. Having a PhD in a field providing the necessary background in theory doesn't necessarily imply the skills and experience required to produce useful models.
llasram
|
9 years ago
|
on: The Sigmoid Function in Logistic Regression
I think this is a good example of the problems with autodidactism. For all its flaws, structured education is what makes common knowledge common. When you study on your own, you don't know what you don't know, and there's no one there to point out your obvious-with-the-right-knowledge lapses.
llasram
|
9 years ago
|
on: A Personal Lisp Crisis (2012)
Which perhaps is evidence for their claim of personality-based self-selection?
llasram
|
10 years ago
|
on: Clojure, the Good Parts
I agree, both in the "mostly agree with" and disagreeing with the list regarding timbre. Adding yet another logging library to the mix doesn't fix the mess of Java logging libraries. Especially given that timbre seems to share the logging philosophy which makes Java logging such a mess in the first place; e.g., bundling an e-mail appender.
I'd personally add the potentially controvertial "prefer transducers to lazy sequences." Lazy seq laziness is a big source of errors for newbies, and even for old hands since 1.7 Iterable-backed lazy seqs have surprising chunked realization behavior. Transducers take a bit more up-front effort to gain familiarity, but then yield fewer surprises.
llasram
|
10 years ago
|
on: The Doom Movement Bible
llasram
|
10 years ago
|
on: Cyclops: a programming language written in undeciphered Greek runes
Quad and squad. The joys of a character-set derived by the character overstriking your original development teletype could produce.
llasram
|
10 years ago
|
on: Comparing a recurrent neural network with a Markov chain
No. Even in toy examples with little internal structure, consider the length of the sequence generated. For both models we can include a synthetic "end of sequence" token which the model can produce when run generatively. In the case of the Markov model the length distribution must be essentially geometric -- at each point that it's possible to end the sequence, the model has some possibility of ending the sequence, conditioned only on the previous n characters and not on any other property. The RNN states can model any arbitrary function, and thus are capable of generating sequences matching any arbitrary length distribution.
llasram
|
10 years ago
|
on: C++14: Transducers
Ok, bad example :-) Rust's mutable-iterator model of abstract collections supports this directly, but Clojure's functional-reduction model does not. I'll see if I can think of a better example, but it might be hard -- mutable iteration is pretty flexible (which is probably why it's what most languages do...).
llasram
|
10 years ago
I could have been clearer. By the scare-quoted "collection" I meant a value which implements the relevant collection-like interface. In Clojure this is any object `reduce` works on, in Rust is any struct implementing the `Iterator` trait etc. In all the examples discussed these transformations are lazy, only execution the composition of any number of transforms when values are requested.
Clojure transducers are different because a transducer is a function which accepts a reducing function and returns a new reducing function, adding behavior by how the input function is composed into the result function. Because the domain and range of transducers are both reducing functions, they can be chained through function composition. A chain is actually applied by transforming a reducing function then using that function to `reduce` (Rust `fold`) a collection.
The cool part is that the transformations don't refer to collections at all, not even through some highly abstract collection-like interface. This makes them applicable to other domains, like the previously-mentioned channels.