top | item 29523094

Advent of Code 2021 in pure TensorFlow – day 1

80 points| me2too | 4 years ago |pgaleone.eu | reply

27 comments

order
[+] mlajtos|4 years ago|reply
This is fun idea. With these kind of coding tasks you won't get any advantage of using differentiable programming paradigm, but it is a nice reminder how syntactically bad TensorFlow is. Code of any differentiable program should look identical to any non-differentiable program. Maybe a small annotation à la TorchScript [0] can be tolerated, but not reimplementing everything via function calls with overly descriptive names.

Btw link to GitHub repo is broken. Copy&pasting URL works.

[0] https://pytorch.org/docs/stable/jit_language_reference.html#...

[+] tubby12345|4 years ago|reply
TS won't work. Despite claims to the contrary the only way to use TS effectively is as a tracing aid.
[+] keyle|4 years ago|reply
Ah yes, the enthusiasm of Day 1, "let's write my own stack DSL and do it on there!"

Day 8 "FML!" checks python version installed...

[+] me2too|4 years ago|reply
rofl, high change I'll end up this way. But right now I'm excited about this approach :D
[+] not2b|4 years ago|reply
The problems get a lot harder so it would be interesting to see if you can get all the way through with this approach.
[+] me2too|4 years ago|reply
So far I solved 1,2, and 3. I start with the 4 today while I write the article on the solution for the puzzle #2.

Let's see if I'm able to face them all (that's also my first year that I join the AoC - so it's totally new for me)

[+] an-allen|4 years ago|reply
Lovely effort. Looks like the approach to the first one is just programatic, procedural updates to a variable.

Was hoping to see some training of a model to produce outputs. Good effort nonetheless!

[+] me2too|4 years ago|reply
Author here. The goal of this challenges (and related articles) is to demonstrate how TensorFlow can be used as any other programming language. I am totally against of the usage of machine learning where's not required at all, and I hope to be able to solve all the puzzles without having to rely upon deep learning based solutions.

If in the other puzzles there's some optimization problem that can be expressed in a differentialble way, then I'd use ML for sure. But until there exist a deterministic solution, ML is just a waste (I say this as a ML researcher :) )

[+] NeutralForest|4 years ago|reply
That's pretty funny, AoC is rule-based so I don't think there will be much "deep" learning going but I hope I'll be surprised!
[+] me2too|4 years ago|reply
Author here. The goal of this challenges (and related articles) is to demonstrate how TensorFlow can be used as any other programming language. I am totally against of the usage of machine learning where's not required at all, and I hope to be able to solve all the puzzles without having to rely upon deep learning based solutions (but who knows, if there's something that can be easily expressed as an optimization problem having a language that's differentiable can help a lot).
[+] exdsq|4 years ago|reply
I’d like to read this but the number of ads navigating the blog on mobile is a horrible UX :(
[+] me2too|4 years ago|reply
That's Google AdSense auto ads (ML based placing of the ads in the page). I've set the level to "low" but I guess the number and positioning of ads are modulated by google using the info it has on you. I see only 3 ads, but maybe different users see different numbers of ads.

That's unfortunate

[+] werdnapk|4 years ago|reply
Ads? Using Firefox and uBlock Origin here (also available on mobile) and it's a nice ad-free experience.
[+] brilee|4 years ago|reply
You wrote this...

  All the comparisons like > are better written using their TensorFlow equivalent (e.g tf.greater). Autograph can convert them (you could write >), but it’s less idiomatic and I recommend to do not relying upon the automatic conversion, for having full control.
...but I'm not sure you realized that the for loop and the if statement in your code are being transparently compiled to dataset.map() and tf.cond() for you by Autograph :)
[+] me2too|4 years ago|reply
Yup! I realized it. I wrote that because in the past there was a huge problem with the operators, as I explained in this talk: https://pgaleone.eu/tf-function-talk/#slide=28

Even if now autograph is able to convert them correctly, I still prefer to have every operator explicitly converted whenever possibile. The loop, luckily, never had this transpilation problems

[+] antpls|4 years ago|reply
Good reading ! It would be interesting to have other similar challenges, such as Euler, solved in idiomatic Tensorflow and Pytorch. Also some examples of more complicated state-of-the-art algorithms, such as sorting/graph/trees algorithms reimplemented in these frameworks.

It would be a great introduction to these frameworks for people who never touched anything ML-related, leaving the neural network content to later in the learning process.

Learning how to create differentiable algorithms and neural networks would be easier once the way those frameworks work is understood (ingesting data, iterating dataset, running, debugging, profiling, etc).

If you are starting with neural networks or differentiable programming, learning both the maths and the frameworks at the same time can be quite overwhelming

[+] 0-_-0|4 years ago|reply
It would have been more tensorflow-y if you did this with convolutions (1x2 and 1x3)
[+] me2too|4 years ago|reply
Great suggestion! I guess I'll write a followup with the convolution-based solution
[+] NotEvil|4 years ago|reply
Site is censored in india.
[+] me2too|4 years ago|reply
wtf also a reddit user contacted me for telling me that. Do you have any idea on the reason and what can I do to avoid this censorship? The reddit user guessed because of the word "leone" in the domain name, but it is part of my surname, I can't change it :<
[+] udbhavs|4 years ago|reply
Loads for me on ACT fiber