Providing a tl;dr: Google provides a jupyter / ipython research environment with some amount of integration to their existing services. This environment they call "Colaboratory". They have now started providing free access to GPU resources from within this environment. Details for using with tensorflow here: https://colab.research.google.com/notebook#fileId=/v2/extern...
Anybody know how to upload large data files into these environments? Possibly from google drive? I spent some time at it and the example loading techniques all run into download limits or IO fails. Anybody successful at loading large data files?
Is it possible to start a training run, logout for a month-two and return back once training is complete? Or does it only allow a single browser session?
1 hour timeout on runs. I'm assuming that means from one command execution to another, not per batch but shrug. Once it times out it looks like they save state and you can reconnect but I've not checked very closely. Same for saved data but I assume there's some way to chuck it into drive which is the storage backend as far as I can tell.
It's previous gen hardware, so presumably the only cost to google is energy usage and similar. They probably use the newer p100 and v100 cards internally which are significantly faster for deep learning.
[+] [-] cstpdk|8 years ago|reply
[+] [-] joe_the_user|8 years ago|reply
[+] [-] forestgreen|8 years ago|reply
[+] [-] drewbuschhorn|8 years ago|reply
I just copy and pasted a keras MNIST demo and it seemed to work like a charm.
https://colab.research.google.com/notebook#fileId=1tD_viugd-...
(think you need a google account, fwiw)
[+] [-] bitL|8 years ago|reply
[+] [-] drewbuschhorn|8 years ago|reply
[+] [-] chickenthief|8 years ago|reply
[+] [-] minimaxir|8 years ago|reply
[+] [-] Robadob|8 years ago|reply
[+] [-] puzzle|8 years ago|reply
[+] [-] fwdpropaganda|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] ct520|8 years ago|reply