Google Colab Pro is switching to compute credits
93 points| 3wolf | 3 years ago
> Hi-
We’re improving the Terms of Service that apply to your Colab Pro or Colab Pro+ subscription making them easier for you to understand and improving the ways you can use Colab. The changes will take effect on September 29.
The [updated Terms of Service](https://research.google.com/colaboratory/tos_v3.html) include changes that will allow you to have more control over how and when you use Colab, allowing us to offer new services and features that will enhance your experience using Colab.
We will increase transparency by granting paid subscribers compute quota via compute units which will be visible in your Colab notebooks, allowing you to understand how much compute quota you have left. These compute units are granted monthly and will expire after 3 months. You will be entitled to a certain number of compute units based on your subscription level and will have the ability to purchase more compute units as needed.
Additionally, we will allow paid subscribers to exhaust their compute quota at a much higher rate. This will result in paid subscribers having more flexibility in accessing resources. Read more about these changes at our [FAQ](https://research.google.com/colaboratory/faq.html#compute-units).
If you would like to cancel your Colab Pro or Pro+ subscription, you can do that by going to pay.google.com and clicking Subscriptions and services. If you have any trouble canceling, you can email colab-billing@google.com for assistance. Please include an order number from one of your receipt emails if you email us for assistance.
-The Colab team
derefr|3 years ago
Individuals tend to be upset; while professionals are happy that individual free-riders will no longer be sucking up undue amounts of compute power, and so QoS on the system will improve for them.
TigeriusKirk|3 years ago
Jugurtha|3 years ago
What I understood from my interactions is that they complain but will not use a paid product because even though they're paying from nothing to $49, the actual resources used is in the $800/month ballpark (notebooks running 23 hours per day, seven days a week, using a GPU).
These are clearly hobbyists. The pros had different problems such as not being able to pay for it from certain countries.
In other words, there are people who need a notebook to run and not crash and willing to pay for that and there are others working on toy projects/individual pet projects or projects with no real stakes who'll complain about it but will not switch because another company will not really subsidize usage.
Yes, there are other companies that offer notebooks, but our product was for professionals in the ML field, and there's much more to ML project than running a notebook (real time collaborative notebooks, automatic experiment tracking, plugging compute from any cloud provider, one click model deployment, object storage like a filesystem, live monitoring dashboard for deployed models, and more).
discordance|3 years ago
Any other suggestions for where to rent cheap GPUs? - i've heard about Hetzner (https://www.hetzner.com/sb?search=gpu), but they are 1080s.
frederickgeek8|3 years ago
etaioinshrdlu|3 years ago
sabalaba|3 years ago
AdamJacobMuller|3 years ago
zhl146|3 years ago
Aeolun|3 years ago
I got an A100 after I susbscribed, so it worked out for me, but still annoying you don’t know what you’ll get.
cperry|3 years ago
mark_l_watson|3 years ago
I am working on an AI book in Python. (I usually write about Lisp languages.) About half the examples will be Colab notebooks and half will be Python examples to be run on laptops.
In any case, I like the soon to be implemented changes, sounds like a good idea to get credits and see a readout of usage and what you have left.
cperry|3 years ago
I read every feedback submission in Colab so if you ever have feedback you'd like addressed, send away.
goodfight|3 years ago
cperry|3 years ago
No intention to lock it down? Whatever that would mean? We ensure notebooks are totally portable for any other Jupyter install you want to move to.
This change is about laying the groundwork for increased transparency for your paid compute consumption, vs. the current model of kind of hiding that away.
xkapastel|3 years ago
TorqueFilet|3 years ago
geogra4|3 years ago
mardifoufs|3 years ago
frognumber|3 years ago
Flat compute units seem simple, but result in a lot of waste.
skybrian|3 years ago
And if it does go idle, it saves energy, which costs money. At scale, compute isn't free.
moralestapia|3 years ago
I guess nothing stops you from buying infra and offering it for "free (or nearly free)".
johndfsgdgdfg|3 years ago
HN used to be a place for interesting discussions. Now it's a grievance forum for entitled freeloaders.
knorker|3 years ago
Mathnerd314|3 years ago
fibrennan|3 years ago
Free notebooks can be run for 6 hours at a time.
More info available in docs: https://docs.paperspace.com/gradient/machines/#free-machines...
moconnor|3 years ago
I even tried and failed to get it up and running with a Google cloud GPU recently, before just switching to Lambda which worked first time (but had since hit availability issues).
stableskeptic|3 years ago
The restrictions listed at https://research.google.com/colaboratory/tos_v3.html differ slightly from the limits listed at https://research.google.com/colaboratory/faq.html specifically tos_v3.html does not mention these items from the faq
I can appreciate why those were added - I've read posts and notebooks explaining how you can use ngrok or cloudflare to do those things in violation of the restrictions in the faq and clearly many people aren't using Colab as intended.Speaking as someone who has been playing around with the Colab free tier with the expectation of moving to a paid service once I know what I'm really doing, I'd like to know if it's likely these restrictions will be eased a bit with the move to a compute credit system.
I'm still learning and haven't had a need to do those things yet but I believe remote ssh access would greatly simplify managing things. The Jupyter interface and integrated Colab debugger are good for experimenting but I'm worried that as I get closer to production I'll need a way to observe and change the state of long-running Colab processes the way I could with ssh, ansible or other existing tooling.
Clearly I can build that myself or use something like Anvil Works https://anvil.works but that's time and effort I'd rather avoid if possible. So I'm hoping that the Colab team will ease the SSH restriction for people like me who want to use it for more traditional ops/monitoring of long running tasks.
Do you anticipate any change or easing of the SSH restriction?
cperry|3 years ago
Both of those address angles of abuse that I don't want to discuss in big forums, and go counter to interactive notebook compute, our top priority.
etaioinshrdlu|3 years ago
LittlePeter|3 years ago
roboy|3 years ago
cperry|3 years ago
All is not lost though, I've got a few irons in the fire that should help resolve those points of feedback over the coming year.
In the meantime, you can always just buy a GCP VM and you have all the certainty you want: https://research.google.com/colaboratory/marketplace.html I find most people don't want that because it's a pain that Colab Pro/Pro+ largely abstracts.
visarga|3 years ago
minimaxir|3 years ago
> This has been planned for months, it's laying the groundwork to give you more transparency in your compute consumption, which is hidden from users today.
https://twitter.com/thechrisperry/status/1564806305893584896
sabalaba|3 years ago
https://lambdalabs.com/service/gpu-cloud
endisneigh|3 years ago
doesnt_know|3 years ago
frederickgeek8|3 years ago
cperry|3 years ago
TigeriusKirk|3 years ago
cperry|3 years ago
porker|3 years ago
I fine-tuned a model on Colab Pro earlier this year and having to launch and quit 6 or 7 times to get a faster graphics card to ensure it completed within the time limit sucked.
Hope this will give more transparency into whether you are assigned a whole card or a virtual slice of one. Something I could never work out before!
cperry|3 years ago
And yes, hoping to give you more control over chip type too, stay tuned.
rahidz|3 years ago
theamk|3 years ago
desindol|3 years ago
LuciferSam86|3 years ago
Like, I don't know, 15$ per month. Still cheaper than buy a GPU VPS.
Elives|3 years ago
[deleted]
imtemplain|3 years ago
[deleted]
boredemployee|3 years ago