top | item 32656200

Google Colab Pro is switching to compute credits

93 points| 3wolf | 3 years ago

Just got this in my inbox. They haven't updated the FAQs page yet, as far as I can tell.

> Hi-

We’re improving the Terms of Service that apply to your Colab Pro or Colab Pro+ subscription making them easier for you to understand and improving the ways you can use Colab. The changes will take effect on September 29.

The [updated Terms of Service](https://research.google.com/colaboratory/tos_v3.html) include changes that will allow you to have more control over how and when you use Colab, allowing us to offer new services and features that will enhance your experience using Colab.

We will increase transparency by granting paid subscribers compute quota via compute units which will be visible in your Colab notebooks, allowing you to understand how much compute quota you have left. These compute units are granted monthly and will expire after 3 months. You will be entitled to a certain number of compute units based on your subscription level and will have the ability to purchase more compute units as needed.

Additionally, we will allow paid subscribers to exhaust their compute quota at a much higher rate. This will result in paid subscribers having more flexibility in accessing resources. Read more about these changes at our [FAQ](https://research.google.com/colaboratory/faq.html#compute-units).

If you would like to cancel your Colab Pro or Pro+ subscription, you can do that by going to pay.google.com and clicking Subscriptions and services. If you have any trouble canceling, you can email colab-billing@google.com for assistance. Please include an order number from one of your receipt emails if you email us for assistance.

-The Colab team

84 comments

order

derefr|3 years ago

You can really tell, in the comments sections of changes like these, who is speaking from the perspective of having a professional/business vs. a personal use-case.

Individuals tend to be upset; while professionals are happy that individual free-riders will no longer be sucking up undue amounts of compute power, and so QoS on the system will improve for them.

TigeriusKirk|3 years ago

I'd think anyone trying to run a business off a $10 or $49 tier is the one sucking up undue compute power.

Jugurtha|3 years ago

I've explored Colab users as a target audience for our product, especially given the fact that practically all the posts on the GoogleColab subreddit complain about how bad it is. Even those with the Pro+ or Pro tend to revert to the free Colab offer because there's no transparency.

What I understood from my interactions is that they complain but will not use a paid product because even though they're paying from nothing to $49, the actual resources used is in the $800/month ballpark (notebooks running 23 hours per day, seven days a week, using a GPU).

These are clearly hobbyists. The pros had different problems such as not being able to pay for it from certain countries.

In other words, there are people who need a notebook to run and not crash and willing to pay for that and there are others working on toy projects/individual pet projects or projects with no real stakes who'll complain about it but will not switch because another company will not really subsidize usage.

Yes, there are other companies that offer notebooks, but our product was for professionals in the ML field, and there's much more to ML project than running a notebook (real time collaborative notebooks, automatic experiment tracking, plugging compute from any cloud provider, one click model deployment, object storage like a filesystem, live monitoring dashboard for deployed models, and more).

discordance|3 years ago

Have been experimenting lately with GPUs off vast.ai. Has worked well for experiments with Stable Diffusion and is cheap!

Any other suggestions for where to rent cheap GPUs? - i've heard about Hetzner (https://www.hetzner.com/sb?search=gpu), but they are 1080s.

frederickgeek8|3 years ago

I tried use Paperspace Gradient "Growth" plan for Teams. The product was so buggy it was unusable. Their support and engineers were wonderful to talk to, but they admitted that there are a lot of features that just don't work and they don't have the bandwidth to fix them. It seems like an early product and I wouldn't recommend it if you need something dependable, at least not now. I would love to work with them in the future if the stability improves.

etaioinshrdlu|3 years ago

Vast.ai told me as far as they know they are the usually cheapest option, but that sometimes lambda offers something similar or slightly lower.

Aeolun|3 years ago

Am I the only one that thinks it’s nice they’re being explicit about how much they’re giving you? I found the original ‘however much we have available and feel like giving to you’ plan limit highly unprofessional.

I got an A100 after I susbscribed, so it worked out for me, but still annoying you don’t know what you’ll get.

cperry|3 years ago

Why thank you!

mark_l_watson|3 years ago

I deeply appreciate Colab. I bought a nice home GPU rig a few years ago, but seldom use it. When I am lightly using Colab I use it for free and when I have more time for personal research the $10/month plan works really well. I can see occasionally paying for the $50/month plan as the need arises in the future.

I am working on an AI book in Python. (I usually write about Lisp languages.) About half the examples will be Colab notebooks and half will be Python examples to be run on laptops.

In any case, I like the soon to be implemented changes, sounds like a good idea to get credits and see a readout of usage and what you have left.

cperry|3 years ago

Thanks! I think people will much prefer this over the current opaque system.

I read every feedback submission in Colab so if you ever have feedback you'd like addressed, send away.

goodfight|3 years ago

Reeling us in with unlimited and locking it down. Classic

cperry|3 years ago

PM for Colab - I wrote the email.

No intention to lock it down? Whatever that would mean? We ensure notebooks are totally portable for any other Jupyter install you want to move to.

This change is about laying the groundwork for increased transparency for your paid compute consumption, vs. the current model of kind of hiding that away.

xkapastel|3 years ago

It wasn't unlimited though, there was always a quota. It just wasn't visible.

TorqueFilet|3 years ago

Used Google Colabs for the last 8 months, will fully divest from them with this change...

geogra4|3 years ago

Yep, similar to what they did with Google photos/gmail

mardifoufs|3 years ago

What does this mean in this context? What's being locked down?

frognumber|3 years ago

I like the transparency, but this doesn't feel like the right way to do it. Computation should be free (or nearly free) if there's idle capacity, paid if Google is near capacity, and expensive/bidding if Google is above capacity.

Flat compute units seem simple, but result in a lot of waste.

skybrian|3 years ago

There might not be servers going idle. Google has plenty of batch jobs they can run in lower-traffic times, both internal and external.

And if it does go idle, it saves energy, which costs money. At scale, compute isn't free.

moralestapia|3 years ago

>Computation should be free (or nearly free) if there's idle capacity

I guess nothing stops you from buying infra and offering it for "free (or nearly free)".

johndfsgdgdfg|3 years ago

> Computation should be free (or nearly free) if there's idle capacity

HN used to be a place for interesting discussions. Now it's a grievance forum for entitled freeloaders.

knorker|3 years ago

How do you mean? Should GCS storage also be free, unless Google is nearing storage capacity?

Mathnerd314|3 years ago

A computation could use fewer compute units if resources are idle. There isn't enough information to make a judgment.

moconnor|3 years ago

At last! I love Colab but the vague promises around availability and quota made it impossible to recommend for my team to use professionally.

I even tried and failed to get it up and running with a Google cloud GPU recently, before just switching to Lambda which worked first time (but had since hit availability issues).

stableskeptic|3 years ago

Question for the Colab team:

The restrictions listed at https://research.google.com/colaboratory/tos_v3.html differ slightly from the limits listed at https://research.google.com/colaboratory/faq.html specifically tos_v3.html does not mention these items from the faq

    * using a remote desktop or SSH
    * connecting to remote proxies
I can appreciate why those were added - I've read posts and notebooks explaining how you can use ngrok or cloudflare to do those things in violation of the restrictions in the faq and clearly many people aren't using Colab as intended.

Speaking as someone who has been playing around with the Colab free tier with the expectation of moving to a paid service once I know what I'm really doing, I'd like to know if it's likely these restrictions will be eased a bit with the move to a compute credit system.

I'm still learning and haven't had a need to do those things yet but I believe remote ssh access would greatly simplify managing things. The Jupyter interface and integrated Colab debugger are good for experimenting but I'm worried that as I get closer to production I'll need a way to observe and change the state of long-running Colab processes the way I could with ssh, ansible or other existing tooling.

Clearly I can build that myself or use something like Anvil Works https://anvil.works but that's time and effort I'd rather avoid if possible. So I'm hoping that the Colab team will ease the SSH restriction for people like me who want to use it for more traditional ops/monitoring of long running tasks.

Do you anticipate any change or easing of the SSH restriction?

cperry|3 years ago

I do not anticipate changes in the short term, but I am always open to changes in the medium term.

Both of those address angles of abuse that I don't want to discuss in big forums, and go counter to interactive notebook compute, our top priority.

etaioinshrdlu|3 years ago

Lambdas labs has run out of GPUs to rent lately. I think it’s too many people running SD.

LittlePeter|3 years ago

Barely two weeks after Stable Diffusion release, we use SD as its acronym? That's fast.

roboy|3 years ago

I really like the increase in transparency, I found it somewhat disturbing to pay for what feels like a random amount of stuff. How should I know if I need Pro or Pro+ if there is no estimate out there what either might get me. The update does not seem to change that though. I would love to have a distribution plotted of how much compute I might expect. Or at least Min/Average/Max run time until disconnect (rn. only Max is known).

cperry|3 years ago

I aspire to offer that level of transparency. I am foiled by (1) GPU prices can change randomly on me, and (2) it's hard to convey to a user pricing without giving them a huge incomprehensible price sheet.

All is not lost though, I've got a few irons in the fire that should help resolve those points of feedback over the coming year.

In the meantime, you can always just buy a GCP VM and you have all the certainty you want: https://research.google.com/colaboratory/marketplace.html I find most people don't want that because it's a pain that Colab Pro/Pro+ largely abstracts.

visarga|3 years ago

They can benchmark a few architectures (ResNet50, BERT) and tell us how many times we can train a model on a specific level of subscription.

sabalaba|3 years ago

For those affected and wanting to run your stable diffusion notebook more, you can always spin up a notebook on Lambda cloud with A100s for only $1.10/hr. PyTorch, TensorFlow, and Jupyter notebooks are pre-installed:

https://lambdalabs.com/service/gpu-cloud

endisneigh|3 years ago

Pro tip: if it costs someone something, it’s not unlimited (this is true even if you’re paying a flat fee).

doesnt_know|3 years ago

Within capitalism, this is everything that exists. The phrase should effectively be banned in all marketing and service contexts across all industries.

frederickgeek8|3 years ago

I just subscribed to Colab Pro+ hours before this announcement (-‸ლ)

cperry|3 years ago

I'll refund you myself if you hate it. This change will give you more transparency in your paid compute consumption, though won't land for 30 days.

TigeriusKirk|3 years ago

Sigh. Unlikely these changes will be of benefit to us users.

cperry|3 years ago

This increases visibility into your paid compute consumption. Net win for folks. Feature doesn't launch for 30 days, but I can't make any changes like this without changing ToS, which I have to pre-announce.

porker|3 years ago

Good. Hopefully this will reduce the randomness of type-of-GPU assignment on the Pro plan.

I fine-tuned a model on Colab Pro earlier this year and having to launch and quit 6 or 7 times to get a faster graphics card to ensure it completed within the time limit sucked.

Hope this will give more transparency into whether you are assigned a whole card or a virtual slice of one. Something I could never work out before!

cperry|3 years ago

You're always getting the whole card, never a slice fwiw. We haven't found a GPU virtualization solution that has a strong enough security boundary between slices, so we keep you isolated.

And yes, hoping to give you more control over chip type too, stay tuned.

rahidz|3 years ago

Right when I started using it for StableDiffusion. Lovely.

theamk|3 years ago

That (people using a lot of resources) is probably why the change was made.

desindol|3 years ago

The restriction was always there the only thing that's changing is that you can see it now…

LuciferSam86|3 years ago

Would be nice a plan tailored for AI Image Generation and leave the more powerful GPUs for heavier jobs.

Like, I don't know, 15$ per month. Still cheaper than buy a GPU VPS.

Elives|3 years ago

[deleted]

boredemployee|3 years ago

For those looking for good alternatives, I recommend vast.ai