top | item 35559497

Amazon announces 'Bedrock' AI platform to take on OpenAI

110 points| bundie | 2 years ago |businessinsider.com | reply

47 comments

order
[+] popcorncowboy|2 years ago|reply
In case it's not obvious, this is what "landing page" customer discovery looks like for $trillion companies who have nothing to show but smoke and a whole lot of stick rubbing. The signup form is one massive "we don't have a clue what we're doing, here are dozens of options, tell us absolutely everything about how you plan to use our hypothetical cough excuse me awesome and totally real bedrock service".

AWS will play hard in this space but as someone else in this thread eloquently put it: this is what the sound of executive butt clenching looks like writ large. Microsoft can only be laughing.

[+] politician|2 years ago|reply
I don't think you're wrong per se, but all of AWS's landing pages look exactly like this sort of enterprise sales "call us for pricing" pitch. Of course AWS will offer a service in this area -- all of the hyperscalers are AI factories now -- but it wouldn't be a surprise to discover that Bedrock is a repackaging of the SageMaker ecosystem.
[+] hooverd|2 years ago|reply
That's every AWS landing page.
[+] nerpderp82|2 years ago|reply
I hits hood grasping at straws here!
[+] andrewstuart|2 years ago|reply
I used to be an AWS true believer. Now I find it increasingly difficult to be enthused about anything AWS. It’s all so expensive, complex and locked in. I recently shut down everything I had on AWS except Route 53 and workmail, both of which I really like. The craziest AWS thing is that they moved all GPU instances to a quota system where you have to request access, specifying the number and type and location of instances that you want to run. It’s like Azure which has an equally terrible quota system. Anyhow I needed to run some performance tests in GPU instances. In the past I would have spun up a bunch of different instance types, run my tests and moved on. In this case I applied for a quota of one machine type and 24 hours later AWS got back to me to approve my request to run one instance. At that point I gave up on the idea of AWS being the heart of any GPU based infrastructure.
[+] crazygringo|2 years ago|reply
Everything you say seems equally applicable to any cloud provider.

They're all expensive and they all want to lock you in, because they're businesses. GPU's are in extremely high demand now so that's just how it is. And they're complex because different customers have different needs, so that's inherent.

I don't see how any of your complaints are anything a for-profit enterprise cloud provider could change and still survive as a business.

[+] cavisne|2 years ago|reply
GPU's in cloud is just a bad business unfortunately.

There is only one vendor (for anything scientific/ML anyway) and you have to buy specific (overpriced) models from them to be compliant with the license. So zero negotiating power, and everything is proprietary so you cant build any software/support edge.

You cant oversell a GPU like you can CPU/memory, and most of the computation happens on the chip so no one cares how good the rest of your stack is.

And finally you are in a neverending battle with crypto miners, which would never be profitable on cloud so they have no intention of paying you.

[+] mochomocha|2 years ago|reply
They behave this way because they are so tight on GPU capacity given incredible demand. Getting a GPU instance on GCP is as miserable of an experience.

Try Lambda Labs, there's usually more availability, and more importantly you get some visibility into currently available capacity.

[+] dubcanada|2 years ago|reply
They all are, what did you switch too? Suppose it's kind of the point really, it's a cloud with a bunch of "tools".

You are welcome to install MySQL or elasticsearch on another VPS outside of AWS and use it. But that's not what customers want.

[+] qbasic_forever|2 years ago|reply
It's not like "Open" AI is any less locked in or opaque though...
[+] SanderNL|2 years ago|reply
We told you this would happen, but no, we were luddites.
[+] metadat|2 years ago|reply
I know Character.ai prefers Oracle Cloud for GPU compute workloads because OCI can actually deliver significant capacity in a reasonable amount of time (often < 24 hours, almost always within a few days for hundreds [or more] of GPU instances, especially if you aren't picky about the region).

I'm all for bagging in Oracle in general, yet it's undeniable this is one area where they're leagues ahead and winning.

[+] ayhoung|2 years ago|reply
besides GPU instances, have you tried out lightsail? i find its quite competitive with DO, Linode, etc. with AWS infra backing
[+] sovietmudkipz|2 years ago|reply
I’m waiting on the AI that is an expert at all things AWS. I’d like to learn about AWS services and then be able to have it generate SAM or CloudFormation templates.

I realized this is probably (hopefully) being worked on by some Amazon engineers. It’s so obvious when I started imagining what AI tools are around the corner.

Honestly, I wouldn’t be surprised if programming languages each had their own AI that could (try to) serve as an expert in the language.

[+] bradhilton|2 years ago|reply
I just read somewhere today that their new code completion tool, CodeWhisperer, is better at Amazon APIs than GitHub Copilot (makes sense).
[+] derwiki|2 years ago|reply
Can't ChatGPT [mostly] do this already? It was trained on all AWS documentation up until the training cut off
[+] nycdatasci|2 years ago|reply
Bedrock isn't a rival. It's a layer of abstraction on top of other second-tier foundation models that should allow developers to seamlessly switch between models. "Titan" is their horse in the LLM race and it's not scheduled to arrive at the starting gate for several more months.
[+] standardly|2 years ago|reply
The timing of this seems more like an attempt to stay relevant rather than some concrete milestone
[+] Kye|2 years ago|reply
I don't see it. They'll sell you GPU time to run and create models and access to data sets to train them on. This looks more like a proof of concept for using AWS to do that and a way to juice the market for it.

It's in their interest to have a competitive offering so they can use it in their own stuff without depending on others. Amazon.com's search is garbage right now, for example. Productizing their efforts gets other people to pay for it.

[+] baq|2 years ago|reply
They’ve seen what Microsoft has in the pipeline and the TAM created in the past couple months out of thin air and I can almost hear butts clenching in their HQ.
[+] jdlyga|2 years ago|reply
This reminds me of when one week, Webcrawler was the go-to search engine. Then it was Ask Jeeves, then Yahoo, then Google. This is one of those times where companies that used to be on top can slip very far behind.
[+] retox|2 years ago|reply
Jumping on the blockchain bandwagon didn't cost new entrants that much, but this AI fever is going to seriously hurt some of these companies financially.
[+] smotched|2 years ago|reply
This doesnt seem like a ChatGPT rival...it doesnt even seem like a GPT rival
[+] slowmovintarget|2 years ago|reply
Amazon Titan would be the GPT rival, based on what the article claims. Bedrock is the product name for the suite of services that allow you to spin up text-gen, image-gen, image classification, text summary, search, and chat bots.
[+] nickthegreek|2 years ago|reply
It just seems like a framework to use other released models but on AWS hardware.
[+] alberth|2 years ago|reply
At first, I thought AWS was launching their own SQLite hosted database.

BedrockDB is a SQLite based database with MySQL compatible drivers.

https://bedrockdb.com

[+] s09dfhks|2 years ago|reply
Much better headline than the other few articles posted about this. Amazon's docs even fail to mention what bedrock and titan actually are!
[+] hospitalJail|2 years ago|reply
My buddy whos livelihood depends on his Amazon stock vesting has been extremely anti-GPT.

Wonder if it was due to Amazon being slow to release something.

[+] Pr0ject217|2 years ago|reply
I wonder how John Digweed feels about the name. ^_^