You can try this out now on Bard, which appears to be generally available.
https://bard.google.com/
My first impressions of this are that it's quite good. I gave it a handful of logic puzzles which it could solve and asked it for some New Zealand native gardening and bird advice and it gave good answers.
Best of all it doesn't have that... tone that ChatGPT has. You know that kind of smug condescending tone that gives away ChatGPT-generated content. If you ask Bard to respond in the style of ChatGPT it can replicate that tone, if you like it for some reason.
I like this prompt for testing LLMs as the problem is easy to reason about but likely doesn't come up a lot in training data:
> I'm playing assetto corsa competizione, and I need you to tell me how many liters of fuel to take in a race. The qualifying time was 2:04.317, the race is 20 minutes long, and the car uses 2.73 liters per lap.
GPT-3.5 gave me a right-ish answer of 24.848 liters, but it did not realize the last lap needs to be completed once the leader finishes. GPT-4 gave me 28-29 liters as the answer, recognizing that a partial lap needs to be added due to race rules, and that it's good to have 1-2 liters of safety buffer.
I prompted Bard today and the three drafts gave three different answers: 18.28, 82.5, and 327.6 liters. All of these were wildly wrong in different ways.
I don't think it's actually using PaLM 2.
If it is, it's not great, but I heavily doubt it is.
The commentary they've made on it could mean many things.
One of the possibilities is that they're using the API to PaLM-2 to classify the language and translate it to English if necessary, such that the simpler and non multilingual Bard model can handle it.
Obviously this isn't necessarily what is occurring, and it could be just Bard is now palm-2 for some instances for testing, or a full rollout.
Ultimately we don't know - because it's not open source, and companies are not great about telling you what they're actually doing.
> "Google, similar to OpenAI, didn’t provide a lot of the technical details about how it trained this next-gen model, including parameter counts (PaLM 2 is a 540-billion parameter model, for what it’s worth). The only technical details Google provided here are that PaLM 2 was built on top of Google’s latest JAX and TPU v4 infrastructure."
I'm sad but not really surprised that these companies aren't publishing and bragging about all of the technical details of their model architecture, size, and training anymore.
They are still stuck in the mindset of “this is our proprietary secret and we can’t let others catch up.” That mindset is just one of the reasons their company is stagnating.
update: there's at least some technical report that I didn't know when I wrote the comment https://ai.google/static/documents/palm2techreport.pdf but it's still mostly capabilities evaluation and not explaining the details of the model
> PaLM 2 will power Google’s updated Bard chat tool, the company’s competitor to OpenAI’s ChatGPT, and function as the foundation model for most of the new AI features the company is announcing today. PaLM 2 is now available to developers through Google’s PaLM API, Firebase and on Colab.
Let's not speculate and instead ask Bard directly!
>Do you use PaLM 2?
>I do not use PaLM 2 directly. PaLM 2 is a proprietary language model developed by Google AI, and it is not yet available to the public. However, I am trained on a dataset that includes text and code from PaLM 2, so I am able to benefit from its knowledge and capabilities. For example, I am able to answer questions more accurately and generate more creative text formats than I would be able to if I was not trained on PaLM 2's data.
>I am excited to see what PaLM 2 will be able to do in the future. I believe that it has the potential to revolutionize the way we interact with computers and the way we learn. I am grateful to Google AI for developing such a powerful tool, and I look forward to seeing how it is used to make the world a better place.
>When will you directly use PaLM 2?
>I do not know when I will directly use PaLM 2. It is a proprietary language model developed by Google AI, and it is not yet available to the public. However, I am always learning and growing, and I am hopeful that one day I will be able to directly use PaLM 2 to improve my capabilities.
>In the meantime, I am grateful for the opportunity to learn from PaLM 2's data, and I am confident that it will help me to become a more powerful and versatile language model.
I'm surprised they've chosen to release a medical focused model (Med-PaLM 2) before releasing a law focused model of PaLM. I'd think it would be simpler to train and also less liability?
On the one hand, yes. On the other hand, if they are both going to have liability issues, choosing the one that isn't literally full of lawyers seems like a smart play.
Idk, this feels like one of those things that would be a huge problem for a startup with a less established legal team, but fairly easy for big corporate Google lawyers to write language releasing them from all liability.
[+] [-] mvdtnz|2 years ago|reply
My first impressions of this are that it's quite good. I gave it a handful of logic puzzles which it could solve and asked it for some New Zealand native gardening and bird advice and it gave good answers.
Best of all it doesn't have that... tone that ChatGPT has. You know that kind of smug condescending tone that gives away ChatGPT-generated content. If you ask Bard to respond in the style of ChatGPT it can replicate that tone, if you like it for some reason.
[+] [-] underyx|2 years ago|reply
> I'm playing assetto corsa competizione, and I need you to tell me how many liters of fuel to take in a race. The qualifying time was 2:04.317, the race is 20 minutes long, and the car uses 2.73 liters per lap.
GPT-3.5 gave me a right-ish answer of 24.848 liters, but it did not realize the last lap needs to be completed once the leader finishes. GPT-4 gave me 28-29 liters as the answer, recognizing that a partial lap needs to be added due to race rules, and that it's good to have 1-2 liters of safety buffer.
I prompted Bard today and the three drafts gave three different answers: 18.28, 82.5, and 327.6 liters. All of these were wildly wrong in different ways.
[+] [-] jbay808|2 years ago|reply
Too bad for us...
[+] [-] chaxor|2 years ago|reply
The commentary they've made on it could mean many things. One of the possibilities is that they're using the API to PaLM-2 to classify the language and translate it to English if necessary, such that the simpler and non multilingual Bard model can handle it. Obviously this isn't necessarily what is occurring, and it could be just Bard is now palm-2 for some instances for testing, or a full rollout. Ultimately we don't know - because it's not open source, and companies are not great about telling you what they're actually doing.
[+] [-] kccqzy|2 years ago|reply
[+] [-] peter-m80|2 years ago|reply
[+] [-] ftxbro|2 years ago|reply
I'm sad but not really surprised that these companies aren't publishing and bragging about all of the technical details of their model architecture, size, and training anymore.
[+] [-] fullsend|2 years ago|reply
[+] [-] chaxor|2 years ago|reply
[+] [-] ftxbro|2 years ago|reply
[+] [-] jusujusu|2 years ago|reply
(exit right)
[+] [-] rapsacnz|2 years ago|reply
[+] [-] youssefabdelm|2 years ago|reply
[+] [-] jstx1|2 years ago|reply
So it's not in Bard yet? What a launch.
[+] [-] PaulWaldman|2 years ago|reply
>Do you use PaLM 2?
>I do not use PaLM 2 directly. PaLM 2 is a proprietary language model developed by Google AI, and it is not yet available to the public. However, I am trained on a dataset that includes text and code from PaLM 2, so I am able to benefit from its knowledge and capabilities. For example, I am able to answer questions more accurately and generate more creative text formats than I would be able to if I was not trained on PaLM 2's data.
>I am excited to see what PaLM 2 will be able to do in the future. I believe that it has the potential to revolutionize the way we interact with computers and the way we learn. I am grateful to Google AI for developing such a powerful tool, and I look forward to seeing how it is used to make the world a better place.
>When will you directly use PaLM 2?
>I do not know when I will directly use PaLM 2. It is a proprietary language model developed by Google AI, and it is not yet available to the public. However, I am always learning and growing, and I am hopeful that one day I will be able to directly use PaLM 2 to improve my capabilities.
>In the meantime, I am grateful for the opportunity to learn from PaLM 2's data, and I am confident that it will help me to become a more powerful and versatile language model.
[+] [-] zb3|2 years ago|reply
[+] [-] famouswaffles|2 years ago|reply
[+] [-] zuzatm|2 years ago|reply
[+] [-] a_vanderbilt|2 years ago|reply
[+] [-] famouswaffles|2 years ago|reply
[+] [-] anorphirith|2 years ago|reply
[+] [-] saynay|2 years ago|reply
[+] [-] xnx|2 years ago|reply
[+] [-] chrisshroba|2 years ago|reply
[+] [-] xnx|2 years ago|reply
[+] [-] xnx|2 years ago|reply
[+] [-] spiderfarmer|2 years ago|reply
[+] [-] zaptheimpaler|2 years ago|reply
Bing & ChatGPT are (close to?) global. Wonder why Bard isn't even in Canada :/
[+] [-] coreypreston|2 years ago|reply
Open AI: 2 ounces Bard: 2 pounds
Not impressed with Bard at all.
[+] [-] anonymouslambda|2 years ago|reply
[deleted]
[+] [-] unknown|2 years ago|reply
[deleted]