top | item 45495136

(no title)

MaxPock | 4 months ago

This is honestly useful.

"Find me hotels in Capetown that have a pool by the beach .Should cost between 200 dollars to 800 dollars a night "

discuss

order

zzo38computer|4 months ago

I would not want to use LLMs for such a thing like that. Something like SQL queries or other kind of computer codes would be better. You would have to read the documentation, but it can be specified more precisely and more accurately. If you have a local program that can manage these queries (and then convert them to the remote service's format; a service could provide a file to specify the schema and the estimated cost of different fields) and interact with multiple services (including local files), then that will be better, without having to worry about problems with OpenAI, require as much power that OpenAI uses, more privacy violations than is necessary, etc.

However, it might be useful for people who do want to use that instead.

pphysch|4 months ago

[injected with guerilla ads]

I don't see how this is a significant upgrade over the many existing hotel-finder tools. At best it slightly augments them as a first pass, but I would still rather look at an actual map of options than trust a stream of generated, ad-augmented text.

elpakal|4 months ago

The benefit I see is that it meets users where they presumable already are (GPT). As other comments allude to here, it's clear they see themselves as a staple of the user's online experience.

AlBentley|4 months ago

exactly. Booking.com etc can just use OpenAI APIs to enable a similar voice/ chat interface on top of their search, and then the UX is not limited to 'cards'.

The UI 'cards' will naturally becoming ever increasing, and soon you end up back with a full app within ChatGPT or ChatGPT just becomes an app launcher.

The only advantage I can see is if ChatGPT can use data from other apps/ chats in your searches e.g. find me hotels in NYC for my upcoming trip (and it already knows the types of hotels you like, your budget and your dates)

b_e_n_t_o_n|4 months ago

I think the end game is that rather than spitting out text back, the LLM transforms your plaintext request to something processable, and then chooses some relevant widgets to display the results.

aryehof|4 months ago

I think the future is that models will not be able to answer that well, because sites will move to protect their data/content.

Instead, the model will provide you with a list of (in chat) “apps” that can fulfill your request. SEO becomes AISO (AI Search Optimization). Sites can partly expose data to entice you to choose them.