top | item 9469789

Viv Is a New Artificial Intelligence from the Inventors of Siri

41 points| anigbrowl | 11 years ago |esquire.com | reply

37 comments

order
[+] _benedict|11 years ago|reply
All they show is its ability to do some specialized airline searching? Which presumably means that is its current limit of functionality. Potentially useful, for sure, but hardly a "New Artificial Intelligence".

Expanding it to generalized product searching, or finding concepts on the internet, is a whole 'nother ball game, that frankly we haven't a clue how to do. Specialized tie in to well understood product spaces like airline ticketing, with a natural language processing engine tied in, is hardly world changing.

[+] sravfeyn|11 years ago|reply
Is the possibility of human level AI a highly accepted Conjecture? Why does almost everyone use that term so loosely?
[+] dlss|11 years ago|reply
1. Brains are physical systems

2. Physical systems can be simulated

3. Brains have human level AI

Therefore the possibility of human level AI is real. Hopefully something better than a simulated brain is possible though :p

[+] TheLoneWolfling|11 years ago|reply
It's odd. On the one hand, what dlss said. On the other hand, we keep moving the goalposts. AI always seems to be defined as "a computer doing something a human can but a computer cannot", with the obvious recursion.

A better question would be how powerful would a human level AI actually be. Personally, not very. Or rather, it'd be fragile as all get out. But that is another matter...

[+] kitd|11 years ago|reply
This looks rather similar to IBM Watson, but with the added input of the end-user's preferences.
[+] mark_l_watson|11 years ago|reply
I have used IBM Watson and this seems very different (and also in very early stage development).

This seems more like Google Now than Siri. My friends with iPhones just ask Siri questions. Google Now also takes actions of creating info cards, adding stuff to my calendar that appears in emails, provides traffic warnings, etc.

One problem that does not get enough discussion is false positives when taking actions. I saw on my calendar the other day that I was supposed to be checking into a hotel in some random city that day. It turns out that a customer had sent me an email that included his travel plans. Voila! Google Now 'thought' that I was taking a trip.