.ai is not one of the ccTLDs that Google considers generic[0].
It would be interesting to know if Google will be making .ai generic, or if they will make a special exception for themselves considering they do not allow others to change the geographical targeting of domains registered to a ccTLD.
Isn't the same also true with .io? I'm not sure if they use any anymore, I remember google.io pointing to something but it doesn't seem to anymore. Really curious to see what the future brings for these domains.
The potential use of TPUs for training is very exciting. They say that they train floating point, but I don't see any indication to the FP precision they're capable of; perhaps I'm missing it. At any rate, I'm really excited to see resources being piled into their Cloud ML Engine product at this high rate.
I've made this comment in a couple of other threads, which subsequently veered off into other territory, so forgive the repetition, but it's a really interesting topic to me. The open-source distributed tensorflow stuff is pretty nice, but it still requires a huge amount of hand coding and tuning the machinery, reminding me quite a lot of just rolling the damn thing in MPI yourself. I'm very excited to see where distributed tf will be in a year or two, but it's a chore today.
Depending on how much these TPUs and other Cloud ML Engine developments help, I'd gladly abandon the attempt to roll it myself with the distributed tf.
The hope is that using Google's secret sauce to auto-distribute the execution graphs and associated data ingestion makes things "just work". At the moment, the documentation and examples for that are a bit all over the place and require writing models to conform to the newish tf.contrib.learn.Experiment API, which is also a bit underdocumented and underexampled. Using it for very large datasets (say >tens of TB) seems to be pretty challenging at this moment (to me at least). For a lot of use cases, BigTable seems to be the ideal ingestion engine for Cloud ML tf jobs, but there's no C native API. You can use BigTable, but you can only dump complete tables into tensorflow rather than querying for relevant data (since the queries cost money, a 5000-core jobs with just a few queries per core would cost you a fortune, so the ability to query BigTable in the tensorflow reader is disabled.
At any rate, I've been banging around on it for a few weeks and am really hopeful. I will follow Cloud ML Engine's career with considerable interest.
I'm not comfortable with Google having and sharing my data.
Very excited about the NVidia chips though. Would be happy to run TensorFlow with them on my own hardware - though I'm more excited about the day when client software and hardware make that easy and cheap.
On one hand, I'm definitely concerned about my privacy and sharing my data.
OTOH, I like to think of Google using my data as a form of a vote. The more data they have on me and tailor experiences using my usage data, the more useful it is and it will be designed to reflect that. So while in elections you may only get one vote for your choice of candidate, Google building on my interactions with an app will mean my voice is taken into consideration.
This is one of the reasons why I tend to share my crash/usage data with developers be it Google, Apple, Microsoft, etc.
Pet peeve: the Google AI effort is the product of ${LARGE_NUMBER} of engineers. This marketing page highlights a half-dozen luminaries. Not only do these luminaries also get comped ($$) one or more orders of magnitude more than rank-and-file, but now they get the glory as well. Sigh.
Of course, back in the 1980s all UK domain names were the other way round. UCL was uk.ac.ucl.cs if you used X25 and cs.ucl.ac.uk if you used TCP/IP. UCL was the gateway between the two worlds, and used magic heuristics to figure out which universe to forward email to. For example, if the domain started with "cs" it was a TCP/IP address and if it ended with "cs" it was an X25 address. Which worked well, right up until Czechoslovakia joined the Internet.
"Federated Learning enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device.."
Does anyone else find it odd that we're so far through the looking glass this past year that Richard Hendricks' latest venture seems not only plausible but a bit mundane by comparison?
I was sorta hoping with the announcement of Google.ai that they would add the .ai extension to Google Domains. Right now there are so few, and very terrible, registers that handle .ai. Like 101domain.com who can only make nameserver changes for you during their 9am to 5pm business hours on weekdays.
Implied here is the emergence of a new business model: developer powerful custom hardware that you do not sell, but only make available as a service in the cloud. This way you get multiple layers of lock-in.
Is it though? Facebook has been pushing hard for open hardware and open datacenter designs. I'm glad they remained an independent company if only to act as a counterweight to Google's dominance in developing customized hardware.
Now, what was that quote again? “We have only bits and pieces of information. But what we know for certain is that at some point in the early 21st century, all of mankind was united in celebration. We marveled at our own magnificance as we gave birth to AI.”
In my experience, Google is getting better at finding more general information and worse in finding specific information.
It also tries to "help" too much with fuzzy matching which starts to make it useless if you are looking for less common thing. For example, if you search for "nmake tabs vs spaces" it returns a bunch of results for GNU make and flame wars about tabs vs spaces instead of nmake specific info regarding usage of tabs or spaces in nmake makefiles.
From my experience, it depends on the type of things you're searching for. If you're doing a search for a local restaurant or something in the news, you don't even realize how good Google's become because it basically gets you exactly what you want in your first result. However, if you're trying to find something more obscure, older, less contextually relevant, or using keywords, Google can get very frustrating because it's trying to contextualize something that shouldn't be contextualized.
Unfortunately this doesn't seem to be for me, even though I'm really interested in AI and I'm currently working on an AI project (look at my profile if you are interested). I wish I could run my own AI algorithm rather than just using their own. It would probably be cheaper to just buy my own Xeon computers. Training is really what takes most of the computer power.
Doesn't appear to be. I'm wondering this myself. For web apps, cloud services are usually the better choice vs. in house servers. But, with ML, the pricing will dictate that more than anything.
I feel like there have been a series of announcements about AI toolkits and work in the last week. Is there some collaboration, is this a special week?
"We're currently testing Federated Learning in Gboard on Android, the Google Keyboard"
Thank-you for reminding me why I don't use Gboard and instead the BlackBerry Priv's fine keyboard instead. The obsession of prediction in our culture is absurd.
I might be skeptical, but every single AI experiment or showcase I see either online or on google experiments list is nothing that impressive. The whole AI thing is so overhyped these days...
Is it overhyped? I feel so. Every startup seems to have a machine learning engineering position available and for what? I have friends hired just to do data analysis (which is necessary for machine learning because you need a clean dataset that can be consumed for training purpose), but beyond a couple simple rules, he's not doing the kind of machine learning the cool kids are celebrating. So the hype is everywhere, but everyone's job is different, a lot of people don't do the "cool" AI stuff.
Most importantly, AI and machine learnings are not synonym at all. People should regard AI as the overall goal, wanting computer to do something really smart on its own, very little to no instruction.
Going back 5-7 years ago when Siri first came out, it was quite a noise. But I honestly never found a compelling reason to use Siri until I started driving and I needed to call somebody. The problem is that I have some accent and I have a lazy tongue so I blur on words, so Siri does not always understand what I want to say. I am surprised the voice-to-text feature in Message is quite accurate (it can auto-correct by learning the next phrase and understands pauses so it waits for you to speak again), but Siri doesn't. So while I appreciate virtual assistant, their capabilities are very limited to a set of commands.
I do feel the AI community has made some good progress, from beating Mario game, beating top Go players, to self-driving car, the technologies supporting these initiatives are getting more sophisticated than ever (and I feel the tooling too is getting too competitive, too many choices). I am working on some simple home automation involving NLP (for speaking to the program), image recognition (who's in the house), and a couple self-execution routines such as make sure all nights are off if no one is in the house or reminds me doctor appointment every Wednesday. That's not AI, it doesn't do anything else beyond what I programmed it to do, it doesn't try to survive or better itself.
[+] [-] RKearney|8 years ago|reply
It would be interesting to know if Google will be making .ai generic, or if they will make a special exception for themselves considering they do not allow others to change the geographical targeting of domains registered to a ccTLD.
[0]: https://support.google.com/webmasters/answer/62399?hl=en&ref...
[+] [-] laser|8 years ago|reply
[+] [-] ehsankia|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] davidmr|8 years ago|reply
I've made this comment in a couple of other threads, which subsequently veered off into other territory, so forgive the repetition, but it's a really interesting topic to me. The open-source distributed tensorflow stuff is pretty nice, but it still requires a huge amount of hand coding and tuning the machinery, reminding me quite a lot of just rolling the damn thing in MPI yourself. I'm very excited to see where distributed tf will be in a year or two, but it's a chore today.
Depending on how much these TPUs and other Cloud ML Engine developments help, I'd gladly abandon the attempt to roll it myself with the distributed tf.
The hope is that using Google's secret sauce to auto-distribute the execution graphs and associated data ingestion makes things "just work". At the moment, the documentation and examples for that are a bit all over the place and require writing models to conform to the newish tf.contrib.learn.Experiment API, which is also a bit underdocumented and underexampled. Using it for very large datasets (say >tens of TB) seems to be pretty challenging at this moment (to me at least). For a lot of use cases, BigTable seems to be the ideal ingestion engine for Cloud ML tf jobs, but there's no C native API. You can use BigTable, but you can only dump complete tables into tensorflow rather than querying for relevant data (since the queries cost money, a 5000-core jobs with just a few queries per core would cost you a fortune, so the ability to query BigTable in the tensorflow reader is disabled.
At any rate, I've been banging around on it for a few weeks and am really hopeful. I will follow Cloud ML Engine's career with considerable interest.
[+] [-] jwtadvice|8 years ago|reply
Very excited about the NVidia chips though. Would be happy to run TensorFlow with them on my own hardware - though I'm more excited about the day when client software and hardware make that easy and cheap.
[+] [-] canistr|8 years ago|reply
OTOH, I like to think of Google using my data as a form of a vote. The more data they have on me and tailor experiences using my usage data, the more useful it is and it will be designed to reflect that. So while in elections you may only get one vote for your choice of candidate, Google building on my interactions with an app will mean my voice is taken into consideration.
This is one of the reasons why I tend to share my crash/usage data with developers be it Google, Apple, Microsoft, etc.
[+] [-] cheath|8 years ago|reply
[+] [-] kayoone|8 years ago|reply
[+] [-] kowdermeister|8 years ago|reply
Is this in the ToS?
[+] [-] icantdrive55|8 years ago|reply
[deleted]
[+] [-] 65827|8 years ago|reply
[+] [-] khazhoux|8 years ago|reply
[+] [-] tghw|8 years ago|reply
https://google.ai/ and https://ai.google/ both point to the same page.
[+] [-] fsckin|8 years ago|reply
[+] [-] mhandley|8 years ago|reply
[+] [-] JimWestergren|8 years ago|reply
[+] [-] maxmcd|8 years ago|reply
One example: limited.international and international.limited are both available
[+] [-] art0rz|8 years ago|reply
[+] [-] aatchb|8 years ago|reply
[+] [-] scoot|8 years ago|reply
[+] [-] ehsankia|8 years ago|reply
[+] [-] MertsA|8 years ago|reply
[+] [-] bbcbasic|8 years ago|reply
[+] [-] dankai|8 years ago|reply
[+] [-] eggie5|8 years ago|reply
[+] [-] headmelted|8 years ago|reply
Does anyone else find it odd that we're so far through the looking glass this past year that Richard Hendricks' latest venture seems not only plausible but a bit mundane by comparison?
[+] [-] temp1245|8 years ago|reply
[+] [-] BinaryIdiot|8 years ago|reply
[+] [-] api|8 years ago|reply
[+] [-] xiphias|8 years ago|reply
[+] [-] pm90|8 years ago|reply
[+] [-] thecity2|8 years ago|reply
[+] [-] teddyh|8 years ago|reply
Google seems to be more and more appropriate for The Matrix quotes: https://news.ycombinator.com/item?id=9780632
[+] [-] theprop|8 years ago|reply
So many searches I do are so "SEO'ed", I feel like it's 1999 again.
[+] [-] babuskov|8 years ago|reply
It also tries to "help" too much with fuzzy matching which starts to make it useless if you are looking for less common thing. For example, if you search for "nmake tabs vs spaces" it returns a bunch of results for GNU make and flame wars about tabs vs spaces instead of nmake specific info regarding usage of tabs or spaces in nmake makefiles.
[+] [-] tdb7893|8 years ago|reply
[+] [-] josephpmay|8 years ago|reply
[+] [-] petra|8 years ago|reply
[+] [-] felipemnoa|8 years ago|reply
[+] [-] softbuilder|8 years ago|reply
[+] [-] IanCal|8 years ago|reply
[+] [-] h4pless|8 years ago|reply
https://cloud.google.com/pricing/list
[+] [-] H1Supreme|8 years ago|reply
[+] [-] obulpathi|8 years ago|reply
[+] [-] tejasmanohar|8 years ago|reply
[+] [-] vturner|8 years ago|reply
Thank-you for reminding me why I don't use Gboard and instead the BlackBerry Priv's fine keyboard instead. The obsession of prediction in our culture is absurd.
[+] [-] usaphp|8 years ago|reply
[+] [-] yeukhon|8 years ago|reply
Most importantly, AI and machine learnings are not synonym at all. People should regard AI as the overall goal, wanting computer to do something really smart on its own, very little to no instruction.
Going back 5-7 years ago when Siri first came out, it was quite a noise. But I honestly never found a compelling reason to use Siri until I started driving and I needed to call somebody. The problem is that I have some accent and I have a lazy tongue so I blur on words, so Siri does not always understand what I want to say. I am surprised the voice-to-text feature in Message is quite accurate (it can auto-correct by learning the next phrase and understands pauses so it waits for you to speak again), but Siri doesn't. So while I appreciate virtual assistant, their capabilities are very limited to a set of commands.
I do feel the AI community has made some good progress, from beating Mario game, beating top Go players, to self-driving car, the technologies supporting these initiatives are getting more sophisticated than ever (and I feel the tooling too is getting too competitive, too many choices). I am working on some simple home automation involving NLP (for speaking to the program), image recognition (who's in the house), and a couple self-execution routines such as make sure all nights are off if no one is in the house or reminds me doctor appointment every Wednesday. That's not AI, it doesn't do anything else beyond what I programmed it to do, it doesn't try to survive or better itself.
[+] [-] tanilama|8 years ago|reply
[deleted]