Well, that's more clever than a company I did due diligence for.
Their strategy was to have a pool of API keys attached to new accounts that would take advantage of the Google Maps API free tier, and monitor its usage. As the free tier usage would run out, the system would roll over to a new API key automatically.
Wrote that one up in big red marker in my report...
I did the same years ago. We were providing realtime suburb data for a fleet of trains. Each train received a GPS coordinate once per minute, we took this and displayed the suburb. So 1440 updates per day per train. For the fleet it was going to be over $100 a day in API costs.
We were going to not display suburb data because of cost. In the end I found a creative commons placename database (geonames.org). For placenames with >500 people it's ~10MB of data and that covers the entire planet (surprisingly small). I then wrote a KD-Tree based library to look it up the nearest point in this table extremely efficiently (log(N) time).
We were going to not display suburb data because of cost. In the end I found a creative commons placename database (geonames.org). For placenames with >500 people it's ~10MB of data and that covers the entire planet (surprisingly small). I then wrote a KD-Tree based library to look it up the nearest point in this table extremely efficiently (log(N) time).
At a previous employer I tried to convince my managers to let me do this for months. They always balked. Their loss.
If you track the vehicles every day and collect the location data, you can easily augment the Open Source Routing Machine to give you traffic accounted estimates.[0] Combined with some Kalman filters you'd get almost perfect estimates when live.
Of course, this is for a use case where you have similar routes every day, this allows you to really tune the Kalman filters.
My understanding is that Google's does real time traffic reporting so well because it's constantly pinging the location of android devices. There have been several write ups on how to spoof it, but historical data is never going to be a match for a feed like that.
If they have their own fleet they can generate their own (historic) traffic data e.g. via Map Matching and use an open source routing engine like GraphHopper with OpenStreetMap data. (disclaimer: I'm one of the developers of GraphHopper.)
PSA: Open Source Routing Machine (OSRM) was largely abandoned by its maintainers. Several of us are working to reboot it, so if you enjoy map data and/or graph theory and have C++ skills, this would be a great project to work on.
I'm a big fan of OpenTripPlanner [0]. I mainly use it for accurate isochrones of public transit networks (super handy for figuring out where to live) but it has support for foot, bike and car routes as well. There was some PoC code to support traffic data [1] but it was removed [2], I believe because nobody was interested in maintaining it. If someone wants to add it back and maintain it I doubt they'd object.
+1, I've made my own geocoding/geodecoding on openstreetmap data, it worked 10x faster than google and with our level of usage, server essentially pays for itself just for 20% of resources. With our level of usage, we would burn through free tier in several hours.
I think real time traffic awareness is exactly what the poster wanted, but yeah it seems like people who don’t need that would be better off using OSS where possible.
(a) No Scraping. Customer will not export, extract, or otherwise scrape Google Maps Content for use outside the Services. For example, Customer will not: (i) pre-fetch, index, store, reshare, or rehost Google Maps Content outside the services; (ii) bulk download Google Maps tiles, Street View images, geocodes, directions, distance matrix results, roads information, places information, elevation values, and time zone details; (iii) copy and save business names, addresses, or user reviews; or (iv) use Google Maps Content with text-to-speech services.
(b) No Caching. Customer will not cache Google Maps Content except as expressly permitted under the Maps Service Specific Terms.
"Customer can temporarily cache latitude (lat) and longitude (lng) values from the Directions API for up to 30 consecutive calendar days, after which Customer must delete the cached latitude and longitude values."
The conclusion I get from this is not plan anything on "free" features promoted by SaaS providers because that can hit you hard in the future. Either plan based on something payd and covered by a contract or try to come up with another solution, maybe an in-house solution, maybe a free solution with a failover based on another provider.
At a cloud storage company i worked for we had the most cost-intensive part of our infrastructure, the object storage ready to be switched over from Azure to AWS, just in case.
Random thought but I think it’s funny how about 40 years ago, people would write about their tricks to save program memory usage. Now, people are talking about how to shave costs of doing API calls.
Meanwhile, the old maxim "All programming is ultimately an exercise in caching" seems to have been forgotten entirely. This whole thread is just surreal.
It would certainly be interesting if they had compared the quality of Mapbox’s traffic data to Google’s. Perhaps the authors tried this and the data wasn’t as good? But it’s not mentioned.
OpenStreetMap itself is just the base map, not a provider of traffic data or routing.
Since they have actual vehicles on the ground, they could collect data on actual travel times under various traffic conditions and apply a bit of ML to predict how long it's going to take under current conditions. No need for any Google API there.
But now that their Google bill is only ~$50/day, it might not be worth building their own prediction system.
Taking this approach to its logical conclusion, one could sample Google to infer a congestion heat map for all the main routes in an area, and calculate the transit time estimates directly from the inferred heat map.
IME congestion is also often modal, there's probably reductions in sampling effort you could make by noticing the patterns in how commuters route, whether school is on break, and what the latest roadworks are.
So could increasing pricing under circumstances help to make your customers more creative, more efficient and more ecologically sustainable by making your costumers consume less of your computing power and bandwidth?
Why not save the route, decode the polygon into its respective coordinates and use that to find overlapping roadparts?
This would work all of Google Maps, also for bikes or walking.
That's a pretty cynical interpretation. To me it seems more like they took an easy engineering approach when the cost per API call was low, then when the price went up they re-designed their software to use API calls more frugally by using some domain specific knowledge (that google couldn't know about so wouldn't be able to build in to their pricing).
We also had an increased Maps bill after the new pricing model for each map load on a detail page for a specific real estate object. Our map is now disabled by default and can be loaded with a “show map” button on a static cached map image.
I wonder if they simply tried calling Google and negotiating a volume based discount. I realize Google is notoriously impersonal, but I always prefer if you can solve a problem without code.
I worked with a major automotive company that was upgrading its head units to be more user friendly. Their #1 customer request at the time was to have it use Google Maps, so they tried to negotiate a discount with Google. I can't share the discount amount, but it was laughably small and the company would have been shelling out millions of dollars a month to Google for the integration. Needless to say, they stuck with a different mapping provider.
[+] [-] disillusioned|6 years ago|reply
Their strategy was to have a pool of API keys attached to new accounts that would take advantage of the Google Maps API free tier, and monitor its usage. As the free tier usage would run out, the system would roll over to a new API key automatically.
Wrote that one up in big red marker in my report...
[+] [-] AnotherGoodName|6 years ago|reply
We were going to not display suburb data because of cost. In the end I found a creative commons placename database (geonames.org). For placenames with >500 people it's ~10MB of data and that covers the entire planet (surprisingly small). I then wrote a KD-Tree based library to look it up the nearest point in this table extremely efficiently (log(N) time).
I'll admit i haven't updated or maintained it. The server running it has been chugging along well though >5years later. https://github.com/AReallyGoodName/OfflineReverseGeocode
[+] [-] nerdponx|6 years ago|reply
At a previous employer I tried to convince my managers to let me do this for months. They always balked. Their loss.
[+] [-] eruci|6 years ago|reply
[+] [-] saila|6 years ago|reply
https://postgis.net/
[+] [-] BubRoss|6 years ago|reply
[+] [-] superpermutat0r|6 years ago|reply
Of course, this is for a use case where you have similar routes every day, this allows you to really tune the Kalman filters.
0: https://github.com/Project-OSRM/osrm-backend/wiki/Traffic
[+] [-] cookie_monsta|6 years ago|reply
[+] [-] wakkaflokka|6 years ago|reply
[+] [-] karussell|6 years ago|reply
It is unclear to me whether their current practice is in harmony with the Google Maps TOS. For some places there are also open traffic data sources: https://github.com/graphhopper/open-traffic-collection
[+] [-] pedro_hab|6 years ago|reply
But I'm not sure if it applies to this use case.
[+] [-] samcheng|6 years ago|reply
If you don't want features like real-time traffic awareness, it's worth investigating the open source tooling. It can save a LOT of money.
[+] [-] Doctor_Fegg|6 years ago|reply
https://github.com/Project-OSRM/osrm-backend/
(Reboot discussion at https://github.com/Project-OSRM/osrm-backend/issues/5209)
[+] [-] Youden|6 years ago|reply
[0]: https://github.com/opentripplanner/OpenTripPlanner
[1]: https://github.com/opentripplanner/OpenTripPlanner/pull/2077
[2]: https://github.com/opentripplanner/OpenTripPlanner/pull/2698
[+] [-] yetihehe|6 years ago|reply
[+] [-] sudhirj|6 years ago|reply
[+] [-] superpermutat0r|6 years ago|reply
https://github.com/Project-OSRM/osrm-backend/wiki/Traffic
It works quite well.
[+] [-] winrid|6 years ago|reply
[+] [-] kartayyar|6 years ago|reply
https://cloud.google.com/maps-platform/terms
(a) No Scraping. Customer will not export, extract, or otherwise scrape Google Maps Content for use outside the Services. For example, Customer will not: (i) pre-fetch, index, store, reshare, or rehost Google Maps Content outside the services; (ii) bulk download Google Maps tiles, Street View images, geocodes, directions, distance matrix results, roads information, places information, elevation values, and time zone details; (iii) copy and save business names, addresses, or user reviews; or (iv) use Google Maps Content with text-to-speech services.
(b) No Caching. Customer will not cache Google Maps Content except as expressly permitted under the Maps Service Specific Terms.
[+] [-] xtony|6 years ago|reply
"Customer can temporarily cache latitude (lat) and longitude (lng) values from the Directions API for up to 30 consecutive calendar days, after which Customer must delete the cached latitude and longitude values."
[+] [-] KingOfCoders|6 years ago|reply
https://www.eventsofa.de/campus/migrating-away-from-google-m...
Shoutout to Stadia Maps CEO who was always very very helpful.
[+] [-] lukeqsee|6 years ago|reply
I'm cofounder of Stadia Maps, and I'm happy to field any questions folks may have.
[+] [-] DeathArrow|6 years ago|reply
[+] [-] sm2i|6 years ago|reply
[+] [-] mkchoi212|6 years ago|reply
[+] [-] CamperBob2|6 years ago|reply
[+] [-] einpoklum|6 years ago|reply
However - it is also important to _contribute_ funds to OpenStreetMaps - for cityflo and for us.
Donations: https://wiki.osmfoundation.org/wiki/Donate
Individual membership: https://wiki.osmfoundation.org/wiki/Membership
For organizations: https://welcome.openstreetmap.org/how-to-give-back/
The OSM Foundation in general: https://wiki.osmfoundation.org/wiki/Main_Page
[+] [-] chowes|6 years ago|reply
[+] [-] rkda|6 years ago|reply
https://www.mapbox.com/navigation/
[+] [-] Reason077|6 years ago|reply
OpenStreetMap itself is just the base map, not a provider of traffic data or routing.
[+] [-] graynk|6 years ago|reply
I still can't believe Google does not have a GUI to both style a map AND add markers to it, but they have GUI for those things separately.
[+] [-] kijin|6 years ago|reply
But now that their Google bill is only ~$50/day, it might not be worth building their own prediction system.
[+] [-] DeathArrow|6 years ago|reply
I used an app which predicts public transportation arrival time based on historic data and it mistakes most often than not, sometimes by a lot.
[+] [-] barrkel|6 years ago|reply
IME congestion is also often modal, there's probably reductions in sampling effort you could make by noticing the patterns in how commuters route, whether school is on break, and what the latest roadworks are.
[+] [-] blumomo|6 years ago|reply
[+] [-] ant6n|6 years ago|reply
[+] [-] melbourne_mat|6 years ago|reply
[+] [-] cookie_monsta|6 years ago|reply
[+] [-] punnerud|6 years ago|reply
Example here on how to decode: https://github.com/geodav-tech/decode-google-maps-polyline
[+] [-] austurist|6 years ago|reply
[+] [-] cookie_monsta|6 years ago|reply
Besides, I suspect that Google knows exactly what their data is worth and keep a very close eye on the price/demand model.
Just because something is subjectively expensive doesn't make it objectively overpriced.
[+] [-] izacus|6 years ago|reply
[+] [-] remus|6 years ago|reply
[+] [-] prepend|6 years ago|reply
Perhaps they have some margin formula for “buying” data from their internal sources.
It seems to me that they are value pricing and since they are close to a monopoly on this service they are testing the price elasticity.
[+] [-] bartkappenburg|6 years ago|reply
[+] [-] cma|6 years ago|reply
[+] [-] Enginerrrd|6 years ago|reply
[+] [-] darkerside|6 years ago|reply
[+] [-] mdorazio|6 years ago|reply