(no title)
midgetjones | 2 years ago
I spotted that the website served its data to the frontend via an unsecured internal JSON API, so I built an Elixir app that would poll the API endpoint and upsert the cat data into the database. Any new records would get posted to a twitter account (a free way to get notifications on my phone).
It worked beautifully, and when a black cat called "Fluff" popped up, we both knew he was the right one, and we were able to phone them and arrange a meeting before anyone else. Fast forward five years, and he's sitting next to me on the sofa right now, purring away.
fortydegrees|2 years ago
I left my airpods in a car I rented using zipcar. I spoke to support etc but nothing had been handed in. I checked to see if the car was still where I left it so that I could re-hire and claim them, but it had been moved.
The app tells you the 'name' of the car you rented which is used as an identifier. It also shows a map of where all available cars are. I sniffed the requests the app made to display this map, and was able to filter it by the car name. From this I was able to locate where the car I left my airpods in was. Was able to head there, unlock the car, and to my amazement the airpods were still there!
midgetjones|2 years ago
stitch4143|2 years ago
bovermyer|2 years ago
sys_64738|2 years ago
midgetjones|2 years ago
bundie|2 years ago
teleforce|2 years ago
If the data is read only it's a GOOD thing especially for non-confidential data that are meant to be public, every government agency should open their public data like this.
midgetjones|2 years ago
I quickly noticed that they had employed lazy loading, which would have made that all but impossible. It took me a good few minutes to realise that if they had lazy loading, there had to be a backend, and I was overjoyed when I found out it was serving JSON.
All in all, it was probably much cheaper for them to have me hitting the API endpoint every minute than scraping the website even once a day
xxriss|2 years ago
Standard, website bad and hard to use but there is a secret json feed of the useful data so hack up an alternative view. ( they have changed its format slightly once so far )
This one is cinema, local cinema ( https://thelight.co.uk/ 5 minute walk) has a monthly membership with unlimited films, but, hard to keep track of whats on time wise. EG planing to watch one film as another one ends, and also hard to tell what is the last showing of a film or even what is on right now.
So simple table view sorted by time.
https://notshi.github.io/dump/ the source is on github but as it is just a single html page with javascript embedded then the source is also the page :)
Kinda nice that it is such a simple hack.
kej|2 years ago
In the course of that I stumbled on https://ntfy.sh/ which solved the notification problem without needing Twitter, and I've used it since then to let me know when long-running scripts complete.
midgetjones|2 years ago
sys_64738|2 years ago
pjc50|2 years ago
fsniper|2 years ago
I was trying to find a used motorcycle. So I created an in-browser javascript app that could go over the listings in a country local second hand site, and score the listings to my liking. Like decrease for long mileage, increase for young age.
That worked pretty well and found me a great one. Good times.
qup|2 years ago
I did a similar thing like twenty years ago to nab free stuff on the local Craigslist.
Hackbraten|2 years ago
midgetjones|2 years ago
eptcyka|2 years ago
Capricorn2481|2 years ago
bomewish|2 years ago
midgetjones|2 years ago
The primary reason was to learn Elixir, so this was just a well-timed excuse to explore the language (and Phoenix, the web framework).
The secondary reason was that my wife was the main client, and she doesn't respond well to raw JSON. Each tweet would be just the cat's name, photo, and a link to the website. I also did some filtering as certain cats have safety requirements we couldn't meet, e.g. no neighbouring cats, no children)
One of the main issues I had to figure out early on, was "how do I distinguish which cats are new, compared to the previous response?". This was made harder because I couldn't rely on the ordering; occasionally previously-posted cats would have details updated and they would move position. Postgres UPSERT was new (to me, at least) at the time, and it seemed like a very handy way to offload the responsibility. There were never more than 50 cats listed at any one time, so it was reasonable enough to request all the animals at once, and let the database figure out which cats were new, based on a combination of identifiers that would make them unique. I could also filter the updated records to see _what_ had been updated, e.g. the cat had now been rehomed.
Another thing Elixir did really well was the polling mechanism. It's absolutely trivial to spawn a worker that can repeatedly perform a task and asynchronously hand it off to be processed.
Hope that answers your question!
frontalier|2 years ago
stevage|2 years ago