(no title)
kuzee | 4 years ago
I think a few companies use Elixir to power their web crawling/scraping tools. This makes intuitive sense as a good candidate for the process supervisor and parallel work architecture OTP encourages.
Nerves (embeddable Elixir) has come a long way. I switched to Nerves for some Raspberry Pi projects and the amount of time I waste dealing with hardware/config has gone to nearly zero. I am a hardware novice and was able to setup flashing firmware over-the-air updates to the Pi with very little effort. I'm sure the companies that use Nerves in production have more to say about it.
I'm not very tuned into the updates to Scenic, a project for display/UI on embedded screens, but it looks like they've hit some big release/stability milestones.
Phoenix is the way to go for web interfaces, and is an excellent toolset, so alternatives haven't been demanded. For more lightweight http people usually reach for Plug, a key building block of Phoenix, if you won't need the full bird.
prophesi|4 years ago
What do they use for headless browser scraping? I tried Hound a few months ago, but it seems too geared towards testing to be used more generically. We ended up just using Oclif and Puppeteer for scraping via NodeJS.
vereis|4 years ago
Otherwise have you heard of Crawley?
michaelcampbell|4 years ago
Never heard this phrase before, but I like it.