I know this is not what's happening here, but I just love the idea of a MySQL function where it spins up a new instance for every connection and promptly throws away the data after executing.
You can somewhat accomplish this with SQLite stored in S3.
Zappa (python) has a deployment configuration that allows this. It's basically a Lambda that keeps itself alive all the time and for each request, fetches the SQLite DB from S3, does its transaction, and then puts the modified database back on S3.
The upside is it's basically free for low traffic read-only apps, the downside is the obvious problem of write conflicts if you have more than one write-capable user at any given time.
If you were to use the django test framework to generate a new SQLite DB on each request, you'd have what you're talking about.
I've been contemplating pushing SQLite data files to my Lambda function via a custom layer. Individual executions can update within the scope of their execution, but you'd only get an update by pushing a new layer.
One fewer network hop compared to DynamoDB, and for something that might get an update once a week or even once a month I get low latency without having to oversubscribe to another service.
RNCTX|5 years ago
Zappa (python) has a deployment configuration that allows this. It's basically a Lambda that keeps itself alive all the time and for each request, fetches the SQLite DB from S3, does its transaction, and then puts the modified database back on S3.
The upside is it's basically free for low traffic read-only apps, the downside is the obvious problem of write conflicts if you have more than one write-capable user at any given time.
If you were to use the django test framework to generate a new SQLite DB on each request, you'd have what you're talking about.
coredog64|5 years ago
One fewer network hop compared to DynamoDB, and for something that might get an update once a week or even once a month I get low latency without having to oversubscribe to another service.