top | item 27306357

(no title)

spamalot159 | 4 years ago

Very cool! I have run into the issue of deploying models recently. I see that this is using AWS lambda functions. How does this hold up with larger models (3-5gb)?

Trying to deploy a model to Azure Functions didn't play nice because of my file sizes.

discuss

order

sameerank|4 years ago

Thanks! AWS Lambda supports Docker image sizes up to 10 GB (according to their docs), so on the back end, 3-5 GB could still work.

Convect, in its current state, is still limited to small models (e.g. < 0.5MB). This is because the deployment happens by posting to a REST API, which hasn't yet been tested for large payloads.

I wrote up some tips for decoupling a large model from the data that's causing it to be large (https://convect.readme.io/docs/news-topic-model). However, it sounds like you're asking about a truly large model. Convect is still in the MVP stage right now, but we plan to handle larger models. 3-5 GB models will likely be feasible in the future.