(no title)
gsharma | 2 years ago
Ingest Client Data - You will have to find customers who ingest dynamic data schemas. The example in the video shows more of a standard schema, which can be mapped once (using Lume or otherwise). No need to add an overhead or extra cost to run that data pipeline.
Normalize Data - One of the challenges will be to establish quality metrics for these mappings. Based on the demo, the quality score is supposed to be 100% all the time, but that’s far from the truth. Real data is messy. Validations will catch a lot of issues, but there will still be cases where incorrect mappings slip through. Ability to provide metrics around this will be very helpful in adoption.
Response Time - I’m not sure if you’re using OAI or your own models in the background. Even small latency in the pipeline for 100s of millions of records adds up to hours/days delay.
All the best!
No comments yet.