hwsw's comments

hwsw | 2 years ago | on: MetaCLIP – Meta AI Research

We have ported CLIP to our Bottlenose camera. The results are very exciting and the possibilities are, for lack of better terms, endless. You can now tell the camera what to look for. Example, if using for manufacturing automation and the task is to detect if any product is missing a label: our customers can use natural language input "unlabelled product" and "labelled product". The system can now differentiate between the two and send results to a PLC. Previously this would have required a new machine learning loop to deploy.

We are generating embeddings on the camera and send them out via chunk-data on the GigE Vision 2.1 protocol.

page 1