I updated the blog with the reference. Basically it crashes to compile the model with https://github.com/NodLabs/shark-samples/blob/main/examples/.... The coremltools converter is very version specific (like all vendor conversion kits) and still on a version of TF I couldn't get on conda. Also it doesn't allow for training and only FP16 for inference with ANE. All our tests were with FP32.
magic_at_nodai|4 years ago
//part of nod.ai/shark team.