If I remember correctly, the "secret API call" was to the framebuffer. They were rendering the maps using the iPhones cpu/gpu, then pushing the rasters to the watch. The watch was not yet able to render the maps in realtime. Uber had to be able to create screens even in the background, so the API could have taken a snapshot of the active part of the screen buffer.
No comments yet.