top | item 12767691

(no title)

pliu | 9 years ago

If you read the PDF, it seems the FPGAs can be used in a number of ways:

1) As a local co-processor, they have been deployed to accelerate selected Bing search ranking operations. This is described a little bit in the white paper, but I think the most interesting parts are going to be proprietary.

2) As a network accelerator, because they are inline with the server network interface. In the white paper using the FPGA for crypto acceleration is described. This is cool. The CPU overhead of encrypting 40Gbps is significant and managing keys for this many hosts is probably non trivial. Moving encryption to another layer would simplify things significantly I think, let app developers not think about it and let datacenter peeps manage keys independently of whatever is going on in the host.

3) As a large scale distributed processing grid. Because the FPGAs have their own network interfaces, they can be used independent of the host. Regular host network traffic gets passed through unaffected while the FPGA is simultaneously running distributed compute tasks from the grid. The white paper described training a DNN, but also describes their Hardware As A Service delivery model, meaning developers I guess have access to the grid and can deploy whatever they want. I would guess there is a lot of machine learning and map reduce type tasks going here, but who knows. The whitepaper also tantalizingly contains the phrase "cross-datacenter" which implies a globally distributed network of these things. Rad.

So that's a long way to say that I think there are very many use cases for this kind of thing. Coupling the FPGAs with regular compute nodes is operationally beneficial I think - they can be co-processors or they can be part of the grid. Microsoft is doing some pretty cool shit.

Edit: This whole thing is called Project Catapult. There are some other good papers here https://www.microsoft.com/en-us/research/project/project-cat...

discuss

order

No comments yet.