> CNTK (http://www.cntk.ai/), the Computational Network Toolkit by Microsoft Research, is a unified deep-learning toolkit that describes neural networks as a series of computational steps via a directed graph. In this directed graph, leaf nodes represent input values or network parameters, while other nodes represent matrix operations upon their inputs. CNTK allows to easily realize and combine popular model types such as feed-forward DNNs, convolutional nets (CNNs), and recurrent networks (RNNs/LSTMs). It implements stochastic gradient descent (SGD, error backpropagation) learning with automatic differentiation and parallelization across multiple GPUs and servers. CNTK has been available under an open-source license since April 2015. It is our hope that the community will take advantage of CNTK to share ideas more quickly through the exchange of open source working code.
> CNTK allows to
OT pet peeve: the tech culture's distaste for pronouns has made this all too common. It doesn't even have to be a pronoun: "users", "people", "clients", etc all work -- but without it, the author is referring to a thing is never specified. That offends me as a technical person far more than pronouns ever will.
In particular, project descriptions and readmes are rife with "X allows to".
Finally Python bindings! I wanted to use this because Tensorflow is impossible on Windows, but the lack of programming language bindings made it a non-starter. Glad this is finally here.
Besides the rebranding, the Python bindings seem relatively (2 months) new. Though the docs seem to imply it is pretty high level compared to other frameworks https://www.cntk.ai/pythondocs/
CNTK contributor here - Keras indeed is pretty high on our list of things to cover soon. But then, all our code is out there on GitHub and we welcome PRs :-)
Here's an article comparing TensorFlow to an earlier version, when it was still an MSR project called CNTK: https://esciencegroup.com/2016/02/08/tensorflow-meets-micros.... The author concluded that they both seemed very useful; he dinged CNTK for its lack of Python bindings at the time but that seems fixed now.
CNTK has significantly higher performance with one or more machines; great multi-gpu scalability. Can train harder on bigger datasets given your resources.
The fastest for distributed deep learning workloads... The proven Toolkit for Microsoft production system... Now Python support is native! Cognitive Toolkit rocks!!!
[+] [-] mrdrozdov|9 years ago|reply
Here's the github repo: https://github.com/Microsoft/CNTK
CNTK homepage (http://www.cntk.ai/) now redirects to https://www.microsoft.com/en-us/research/product/cognitive-t...
> CNTK (http://www.cntk.ai/), the Computational Network Toolkit by Microsoft Research, is a unified deep-learning toolkit that describes neural networks as a series of computational steps via a directed graph. In this directed graph, leaf nodes represent input values or network parameters, while other nodes represent matrix operations upon their inputs. CNTK allows to easily realize and combine popular model types such as feed-forward DNNs, convolutional nets (CNNs), and recurrent networks (RNNs/LSTMs). It implements stochastic gradient descent (SGD, error backpropagation) learning with automatic differentiation and parallelization across multiple GPUs and servers. CNTK has been available under an open-source license since April 2015. It is our hope that the community will take advantage of CNTK to share ideas more quickly through the exchange of open source working code.
[+] [-] GrinningFool|9 years ago|reply
In particular, project descriptions and readmes are rife with "X allows to".
[+] [-] Blackthorn|9 years ago|reply
[+] [-] Eridrus|9 years ago|reply
One interesting note is that there seem to be plans to create a Keras backend that lets you run Keras models on CNTK: https://github.com/Microsoft/CNTK/issues/797
[+] [-] derpapst|9 years ago|reply
[+] [-] reckel|9 years ago|reply
[+] [-] Ph0X|9 years ago|reply
[+] [-] Analemma_|9 years ago|reply
[+] [-] ajwald|9 years ago|reply
[+] [-] reader5000|9 years ago|reply
[+] [-] xdh168|9 years ago|reply
[+] [-] akssri|9 years ago|reply
New ops in tensorflow seem to be oriented towards forward-mode AD rather than reverse-mode (for which one needs a pull back op on a dual).
[+] [-] xdh168|9 years ago|reply
[+] [-] PacoLoco|9 years ago|reply
[deleted]