PyTorch versus Tensorflow

PyTorch versus Tensorflow

If you are reading this I'm presuming you are already knee deep into you deep learning journey. There are several open source deep learning libraries out there. I got asked yesterday a very good question. Which deep learning library is better? This is very common and typically users of one library tend to get siloed into one, therefore they think they are missing out on the benefits of the other library.

#1:

Both are open source frameworks, and created by two different entities. Tensorflow is based on Theano and has been developed by Google, whereas Pytorch is based on Torch and has been developed by Facebook.

Note: It has been announced that as on 2017 September, the Theano team (headed by Yoshua Bengio would cease all its major development)

#2:

The most important difference between the two is the way these frameworks define the computational graphs. While Tensorflow creates a static graph, PyTorch believes in a dynamic graph. Basically, in Tensorflow you first have to define the entire computational graph of the model and then run your ML model. But in PyTorch, you can define/manipulate your graph on-the-go. This is particularly helpful while using variable inputs in RNNs.

#3:

Tensorflow is relatively low level programming i.e. has a steeper learning curve. PyTorch is definitely more pythonic and higher level programming. You will definitely have to learn about it's working (sessions, placeholders, etc) and so it becomes a bit more difficult to learn Tensorflow than PyTorch.

#4

Tensorflow also has a much bigger community behind it that PyTorch. This means that it becomes easier to find resources to learn Tensorflow and also, to find solutions to your problems.

#5

Ideally, Tensorflow is much better for production models and scalability. It was built to be production ready. Whereas, PyTorch is easier to learn and lighter to work with, and hence, is relatively better for passion projects and building rapid prototypes.

Also, quoting an interesting fact:

Unique mentions of deep learning frameworks in arxiv papers (full text) over time, based on 43K ML papers over last 6 years. So far TF mentioned in 14.3% of all papers, PyTorch 4.7%, Keras 4.0%, Caffe 3.8%, Theano 2.3%, Torch 1.5%, mxnet/chainer/cntk <1% (Date 10/March/2018)

Hope this helps selecting a library of your choice!



要查看或添加评论,请登录

Jayanth Rasamsetti的更多文章

社区洞察

其他会员也浏览了