TransWikia.com

Why does Keras need TensorFlow as backend?

Data Science Asked by Aj_MLstater on December 8, 2020

Why does Keras need the TensorFlow engine? I am not getting correct directions on why we need Keras. We can use TensorFlow to build a neural network model, but why do most people use Keras with TensorFlow as backend?

7 Answers

Keras is an application programming interface (API). It is a single interface that can support multi-backends, which means a programmer can write Keras code once and it can be executed in a variety of neural networks frameworks (e.g., TensorFlow, CNTK, or Theano).

TensorFlow 2.0 is the suggested backend starting with Keras 2.3.0.

Answered by Brian Spiering on December 8, 2020

Additionally: Think of it as an abstraction layer.

Keras gives nice and intuitive way to build and think about neural network, but you have to understand thats not how computer takes orders. Hiding this complexity behind Tensorflow allows us to think naturally about building a neural network and not all the details behind implementation.

(On a general note thats why python is so popular, cause it abstracts the complexity away, and allows you to think and write down solution more naturally and intuitively)

Answered by Noah Weber on December 8, 2020

The first point to note is that Keras can potentially use many backends (e.g. Theano before it was discontinued, Microsoft Cognitive Toolkit, to name a couple). It just so happens that Keras has proven to be the most popular among the community. As a result, TensorFlow has adapted to the extent that Keras is now the default API in TensorFlow 2.0.

One of the biggest changes is the way libraries are now loaded using tf.keras.

Consider this example. Let's say one wishes to run a Sequential model using Keras. To do so, one must import the relevant libraries.

In the first version of TensorFlow, it would be done as follows:

from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense
from tensorflow.python.keras.wrappers.scikit_learn import KerasRegressor

The model is defined as such:

model = Sequential()
model.add(Dense(8, activation='relu', input_shape=(4,)))
model.add(Dense(1, activation='sigmoid'))

Now, let's contrast this to the TensorFlow 2.0 notation:

from tensorflow.keras import models
from tensorflow.keras import layers

model = models.Sequential()
model.add(layers.Dense(8, activation='relu', input_shape=(4,)))
model.add(layers.Dense(1, activation='sigmoid'))

The Sequential function is now being defined using models, and layers is the only other library imported. Whereas in TensorFlow v1.0, Sequential, Dense, and KerasRegressor all had to be imported separately to generate the model.

Using the above example as a reference point, one can say that Keras now uses TensorFlow as a backend most frequently - simply because it has proven to be the most popular. As a result, TensorFlow has adapted to making the syntax for calling Keras more user-friendly, and thus Keras has become the default API in v2.0.

You might also find this article of use for further information on this topic: https://www.pyimagesearch.com/2019/10/21/keras-vs-tf-keras-whats-the-difference-in-tensorflow-2-0/

Answered by Michael Grogan on December 8, 2020

This makes more sense when understood in its historical context. These were the chronological events:

  • April 2009 Theano 0.1 is released. It would dominate the deep learning framework scene for many many years.
  • June 2015 Keras is created by François Chollet. The goal was to create an abstraction layer to make Theano easier to use, enabling fast prototyping.
  • August 2015 Google hires François Chollet.
  • November 2015 Tensorflow is released by Google, with much inspiration from Theano and its declarative computational graph paradigm.
  • December 2015 Keras is refactored to allow for pluggable backend engines, and now it offers backend implementations for Theano and Tensorflow.

Other backends were later supported by Keras (CNTK, MxNet), but they never got much traction.

Time passes by and the overlap between Tensorflow and Keras grows. Tensorflow ends up duplicating many of the functionalities in Keras (apart from the multiple APIs within Tensorflow that also had big overlaps).

  • September 2017 Theano is discontinued.
  • November 2017 Keras is bundled with Tensorflow as tf.keras. From this point on there are 2 different Keras: the one bundled with Tensorflow and the one that supports multiple backend engines. Both are maintained by the same people and are kept in sync.

At some point, the roadmap for Tensorflow 2.0 is defined, choosing to pursue an imperative model like PyTorch. The person leading the Tensorflow API refactoring is François Chollet. This refactoring included a reorganization of the functionality to avoid duplications.

Now, THE ANSWER to your question: Tensorflow is the most used Keras backend because it is the only one with a relevant user base that is under active development and, furthermore, the only version of Keras that is actively developed and maintained is one with Tensorflow.

So, summing up:

  1. At the beginning of Keras, the overlap with Tensorflow was small. Tensorflow was a bit difficult to use, and Keras simplified it a lot.
  2. Later, Tensorflow incorporated many functionalities similar to Keras'. Keras became less necessary.
  3. Then, apart from the multi-backend version, Keras was bundled with Tensorflow. Their separation line blurred over the years.
  4. The multi-backend Keras version was discontinued. Now the only Keras is the one bundled with Tensorflow.

Update: the relationship between Keras and Tensorflow is best understood with an example:

The dependency between Keras and Tensorflow is internal to Keras, it is not exposed to the programmer working with Keras. For example, in the source code of Keras, there is an implementation of a convolutional layer; this implementation calls package keras.backend to actually run the convolution computation; depending on the Keras configuration file, this backend is set to use the Tensorflow backend implementation in keras.backend.tensorflow_backend.py; this Keras file just invokes Tensorflow to compute the convolution

Answered by ncasas on December 8, 2020

Keras used to use 2 backends(Theano and Tensorflow), but now only supports Tensorflow because of the discontinuation of Theano. The reason why Keras uses Tensorflow as it's backend is because it is an abstraction layer.

It is the easiest way to get started with AI and machine learning because all of the core algorithms are implemented in tensorflow and keras allows you to just call the classes/functions without adding any additional code. Great starter library for beginners and AI enthusiasts who have little coding experience.

Answered by Shashank Reddy on December 8, 2020

Lets go back to basics here.

It is not possible to only use Keras without using a backend, such as Tensorflow, because Keras is only an extension for making it easier to read and write machine learning programs. All the actual calculations needed to create models are not implemented in Keras, which is why you need to use a backend library for anything to work.

When you are creating a model in Keras, you are actually still creating a model using Tensorflow, Keras just makes it easier to code.

Answered by Duodenum on December 8, 2020

Imagine you have a basic maths framework, a lot of functions doing addition, subtraction, multiplication and division.

Imagine in everyday life you often need to compute averages.

Then you make a function (using the functions from the framework, inside it), that will take an array of numbers as parameters an return the mean.

The framework is actually doing the work, it's still a lot of additions and a division, but your API-like function is a way nicer way to do what you need.

Let's say you were using Numpy (an algebra framework on CPU) to do your stuff. Numba is it's equivalent but on GPU. If in your code you had a lot of "numpy.add(a, b)" everywhere you needed an addition, you would need to change it everywhere to "numba.add(a, b)", so a lot of shitty work. But if instead you were using your homemade function "add(a, b)", then you just have to change the framework you use inside your function, easy peasy ! So yeah you understood correctly, it's better to update the API than the framework. To come back to this dumb example, Numpy is a "CPU computing framework", so it wouldn't make any sense to change it to use the GPU (Numba was created for it). But your custom function can easily be modified, as it's purpose is to "do the job the best way for you". So it's good practice to stick to using your "API" everywhere, even if it sometimes seems unnecessary.

Now picture Keras as this function, and Tensorflow as the algebra framework. Sure, most of the time you can use directly the framework, but if you want your code to be cleaner, you'll use your API.

As of today Keras and Tensorflow are bundled together and Tensorflow interface is getting closer to it, but that was the idea.

If you can do the same model as easily with Keras than directly from Tensorflow, it might seems better to get rid of the "useless" middle-man (Keras), but beware! If one day Tensorflow implements a better way of doing it, Keras will use it, while using directly Tensorflow you'll need to update your code...

I over simplify everything, I know, but you seems to have a hard time to distinguish framework and API.

You can see that the API is dumb, meaning that the API on itself is using the algebra framework and would be useless without it. Or it would need to implement all those operation and become a fully-fledged framework instead of a simple API. The API needs the framework to work, as Keras needs TensorFlow.

Answered by Kévin Azoulay on December 8, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP