Neural Learning with Tensorflow2.0 Part-3 ( Tensorflow Model Graph in Neo4j and Linkurious)

Neural Learning with Tensorflow2.0 Part-3 ( Tensorflow Model Graph in Neo4j and Linkurious)

In Part-2 of Neural Learning, we built a simple model for computing sum of two numbers.

In this part we will using Neo4j and Linkurious (a graph visualization tool) view the computational graph that we have trained.

A computational graph is a way to represent a math function in the language of graph theory.In such computational graph nodes are either input values or functions for combining values. Edges receive their weights as the data flows through the graph

With tensorflow Keras api's we can easily extract the variables, weights, bias, activation functions used etc.. in model. The structure of model we built in previous part looks like

No alt text provided for this image

Once a model is well trained, extract weights and biases that were updated on each neuron with gradient descent and load them in neo4j.

Once these are loaded in neo4j, graph looks like,

No alt text provided for this image

'HAS_NEXT' is a relation between successive layers of nodes. Each node has bias as property and relationship holds 'weight' property.

Lets view the graph in Linkurious and try to get some insights.

Tensorflow graph in Linkurious


No alt text provided for this image

Using Linkurious we can quickly explore the components (weights on edges and biases on nodes). As complexity of graph increases like in each layer if there are number of neurons, to extract information from such graphs becomes easier. Using this visualization tool, we can easily answer some questions like, How the distribution of weights between layers look like ? Are there positive or negative weights in each layer ? How many nodes have 0 bias or undefined bias ? etc

Let's keep some styles to the Linkurious graph.

No alt text provided for this image

With the above graph we can easily distinguish negative weights(grey colored) and nodes with 0 biases (red colored) and edges that have highest weight (orange colored).

We can also represent what activation function was applied on each neuron. And what the effect of weights and biases happened due to those activation functions.

Let's see how a more complex network looks like in Linkurious


No alt text provided for this image

We can click on each node and check for bias and weight over edges. This is one way visualizing a tensorflow model.



Thanks for reading !!

要查看或添加评论,请登录

pradeep ponduri的更多文章

  • Optimize your Spark Jobs

    Optimize your Spark Jobs

    As the volume of data increases, we always find bottlenecks dealing with it. Although spark has its own catalyst to…

  • Big Data Storage Formats

    Big Data Storage Formats

    An important task of any platform that processes big data is to decide on the type of format to store data. Hadoop has…

  • Concurrent Read Write Capability

    Concurrent Read Write Capability

    In the previous post, we have seen how transaction logs keep track of commits in delta lake. Now let’s talk about…

  • Data skipping and zorder in delta

    Data skipping and zorder in delta

    In this post, we take a look at how delta under the hood is capable of sifting through petabytes of data within…

  • Transaction Logs in Delta Lake

    Transaction Logs in Delta Lake

    Understanding the transaction log in Delta Lake is key in understanding the concept of the delta. This log is…

    3 条评论
  • Data Lifecycle to Delta Lake Lifecycle

    Data Lifecycle to Delta Lake Lifecycle

    We’re always told to ‘Go for the Gold!’ but how do we get that? This article is about how data can be moved in stages…

  • Delta Lake To Prevent Data Corruption

    Delta Lake To Prevent Data Corruption

    Delta lake or simply Delta is my go-to big data storage format these days. Storage formats are continuously evolving…

  • Static models in a rapidly changing dynamic world

    Static models in a rapidly changing dynamic world

    We always develop a machine learning solution to solve real-life problems. The data that we use to train the models is…

  • Blockchain - As I See It

    Blockchain - As I See It

    Block chain is a technology that enables moving digital coins or assets from one place/individual to other. The terms…

    1 条评论
  • Neural Learning with Tensorflow2.0 Part-2 (Overview of Gradient Descent and building simple model with Tensorflow)

    Neural Learning with Tensorflow2.0 Part-2 (Overview of Gradient Descent and building simple model with Tensorflow)

    In Part1 we have seen basics of Neural networks, how perceptron model and multi-layer perceptron model can be…

社区洞察

其他会员也浏览了