A Basic Used case of TFJS
Aman Patel
Machine Learning | Deep Learning | NLP | SQL | MongoDB | Django | Python | Docker | Kubernetes | AWS | GCP
While learning some new concept from Tensorflow for Natural Language Processing from Linkedin (Course link), I came across the idea of implementing a chatbot without any server, API, SDK, platform. We can accomplish the task using TensorflowJS ( A javascript library to work on TensorFlow models). So first I created a basic dichotomous text classification model for POC.
Created a Keras model to detect sarcastic sentences.
To create the same model you can follow the instruction from the course. ( TFjs is not a part of this course)
After creating the model to get tokenized data, I was instructed to create an object/dictionary with key as the word from the sentence and value as the token number.
I downloaded the object/dictionary as a JSON (word_token.json) file so that we can use this for generating similar tokens for our TFjs model.
Converted the Keras model to TFjs model (a model.json file).
Suppose model.h5 is my model and /output/dir is my output folder then I will perform the code below to get the model.json and group1-shard1of1.bin file containing the weights.
We need to install tensorflowjs for this conversion.
pip install tensorflowjs tensorflowjs_converter --input_format keras model.h5 /output/dir/
Create a React Application (using npx)
The basic command to create a react application link.
npx create-react-app my-app
Some npm packages to install. (Material UI and tfjs)
npm i @material-ui/core @material-ui/icons npm i @tensorflow/tfjs
Copy the word_token.json, model.json and group1-shard1of1.bin files to public/static directory so that we can serve those files.
Implemented the TFjs model in React app.
Replacing App.js with our own layout. No explanation needed to understand this (for a react user).
import "./App.css"; import * as tf from "@tensorflow/tfjs"; import { useEffect, useState } from "react"; import { Check, Clear, RecordVoiceOver, TextFields } from "@material-ui/icons"; import { Typography, TextField, LinearProgress } from "@material-ui/core"; function App() { const [nlpModel, setNlpModel] = useState(null); const [wordIndex, setWordIndex] = useState({}); const [inputstring, setInputstring] = useState( "Why don't you enter some text here ?" ); async function loadModel() { const model = await tf.loadLayersModel("/static/pureconvModel.json"); console.log("model loaded"); setNlpModel(model); } useEffect(() => { loadModel(); fetch("/static/word_token.json") .then((response) => response.json()) .then((data) => setWordIndex(data)); }, []); const predictIsSarcastic = (text) => { const token = text.split(" "); const tok = token.map((t) => wordIndex[t.toLowerCase()] !== undefined ? wordIndex[t.toLowerCase()] : 0 ); var inputStr = tok.concat(Array(150 - tok.length).fill(0)); if (nlpModel !== null) { const prediction = nlpModel.predict(tf.tensor([inputStr])).dataSync(); return [prediction >= 0.5 ? 1 : 0, prediction]; } return null; }; return ( <div className="App"> <header className="App-header"> <RecordVoiceOver></RecordVoiceOver> <Typography>Detection of Sarcastic Sentence</Typography> <TextFields></TextFields> <TextField value={inputstring ? inputstring : ""} onChange ?={(e) => setInputstring(e.target.value)} style={{ backgroundColor: "white", width: "50%", padding: 2, }} ></TextField> <br /> <LinearProgress style={{ width: "50%" }} variant="determinate" value={ predictIsSarcastic(inputstring) ? predictIsSarcastic(inputstring)[1] * 100 : 0 } /> {predictIsSarcastic(inputstring) && predictIsSarcastic(inputstring)[0] ? ( <Check color="secondary"></Check> ) : ( <Clear color="primary"></Clear> )} </header> </div> ); } export default App;
Now we are ready to play with it, just type npm start
And play with this web app
Link to my repository link
Link to the Notebook link
Link to the hosted site
Special thanks to Harshit Tyagi for this course (Deep Learning Foundations: Natural Language Processing with TensorFlow)