Deep diving into NODEJS, PART2:
Understanding event driving architecture
Node.js couples JavaScript with an event loop for quickly dispatching operations when events occur. Many JavaScript environments use an event loop, but it is a core feature of Node.js. Node’s philosophy is to give you low-level access to the event loop and to system resources. Or, in Node everything runs in parallel except your code.
As long as there’s something left to do, Node’s event loop will keep spinning. Whenever an event occurs, Node invokes any callbacks (event handler) that are listening for that event.
As a Node developer, your job is to create the callback functions that get executed in response to events. Any number of callbacks can respond to any event, but only one callback function will ever be executing at any time. Everything else your program might do—like waiting for data from a file or an incoming HTTP request—is handled by Node, in parallel, behind the scenes.
Your application code will never be executed at the same time as anything else. It will always have the full attention of Node’s JavaScript engine while it’s running.
Events everywhere
Most of Node’s objects — like HTTP requests, responses, and streams — implement the EventEmitter module so they can provide a way to emit and listen to events.
const EventEmitter = require('events')
The EventEmitter is a module that facilitates communication between objects in Node. EventEmitter is at the core of Node asynchronous event-driven architecture. Many of Node’s built-in modules inherit from EventEmitter.
The concept is simple: emitter objects emit named events that cause previously registered listeners to be called. So, an emitter object basically has two main features:
we can use the emit function to emit any named event we want.
myEmitter.emit('something happened');
Emitting an event is the signal that some condition has occurred. This condition is usually about a state change in the emitting object.
We can add listener functions using the on method, and those listener functions will be executed every time the emitter object emits its associated name event.
Events execution happens synchronously, To emit an event after an asynchronous function is done, we’ll need to combine callbacks (or promises) with this event-based communication.
One benefit of using events instead of regular callbacks is that we can react to the same signal multiple times by defining multiple listeners. To accomplish the same with callbacks, we have to write more logic inside the single available callback. Events are a great way for applications to allow multiple external plugins to build functionality on top of the application’s core. You can think of them as hook points to allow for customizing the story around a state change.
Let’s see an example where we first right synchronous version log order code & then write same in an asynchronous fashion
const EventEmitter = require('events')
class PrintLog extends EventEmitter {
? execute(taskFunc) {
? ? console.log('Before executing');
? ? this.emit('start');
? ? taskFunc();
? ? this.emit('end');
? ? console.log('After executing');
? }
}
const printLog = new PrintLog();
printLog.on('begin', () => console.log('Starting execution'));
printLog.on('end', () => console.log('execution done'));
printLog.execute(() => console.log(' Executing task '));
OUT:
Before executing
Starting execution
?Executing task?
execution done
After executing
Class PrintLog is an event emitter. It defines one instance function execute. This executes function receives one argument, a task function, and wraps its execution with log statements. It fires events before and after the execution, What I want you to notice about the output above is that it all happens synchronously. There is nothing asynchronous about this code.
const fs = require('fs')
const EventEmitter = require('events');
class PrintLog extends EventEmitter {
? execute(asyncFunc, ...args) {
? ? this.emit('begin');
? ? asyncFunc(...args, (err, data) => {
? ? ? if (err) {
? ? ? ? return this.emit('error', err);
? ? ? }
? ? ? this.emit('data', data);
? ? ? this.emit('end');
? ? });
? }
}
const printLog = new PrintLog();
printLog.on('begin', () => console.log('About to execute'));
printLog.on('end', () => console.log('Done with execute'));
printLog.execute(fs.readFile, __filename);
OUT:
About to execute
Done with execute;
When we execute this code , we get the right sequence of events, as expected PrintLog class executes an asyncFunc . It emits the right sequence of events before and after the execution. And also emits error/data events to work with the usual signals of asynchronous calls
We test a printLog emitter by passing it an fs.readFile call, which is an asynchronous function. Instead of handling file data with a callback, we can now listen to the data event.
When we execute this code, we get the right sequence of events, as expected,
Note how we needed to combine a callback with an event emitter to accomplish that. If the asynFunc supported promises as well, we could use the async/await feature to do the same and maintain clean code.
class PrintLog extends EventEmitter
? async execute(asyncFunc, ...args) {
? ? this.emit('begin');
? ? try {
? ? ? const data = await asyncFunc(...args);
? ? ? this.emit('data', data);
? ? ? this.emit('end');
? ? } catch(err) {
? ? ? this.emit('error', err);
? ? }
? }
}{
CALLBACKS & PROMISES:
The simplest form of the event-driven nature is the callback style of some of the popular Node.js functions — for example, fs.readFile. In this analogy, the event will be fired once (when Node is ready to call the callback) and the callback acts as the event handler.
Let’s explore a simple example of a typical asynchronous Node function that’s written with a callback style
const printEachLine = function(file, callback)
? fs.readFile(file, function(err, data) {
? ? if (err) {
? ? ? return callback(err);
? ? }
? ? const lines = data.toString().split('\n');
? ? callback(null, lines);
? });
};
printEachLine('./file.txt', (err, lines) => {
? if (err) throw err;
? lines.map(line => console.log(line));
});{
printEachLine takes a file path and a callback function. It reads the file content, splits it into an array of lines, and calls the callback function with that array.Node’s callback style is used purely here. The callback has an error-first argument err that’s nullable and we pass the callback as the last argument for the host function. You should always do that in your functions because users will probably assume that. Make the host function receive the callback as its last argument and make the callback expect an error object as its first argument.
In modern JavaScript, we have promise objects. Promises can be an alternative to callbacks for asynchronous APIs. Instead of passing a callback as an argument and handling the error in the same place, a promise object allows us to handle success and error cases separately and it also allows us to chain multiple asynchronous calls instead of nesting them.
printEachLine function supports promises, we can use it as follows:
printEachLine('./file.txt'
? .then(lines => {
? ? lines.map(line => console.log(line));
? })
? .catch(console.error))
Instead of passing in a callback function, we called a .then function on the return value of the host function. This .then function usually gives us access to the same lines array that we get in the callback version, and we can do our processing on it as before. To handle errors, we add a .catch call on the result and that gives us access to an error when it happens,using this architecture we are preventing callback hell.
We can also make our printEachLine function to return promise by instancing new Promise
const printEachLine = function(file, callback = () => {})
? return new Promise((resolve, reject) => {
? ? fs.readFile(file, function(err, data) {
? ? ? if (err) {
? ? ? ? reject(err);
? ? ? ? return callback(err);
? ? ? }
? ? ? const lines = data.toString().split('\n');
? ? ? resolve(lines);
? ? ? callback(null, lines);
? ? });
? });
};
So we make the function return a Promise object, which wraps the fs.readFile async call. The promise object exposes two arguments, a resolve function and a reject function.
Whenever we want to invoke the callback with an error we use the promise reject function as well, and whenever we want to invoke the callback with data we use the promise resolve function as well.
HTTP in NodeJS
HTTP?is a protocol that allows the fetching of resources, such as HTML documents. It is the foundation of any data exchange on the Web and a client-server protocol, which means requests are initiated by the recipient, usually the Web browser. A complete document is reconstructed from the different sub-documents fetched, for instance, text, layout description, images, videos, scripts, and more.
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.write('Hello World!');
res.end();
}).listen(8080,'127.0.0.1');
now when we run this code will successfully sit there in?the node?engine wait for?the request?to emit?a response
The best way to request is to use our browser
https://localhost:8080?will get?a response?Hello World!
Outputting HTML and templates
var http = require('http');
var fs = require('fs');
http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type': 'text/html' });
var html = fs.readFileSync(__dirname + '/index.htm', 'utf8');
var message = 'Hello world...';
html = html.replace('{Message}', message);
res.end(html);
}).listen(1337, '127.0.0.1');
<html>
<head></head>
<body>
<h1>{Message}</h1>
</body>
</html>
note that this method can lead to?latency?while outputting response when you are delivering a large amount of data, instead use?a pipe?to output send chunks of data via?response?stream
var http = require('http');
var fs = require('fs');
http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type': 'text/html' });
fs.createReadStream(__dirname + '/index.htm').pipe(res);
}).listen(1337, '127.0.0.1');
Routing
A server stores lots of files, when browsers request, they tell the server what they are looking for. The server responds accordingly by giving them the files they ask for. This is called?Routing.
In NodeJs, we need to manually define our routes. It’s no big deal, here’s how to do a basic one.
//server.js
const http = require('http'),
url = require('url'),
makeServer = function (request,response){
let path = url.parse(request.url).pathname;
console.log(path);
if(path === '/'){
response.writeHead(200,{'Content-Type':'text/plain'});
response.write('Hello world');
}
else if(path === '/about'){
response.writeHead(200,{'Content-Type':'text/plain'});
response.write('About page');
}
else if(path === '/blog'){
response.writeHead(200,{'Content-Type':'text/plain'});
response.write('Blog page');
}
else{
response.writeHead(404,{'Content-Type':'text/plain'});
response.write('Error page');
}
response.end();
},
server = http.createServer(makeServer);
server.listen(3000,()=>{
console.log('Node server created at port 3000');
});
Paste this code in your?server.js?file, navigate to it, run it with node server.js head to your browser hit localhost:3000 and localhost:3000/about,. Try doing it with something like?localhost:3000/somethingelse.?Our error page right?.
While this may satisfy our immediate need of starting a server, it’s crazy to do this for every web page on your server. Nobody does this actually, but it’s just to give you the idea of how things work.
If you observed, we required another module; URL. It provides us with an easy way to work with us.
The?.parse()?method takes a URL as an argument and breaks it into protocol,?host,?path?and?querystringetc.?If you don’t get that part, it’s all right.
So when we do?url.parse(request.url).pathname?, it gives us the pathname or the url which is primarily what we use to?route requests.?But things can get easier though.
Routing with express
If you’ve done some underground research, you must have heard about?Express.?It’s a NodeJs framework for easily building web apps and?APIs. Since you want to write NodeJs apps, you’ll need Express too. It makes everything easier. You’ll see what I mean.
//server.js
const express = require('express'),
server = express();
server.set('port', process.env.PORT || 3000);
//Basic routes
server.get('/', (request,response)=>{
response.send('Home page');
});
server.get('/about',(request,response)=>{
response.send('About page');
});
//Express error handling middleware
server.use((request,response)=>{
response.type('text/plain');
response.status(505);
response.send('Error page');
});
//Binding to a port
server.listen(3000, ()=>{
console.log('Express server started at port 3000');
});
Now, this looks clean, right? I believe you might be getting tuned already. After importing the express module, we called it a function. This begins our server journey.
Next, we try to set the port with?the server. set().?process.env.PORT?gets the environment port the app is running on and somehow, if it’s not available, we default it to 3000.
If you observed the code above, routing in Express follows a pattern.
server.VERB('route',callback);
VERB?here is any of the?GET, POST?etc, pathname is a string that gets appended to the domain .And the?callback?is any function we want to fire when the requests come in.
There’s one more thing.
server.use(callback);
Whenever you see a?.use()?method in express, it’s called a?middleware.?Middlewares are functions that do some http heavy lifting for us. In the code above, the middleware is an error-handling one.
Finally, we do our normal?server.listen()?remember?
领英推荐
This is what routing looks like in NodeJs applications
Building RESTful APIs with Node and Express.
A?RESTful API is one that the server or client that has no idea of the state of each other. By using a REST interface, different clients hit the same REST endpoints, perform the same actions, and receive the same responses without minding the state of each other.
An API endpoint is a single function in an API that returns data.
Creating a RESTful API involves sending data in JSON or XML format. Let’s try to make one in NodeJs. We’ll make one that returns dummy json data when a client requests via Ajax. It’s not a fancy API, but it’ll help us understand how things work in Node. So…
//users.js
module.exports.users = [
{
name: 'Saurabh',
age : 19,
occupation: 'Developer',
},
{
name: 'Abhishek',
age : 27,
occupation: 'Support',
}
]
This is the data we want to share with other applications. We export it so every program can easily make use of it. That’s the idea. We store the users array in the?modules.exports?object.
//server.js
const express = require('express'),
server = express(),
users = require('./users');
//setting the port.
server.set('port', process.env.PORT || 3000);
//Adding routes
server.get('/',(request,response)=>{
response.sendFile(__dirname + '/index.html');
});
server.get('/users',(request,response)=>{
response.json(users);
});
//Binding to localhost://3000
server.listen(3000,()=>{
console.log('Express server started at port 3000');
});
Here, we require('express) and create our server with express(). If you watch closely, you’ll also see that we’re requiring something else. That’s our?users.js, remember we stored the data we’re sharing? We need it for the program to work.
Express has some methods that help us send certain content to the browser. response.sendFile() searches for a file and sends it to the browser. Here, we use a __dirname to get the root folder where our server is running from, then we add + 'index.js' to make sure we target the right file.
The?response.json()?sends JSON content to requesting websites. We pass it an argument, users array which is what we’re actually sharing.
<script src="https://code.jquery.com/jquery-3.2.1.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=" crossorigin="anonymous"></script>
<script type="text/javascript">
const btn = document.querySelector('button');
btn.addEventListener('click',getData);
function getData(e){
$.ajax({
url : '/users',
method : 'GET',
success : function(data){
console.log(data);
},
error: function(err){
console.log('Failed');
}
});
}
</script>
btn.addEventListent('click',getData); The getData makes a ‘GET’ ajax request. It contains a?$.ajax({properties})?function that sets certain parameters likeurl, success and error etc.
In real-world systems, you won’t just be reading JSON files. You would want to Create, Read, Update and Delete data. Express allows these through HTTP verbs.?POST, GET, PUT and Delete?respectively.
Networking with NodeJs sockets.
A computer network is a connection of computers to share and receive information. To do networking in NodeJs, we require the?net?module.
javascript const net = require('net');
In?Transmission Control Protocol (TCP), there must be two endpoints; one endpoint(computer) binds to a numbered port while the other connects to the port.l
Let's implement the socket
const net = require('net'),
const fs = require('fs'),
const filename = process.argv[2],
server = net.createServer((connection)=>{
console.log('Subscriber connected');
connection.write(`watching ${filename} for changes`);
let watcher = fs.watch(filename,(err,data)=>{
connection.write(`${filename} has changed`);
});
connection.on('close',()=>{
console.log('Subscriber disconnected');
watcher.close();
});
});
server.listen(3000,()=>console.log('listening for subscribers'));
Next, we should provide a file to watch. Put this into?textfile.txt?which contains text
Hello world
Next, let's create our client. Put this into?client.js
const net = require('net');
let client = net.connect({port:3000});
client.on('data',(data)=>{
console.log(data.toString());
});.
When we run the?filename.js?with the name of the file to watch like so.
//the textfile.txt will get stored in our filename variable
node filewatcher.js textfile.txt
//listening for subscribers
Now go ahead and change textfile.txt Make sure to save the changes. Take a look at the client.js terminal and notice the additional line.
//textfile.txt has changed.
One major characteristic of a network is that many clients can connect at the same time. Startup another command line, navigate to client.js run node client.js, and change the textfile.txt file again (don’t forget to save).
What are we doing?
Our?filewatcher.js?basically does three things
creates a server to send messages to many clients.?net.createServer()?Tell the server that a client connected, also tells the client that there’s a file being watched. console.log() and connection.write() . Finally watches the file with the watcher variable and close it when the client disconnects.
Events Arguments and Errors hand
In the previous example, there were two events that were emitted with extra arguments.The error event is emitted with an error object.
this.emit('error', err);
The data event is emitted with a data object.
this.emit('data', data);
We can use as many arguments as we need after the named event, and all these arguments will be available inside the listener functions we register for these named events.
For example, to work with the data event, the listener function that we register will get access to the data argument that was passed to the emitted event and that data object is exactly what the asyncFunc exposes.
printLog.on('data', (data) =>
? // do something with data
});
The error event is usually a special one. In our callback-based example, if we don’t handle the error event with a listener, the node process will actually exit.
class PrintLog extends EventEmitter
? execute(asyncFunc, ...args) {
? ? console.time('execute');
? ? asyncFunc(...args, (err, data) => {
? ? ? if (err) {
? ? ? ? return this.emit('error', err); // Not Handled
? ? ? }
? ? ? console.timeEnd('execute');
? ? });
? }
}
const printLog = new PrintLog();
printLog.execute(fs.readFile, ''); // BAD CALL
printLog.execute(fs.readFile, __filename);{
The first execute call above will trigger an error. The node process is going to crash and exit:
events.js:163
? ? ? throw er; // Unhandled 'error' event
? ? ? ^Error: ENOENT: no such file or directory, open ''
The second execute call will be affected by this crash and will potentially not get executed at all.
If we register a listener for the special error event, the behavior of the node process will change. For example:
printLog.on('error', (err) =>
? // do something with err, for example log it somewhere
? console.log(err)
});
If we do the above, the error from the first execute call will be reported but the node process will not crash and exit. The other execute call will finish normally:
{ Error: ENOENT: no such file or directory, open '' errno: -2, code: 'ENOENT', syscall: 'open', path: '' }
Note that Node currently behaves differently with promise-based functions and just outputs a warning, but that will eventually change:
UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): Error: ENOENT: no such file or directory, open ''
DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
The other way to handle exceptions from emitted errors is to register a listener for the global uncaughtException process event. However, catching errors globally with that event is a bad idea.
The standard advice about uncaughtException is to avoid using it, but if you must do (say to report what happened or do cleanups), you should just let the process exit anyway:
process.on('uncaughtException', (err) =>
? // something went unhandled.
? // Do any cleanup and exit anyway!
? console.error(err); // don't do just that.
? // FORCE exit the process too.
? process.exit(1);
});{
However, imagine that multiple error events happen at the exact same time. This means the uncaughtException listener above will be triggered multiple times, which might be a problem for some cleanup code. An example of this is when multiple calls are made to a database shutdown action.
The EventEmitter module exposes a once method. This method signals to invoke the listener just once, not every time it happens. So, this is a practical use case to use with the uncaughtException because with the first uncaught exception we’ll start doing the cleanup and we know that we’re going to exit the process anyway.
Order of Listeners
If we register multiple listeners for the same event, the invocation of those listeners will be in order. The first listener that we register is the first listener that gets invoked.
// first
printLog.on('data', (data) => {
? console.log(`Length: ${data.length}`);
});
// second
printLog.on('data', (data) => {
? console.log(`Characters: ${data.toString().length}`);
});
withTime.execute(fs.readFile, __filename);
The above code will cause the “Length” line to be logged before the “Characters” line, because that’s the order in which we defined those listeners.
If you need to define a new listener, but have that listener invoked first, you can use the prependListener method:
// firs
printLog.on('data', (data) => {
? console.log(`Length: ${data.length}`);
});
// second
printLog.prependListener('data', (data) => {
? console.log(`Characters: ${data.toString().length}`);
});
printLog.execute(fs.readFile, __filename);t
The above will cause the “Characters” line to be logged first.
And finally, if you need to remove a listener, you can use the removeListener method.
References:
In next part, we'll explore streams events in nodejs, different types of streams & its usage