Introduction

Node.js is a JavaScript runtime environment, that allows you to build scalable network applications that is capable to deal with thousands of simultaneous connections, in just one machine. This is possible because of JavaScript's power. Node.js, currently, is running on top of V8 JavaScript Runtime.

As Node.js is built using mostly C++, it is fast, cross-plataform and, best of all, it is open source. You are able to contribute and see what is going on under the hood.

What is Node ?

  • Awesome;
  • Event Based;
  • Low Level;
  • Good tool to build Real-Time data apps.

What is not ?

  • A Web Framework;
  • Multi-Threaded;
  • Slow.

Blocking and Non-Blocking

So, What is so great about this non-blocking thing? Well my friend, lets take a look in some examples: Imagine that you want to provide a file to anyone that request it to your application, what you have to do? Read the file, load it in the memory and then provide it to the requester. To illustrate it better lets take a look in the code below:

'use strict';

const fs = require('fs');

const contents = fs.readFileSync('file.txt', 'utf8');

console.log(contents);
An example of Blocking I/O

As you see the code you can't find any problems with it, it gets the job done. However we block the entire application for this action. It will import the library, read and load the file's content and then print it on the console, but in a synchronous way. Meaning that the resource will be blocked throughout the whole action, becoming a bottleneck.

JavaScript is a powerful tool to build asynchronous code. Using callbacks and the Event Emitter we can take it up a notch and refactor this code. Don't worry, we will talk about Event Emitter and Callbacks in a minute.

'use strict';

const fs = require('fs');

fs.readFile('file.txt', 'utf8', function(err, contents) {
    console.log(contents);
});

console.log('after calling readFile');
An example of Non-Blocking I/O

What we did here? We changed the method from synchronous to asynchronous and added a new parameter. This new parameter is what we call a callback. The readFile method listen to a specific event, once this event is triggered our callback will be executed and the file's content will be printed on the console. And you know what is the best part of it all? It does all that asynchronously! Yes sir! We do not block any application's resources, making it possible to request a bunch of files at once and still being able to execute any numbers of tasks later.

WARNING: When we say asynchronously we didn't said it was parallel execution.

Concepts Recap

The reason behind choosing JavaScript as language for Node is simple:

"JavaScript has certain characteristics that make it very different than other dynamic languages, namely that it has no concept of threads. Its model of currency is completely based around events." - Ryan Dahl

What that means is that Node.js has a procedure that looks into a queue of events, every time an event is triggered, the Event Loop will get this event, execute its handler and keep looking for a new event. And JavaScript have a code structure that enables a smooth coding for this kind of problem. You can have a look into this awesome video about how event loop works.

Event Loop

Event Loop Diagram
Event Loop Diagram

As you can see it is a single-thread procedure that looks into a list of events. You can take a look into Event Emitter API to understand its core and how it works. Nevertheless, this is what you need to know:

  • Event Emitter is single threaded;
  • You can add multiples Event Handlers in one event;
  • Event Handlers will be executed in order of addition;
  • You can trigger an event anytime you want.

It is possible to create our own EventEmitter and trigger this events. Lets do a simple log event emitter.

'use strict';

const EventEmitter = require('events').EventEmitter;

const logger = new EventEmitter();

logger.on('error', function(message){
  console.log('Error: ' + message);
});

logger.emit('error', 'All your base are belong to us.');
Our First Event Emitter
Event Emitter using ES6 Classes
Event Emitter as Object Composition

Callbacks

Callbacks are functions that have its scope encapsulated and stored in memory for later use. JavaScript give us the ability to store functions in variables and pass it as a parameter for other functions, like event handlers. Maybe you know this behavior as closure.

setTimeout(function() {
    console.log('Hi, I\'m inside the call back!');
}, 2000);

console.log('Waiting 2 seconds...');
Callback Example
'use strict';

const message = 'Hi, I\'m inside the call back!';

const callback = function() {
    console.log(message);
};

setTimeout(callback, 2000);

console.log('Waiting 2 seconds...');
Callback Example With Variable (closure)

Lim Po Kim

Code Import, Using Require

Now that we have the basic concepts explained we can start with some Node.js code. You will see that it is good old JavaScript but with benefits. We can import and export files in an organized way, with simple commands. First lets take a look into how we can import a code that we created. So, one step at a time, we need to create a code snippet to be imported. Something simple, a class:

'use strict';

const logMsg = function (msg) {
  console.log(msg);
};

const Dog = function (name) {
  const _name = name;

  this.whoIsAGoodBoy = function () {
    logMsg(_name + ' is a good boy!');
  };

  this.wiggle = function () {
    logMsg(_name + ' is wiggling.');
  };
}

module.exports = Dog;
Exporting Dog Class
Dog Class as ES6 Standard
Dog Class as Composition

The line "module.exports = Dog;" is the Node way to export something. By doing this we are adding our code to Node environment. What is important to know is that each export actually caches the code exported, this way Node avoid parsing the file every time it is required. We should be very careful for how we export our code, because this could lead to very weird bugs.

It is time to import our code somewhere else. And how do we do it? Simple, we require this code to Node. Now that we already did the export in the file, this class of ours is available to us:

'use strict';

const Dog = require('./dog');

const brian = new Dog('Brian');

brian.whoIsAGoodDog();
brian.wiggle();
Importing Dog Class
Instanciating with Dog as ES6 Class
Instanciating with Dog as Object Composition

In the code snippet above you can see that we imported the Dog class using "var Dog = require('./Dog')". So, we are passing a relative path to require, saying which file we want to import. You can give any kind of path that you want, relative, absolute or just the module's name (This one I will talk later). It is not required to add the file extension in the end of the path as Node is always expecting a JavaScript file. Even though after the first require the code is cached, we still need to execute the require in every file that we want to import our code.

So, imagine that we have a different folder structure for our code, where index.js is not where our Dog import snippet is:

                    └── nodeCode
                        ├── lib
                        │   └── Dog.js
                        └── src
                            └── index.js
                  
Importing Dog Class

We could change the import to var Dog = require('../lib/Dog'), a relative path to the file that we want to import. Node is smart enough for this same relative path work on Unix or Windows machines.

Package Management

Node Package Manager

Remember that I said that it is possible to import a module just by using its name to the require method? Well, this is possible because of one of Node's package manager, a.k.a NPM. NPM has a huge amount of libraries that you can use. If you want, you can create your own package and publish it there, you can also contribute with your ideas on others libraries too.

To install a new package you have to open you terminal and execute this code:

$ npm install <package_name>
  
Installing a NPM Package

That is all it takes, seriously. No environment setup, no path settings, no bash_profile configuration. Cool isn't it? Ok, but this instruction will only add this package into your current project. So, after executing this command NPM will add a new folder into your application's root folder named node_modules. It will be something like this:

                      └── nodeCode
                          ├── lib
                          │   └── Dog.js
                          ├── node_modules
                          │   └── ...
                          └── src
                              └── index.js
                    
Project Folder with node_modules

Now, every package that you installed locally will be available to be required in your code. For instance, lets say that we decided that console.log is no more suitable to our Dog class. And our super cool work colleague said that he worked with this super duper package named winston and it has all that cool colored alerts. We will give it a try. First we need to install it:

$ npm install winston
  
Installing winston by NPM Package

Now we create a file to implement our new logger:

'use strict';

const winston = require('winston');

const coolLogger = new (winston.Logger)({
  transports: [
    new (winston.transports.Console)({
      level: 'silly',
      colorize: true,
      timestamp: function() {
        return (new Date()).toISOString();
      }
    })
  ]
});

module.exports = coolLogger;
Winston Logger

Import it to the Dog class:

'use strict';

const log = require('./winstonConfig.js');

const logMsg = function (msg) {
  log.info(msg);
};

const Dog = function (name) {
  const _name = name;

  this.whoIsAGoodBoy = function () {
    logMsg(_name + ' is a good boy!');
  };

  this.wiggle = function () {
    logMsg(_name + ' is wiggling.');
  };
}

module.exports = Dog;
Dog Class using new logger
With Dog ES6 Class
With Dog Object Composition

Voilà! Now we can use winston in our code! Every time we call any method from brian we will display in the console with bright colors and with a timestamp.

If I want to use this package without installing in every project? Calm dowm my friend, we have the answer for you. You can install any package globally by executing this command in you terminal:

$ npm install <package_name> -g
  
Installing a Global Package

This will install the package in the node_modules global folder for Node. This will ensure that this installed package will be available to any project that you may have. In my humble opinion I do not like to install global packages, even though I will use them in other projects. Installing packages locally will encapsulate your project and will avoid version conflicts with other projects that you might have. And this can keep your project's workspace clean. Lets keep it clean when possible!

You may be asking yourself: "Okay, but how can I keep track of all packages installed in a project?". Node have suggested a standard file that NPM can read and know which packages with what version was installed in each project. This file is called package.json

Package.json

This file called package.json lives in your application's root folder. It will act as your application's description.

                      └── nodeCode
                          ├── lib
                          │   └── Dog.js
                          ├── node_modules
                          │   └── ...
                          ├── src
                          │   └── index.js
                          └── package.json
                    
Project Folder with package.json

It is written using JSON annotation. It have all kind of your application's info, from the application's name, its version, quick description to the dependencies. A standard file would look like this:

{
    "name": "AC-socialAPI",
    "version": "0.0.0",
    "description": "Avenue Code Training Code",
    "main": "index.js",
    "scripts": {
      "preinstall": "npm install grunt-cli -g"
    },
    "repository": {
      "type": "git",
      "url": "https://github.com/andrebot/AC-socialApi.git"
    },
    "keywords": [
      "Node",
      "Express"
    ],
    "author": "Andre Botelho, Henrique Elias",
    "license": "ISC",
    "bugs": {
      "url": "https://github.com/andrebot/AC-socialApi/issues"
    },
    "homepage": "https://github.com/andrebot/AC-socialApi",
    "dependencies": {
      "body-parser": "^1.13.1",
      "cookie-parser": "^1.3.5",
      "express": "^4.13.0",
      "jsonwebtoken": "^5.0.2",
      "mongoose": "^4.0.6",
      "mpromise": "^0.5.5"
    },
    "devDependencies": {
      "blanket": "^1.1.7",
      "bootstrap": "^3.3.5",
      "grunt": "^0.4.5",
      "grunt-contrib-copy": "^0.8.0",
      "grunt-mocha-test": "^0.12.7",
      "grunt-nodemon": "^0.4.0",
      "jquery": "^2.1.4",
      "jquery.cookie": "^1.4.1",
      "mocha": "^2.2.5",
      "mongodb": "^2.0.36",
      "should": "^7.0.1",
      "supertest": "^1.0.1"
    }
  }
  
Standard package.json file

You can find the whole package.json documentation here. It explains all the possible attributes and what you can do with each one of them.

The only thing that will concern us right now is the last two attribute dependencies and devDependencies. As I said, we can control which version to install in the current project. Lets take the dependency "body-parse". It says that it is a dependency (duh!), however, it says "^1.1.7". Obviously this is the version that we want to use in our beloved app. But what is this symbol that comes after the version number? That is another type of annotation. We can actually define if we want an exact version, any version above and etc. Here is a list about what we can do:

  • ">"
  • ">="
  • "<"
  • "<="
  • "~"
  • ...

The list goes on! We can see the full list and its meaning in the same link for the package.json documentation. So you can pretty much control every aspect of you packages. Therefore, to add any new package to you project you can just edit this file and add anything that you'd like and add any version that works for you.

If you prefer you can add packages and save them using your terminal. It is as simple as to add a new parameter in the command that we already know and love:

$ npm install <package_name> --save
  
Saving package at your package.json

Executing this, NPM will add a new line at dependecies with the latest version. You can change it editing the file itself or requesting a version that you would like:

$ npm install <package_name>@<version> --save
  
Saving package at your package.json defining version

The whole documentation can be found here.

You have noticed that we have two places to add dependencies. We've just talked about where we add the packages that the application needs to run. However we have a bunch of other libraries and tools that we can add to our project that can help us out. This packages are solely for development purpose, thus we have the devDependencies attribute in our package.json. This is the place where we add automated tasks runner, tests frameworks and etc. This keep it simple for developers to know what he needs to install to do what he wants to do. For instance, if I just want to build the application I can install only the packages under dependencies. This is valid for deployment environments, that I don't need any fancy package because I just want to deploy the app. To install packages and save them at devDependencies we do:

$ npm install <package_name> --save-dev
  
Saving package at your package.json devDependencies

Now that we have your application's package.json rocking, we can easily install all the packages on new environments any time we want. We just need to execute the install command:

$ npm install
  
Installing All Packages By package.json

If we want to install just the packages required to run the project, as in a production environment, we do:

$ npm install --production
  
Excluding Development Packages

And we can easily keep things up to date:

$ npm update
  
Updating Application's Packages

This command will update all packages, local and global, to the latest version. We can also define only one package to update, or just the global packages, and all of that is described here.

Of course, we can't forget how to uninstall a package. We can always be fooled by some poorly written description or we can discover a better package at some point in time. So to uninstall any package we can do:

$ npm uninstall <package_name>
Uninstalling a Package

To save this action in your package.json we still have to add --save or --save-dev.

Another thing that you might have noticed is an attribute called scripts. This field is where you can some shell commands (or shell scripts) to be run. We have pre defined values that we can add under scripts:

  • prepublish;
  • publish & postpublish;
  • preinstall;
  • install & postinstall;
  • preuninstall & uninstall;
  • postuninstall;
  • preversion & version;
  • postversion;
  • pretest;
  • test;
  • posttest;
  • prestop;
  • stop;
  • poststop;
  • prestart;
  • start;
  • poststart;
  • prerestart;
  • restart;
  • postrestart.

You can find the whole documentation for that in this link. This is a long list, but I want you to pay attention to this set of values: preinstall, install, postinstall, pretest, test, posttest and start. As you have already figured it out, all the values that start with "post" and "pre" are scripts run before and after its actions (install, test and start) respectively. This way you can define commands to be executed in a pre-determined order so your app can work as expected.

$ npm start
Calling a NPM script in the terminal

Another useful thing about this scripts attribute is that you can add any other value that you want to run any other task/command you may think of. This way you can define flows to be executed and give meaningful names to them.

{
  ...
  "scripts": {
    "clearModules": "rm -rf node_modules"
  },
  ...
}
Example of a custom script

And we can easily execute them using:

$ npm run clearModules
Executing custom script

Ok, awesome features! Now, I want to tell you a secret. However, I don't want this to be a secret anymore! Just tell everybody about this mind blowing information: remember that I told you guys about some libraries that needed to be installed globally? So, this way, we can use them in the command line? Well, there is some issues/problems about asking your team members to install global packages: they can't be described in the package.json; you have to document this somewhere else about this package that needs to be installed; it is one more step to add to the instalation process for each global package; hard to manage the package's version throughout the whole team; we could face some permission issues. But fear not, NPM have a solution.

Instead of installing your pretty global packages globaly, install them locally, without the parameter "-g" (yes! you read it right.). "So how do I run those global packages now?", is that what you thinking. I know. Come closer. Read this carefully: all the packages that you've installed locally can be used in the scripts that you defined under the scripts attribute in your package.json. That said, you can install your "global" packages locally and use them for your pre-defined scripts..

Node Server

Now that we know what is an EventEmitter, that we know what a callback is, that we understand how the event loop works and how to import and export code we can start doing our first server!!!!!!!11!1!1!!one! Lets keep it simple. Now we will build just a server that says something useless.

'use strict';

const http = require('http');

http.createServer((request, response) => {
  response.writeHead(200);
  response.write('I\'m alive!');
  response.end();
}).listen(8080, () => {
  console.log('Listening on port 8080...');
});
First server

Basically this server replies "I'm alive!" to the requester. Cool! But useless. However we just used almost everything that we talked about. We imported a module, use it, listen to an event, triggered an event and executed an event handler! Awesome! What else we can demonstrate about all the yada yada written above? Oh, I know! Non-blocking execution:

'use strict';

const http = require('http');

http.createServer(function(request, response){
  response.writeHead(200);
  response.write('I\'m alive, wait and see...');

  setTimeout(function(){
    response.write('Took me 5 seconds to finish.');
    response.end();
  }, 5000);

  response.write('Not yet done.');
}).listen(8080, function(){
  console.log('Listening on port 8080...');
});
Non-Blocking Server

Pretty neat, hun? We kept writting data to the requester's response and only ended the request after 5 seconds, however we still could execute other functions while it was processing the other action. With the power of JavaScript syntax it was easy to code something like this, it didn't hurt, and it is readable.

Hold on! I didn't explained much about what was going on. When we passed the function to the createServer method, we defined two parameters request and response. Those are the readable and writable streams. We can write and read data from those variables. Now lets print data coming from the requester:

'use strict';

const http = require('http');

http.createServer((request, response) => {
  response.writeHead(200);
  response.write('Thanks for sharing ...');

  request.on('readable', () => {
    let data = null;

    console.log('Reading');

    while (null !== (data = request.read())) {
      console.log(data.toString());
    }

  });

  request.on('end', () => {
    console.log('Ending');

    response.end();
  });

}).listen(8080,  () => {
  console.log('Listening on port 8080...');
});
Reading Data From Requester

Now that do something that is not that useless. With this code we can even upload files! As long as the request stream has data to provide, the event end will not be triggered, and we can keep getting data and writing it where we want.

Make Request

Not only we can receive requests, but we can make requests to a server.

'use strict';

const http = require('http');
const StringDecoder = require('string_decoder').StringDecoder;

const options = {
    host: 'localhost',
    port: 8080,
    path: '/',
    method: 'POST'
};

const request = http.request(options, function(response){
    response.on('data', function(data){
        const decoder = new StringDecoder('utf8');
        console.log( decoder.write(data));
    });
});

request.write('Are you alive?');
request.end();
Sending Data To The Server
Server receiving chunk by chunk
Server receiving by event 'readable'
Sending chunk by chunk
Sending by pipe

Exercises

Exercise 1:

Create a program that prints into the console any string that you like.

Exercise 2:

Create a program that reads a content from a file and print it into the console.

Exercise 3:

Create a program that reads a content from a file and write it in another file.

Exercise 4:

Create a class that its constructor receives a string and, inside this class, create a method to print this string into the console. Export this class to Node. Create a program that imports the class that you've just created. Create a new class with your name and print it into the console.

Exercise 5:

Create a class that its constructor receives a string and, inside this class, create a method to print this string into the console using a new Event Emitter, calling the event 'log'. Export this class to Node. Create a program that imports the class that you've just created. Create a new class with your name an print it into the console.

Exercise 6:

Create a class that its constructor receives a string and, inside this class, create a method to print this string into the console using winston package. Export this class to Node. Create a program that imports the class that you've just created. Create a new class with your name an print it into the console.

Exercise 7:

Create a simple package.json file with all the dependencies and info required to describe all of what you did at exercise 6.

Exercise 8:

Create a simple server that, when requested, returns the string "Hello World".

Exercise 9:

Create a program that make a request to the local server.

Exercise 10:

Create a server that echoes to the requester the same data that he sent.

Exercise 11:

Create a server that print into the console the data that the requester sent.

Exercise 12:

Create a server that prints into a file the data that the requester sent.

Exercise 13:

Create a server that prints into a file the data that the requester sent using the method pipe from the stream request.

Exercise 14:

Create a upload server that receives a file, save it into the disk and returns to the requester the upload progress in real time.