Accreditation Bodies
Accreditation Bodies
Accreditation Bodies
Supercharge your career with our Multi-Cloud Engineer Bootcamp
KNOW MORENode.js is an open source cross-platform runtime environment for carrying out JavaScript code. It is used for the programming of server-side which makes it possible for the developers to use JavaScript for client-side and server-side code without learning an extra language. Whether you are a beginner or an intermediate or an experienced Node.js professional, these questions will aid you in increasing your confidence level and learning of Node.js. We have compiled a list of basic and advanced Node.js interview questions that discusses the basics of Node.js, its features, npm and its components and more. This guide on Node.js will also explain you each questions and help you understand the concepts in detail. With Node.js interview questions by your side, you can be sure of preparing well for your upcoming interview.
Filter By
Clear all
For those coming from Java development background , here is the analogy.
This is a frequently asked question in Node js interview questions for freshers.
It is not advisable to use Node.js for CPU-intensive applications
Because node.js is designed around using a single thread very efficiently. Its event-based model dispatches code fragments when specific events occur. Those code fragments are supposed to execute very quickly and then return control to node.js, which then dispatches the next event.
If one of those code fragments performs a long-running task, then no more events will be dispatched, and the whole system appears to hang.
Microsoft, Paypal, Uber
Here is the list of important features of Node.JS
Expect to come across this popular question in Nodejs interview questions.
One of the most frequently posed Node js interview questions, be ready for it.
Streams are a way of handling the following:
It is the movement of data from one point to another.
When a program is supposed to read a file consisting of single page(three to four lines), it will be initially read into memory from start to finish and then starts processing. If the file is an e-book consisting of 500+ pages, then it takes a lot of storage space and time to be loaded into memory before starting processing. This is where Streams make a difference.
Using streams, you read it piece by piece, processing its content without keeping it in memory.
The following are the advantages of streams:
The following code statement refers to the stream module:
const stream = require(‘Stream’);
The following are the types of streams in Node.Js:
1) Readable Streams: A stream where you can recieve data but cannot send it. When you push data into a readable stream , it is buffered, until customer starts to read the data
Example: HTTP Requests to the server. Based on the HTTP request, server sends HTTP response to the client which is a readable stream. Another example is RSS feed posted by remote servers on the HTTP clients are readonly streams.
Module used: fs.createReadStream
2) Writable Streams: A stream where you can send the data but not recieve from it.
Example: user entered data on HTTP clients go as HTTP Responses to the server where data is written.
Module used: fs.createWriteStream()
3) Duplex Streams: Streams that are both readable and writable.
Example: TCP sockets
Module used: net.socket
4) Transform Streams: A type of duplex streams where output is in someway related to the input. Like duplex streams, Transform streams also implement both Readable and Writable interfaces.
Node.js is a javascript server environment built on Google Chrome’s Javascript engine(V8 Engine). It’s very well suited for highly scalable I/O intensive web applications, real-time web applications and network applications. Node.js is the server side technology for the MEAN stack which is one of the most popular software stack for building dynamic web sites and web applications .Node.js is open source and hence has great community support. Main features of Node.js are
Node.js is often not preferred with relational database and CPU intensive applications.
This question is a regular feature in Nodejs interview questions for freshers, be ready to tackle it.
Npm or Node Package Manager is the default package manager for Node.js.It works as:
Few of the important npm commands are:
“ npm install <name of the package> “.
This will install the module under the path, “./node_modules/”. Once installed the module could be used just like they were built-ins. Dependency management could also be done with npm. Our node project will have a package.json which will have all the dependencies needed for our project. If we perform “npm install” from project root all the dependencies listed in the package.json file will be installed.
“npm <command> -h”
A callback function is called at the end of a specific task or simply when another function has finished executing. Callback functions are used exclusively to support the asynchronous feature or non-blocking feature of Node.js. In the asynchronous programming with Node.js, servers don’t wait for any action like API call to complete to start a new API call.
For eg:
Let’s read a file say “input.txt” and print its output in the console.The synchronous or blocking code for the above requirement is shown below:
var fs = require("fs"); var data = fs.readFileSync('input.txt'); // execution stops and waits for the read to finish console.log(data.toString()); console.log("Program Ended"); Let’s rephrase the code with the callback function. var fs = require("fs"); fs.readFile('input.txt', function (err, data) { if (err) return console.error(err); console.log(data.toString()); }); console.log("Program Ended");
Here program does not wait for reading the file to complete but proceeds to print "Program Ended".If an error occurs during the read function readFile(), err object will contain the corresponding error and if the read is successful data object will contain the contents of the file. readFile() passes err and data object to the callback function after the read operation is complete, which finally prints the content.
Pyramid of Doom or Callback hell happens when the node.js programs are very complex in nature and having heavily nested callback functions. The name is attained by the pattern caused by nested callbacks which are unreadable.
For eg:
Let’s assume that we have 3 different asynchronous tasks and each one depends on the previous result causing a mess in our code.
asyncFuncA(function(x){ asyncFuncB(x, function(y){ asyncFuncC(y, function(z){ ... }); }); });
Callback hell could be avoided by the following methods :
There are two types of API functions available in Node.js:
1.Synchronous or Blocking functions where all other code execution is blocked till an I/O event that is being waited on completes. These are executed synchronously one after the other.
For eg:
Reading a file called ‘file.txt’
const fs = require('fs'); const data = fs.readFileSync('file.txt’); // blocks here until file is read
Here the execution of further lines in the program will be blocked. If any error is thrown it needs to be caught immediately to avoid the crashing of the program.readFileSync() completely reads the content to the memory and then prints the data in the console. The blocking function has an adverse effect on the application’s performance.
2.Asynchronous or Non-blocking functions are another type of API functions where multiple I/O calls can be performed without the execution of the program being blocked.
For eg:
Reading a file “file.txt”
const fs = require('fs'); fs.readFile('file.txt’, function(err, data) => { if (err) throw err; });
Here reading of the file (readFile()) doesn’t block further execution of the next instruction in the program. The above function takes the file name and passes the data of the file as a reference to the callback handler. Then the file system object remains ready to take up any other file system operation. Asynchronous API functions increase the throughput by increasing the number of instructions handled per cycle time.
Streams are abstract interface available with Node.js.The stream module helps in implementation of streaming data. There are four types of streams.
The important events on a readable stream are:
For eg:
Reading a file “input.txt”
var fs = require('fs'); var readableStream = fs.createReadStream('input.txt'); // creates readable stream var data = ''; readableStream.on('data', function(txt) { // data event produces the flow of data data+=txt; }); readableStream.on('end', function() // end event is triggered when no data to read { console.log(data); });
The important events on a writable stream are:
For eg:
Write “Hello World “ to file.txt
var fs = require("fs"); var data = 'Hello world'; var writerStream = fs.createWriteStream('file.txt'); // Create a writable stream writerStream.write(data,'UTF8'); // Write the data to stream // Mark the end of file writerStream.end(); writerStream.on('finish', function() { // finish triggered when all data is written to console.log("Write completed."); });
Piping the streams is one of the most popular mechanisms in Node.js programs where output of one stream is provided as the input to another stream.For eg:
var fs = require("fs"); var readerStream = fs.createReadStream('example.txt'); // Readable stream var writerStream = fs.createWriteStream('exampleOutput.txt'); // Writable stream readerStream.pipe(writerStream);// Pipe the readable stream as input to writable stream console.log("Program Ended");
REPL module in Node.js is Read-Eval-Print-Loop (REPL) implementation. It’s just like a shell and command prompt.REPL is available both as a standalone program or included in other applications. It can be accessed using the command:
“const repl = require('repl');”
REPL accept individual lines of user input, evaluate them and then output the result. Input and output use stdin and stdout, respectively, or use any Node.js stream.REPL is mainly used for testing, debugging, or experimenting as it helps to execute ad-hoc javascript statements. The repl module exports the “repl.REPLServer” class which supports automatic completion of multi-line inputs, Emacs-style line editing, ANSI-styled output, saving and restoring current REPL session state, error recovery, and customizable evaluation functions. REPL environment could be started by the opening terminal in case of Unix/Linux or command prompt in case of windows and typing “node”. Some of the commands supported by the REPL environment are below:
Test pyramid is the pictorial representation of the ratio of unit tests, integration tests and end-to-end tests required for developing a good quality node.js project.
Unit tests help to check the working of a single component or module. All dependencies are stubbed providing tests for exposed methods. Modules used for Node.js Unit Testing are:
Some of the above tools could also be used for integration tests for eg: SuperTest, Mocha and Chai that detects the defects in the early stage itself. Integration tests run faster than end-to-end tests.
Testing your application through its user interface is the most popular end-to-end way of testing for any application. End-to-end tests provide us with confidence that our system works all well together. The main disadvantage of end to end testing is that it requires a lot of maintenance and run pretty slowly.
This is a frequently asked question in Node js coding questions.
Promises in simple words could be explained as advanced call-back functions. Whenever multiple callback functions needed to be nested together, Promises could be used. Promises avoid the callback hell produced by nesting together many callback functions. A promise could take up three states defined by the 'then clause'. Fulfilled state, rejected state, and pending state which is the initial state of promise.
Let’s take the example of reading a file and parsing it as JSON
1. Synchronous method of writing the code
function readJSONSync(filename) { return JSON.parse(fs.readFileSync(filename, 'utf8')); }
2. Asynchronous method of writing the code using callback. Introducing callbacks make all I/O functions asynchronous.
function readJSON(filename, callback){ fs.readFile(filename, 'utf8', function (err, res){ if (err) return callback(err); callback(null, JSON.parse(res)); }); }
Here a callback parameter confuses a bit so we replace it with promise
3. Implementation Using Promise
function readFile(filename, enc){ return new Promise(function (fulfill, reject){ fs.readFile(filename, enc, function (err, res){ if (err) reject(err); else fulfill(res); }); }); }
Here we use “new Promise” to construct the promise and pass a function to the constructor with two arguments. The first one fulfills the promise and the second one rejects the promise.
To start working with promises we need to install the “promise” module first using the command
“npm install promise”
A must-know for anyone heading into a Nodejs interview, this question is frequently asked in Node js interview.
libuv is a multi-platform library of Node.js that supports asynchronous I/O. It’s written in C.It was developed for Node.js, but it’s also used by Luvit, Julia, pyuv etc.libuv library handles file system, DNS, child processes, pipes, signal handling, polling and streaming.libuv provides the event loop to Node.js.The important features of libuv are:
In event-driven programming, an application follows certain events and respond to them when they occur. libuv gathers events from the operating system or other sources of events and then user registers callbacks to be called when an event occurs.
Some examples of events are:
Libuv also provides two types of abstractions to users. These are handles and requests. Handles represent long living objects like TCP server handle where its connection callback is called every time when there is a new connection. Requests are short-lived operations performed on the handle, like writing requests to write data on a handle.
Node.js follows a single-threaded event loop model architecture. One process in one CPU is not enough to handle the application workload so we create child processes.“child_process” module supports child processes in Node.js. These child processes can communicate with each other using a built-in messaging system. Child processes could be created in four different ways Node: spawn(), fork(), exec(), and execFile().
spawn() method brings up the child processes asynchronously. It is a command designed to run system commands that will be run on its own process. In spawn() no new V8 instance is created and only one copy of your node module is active. When your child process returns a large amount of data to the Node spawn() method could be used.
Syntax:
child_process.spawn(command[, args][, options])
Spawn method returns streams (stdout & stderr) and it’s main advantages are
In fork() a fresh instance of the V8 engine is created. fork() method could be used as below:
Syntax:
child_process.fork(modulePath[, args][, options])
In fork() a communication channel is established between parent and child processes and returns an object. We use the EventEmitter module interface to exchange messages between them.
Node.js is a web application system based on Google Chrome's JavaScript Engine(V8 Engine).
Node.js accompanies runtime condition on which a JavaScript based content can be deciphered and executed (It is analogous to JVM to JAVA bytecode). This runtime permits to execute a JavaScript code on any machine outside a program. As a result of this runtime of Node.js, JavaScript presently can be executed on server too.
Node.js likewise gives a rich library of different JavaScript modules which facilitates the development of web application utilizing Node.js to incredible degrees.
Node.js = Runtime Environment + JavaScript Library
JavaScript is asynchronous in nature as is Node. Asynchronous computer programs have a structure design which guarantees the non-blocking code execution.
Non-blocking code doesn't forestall the execution of bit of code. By and large in the event that we execute in a Synchronous way,i.e in a steady progression, we pointlessly stop the execution of code which does not upon the one you are executing.
An asynchronous API makes a scheduled request for resources, services or data at a later time when they are available. In other words, asynchronous code executes without having any reliance or organization. This improves the framework effectiveness and throughput.)
Asynchronous computer programs allow quicker execution of projects however it is at a higher cost. Truth is stranger than fiction, it's hard to program and more often than not we wind up having callback hellfire situations.
This instructional exercise is tied in with clarifying every one of the Asynchronous situations which you may confront while coding.
All APIs of Node.js library are asynchronous that is non-blocking. It basically implies a Node.js based server never trusts that an API will return information. Server moves to next API subsequent to calling it and a notice instrument of Events of Node.js causes the server to get a reaction from the past API call.
Stubs are utilized during Top-down combination testing, so as to reenact the conduct of the lower-level modules that are not yet coordinated. Stubs are the modules that go about as impermanent substitution for a considered module and give a similar yield as that of the real item.
Stubs are likewise utilized when the product needs to collaborate with an outside framework.
Stubs are functions or programs that mimic the practices of component or modules. It gives predetermined responses to function calls made during testing.
Stub Workflow
In the above figure it is clearly stated that Module 1, 2 and 3 are ready for testing whereas all remaining Modules are still in development. Order of Integration is as below:-
Test Pyramid concept was given Mike Cohn. Its essential point is that you should have many more low-level unit tests than high level end-to-end tests running through a GUI. A test pyramid depicts the proportion of what number of unit tests, reconciliation tests and E2E test you ought to compose.
Advantages of Node.js are mentioned as below:-
Disadvantages of Node.js are mentioned as below:-
The following tools are very popular:-
Node.js is an open source server-side JavaScript run-time condition based on Chrome's JavaScript Engine(V8). Node.js is utilized to structure quick and adaptable applications and is an occasion driven, non-blocking I/O model.
REPL (READ, EVAL, PRINT, LOOP) is a PC situation like Shell (Unix/Linux) and direction brief. A command is entered and the system responds with an output. The hub accompanies the REPL condition when it is introduced, and the Framework associates with the client through yields of directions/articulations utilized.
Node.js or Node comes bundled with a REPL environment that performs the following desired tasks.
We have repl module which is available for standalone applicationsas well, and can be called through below command:-
const repl = require('repl');
Underscore variable is a special kind of variable which usually stores the value of its last execution result. Whenever you need to use the output of one module to the input of next module you may use this variable. It serves asimilar purpose as $? in bash. Please find below an example for a more detailed explanation:-
Use _ to get the last result.
C:Nodejs_WorkSpace>node
> var x = 10 undefined > var y = 20 undefined > x + y 30 > var sum = _ undefined > console.log(sum) 30 undefined >
In Version 6.x or higher Underscore variable gives you the following result:-
> [ 'a', 'b', 'c' ] [ 'a', 'b', 'c' ] > _.length 3 > _ += 1
Expression assignment to _ now disabled.
4 > 1 + 1 2 > _ 4
In older version you will get a different result:-
> [ 'a', 'b', 'c' ] [ 'a', 'b', 'c' ] > _.length 3 > _ += 1 4 > 1 + 1 2 > _ 2
Applications made using Node.js create a single thread, which means it receives a request and processes it first before moving to another request. So if you are developing any streaming application or event based application, then Node.js is best suited for it. Some examples of such applications are as follows:-
The package.json file is the core of Node.js system. It is the main file of any Node.js project and contains the complete information about the project. Developers have to have a deep understanding of package.json file to work with Node.js. It is the first step to learn about development in Node.js. Package.jason is mostly used for defining the properties of a package and it is always present in the root directory folder of any Node js application. The most common attributes of Package.Jason are as following:-
Node Package Manager (NPM) provides two primary functionalities −
www.npmjs.com hosts thousands of free packages to download and use. The NPM program is installed on your computer when you install Node.js. There might be a scenario when we are required to either uninstall a dependency or have to update an existing dependency, so please use the following command for uninstalling any dependencies through NPM:
C:Nodejs_WorkSpace>npm uninstall <Name of dependency here>
Please use the following command for updating any dependencies through NPM
C:Nodejs_WorkSpace>npm update <Name of dependency here>
To get the old behavior, use npm --depth 9999 update . As of npm@5.0.0 , the npm update will change package.json to save the new version as the minimum required dependency. To get the old behavior, use npm update --no-save
In general programming terms, event-driven programming is a programming technique in which the flow of the program is determined by events such as user actions (mouse clicks, key presses), sensor outputs, or messages from other programs or threads.
As we know, Node.js is a single threaded application but it can support multithreading through Event and callback function. Event Driven programming is running on Request and response technique. We have to define a target which may be any button or click event so whenever our application is going to receive a request on the target our application will accept and process that request, and provide a response back to the user. This is usually achieved through Callback function. You can review the below example for more reference:-
function addtoStudent(studentId) { event.send("student.add" , {id: studentId}) } event.on("student.add", function(event) { show("Adding New Student" +event.id); });
npm is world’s largest software repository. Globally, open source developers use npm to share and borrow packages. Example, you need to install node and npm before getting necessary packages for Angular development.Packages are needed to bring modularity to code development.
npm consists of three distinct components:
Some of the uses of npm are:
Blocking program
Example: Let us create text file named blk.txt
blk.txt
A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.
Create a javascript code as follows and save as blk.js
var fs = require("fs"); var data = fs.readFileSync('blk.txt'); console.log(data.toString()); console.log('Program Ended');
Execute the code. The result is
A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.
Program Ended
In this example, program blocks until it reads the file and then only it proceeds to next statement or end the program.
Non Blocking program
Example: Use the same input file defined for blocking code example.
Create a javascript code as follows and save as nblk.js
var fs = require("fs"); fs.readFile('blk.txt', function (err,data) { if (err) return console.error(err); console.log(data.toString()); }); console.log("Program Ended");
Execute the code. The result is
Program Ended
A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.
This example shows that program doesn’t wait for file reading and prints “Program Ended” first. At the same time, the program without blocking continues to read the file.
Every action on a computer is an event.
Example: File opening is an event.
Objects in Node.js can fire events like createReadStream object fires when opening or closing a file.
Example: to read a stream of characters in existing file trnk.txt
var fs = require("fs"); var rdstream = fs.createReadStream('trnk.txt'); rdstream.on('open', function(){ console.log("File Opened"); });
Executing above code you will get result as
File Opened
You can create, fire and listen to your events using Events module, eventEmitter class.
Events module and eventEmitter class is used to bind events and event-listener.
A must-know for anyone heading into a Nodejs interview, this question is frequently asked in Node js interview questions for experienced.
To fire an event use eventEmitter.emit(‘evenName’)
To bind an event handler with an event eventEmitter.on(‘eventName’, event handler)
Example: Refer to the following example.
// Import Events module var events = require('events'); // Create an eventemitter object var eventEmitter = new events.EventEmitter();
//Create an event handler var myEventHandler = function () { console.log('I have completed'); } // Assign evenhandler to an event eventEmitter.on('complete', myEventHandler);
// Fire the complete event eventEmitter.emit('complete')
Following is the result after executing the code.
Result: I have completed
It's no surprise that this one pops up often in Node js interview questions for experienced professionals.
The buffers module provides a way of handling streams of binary data.
Nodejs implements Buffer using the Buffer class.
Typically, the movement of data is done with the purpose of processing it, or read it, and make decisions based on it. But there is a minimum and a maximum amount of data a process could take over time. So if the rate at which the data arrives is faster than the rate at which the process consumes the data, the excess data need to wait somewhere for its turn to be processed.
On the other hand, if the process is consuming the data faster than it arrives, the few data that arrive earlier need to wait for a certain amount of data to arrive before being sent out for processing.
That “waiting area” is the buffer! It is a small physical location in your computer, usually in the RAM, where data are temporally gathered, wait, and are eventually sent out for processing during streaming.
Example: An example where you can see buffering in action is when you are trying to read an e-book( of size 500 pages with graphics) in google books. If internet is fast enough, the speed of the stream is fast enough to fill up the buffer and send out for further processing, then fill another one and send out for processing till stream is finished.
If your internet connection is slow, Google books display a loading icon, which means gathering more data or expecting more data to arrive. When the buffer is filled up and processed, google books show the page. While displaying the page, more data continues to arrive and wait in the buffer.
No. Buffer class is part of Global Modules.
Example:
How to convert binary equivalent of stream xyz to JSON format:
create a javascript file with following code.
var buf = Buffer.from('xyz'); console.log(buf.toJSON());
in the first line buf is the variable and Buffer is the Buffer class. Using toJSON method we can convert code as shown in the result below.
Execute the code : Following is the result
{ type: 'Buffer', data: [ 120, 121, 122 ] }
Duplex: Duplex streams are streams that implement both readable and writable interfaces
Examples:
Transform: A type of duplex streams where output is in someway related to the input. Like duplex streams, Transform streams also implement both Readable and Writable interfaces.
Examples:
Piping is
No limit on piping operations.
Example: Create a text file dataSinput.txt with the following content.
After executing the following code you can view the contents in the outputfile.
var fs = require("fs"); //import fs module //creating a readstream to read our inputdatafile dataSinput.txt var readStream = fs.createReadStream("F://dataSinput.txt"); //creating a writestream(initially empty) which is destination for transferred data var writeStream = fs.createWriteStream("F://dataSoutput.txt"); //Use Pipe command to transfer from readstream to writestream. //Pipe command takes all data from readstream and pushes it to writestream readStream.pipe(writeStream);
Output in dataSoutput.txt can be seen as
Chaining is
Example: Create a text file dataSinput.txt with the following content.
After executing following code.
var fs = require("fs"); //import fs module var zlib = require("zlib"); //import zlib module //creating a readstream to read our inputdatafile dataSinput.txt var readStream = fs.createReadStream("F://dataSinput.txt"); //create a compressed folder zlib var czlib = zlib.createGzip(); //creating a writestream(initially empty) which is destination for transferred data var writeStream = fs.createWriteStream("F://dataSoutput.txt.gz"); //Use Pipe command to transfer from readstream to gzip. //Pipe commands takes all data from readstream and pushes it to compressed writestream file readStream.pipe(czlib).pipe(writeStream); console.log("File Compressed");
You get result as “File Compressed”. And compressed file dataSoutput.txt.gz as output Which consists of text file dataSoutput.txt.
Yes.Every method of fs module supports both synchronous and asynchronous forms.
Asynchronous methods take the first parameter of callback function as error and last parameter as completion function.
It is better to use an asynchronous method as it never blocks as program during execution whereas synchronous methods does block the program during execution.
A common question in Node js advanced interview questions, don't miss this one.
import fs module and declare buffer class
Example: Create a text file named trnk.txt with Knowledgehut tutorials as text in it
Create a javascript code as follows. Save it as trnkexmpl.js
var fs = require("fs"); //import module var buf = new Buffer(1024); //define buffer fs.open('trnk.txt', 'r+', function (err,fd) { if (err) { return console.error(err); } console.log("File opened"); //Truncate the open file) fs.ftruncate(fd, 12, function(err) { if (err) { return console.log(err); } console.log("File Truncated") console.log("Going to read same file") fs.read(fd, buf, 0, buf.length, 0 ,function(err, bytes) { if(err) { return console.log(err); } //Print only read bytes if(bytes > 0) { console.log(buf.slice(0, bytes).toString()); } //Close the opened file fs.close(fd, function(err){ if (err) { console.log(err); } console.log("File closed successfully"); }); }); }); });
Execute the code . You get the following result:
Create a text file “demo_file_del.txt” file to be deleted.
var fs = require("fs"); //import fs module console.log("going to delete demo_file_del.txt file") fs.unlink('demo_file_del.txt', function(err) { // call unlink method) if (err) { return console.err(err); } console.log("File deleted successfully") });
Result:
The “error-first” callback is also known as an “errorback”, “errback”, or “node-style callback”. Error-first callbacks are utilized to pass errors and data too. You need to pass the error as the main parameter, and you must verify whether something went wrong. Additional arguments are utilized to pass data. There are two rules for defining an error-first callback:
Code example is as below:-
fs.readFile(filePath, function(err, data) { if (err) { // handle the error, the return is important here // so execution stops here return console.log(err) } // use the data object console.log(data) })
Callback Hell no more with Promises and its nearby partner, Generators. ... It comprises of numerous settled callbacks which makes code hard to pursue and investigate. One may unconsciously get captured in Callback Hell while managing asynchronous logic. Callback hell is a marvel that torments a JavaScript designer when he attempts to execute different nonconcurrent tasks in a steady progression. You may use any of the below options for resolving the issue of callback hells:
Promises are a simultaneousness primitive, first depicted during the 80s. Presently they are a part of most current programming languages to make your life simpler. Promises can enable you to handle async activities.
Please see the below example, which will help you understand the code simplicity and scalability using or without using Promise.
Code example without using Promise:-
var MongoClient = require('mongodb').MongoClient; var url = 'mongodb://localhost/EmployeeDB'; MongoClient.connect(url, function(err, db) { db.collection('Employee').insertOne({ Employeeid: 4, EmployeeName: "NewEmployee" }); });
The portion of the function(err, db) in the above code is known as an anonymous or callback function declaration. When the MongoClient connects to the MongoDB database, it returns to the callback function once the connection is complete. In a way, the link operations occur in the background, and it calls our callback function when it is completed. Note that this is one of Node.js' key points to allow multiple operations to occur simultaneously, thereby preventing any user from carrying out an operation.
So then what's a promise? Ok, a promise in Node.js is just an update to callback functions. There may be an instance during the software lifecycle where several callback functions need to be nestled together. At some point in time, this can getmessy and difficult to maintain. In short, a promise is a callback improvement that aims to mitigate these issues.
The basic syntax of a promise is shown below;
var promise = doSomethingAync() promise.then(onFulfilled, onRejected)
Code example using Promise:-
var Promise = require('promise'); var MongoClient = require('mongodb').MongoClient; var url = 'mongodb://localhost/EmployeeDB'; MongoClient.connect(url) .then(function(err, db) { db.collection('Employee').updateOne({ "EmployeeName": "Martin" }, { $set: { "EmployeeName": "Mohan" } }); });
We can easily load HTML code into Node.Jsby making a change in the content type. In HTML Code content type is defined as “text/plain” and for loading this into Node.Js we have to change it to “text/html”. Please see below example for more detail:-
fs.readFile(filename, "binary", function(err, file) { if(err) { response.writeHead(500, {"Content-Type": "text/plain"}); response.write(err + "\n"); response.end(); return; } response.writeHead(200); response.write(file, "binary"); response.end(); });
Now we have to modify the above code to load an HTML page instead of plain text like as below:-
fs.readFile(filename, "binary", function(err, file) { if(err) { response.writeHead(500, {"Content-Type": "text/html"}); response.write(err + "\n"); response.end(); return; } response.writeHead(200, {"Content-Type": "text/html"}); response.write(file); response.end(); });
A mechanism which is designed to handle async callbacks is known as Event loop. Node.js is a single threaded and event driven programming language. We can attach any listener on the request node and whenever it is triggered with a known request then listener will accept and process it based on the predefined callback functions which we have setup in our application.
At whatever point we are callingsetTimeout, http.get and fs.readFile, Node.js runs these tasks and further keeps on running other programs without waiting for the output. At the point when the activity is done, it gets the output and runs our callback function.
So all the callback functions are lined in a circle, and will run individually whenever we receive the response.
The module.exports or exports is an uncommon article which is incorporated into each JS record in the Node.js application as a matter of course. The module is a variable that speaks to current module and exports is an item that will be uncovered as a module. Thus, whatever you allocate to module.exports or send out, will be uncovered as a module. Each Module in Node.js has its own functionality and that cannot interfere with the other modules.
A module encapsulates related code into a solitary unit of code. This can be translated as moving every single related function into a document. Envision that we made a document called greetings.js and it contains the following two functions:
module.exports = { printAdditionInMaths: functions() { return "+"; }, printSubtractionInMaths: functions() { return "-";} };
In the above example, module.export has exposed 2 functions which can be called out in any other program as shown below:-
Var mathematics = require(“./mathematics.js)
mathematics.printAdditionInMaths(); mathematics.printSubtractionInMaths();
Asynchronous truly implies not synchronous. If we are making HTTP requests which are asynchronous, it implies we are not waiting for the server reaction.
The term Non-Blocking is broadly utilized with IO. For instance non-blocking read/compose calls come back with whatever they can do and anticipate that the user should execute the call once more. Peruse will hold up until it has some information, and put calling thread to rest.
An asynchronous consider demands an exchange that will be performed in its whole(entirety) yet will finish at some future time. Non-blocking: This capacity won't pause while on the stack. Synchronous is characterized as occurring simultaneously. Asynchronous is characterized as not occurring simultaneously.
Please review the below example to understand better:-
Synchronous & blocking
Synchronous & non-blocking
Asynchronous
A reusable block of code which may not impact the other codes through its presence is called a Module. Modules are presented in ES6. Modules are significant for Maintainability, Reusability, and Namespacing of Code.
Module in Node.js is a basic or complex functionality organized in single or various JavaScript documents which can be reused all through the Node.js applications.
Every module in Node.js has its own unique circumstances, so it can't meddle with different modules or contaminate global modules. Additionally, every module can be put in a different .js record under a different envelope.
Node.js actualizes CommonJS modules standard. CommonJS is a gathering of volunteers who characterize JavaScript guidelines for web server, work area, and reassure application.
Node.js includes three types of modules:
libuv is a fabulous asynchronous IO library. It has a high productive event loop, and additionally has a separate answer for io blocking activity. It has an interior laborer thread pool for io blocking operation(E.g. submit io blocking work through uv_queue_work). So it accomplishes incredible execution by consolidating both asynchronous event loops and thread pools for non-io blocking and io blocking activity. It is a decent decision for superior server.
If the synchronous thread pool model is what you are used to on an everyday basis, you may find the asynchronous model somewhat hard, particularly when you have to decide when is the best time to discharge the "handles". If you do not get that right, libuv will crash and make your troubleshooting difficult.
libuv is a Cross-platform I/O abstraction library that supports asynchronous I/O based on event loops.It is written in C and released under the MIT Licence.
libuv support Windows IOCP, epoll(4), kqueue(2), and Solaris event ports. Initially, it was designed for Node.js but later it is also used by other software projects.
Reference : https://en.wikipedia.org/wiki/Libuv
It's no surprise that this one pops up often in Node js interview questions for experienced.
The Zlib module provides a way to zip and unzip files. Zlib is a Cross-stage data compression library. It was composed by Jean-loupGailly and Mark Adler. In Node js, you can Zlib for Threadpool, HTTP solicitations, and reactions pressure and Memory Usage Tuning. So as to utilize zlib in node js, you have to introduce hub zlib bundle. After establishment, below is the test code to utilize Zlib.
var Buffer = require('buffer').Buffer; var zlib = require('zlib'); var input = new Buffer('lorem ipsum dolor sit amet'); var compressed = zlib.deflate(input); var output = zlib.inflate(compressed);
Following are a few important Zlib properties and Methods:-
MethodDescription
Node.js file system module is capable of working on files either on your local system or database, but working on file system we have to call require() method for calling a file server. Through Node.js we can even open any http file and perform read content operation.
NodeJs peruses the content of a document in a non-blocking, nonconcurrent way. Node JS utilizes its fs center API to manage documents. The simplest method to peruse the whole substance of a document in nodeJs is with fs.readFile strategy. The following is a test code to peruse a record in NodeJs asynchronously and synchronously.
Reading a file in node asynchronous/ non-blocking
var fs = require('fs'); fs.readFile('DATA', 'utf8', function(err, contents) { console.log(contents); }); console.log('after calling readFile'); Reading a file in node asynchronous/blocking var fs = require('fs'); var contents = fs.readFileSync('DATA', 'utf8'); console.log(contents);
One of the most frequently posed Node js interview questions for experienced, be ready for it.
Streams are accumulations of information — simply like clusters or strings. What matters is that streams probably won't be accessible at the same time, and they don't need to fit in memory. This makes streams extremely amazing when working with a lot of information, or information that is originating from an outer source one piece at once.
Notwithstanding, streams are not just about working with enormous information. They additionally give us the intensity of composability in our code. Much the same as we can create ground-breaking Linux directions by funneling other littler Linux directions, we can do the equivalent in Node with streams precisely.
Streams are extraordinary kinds of objects in Node that enable us to peruse data from a source or compose data for a goal consistently. There are 4 kinds of streams accessible in Node Js; they are:
Node.js is a runtime framework that has turned out to be famous with most coders and is utilized generally for creating server-side applications. Node.js is best known for making constant APIs and building another network of interoperability over the web. There are two manners by which a record can be perused and sent for execution in Node.js; readFile and CreateStream. Please see a few major differences in readfile and createStream as below:-
Node.js file system module is capable to work on file either on your local system or database but working on file system we have to call require() method for calling a file server. This can be understood by the following example:-
Var fs = require(‘fs’);
Basic activities or operations which are required for working on file are as mentioned below:-
var http = require('http'); var fs = require('fs'); http.createServer(function (req, res) { fs.readFile('demofile1.html', function(err, data) { res.writeHead(200, {'Content-Type': 'text/html'}); res.write(data); res.end(); }); }).listen(8080);
fs.open(); fs.writeFile();
fs.writeFile();
A set of code which always operates in between multiple asynchronous function call is called Control Flow. You may describe it as a function because it takes some input and provides you or its next functions an output. Control Flow usually controls the function by using the following steps:-
We have three different patterns which can explain more about control functions :-
For those coming from Java development background , here is the analogy.
This is a frequently asked question in Node js interview questions for freshers.
It is not advisable to use Node.js for CPU-intensive applications
Because node.js is designed around using a single thread very efficiently. Its event-based model dispatches code fragments when specific events occur. Those code fragments are supposed to execute very quickly and then return control to node.js, which then dispatches the next event.
If one of those code fragments performs a long-running task, then no more events will be dispatched, and the whole system appears to hang.
Microsoft, Paypal, Uber
Here is the list of important features of Node.JS
Expect to come across this popular question in Nodejs interview questions.
One of the most frequently posed Node js interview questions, be ready for it.
Streams are a way of handling the following:
It is the movement of data from one point to another.
When a program is supposed to read a file consisting of single page(three to four lines), it will be initially read into memory from start to finish and then starts processing. If the file is an e-book consisting of 500+ pages, then it takes a lot of storage space and time to be loaded into memory before starting processing. This is where Streams make a difference.
Using streams, you read it piece by piece, processing its content without keeping it in memory.
The following are the advantages of streams:
The following code statement refers to the stream module:
const stream = require(‘Stream’);
The following are the types of streams in Node.Js:
1) Readable Streams: A stream where you can recieve data but cannot send it. When you push data into a readable stream , it is buffered, until customer starts to read the data
Example: HTTP Requests to the server. Based on the HTTP request, server sends HTTP response to the client which is a readable stream. Another example is RSS feed posted by remote servers on the HTTP clients are readonly streams.
Module used: fs.createReadStream
2) Writable Streams: A stream where you can send the data but not recieve from it.
Example: user entered data on HTTP clients go as HTTP Responses to the server where data is written.
Module used: fs.createWriteStream()
3) Duplex Streams: Streams that are both readable and writable.
Example: TCP sockets
Module used: net.socket
4) Transform Streams: A type of duplex streams where output is in someway related to the input. Like duplex streams, Transform streams also implement both Readable and Writable interfaces.
Node.js is a javascript server environment built on Google Chrome’s Javascript engine(V8 Engine). It’s very well suited for highly scalable I/O intensive web applications, real-time web applications and network applications. Node.js is the server side technology for the MEAN stack which is one of the most popular software stack for building dynamic web sites and web applications .Node.js is open source and hence has great community support. Main features of Node.js are
Node.js is often not preferred with relational database and CPU intensive applications.
This question is a regular feature in Nodejs interview questions for freshers, be ready to tackle it.
Npm or Node Package Manager is the default package manager for Node.js.It works as:
Few of the important npm commands are:
“ npm install <name of the package> “.
This will install the module under the path, “./node_modules/”. Once installed the module could be used just like they were built-ins. Dependency management could also be done with npm. Our node project will have a package.json which will have all the dependencies needed for our project. If we perform “npm install” from project root all the dependencies listed in the package.json file will be installed.
“npm <command> -h”
A callback function is called at the end of a specific task or simply when another function has finished executing. Callback functions are used exclusively to support the asynchronous feature or non-blocking feature of Node.js. In the asynchronous programming with Node.js, servers don’t wait for any action like API call to complete to start a new API call.
For eg:
Let’s read a file say “input.txt” and print its output in the console.The synchronous or blocking code for the above requirement is shown below:
var fs = require("fs"); var data = fs.readFileSync('input.txt'); // execution stops and waits for the read to finish console.log(data.toString()); console.log("Program Ended"); Let’s rephrase the code with the callback function. var fs = require("fs"); fs.readFile('input.txt', function (err, data) { if (err) return console.error(err); console.log(data.toString()); }); console.log("Program Ended");
Here program does not wait for reading the file to complete but proceeds to print "Program Ended".If an error occurs during the read function readFile(), err object will contain the corresponding error and if the read is successful data object will contain the contents of the file. readFile() passes err and data object to the callback function after the read operation is complete, which finally prints the content.
Pyramid of Doom or Callback hell happens when the node.js programs are very complex in nature and having heavily nested callback functions. The name is attained by the pattern caused by nested callbacks which are unreadable.
For eg:
Let’s assume that we have 3 different asynchronous tasks and each one depends on the previous result causing a mess in our code.
asyncFuncA(function(x){ asyncFuncB(x, function(y){ asyncFuncC(y, function(z){ ... }); }); });
Callback hell could be avoided by the following methods :
There are two types of API functions available in Node.js:
1.Synchronous or Blocking functions where all other code execution is blocked till an I/O event that is being waited on completes. These are executed synchronously one after the other.
For eg:
Reading a file called ‘file.txt’
const fs = require('fs'); const data = fs.readFileSync('file.txt’); // blocks here until file is read
Here the execution of further lines in the program will be blocked. If any error is thrown it needs to be caught immediately to avoid the crashing of the program.readFileSync() completely reads the content to the memory and then prints the data in the console. The blocking function has an adverse effect on the application’s performance.
2.Asynchronous or Non-blocking functions are another type of API functions where multiple I/O calls can be performed without the execution of the program being blocked.
For eg:
Reading a file “file.txt”
const fs = require('fs'); fs.readFile('file.txt’, function(err, data) => { if (err) throw err; });
Here reading of the file (readFile()) doesn’t block further execution of the next instruction in the program. The above function takes the file name and passes the data of the file as a reference to the callback handler. Then the file system object remains ready to take up any other file system operation. Asynchronous API functions increase the throughput by increasing the number of instructions handled per cycle time.
Streams are abstract interface available with Node.js.The stream module helps in implementation of streaming data. There are four types of streams.
The important events on a readable stream are:
For eg:
Reading a file “input.txt”
var fs = require('fs'); var readableStream = fs.createReadStream('input.txt'); // creates readable stream var data = ''; readableStream.on('data', function(txt) { // data event produces the flow of data data+=txt; }); readableStream.on('end', function() // end event is triggered when no data to read { console.log(data); });
The important events on a writable stream are:
For eg:
Write “Hello World “ to file.txt
var fs = require("fs"); var data = 'Hello world'; var writerStream = fs.createWriteStream('file.txt'); // Create a writable stream writerStream.write(data,'UTF8'); // Write the data to stream // Mark the end of file writerStream.end(); writerStream.on('finish', function() { // finish triggered when all data is written to console.log("Write completed."); });
Piping the streams is one of the most popular mechanisms in Node.js programs where output of one stream is provided as the input to another stream.For eg:
var fs = require("fs"); var readerStream = fs.createReadStream('example.txt'); // Readable stream var writerStream = fs.createWriteStream('exampleOutput.txt'); // Writable stream readerStream.pipe(writerStream);// Pipe the readable stream as input to writable stream console.log("Program Ended");
REPL module in Node.js is Read-Eval-Print-Loop (REPL) implementation. It’s just like a shell and command prompt.REPL is available both as a standalone program or included in other applications. It can be accessed using the command:
“const repl = require('repl');”
REPL accept individual lines of user input, evaluate them and then output the result. Input and output use stdin and stdout, respectively, or use any Node.js stream.REPL is mainly used for testing, debugging, or experimenting as it helps to execute ad-hoc javascript statements. The repl module exports the “repl.REPLServer” class which supports automatic completion of multi-line inputs, Emacs-style line editing, ANSI-styled output, saving and restoring current REPL session state, error recovery, and customizable evaluation functions. REPL environment could be started by the opening terminal in case of Unix/Linux or command prompt in case of windows and typing “node”. Some of the commands supported by the REPL environment are below:
Test pyramid is the pictorial representation of the ratio of unit tests, integration tests and end-to-end tests required for developing a good quality node.js project.
Unit tests help to check the working of a single component or module. All dependencies are stubbed providing tests for exposed methods. Modules used for Node.js Unit Testing are:
Some of the above tools could also be used for integration tests for eg: SuperTest, Mocha and Chai that detects the defects in the early stage itself. Integration tests run faster than end-to-end tests.
Testing your application through its user interface is the most popular end-to-end way of testing for any application. End-to-end tests provide us with confidence that our system works all well together. The main disadvantage of end to end testing is that it requires a lot of maintenance and run pretty slowly.
This is a frequently asked question in Node js coding questions.
Promises in simple words could be explained as advanced call-back functions. Whenever multiple callback functions needed to be nested together, Promises could be used. Promises avoid the callback hell produced by nesting together many callback functions. A promise could take up three states defined by the 'then clause'. Fulfilled state, rejected state, and pending state which is the initial state of promise.
Let’s take the example of reading a file and parsing it as JSON
1. Synchronous method of writing the code
function readJSONSync(filename) { return JSON.parse(fs.readFileSync(filename, 'utf8')); }
2. Asynchronous method of writing the code using callback. Introducing callbacks make all I/O functions asynchronous.
function readJSON(filename, callback){ fs.readFile(filename, 'utf8', function (err, res){ if (err) return callback(err); callback(null, JSON.parse(res)); }); }
Here a callback parameter confuses a bit so we replace it with promise
3. Implementation Using Promise
function readFile(filename, enc){ return new Promise(function (fulfill, reject){ fs.readFile(filename, enc, function (err, res){ if (err) reject(err); else fulfill(res); }); }); }
Here we use “new Promise” to construct the promise and pass a function to the constructor with two arguments. The first one fulfills the promise and the second one rejects the promise.
To start working with promises we need to install the “promise” module first using the command
“npm install promise”
A must-know for anyone heading into a Nodejs interview, this question is frequently asked in Node js interview.
libuv is a multi-platform library of Node.js that supports asynchronous I/O. It’s written in C.It was developed for Node.js, but it’s also used by Luvit, Julia, pyuv etc.libuv library handles file system, DNS, child processes, pipes, signal handling, polling and streaming.libuv provides the event loop to Node.js.The important features of libuv are:
In event-driven programming, an application follows certain events and respond to them when they occur. libuv gathers events from the operating system or other sources of events and then user registers callbacks to be called when an event occurs.
Some examples of events are:
Libuv also provides two types of abstractions to users. These are handles and requests. Handles represent long living objects like TCP server handle where its connection callback is called every time when there is a new connection. Requests are short-lived operations performed on the handle, like writing requests to write data on a handle.
Node.js follows a single-threaded event loop model architecture. One process in one CPU is not enough to handle the application workload so we create child processes.“child_process” module supports child processes in Node.js. These child processes can communicate with each other using a built-in messaging system. Child processes could be created in four different ways Node: spawn(), fork(), exec(), and execFile().
spawn() method brings up the child processes asynchronously. It is a command designed to run system commands that will be run on its own process. In spawn() no new V8 instance is created and only one copy of your node module is active. When your child process returns a large amount of data to the Node spawn() method could be used.
Syntax:
child_process.spawn(command[, args][, options])
Spawn method returns streams (stdout & stderr) and it’s main advantages are
In fork() a fresh instance of the V8 engine is created. fork() method could be used as below:
Syntax:
child_process.fork(modulePath[, args][, options])
In fork() a communication channel is established between parent and child processes and returns an object. We use the EventEmitter module interface to exchange messages between them.
Node.js is a web application system based on Google Chrome's JavaScript Engine(V8 Engine).
Node.js accompanies runtime condition on which a JavaScript based content can be deciphered and executed (It is analogous to JVM to JAVA bytecode). This runtime permits to execute a JavaScript code on any machine outside a program. As a result of this runtime of Node.js, JavaScript presently can be executed on server too.
Node.js likewise gives a rich library of different JavaScript modules which facilitates the development of web application utilizing Node.js to incredible degrees.
Node.js = Runtime Environment + JavaScript Library
JavaScript is asynchronous in nature as is Node. Asynchronous computer programs have a structure design which guarantees the non-blocking code execution.
Non-blocking code doesn't forestall the execution of bit of code. By and large in the event that we execute in a Synchronous way,i.e in a steady progression, we pointlessly stop the execution of code which does not upon the one you are executing.
An asynchronous API makes a scheduled request for resources, services or data at a later time when they are available. In other words, asynchronous code executes without having any reliance or organization. This improves the framework effectiveness and throughput.)
Asynchronous computer programs allow quicker execution of projects however it is at a higher cost. Truth is stranger than fiction, it's hard to program and more often than not we wind up having callback hellfire situations.
This instructional exercise is tied in with clarifying every one of the Asynchronous situations which you may confront while coding.
All APIs of Node.js library are asynchronous that is non-blocking. It basically implies a Node.js based server never trusts that an API will return information. Server moves to next API subsequent to calling it and a notice instrument of Events of Node.js causes the server to get a reaction from the past API call.
Stubs are utilized during Top-down combination testing, so as to reenact the conduct of the lower-level modules that are not yet coordinated. Stubs are the modules that go about as impermanent substitution for a considered module and give a similar yield as that of the real item.
Stubs are likewise utilized when the product needs to collaborate with an outside framework.
Stubs are functions or programs that mimic the practices of component or modules. It gives predetermined responses to function calls made during testing.
Stub Workflow
In the above figure it is clearly stated that Module 1, 2 and 3 are ready for testing whereas all remaining Modules are still in development. Order of Integration is as below:-
Test Pyramid concept was given Mike Cohn. Its essential point is that you should have many more low-level unit tests than high level end-to-end tests running through a GUI. A test pyramid depicts the proportion of what number of unit tests, reconciliation tests and E2E test you ought to compose.
Advantages of Node.js are mentioned as below:-
Disadvantages of Node.js are mentioned as below:-
The following tools are very popular:-
Node.js is an open source server-side JavaScript run-time condition based on Chrome's JavaScript Engine(V8). Node.js is utilized to structure quick and adaptable applications and is an occasion driven, non-blocking I/O model.
REPL (READ, EVAL, PRINT, LOOP) is a PC situation like Shell (Unix/Linux) and direction brief. A command is entered and the system responds with an output. The hub accompanies the REPL condition when it is introduced, and the Framework associates with the client through yields of directions/articulations utilized.
Node.js or Node comes bundled with a REPL environment that performs the following desired tasks.
We have repl module which is available for standalone applicationsas well, and can be called through below command:-
const repl = require('repl');
Underscore variable is a special kind of variable which usually stores the value of its last execution result. Whenever you need to use the output of one module to the input of next module you may use this variable. It serves asimilar purpose as $? in bash. Please find below an example for a more detailed explanation:-
Use _ to get the last result.
C:Nodejs_WorkSpace>node
> var x = 10 undefined > var y = 20 undefined > x + y 30 > var sum = _ undefined > console.log(sum) 30 undefined >
In Version 6.x or higher Underscore variable gives you the following result:-
> [ 'a', 'b', 'c' ] [ 'a', 'b', 'c' ] > _.length 3 > _ += 1
Expression assignment to _ now disabled.
4 > 1 + 1 2 > _ 4
In older version you will get a different result:-
> [ 'a', 'b', 'c' ] [ 'a', 'b', 'c' ] > _.length 3 > _ += 1 4 > 1 + 1 2 > _ 2
Applications made using Node.js create a single thread, which means it receives a request and processes it first before moving to another request. So if you are developing any streaming application or event based application, then Node.js is best suited for it. Some examples of such applications are as follows:-
The package.json file is the core of Node.js system. It is the main file of any Node.js project and contains the complete information about the project. Developers have to have a deep understanding of package.json file to work with Node.js. It is the first step to learn about development in Node.js. Package.jason is mostly used for defining the properties of a package and it is always present in the root directory folder of any Node js application. The most common attributes of Package.Jason are as following:-
Node Package Manager (NPM) provides two primary functionalities −
www.npmjs.com hosts thousands of free packages to download and use. The NPM program is installed on your computer when you install Node.js. There might be a scenario when we are required to either uninstall a dependency or have to update an existing dependency, so please use the following command for uninstalling any dependencies through NPM:
C:Nodejs_WorkSpace>npm uninstall <Name of dependency here>
Please use the following command for updating any dependencies through NPM
C:Nodejs_WorkSpace>npm update <Name of dependency here>
To get the old behavior, use npm --depth 9999 update . As of npm@5.0.0 , the npm update will change package.json to save the new version as the minimum required dependency. To get the old behavior, use npm update --no-save
In general programming terms, event-driven programming is a programming technique in which the flow of the program is determined by events such as user actions (mouse clicks, key presses), sensor outputs, or messages from other programs or threads.
As we know, Node.js is a single threaded application but it can support multithreading through Event and callback function. Event Driven programming is running on Request and response technique. We have to define a target which may be any button or click event so whenever our application is going to receive a request on the target our application will accept and process that request, and provide a response back to the user. This is usually achieved through Callback function. You can review the below example for more reference:-
function addtoStudent(studentId) { event.send("student.add" , {id: studentId}) } event.on("student.add", function(event) { show("Adding New Student" +event.id); });
npm is world’s largest software repository. Globally, open source developers use npm to share and borrow packages. Example, you need to install node and npm before getting necessary packages for Angular development.Packages are needed to bring modularity to code development.
npm consists of three distinct components:
Some of the uses of npm are:
Blocking program
Example: Let us create text file named blk.txt
blk.txt
A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.
Create a javascript code as follows and save as blk.js
var fs = require("fs"); var data = fs.readFileSync('blk.txt'); console.log(data.toString()); console.log('Program Ended');
Execute the code. The result is
A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.
Program Ended
In this example, program blocks until it reads the file and then only it proceeds to next statement or end the program.
Non Blocking program
Example: Use the same input file defined for blocking code example.
Create a javascript code as follows and save as nblk.js
var fs = require("fs"); fs.readFile('blk.txt', function (err,data) { if (err) return console.error(err); console.log(data.toString()); }); console.log("Program Ended");
Execute the code. The result is
Program Ended
A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.
This example shows that program doesn’t wait for file reading and prints “Program Ended” first. At the same time, the program without blocking continues to read the file.
Every action on a computer is an event.
Example: File opening is an event.
Objects in Node.js can fire events like createReadStream object fires when opening or closing a file.
Example: to read a stream of characters in existing file trnk.txt
var fs = require("fs"); var rdstream = fs.createReadStream('trnk.txt'); rdstream.on('open', function(){ console.log("File Opened"); });
Executing above code you will get result as
File Opened
You can create, fire and listen to your events using Events module, eventEmitter class.
Events module and eventEmitter class is used to bind events and event-listener.
A must-know for anyone heading into a Nodejs interview, this question is frequently asked in Node js interview questions for experienced.
To fire an event use eventEmitter.emit(‘evenName’)
To bind an event handler with an event eventEmitter.on(‘eventName’, event handler)
Example: Refer to the following example.
// Import Events module var events = require('events'); // Create an eventemitter object var eventEmitter = new events.EventEmitter();
//Create an event handler var myEventHandler = function () { console.log('I have completed'); } // Assign evenhandler to an event eventEmitter.on('complete', myEventHandler);
// Fire the complete event eventEmitter.emit('complete')
Following is the result after executing the code.
Result: I have completed
It's no surprise that this one pops up often in Node js interview questions for experienced professionals.
The buffers module provides a way of handling streams of binary data.
Nodejs implements Buffer using the Buffer class.
Typically, the movement of data is done with the purpose of processing it, or read it, and make decisions based on it. But there is a minimum and a maximum amount of data a process could take over time. So if the rate at which the data arrives is faster than the rate at which the process consumes the data, the excess data need to wait somewhere for its turn to be processed.
On the other hand, if the process is consuming the data faster than it arrives, the few data that arrive earlier need to wait for a certain amount of data to arrive before being sent out for processing.
That “waiting area” is the buffer! It is a small physical location in your computer, usually in the RAM, where data are temporally gathered, wait, and are eventually sent out for processing during streaming.
Example: An example where you can see buffering in action is when you are trying to read an e-book( of size 500 pages with graphics) in google books. If internet is fast enough, the speed of the stream is fast enough to fill up the buffer and send out for further processing, then fill another one and send out for processing till stream is finished.
If your internet connection is slow, Google books display a loading icon, which means gathering more data or expecting more data to arrive. When the buffer is filled up and processed, google books show the page. While displaying the page, more data continues to arrive and wait in the buffer.
No. Buffer class is part of Global Modules.
Example:
How to convert binary equivalent of stream xyz to JSON format:
create a javascript file with following code.
var buf = Buffer.from('xyz'); console.log(buf.toJSON());
in the first line buf is the variable and Buffer is the Buffer class. Using toJSON method we can convert code as shown in the result below.
Execute the code : Following is the result
{ type: 'Buffer', data: [ 120, 121, 122 ] }
Duplex: Duplex streams are streams that implement both readable and writable interfaces
Examples:
Transform: A type of duplex streams where output is in someway related to the input. Like duplex streams, Transform streams also implement both Readable and Writable interfaces.
Examples:
Piping is
No limit on piping operations.
Example: Create a text file dataSinput.txt with the following content.
After executing the following code you can view the contents in the outputfile.
var fs = require("fs"); //import fs module //creating a readstream to read our inputdatafile dataSinput.txt var readStream = fs.createReadStream("F://dataSinput.txt"); //creating a writestream(initially empty) which is destination for transferred data var writeStream = fs.createWriteStream("F://dataSoutput.txt"); //Use Pipe command to transfer from readstream to writestream. //Pipe command takes all data from readstream and pushes it to writestream readStream.pipe(writeStream);
Output in dataSoutput.txt can be seen as
Chaining is
Example: Create a text file dataSinput.txt with the following content.
After executing following code.
var fs = require("fs"); //import fs module var zlib = require("zlib"); //import zlib module //creating a readstream to read our inputdatafile dataSinput.txt var readStream = fs.createReadStream("F://dataSinput.txt"); //create a compressed folder zlib var czlib = zlib.createGzip(); //creating a writestream(initially empty) which is destination for transferred data var writeStream = fs.createWriteStream("F://dataSoutput.txt.gz"); //Use Pipe command to transfer from readstream to gzip. //Pipe commands takes all data from readstream and pushes it to compressed writestream file readStream.pipe(czlib).pipe(writeStream); console.log("File Compressed");
You get result as “File Compressed”. And compressed file dataSoutput.txt.gz as output Which consists of text file dataSoutput.txt.
Yes.Every method of fs module supports both synchronous and asynchronous forms.
Asynchronous methods take the first parameter of callback function as error and last parameter as completion function.
It is better to use an asynchronous method as it never blocks as program during execution whereas synchronous methods does block the program during execution.
A common question in Node js advanced interview questions, don't miss this one.
import fs module and declare buffer class
Example: Create a text file named trnk.txt with Knowledgehut tutorials as text in it
Create a javascript code as follows. Save it as trnkexmpl.js
var fs = require("fs"); //import module var buf = new Buffer(1024); //define buffer fs.open('trnk.txt', 'r+', function (err,fd) { if (err) { return console.error(err); } console.log("File opened"); //Truncate the open file) fs.ftruncate(fd, 12, function(err) { if (err) { return console.log(err); } console.log("File Truncated") console.log("Going to read same file") fs.read(fd, buf, 0, buf.length, 0 ,function(err, bytes) { if(err) { return console.log(err); } //Print only read bytes if(bytes > 0) { console.log(buf.slice(0, bytes).toString()); } //Close the opened file fs.close(fd, function(err){ if (err) { console.log(err); } console.log("File closed successfully"); }); }); }); });
Execute the code . You get the following result:
Create a text file “demo_file_del.txt” file to be deleted.
var fs = require("fs"); //import fs module console.log("going to delete demo_file_del.txt file") fs.unlink('demo_file_del.txt', function(err) { // call unlink method) if (err) { return console.err(err); } console.log("File deleted successfully") });
Result:
The “error-first” callback is also known as an “errorback”, “errback”, or “node-style callback”. Error-first callbacks are utilized to pass errors and data too. You need to pass the error as the main parameter, and you must verify whether something went wrong. Additional arguments are utilized to pass data. There are two rules for defining an error-first callback:
Code example is as below:-
fs.readFile(filePath, function(err, data) { if (err) { // handle the error, the return is important here // so execution stops here return console.log(err) } // use the data object console.log(data) })
Callback Hell no more with Promises and its nearby partner, Generators. ... It comprises of numerous settled callbacks which makes code hard to pursue and investigate. One may unconsciously get captured in Callback Hell while managing asynchronous logic. Callback hell is a marvel that torments a JavaScript designer when he attempts to execute different nonconcurrent tasks in a steady progression. You may use any of the below options for resolving the issue of callback hells:
Promises are a simultaneousness primitive, first depicted during the 80s. Presently they are a part of most current programming languages to make your life simpler. Promises can enable you to handle async activities.
Please see the below example, which will help you understand the code simplicity and scalability using or without using Promise.
Code example without using Promise:-
var MongoClient = require('mongodb').MongoClient; var url = 'mongodb://localhost/EmployeeDB'; MongoClient.connect(url, function(err, db) { db.collection('Employee').insertOne({ Employeeid: 4, EmployeeName: "NewEmployee" }); });
The portion of the function(err, db) in the above code is known as an anonymous or callback function declaration. When the MongoClient connects to the MongoDB database, it returns to the callback function once the connection is complete. In a way, the link operations occur in the background, and it calls our callback function when it is completed. Note that this is one of Node.js' key points to allow multiple operations to occur simultaneously, thereby preventing any user from carrying out an operation.
So then what's a promise? Ok, a promise in Node.js is just an update to callback functions. There may be an instance during the software lifecycle where several callback functions need to be nestled together. At some point in time, this can getmessy and difficult to maintain. In short, a promise is a callback improvement that aims to mitigate these issues.
The basic syntax of a promise is shown below;
var promise = doSomethingAync() promise.then(onFulfilled, onRejected)
Code example using Promise:-
var Promise = require('promise'); var MongoClient = require('mongodb').MongoClient; var url = 'mongodb://localhost/EmployeeDB'; MongoClient.connect(url) .then(function(err, db) { db.collection('Employee').updateOne({ "EmployeeName": "Martin" }, { $set: { "EmployeeName": "Mohan" } }); });
We can easily load HTML code into Node.Jsby making a change in the content type. In HTML Code content type is defined as “text/plain” and for loading this into Node.Js we have to change it to “text/html”. Please see below example for more detail:-
fs.readFile(filename, "binary", function(err, file) { if(err) { response.writeHead(500, {"Content-Type": "text/plain"}); response.write(err + "\n"); response.end(); return; } response.writeHead(200); response.write(file, "binary"); response.end(); });
Now we have to modify the above code to load an HTML page instead of plain text like as below:-
fs.readFile(filename, "binary", function(err, file) { if(err) { response.writeHead(500, {"Content-Type": "text/html"}); response.write(err + "\n"); response.end(); return; } response.writeHead(200, {"Content-Type": "text/html"}); response.write(file); response.end(); });
A mechanism which is designed to handle async callbacks is known as Event loop. Node.js is a single threaded and event driven programming language. We can attach any listener on the request node and whenever it is triggered with a known request then listener will accept and process it based on the predefined callback functions which we have setup in our application.
At whatever point we are callingsetTimeout, http.get and fs.readFile, Node.js runs these tasks and further keeps on running other programs without waiting for the output. At the point when the activity is done, it gets the output and runs our callback function.
So all the callback functions are lined in a circle, and will run individually whenever we receive the response.
The module.exports or exports is an uncommon article which is incorporated into each JS record in the Node.js application as a matter of course. The module is a variable that speaks to current module and exports is an item that will be uncovered as a module. Thus, whatever you allocate to module.exports or send out, will be uncovered as a module. Each Module in Node.js has its own functionality and that cannot interfere with the other modules.
A module encapsulates related code into a solitary unit of code. This can be translated as moving every single related function into a document. Envision that we made a document called greetings.js and it contains the following two functions:
module.exports = { printAdditionInMaths: functions() { return "+"; }, printSubtractionInMaths: functions() { return "-";} };
In the above example, module.export has exposed 2 functions which can be called out in any other program as shown below:-
Var mathematics = require(“./mathematics.js)
mathematics.printAdditionInMaths(); mathematics.printSubtractionInMaths();
Asynchronous truly implies not synchronous. If we are making HTTP requests which are asynchronous, it implies we are not waiting for the server reaction.
The term Non-Blocking is broadly utilized with IO. For instance non-blocking read/compose calls come back with whatever they can do and anticipate that the user should execute the call once more. Peruse will hold up until it has some information, and put calling thread to rest.
An asynchronous consider demands an exchange that will be performed in its whole(entirety) yet will finish at some future time. Non-blocking: This capacity won't pause while on the stack. Synchronous is characterized as occurring simultaneously. Asynchronous is characterized as not occurring simultaneously.
Please review the below example to understand better:-
Synchronous & blocking
Synchronous & non-blocking
Asynchronous
A reusable block of code which may not impact the other codes through its presence is called a Module. Modules are presented in ES6. Modules are significant for Maintainability, Reusability, and Namespacing of Code.
Module in Node.js is a basic or complex functionality organized in single or various JavaScript documents which can be reused all through the Node.js applications.
Every module in Node.js has its own unique circumstances, so it can't meddle with different modules or contaminate global modules. Additionally, every module can be put in a different .js record under a different envelope.
Node.js actualizes CommonJS modules standard. CommonJS is a gathering of volunteers who characterize JavaScript guidelines for web server, work area, and reassure application.
Node.js includes three types of modules:
libuv is a fabulous asynchronous IO library. It has a high productive event loop, and additionally has a separate answer for io blocking activity. It has an interior laborer thread pool for io blocking operation(E.g. submit io blocking work through uv_queue_work). So it accomplishes incredible execution by consolidating both asynchronous event loops and thread pools for non-io blocking and io blocking activity. It is a decent decision for superior server.
If the synchronous thread pool model is what you are used to on an everyday basis, you may find the asynchronous model somewhat hard, particularly when you have to decide when is the best time to discharge the "handles". If you do not get that right, libuv will crash and make your troubleshooting difficult.
libuv is a Cross-platform I/O abstraction library that supports asynchronous I/O based on event loops.It is written in C and released under the MIT Licence.
libuv support Windows IOCP, epoll(4), kqueue(2), and Solaris event ports. Initially, it was designed for Node.js but later it is also used by other software projects.
Reference : https://en.wikipedia.org/wiki/Libuv
It's no surprise that this one pops up often in Node js interview questions for experienced.
The Zlib module provides a way to zip and unzip files. Zlib is a Cross-stage data compression library. It was composed by Jean-loupGailly and Mark Adler. In Node js, you can Zlib for Threadpool, HTTP solicitations, and reactions pressure and Memory Usage Tuning. So as to utilize zlib in node js, you have to introduce hub zlib bundle. After establishment, below is the test code to utilize Zlib.
var Buffer = require('buffer').Buffer; var zlib = require('zlib'); var input = new Buffer('lorem ipsum dolor sit amet'); var compressed = zlib.deflate(input); var output = zlib.inflate(compressed);
Following are a few important Zlib properties and Methods:-
MethodDescription
Node.js file system module is capable of working on files either on your local system or database, but working on file system we have to call require() method for calling a file server. Through Node.js we can even open any http file and perform read content operation.
NodeJs peruses the content of a document in a non-blocking, nonconcurrent way. Node JS utilizes its fs center API to manage documents. The simplest method to peruse the whole substance of a document in nodeJs is with fs.readFile strategy. The following is a test code to peruse a record in NodeJs asynchronously and synchronously.
Reading a file in node asynchronous/ non-blocking
var fs = require('fs'); fs.readFile('DATA', 'utf8', function(err, contents) { console.log(contents); }); console.log('after calling readFile'); Reading a file in node asynchronous/blocking var fs = require('fs'); var contents = fs.readFileSync('DATA', 'utf8'); console.log(contents);
One of the most frequently posed Node js interview questions for experienced, be ready for it.
Streams are accumulations of information — simply like clusters or strings. What matters is that streams probably won't be accessible at the same time, and they don't need to fit in memory. This makes streams extremely amazing when working with a lot of information, or information that is originating from an outer source one piece at once.
Notwithstanding, streams are not just about working with enormous information. They additionally give us the intensity of composability in our code. Much the same as we can create ground-breaking Linux directions by funneling other littler Linux directions, we can do the equivalent in Node with streams precisely.
Streams are extraordinary kinds of objects in Node that enable us to peruse data from a source or compose data for a goal consistently. There are 4 kinds of streams accessible in Node Js; they are:
Node.js is a runtime framework that has turned out to be famous with most coders and is utilized generally for creating server-side applications. Node.js is best known for making constant APIs and building another network of interoperability over the web. There are two manners by which a record can be perused and sent for execution in Node.js; readFile and CreateStream. Please see a few major differences in readfile and createStream as below:-
Node.js file system module is capable to work on file either on your local system or database but working on file system we have to call require() method for calling a file server. This can be understood by the following example:-
Var fs = require(‘fs’);
Basic activities or operations which are required for working on file are as mentioned below:-
var http = require('http'); var fs = require('fs'); http.createServer(function (req, res) { fs.readFile('demofile1.html', function(err, data) { res.writeHead(200, {'Content-Type': 'text/html'}); res.write(data); res.end(); }); }).listen(8080);
fs.open(); fs.writeFile();
fs.writeFile();
A set of code which always operates in between multiple asynchronous function call is called Control Flow. You may describe it as a function because it takes some input and provides you or its next functions an output. Control Flow usually controls the function by using the following steps:-
We have three different patterns which can explain more about control functions :-
Node.js is a popular open-source server environment which runs on various platforms (Windows, Mac OS, Linux, Unix, etc.). It allows you to build an entire website using one programming language: Javascript. It is one of the most sought-after development tools to learn as the demand for the same has increased and continues to increase. According to Ziprecruiter.com, the average salary of a Node Js Developer is $117,350 per year.
Many reputed companies are hunting for a good web developer and if you’re passionate about becoming a web developer and planning to opt for Node JS as a career building tool, you are already on the right track! Make the best use of your time and be thorough with these Node js interview questions and the best answers. These Node.js interview questions have been designed to get you familiarized with the types of questions that you may encounter in your interviews. Our basic and advanced Node.js interview questions are followed by answers from industry experts so that you can prepare better for your upcoming interviews. These top Node.js interview questions will help to save your time in preparation and will definitely help your interviewer to understand your deep knowledge of Nodejs.
We’ve listed all the frequently asked questions and answers which will help you get a clear understanding of Node.js and they are simple to remember as well. The answers you find here have been prepared by industry experts.
All our interview questions for Node.js are up-to-date with the aim to always keep you updated with the latest interview questions. These Node JS Interview Questions and answers will definitely help you in cracking your interview and follow your dream career as a Node.JS Developer.
Practice well with these interview questions. Be confident, gear up. All the best!
Submitted questions and answers are subjecct to review and editing,and may or may not be selected for posting, at the sole discretion of Knowledgehut.
Get a 1:1 Mentorship call with our Career Advisor
By tapping submit, you agree to KnowledgeHut Privacy Policy and Terms & Conditions