Streams are core concept of nodejs. Streams are data (array or object), they do not process all at once rather than piece by piece. Using Streams reading/writing files, network communication or any kind of end-to-end information exchange will be efficient.
When we use streams, our program no need to get everything at once, rather it can get as a chunk, this makes easy for our program to read a large file. An example can be Youtube. Youtube gets the data as a chunks, it shows us that chunks of data every time.
In this diagram, we are seeing our data coming as chunks and fill up the buffer. When the buffer is filled up it(buffer) passes everything(data) combined to the next level. This is the core duty of buffer.
Two benefits NodeJS Stream has.
- Time: Our data coming as chunks, so there can be a significant less amount of time needed to process those data.
- Memory: We don’t need to store large amount of data in memory at once, this can make a positive impact on memory.
There are four kinds of streams available.
- Readable: streams that used to read contents/data of a file. ex –
fs.createReadStream()
- Writable: streams that used to write contents/data to a file. ex –
fs.createWriteStream()
- Duplex: streams that is both readable & writable. ex –
net.Socket
- Transform: streams that can be used to modify or transform the data as it is written and read. ex – compress a file that contains data and decompress it to read/write.
Practical Examples
Create a Readable Stream
const fs = require('fs')
let data = ''
// create a readable stream
let readerStream = fs.createReadStream('file.txt')
readerStream.on('data', function (chunk) {
data += chunk
})
readerStream.on('end', function () {
console.log(data)
})
readerStream.on('error', function (err) {
console.log(err)
})
console.log("Programs ended")
fs
is a file system module, which has a createReadStream
method where we include the relative path of that file. createReadStream
has three kinds of event listener. The data
event listener will trigger every time when there is a new data available. The end
event listener will trigger when our program finished getting data. The error
event listener will trigger when there is an error.
Create a Writeable Stream
const fs = require('fs')
// create a readable stream
let readerStream = fs.createReadStream('file.txt')
let writableStream = fs.createWriteStream('secondary-file.txt')
readerStream.on('data', function (chunk) {
writableStream.write(chunk)
})
In this example the createWriteStream
method will take the path of writable file, which will be created whenever the server starts. In the callback function of readerStream
we are using writableStream.write
method, which means every time our program gets a new chunk of data it will write that chunk into secondary-file.txt file.
Pipe Stream
The pipe
method does same thing as we saw in our previous example. It is a property of readableStream
, it accepts writeableStream
in its parameter. Every time our program gets a new chunk of data it will write that chunk of data into secondary-file.txt file through pipe()
method.
let fs = require("fs");
let readableStream = fs.createReadStream("file.txt");
let writableStream = fs.createWriteStream("secondary-file.txt");
readableStream.pipe(writableStream);
Stream Module
NodeJS has a native built-in stream
module, which is useful to create new types of Stream Instance. It is usually not necessary to use the stream
module to consume streams.
const stream = require('stream');
This is the basic of Streams. Use Streams as it can play great role to perform I/O.
In the pipe stream part, its actually not possible to make :
writableStream.pipe(readableStream)
The correct way is to make:
readableStream.pipe(writableStream);
Thank you very much!
Thanks for share with us.
My pleasure