Streams in Node.js

Streams in Node.js

Streams are one of the most important concepts in node js and the most typical concept to understand. But amazing to know how we stream and how we send a large amount of data and how to deal with them.🧐

in the article, we will understand a little about streams and I will try to give some information. So further, you will search for streams on other platforms.

many popular tech companies use node js stream such as (Netflix, NASA, Trello, PayPal, LinkedIn, Walmart, Uber, Twitter, Yahoo, and eBay,)

for more visit - Node.js v20.5.1 documentation

What?🤔

Streams are objects that let you read data from a source or write data to a destination in a continuous fashion.

there are four types of streams

  • Readable − Stream which is used for read operation.

  • Writable − Stream which is used for write operation.

  • Duplex − Stream which can be used for both read and write operations.

  • Transform − A type of duplex stream where the output is computed based on input.

    we will focus on two streams in this article

    Readable, Writable

why?🤨

a way to handle reading/writing files, network communications, or any kind of end-to-end information exchange in an efficient way.

in the simple word if we normally send a file that is more than heavy than your RAM then it is difficult to send and if the number of users is too much it is more difficult to handle all the users simultaneously for this problem we need streams.

How it works😶‍🌫️

this kind of something stream works. we pass the data into small chunks to the program and then the program contains a buffer that has a fixed memory space (buffer a kind of hardware device)that buffer takes the chunks (it contains data in fixed amounts) and to it.

by this method, we pass the data into small chunks.so it doesn't matter how much is heavy we will send it easily with the help of streams

The real experience of streams😲

we all serf on YOUTUBE and we see a white lie that is always away from our videos that is an example of streams.

Let's Talk about

Readable Streams - is used for read operation. we can pipe it to the write stream or vice versa

Writable − Stream which is used for write operation.

Let's Code😎

it is for Readable Streams

for this, I am using the file system of node js.

it is good that you have a little bit of pre-knowledge of

file systems.

Steps

creating a server.

require filesystem

we need a file (sample.txt)that contains data a large amount of data so you will easily see the process, memory management and distribution of that file in you your system monitor. where you will see how much RAM is taking while you send a file to a page/webpage.

create a stream with the help of the file system.

create an event on Stram for better understanding.

Do the Pipeline with res(parameter)

pipe() method in a Readable Stream is used to attach a Writable stream to the readable stream so that it consequently switches into flowing mode and then pushes all the data that it has to the attached Writable. source

NOTE [req. , and res are also two types of streams such as res is writeableStream, and req. is readableStrram]

sample.txt - a normal file that contains a large amount of data (lorem ipsum)

const http = require('http');
const fs = require('fs');
const server = http.createServer((req,resp)=>{ 
  if(req.url !='/'){
  resp.end();
  }
     // streaming
  const readableStream = fs.createReadStream('sample.txt');

  //Pipeline 
  // readableStream -> WritableStream
  readableStream.pipe(resp) 

  // event on readableStream
  readableStream.on('data',(chunk)=>{
    console.log("Chunk: ",chunk)
  })
})

server.listen(5000,()=>{
  console.log('Server is suning on port 5000');
})

You can add different, different functions on a chunk (eg.toJSON(), toLocaleString() )

depend on your need 🤗

without any function, you will see a buffer like that

this is the data(sample.txt file) which is streaming in chunks. if you use a function like a chunk.toJSON() then

Let's Talk about

it is for WriteableStream Streams

for this, we are coping the sample .txt file into output.txt for this purpose we use writeableStream

// strams
const http = require('http');
const fs = require('fs');

const server = http.createServer((req,resp)=>{
  if(req.url !='/'){
  resp.end();
  }

  // Now for Writeabelstream
  const readableStream = fs.createReadStream('sample.txt');

  // creating writable stream with fs
  const Writeabelstream = fs.WriteStream('output.txt');

  // event
  readableStream.on('data',(chunk)=>{
    console.log("Chunk: ",chunk.toSting())

    Writeabelstream.write(chunk);
  })
})

server.listen(5000,()=>{
  console.log('Server is suning on port 5000');
})

Note - if you don't mention the path for the output file it will automatically create a file.

for writing in the file we use Writeabelstream.write(chunk); this will help to copy data from a readable file to a writeable file i.e output.txt

This time I use chunk.toString

Streams are a lot more than this article I am just trying to give an introduction to streams in Node js. if you find something wrong then please correct me. your comment is valuable to me🫡.

Thank you for reading my content. Be sure to follow and comment on what you want me to write about next 🤓

Did you find this article valuable?

Support Saurabh verma by becoming a sponsor. Any amount is appreciated!