Contributing Limited Amount of Data-Node Storage to Hadoop Cluster.

Contributing Limited Amount of Data-Node Storage to Hadoop Cluster.

Task 4.1 Description :-

??In a Hadoop cluster, find how to contribute limited/specific amount of storage as slave to the cluster?

let's begin :-

Configure NameNode -

No alt text provided for this image

Starting NameNode -

No alt text provided for this image

Checking if any Datanode is connected -

No alt text provided for this image

Till now no DataNode is connected .

To connect any DataNoade to NameNode first we have to configure DataNode .

Configuring DataNode -

No alt text provided for this image

Starting DataNode -

No alt text provided for this image

After starting DataNode it gets connected to the NameNode .

Now DataNode is connected to NameNode and it contributes all its storage .

But we cant also contributed limited storage to NameNode .

SO LET'S SEE HOW TO CONTRIBUTE LIMITED STORAGE -

Add New HardDisk To DataNode -

Here I am using Oracle Virtual Box .

To add new hard disk DataNode must be in "Stopped" state then Follow this steps -

Go Storage in Settings of DataNode -

No alt text provided for this image

Now Click on right "+" icon of "Controller: SATA" -

No alt text provided for this image

Now Click On "Create" .

No alt text provided for this image

Now Click on "Next ".

No alt text provided for this image

Again Click "Next" .

No alt text provided for this image

Give Your Hard Disk name & Choose the size of your Hard Disk & then Click on Create .

In my case My Hard Disk size is 8 GiB .

No alt text provided for this image

Choose your Hard Disk & then Click on "Create" .

No alt text provided for this image

Now our created Hard Disk is attached -

To check hard disk is attached or not run "fdisk -l" command -

No alt text provided for this image

You will see "/dev/sdb : 8 GiB"

To use new Hard Disk we have to create partition then format it and mount it .

Now Create Partition In Add Device At DataNode .

Run "fdisk /dev/sdb" command -

No alt text provided for this image

Format & Mount Partition the DataNode -

Run this command to format "mkfs.ext4 /dev/sdb1" -

And we want to mount Partition in the directory in which Hadoop Cluster Distributed File Storage -

Run this command to mount "mount /dev/sdb /dn1"

No alt text provided for this image

Our Hard Disk is mounted .

As You can see DataNode is connected & contributed around 7.81 GiB .

Thus we can set limitation of contribution of DataNode in Hadoop Cluster.

No alt text provided for this image


TASK COMPLETED !!!!!!

ThankYou For Reading .


要查看或添加评论,请登录

Nikhil Suryawanshi的更多文章

  • Neural Network & its Use Cases.

    Neural Network & its Use Cases.

    What are Neural Networks? Neural networks are a set of algorithms, they are designed to mimic the human brain, that is…

  • Ansible & NASA

    Ansible & NASA

    What Is Ansible? Ansible is an open source IT Configuration Management, Deployment & Orchestration tool. It aims to…

  • Artificial Intelligence & Machine Learning .

    Artificial Intelligence & Machine Learning .

    What is artificial intelligence? In computer science, the term artificial intelligence (AI) refers to any human-like…

  • Kubernetes Solving Usecases in Industries

    Kubernetes Solving Usecases in Industries

    What is Kubernetes? “Kubernetes, or k8s, is an open source platform that automates Linux container operations. .

  • Configuring Web Server 0n Docker Container Using Ansible .

    Configuring Web Server 0n Docker Container Using Ansible .

    Task Description?? ??Write an Ansible PlayBook that does the following operations in the managed nodes: ?? Configure…

    1 条评论
  • Increasing Storage Dynamically Using LVM .

    Increasing Storage Dynamically Using LVM .

    Attaching New/Extra Hard Disk to our Data Node - Here I created Two Hard Disk Of Size 10 & 12 Gb . Now Both volumes…

    1 条评论
  • ?? Create High Availability Architecture with AWS CLI ??

    ?? Create High Availability Architecture with AWS CLI ??

    ??The architecture includes- - Webserver configured on EC2 Instance - Document Root(/var/www/html) made persistent by…

  • Configuring LoadBalancer using haproxy on AWS Using Ansible

    Configuring LoadBalancer using haproxy on AWS Using Ansible

    Task Description: ??Launch an AWS instances with the help of ansible. ??Retrieve the public IP which is allocated to…

  • Configure Web Server on AWS Cloud using ANSIBLE

    Configure Web Server on AWS Cloud using ANSIBLE

    Task description- Deploy Web Server on AWS using ANSIBLE ? Provision EC2 instance through ansible. ? Retrieve the IP…

社区洞察

其他会员也浏览了