?? How to Set Up Elasticsearch, Logstash, and Kibana Stack Locally ??
ELK Stack

?? How to Set Up Elasticsearch, Logstash, and Kibana Stack Locally ??

Introduction

ELK (Elasticsearch, Logstash, Kibana) is a powerful open-source technology stack that has revolutionized log management and analytics, making it a crucial tool for DevOps professionals. Setting up an observability stack using ELK is a valuable skill for DevOps practitioners as it allows gaining valuable insights into the performance, health, and behavior of applications and infrastructure. In this article, we'll explore what ELK is, its significance, especially for DevOps, and how you can get started with this trending technology.

What is ELK?

The ELK stack is comprised of three essential components: Elasticsearch, Logstash, and Kibana. Elasticsearch acts as a distributed, multi-tenant capable full-text search engine, providing remarkable search capabilities across massive volumes of data in mere seconds. Logstash serves as a data processing pipeline that ingests, transforms, and enriches data before forwarding it to Elasticsearch. Kibana offers a user-friendly web interface to visualize and analyze data stored in Elasticsearch.

Why is ELK Important?

The popularity of ELK stems from its exceptional performance and scalability. Companies are migrating to ELK because it enables them to search through terabytes of data with incredible speed, far surpassing traditional databases. Moreover, ELK's extensive collection of plugins and libraries, like Kibana and Logstash, enhances its functionalities and makes it indispensable for managing logs and metrics efficiently.

ELK and DevOps

For DevOps teams, ELK is an invaluable tool for monitoring, troubleshooting, and centralizing logs from various systems and applications. By storing logs in Elasticsearch, DevOps professionals can analyze system performance, detect issues, and swiftly resolve application problems. Additionally, ELK's real-time analytics capabilities allow for rapid insights into data trends and anomalies, ensuring proactive measures to maintain optimal system performance.

Getting Started with ELK

?? Configuring Elasticsearch

  • Download and unzip the .zip archive for Elasticsearch 8.9.0.
  • Allow the commercial features to create the required indices in the elasticsearch.yaml file by adding the following line:

action.auto_create_index: .monitoring*,.watches,.triggered_watches,.watcher-history*,.ml*        

  • Uncomment and edit cluster.name and node.name as shown:

cluster.name: first-elk-stack
node.name: pr1es        

  • If elastic is already running stop and restart for the name changes to take place.
  • Start Elasticsearch from the command line using the following command:

elasticsearch.bat        

  • Retrieve the password for the elastic user and the enrollment token for Kibana from the terminal and store them safely.
  • If you forget the password, reset it using the command:

elasticsearch-reset-password -u elastic -v        

  • If you encounter an error while resetting the password, use the following command:

elasticsearch-reset-password -i -u elastic --url https://localhost:9200        

  • Stop Elasticsearch and edit the elastic.yml file to set xpack.security.http.ssl.enabled and xpack.security.transport.ssl.enabled to false.

xpack.security.http.ssl
? enabled: false
? keystore.path: certs/http.p12
xpack.security.transport.ssl
? enabled: false
? verification_mode: certificate
? keystore.path: certs/transport.p12
? truststore.path: certs/transport.p12
cluster.initial_master_nodes: ["your-desktop-name"]:        

  • Elasticsearch server runs on port 9200. Head over to your browser and type the following commands:

elasticsearch.bat

localhost:9200        

Enter your credentials:

  • Username: elastic
  • Password: <your password>

No alt text provided for this image
elasticsearch

It's time to head over to Kibana.

?? Configuring Kibana

  • Download and unzip the .zip archive for Kibana 8.9.0.
  • Uncomment and edit the relevant lines in the kibana.yml file, including server.port, server.host, elasticsearch.username, and elasticsearch.password.

server.port: 5601
server.host: "localhost"
elasticsearch.username: "kibana_system"
elasticsearch.password: "pass"        

  • Reset the Kibana password using the command:

elasticsearch-reset-password -u kibana_system -v        

Start the Kibana server using the command:

kibana.bat        

Access Kibana on your browser at localhost:5601 and enter your elastic credentials.

No alt text provided for this image
kibana first screen
No alt text provided for this image
kibana home page

Now it's time to move on to logstash.

?? Configuring Logstash

  • Download and unzip the .zip archive for Logstash 8.9.0.
  • Open the logstash-sample.conf file and uncomment/edit the user and password parameters.

Save the file and run the following command from the logstash bin directory:

logstash -f .\config\logstash-sample.conf --config.reload.automatic        

?? Displaying documents in Kibana and writing our first queries

Head over to kibana on localhost:5601 click on the menu icon on the upper left corner scroll down to management and click on dev tools.

We will send requests to elasticsearch and on the right panel receive the responses.

  • Let's check the health of our 1 node cluster. Since the query syntax is GET _API/parameter, we will run GET _cluster/health.

No alt text provided for this image
cluster api responding back to get cluster status

  • Create an index named myindex1 sending PUT myindex1 request. You should get a 200 ok response with acknowledged true. Then list all indices running GET _cat/indices.

No alt text provided for this image
create and list index/indices

Now that we have created and index, let's generate some longs using logstash command line.

No alt text provided for this image
generating logs

  • Run GET myindex1/_search to display the documents that were inserted by logstash.

No alt text provided for this image
displaying logs real time

Conclusion

Congratulations, you have set your first ELK stack locally on windows!

As we have explored, ELK provides us with powerful log management and analytics capabilities. Elasticsearch, Logstash, and Kibana work seamlessly together to allow easy ingestion, processing, and visualization of data. By mastering ELK, you can can efficiently monitor and troubleshoot applications and infrastructure, ensuring optimal performance and stability as a DevOps practitioner.

Embrace ELK to take your observability to the next level! ??????

Next: How do you set up an observability stack on #AWS?

#elasticsearch #observability #devops #elk

Useful Resources

Art S. Nestsiarenka

Lead Applications Developer at AT&T

1 年

thanks for the great document, very useful! I'm in the process of figuring out how to setup and configure ELK on Azure Kubernetes cluster. Need to learn about this Kubernetes creature first though. And then if I'm successful I have to migrate data from on-prem ElasticSearch v6 to cloud ElasticSearch v7 or v8. Fun times...

要查看或添加评论,请登录

Gulcan T.的更多文章

社区洞察

其他会员也浏览了