Visualizing real-time Bosch XDK Data in AWS with Elasticsearch, Kibana, IoT Core and the Rules Engine
Carlos Hernandez-Vaquero, PMP?
?? Making GenAI a massive opportunity for engineering productivity ?? Become part of the change | Engineering Manager & AI Advocate @Bosch | AIoT products, 20+ Apps, GenAI-Gurus.com
Motivation
Some weeks ago I was playing with my Bosch XDK and I wanted to both get the data in AWS and also visualize it in real-time. The XDK comes with the Virtual XDK mobile app, which allows to do this directly over BLE in your phone. But then you don't have it in the cloud. So I thought "this sounds like a plan for the weekend" :)
Disclaimer: At the end of the article there is a video with a demo of the result
For those who don't know the XDK, you are invited to watch the video below
Overview
In this article I will explain how to:
- Connect XDK to AWS IoT Core. Enable logs to debug in AWS CloudWatch
- Use the AWS IoT Rules engine to prepare the data, including all the measurements from XDK and a timestamp, so that it can be used by Elasticsearch
- Create an Elasticsearch cluster and visualize the data received in Kibana
For the sake of the demo, we will use services which are covered by the AWS free tier. Please think twice about the setup if you want to use this in production.
Connecting XDK to AWS IoT Core
When we install the XDK Workbench, we have a list of official examples, one of them called "AwsSendDataOverMQTT". By using this example and its documentation we will be able to configure our XDK, including the WiFi configuration, certificates and the basic steps to setup and receive MQTT messages in AWS IoT Core. I will not repeat here what is already explained in the example, just follow the guide.
Sensor data as numbers
I would recommend to remove the quotes from the MQTT messages generated from the XDK. This can be done by editing the source code in the XDK Workbench before compiling it. By changing this, Elasticsearch will understand the data as numbers, which is what they are, and not as text. This is in my view a general improvement suggestion for the example provided by the XDK.
Enable logs to AWS CloudWatch
We would like to debug any possible errors in AWS IoT, so we will enable logs under Settings in the AWS IoT Core service. For this we will select the level of verbosity (Warning or Errors should be enough, unless you really want lots of rubbish) and attach a new role to allow AWS IoT to send the logs to AWS CloudWatch. This step really helped me a lot to understand what was going on.
Preparing the data for Elasticsearch
Once the data is in AWS, we will forward the data to the Elasticsearch service from AWS. Elasticsearch is part of the Elastic Stack, initially meant for searching documents (e.g. JSON) at scale. However, it comes with Kibana out-of-the-box, which makes it a very powerful combination for real-time applications which require visualization. AWS offers a fully managed service, where you don't need to worry about anything else than using Elasticsearch. No maintenance, no patching, no complex configuration steps and nights without sleep. No useless stuff. Is this NoOps? I don't know, but I like it.
Creating an Elasticsearch domain
We first need to create an Elasticsearch cluster. We will login the AWS console and find the Elasticsearch Service. Then we will click on "Create domain", select "Development and testing" and click on next.
In the next page we will select the smallest EC2 instance from the list (t2.small). In the most prominent documentation of the AWS Free tier it is always mentioned that you get 750 free hours of a t2.micro instance, but t2.small is also part of the free offering, at least for Elasticsearch. I can confirm that I was not charged for using this. The rest of configuration can be left as it is by default.
In the next step, we will configure the security. We should be careful here and follow the rule of "least privilege", but for the demo I will just set it to public and allow all traffic.
Once we click on "Next", we will get an overview of the whole setup and we can "Confirm" to start the Elasticsearch cluster. We will then need to wait for ~15 minutes until the "Domain status" changes to "Active".
We will now try in the same page to open the URL of Kibana (".../_plugin/kibana"). If everything went fine, we should see Kibana loading. Copy the "Endpoint" address (https://search-...) because we will need it in the next steps.
Note: If when you try to access Kibana you get:
{"Message":"User: anonymous is not authorized to perform: es:ESHttpGet"}
You may need to go back to Elasticsearch, click on "Actions", then "Modify access policy", and then select "Allow open access to the domain".
In Kibana we will go to the "Dev Tools" and execute (press play icon or CMD+Enter in Mac) the 2 commands as in the picture below. This will create an index with the name xdk and map the "timestamp" to the type "Date". This will help in the visualization steps at the end of this article.
Configuring the Rule in AWS IoT
We will come back to AWS IoT and click on "Act" and "Rules". We will Create a new rule here
In the Rule Query Statement we will write the next, where BCDS/XDK110/example/out is the MQTT topic where the XDK is publishing messages. If you followed the official documentation of the example, this topic would match, otherwise change it accordingly. Note that by adding the timestamp() as timestamp statement, we will add the timestamp to the sensor data when the MQTT message has been received by the MQTT broker.
We will then add an Action to "Send a message to the Amazon Elastichsearch Service" and configure the parameters as in the picture. The index should match what we already defined in the Dev Tools in Kibana. The Type is in theory a deprecated value for newer versions of Elasticsearch, so we better leave it as recommended in the Elasticsearch docu ("_doc"). And finally the ID should be a different value each time we send anything to Elasticsearch. If we write a static value, the document (JSON) in Elasticsearch gets updated with the every new message, instead of creating a new one. ${newuuid()} generates a random value for the ID of each message.
We will finally add the required role for AWS IoT to access Elasticsearch and we are good to start sending data from our XDK all the way to Elasticsearch. Switch on your XDK!
Visualizing the data in Kibana
While XDK is successfully sending measurements to Elasticsearch, we can now prepare the visualization in Kibana.
We should first create the index pattern. For this we will go to Management and start typing "xdk". If any message from AWS IoT has successfully reached Elasticsearch, we should see the message "Success! Your index pattern matches 1 index"
In the next step, you should be able to select "timestamp" as the "Time Filter field name". If this is the case, we can create the index pattern
Now we will go to Visualize and click on "Create a visualization"
Here we will choose the "Line" graph and select the Index pattern xdk.
We will then map each of the measurements in the Y axis and aggregate the data using Average.
Similarly for the X axis, we will aggregate the timestamp using Date Histogram and Interval of a Second.
In order to force the visualization to update every 1 second, we will select the next configuration in the top right corner of this page. We should then see data updating in the screen. How cool is that?
We will create and save as many Visualizations as desired and save each of them. Once we are done, we can combine them in a Dashboard and we are done!
Demo time!
Finally a demo of the result. Maybe you can guess what am I doing with the XDK? :)
AI and IoT Systems Researcher and Developer | PhD Student | Engineering Solutions Specialist | Building solutions with Java, Python, and JavaScript
4 年Hey Carlos Hernandez-Vaquero, PMP?. Can you help me? Do you know how I can connect the XDK to IBM cloud?
Head of Multi-Market Embedded Software Development
5 年Impressive ??
Firmware Development | IoT and Connectivity Solutions | Embedded Software Architecture | V-Model and ASPICE | Requirements Engineering
5 年Awesome work Carlos!