Smart irrigation system part 1: Obtaining weather data via Azure Functions

Smart irrigation system part 1: Obtaining weather data via Azure Functions

Good morning again,?

I am going to translate the previous article to English so you can see how I am updating the existing developments to maximize the use of the cloud. This is one of the most complete I have done. Both at the level of programming and IoT and DIY and has been running locally for more than three years.


I use postgreSQL to store the data and then the python program intelligently handles the data to activate the sprinklers via setbacks and solenoid valves.

The initial idea was to automatically water the lawn, something simple with a programmer that is sold in any hardware store and some solenoid valves to turn on the irrigation areas but what if I could:

  1. Get data from the nearest sensor of the Spanish Meteorological Agency to know the temperature and the water that has fallen? This way I can irrigate less or not at all if it rains a lot or irrigate more on a very hot day.
  2. Use the AEMET prediction models to avoid irrigating if the probability and estimated rain volume is high? Surely they make better models than I do.
  3. Have an algorithm that adapts watering time to daylight hours, temperature and rainfall?
  4. Use Siri to be able to also play with the kids to try to run through the garden avoiding the sprinklers that you activate with your voice ... :-D

No alt text provided for this image
Current state of the lawn

As I have had the system in operation for several years, you can see how using a visualization program like metabase and saving the data in postgreSQL, you can see how the irrigation time varies with sunlight (more in summer and less in winter), how there are irrigation peaks (which usually coincide with heat waves) and how there are "gaps" that refer to rain and after some rain starts to irrigate less until it doesn't work at all.

No alt text provided for this image
data from postgreSQL database

We are going to focus in this first article on point 1, this way it does not become so long and it is more enjoyable.

We are going to approach it as an ETL (Extraction, Transformation and Loading of data).

DATA EXTRACTION:

To retrieve the data, I use the open data provided by AEMET, where maybe the only downside is that they are about 2 hours late. It is very simple:

  • You generate your token to be able to use the API.
  • You call the function that gives you the sensor data (idema) you want.
  • It returns the URL with the data and the URL with the metadata.
  • You load the data in JSON and then we will process it.

No alt text provided for this image
output from the python program that fetch the AEMENT open data URLs

DATA TRANSFORMATION

Once the data are in JSON, we pass them to a "dataframe" structure of the PANDAS library so that we can work better with them.

It is necessary to keep in mind that sometimes, they give data that still do not have or have not taken (they show them as "NaN") for what it is necessary to revise it and to put them with a "" so that when saving them in the database it does not give error.

The formatted data in a data frame would look like this:DATA LOADING

No alt text provided for this image
table with all wheater data of the sensor: temperature, wind, etc.


LOAD DATA

Finally, we save them in the database using a library to connect to PostgreSQL called psycopg2

AZURE FUNCTION

To adapt the code to Azure Functions I have followed this guide from Microsoft, where you have to take into account several aspects:

The data must be displayed as if they were a web page, otherwise they will not be seen.

You have to enter the code inside the function that is created with the command "function init <project_name> --python -m V2".

No alt text provided for this image
On Microsoft example.- The "route" part is the url folder where the function will be stored

In total there are about 5 calls that come in the guide for:

  • Generate a resource group if you don't want to use an existing one.
  • Generate a "storage-account" if you don't want to use an existing one
  • Create the app function in Azure
  • Publish the function to Azure
  • Update some "app-settings" to use your V2 model for python.

Then we can see the function in Azure Functions

No alt text provided for this image
Azure function on Azure portal

And inside the function we have programmed, where we can copy the complete URL to call it.

No alt text provided for this image
You can see the python code inside the Azure Function, and in this zone, you can "copy" the URL to invoke it.

The important thing is the result, which as you can see is as follows:

No alt text provided for this image
same python output but now on an html page.

Now it only remains to schedule it to run every X minutes, and one of the things I like the most is to add functionality to send an email when it is executed, and thus know how it has gone without having to look at Azure.

No alt text provided for this image

Finally to close this first part, I copy the Azure function files on GitHub and I paste a screenshot of how the data looks like if you show them in a BI tool like metabase.

No alt text provided for this image
Metabase dashboard with the data we fetch

And soon for the second part to see how to get other data from the AEMET as the time of sunset and sunrise, or rain predictions.


Best regards to all of you:

Santiago

要查看或添加评论,请登录

Santiago Merchán Casado的更多文章

社区洞察

其他会员也浏览了