Building a Raspberry Pi 4-based weather monitoring solution using Ruuvitag sensors, Azure IoT Hub and Azure Functions
The Raspberry Pi organization announced Raspberry Pi 4 just a few weeks ago. I had to get one, and after placing an order, I received my RPi4 in only a few days.
It’s a powerful little computer, with 4 GB of RAM and dual-HDMI outputs. It powers through USB-C, which is an excellent addition. I installed Raspbian on it and connected it to one of the 4K displays I use at my home office.
I also had a spare RuuviTag Bluetooth beacon I haven’t really used in recent months. It’s a small plastic puck that is weather-proofed, and lightweight. It has all sorts of sensors, including an accelerometer, temperature, humidity, and air pressure sensor and an NFC antenna. It uses Bluetooth Low Energy, and it’s also mesh enabled. You can buy one for only 29 € (excluding VAT).
Since Google more or less killed the Physical Web some time ago, I’d given up on using some mobile solutions to track weather and humidity at home. The idea of always needing to use my phone to get data felt cumbersome – but perhaps I could use the Raspberry Pi 4 to retrieve data from my RuuviTag and somehow store that data in the cloud?
Turns out, it’s possible but not as easy as I’d hoped for.
My solution for accessing and using weather data from RuuviTag
I wanted to retrieve data from RuuviTag automatically, and push that to Azure. I’m a huge fan of using Azure Functions, so my initial Proof of Concept solution was embarrassingly hacky, but it worked:
- Use Raspberry Pi 4 to ping RuuviTag to retrieve weather data
- Call Azure Function via REST and submit data
- Use Azure Function to store data.. somewhere (Azure Storage, or similar)
While this approach worked it has a kludge, at best. While Azure Functions are amazingly useful for a lot of different scenarios, in IoT-driven scenarios, you need something to manage and govern all that incoming data. My weather data, after all, might require enterprise-grade scalability and high availability!
Azure and IoT
I remembered attending a briefing on Azure IoT at Build 2018. Azure IoT Hub is something I’d read at a glance, but I hadn’t used it beyond clicking through the Azure Portal panels to see what it is all about.
It turns out, Azure IoT Hub has several pricing tiers, and there is even a free tier to be had! It allows for 8,000 messages per day, and each message can be up to 0.5 KB in size. Perfect fit for my weather data! The free tier even offers all other features from the paid tier, just with tighter limits.
Initial configuration for Azure IoT Hub
I provisioned Azure IoT Hub through Azure Portal, and named it ruuvitagiothub.
It’s relatively simple to manage through the portal, as there isn’t too much to configure for a simple solution like mine.
I added my Raspberry Pi 4 device as an IoT Device through IoT Devices pane:
This generates the Device ID, and more importantly, the primary key (and secondary key) for authentication from the device. My plan is to send messages from the device to the cloud.
Configuring Raspberry Pi 4 to read data from RuuviTag Bluetooth beacon
Next, I SSH’d to my Raspberry Pi 4 and to keep track of things, renamed the device to rpi4iot. This name matches the name in Azure IoT Hub, although it’s not required it makes things cleaner, especially when troubleshooting, which I had to do a lot.
RuuviTag has an active community, and through that I found an open source Python library to communicate with the beacon. I’m very inexperienced with Python, so I had a look if there was a PowerShell or .NET Core library I could benefit from. I found some libraries (see here), but they were utterly complex and had a lot of moving parts. Alternatively, it required Windows 10 IoT, which I didn’t want to use.
I chose to try out the Python libraries for RuuviTag. Installation is a breeze:
sudo apt-get install bluez-hcidump && echo +++ install successful +++
sudo pip3 install --upgrade setuptools
pip3 install ruuvitag-sensor
This adds the necessary libraries for RuuviTag, and there’s also a command-line utility to quickly find nearby RuuviTag sensors:
The default data set that RuuviTag returns is in JSON, and looks like this:
{'pressure': 1005.0, 'humidity': 78.0, 'data_format': 4, 'identifier': 'G', 'temperature': 14.0}
The data format is described here. By default, it gives me pressure, humidity, and temperature, along with a semi-unique identifier for the beacon (‘G’). Azure IoT Hub has a REST API that I can directly connect with and it supports JSON (and Apache Avro), so I should be good here.
Using my rudimentary Python skills I wrote a simple script to pull data from RuuviTag:
import datetime, requests
from ruuvitag_sensor.ruuvi import RuuviTagSensor
macs = ['RUUVITAG:MAC:ADDRESS:HERE']
timeout_in_sec = 5
datas = RuuviTagSensor.get_data_for_sensors(macs, timeout_in_sec)
weatherdata = datas['RUUVITAG:MAC:ADDRESS:HERE']
I got the MAC address of the RuuviTag through ruuvitag_sensor -f command-line utility. If I had more RuuviTag sensors, I could use the mobile app to scan for each of them as the app reveals the MAC addresses also.
Configuring Raspberry Pi 4 to send messages to Azure IoT Hub
I now had access to data from my RuuviTag and a simple script to automate the retrieval of this data. I used crontab to run the script every minute:
# m h dom mon dow command
* * * * * python3 send-to-iothub.py
Next, I needed to somehow call the Azure IoT Hub REST API. I found a great looking lab that builds something similar to my needs. It uses an external sensor that is wired to a breadboard, but otherwise, the idea is the same. The sample application looked quite complex, and it relies on C, so modifying the underlying logic looked too challenging. There’s also a Python SDK for Azure IoT Hub, but I couldn’t get it to work with Raspberry Pi 4.
Rightfully the README states there’s only a “handful of golden configurations” that are supported, and apparently, my setup wasn’t one. I ran out of disk space while trying to compile the SDK, and after a few hours of compiling and fixing the numerous errors, I gave up.
I tried building something simple for my needs with some drive-by Googling and trawling Stack Overflow for a few hours. I ended up getting it to work, but not really. The issue I ran into was with SAS Tokens, that IoT Hub relies on authentication. Each SAS Token expires, by default in 3600 seconds, and generating a new one seemed to be too tricky.
I was able to generate valid tokens using az cli, the command-line interface for Azure, using the following command:
az iot hub generate-sas-token --device-id DEVICENAME --hub-name AZUREIOTHUBNAME
But using Python first to retrieve RuuviTag data, and then resorting to calling az cli was something I didn’t feel was a working solution.
Thankfully, a person named Štěpán Bechynský from Microsoft has built an implementation of Azure IoT Hub client for Python, that works in Raspberry. You can find the repo here.
A separate library in the repo, DeviceClient.py, was all that I needed. It has a method called create_sas that generates working tokens.
Using Štěpán’s repo and sample code, I was able to pull it together in Python:
import DeviceClient
import datetime, requests
from ruuvitag_sensor.ruuvi import RuuviTagSensor
import json
# START: Azure IoT Hub settings
KEY = "DEVICEPRIMARYKEY";
HUB = "AZUREIOTHUBNAME";
DEVICE_NAME = "DEVICENAME";
# END: Azure IoT Hub settings
macs = ['RUUVITAG:MAC:ADDRESS:HERE']
timeout_in_sec = 5
datas = RuuviTagSensor.get_data_for_sensors(macs, timeout_in_sec)
device = DeviceClient.DeviceClient(HUB, DEVICE_NAME, KEY)
device.create_sas(600)
weatherdata = datas['RUUVITAG:MAC:ADDRESS:HERE']
encode_weatherdata = json.dumps(weatherdata, indent=1).encode('utf-8')
print(encode_weatherdata)
# Device to Cloud
print(device.send(encode_weatherdata))
I need to encode JSON for UTF-8, as Azure IoT Hub Message Routing expects content to be encoded in UTF-8, UTF-16 or UTF-32.
Running my script locally via SSH on the Raspberry Pi 4, I can see that data is submitted to Azure IoT Hub successfully:
Using Azure IoT Hub metrics I can track that messages are being received in the cloud:
Visual Studio Code also has built-in Azure IoT Hub tooling that allows for tracking and monitoring of messages. This tooling is also useful for one-time testing of the SAS tokens and for sending test messages.
Storing RuuviTag beacon data in Azure
Finally, as my Raspberry is happily sending data from the RuuviTag beacon to Azure IoT Hub, it’s time to do something about this data. There are two ways for storing data through Azure IoT Hub: using message routing, or reading through the built-in endpoints.
Message routing allows for configuring logic where incoming messages are routed based on device id’s, message contents and similar. It’s useful for executing custom logic when certain events are found in messages – such as when the temperature rises over a pre-defined value. Endpoints for message routing include Azure Blob Storage, Service Bus and Event Hubs.
I fiddled around with Blob storage and got it working. But I also found out that by using UTF-8 encoding, my JSON data was skewed. It ended up being stored in Azure Blob Storage using Base64 encoding — and when decoding this, I ended up with a JSON entry like this:
{\n "data_format": 4,\n "temperature": 13.0,\n "humidity": 78.0,\n "pressure": 1005.0,\n "identifier": "G"\n}
Cleaning this wouldn’t be too hard, I reasoned – especially when using Azure Functions for small operations like this.
Microsoft has published a comprehensive guide for Azure IoT reference architectures, that I hoped would give me some insights into how to build my storage solution. Within the architecture, a Function app is recommended to be used before storing the data – and I was now doing the opposite!
Back to the drawing board
I removed my Azure IoT Hub Message routing rules and had a look at the built-in endpoints.
Using the built-in endpoints, I could push my messages to Azure IoT Hub and have them retained there for a period of days until I would retrieve and process the messages. This proved to be especially useful when debugging, as I didn’t lose any of my data in the meantime.
I decided to use Azure Functions to trigger whenever a new message was received in the IoT Hub default endpoint. I built the Azure Function using Visual Studio 2019, and while the SDK helps, there’s a bit of trickery required to get this working.
Note: You can access the source codes in my GitHub repo here.
To wire up the Function to trigger for IoT Hub messages, I’m using IoTHubTrigger for the messages/events endpoint.
public static void Run(
[IoTHubTrigger("messages/events", Connection = "ConnectionString")] EventData message,
ILogger log, ExecutionContext context)
Each message that I receive contains all the data in the message object of type EventData. I can parse the JSON my device is sending simply by getting the body of the message:
var jsonData = JObject.Parse(Encoding.UTF8.GetString(message.Body.Array));
I also wanted to get the device id (“rpi4iot”) and timestamp of the message when it hit IoT Hub:
var deviceId = message.SystemProperties["iothub-connection-device-id"].ToString();
var timestamp = message.SystemProperties["iothub-enqueuedtime"].ToString();
I struggled a bit with the timestamp, as I hadn’t bothered to set a locale or culture. I decided to convert the timestamp to a universally accepted format of ‘MM-dd-YYYYTHH:mm:ss’:
timestamp = DateTime.Parse(timestamp).ToString("s");
I opted to use Azure SQL to store my data, as it’s a mature technology and very affordable. Cosmos DB could have been another choice, but as I knew the amount of data I’d be generating is relatively low (about 44 640 rows/month) Azure SQL felt like an easy choice for now. Remember, that I had spent considerable time trying to figure out the Python SDK for Azure IoT Hub, so from now on I wanted to see this working rather quickly.
I provisioned an Azure SQL Database along with Azure SQL Server, and created a simple table with the following structure:
CREATE TABLE [dbo].[RuuviTagData](
[timestamp] [datetime] NOT NULL,
[deviceId] [varchar](16) NOT NULL,
[identifier] [char](1) NOT NULL,
[temperature] [float] NOT NULL,
[humidity] [float] NOT NULL,
[pressure] [float] NOT NULL
) ON [PRIMARY]
GO
I then stored the connection string to my database in local.settings.json within my Visual Studio project. These settings are only used locally and not transmitted to the cloud upon publishing.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"ConnectionString": "Endpoint=sb://address-to-azure-iot-hub"
},
"ConnectionStrings": {
"SqlConnection": "Server=fqdn-to-azure-sql-and-auth-info"
}
}
I learned along the way that since I’m using Azure Functions 2.0, it relies on the new ConfigurationBuilder() to read configuration values. I found some great code to figure this out here.
To get the connection string in code, I’m using this:
var config = new ConfigurationBuilder()
.SetBasePath(context.FunctionAppDirectory)
.AddJsonFile("local.settings.json", optional: true, reloadOnChange: true)
.AddEnvironmentVariables()
.Build();
var sqlConnection = config.GetConnectionString("SqlConnection");
And then all that is needed is a SqlConnection and a SqlCommand to write out the row in Azure SQL Database:
using (SqlConnection connection = new SqlConnection(sqlConnection))
{
connection.Open();
StringBuilder sb = new StringBuilder();
sb.Append("INSERT INTO RuuviTagData VALUES (");
sb.Append("'" + timestamp + "', ");
sb.Append("'" + deviceId + "', ");
sb.Append("'" + jsonData["identifier"].ToString() + "', ");
sb.Append(jsonData["temperature"].ToString() + ", ");
sb.Append(jsonData["humidity"].ToString() + ", ");
sb.Append(jsonData["pressure"].ToString());
sb.Append(")");
log.LogInformation(sb.ToString());
SqlCommand cmd = new SqlCommand(sb.ToString(), connection);
cmd.ExecuteNonQuery();
connection.Close();
}
Running this locally shows that it works:
Upon publishing the Azure Functions to Azure, I needed to add my connection string to the Function’s App Settings, under Connection Strings:
Testing and testing again
I tested my solution by running the Python script manually from Raspberry Pi 4. I could then see in Azure Function’s monitoring page that messages are processed through IoT Hub:
Since Azure Functions are using Application Insights by default, I also get near-real-time metrics stream:
The Azure SQL Database is doing fine in terms of utilization:
And querying the database I can see that rows are being added – one row each minute, as that’s what I set in crontab in Raspberry:
You can probably tell it’s summer in Finland, as the temperature is consistently between 12 and 14 degrees Celsius.
In summary
I learned a lot while building this nice little solution. It took me about 10 hours in total, including losing a few hours with the Python SDK compilation issues and challenges.
I placed the RuuviTag beacon on my balcony, as it’s weather-proofed and can easily stand the Finnish winter weather as well. The Raspberry Pi 4 is placed about 5 meters away in my home office, connected via Wifi to the Internet and using Bluetooth to pick up the sensor readings.
For Azure SQL, I continue using the Basic tier, which costs me 4.12 € a month. Azure IoT Hub is essentially free, and Azure Functions are also free under the consumption plan (I execute the Function about 45K times a month).
The hardest part was in figuring out how to send data from Raspberry Pi to Azure IoT Hub. A lot of adjustments and enhancements could be made to this solution, including:
- Visualizing the data using Power BI (and Direct Query)
- Using Stream Analytics for automated insights into the data
- Using Microsoft Flow to act depending on data – such as when the temperature exceeds a pre-defined limit
- Using Message Routing for long-term archival of data
Sources
You can find the source codes for this project from my GitHub repo here.