Monitoring temperature & other parameters with Wireless Sensor Tags, InfluxDB and Grafana
I have been looking for something that would enable monitoring the temperature — among other metrics — in my apartment. I wanted something that:
- enables temperature monitoring in multiple rooms
- has small sensors
- has reasonable battery life
- provides some sort of an API to access the measurements
… and there was of course the price & the security aspect as well.
Wireless Sensor Tags
The top candidate after some research was the Wireless Sensor Tags, so I ordered what was necessary — one tag manager, and a selection of various sensors, (temperature/humidity, light, water) a.k.a tags.
The communication between the tags and the Tag Manager happens in an open frequency band around 433MHz, over a proprietary protocol — this enables more efficient operation than WiFi/Bluetooth, thus saving considerable amount of power while not sacrificing range.
The Tag Manager then uploads the measurements to their cloud, the same place where one can administer the tags and view the data.
After connecting the Tag Manager to my home network, I had to register on the https://my.wirelesstag.net website, and activate the tag manager by serial number.
This page can be used to add/configure the sensors as well as to look at the data collected by the tag manager.
As the image above shows, there are plenty of settings when it comes to monitoring/alerting on the various metrics.
On the analytics side, however, the options are limited:
The data can be viewed either grouped by sensor (combined data across different metrics), or by type — all the temperatures on one chart.
The available options are probably enough for the majority of the users, but I missed the flexible features of Grafana, and the possibility to query the data, like in a database.
Honestly speaking, I have no idea how & where the data is stored, nor was I interested in reverse-engineering the HTTP endpoints to be able to query the data myself.
Instead, I wanted to have the data on my premises (too). To achieve this, there is an interesting option present for all the sensors:
Choosing this opens a window with a lot of options — each of them is essentially a webhook calling option that gets triggered under various circumstances.
Each of these triggers accept a URL, an HTTP method, and optionally a request payload. But before setting up anything here, I needed some place to store the data first.
As a side-story — I have a Linux machine at home in the storage room, which I usually call my home server. Although it used to be much worse, it’s still like a pretty piece of porcelain — when it shatters, I cry.
In other words, it’s setup is a result of hard labor spanning many, many hours, so when the motherboard dies, or storage dies, or I manage to permanently disable the system as a result of experimenting on it, I have to start from scratch.
Long story short, I containerize all I can now, so at least, those parts are not lost (well, maybe except the data, sometimes… ).
I encountered InfluxDB at my previous job, and it fits the purpose perfectly. It is a database that was created specifically for time series, which means that it has special features to handle the time aspect of the data.
Among others, it has retention policy (to drop data after a certain age) and various time-related aggregation functions to make data more meaningful when querying.
Luckily there is a Docker image for InfluxDB, so we can start building our
So that should take care of the database.
At this point, I could have just tried to call InfluxDB directly from the Tag Manager — after all, InfluxDB does have an API for this purpose.
However, I wanted to have more control over the process, especially because I wanted to use separate measurements for temperature, humidity, etc.
Why, you ask? I have found a question on StackOverflow about this, and the answer made sense — one measurement should have as many fields as needed for the data to make sense, but not more.
So if we have a measurement that consists of two different metrics that are both needed to interpret the data, then they should be two fields on one measurement. Otherwise, use separate measurements.
Implementing the Wireless Tags Receiver
The plan was that the Tag Manager calls the Receiver, which saves the data in InfluxDB, and returns 200 OK:
As for the infrastructure, just like for InfluxDB, I wanted to use a container for the Receiver, hence the boxes around those components.
I quickly set up a new Node.js project with Express, and added the InfluxDB NPM package. With those two, the core logic turned out to be quite simple:
This code expects that the data is sent in JSON format, having the following fields:
The next step was to inject the configuration (username, password, etc) — which is solved easily by sending these in as environment variables to the Docker container.
The following code snippet reads the respective settings and sets up the InfluxDB client:
You can find the full code here — there is some additional code for setting up Express, adding an HTTP endpoint, error handling, etc.
To wrap all these in a Docker container, I used the following Dockerfile:
Then I just ran
docker build -t wirelesstags-receiver . and I had the image ready to be used in the updated
docker-compose up -d, I had the following:
- an instance of InfluxDB listening on port 8086
- an instance of the Receiver, failing to connect to InfluxDB — as I forgot to create the database & the user!
Let’s fix that:
Now it’s time to send some data!
Configuring the webhooks
The containers were running on my server (192.168.42.64) that belongs to the same subnet as the Tag Manager was on. This came handy, because then I could avoid exposing the Receiver endpoint to the internet.
By default all the calls are coming from an external IP, from wherever the Wireless Tags’ servers are running — so the endpoints are expected to be reachable from outside of my home network.
However, there is an option to trigger the call to the Receiver from the Tag Manager itself, which is already inside my home network, so no need to set up port forwarding (and authentication…).
With that, first I had to enter the Receiver’s URL:
Then set the payload, which is the data in the JSON format that the Receiver expects:
After quickly verifying that there are no errors in the Receiver logs, I set up all the tags the same way, so that finally I can gather some data.
To save power, I set the measurement interval on the tags to 10 minutes — so to get the graphs you’ll see below I had to wait quite a bit…
The visuals — finally
Now on to the fun part — visualizing the data!
Grafana too has an official Docker image, so it was relatively easy to add it to the stack:
After logging in, the first thing I had to do was to add my InfluxDB database as a data source:
As for the username & password, using the same credentials for reading and writing is good for simplicity — bad for security. Up to you ;)
The rest is probably quite familiar for those who have seen Grafana before. I added a chart that used the newly added data source, querying data from one of the tags:
Doing the same for the rest of the sensors gave me the very first version of my home environment monitoring dashboard:
I quickly got tired of using basic credentials — username and password — to log in, so I set up Grafana to use GitHub as authentication provider.
That way I could log in by a click of a button.
Also, some of the tags are capable of measuring ambient light, so later I’ve added support for that too — just had to add a new field to the webhook’s payload, and handle it in the receiver.
Now that I was able to monitor the temperature in the whole apartment, it was interesting (and sometimes frightening) to see the humidity and temperature values throughout the year.
But this was just the first step — read on for the rest: