How to control Somfy IO with a Raspberry Pi

Another blog about home automation. We added Somfy IO motorized rolling shutters to our house recently with a Situo 5 IO remote to control them. But what about controlling these rolling shutters with a Raspberry Pi, preferably without an expensive hub? I did some tinkering and found a much cheaper solution.

Somfy IO or RTS

Somfy uses two types of wireless control for motorized rolling shutters: RTS and io-homecontrol. RTS stands for Radio Technology Somfy and uses the open 433 MHz radiofrequency for one-way control. Io-homecontrol is the newer two-way wireless technology. This technology is more secure and more reliable, but this also means that it is more difficult to control with products like a Raspberry Pi.

If you have rolling shutters with an RTS wireless receiver you should look for blogs about Raspberry Pi and rf433. This blog is about controlling Somfy IO rolling shutters with a Raspberry Pi. This solution won’t work with RTS rolling shutters.

What do you need?

Well, Somfy motorized rolling shutters and a Raspberry Pi of course. But what else? You’ll find a small shopping list below.

Do you need a hub?

Somfy offers two devices for smart home control of their rolling shutters: the TaHoma hub and the Connexoon io hub. The TaHoma costs about 300 euro and the Connexoon about 170 euro. That’s quite expensive! I already use a Raspberry Pi to control most of my smart home devices and I don’t want to add another hub just to move my rolling shutters. I want to control my rolling shutters without a hub.

Other requirements

I did some Google searches and I found the Somfy Izymo transmitter module. This module is meant for people who already have a wired push-button or rocker switch that they want to transform into an io control. When I got this module it cost about 35 euro, but the price has gone up a bit to about 50 euro.

Somfy Izymo transmitter

As you’ll see below, you’ll also need a relay with at least two channels. I used this two-channel relay that cost 6,95 euro.

Two-channel relay

And finally, you’ll need some wires to connect the Raspberry Pi, the Izymo module, and the relay. You could also use a breadboard or a small terminal block to make connections without soldering. This shouldn’t cost much. I had this stuff lying around from other projects.

You’ll also need a remote to pair the Izymo transmitter to the rolling shutters. You’ll only need to do this once and you can still use the remote like a regular remote afterward.

So here’s your shopping list:

Even cheaper option

During my search for a solution, I also found some blogs about soldering a Somfy remote directly to an Arduino or a Raspberry Pi. Since a single remote is a bit cheaper than the Izymo module this option will be a bit cheaper altogether. I did not choose this option, because I don’t want to butcher a remote for this. But it is an option.

Pairing the transmitter

Before you can use the Izymo transmitter it must be paired with the rolling shutters. The easiest way is to use another remote that is already paired with the rolling shutters. There’s a small button on the back of the Situo 5 IO remote that can be used to put the rolling shutters in association mode. Then briefly press the PROG 1 button (for channel S1) or PROG 2 button (for channel S2) on the transmitter and that should be it.

Wiring schematics

As you can see in the picture, the Izymo module has three wires: green, white, and yellow. If you connect the white wire to the yellow wire, the rolling shutters go up. And if you connect the white wire to the green wire, the rolling shutters go down. So to control the rolling shutters, you need to physically ‘switch’ between these wires. That’s what we’ll use the relay for that was mentioned earlier.

The relay has four pins to connect it to the Raspberry Pi. On the relay I got, the VCC pin is connected to 5 volts, the GND pin is connected to ground and the other two pins (IN1 and IN2) are connected to the GPIO pins. In my case, I used pin 13 and 26 as GPIO pins, but feel free to use other pins if that suits your project.

Wiring schematic

A Python code example

After connecting all the wires we can start writing some code. There are plenty of blogs about using the GPIO pins on a Raspberry Pi with Python. I can recommend this blog if this is your first project: https://pythonprogramming.net/gpio-raspberry-pi-tutorials/. The code below is actually not very different from the example in the other blog.

You can change the pin numbers to correspond with your own project. Then call the function move_shutters() with a direction (‘up’ or ‘down’) and the duration in seconds that you want to move the rolling shutters.

Open In Colab

In [ ]:
import RPi.GPIO as GPIO
import time

GPIO.setmode(GPIO.BCM)

pin_up = 26
pin_down = 13

GPIO.setup(pin_up, GPIO.OUT)
GPIO.setup(pin_down, GPIO.OUT)

GPIO.output(pin_up, GPIO.LOW)
GPIO.output(pin_down, GPIO.LOW)

def move_shutters (direction, sec):
    if (direction == 'up'):
        GPIO.output(pin_up, GPIO.HIGH)
        time.sleep(sec)
        GPIO.output(pin_up, GPIO.LOW)
    if (direction == 'down'):
        GPIO.output(pin_down, GPIO.HIGH)
        time.sleep(sec)
        GPIO.output(pin_down, GPIO.LOW)

move_shutters('down', 20)

A Nodejs example

The example below is a bit more elaborate. This is what I actually use in my own home automation project. It creates a Router for a Nodejs Express app. I also added some lines to make it work on devices without GPIO pins for testing purposes.

Controlling a Google Nest thermostat with Python

Google recently released its Device Access API. This article is a short explanation of how to access and use this API to control a Google Nest thermostat.

We moved to a new house a few months ago and the previous owners left their thermostat behind. This thermostat turned out to be a brand new Nest smart thermostat. Of course, I wanted to connect this smart device to my home automation. But unfortunately, after some searching, I found out that most connections were deprecated after Google acquired Nest a couple of years ago, along with the API access for consumers. The only option was to sign up for a waitlist.

But to my surprise, I got an email from Google a couple of weeks ago! The Device Access Console is now available to individual consumers as well as commercial partners. I wanted to find out if I could control my Google Nest thermostat so I wrote some Python code for this.

E-mail from Google
E-mail from Google

Get started with Google documentation

Google has written some quite elaborate documentation about all the steps needed to access the API. The purpose of this article is not to rewrite their help articles. These articles will get you started:

And you’ll also need these links:

How much does it cost?

Before creating your first project, you must pay a one-time, non-refundable fee (US$5) per account. There can also be cost involved by using Google Cloud services, but I have been polling my thermostat every 5 minutes for a couple of weeks now without cost.

Gather all requirements

It’s tempting to begin creating a Device Access project right away (I did so too), but I found out that it is probably easier to get all the necessary prerequisites first. Before you can start coding you should follow these steps:

  1. Create a Google Cloud Console project at console.cloud.google.com
    1. Create Cloud Console oAuth 2.0 credentials
    2. Set a redirect URI
    3. Turn on the Smart Device Access API
  2. Create a Device Access project at console.nest.google.com

I made some screenshots from each step:

After creating an OAuth 2.0 client ID and a Device Access project you should have these values:

  • A project ID. You can find this in your Device Access console.
  • A client ID. You can find this with your OAuth credentials.
  • A client secret. You can find this with your OAuth credentials.
  • A redirect URI. This URI can be almost any URL you want, as long as this URL matches a URI that is entered as an authorized redirect URI in your Cloud Console project. 

Some example API calls with Python

Now we can start making API calls! Below is a Jupyter Notebook that walks through all the steps to control a Nest thermostat.

You can also make a copy of the Google Colab notebook I made.

Open In Colab

Example for making Nest API calls

Please start by making a copy of this notebook.

Enter your credentials

Enter your credentials and run the cell to generate a login URL. Click the URL and log in to your Google account.

In [ ]:
project_id = 'your-project-id'
client_id = 'your-client-id.apps.googleusercontent.com'
client_secret = 'your-client-secret'
redirect_uri = 'https://www.google.com'

url = 'https://nestservices.google.com/partnerconnections/'+project_id+'/auth?redirect_uri='+redirect_uri+'&access_type=offline&prompt=consent&client_id='+client_id+'&response_type=code&scope=https://www.googleapis.com/auth/sdm.service'
print("Go to this URL to log in:")
print(url)

After logging in you are sent to URL you specified as redirect_url. Google added a query to end that looks like this: ?code=.....&scope=... Copy the part between code= and &scope= and add it below:

In [ ]:
code = '4/add-your-code-here'

Get tokens

Now we can use this code to retrieve an access token and a refresh token:

In [ ]:
# Get tokens

import requests

params = (
    ('client_id', client_id),
    ('client_secret', client_secret),
    ('code', code),
    ('grant_type', 'authorization_code'),
    ('redirect_uri', redirect_uri),
)

response = requests.post('https://www.googleapis.com/oauth2/v4/token', params=params)

response_json = response.json()
access_token = response_json['token_type'] + ' ' + str(response_json['access_token'])
print('Access token: ' + access_token)
refresh_token = response_json['refresh_token']
print('Refresh token: ' + refresh_token)

Refresh access token

The access token is only valid for 60 minutes. You can use the refresh token to renew it.

In [ ]:
# Refresh token

params = (
    ('client_id', client_id),
    ('client_secret', client_secret),
    ('refresh_token', refresh_token),
    ('grant_type', 'refresh_token'),
)

response = requests.post('https://www.googleapis.com/oauth2/v4/token', params=params)

response_json = response.json()
access_token = response_json['token_type'] + ' ' + response_json['access_token']
print('Access token: ' + access_token)

Get structures and devices

Now lets get some information about what devices we have access to and where these are "located". Devices are part of a structure (such as your home). We can get information about the structures we have access to:

In [ ]:
# Get structures

url_structures = 'https://smartdevicemanagement.googleapis.com/v1/enterprises/' + project_id + '/structures'

headers = {
    'Content-Type': 'application/json',
    'Authorization': access_token,
}

response = requests.get(url_structures, headers=headers)

print(response.json())

But we can also directly retrieve the devices we have access to:

In [ ]:
# Get devices

url_get_devices = 'https://smartdevicemanagement.googleapis.com/v1/enterprises/' + project_id + '/devices'

headers = {
    'Content-Type': 'application/json',
    'Authorization': access_token,
}

response = requests.get(url_get_devices, headers=headers)

print(response.json())

response_json = response.json()
device_0_name = response_json['devices'][0]['name']
print(device_0_name)

Get device stats

For this example I simply took the first item of the array of devices. I assume most people probably have one Nest thermostat anyway.

The name of a device can be used to retrieve data from this device and to send commands to it. Lets get soms stats first:

In [ ]:
# Get device stats

url_get_device = 'https://smartdevicemanagement.googleapis.com/v1/' + device_0_name

headers = {
    'Content-Type': 'application/json',
    'Authorization': access_token,
}

response = requests.get(url_get_device, headers=headers)

response_json = response.json()
humidity = response_json['traits']['sdm.devices.traits.Humidity']['ambientHumidityPercent']
print('Humidity:', humidity)
temperature = response_json['traits']['sdm.devices.traits.Temperature']['ambientTemperatureCelsius']
print('Temperature:', temperature)

Set thermostat to HEAT

And last but not least, lets send some commands to our thermostat. The cell below contains the code to set the mode to "HEAT":

In [ ]:
# Set mode to "HEAT"

url_set_mode = 'https://smartdevicemanagement.googleapis.com/v1/' + device_0_name + ':executeCommand'

headers = {
    'Content-Type': 'application/json',
    'Authorization': access_token,
}

data = '{ "command" : "sdm.devices.commands.ThermostatMode.SetMode", "params" : { "mode" : "HEAT" } }'

response = requests.post(url_set_mode, headers=headers, data=data)

print(response.json())

Set a new temperature

And finally we can set a temperature by executing this command:

In [ ]:
set_temp_to = 21.0
In [ ]:
# Set temperature to set_temp_to degrees

url_set_mode = 'https://smartdevicemanagement.googleapis.com/v1/' + device_0_name + ':executeCommand'

headers = {
    'Content-Type': 'application/json',
    'Authorization': access_token,
}

data = '{"command" : "sdm.devices.commands.ThermostatTemperatureSetpoint.SetHeat", "params" : {"heatCelsius" : ' + str(set_temp_to) + '} }'

response = requests.post(url_set_mode, headers=headers, data=data)

print(response.json())

Using Google Cloud Functions to get around Google Ads Scripts limitations

Cloudy lake

My first real venture into the more technical side of online marketing was when I learned some JavaScript to write my own Google Ads (Adwords) Scripts. With Google Ads Scripts you can automate stuff within your Google Ads account. You can automate to the extent that it exceeds the standard functionality of the regular interface. So if you want to go beyond the regular options and buttons that any SEA specialist can use, Google Ads Scripts is the way to go.

But you probably already know this if you’re reading this article. You’re probably here because you found out that Ads Scripts also have limitations. This article explores the possibility of using Google Cloud Functions together with Google Ads Scripts as a workaround for these limitations. (Spoiler: it’s possible!)

I’ll show you a real-life example of something quite hard to do directly within Ads Scripts: fuzzy string matching on search terms. This is useful for classifying search terms that have spelling errors for example. We’ll get to that further down this article.

The limitations of Google Ads Scripts

Like I said, Google Ads Scripts are powerful, but they also have limitations. These limitations include:

  • Maximum execution time of 30 minutes. This sounds like a lot, but it’s a real limitation. Mainly because:
    • You can only use synchronous JavaScript from the older JavaScript 1.6 standard, plus a few features from 1.7 and 1.8.
    • Your script runs on a server somewhere that is probably not solely dedicated to the fastest possible execution of your script. Execution is quite slow.
    • No external libraries like lodash or async. You can only use the methods that Google provided, so if you need something else you’re out of luck.

All in all, it’s easy to find yourself stuck in a situation where you’re looping over a loop inside another loop with runtimes that easily exceed 30 minutes. What if we’d move some of the heavy work outside Ads Scripts to another tool that does not have these limitations? That’s where Google Cloud functions come in!

Google Cloud Functions to the rescue

Cloud Functions is Google Cloud’s event-driven serverless compute platform. This is a complicated way of saying that you can execute functions written in Node.js, Python or Go in the cloud. It’s not a free tool, but there is a free tear that lets you make up to two million invocations per month without charge. This plenty for our use case. I won’t go into detail here about how to set up a new Cloud Function. Check out one of these blogs to learn about setting up Cloud Functions: herehere or here.

With Cloud Functions, we can run our code on faster machines. We can also import external libraries from NPM for Node.js or pip for Python. And last but not least, we can run asynchronous code. The only limitation here is a maximum execution time of 9 minutes. But if this is a problem you could divide the workload over multiple Cloud Functions that run in parallel.

Google provides several ways to trigger a Cloud Function. For our use case, we need to create a Cloud Function that uses an HTTP trigger. This means that it can be triggered by either a GET request or a POST request. We can make these requests from our Google Ads Script as our gateway to pass information between Google Ads and Google Cloud.

How to make a POST request from Ads Scripts to a Cloud Function

Making a POST request from a Google Ads Script is fairly straightforward. We can use UrlFetchApp.fetch() to make a request. We can add an options object that contains JSON payload. Use this to send data from our Google Ads account to the Cloud Function.

This specific example populates a JSON payload with two key-value pairs: ‘example_content‘ and ‘another_example_content‘. The next section shows how to access this data in the Cloud Function.

The Cloud Function code

When you create a new Cloud Function, you can choose between a Node.js, Python or Go runtime. Google provides you with a code example for the runtime of your choice. I’ll give you an example that works with the Google Ads Script code example above for both Node.js and Python. I’m not familiar with Go, so I’ll skip that programming language if you don’t mind. 🙂

One thing worth mentioning is that the Cloud Function should always return a string. So keep this in mind when you’re writing your code.

Python

With Python, you can get to the ‘example_content’ from Google Ads through request.get_json().

Node.js

With Node.js, you can access the ‘example_content’ from Google Ads through the express.js request with req.body.

Example use case: fuzzy string matching

Let’s see if we can use the ideas above in practice with an example. Let’s say you want to classify incoming search terms based on whether they contain a word from a list of words. This list of words could, for example, be a list of brand names. In JavaScript this could be accomplished like this:

However, this only works for exact matches. If the search term contains a spelling error it doesn’t work. We need fuzzy string matching for this. 

We could do this with JavaScript, as explained in this article, but there is an extra problem. We would use this code to compare two strings, such as “facebook” and “facebok”. But we want to find a string that might be part of a longer search query, such as “sign up for facebok online”. This requires us to compare the word we’re looking for to every part of this search query. These parts are called N-grams

A search query with 5 words consists of 5+4+3+2+1 = 15 N-grams. Our example search query consists of these N-grams: “sign”, “up”, “for”, “facebok”, “online”, “sign up”, “up for”, “for facebok”, “facebok online”, “sign up for”, “up for facebok”, “for facebok online”, “sign up for facebok”, “up for facebok online”, and finally “sign up for facebok online”.

Now you probably see why this could lead to problems with the maximum execution time. 😉 We would end up looping over every N-gram inside a loop over every search term. And if we’d be looking for multiple words we’d get a nested loop that is three layers deep. That’s not efficient at all! Can we find a more efficient way?

Requesting a Cloud Function from Google Ads

The code snippet below looks a lot like one of the examples above. Instead of example content, we filled the JSON object with some search terms and a list of brands. In the real world, these arrays could contain hundreds or even thousands of items.

A Cloud Function with FuzzyWuzzy

The Python package FuzzyWuzzy uses Levenshtein Distance to calculate the differences between strings. It also has a method that accepts an entire list (array) of items at once and that returns the item with a word that is most alike.

Importing FuzzyWuzzy

Before we can use FuzzyWuzzy in our Cloud Function, we must include it in our package.txt. To do this, add these two lines to package.txt:

python-Levenshtein>=0.12.0
fuzzywuzzy>=0.17.0

The Cloud Function code

Just as with the examples above, this code snippet accepts a request with a JSON object. This JSON object contains the two lists or arrays with search terms and brands. For each search term, it finds the brand that is most alike and returns that brand together with a score.

The result

Our example returns this output:

{'buy nike shoes': ('nike', 100), 'adidas online': ('adidas', 100), 'rebok cloths': ('reebok', 56)}

The search term ‘buy nike shoes’ matches with the brand Nike with a 100% score. The search term ‘rebok cloths’ on the other hand matches the brand Reebok, but with a 56% score. A higher score means a closer match.

Now, all that’s left is to process these results. I’ll leave it up to you to come up with ideas on how to use Cloud Functions together with Google Ads Scripts! 🙂

How to find the fastest section within a GPX file with Python & Jupyter Notebooks

One of the fun things when you like both data and running is that you can collect loads of data by running with a GPS tracker or a mobile phone. With apps such as Strava, Runkeeper or Endomondo you get all kinds of statistics about your workouts. I’ve been using Endomondo for several years now and when I found this really interesting article by Steven van Dorpe about how to analyze GPS data with Python I wanted to try doing some analysis for myself.

Specifically, in this blog we’ll go through the steps to find the best or fastest section within a workout. For example, if I ran 12 kilometers, started slow and finished fast, what was the time for the fastest 10 kilometers during that entire workout? And what was my fastest 10 kilometer this year for example? Let’s find out!

There is this specific chart from Endomondo that I like a lot: my personal best for a specific distance over time. The interesting thing about this graph is that it shows my personal best for a specific distance from within a longer workout.

Endomondo graph for my personal best for 5K over time.

Unfortunately this graph has some limitations, especially when you’re not a premium (paying) user of Endomondo. The free version only shows personal bests for 5 kilometers or 3 miles. PB’s for other distances are only available for premium users. (I leave it up to you to decide if you find it ethical to replicate some functionality from the premium features.) 😉 And interesting stats such as my PB’s for specific combinations are not available at all, such as my fastest 5K from workouts over 10K over the past 12 months. (Sidenote: I’ve been a paying user for quite some time in the past and I also bought the app for a fixed price many years ago.)

Parsing and preparing the data

Anyway, lets see if we can build on Steven’s instructions to find the fastest 5K, 10K, and so on within a workout. Endomondo and most other running apps let you download your workouts as GPX files. These files contain the data about your GPS location, altitude and time. I use Python and Jupyterlabs Notebooks (which is included with Anaconda) for parsing and analysing these files.

Let’s get coding! The first step is to include some packages:

Then declare some variables and parse the GPX file:

In Steven van Dorpe’s article, all data is added by using data = gpx.tracks[0].segments[0].points but I found that some GPX files contain multiple segments with relevant data, some don’t, some contain multiple segments where the last segment should be omitted, and so on. You should really inspect your data to see what data you need. For recent Endomondo GPX files I found that if the file contains multiple segments, I should import all segments except the last segment, unless the file contains only one segment, because then we need all segments.

For this blog let’s assume you can include all segments. Oh, and let’s also apply some sorting and filling to clean any problems in the data:

So far we’ve been following the instructions from the article by Steven, but from here we’ll be doing some things different. Instead of looping over data to calculate distances I decided to stick with Pandas and use vectorization to try to speed things up. At this point we’re working on a single GPX file, but eventually we may want to loop over a folder with multiple GPX files (270 in my case) and speed suddenly becomes important (as we’ll see later). Using vectorization and the apply() method to calculate distances involved adding some extra columns to the DataFrame with values that are “shifted one row backwards”:

Some GPX files gave me a hard time about timezones. These lines fixed this:

And then the actual calculations for distances and time differences with Pandas apply(). The math behind this is all explained thoroughly in the article by Steven van Dorpe I mentioned earlier. If you’re at this point in this blog and you still haven’t read his article, I can definitely recommend you do this first and then continue with the code below:

Then as a final substep before the actual search for the fastest sections I create a new DataFrame that only contains the columns needed for further calculations to speed things up a little bit more. Here I also add two columns for cumulative sums for these columns.

Finding the actual fastest section

Remember that at the beginning of this blog we created a list with sections? Now it’s time to loop over this list and find the fastest kilometer, fastest 5 kilometer, fastest 5 miles, and so on.

If the current section is longer than the total distance of the entire workout there is no need to do any further analysis and we can skip ahead to the next iteration.

To find the fastest section within the entire workout we must loop over the DataFrame.* For every row we locate the first row further ahead where the cumulative sum of the total distance minus the cumulative distance at this row is greater or equal to the section.

*) I know that looping over a DataFrame is not the best solution in terms of speed. However, I have not found a way to get the same results without this loop. Did you find a faster way to do this? Let me know! 🙂

That’s a lot to take in… 😉 So let me give you an example. Lets say we want to find the fastest 50 meters from the virtual workout below:

TimeDistance
0 seconds0 meters
10 seconds10 meters
20 seconds25 meters
30 seconds35 meters
40 seconds45 meters
50 seconds55 meters
60 seconds60 meters
70 seconds70 meters
80 seconds85 meters
90 seconds90 meters
  • At 0 seconds and 0 meters we find the first row that is 50 meters or more ahead, which is the row at 50 seconds and 55 meters. The speed over this section was (55 meter / 50 seconds) = 1.1 meter per second.
  • Then we start at the second row with 10 seconds and 10 meters and find the first row that is 50 meters or more ahead, which is the row at 60 seconds and 60 meters. The speed over this section was ((60 meter – 10 meter) / (60 seconds – 10 seconds)) = 1.0 meter per second.
  • Then we start at the third row with 20 seconds and 25 meters and find the first row that is 50 meters or more ahead, which is the row at 80 seconds and 85 meters. The speed over this section was ((80 meters – 25 meters) / (80 seconds – 20 seconds)) = 0.92 meters per second.
  • … and so on…

The first iteration with 1.1 meter per second is the fastest section, because it traveled the section of 50 meter with the highest average speed.

This is what this looks like in code:

The result in df_final looks something like this:

The output of df_final
The best sections according to Endomondo
SectionCalculations based on timeEndomondo
1 km283 seconds = 4:434:42
1 mi468 seconds = 7:487:47
3 km887 seconds = 14:4714:45
5 km1485 seconds = 24:4524:43
10 km3031 seconds = 50:3150:31

As you can see, our calculations are pretty close, but not exactly the same. When you study the screenshot of the DataFrame more closely, the distance of each section is also just a little bit more than the actual section. So for the time it took to travel 3 km, I actually travelled 3.005 km. This would explain why we got one or two seconds more for each section. Let’s see if the results are better when we account for this by using the average speed over the fastest section for our calculations:

SectionCalculations based on minutes per kilometerEndomondo
1 km4.699604 * 1 = 4:424:42
1 mi4.839385 * 1.60934 = 7:477:47
3 km4.918145 * 3 = 14:4514:45
5 km4.944354 * 5 = 24:4324:43
10 km5.051423 * 10 = 50:3150:31

Now my calculations exactly match the results from Endomondo! Nice!

Working with multiple GPX files

So far we’ve used only a single GPX file, but for statistics like the fastest 10K over the past 12 months we need to import all workouts during that timeframe. For my personal project I wanted to include all my workouts. So that’s what I did!

The following code is a combination of all the code snippets above, but it also collects all GPX files included in a subfolder /tracks/ and loops over all of them. This might take a while to run though, so make sure not to include too many files.

Final result

With the output data in df_final we can continue to make plots, extract data and so on. With just a few more lines of code I can for example find out how far I’ve ran on my current shoes so far (709 km), make scatter plots of distance and date (apparently I had a habit of stopping at exactly 10 kilometers in 2016) or make my own version of the Endomondo graph at the top of this blog (it’s not as fancy yet, but close).

All my workouts displayed as a scatter plot
Best 10 km’s per month

So this is it! We’ve looped over a DataFrame to find the fastest sections from a list of GPX files. Perhaps I’ll build on this script to build a nice dashboard for my running activities, but that’s something for later.

Some final thoughts

Probably my biggest “concern” with this entire endeavour is the contents of the GPX files compared to the Endomondo app and interface. As it turns out, the total distance of the workouts from the GPX files is just slightly lower compared to what Endomondo reports. And I know this discrepancy is not caused by my calculations, because I found out that if I would upload a GPX file back to Endomondo it would also have a lower total distance!

An example workout sums up to a maximum total distance of 13303 meters or 13.30 kilometers in my calculations. According to the Endomondo interface, this workout should actually be 13.32 kilometers. However, when I upload this GPX file back to Endomondo it would only be 13.29 kilometers. Weird…

Endomondo reports 13.32 km
Uploading the same workout would report 13.28 km

Also, I learned a lot about the importance of speed with this project. The original instructions by Steven van Dorpe were too slow for my use case (but very educational nonetheless). But even with all the vectorization and apply() methods that I could get to work in this project it will still take minutes or more to loop over multiple GPX files. So keep this in mind if you’re trying to build on my examples. 🙂