Retrieving public data from Netatmo
I have Netatmo weather stations both at home and at our cabin. I retrieve the data from the stations to my own server, and store the data in CSV file as well as in an influx database. I use Grafana to plot the data and to generate monthly and annual reports. The data is retried with bash and python scripts.
About a year ago I noticed that Netatmo offers retrieval of data from public servers. For an area defined by latitude and longitude, the data from weather stations in that area could be retrieved. I managed to retrieve the data, but as my python skills were limited I was struggling with retrieval of specific (e.g temperature) data.
Data typically looks like this, with some variety depending on the public data offered by the individual stations:
About a year ago I noticed that Netatmo offers retrieval of data from public servers. For an area defined by latitude and longitude, the data from weather stations in that area could be retrieved. I managed to retrieve the data, but as my python skills were limited I was struggling with retrieval of specific (e.g temperature) data.
Data typically looks like this, with some variety depending on the public data offered by the individual stations:
[ { "_id": "70:ee:50:36:dd:2e", "place": { "location": [ 9.877501, 62.842563 ], "timezone": "Europe/Oslo", "country": "NO", "altitude": 221, "street": "Gunnesgrenda" }, "mark": 14, "measures": { "02:00:00:36:e1:6e": { "res": { "1588713389": [ 5.7, 62 ] }, "type": [ "temperature", "humidity" ] }, "70:ee:50:36:dd:2e": { "res": { "1588713401": [ 1013.6 ] }, "type": [ "pressure" ] }, ...
Converting the JSON format to pythons dictionary, I struggled with nested loops in order to retrieve specific data: indexing by keys and values always ended in errors. I searched help in a python skilled colleague. He made a class that would parse single datasets and extract temperature information. I could however not convert it to parse through whole data sets and I remember struggling with the extraction of the timestamps, which were keys in the dictionaries.
About a year later, I decided to read up on how python dictionaries work and how data could be retried from them. Suddenly, I understood more of the dictionary structures and decided to have a go at the parsing of data.
After quite a few hours trying to parse the nested keys and values I found a working solution. I guess the main problem was the indexing of dictionaries and lists interchangeably. I was particularly fascinated by the list comprehensions possible in python (e.g. [ x for x in range(20) if x % 2 == 0]).
Anyways, the code i ended up using looks like this: (I need not remind you that I am not a professional programmer)# Authentication token and lat/lon parameters set in params
response = requests.get('https://api.netatmo.com/api/getpublicdata', params=params) string = json.loads(response.text) temperatures = [] timestamps = [] location = [] altitude = [] for i in string['body']: location.append(i['place']['location']) altitude.append(i['place']['altitude']) for j in i['measures']: try: if i['measures'][j]['type'][0] == 'temperature': temperatures.append(list(i['measures'][j]['res'].values())[0][0]) timestamps.append(list(i['measures'][j]['res'].keys())[0]) except KeyError: pass #Wind and rain units report error. Indoor only report pressure
Finally I had lists with lat/lon and temperature data. So what to do with the data? Calculating averages does not make sense so I decided to plot the data with basemap extention of matplotlib. I found some nice examples and finally made a plot I was reasonably happy with. The generating code is as follows:
from mpl_toolkits.basemap import Basemap
import matplotlib.pyplot as plt
fig = plt.figure(figsize=(16, 16), dpi=300)
map = Basemap(llcrnrlon=9, llcrnrlat=62.5,urcrnrlon=12,urcrnrlat=65,
resolution='f', projection='tmerc', lat_0 = 63, lon_0 = 10)
map.drawcoastlines()
map.shadedrelief()
map.drawcountries(color='gray')
map.drawstates(color='gray')
x, y = map([i[0] for i in location],[i[1] for i in location])
map.plot(x, y, 'ko', markersize=5)
for x,y,t in zip(x,y, temperatures):
plt.text(x,y, str(t), fontsize=12)
plt.savefig('firstmap.png', format='png', dpi=300, bb_inches='tight')
plt.show()
The plot looks like this:
I am only reasonably happy with the resolution of the basemap, so I am going to dump the data in data in influxdb next and see what Grafana can do with the Worldmap plugin.
So I guess the question remains if this was a useful exercise. Plotting of the temperature data in itself is probably not. But I have plans to place distributed gas sensors (VOC, ozone, nitrogen oxides) in the future and this could be used to display live data from the sensors.
It was great fun, and I learned a lot about python coding.
Comments
Post a Comment