Air Quality Sensor Blog

This entry is part 1 of 2 in the series Air Quality Sensor Series

My partner and I wanted to measure the air quality in our living room. We live on a main road in a busy part of South East London and we have often wondered just how exposed we are to pollutants on a daily basis. This is something that has become even more of a pressing question seeing as with the pandemic we have been spending the majority of our time in the flat. We found a blog on hacker news on how to build your own air sensor.

This blog tells you where to get the necessary parts and how to put them together a PMS7003 air quality sensor at a small price. Whilst this set up uses java, I decided to use Python because a very handy Python package for this exact sensor already exists. Plus, I wanted to focus on Python as much as possible to build up this skill. We also had a Raspberry Pi lying around, which was handy! So, this project is sort of perfect: we want to know how gross our lungs are at any given moment in our flat, we’ve got the parts to build a sensor and it also gives me the opportunity to build APIs, work with live data and data warehousing.

With the sensor ready and whirring the next part is to set up an API that would allow us to send readings from the sensor to our database as well as allow us to read this information back (maybe as a dashboard?). So the next tasks on the agenda are:

  1. Construct a database (going for a PostgreSQL database to allow us to have more freedom later on)
  2. Set-up a basic REST API with Flask (allowing for POST and GET requests) in Python
  3. Create a script to take sensor readings at some frequency (to be decided) which are then posted to the database.
  4. Finally, do something with the data – once enough has been collected, we will do some analysis of air quality in our living room over time. We will probably also create a little live dashboard – because why the hell not!

This blog is part 1 of 4 and will cover Task 1 of this list, and is intended to be like a diary. Links to blogs/tutorials can be found at the end.

I opted for a PostgreSQL database and I decided to use Heroku – mostly because I had used it before meaning that set up would be simple. For any sections where I couldn’t remember or figure out what I had to do, I followed Heroku’s own documentation.

With the database ready, I set about creating my project folder. The blogs and tutorials that I have followed suggest setting up a basic test Flask web app to make sure everything works.  This will be edited and tailored later when we’re ready to set up our GET and POST method routes.

from flask import Flask, request
from flask_sqlalchemy import SQLAlchemy
import os

app = Flask(__name__)

db = SQLAlchemy(app)

from models import Particles

def hello():
    return "Testing Flask app is working"


if __name__ == '__main__':

This already includes SQLAlchemy and os imports. It’s also already importing a SQLAlchemy model called Particles which is the table that I have created to go into the Heroku PostgreSQL database – more on this shortly.

I then sent up a database configuration script, also in Python, and this is how I’m going to link my app to my Heroku PostgreSQL database (this is my script).

import os
basedir = os.path.abspath(os.path.dirname(__file__))

class Config(object):
    DEBUG = False
    TESTING = False

class ProductionConfig(Config):
    DEBUG = False

class StagingConfig(Config):
    DEBUG = True

class DevelopmentConfig(Config):
    DEBUG = True

class TestingConfig(Config):
    TESTING = True

In the config object, I include the key database connector details which are found in the Heroku app’s Database credentials tab. I also set the production, staging and development settings objects to development stage and allow for debugging.

To store the air quality data, I needed to create a table in the Heroku database. Whilst it can be done through the Heroku command line interface, I decided to do this in Python. I wanted to gain experience of in working with SQLAlchemy and this felt like a good opportunity! Plus, it meant that I could keep a record of how everything was constructed. In this script, I specify the columns names and the column value datatypes. The PMS7003 gives you 12 readings in total (you can check out the documentation for the sensor here and the PMS7003 Python package coding here), to this I add ID and timestamp columns.

from app import db

class Particles(db.Model):
    __tablename__ = 'particles'

    id = db.Column(db.Integer, primary_key=True)
    TimeStamp = db.Column(db.DateTime())
    pm1_0cf1 = db.Column(db.Numeric())
    pm2_5cf1 = db.Column(db.Numeric())
    pm10cf1 = db.Column(db.Numeric())
    pm1_0 = db.Column(db.Numeric())
    pm2_5 = db.Column(db.Numeric())
    pm10 = db.Column(db.Numeric())
    n0_3 = db.Column(db.Numeric())
    n0_5 = db.Column(db.Numeric())
    n1_0 = db.Column(db.Numeric())
    n2_5 = db.Column(db.Numeric())
    n5_0 = db.Column(db.Numeric())
    n10 = db.Column(db.Numeric())

    def __init__(self, pm1_0cf1, pm2_5cf1, pm10cf1, pm1_0, pm2_5, pm10, n0_3, n0_5, n1_0, n2_5, n5_0, n10, TimeStamp):
        self.pm1_0cf1 = pm1_0cf1
        self.pm2_5cf1 = pm2_5cf1
        self.pm10cf1 = pm10cf1
        self.pm1_0 = pm1_0
        self.pm2_5 = pm2_5
        self.pm10 = pm10
        self.n0_3 = n0_3
        self.n0_5 = n0_5
        self.n1_0 = n1_0
        self.n2_5 = n2_5
        self.n5_0 = n5_0
        self.n10 = n10
        self.TimeStamp = TimeStamp

    def __repr__(self):
        return '<id {}>'.format(
    def serialize(self):
        return {
            'TimeStamp': self.TimeStamp,
            'pm1_0cf1': self.pm1_0cf1,
            'pm2_5cf1': self.pm2_5cf1,
            'pm10cf1': self.pm10cf1,
            'pm1_0': self.pm1_0,
            'pm2_5': self.pm2_5,
            'pm10': self.pm10,
            'n0_3': self.n0_3,
            'n0_5': self.n0_5,
            'n1_0': self.n1_0,
            'n2_5': self.n2_5,
            'n5_0': self.n5_0,
            'n10': self.n10

I create the model (my table) called Particles which I will then read into my script. The serialize here is not needed for migration but is useful for returning particle readings object as a JSON response.

Finally, I set the python script that would manage the database migration and then initiate the first migration to send this table to my PostgreSQL database.

from flask_script import Manager
from flask_migrate import Migrate, MigrateCommand

from app import app, db

migrate = Migrate(app, db)
manager = Manager(app)

manager.add_command('db', MigrateCommand)

if __name__ == '__main__':

To migrate everything, in terminal I execute the following:

python db init
python db migrate
python db upgrade

So, in summary…

Here is a list of a few blogs and tutorials that I have used in this part of the development:

  • This great blog post by Dushan Kumarasing on creating a web application with Python, Flask, PostgreSQL and deploying on Heroku
  • Also, this amazing blog by Jeremy Morgan on building APIs in Python. This one will be particularly useful in the next stages as he is working with a weather station connected to a Raspberry Pi which is similar to what I’m doing here

And here are some of the packages used:

Takeaways and next steps:

On the whole, this part of the set up was pretty straightforward, especially since I just followed the steps outlined in the above tutorials. I would suggest following Dushan Kumarasing’s blog for this stage, but for the next part (Part 2: Set-up a basic RESTful API with Flask), it will be better to use Jeremy Morgan’s blog a little more given the type of information that will be sent and retrieved from the database and as we will be working with the Raspberry Pi more.

Series NavigationBuilding the POST and GET methods for my air quality sensor and harvesting the data from a script on my raspberry pi >>

Leave a Reply