The Air Quality and Sustainable Distancing sprint is exploring ways to make physical distancing sustainable, and understand applications/implications. The Sensor Platform effort is prototyping a collection of different sensor capabilities – CO2 levels, motion sensing, object recognition, atmospheric data. The goal is to look at different capabilities and experiment with ways to get actionable data.
Our hardware approaches:
- Arduino based “Mix & Match” sensor modules
- Computer vision/ object recognition
For instrumenting and monitoring the two elements of interest (Air Quality—Sustainable Distancing), we knew we would need some suite of sensors, but we didn’t know exactly what approaches would work yet. With this in mind, we started with a low-cost, open-source prototyping board that would allow us to buy a selection of sensors and “mix & match” them as we experiment. We chose an Arduino based platform for its wide userbase, support and ecosystem of compatible devices. With many Arduino boards and third-party variants on the market, we identified one board that fit the design criteria for this project—the Arduino MKR 1010 features:
- Wifi/ Bluetooth /BLE enabled out-of-the-box
- Small form factor
- Compatible with many types of sensors & devices (Grove ecosystem)
- Battery port
- SD card shield
The Grove ecosystem is a collection of “plug and play” devices designed to interface with Arduino through an attachable shield that remaps the pins to standardized 4-wire connectors, eliminating the need for breadboards.
Basic sensors and devices we thought could be useful:
-PIR (infrared) motion sensor
-Ultrasonic (distance) sensor
– Analog Microphone
-CO2, temperature & humidity
-PM2.5 particulate matter
Covid-19 is spread through aerosols, but there is no commercial sensor (yet) for directly measuring those airborne viral particles. So, we need something else able to measure indoor air quality and the associated risk of Covid-19 transmission. We decided to work from the assumption that since exhaled breath contains CO2, measuring CO2 concentrations in a room would give us an approximation of the amount of exhaled breath from occupants and by proxy, the risk of exposure. This assumption has not been rigorously validated, but CO2 looks promising as a low-cost solution. In addition to CO2, we would measure temperature, humidity and PM2.5 (particulate matter 2.5 micrometers or smaller). In previous work done by the TNL as part of the Hyper-Local Air Quality project, we used commercial Purple Air devices to track outdoor particulate matter levels. For this project, we took the Plantower PM2.5 sensors used by Purple Air and created an open-source version using Arduino. Although we were unable to establish a serial communication between the Plantower PM2.5 and the MKR 1010 board due to processor architecture incompatibility, we were successful with an Arduino Uno.
Plantower PM2.5—Arduino Uno schematic:
Tabletop CO2 Monitor
We prototyped a tabletop CO2 monitor that would signal the occupants of a room when the concentration of breath, and by proxy risk, exceeded predetermined threshold. To accomplish this, we used the MKR 1010 with a CO2 sensor, motion detector, LED, and LCD Display and a single-cell 3.7V LiPo battery. The motion sensor is used to set a baseline CO2 level, when people have not been in the room for some time. At baseline the LED will indicate blue. Once it detects motion the LED will change to green, indicating lower risk. As CO2 levels climb and exceed set thresholds, the LED will change to yellow and then red, indicating moderate and elevated risk respectively.
Using this prototype and a $15 anemometer to measure HVAC airflow, we ran some experiments.
Dinner Experiment #1:
- 4 occupants
- 41m^3 room volume
- 2 hour experiment (1 hour meeting)
- Poor ventilation–HVAC off, windows closed
Dinner Experiment #2:
- Same 4 occupants and room
- 1.75 hour experiment (1.5 hour meeting)
- Better ventilation–HVAC on, windows closed
-9.5 air changes / hour
-Standard range for business conference 8-12 ACH
Night Experiment #1:
- 2 occupants
- Smaller room–35m^3 volume
- 11.5 hours
- Poor ventilation
Night Experiment #2:
- Same 2 occupants
- 9 hours
- Good ventilation
-33.5 air changes / hour
-Standard for bedroom is 5-6 ACH
Further development on the tabletop CO2 device included additional motion sensors for panoramic field of view and a custom 3D printed case. In prototyping the new case, we saw the need for a smaller version designed to house individual sensors. Both versions feature sliding polycarbonate panels for easy access to the Arduino boards. They also have swappable lids, meaning different sensor configurations can be selected from by simply swapping out the top.
Initial whiteboard sketches
3D CAD models
Testing the space constraints / layout
Failed prints and various iterations
For monitoring physical distancing, we have two different computer vision approaches:
- “Occupi” Occupany Counter using Object Recognition
-Raspberry Pi + Pi Camera + TensorFlow lite
- Occupant Distance Detection using Depth Camera
-Nvidia Jetson Nano + Intel Realsense D435
Raspberry Pi + Pi Camera
Occupi detecting Eddie Lopez
Nvidia Jetson Nano + Realsense D435 Depth Camera
Depth Camera Feed. Blue is closest to camera, red is furthest.
The TensorFlow Library is pretrained on several dozen common objects, including people. In our testing, it was able to consistently identify us (as “person”) with a high degree of certainty, even with masks or hoods on. We wrote a Python script that runs on the Raspi which counts the number of objects identified as “person” above a certain threshold of certainty. We can then upload this count to the cloud to be used by other sensors or to inform decision making. For example, we networked the Occupi to an Arduino with an LED so that when the occupancy limit is exceeded, the LED changes from green to red. A use case for this would be to signal those outside a room or confined area when it is safe or not to enter.
The Occupi can determine how many people are in a room, but it does not tell you anything about their distancing behavior. The depth camera gives us information about each person’s relative position in space and could be used to track physical distancing within a room. We have yet to implement this functionality.
Configure AWS Data Pipeline:
- Got RAND AWS Access
- Registered MKRWIFI, Raspberry Pi on AWS IoT
- Set certificates / rules / policies to establish communication
- Can publish / receive messages to / from IoT connected devices
- Set up DynamoDB database using Amazon Lambda function
- Transmit data from device COM port to AWS in JSON or .xls
Continuing this work, we will streamline our AWS pipeline and fully instrument a conference room and/or open workspace within PRGS. There is still further development that can be done on the AQ sensors by integrating the CO2 and PM sensors, as well as with the depth camera rig.
TNL disclaimer: this work represents experimental, exploratory, and often in-progress or preliminary efforts. One goal of the TNL is to get interesting approaches, topics, and concepts discussed and presented quickly. This work has not been peer-reviewed and is not an official RAND publication. No warranties implied or expressed, your mileage may vary, enter as often as you like.