This work contains data gathered by a series of sensors (PM 10, PM 2.5, temperature, relative humidity, and pressure) in the city of Turin in the north part of Italy (more precisely, at coordinates 45.041903N, 7.625850E). The data has been collected for a period of 5 months, from October 2018 to February 2019. The scope of the study was to address the calibration of low-cost particulate matter sensors and compare the readings against official measures provided by the Italian environmental agency (ARPA Piemonte).
A Densely-Deployed, High Sampling Rate, Open-Source Air Pollution Monitoring WSN
Documentation for the air pollution monitoring station developed at Politecnico di Torino by:
Edoardo Giusto, Mohammad Ghazi Vakili under the supervision of Prof. Bartolomeo Montrucchio.
This section includes a description of our architecture from several points of view, going from the hardware and software architecture, to the communication protocols.
We target the following key characteristics of our system:
- The rapid and easy prototyping capabilities,
- Flexibility in connection scenarios, and
- Cheapness but also dependability of components.
As each board has to include a limited number of modules, to facilitate our prototype development, we select the
Raspberry Pi single-board computer as a monitoring board.
Due to our constraints in terms of cost, size and power consumption we select its
Zero Wireless version based on the
The basic operating principle of the system is the following. The data gathered from the sensors are stored in the
MicroSD card of the RPi. At certain time intervals the RPi tries to connect to a
Wi-Fi network and, if such a connection is established, it uploads the newly acquired data to a remote server.
The creation of the Wi-Fi network is achieved using a mobile phone set to operate as personal hot-spot, while on the remote server resides the database storing all the performed measurements.
Wi-Fi connectivity was one of the requirements for the system, but at the same time, the system itself should have not to produce unnecessary electromagnetic noise, possibly impacting the operating ability of the host's appliances.
To reduce the time in which the Wi-Fi connection was active, the
Linux OS was set to activate the specific interface at predefined time instants in order to connect to the portable hot-spot.
Once connected to the network, the system performed the following tasks:
- synchronization of the system and
RTC clockwith a remote Network Time Protocol (NTP) server,
- synchronization of
the local samples directorywith the
remote directoryresiding on the server.
The latter task is performed using the
UNIX rsyncutility, which has to be installed on both the machines.
To gather data from the sensors, a Python program has been implemented, which runs continuously with a separate process reading from each physical sensor plugged to the board and writing on the MicroSD card.
It has to be noted that for what concerns the PM sensors, since the UART communication had to take place using GPIOs, a Pigpiod deamon has been leveraged, to create digital serial ports over the Pi's pins.
The directories on the remote server are a simple copy of the MicroSD cards mounted on the boards.
Data in these directories have been inserted in a MySQL database.
Mechanical Design and Hardware Components
In order to easily stack more than one device together, a 3D printed modular case has been designed.
Several enclosing frames can be tied together using nuts and bolts, with the use of a single cap on top.
Figure shows the 3D board design, together with the final sensor and board configurations.
Each platform is equipped with 4 PM sensors (a good trade-off between size and redundancy),
1 Temperature (T) and
Relative Humidity (HT) sensor and
1 Pressure (P) sensor.
As our target was to capture significant data sampling for the particulate matter we adopt the following sensors:
Honeywell HPMA115S0-XXXas PM sensor.
As one of our targets was to evaluate these sensors' suitability for air pollution monitoring applications, we insert 4 instances of this sensor in every single platform.
This sort of redundancy allows us to detect strange phenomena and to avoid several kind of malfunctions, making more stable the overall system.
DHT22as temperature and relative humidity sensor.
This is very widespread in prototyping applications, with several open-source implementation of its library, publicly available on the internet.
BME280as a pressure sensor.
This is a cheap but precise barometric pressure and temperature sensor which comes pre-soldered on a small PCB for easy prototyping.
The system also includes a
Real Time Clock (RTC) module for the operating system to retrieve the correct time after a sudden power loss. The chosen device is the
The DS3231 communicates via I2C interface and has native support in the Linux kernel.
As a last comment, notice that a Printed Circuit Board (PCB) was designed to facilitate connections and soldering of the various sensors and other components.
The database structure can be created using the scripts located in the
mysql_insertion folder of the
mysql -u <user> [-h <host>] [-p] < create_db.sql
Load SQL data (SQL Format)
Data formated in SQL can be loaded using the mysql command
mysql -u username -p WEATHER_STATION < db_whole_data.sql, and the
db_whole_data.sql is available in the
SQL_data/ folder of the
Load RAW data (CSV)
Data can be loaded using the python script
sql_ins.py available in the
mysql_insertion folder of the
python sql_ins.py <data_folder>
The script assumes the following folder structure:
Each folder contains a set of csv files. The script automatically loads data into the appropriate table and using the correct fields, which are specified as a list of parameters in the script. It is possible to edit the script to load only a subset of the folders.
To replicate the experiments, the user should clone the raspberry pi image into a MicroSD (16-32 GB).
To do this, s/he can issue the command
dd if=/path/to/image of=/path/of/microsd bs=4m on Linux.
The sampling scripts are run by a systemd unit automatically at system startup. The same systemd unit handles also the automatic respawn of the processes if some problems occur. The data are stored in the
/home/alarm/ws/data directory, with filenames corresponding to the date of acquisition.
In order to upload these data to a database, it is possible to use the guide contained in the "database" directory.
In order to perform calibration and tests, it is recommended to take a look at the guide contained in the "analysis" directory. A Python class has been implemented to perform calibration of sensors against the ARPA reference ones. The resulting calibration can then be applied to a time window of choice.
3D model of the case has been developed using
SketchUp online software.
The resulting model is split in 5 different parts, each large enough to fit in our
3D printer (Makerbot Replicator 2X).
The model is stackable, meaning that several cases can be put on top of each other, with a single roof piece.
Printed Circuit Board
PCB has been developed using
KiCad software, so to create a hat for the RPi0 connecting all the sensors.
WS Analysis library documentation (v0.2)
The aim of this package is to provide fast and easy access and analysis to the Weather Station database. This package is located in the
analysis directory, and it is compatible only with Python 3. Please follow the readme file for more information.
│ ├── Cap_v0_1stpart.skp
│ ├── Cap_v0_2dpart.skp
│ ├── ws_rpzero_noGPS_v1.skp
│ ├── ws_sensors_2d_half_v2.skp
│ └── ws_sensors_half_v2.skp
│ ├── arpa_station.json
│ ├── board.json
│ ├── example.py
│ ├── extract.py
│ ├── out.pdf
│ ├── requirements.txt
│ ├── ws_analysis
│ │ ├── __pycache__
│ │ │ └── ws_analysis.cpython-37.pyc
│ │ ├── rpt.txt
│ │ └── script_offset.py
│ ├── ws_analysis.md
│ ├── ws_analysis.pdf
│ ├── ws_analysis.py
│ └── ws_analysis.pyc
│ ├── db_setup.html
│ ├── db_setup.md
│ ├── db_setup.pdf
│ ├── er_diagram.pdf
│ ├── mysql_insertion
│ │ ├── extract_to_file.py
│ │ ├── remove_duplicate.py
│ │ └── sql_ins.py
│ ├── SQL_Table
│ │ ├── create_db.sql
│ │ ├── create_measure_table.sql
│ │ └── load_data.sql
│ └── SQL_data
│ └── db_whole_data.sql.gz
│ └── WS_v2_output.tar.xz
│ ├── csv
│ │ ├── arpa_retrieve.py
│ │ ├── filemerge.py
│ │ ├── gpx2geohash.py
│ │ ├── parse_csv.py
│ │ └── validation.py
│ └── mpu9250
│ └── gyro.py
This is the data supporting the research of "driving cycle of Haikou bus"
This work presents a methodology of constructing three models respectively without blades, with straight blades and with curved blades, coupled for artificial simulated fog-haze environment with computational fluid dynamics (CFD), to predict the impact of the rotating blades on the flow velocities in the enclosed environment by simulation. Atmospheric flow characteristics and variation of flow velocities were analyzed, and the influences of different rotating blades on flow velocities were compared to get the related simulation results in three models.
These videos show bottom and side views of the treatment by a He/O2 plasma jet of a solution containing ultra-pure water, potassium iodide and starch. As the plasma reaches the liquid, a purple filament characteristic of the triiodide ion/starch complex appears. This complex is formed by the reaction between iodide ion and a reactive oxygen and nitrogen species (RONS) such as ozone, hydrogen peroxide, hydroxide ion, nitric oxyde or nitrate. Therefore, the formation of these RONS in the liquid phase is temporally and spatially quantified.
It is possible to construct "aerosol cytometers" based on different types of Zhulanov's laser aerosol counters | diffusion aerosol spectrometers (DAS) [1-8] and "hydrosol cytometers" based on hydrosol particle counters (adopted for ocean marine, ocean and hydrothermal conditions [9,10]).