Making use of a specifically designed SW tool, the authors here presents the results of an activity for the evaluation of energy consumption of buses for urban applications. Both conventional and innovative transport means are considered to obtain interesting comparative conclusions. The SW tool simulates the dynamical behaviour of the vehicles on really measured paths making it possible to evaluate their energetic performances on a Tank to Wheel (TTW) basis. Those data, on such a wide and comparable range were still unavailable in literature.

Categories:
251 Views

To study the driver's behavior in real traffic situations, we conducted experiments using an instrumented vehicle, which comprises:

(i) a camera, installed above the vehicle's side window and oriented toward the driver, and (ii) a Mobile Digital Video Recorder (MDVR).

Categories:
444 Views

These measurements were taken at the point of common coupling using the power quality analyzer PQ-Box 200, where about 30 EV chargers are installed and exploited by the utility. For this reason, this data set only considers the charging behavior of the vehicles employed by the enterprise, namely the Renault Kangoo ZE and Renault Zoe. The period under consideration starts on 5.11.2018 and ends on 07.01.2020. Because of the large amount of data, values with a time interval of 10mins are extracted and used in this data set.

Categories:
219 Views

The T-Drive dataset,  published by the Microsoft Research Lab, was used to construct the network topology.

Categories:
11 Views

Driving behavior plays a vital role in maintaining safe and sustainable transport, and specifically, in the area of traffic management and control, driving behavior is of great importance since specific driving behaviors are significantly related with traffic congestion levels. Beyond that, it affects fuel consumption, air pollution, public health as well as personal mental health and psychology. Use of Smartphone sensors for data acquisition has emerged as a means to understand and model driving behavior. Our aim is to analyze driving behavior using on Smartphone sensors’ data streams.

Instructions: 

The datasets folder include .csv files of sensor data like Accelerometer, Gyroscope, etc. This data was recorded in live traffic while driver was executing certain driving events. The travel time for each one way trip was approximately 5kms - 20kms. The smartphone position was fixed horizontally in the vehicles utility box. Vehicle type used for data recording was LMV.

Categories:
532 Views

This dataset was used in our work "See-through a Vehicle: Augmenting Road Safety Information using Visual Perception and Camera Communication in Vehicles" published in the IEEE Transactions on Vehicular Technology (TVT). In this work, we present the design, implementation and evaluation of non-line-of-sight (NLOS) perception to achieve a virtual see-through functionality for road vehicles.

Instructions: 

Non-Line of Sight Perception Vehicular Camera Communication

This project is an end-end python-3 application with a continuous loop captures and analyses 100 frames captured in a second to derive appropriate safety warnings.

Contact

Dr. Ashwin Ashok, Assistant Professor, Computer Science, Georgia State University

Collaborators

Project contents

This project contains 3 modules that should be run in parallel and interact with each other using 3 CSV files.

Modules

  1. non-line-of-sight-perception
  2. intelligent-vehicular-perception_ivp
  3. warning-transmission

CSV Files

  1. packet.csv
  2. perceived_info.csv
  3. receiver_action.csv

Usage :

Folling commands must be run in parallel. For more information on libraries needed for execution, see detailed sections below.

# Terminal 1
python3 non-line-of-sight-perception/VLC_project_flow.py zed

# Terminal 2
python3 intelligent-vehicular-perception_ivp/src-code/ivp_impl.py

# Terminal 3
python3 warning-transmission/send_bits_to_transmitter.py

1. non-line-of-sight-perception : Object Detection and Scene Perception Module

This folder, For the YOLO-V3 training and inference: This project is a fork of public repository keras-yolo3. Refer the readme of that repository here. Relevant folders from this repository have been placed in training and configuration folders in this repository.

Installation of python libraries

Use the package manager pip to install foobar.

pip install opencv-python
pip install tensorflow-gpu
pip install Keras
pip install Pillow

Hardware requirements

  1. This code was tested on Jetson Xavier, but any GPU enabled machine should be sufficient.
  2. Zed Camera: Uses Zed camera to capture the images. (Requires GPU to operate at 100 fps).
  3. (Optional) the code can be modified as per the comments in file to use zed, 0 for the camera, or ' the video path' for mp4 or svo files)

Output

perceived_info.csv

2. intelligent-vehicular-perception_ivp : Safety Message/Warning Mapping Module

This module is responsible for making intelligent recommendation to the driver as well as generating Safety Warnings for the following vehicles in the road. The module ouuputs with a fusion of Received Safety Warning through VLC channel and the vehicle's own Scene Perception Data.

Python Library Dependencies

  • json
  • operator
  • csv
  • enum
  • fileinput
  • re

Input

Output

The output is two-fold.

  • packet.csv : Intelligent Recommendation to the Driver.
  • receiver_action.csv : Generated Packet bits. Each Packet bits are logged into the 'packet.csv' file. This CSV files works as a queue. Every new packet logged here eventually gets transmitted by the VLC transmission module.

3. warning-transmission: Communication Module

Detailed transmitter notes including hardware requirement is present in transmitter_notes.txt

Python Library Dependencies

  • serial

Input

  • packet.csv : Intelligent Recommendation to the Driver.

Output

LED flashes high/low in correspondence to the packets in input file.

Dataset used for training the model

The dataset has been generated using Microsoft VoTT.

This is a "Brakelight" labelled dataset that can be used for training Brakelight detection models. The dataset contains brakelights labelled on images from

*Reference : Cui, Z., Yang, S. W., & Tsai, H. M. (2015, September). A vision-based hierarchical framework for autonomous front-vehicle taillights detection and signal recognition. In Intelligent Transportation Systems (ITSC), 2015 IEEE 18th International Conference on (pp. 931-937). IEEE.

Labeled Dataset contains 1720 trained images as well a csv file that lists 4102 bounding boxes in the format : image | xmin | ymin | xmax | ymax | label

This can be further converted into the format as required by the training module using convert_dataset_for_training.py - (Replace annotations.txt with the Microsoft VoTT generated CSV) .

 

Acknowledgements

This work has been partially supported the US National Science Foundation (NSF) grants 1755925, 1929171 and 1901133.

Categories:
141 Views

The data collection was carried out over several months and across several cities including but not limited to Quetta, Islamabad and Karachi, Pakistan. Ultimately, the number of images collected as part of the Pakistani dataset were, albeit in a very small quantity. The images taken were also distributed across the classes unevenly, just like the German dataset. All the 359 images were then manually cropped to filter out the unwanted image background data. All the images were sorted into folders with names corresponding to the label of the images.

Instructions: 

Dataset is divided by classes and the images inside the folder are named randomly and contain no useful labels in their names.

Categories:
522 Views

This data set is shared to help the readers to reproduce the results (Figure 5 and Figure 6) of the manuscript entitled ‘’Online System Identification of a Fuel Cell Stack with Guaranteed Stability for Energy Management Applications’’ published by IEEE Transactions on Energy Conversion.

If you use this data, please cite the following paper :

Categories:
285 Views

10 Use cases of containers sway speed along the X-axis during loading and unloading procedures using a quay crane in Klaipeda containers terminal.

Instructions: 

The file includes 10 use cases of container sway speed including the spreader. 

Data samples: dataX_1, dataX_3, dataX_5, dataX_10, provide sway speed for the X-axis during the container unloading procedures from a ship, while other samples provide the opposite procedures.

Data sample called Y provides the time-stamp. 

Categories:
45 Views

Pages