Artificial Intelligence
This dataset was collected with the goal of providing researchers with access to a collection of hundreds of images for efficient classification of plant attributes and multi-instance plant localisation and detection. There are two folders, i.e. Side view and Top View.Each folder includes label files and image files in the.jpg format (.txt format). Images of 30 plants grown in 5 hydroponic systems have been collected for 66 days. Thirty plants of three species (Petunia, Pansy and Calendula) were grown in a hydroponic system for the purpose of collecting and analysing images.
- Categories:
This dataset contains continuous gesture data for both Chinese and English, including 14 Chinese characters and 4 English words. The Chinese characters are: 不 (bù), 程 (chéng), 刀 (dāo), 工 (gōng), 古 (gǔ), 今 (jīn), 力 (lì), 刘 (liú), 木 (mù), 石 (shí), 土 (tǔ), 外 (wài), 中 (zhōng), 乙 (yǐ). The English words included are: 'can', 'NO', 'Who', 'yes'.
- Categories:
With a plethora of new connections, features, and services introduced, the 5th generation (5G) wireless technology reflects the development of mobile communication networks and is here to stay for the next decade. The multitude of services and technologies that 5G incorporates have made modern communication networks very complex and sophisticated in nature. This complexity along with the incorporation of Machine Learning (ML) and Artificial Intelligence (AI) provides the opportunity for the attackers to launch intelligent attacks against the network and network devices.
- Categories:
Ear biting is a welfare challenge in commercial pig farming. Pigs sustain injuries at the site of bite paving the way for bacterial infections. Early detection and management of this behaviour is important to enhance animal health and welfare, increase productivity whilst minimising inputs from medication. Pig management using physical observation is not practical due to the scale of modern pig production systems. The same applies to the manual analysis of captured videos from pig houses. Therefore, a method of automated detection is desirable.
- Categories:
The synthetic dataset has been produced by an industrial simulator that is able to generate a set of samples representing the working parameters of a device during its operation.
The simulator is not publicly available. The formalization of the domain used by the simulator is now the same adopted in the control system of the chillers. Thus, the simulator provides a good starting point to collect data that describes the studied domain.
- Categories:
The deployment of unmanned aerial vehicles (UAV) for logistics and other civil purposes is consistently disrupting airspace security. Consequently, there is a scarcity of robust datasets for the development of real-time systems that can checkmate the incessant deployment of UAVs in carrying out criminal or terrorist activities. VisioDECT is a robust vision-based drone dataset for classifying, detecting, and countering unauthorized drone deployment using visual and electro-optical infra-red detection technologies.
- Categories:
In order to contribute to the development of automatic methods for the detection of bacilli, TBimages is an image dataset composed of two subsets: TbImages_SS1 contains 10 images per field, of different focal depths, and aims to support the definition of autofocus metrics and also the development of extended focus imaging methods to facilitate the detection of bacilli in smear microscopy imaging. TbImages_SS2 aims to support the development of automatic bacilli detection.
- Categories:
For the purpose of experimentation, the historical stock prices of three petroleum companies: Pakistan State Oil (PSO), Hascol, and Attock Petroleum Limited (APL), are extracted from the Pakistan Stock Exchange (PSX) website through a web scrapper for the last four years. Different attributes related to the stocks of each of these companies are extracted for each day. Along with this, for each of these companies, Twitter data for sentiment analysis is also extracted using Twint.
- Categories:
In this study, we present advances on the development of proactive control for online individual user adaptation in a welfare robot guidance scenario, with the integration of three main modules: navigation control, visual human detection, and temporal error correlation-based neural learning. The proposed control approach can drive a mobile robot to autonomously navigate in relevant indoor environments. At the same time, it can predict human walking speed based on visual information without prior knowledge of personality and preferences (i.e., walking speed).
- Categories: