Facial Expression

In the realm of real-time communications, WebRTC-based multimedia applications are increasingly prevalent as these can be smoothly integrated within Web browsing sessions. The browsing experience is then significantly improved concerning scenarios where browser add-ons and/or plug-ins are used; still, the end user's Quality of Experience (QoE) in WebRTC sessions may be affected by network impairments, such as delays and losses.


The ability to perceive human facial emotions is an essential feature of various multi-modal applications, especially in the intelligent human-computer interaction (HCI) area. In recent decades, considerable efforts have been put into researching automatic facial emotion recognition (FER). However, most of the existing FER methods only focus on either basic emotions such as the seven/eight categories (e.g., happiness, anger and surprise) or abstract dimensions (valence, arousal, etc.), while neglecting the fruitful nature of emotion statements.


This dataset contains facial expressions from different sides. the top-level videos are shot on Logitech c270 and the bottom ones are shot with an LG g6. The videos are continuous shots at 480p from different angles. 
This is meant to serve as a dataset for facial expression recognition under different angles and poses.