We list some face databases widely used for facial expression related
studies, and summarize the specifications of these databases as below.
1. Binghamton University facial expression databases.
o Source:
The Binghamton University facial expression databases are built by
o Purpose: The Binghamton University facial expression databases record images or videos of subjects with various facial expressions. There are multiple types of subsets. Some subsets contain 4D facial data. Some subsets contain multi-modality facial data.
o Properties:
Properties |
Descriptions |
# of subjects |
Number of subjects varies with different data subsets. |
# of images/videos |
- |
Static/Videos |
Static and videos. |
Single/Multiple faces |
Single |
Gray/Color |
color |
Resolution |
- |
Face pose |
- |
Facial expression |
Various expressions. |
Illumination |
- |
3D data |
3D face scann |
Ground truth |
Facial expression and facial action unit annotations. Some data subsets contain tracked facial landmark locations. |
o Reference: refer to the website: http://www.cs.binghamton.edu/~lijun/Research/3DFE/3DFE_Analysis.html
o Source:
The AFEW and SFEW databases are built by
o Purpose: Acted Facial Expressions In The Wild (AFEW) is a dynamic temporal facial expressions data corpus consisting of close to real world environment extracted from movies. Static Facial Expressions in the Wild (SFEW) has been developed by selecting frames from AFEW.
o Properties:
Properties |
Descriptions |
# of subjects |
330 |
# of images/videos |
1426 video sequences in AFEW database. 700 images in SFEW database (SPI category). |
Static/Videos |
Videos in AFEW, Static images in SFEW. |
Single/Multiple faces |
Multiple |
Gray/Color |
color |
Resolution |
- |
Face pose |
Various poses |
Facial expression |
Angry, Disgust, Fear, Happy, Neutral, Sad, Surprise. |
Illumination |
Various illuminations |
3D data |
coarse head pose label |
Ground truth |
5 facial landmark annotations for some images |
o Reference: refer to the paper: Abhinav Dhall, Roland Goecke, Simon Lucey, Tom Gedeon, Collecting Large, "Richly Annotated Facial-Expression Databases from Movies", IEEE Multimedia 2012. Abhinav Dhall, Roland Goecke, Simon Lucey, and Tom Gedeon, "Static Facial Expressions in Tough Conditions: Data, Evaluation Protocol And Benchmark", First IEEE International Workshop on Benchmarking Facial Image Analysis Technologies BeFIT, IEEE International Conference on Computer Vision ICCV2011, Barcelona, Spain, 6-13 November 2011.
o
Source: this database is provided by the Second Emotion Recognition In The Wild Challenge and Workshop.
o
Purpose: this database is primarily used to evaluate the emotion recognition methods in real-world conditions.
o
Properties:
Properties |
Descriptions |
# of subjects |
N/A |
# of images/videos |
Training (578 videos), validataion (383 videos), and test sets (N/A) |
Static/Videos |
Video and audio |
Single/Multiple faces |
Single |
Gray/Color |
Color |
Resolution |
N/A |
Face pose |
various face poses |
Facial expression |
7 basic facial expressions: Anger, Disgust, Fear, Happiness, Neutral, Sadness and Surprise. |
Description of facial expression |
video clips from movies. |
Illumination |
N/A |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
N/A |
Ground truth |
Facial expression label for each video |
o
Source: this database is provided by Jeff Cohn from Carnegie
Mellon University.
o
Purpose: this database is widely used as the standard database
to evaluate the facial action unit recognition systems. It may also be used for
facial expression recognition and face recognition.
o
Properties:
Properties |
Descriptions |
# of subjects |
100 university students (released version) 65% female, 15% African-American, and 3% percent Asian
or Latino. |
# of images/videos |
486 |
Static/Videos |
Videos |
Single/Multiple faces |
Single |
Gray/Color |
Eight-bit gray |
Resolution |
640* 490 |
Face pose |
Frontal-view only |
Facial expression |
23 facial displays including single AUs or combinations
of AUs |
Description of facial expression |
Neutral to apex; Posed facial expressions |
Illumination |
N/A |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
12 frame/sec |
Ground truth |
AU label for final frame in each image sequence Identifications of subjects |
o
Source: this database is provided by M. Pantic and M. F.
Valstar.
o
Purpose: this database is primarily used to evaluate the facial action
unit recognition systems. It may also be used for face recognition.
o
Properties:
Properties |
Descriptions |
# of subjects |
43 |
# of images/videos |
1280 videos and over 250 images |
Static/Videos |
Videos and static images |
Single/Multiple faces |
Single |
Gray/Color |
Color |
Resolution |
720* 576 |
Face pose |
Frontal-view or dual-view (frontal and profile)
captured by two cameras simultaneously |
Facial expression |
79 facial displays including single AUs or combinations
of AUs |
Description of facial expression |
Neutral-apex-neutral; Posed facial expressions |
Illumination |
N/A |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
24 frame/sec |
Ground truth |
AU label for the image frame with apex facial
expression in each image sequence; Some image sequences have been FACS coded for each
frame; Lot of metadata of subjects |
o
Source: this database was planned and assembled by Miyuki
Kamachi, Michael Lyons, and Jiro Gyoba.
o
Purpose: this database is primarily used to evaluate the facial
expression recognition systems. It may also be used for face recognition.
o
Properties:
Properties |
Descriptions |
# of subjects |
10 |
# of images/videos |
213 |
Static/Videos |
Static |
Single/Multiple faces |
Single |
Gray/Color |
Eight-bit gray |
Resolution |
256* 256 |
Face pose |
Frontal-view |
Facial expression |
7 facial expressions: neutral, sadness, surprise, happiness, fear, anger, and
disgust |
Description of facial expression |
Posed facial expressions |
Illumination |
N/A |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
N/A |
Ground truth |
Facial expression label Identifications of subjects |
o
Reference: Please refer to the paper: Michael J. Lyons, Shigeru
Akamatsu, Miyuki Kamachi & Jiro Gyoba, Coding Facial Expressions with
Gabor Wavelets, Proc. of FGR98, pp. 200-205, 1998.
o
Source: this database is created by Queen's University of
Belfast.
o
Purpose: This database is primarily used for emotion
recognition. It may also be used to evaluate the algorithms for facial
expression and facial action unit recognition under spontaneous conditions.
o
Properties:
Properties |
Descriptions |
# of subjects |
125 (31 males and 94 females) |
# of images/videos |
>250 |
Static/Videos |
Videos (audio-visual) |
Single/Multiple faces |
Single |
Gray/Color |
Color |
Resolution |
N/A |
Face pose |
Various |
Facial expression |
Wide range of facial expressions |
Description of facial expression |
Neutral-apex-neutral; Spontaneous facial
expressions |
Illumination |
Indoor |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
N/A |
Ground truth |
Identifications of subjects Emotional descriptors of each sequence |
o
Reference: Please refer to the paper: E. Douglas-Cowie and R.
Cowie and M. Schroeder, The description of naturally occurring emotional
speech, 15th ICPhS, Barcelona.