VIPL-HR database is a database for remote heart rate (HR) estimation from face videos under less-constrained situations. It contains 2,378 visible light videos (VIS) and 752 near-infrared (NIR) videos of 107 subjects. Nine different conditions, including various head movements and illumination conditions are taken into consideration. All the videos are recorded using Logitech C310, RealSense F200 and the front camera of HUAWEI P9 smartphone, and the ground-truth HR is recorded using a CONTEC CMS60C BVP sensor (a FDA approved device). More detailed information about the database can be found in the Readme file in the package.
2. Evaluation Protocol
For the methods that do not need training, we suggest all the videos should be used for testing. For machine learning methods, a five-fold subject-exclusive testing protocol is suggested. The detailed partition of the subjects is also in the package. The average HR is suggested as the ground-truth HR for each video.
The VIPL-HR dataset is released to universities and research
institutes for research purpose only. To request a copy of the VIPL-HR database,
please do as follows:
• Download the VIPL-HRDatabase Release Agreement, read it carefully, and complete it appropriately. Note that the agreement should be signed by a full-time staff member (that is, student is not acceptable). Then, please scan the signed agreement and send it to Dr. Han (hanhu[at]ict.ac.cn) using an official email address (that is, university or institute email address, and non-official email addresses such as Gmail and 163 are not acceptable). When we receive your reply, we would provide the download link to you.
• By using the VIPL-HR database, you are recommended to cite
the following paper:
 Xuesong Niu, Shiguang Shan*, Hu Han, and Xilin Chen, "RhythmNet: End-to-end Heart Rate Estimation from Face via Spatial-temporal Representation," IEEE Transactions on Image Processing (T-IP), vol. 29, pp. 2409-2423, 2020.
 Xuesong Niu, Hu Han, Shiguang Shan, and Xilin Chen, “VIPL-HR: A Multi-modal Database for Pulse Estimation from Less-constrained Face Video,” Asian Conference on Computer Vision, 2018.