WEAR:
An Outdoor Sports Dataset for Wearable and Egocentric Activity Recognition

*University of Siegen, University of Bonn
In review, 2023



REGISTER FOR THE FIRST WEAR DATASET CHALLENGE NOW!


Abstract

Though research has shown the complementarity of camera- and inertial-based data, datasets which offer both egocentric video and inertial-based sensor data remain scarce. In this paper, we introduce WEAR, an outdoor sports dataset for both vision- and inertial-based human activity recognition (HAR). The dataset comprises data from 18 participants performing a total of 18 different workout activities with untrimmed inertial (acceleration) and camera (egocentric video) data recorded at 10 different outside locations. Unlike previous egocentric datasets, WEAR provides a challenging prediction scenario marked by purposely introduced activity variations as well as an overall small information overlap across modalities. Benchmark results obtained using each modality separately show that each modality interestingly offers complementary strengths and weaknesses in their prediction performance. Further, in light of the recent success of temporal action localization models following the architecture design of the ActionFormer, we demonstrate their versatility by applying them in a plain fashion using vision, inertial and combined (vision + inertial) features as input. Results demonstrate both the applicability of vision-based temporal action localization models for inertial data and fusing both modalities by means of simple concatenation, with the combined approach (vision + inertial features) being able to produce the highest mean average precision and close-to-best F1-score.

The Sensors

We provide subject-wise raw and processed acceleration and egocentric-video data. 3D-accelerometer data (50Hz±8g) was collected using four open-source Bangle.js smartwatches running a custom, open-source firmware. The watches and were placed in a fixed orientation on the left and right wrists and ankles of each participant (see Figure below). Egocentric video data (1080p@60FPS) was captured using a GoPro Hero 8 action camera, which was mounted using a head strap on each participant's head. The camera was tilted downwards in a 45 degree angle during recording.

The Activities

Each participant performed a set of 18 workout activities. These activities include running-, stretching- and strength-based exercises, with base activities like push-ups being complemented with complex variations that alter and/ or extend the performed movement during the exercise. Activities were divided across multiple recording sessions, with each session consisting of uninterrupted data streams of all modalities. Each participant was tasked to perform each exercise for at least 90 seconds, but had the freedom to choose the order of activities and take breaks as desired.


Talk on WEAR @ University of Cambridge

Watch the recording of the talk I gave on WEAR at Prof. Dr. Schönlieb's group at the University of Cambridge (15.09.2023).



Download

The full dataset can be downloaded this link. The download folder is divided into 3 subdirectories

  • annotations (> 1MB): JSON-files containing annotations per-subject using the THUMOS14-style
  • processed (15GB): precomputed I3D, inertial and combined per-subject features
  • raw (130GB): Raw, per-subject video and inertial data

Reproduce Benchmarks

In order to reproduce any experiments mentioned in the paper, please refer to the instructions provided in the GitHub repository.

How to contribute

If you want to contribute to the WEAR dataset and check out our How-To: Record your own data guide or get in touch with us via marius.bock@uni-siegen.de.

License

WEAR is offered under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. You are free to use, copy, and redistribute the material for non-commercial purposes provided you give appropriate credit, provide a link to the license, and indicate if changes were made. If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. You may not use the material for commercial purposes.

Contact

Cite as

@article{bock2023wear,
  title={WEAR: An Outdoor Sports for Wearable and Egocentric Activity Recognition},
  author={Bock, Marius and Kuehne, Hilde and Van Laerhoven, Kristof and Moeller, Michael},
  volume={abs/2304.05088},
  journal={CoRR},
  year={2023},
  url={https://arxiv.org/abs/2304.05088}
}