The First WEAR Dataset Challenge is a Human Activity Recognition prediction challenge based on the inertial data of the original WEAR dataset [1] publication.
Challenge participants are tasked to predict the activity label of a yet unreleased test dataset of newly and re-recorded participants. Winners are determined based on the sample-wise macro F1-score averaged across all activities and participants. The challenge is part of the HASCA Workshop at UbiComp/ ISWC 2024.
The challenge dataset download contains accelerometer data of 18 particpants performing multiple outdoor fitness workouts. In total each participant performed 18 workout activities. Data was captured at the four limbs of each participant, being sampled at 50 Hz with a sensitivity of ±8G. The test dataset contains data of rerecorded and new participants performing the same workout and activities as the original participants. Participants are tasked to predict the activity label of each data record of the new participants using their prediction algorithm of choice. Submissions should contain the predicted activity label for each data record of the test dataset (see below for more details).
The WEAR dataset is an outdoor sports dataset for inertial- and video-based human activity recognition (HAR). The dataset comprises data from 18 participants performing a total of 18 different workout activities with untrimmed inertial (acceleration) and egocentric video data recorded at 10 different outside locations. WEAR provides a challenging prediction scenario marked by purposely introduced activity variations as well as an overall small information overlap across modalities. This challenge focuses on the inertial data only.
Registration is now open! In order to participate, each team must send a registration email to wear.challenge@gmail.com, stating the:
Evaluation of submissions will be based on the sample-wise, macro F1-score averaged across all participants. The macro F1-score is the average of the F1-scores of each activity class. Please check the WEAR dataset repository for sample code on how to translate windowed data back to record-wise data and calculate the macro F1-score.
Each participant performed a set of 18 workout activities. These activities include running-, stretching- and strength-based exercises, with base activities like push-ups being complemented with complex variations that alter and/ or extend the performed movement during the exercise. Activities were divided across multiple recording sessions, with each session consisting of uninterrupted data streams of all modalities. Each participant was tasked to perform each exercise for at least 90 seconds, but had the freedom to choose the order of activities and take breaks as desired.
The original dataset comprises of outdoor workouts of 18 participants. Each workout is divided across multiple session. In total more than 15 hours were recorded at 10 outdoor locations. The training data contains the raw sensor data of the 18 partipants which were part of the original WEAR dataset [1]. The sensors were placed at four body locations (right wrist, left wrist, right ankle and left ankle). Each sensor sampled 3D-accelerometer data. During all recording sessions sensor orientation was fixed according to one pre-defined sensor placement. Each sampled data record is labeled as one of the 18 (+ null
-class) possible activities.
Training Data: The training data is provided in a seperate CSV files per subject. Each file contains the following columns:
int
between 0
and 17
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)float
between -8.0
and +8.0
)str
being one of the 18 workout activities or null
during breaks)
[1] Marius Bock, Hilde Kuehne, Kristof Van Laerhoven and Michael Moeller. 2023. WEAR: An Outdoor Sports Dataset for Wearable and Egocentric Activity Recognition. CoRR abs/2304.05088. https://arxiv.org/abs/2304.05088
WEAR is offered under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. You are free to use, copy, and redistribute the material for non-commercial purposes provided you give appropriate credit, provide a link to the license, and indicate if changes were made. If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. You may not use the material for commercial purposes.