BlurBall: Joint Ball and Motion Blur Estimation for Table Tennis Ball Tracking

Thomas Gossard*, Filip Radovic, Andreas Ziegler, Andreas Zell
University of Tübingen

Abstract

Motion blur is ubiquitous in broadcast footage of fast racket sports and encodes valuable cues about the ball's velocity. We propose BlurBall, a detector that jointly estimates ball position and motion-blur attributes (orientation and length). We also introduce a blur-aware labeling convention that defines the ball position at the center of the blur and annotates the blur streak itself.

Built on an HRNet backbone with Squeeze-and-Excitation attention and trained with blur-aware heatmaps, BlurBall improves detection accuracy and enables velocity-informed downstream tasks such as trajectory prediction. We release a diverse table-tennis dataset with blur annotations and camera calibration.

Blur example in dataset

Motion blur frequently appears in broadcast footage but is typically disregarded. Yet, it offers valuable cues for estimating the ball’s velocity. The blue cross denotes the classical labeling approach, which introduces asymmetry and ambiguity to the detection task. We propose a refined annotation strategy: relabel the ball center to correspond to the middle of the blur (red cross) and include a directional blur label (green line) to capture motion information better.

Blur Labelling

Instead of marking the ball as a single point, we annotate the entire motion blur streak. Each annotation specifies the blur's center, length, and orientation, providing richer supervision and allowing the model to infer velocity cues directly from a single frame.

Example of blur labelling with streak annotation

Blur-aware Heatmaps

From these blur annotations, we generate heatmaps that cover the entire streak rather than a point-like Gaussian. These blur-aware heatmaps guide the network to capture both the ball center and its motion extent.

Label Generated blur-aware heatmap

Dataset

The BlurBall dataset is collected from diverse table tennis scenes under varying conditions, including camera viewpoints, lighting, and player styles. Each frame is annotated with both the ball position and its motion blur streak, enabling training of blur-aware detectors. The dataset also provides camera calibration for geometric analysis.

Dataset example 00
Dataset example 01
Dataset example 02
Dataset example 05
Dataset example 06
Dataset example 11
Dataset example 12
Dataset example 17
Dataset example 18
Dataset example 21
Dataset example 24
Dataset example 25

Example scenes from our dataset, showcasing a diverse range of contexts to ensure comprehensive coverage.

BibTeX

@article{gossard2025blurball,
          title = {BlurBall: Joint Ball and Motion Blur Estimation for Table Tennis Ball Tracking},
          author = {Gossard, Thomas and Radovic, Filip and Ziegler, Andreas and Zell, Andreas},
          journal = {arXiv preprint arXiv:2509.18387},
          year = {2025}
          }