eWand: A calibration framework for wide baseline frame-based and event-based camera systems
Thomas Gossard*
Andreas Ziegler*
Levin Kolmar
Jonas Tebbe
Andreas Zell
*equal contribution
[Paper]
[Code]
[CAD]
An overview of our proposed calibration method.

Abstract

Accurate calibration is crucial for using multiple cameras to triangulate the position of objects precisely. However, it is also a time-consuming process that needs to be repeated for every displacement of the cameras. The standard approach is to use a printed pattern with known geometry to estimate the intrinsic and extrinsic parameters of the cameras. The same idea can be applied to event-based cameras, though it requires extra work. By using frame reconstruction from events, a printed pattern can be detected. A blinking pattern can also be displayed on a screen. Then, the pattern can be directly detected from the events. Such calibration methods can provide accurate intrinsic calibration for both frame- and event-based cameras. However, using 2D patterns has several limitations for multi-camera extrinsic calibration, with cameras possessing highly different points of view and a wide baseline. The 2D pattern can only be detected from one direction and needs to be of significant size to compensate for its distance to the camera. This makes the extrinsic calibration time-consuming and cumbersome. To overcome these limitations, we propose eWand, a new method that uses blinking LEDs inside opaque spheres instead of a printed or displayed pattern. Our method provides a faster, easier-to-use extrinsic calibration approach that maintains high accuracy for both event- and frame-based cameras.


Talk

Paper

T. Gossard*, A. Ziegler*, L. Kolmar, J. Tebbe, A. Zell.
eWand: A calibration framework for wide baseline frame-based and event-based camera systems.
In ICRA, 2024.
(hosted on ArXiv)


[Bibtex]


Acknowledgements

This research was funded by Sony AI.