Lehrinhalte
Event-based vision is an emerging technology that promises to offer advantages to overcome some of the limitations of traditional, frame-based cameras and visual processing pipelines (from sensors to output, actionable information), such as latency, dynamic range, bandwidth and power consumption. To unlock the advantages of event-based cameras, new algorithms are needed to process their unconventional output (a stream of asynchronous pixel-wise intensity changes, as opposed to the familiar video images of standard cameras). This project is related to the investigation and development of tailored algorithms and methods to tackle specific problems in event-based vision (motion estimation, segmentation, object detection and recognition, etc.).
At the beginning of the module, students receive or select project topics from a list of possible ones, as well as some introductory material related to the chosen problem. After setting the project teams and topics, the suitable tools to carry out the project are discussed and set up. The students prepare a project plan, specify the data on which they will be working on and the steps that are anticipated for a successful completion of the project. During the remaining weeks the students develop their projects and discuss the progress with the instructor, to guide future action items. At the end of the project, the students present their findings to other students in the module, with an oral presentation. They summarize not only the technical outcome of the project but also the difficulties and lessons learned during the project.
The general topics include but are not limited to:
- Algorithms: visual odometry, SLAM, 3D reconstruction, optical flow estimation, image intensity reconstruction, recognition, stereo depth reconstruction, feature/object detection, tracking, calibration, sensor fusion (video synthesis, visual-inertial odometry, etc.).
- Event camera datasets and/or simulators.
- Event-based signal processing, representation, control, bandwidth control.
- Event-based active vision, event-based sensorimotor integration.
- Applications in: robotics (navigation, manipulation, drones...), automotive, IoT, AR/VR, space science, inspection, surveillance, crowd counting, physics, biology.
- Model-based, embedded, or learning approaches.
- Novel hardware (cameras, neuromorphic processors, etc.) and/or software platforms.
- New trends and challenges in event-based and/or biologically-inspired vision (SNNs, etc.).
- Event-based vision for computational photography.
A longer list of related topics is available in the table of content of this repository: https://github.com/uzh-rpg/event-based_vision_resources/