TimeLens: Event-based Video Frame Interpolation (CVPR 2021)

TimeLens: Event-based Video Frame Interpolation (CVPR 2021)

UZH Robotics and Perception Group

3 года назад

12,808 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Silva He
Silva He - 25.03.2023 12:21

i was wondering that how can you let the normal camera and the event camera get the same sight.

Ответить
REEL60FRAMES
REEL60FRAMES - 22.11.2021 22:36

My videos are the reference to study the issues.( artefacts by high dynamic motion)

Ответить
Bob™
Bob™ - 06.11.2021 19:29

Amazing! Now I just need a Prophesee Gen4M to fall off a truck...

Ответить
Kibbles
Kibbles - 05.10.2021 02:39

nice work!

Ответить
S!KK
S!KK - 10.09.2021 02:16

This is optical flow on steroids. Should write this to run on the Apple M1 chip.. It would rock.

Ответить
Ardeact
Ardeact - 19.08.2021 00:12

I thought dain was impressive but the fact that there's basically no artifacts on the moving image is amazing

Ответить
kk old
kk old - 11.08.2021 18:33

oh man, really a great technology.
But apparently its only on special hardware,
Great, nontheless!

Ответить
The Slow Mo Guys
The Slow Mo Guys - 11.08.2021 01:44

Well we had a good innings, lads. 😂

Ответить
Upscaled
Upscaled - 09.08.2021 05:00

Great video, looks awesome

Ответить
Zenn22
Zenn22 - 08.08.2021 08:21

uploaded at 25fps 🤣🤣🤣🤣🤣

Ответить
FloraSora
FloraSora - 07.08.2021 23:03

I need an in-depth tutorial for this so badly!!! I'm dying to use this.

Ответить
Eran Boodnero
Eran Boodnero - 07.08.2021 18:44

Man. Obviously this could be turned into an application that you feed a video into and it spits out a high refresh rate version. Incredible

Ответить
CaHydra
CaHydra - 07.08.2021 18:27

it would be nice if this was released to the public, or if it already is then make it more user friendly since it would be amazing to try this

Ответить
turgor127
turgor127 - 26.06.2021 22:21

This is gonna be in 2 minute papers.

Ответить
roidroid
roidroid - 14.06.2021 16:51

It would be interesting to gradually reduce the framerate of the RGB camera, to find the limits of how much of the missing data your system is capable of constructing.

I'm curious how FEW keyframes are truly required to get reliable results. Or maybe a system can automatically request the RGB/keyframe only as required, and only for the nessesary part of the image rather than the entire frame.

Ответить