Now that many top-of-the-line smartphones can record videos and shoot high quality, mobile device manufacturers are beginning to innovate in high-speed shooting, most high-end phones boast filming up to 960 frames per second. NVIDIA has come up with a more stylish solution for a slow motion effect made from regular, full-resolution shooting using neural networks and artificial intelligence to generate new frames to ensure fluid rendering of the video
Generally, high-resolution shooting at speeds over 24, 30 or 60 frames per second generally requires very expensive, specialized equipment. NVIDIA proposes an algorithm that uses company video cards to process regular video clips, thus inserting new "framed" frames between frames made using previous and subsequent frame information
The result is very convincing, or even close to perfection for an undrawn eye, but technology has its limitations. First, some elements in the frame can be displayed with processing artifacts in spaces where there is not enough information to generate a perfect frame. It also takes a lot of time to process these clips, even on top of the company's top videos
To achieve these results, NVIDIA has used over 11,000 sports videos, shot at 240 frames per second native. Based on these, the algorithm can now generate up to 210 frames per second of playback. But it's interesting that technology works even better as the clip is played at a higher speed, so slow-motion native movies can be slowed down.
Perhaps this technology will not be integrated into smartphones too soon, but it could turn the normal footage of the phone into slow-motion clips on a PC, the advantage being that high resolution (Full HD, 4K, etc.) remains intact, as opposed to the native features of smartphones that now captures 960 frames per second at resolution below 720p.