Getting Started with NeRFs
Let's back up a bit and talk about what NeRFs are and how to get started.
With how much innovation being released seemingly every day, it can be easy to feel overwhelmed or that the boat is leaving without you.
So, for this week, let’s slow it down a little and look at what NeRFs are, the current platforms, and how to get started.
Jonathan Stephens, Matt Tancik, and Jared Heinly tackle the difference between a NeRF and Photogrammetry along with what a NeRF is not. It’s a bit easier to understand than the above video.
Most NeRFs utilize AABB; what does that mean?
Platforms:
With the foundation in place, it’s important to understand the current players within the space.
NeRF has existed since 2020, but it wasn’t until early 2022 that Instant-NGP was introduced and allowed for local NeRF generation.
Today, I would like to cover four of them that can be used primarily by everyday consumers.
Instant-NGP (Instant-NeRF)
On January 14th, 2022, Nvidia Labs released the code for Instant-NGP onto Github. Instant-NGP or Instant-NeRF as it’s also referred to, was the first platform that allowed for rapid NeRF training along with being able to run it on consumer grade GPUs, hence Instant NeRF.
Nvidia ran an Instant NeRF contest last year, which was won by Vibrant Nebula and Mason McGough.
Since then, Instant-NeRF was the 8th most cited AI paper last year and was named one of Time Magazine’s top inventions of 2022.
As a great Christmas gift, Nvidia released executables, to greatly simplify the build process. If you’d still like to compile it yourself, use Linux, or want the developer Python bindings, you’ll need to compile it yourself with these instructions.
This year Nvidia unveiled the INGP file type along with Virtual Reality support in February, with another follow up contest.
For those that want to create NeRFs from videos, make sure you have FFmpeg. If you’re getting stuck with inputting code, this article helps walk you through what to change. You’ll also want to make sure that you have a system that can run Instant-NGP.
Want to try Instant NGP, but don’t have any data? Check out the .INGP library, where you can download to ready to go NeRF files. More and more samples and different scenes will be added across the next couple of days.
Luma AI
Luma AI entered the NeRF world in mid 2022, with the launch of its private beta. Luma was quick to expand access and today is available to download on the Apple App Store.
It’s difficult to make an argument that any other platform has lowered the entry barrier to NeRFs more than Luma. Their app is mobile first, on iOS; if you’re on Android or want to use existing videos or photos, there’s a very easy method!
The Luma team has been hard at work introducing clamored for features such as Unreal Engine plugins, Augmented Reality, and embeddable NeRFs.
Luma CEO, Amit Jain is a tech veteran and Apple alumni, working on the LiDAR scanner. With that in mind, it’s easy to see the translation of clean mobile product within the Luma app.
Jain also spoke to his vision for Luma and the evolution of NeRFs in this interview with Building the Open Metaverse.
nerfstudio
It’s hard to overstate Matt Tancik’s contribution to NeRF. He is one of the authors of the original NeRF paper, NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis. You might recognize a few other names from the Zip-NeRF paper.
Matt also is an amazing guy, who regularly can be found answering product questions within the nerfstudio Discord channel. Other founding members of nerfstudio include Ethan Weber and Evonne Ng, who both have a strong foundation in Neural Radiance Fields.
nerfstudio launched on October 5th, 2022 and is the current industry leader for those that want to try out new papers and features, such as Instruct-NeRF2NeRF, Blender, and Volinga AI’s Unreal Engine Plugin. Because their team is at the forefront of the research, it makes it easier for them to adapt the features in.
For those coming over from LiDAR/Photogrammetry, you can use Polycam or use Google Colab if you don’t have a usable GPU.
I see nerfstudio as a Hugging Face equivalent, but for NeRFs, allowing for people to easily customize and plugin new methods for people to try out.
TurboNeRF
TurboNeRF is a one man project founded by James Perlman.
James called me one afternoon and told me he was thinking of creating his own NeRF algorithm from scratch. Less than two months later, TurboNeRF was announced on Pi-Day.
What sets TurboNeRF apart is that the NeRF training and rendering is done directly in Blender and allows users to leverage its abilities.
In case you’d like to see what goes into creating a NeRF platform, James streams his journey on Twitch. James will also occasionally tease upcoming features on his Twitter.
James has been very transparent, publishing his product roadmap and making the codebase open sourced and MIT Licensed.
How to Get Started
Depending on the resources available to you, one platform might be better suited for you than another. Here are download links for all of them.
Together, these platforms are each driving NeRF adoption forwards and to be usable for everyday use cases.