Instant neural graphics primitives with a multiresolution hash encoding

# · 🔥 179 · 💬 37 · 2 years ago · nvlabs.github.io · ath92 · 📷
Neural graphics primitives, parameterized by fully connected neural networks, can be costly to train and evaluate. We reduce this cost with a versatile new input encoding that permits the use of a smaller network without sacrificing quality, thus significantly reducing the number of floating point and memory access operations. A small neural network is augmented by a multiresolution hash table of trainable feature vectors whose values are optimized through stochastic gradient descent. The multiresolution structure allows the network to disambiguate hash collisions, making for a simple architecture that is trivial to parallelize on modern GPUs. We achieve a combined speedup of several orders of magnitude, enabling training of high-quality neural graphics primitives in a matter of seconds, and rendering in tens of milliseconds at a resolution of 1920x1080. Please send feedback and questions to Thomas Müller. We would like to thank Towaki Takikawa, David Luebke, Koki Nagano and Nikolaus Binder for profound discussions, and Anjul Patney, Jacob Munkberg, Jonathan Granskog, Jonathan Tremblay, Marco Salvi, James Lucas and Towaki Takikawa for proof-reading, feedback, profound discussions, and early testing.
Instant neural graphics primitives with a multiresolution hash encoding



Send Feedback | WebAssembly Version (beta)