Thursday, March 23, 2023

Instruct-NeRF2NeRF: Editing 3D Scenes with Instructions

Instruct-NeRF2NeRF

UC Berkeley

TL;DR: Instruct-NeRF2NeRF enables instruction-based editing of NeRFs via a 2D diffusion model

We propose a method for editing NeRF scenes with text-instructions. Given a NeRF of a scene and the collection of images used to reconstruct it, our method uses an image-conditioned diffusion model (InstructPix2Pix) to iteratively edit the input images while optimizing the underlying scene, resulting in an optimized 3D scene that respects the edit instruction. We demonstrate that our proposed method is able to edit large-scale, real-world scenes, and is able to accomplish more realistic, targeted edits than prior work.

Our method gradually updates a reconstructed NeRF scene by iteratively updating the dataset images while training the NeRF:

  1. An image is rendered from the scene at a training viewpoint.
  2. It is edited by InstructPix2Pix given a global text instruction.
  3. The training dataset image is replaced with the edited image.
  4. The NeRF continues training as usual.

Example updates to the training dataset images over time. Notice that the edits are gradually becoming more consistent.

"Make it look like autumn"

"Make it look like autumn"

Citation

If you use this work or find it helpful, please consider citing: (bibtex)

@article{instructnerf2023,
         author = {Haque, Ayaan and Tancik, Matthew and Efros, Alexei and Holynski, Aleksander and Kanazawa, Angjoo},
         title = {Instruct-NeRF2NeRF: Editing 3D Scenes with Instructions},
         journal = {arXiv preprint arXiv:XXXXXX},
         year = {2023},
        } 

We thank our colleagues for their insightful feedback helpful discussions, in particular Ethan Weber, Frederik Warburg, Ben Poole, Richard Szeliski, Jon Barron, Alexander Kristoffersen, Rohan Mathur, Alejandro Escontrela, and the Nerfstudio team.



from Hacker News https://ift.tt/5caujLC

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.