Skip to content

Commit

Permalink
Update URLs for readme and image references (nerfstudio-project#2527)
Browse files Browse the repository at this point in the history
* Update URLs for readme and image references

Update the image URLs and documentation links due to change in docs url structure. Files edited include readme, documentation, and legacy viewer files.

* Updating url in custom dataset
  • Loading branch information
cvachha authored Oct 14, 2023
1 parent d393473 commit e9a3bae
Show file tree
Hide file tree
Showing 8 changed files with 28 additions and 28 deletions.
30 changes: 15 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,18 +23,18 @@
<p align="center">
<!-- pypi-strip -->
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://docs.nerf.studio/en/latest/_images/logo-dark.png">
<source media="(prefers-color-scheme: light)" srcset="https://docs.nerf.studio/en/latest/_images/logo.png">
<source media="(prefers-color-scheme: dark)" srcset="https://docs.nerf.studio/_images/logo-dark.png">
<source media="(prefers-color-scheme: light)" srcset="https://docs.nerf.studio/_images/logo.png">
<!-- /pypi-strip -->
<img alt="nerfstudio" src="https://docs.nerf.studio/en/latest/_images/logo.png" width="400">
<img alt="nerfstudio" src="https://docs.nerf.studio/_images/logo.png" width="400">
<!-- pypi-strip -->
</picture>
<!-- /pypi-strip -->
</p>

<!-- Use this for pypi package (and disable above). Hacky workaround -->
<!-- <p align="center">
<img alt="nerfstudio" src="https://docs.nerf.studio/en/latest/_images/logo.png" width="400">
<img alt="nerfstudio" src="https://docs.nerf.studio/_images/logo.png" width="400">
</p> -->

<p align="center"> A collaboration friendly studio for NeRFs </p>
Expand Down Expand Up @@ -70,7 +70,7 @@ Nerfstudio initially launched as an opensource project by Berkeley students in [

We are committed to providing learning resources to help you understand the basics of (if you're just getting started), and keep up-to-date with (if you're a seasoned veteran) all things NeRF. As researchers, we know just how hard it is to get onboarded with this next-gen technology. So we're here to help with tutorials, documentation, and more!

Have feature requests? Want to add your brand-spankin'-new NeRF model? Have a new dataset? **We welcome [contributions](https://docs.nerf.studio/en/latest/reference/contributing.html)!** Please do not hesitate to reach out to the nerfstudio team with any questions via [Discord](https://discord.gg/uMbNqcraFc).
Have feature requests? Want to add your brand-spankin'-new NeRF model? Have a new dataset? **We welcome [contributions](https://docs.nerf.studio/reference/contributing.html)!** Please do not hesitate to reach out to the nerfstudio team with any questions via [Discord](https://discord.gg/uMbNqcraFc).

Have feedback? We'd love for you to fill out our [Nerfstudio Feedback Form](https://forms.gle/sqN5phJN7LfQVwnP9) if you want to let us know who you are, why you are interested in Nerfstudio, or provide any feedback!

Expand Down Expand Up @@ -118,7 +118,7 @@ You must have an NVIDIA video card with CUDA installed on the system. This libra

### Create environment

Nerfstudio requires `python >= 3.8`. We recommend using conda to manage dependencies. Make sure to install [Conda](https://docs.conda.io/en/latest/miniconda.html) before proceeding.
Nerfstudio requires `python >= 3.8`. We recommend using conda to manage dependencies. Make sure to install [Conda](https://docs.conda.io/miniconda.html) before proceeding.

```bash
conda create --name nerfstudio -y python=3.8
Expand Down Expand Up @@ -281,23 +281,23 @@ We support four different methods to track training progress, using the viewer[t

And that's it for getting started with the basics of nerfstudio.

If you're interested in learning more on how to create your own pipelines, develop with the viewer, run benchmarks, and more, please check out some of the quicklinks below or visit our [documentation](https://docs.nerf.studio/en/latest/) directly.
If you're interested in learning more on how to create your own pipelines, develop with the viewer, run benchmarks, and more, please check out some of the quicklinks below or visit our [documentation](https://docs.nerf.studio/) directly.

| Section | Description |
| -------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------- |
| [Documentation](https://docs.nerf.studio/en/latest/) | Full API documentation and tutorials |
| [Documentation](https://docs.nerf.studio/) | Full API documentation and tutorials |
| [Viewer](https://viewer.nerf.studio/) | Home page for our web viewer |
| 🎒 **Educational** |
| [Model Descriptions](https://docs.nerf.studio/en/latest/nerfology/methods/index.html) | Description of all the models supported by nerfstudio and explanations of component parts. |
| [Component Descriptions](https://docs.nerf.studio/en/latest/nerfology/model_components/index.html) | Interactive notebooks that explain notable/commonly used modules in various models. |
| [Model Descriptions](https://docs.nerf.studio/nerfology/methods/index.html) | Description of all the models supported by nerfstudio and explanations of component parts. |
| [Component Descriptions](https://docs.nerf.studio/nerfology/model_components/index.html) | Interactive notebooks that explain notable/commonly used modules in various models. |
| 🏃 **Tutorials** |
| [Getting Started](https://docs.nerf.studio/en/latest/quickstart/installation.html) | A more in-depth guide on how to get started with nerfstudio from installation to contributing. |
| [Using the Viewer](https://docs.nerf.studio/en/latest/quickstart/viewer_quickstart.html) | A quick demo video on how to navigate the viewer. |
| [Getting Started](https://docs.nerf.studio/quickstart/installation.html) | A more in-depth guide on how to get started with nerfstudio from installation to contributing. |
| [Using the Viewer](https://docs.nerf.studio/quickstart/viewer_quickstart.html) | A quick demo video on how to navigate the viewer. |
| [Using Record3D](https://www.youtube.com/watch?v=XwKq7qDQCQk) | Demo video on how to run nerfstudio without using COLMAP. |
| 💻 **For Developers** |
| [Creating pipelines](https://docs.nerf.studio/en/latest/developer_guides/pipelines/index.html) | Learn how to easily build new neural rendering pipelines by using and/or implementing new modules. |
| [Creating datasets](https://docs.nerf.studio/en/latest/quickstart/custom_dataset.html) | Have a new dataset? Learn how to run it with nerfstudio. |
| [Contributing](https://docs.nerf.studio/en/latest/reference/contributing.html) | Walk-through for how you can start contributing now. |
| [Creating pipelines](https://docs.nerf.studio/developer_guides/pipelines/index.html) | Learn how to easily build new neural rendering pipelines by using and/or implementing new modules. |
| [Creating datasets](https://docs.nerf.studio/quickstart/custom_dataset.html) | Have a new dataset? Learn how to run it with nerfstudio. |
| [Contributing](https://docs.nerf.studio/reference/contributing.html) | Walk-through for how you can start contributing now. |
| 💖 **Community** |
| [Discord](https://discord.gg/uMbNqcraFc) | Join our community to discuss more. We would love to hear from you! |
| [Twitter](https://twitter.com/nerfstudioteam) | Follow us on Twitter @nerfstudioteam to see cool updates and announcements |
Expand Down
2 changes: 1 addition & 1 deletion docs/nerfology/methods/instant_ngp.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Instant-NGP breaks NeRF training into 3 pillars and proposes improvements to eac

The core idea behind the improved sampling technique is that sampling over empty space should be skipped and sampling behind high density areas should also be skipped. This is achieved by maintaining a set of multiscale occupancy grids which coarsely mark empty and non-empty space. Occupancy is stored as a single bit, and a sample on a ray is skipped if its occupancy is too low. These occupancy grids are stored independently of the trainable encoding and are updated throughout training based on the updated density predictions. The authors find they can increase sampling speed by 10-100x compared to naive approaches.

Nerfstudio uses [NerfAcc](https://www.nerfacc.com/en/latest/index.html) as the sampling algorithm implementation. The details on NerfAcc's sampling and occupancy grid is discussed [here](https://www.nerfacc.com/en/stable/methodology/sampling.html#occupancy-grid-estimator).
Nerfstudio uses [NerfAcc](https://www.nerfacc.com/index.html) as the sampling algorithm implementation. The details on NerfAcc's sampling and occupancy grid is discussed [here](https://www.nerfacc.com/en/stable/methodology/sampling.html#occupancy-grid-estimator).

Another major bottleneck for NeRF's training speed has been querying the neural network. The authors of this work implement the network such that it runs entirely on a single CUDA kernel. The network is also shrunk down to be just 4 layers with 64 neurons in each layer. They show that their fully-fused neural network is 5-10x faster than a Tensorflow implementation.

Expand Down
4 changes: 2 additions & 2 deletions docs/quickstart/custom_dataset.md
Original file line number Diff line number Diff line change
Expand Up @@ -406,7 +406,7 @@ This outputs two 180 deg equirectangular renders horizontally stacked, one for e
</center>

### Setup instructions
To render for VR video it is essential to adjust the NeRF to have an approximately true-to-life real world scale (adjustable in the camera path) to ensure that the scene depth and IPD (distance between the eyes) is appropriate for the render to be viewable in VR. You can adjust the scene scale with the [Nerfstudio Blender Add-on](https://docs.nerf.studio/en/latest/extensions/blender_addon.html) by appropriately scaling a point cloud representation of the NeRF.
To render for VR video it is essential to adjust the NeRF to have an approximately true-to-life real world scale (adjustable in the camera path) to ensure that the scene depth and IPD (distance between the eyes) is appropriate for the render to be viewable in VR. You can adjust the scene scale with the [Nerfstudio Blender Add-on](https://docs.nerf.studio/extensions/blender_addon.html) by appropriately scaling a point cloud representation of the NeRF.
Results may be unviewable if the scale is not set appropriately. The IPD is set at 64mm by default but only is accurate when the NeRF scene is true to scale.

For good quality renders, it is recommended to render at high resolutions (For ODS: 4096x2048 per eye, or 2048x1024 per eye. For VR180: 4096x4096 per eye or 2048x2048 per eye). Render resolutions for a single eye are specified in the camera path. For VR180, resolutions must be in a 1:1 aspect ratio. For ODS, resolutions must be in a 2:1 aspect ratio. The final stacked render output will automatically be constructed (with aspect ratios for VR180 as 2:1 and ODS as 1:1).
Expand All @@ -417,7 +417,7 @@ If you are rendering an image sequence, it is recommended to render as png inste
:::

To render with the VR videos camera:
1. Use the [Nerfstudio Blender Add-on](https://docs.nerf.studio/en/latest/extensions/blender_addon.html) to set the scale of the NeRF scene and create the camera path
1. Use the [Nerfstudio Blender Add-on](https://docs.nerf.studio/extensions/blender_addon.html) to set the scale of the NeRF scene and create the camera path
- Export a point cloud representation of the NeRF
- Import the point cloud representation in Blender and enable the Nerfstudio Blender Add-on
- Create a reference object such as a cube which may be 1x1x1 meter. You could also create a cylinder and scale it to an appropriate height of a viewer.
Expand Down
2 changes: 1 addition & 1 deletion docs/reference/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ python nerfstudio/scripts/docs/build_docs.py

### Auto build

As you change or add models/components, the auto-generated [Reference API](https://docs.nerf.studio/en/latest/reference/api/index.html) may change.
As you change or add models/components, the auto-generated [Reference API](https://docs.nerf.studio/reference/api/index.html) may change.
If you want the code to build on save you can use [sphinx autobuild](https://github.com/executablebooks/sphinx-autobuild).

:::{admonition} Tip
Expand Down
12 changes: 6 additions & 6 deletions nerfstudio/configs/external_methods.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ class ExternalMethod:
external_methods.append(
ExternalMethod(
"""[bold yellow]Instruct-NeRF2NeRF[/bold yellow]
For more information visit: https://docs.nerf.studio/en/latest/nerfology/methods/in2n.html
For more information visit: https://docs.nerf.studio/nerfology/methods/in2n.html
To enable Instruct-NeRF2NeRF, you must install it first by running:
[grey]pip install git+https://github.com/ayaanzhaque/instruct-nerf2nerf[/grey]""",
Expand All @@ -62,7 +62,7 @@ class ExternalMethod:
external_methods.append(
ExternalMethod(
"""[bold yellow]K-Planes[/bold yellow]
For more information visit https://docs.nerf.studio/en/latest/nerfology/methods/kplanes.html
For more information visit https://docs.nerf.studio/nerfology/methods/kplanes.html
To enable K-Planes, you must install it first by running:
[grey]pip install kplanes-nerfstudio[/grey]""",
Expand All @@ -78,7 +78,7 @@ class ExternalMethod:
external_methods.append(
ExternalMethod(
"""[bold yellow]LERF[/bold yellow]
For more information visit: https://docs.nerf.studio/en/latest/nerfology/methods/lerf.html
For more information visit: https://docs.nerf.studio/nerfology/methods/lerf.html
To enable LERF, you must install it first by running:
[grey]pip install git+https://github.com/kerrj/lerf[/grey]""",
Expand All @@ -95,7 +95,7 @@ class ExternalMethod:
external_methods.append(
ExternalMethod(
"""[bold yellow]Tetra-NeRF[/bold yellow]
For more information visit: https://docs.nerf.studio/en/latest/nerfology/methods/tetranerf.html
For more information visit: https://docs.nerf.studio/nerfology/methods/tetranerf.html
To enable Tetra-NeRF, you must install it first. Please follow the instructions here:
https://github.com/jkulhanek/tetra-nerf/blob/master/README.md#installation""",
Expand All @@ -110,7 +110,7 @@ class ExternalMethod:
external_methods.append(
ExternalMethod(
"""[bold yellow]NeRFPlayer[/bold yellow]
For more information visit: https://docs.nerf.studio/en/latest/nerfology/methods/nerfplayer.html
For more information visit: https://docs.nerf.studio/nerfology/methods/nerfplayer.html
To enable NeRFPlayer, you must install it first by running:
[grey]pip install git+https://github.com/lsongx/nerfplayer-nerfstudio[/grey]""",
Expand All @@ -125,7 +125,7 @@ class ExternalMethod:
external_methods.append(
ExternalMethod(
"""[bold yellow]Volinga[/bold yellow]
For more information visit: https://docs.nerf.studio/en/latest/extensions/unreal_engine.html
For more information visit: https://docs.nerf.studio/extensions/unreal_engine.html
To enable Volinga, you must install it first by running:
[grey]pip install git+https://github.com/Volinga/volinga-model[/grey]""",
Expand Down
2 changes: 1 addition & 1 deletion nerfstudio/viewer/app/src/modules/Banner/Banner.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ export default function Banner() {
<div className="banner-logo">
<img
style={{ height: 30, margin: 'auto' }}
src="https://docs.nerf.studio/en/latest/_images/logo-dark.png"
src="https://docs.nerf.studio/_images/logo-dark.png"
alt="The favicon."
/>
</div>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ export default function LandingModel(props: LandingModalProps) {
<center>
<img
style={{ height: 37, margin: 'auto' }}
src="https://docs.nerf.studio/en/latest/_images/logo-dark.png"
src="https://docs.nerf.studio/_images/logo-dark.png"
alt="The favicon."
/>
</center>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ export default function ControlsModal() {
<center>
<img
style={{ height: 37, margin: 'auto' }}
src="https://docs.nerf.studio/en/latest/_images/logo-dark.png"
src="https://docs.nerf.studio/_images/logo-dark.png"
alt="The favicon."
/>
<img
Expand Down

0 comments on commit e9a3bae

Please sign in to comment.