Skip to content

Commit

Permalink
Updated about page
Browse files Browse the repository at this point in the history
  • Loading branch information
JadenFiotto-Kaufman committed Dec 3, 2023
1 parent d02b937 commit a3221e9
Show file tree
Hide file tree
Showing 5 changed files with 7 additions and 7 deletions.
Binary file modified public/_images/remote_execution.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 3 additions & 3 deletions public/_sources/about.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ An API for transparent science on black-box AI
models, but they do not let you see model internals.

The nnsight library is different: it gives you full access to all the neural network internals.
When used together with a remote service like the National Deep Inference Facility (NDIF),
it lets you run expriments on huge open models easily, with full transparent access.
When used together with a remote service like the `National Deep Inference Facility <https://ndif.us/>`_ (NDIF),
it lets you run experiments on huge open models easily, with full transparent access.
The nnsight library is also terrific for studying smaller local models.

.. figure:: _static/images/remote_execution.png
Expand All @@ -25,7 +25,7 @@ How you use nnsight

Nnsight is built on pytorch.

Running inference on a huge remote model with nnsight is very similar to running a neural network locally on your own workstataion. In fact, with nnsight, the same code for running experiments locally on small models can also be used on large models, just by changing a few arguments.
Running inference on a huge remote model with nnsight is very similar to running a neural network locally on your own workstation. In fact, with nnsight, the same code for running experiments locally on small models can also be used on large models, just by changing a few arguments.

The difference between nnsight and normal inference is that when you use nnsight, you do not treat the model as an opaque black box.
Instead, you set up a python ``with`` context that enables you to get direct access to model internals while the neural network runs.
Expand Down
Binary file modified public/_static/images/remote_execution.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 3 additions & 3 deletions public/about/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -428,8 +428,8 @@ <h2>An API for transparent science on black-box AI<a class="headerlink" href="#a
that are hard to run. Ordinary commercial inference service APIs let you interact with huge
models, but they do not let you see model internals.</p>
<p class="sd-card-text">The nnsight library is different: it gives you full access to all the neural network internals.
When used together with a remote service like the National Deep Inference Facility (NDIF),
it lets you run expriments on huge open models easily, with full transparent access.
When used together with a remote service like the <a class="reference external" href="https://ndif.us/">National Deep Inference Facility</a> (NDIF),
it lets you run experiments on huge open models easily, with full transparent access.
The nnsight library is also terrific for studying smaller local models.</p>
</div>
</div>
Expand All @@ -443,7 +443,7 @@ <h2>An API for transparent science on black-box AI<a class="headerlink" href="#a
<section id="how-you-use-nnsight">
<h2>How you use nnsight<a class="headerlink" href="#how-you-use-nnsight" title="Link to this heading">#</a></h2>
<p>Nnsight is built on pytorch.</p>
<p>Running inference on a huge remote model with nnsight is very similar to running a neural network locally on your own workstataion. In fact, with nnsight, the same code for running experiments locally on small models can also be used on large models, just by changing a few arguments.</p>
<p>Running inference on a huge remote model with nnsight is very similar to running a neural network locally on your own workstation. In fact, with nnsight, the same code for running experiments locally on small models can also be used on large models, just by changing a few arguments.</p>
<p>The difference between nnsight and normal inference is that when you use nnsight, you do not treat the model as an opaque black box.
Instead, you set up a python <code class="docutils literal notranslate"><span class="pre">with</span></code> context that enables you to get direct access to model internals while the neural network runs.
Here is how it looks:</p>
Expand Down
2 changes: 1 addition & 1 deletion public/searchindex.js

Large diffs are not rendered by default.

0 comments on commit a3221e9

Please sign in to comment.