Skip to content

Commit

Permalink
Fix code block syntax and language specification
Browse files Browse the repository at this point in the history
  • Loading branch information
V0XNIHILI committed Dec 7, 2023
1 parent 466b9a4 commit 2d7cd49
Showing 1 changed file with 8 additions and 2 deletions.
10 changes: 8 additions & 2 deletions frameworks.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -226,15 +226,21 @@ Deep learning frameworks have traditionally followed one of two approaches for e

For example:

```{{python}} x = tf.placeholder(tf.float32) y = tf.matmul(x, weights) + biases ```
```{.python}
x = tf.placeholder(tf.float32)
y = tf.matmul(x, weights) + biases
```

The model is defined separately from execution, like building a blueprint. For TensorFlow 1.x, this is done using tf.Graph(). All ops and variables must be declared upfront. Subsequently, the graph is compiled and optimized before running. Execution is done later by feeding in tensor values.

**Dynamic graphs (define-by-run):** In contrast to declare (all) first and then execute, the graph is built dynamically as execution happens. There is no separate declaration phase - operations execute immediately as they are defined. This style is more imperative and flexible, facilitating experimentation.

PyTorch uses dynamic graphs, building the graph on-the-fly as execution happens. For example, consider the following code snippet, where the graph is built as the execution is taking place:

```{{python}} x = torch.randn(4,784) y = torch.matmul(x, weights) + biases ```
```{.python}
x = torch.randn(4,784)
y = torch.matmul(x, weights) + biases
```

In the above example, there are no separate compile/build/run phases. Ops define and execute immediately. With dynamic graphs, definition is intertwined with execution. This provides a more intuitive, interactive workflow. But the downside is less potential for optimizations, since the framework only sees the graph as it is built.

Expand Down

0 comments on commit 2d7cd49

Please sign in to comment.