Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tensor shapes not displayed in rendered graph #83

Open
asberk opened this issue Sep 23, 2020 · 5 comments
Open

tensor shapes not displayed in rendered graph #83

asberk opened this issue Sep 23, 2020 · 5 comments

Comments

@asberk
Copy link

asberk commented Sep 23, 2020

Using the following code — which comes from the PyTorch demo notebook — I get the output in the subsequent image. The Pytorch demo notebook suggests in its first example that tensor shapes are printed on the edges between nodes. However, I have found this not to be the case in my attempts. How can I get the tensor shapes printed along the edges? Is this an issue, or is there documentation I can refer to fix the output?

(By the way, I am running python3.8 with the latest version of Pytorch (1.6.0) and hiddenlayer).

import torchvision
import hiddenlayer as hl

model = torchvision.models.resnet101()

# Rather than using the default transforms, build custom ones to group
# nodes of residual and bottleneck blocks.
transforms = [
    # Fold Conv, BN, RELU layers into one
    hl.transforms.Fold("Conv > BatchNorm > Relu", "ConvBnRelu"),
    # Fold Conv, BN layers together
    hl.transforms.Fold("Conv > BatchNorm", "ConvBn"),
    # Fold bottleneck blocks
    hl.transforms.Fold(
        """
        ((ConvBnRelu > ConvBnRelu > ConvBn) | ConvBn) > Add > Relu
        """,
        "BottleneckBlock",
        "Bottleneck Block",
    ),
    # Fold residual blocks
    hl.transforms.Fold(
        """ConvBnRelu > ConvBnRelu > ConvBn > Add > Relu""",
        "ResBlock",
        "Residual Block",
    ),
    # Fold repeated blocks
    hl.transforms.FoldDuplicates(),
]

# Display graph using the transforms above
res_graph = hl.build_graph(
    model, torch.zeros([1, 3, 224, 224]), transforms=transforms
)
res_graph.save(path="./log/res_graph.pdf")

@peterqtr11
Copy link

I face the same problem

@Alexis-Martin
Copy link

There is a fix for that. You need to replace the function get_shape(torch_node) in the file pytorch_builder.py, by this one:

def get_shape(torch_node):
    """Return the output shape of the given Pytorch node."""
    # Extract node output shape from the node string representation
    # This is a hack because there doesn't seem to be an official way to do it.
    # See my quesiton in the PyTorch forum:
    # https://discuss.pytorch.org/t/node-output-shape-from-trace-graph/24351/2
    # TODO: find a better way to extract output shape
    # TODO: Assuming the node has one output. Update if we encounter a multi-output node.
    shape = torch_node.output().type().sizes()
    return shape

The solution comes from the link of the discussion above.

ahrnbom added a commit to ahrnbom/hiddenlayer that referenced this issue Jun 12, 2021
Fixed shapes for PyTorch plots. See waleedka#83 for more info.
@ahrnbom
Copy link

ahrnbom commented Jun 12, 2021

I had this problem too, and Alexis-Martin's solution worked for me, so I created a pull request: #89

@lxfhfut
Copy link

lxfhfut commented Jan 21, 2022

If Alexis-Martin's solution does not work, try

try:
    shape = torch_node.output().type().sizes()
except:
    shape = None
return shape

Refer here for more details.

@markytools
Copy link

If Alexis-Martin's solution does not work, try

try:
    shape = torch_node.output().type().sizes()
except:
    shape = None
return shape

Refer here for more details.

This fixed the problem, "RuntimeError: outputs_.size() == 1 INTERNAL ASSERT FAILED"

Strasser-Pablo added a commit to Strasser-Pablo/hiddenlayer that referenced this issue Mar 31, 2022
Used code from waleedka#83 .
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants