Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

An option to minimize/omit Normalization and Activation layers #114

Open
caiuspetronius opened this issue Oct 31, 2024 · 0 comments
Open

Comments

@caiuspetronius
Copy link

Is your feature request related to a problem? Please describe.
Large networks appear about 3x times larger than they should because of the omnipresent normalization and activation layers. One doesn't really need to see normalizations and activations represented by the same size blocks as more important layers, e.g. convolutions.

Describe the solution you'd like
Ideally, an option to fold them into a line of a given color for normalization and another color for activation juxtaposed to the bottom of the block that they follow.

Describe alternatives you've considered
An option of omitting them altogether would be another solution.

Screenshots / Text
When showing piece of code/error, please always prefer text format (over screenshot of the code/error) unless screenshot is absolutely necessary.
If you provide text format (that is executable), it is easier to copy and paste.

Additional context
Add any other context or screenshots about the feature request here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant