Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[quidditch_snitch] Add start_zero_mem_transfer operation #113

Merged
merged 1 commit into from
Aug 18, 2024

Conversation

zero9178
Copy link
Member

This operation is a special kind of DMA transfer leveraging both Snitch's DMA and a special address space in the cluster's memory that always returns 0. This allows zeroing memory in bursts of 64kB using the DMA.

While currently unused in the pipeline, the operation will be used soon to implement a lowering of tensor.pad. Occurrences of linalg.fill with zero can also be optimized in the future.

This operation is a special kind of DMA transfer leveraging both Snitch's DMA and a special address space in the cluster's memory that always returns 0. This allows zeroing memory in bursts of 64kB using the DMA.

While currently unused in the pipeline, the operation will be used soon to implement a lowering of `tensor.pad`. Occurrences of `linalg.fill` with zero can also be optimized in the future.
@zero9178 zero9178 merged commit d35e424 into main Aug 18, 2024
1 check passed
@zero9178 zero9178 deleted the zero-mem-transfer branch August 18, 2024 13:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant