-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor for softmax, split, topk, transpose #1404
refactor for softmax, split, topk, transpose #1404
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## repo-refactor #1404 +/- ##
=================================================
+ Coverage 38.10% 41.32% +3.22%
=================================================
Files 167 181 +14
Lines 5026 5265 +239
Branches 246 271 +25
=================================================
+ Hits 1915 2176 +261
+ Misses 3111 3089 -22
Flags with carried forward coverage won't be shown. Click here to find out more. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewable status: 0 of 4 files reviewed, 4 unresolved discussions (waiting on @Bob-Chen222)
lib/kernels/src/hip/topk_kernels.cpp
line 24 at r1 (raw file):
namespace Kernels { namespace TopK {
Missing init_kernel
TopKPerDeviceState init_kernel(bool sorted) {
TopKPerDeviceState per_device_state = {sorted};
return per_device_state;
}
lib/kernels/src/hip/transpose_kernels.cpp
line 46 at r1 (raw file):
} __global__ void transpose_simple_kernel(coord_t volume,
size_t
instead of coord_t
lib/kernels/src/hip/transpose_kernels.cpp
line 67 at r1 (raw file):
float const *input_ptr, float *output_ptr, Domain in_domain,
Refactor to avoid Domain
since that's a legion object. Should use ArrayShape
instead, check out the cuda kernel for this.
lib/kernels/src/hip/transpose_kernels.cpp
line 96 at r1 (raw file):
float *input_grad_ptr, float const *output_grad_ptr, Domain in_grad_domain,
See above on Domain
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewable status: 0 of 4 files reviewed, 4 unresolved discussions (waiting on @reyna-abhyankar)
lib/kernels/src/hip/topk_kernels.cpp
line 24 at r1 (raw file):
Previously, reyna-abhyankar (Reyna Abhyankar) wrote…
Missing
init_kernel
TopKPerDeviceState init_kernel(bool sorted) { TopKPerDeviceState per_device_state = {sorted}; return per_device_state; }
Done.
lib/kernels/src/hip/transpose_kernels.cpp
line 46 at r1 (raw file):
Previously, reyna-abhyankar (Reyna Abhyankar) wrote…
size_t
instead ofcoord_t
Done.
lib/kernels/src/hip/transpose_kernels.cpp
line 67 at r1 (raw file):
Previously, reyna-abhyankar (Reyna Abhyankar) wrote…
Refactor to avoid
Domain
since that's a legion object. Should useArrayShape
instead, check out the cuda kernel for this.
Done.
lib/kernels/src/hip/transpose_kernels.cpp
line 96 at r1 (raw file):
Previously, reyna-abhyankar (Reyna Abhyankar) wrote…
See above on
Domain
Done.
Description of changes:
Refactor for softmax, split, topk, transpose
Related Issues:
Linked Issues:
Issues closed by this PR:
This change is