Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor for softmax, split, topk, transpose #1404

Merged

Conversation

Bob-Chen222
Copy link
Contributor

@Bob-Chen222 Bob-Chen222 commented Jun 1, 2024

Description of changes:

Refactor for softmax, split, topk, transpose

Related Issues:

Linked Issues:

Issues closed by this PR:

  • Closes #

This change is Reviewable

Copy link

codecov bot commented Jun 1, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 41.32%. Comparing base (89cbe93) to head (c46535f).

Current head c46535f differs from pull request most recent head 09c1aa6

Please upload reports for the commit 09c1aa6 to get more accurate results.

Additional details and impacted files
@@                Coverage Diff                @@
##           repo-refactor    #1404      +/-   ##
=================================================
+ Coverage          38.10%   41.32%   +3.22%     
=================================================
  Files                167      181      +14     
  Lines               5026     5265     +239     
  Branches             246      271      +25     
=================================================
+ Hits                1915     2176     +261     
+ Misses              3111     3089      -22     
Flag Coverage Δ
unittests 41.32% <ø> (+3.22%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

see 223 files with indirect coverage changes

Copy link
Collaborator

@reyna-abhyankar reyna-abhyankar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewable status: 0 of 4 files reviewed, 4 unresolved discussions (waiting on @Bob-Chen222)


lib/kernels/src/hip/topk_kernels.cpp line 24 at r1 (raw file):

namespace Kernels {
namespace TopK {

Missing init_kernel

TopKPerDeviceState init_kernel(bool sorted) {
  TopKPerDeviceState per_device_state = {sorted};
  return per_device_state;
}

lib/kernels/src/hip/transpose_kernels.cpp line 46 at r1 (raw file):

}

__global__ void transpose_simple_kernel(coord_t volume,

size_t instead of coord_t


lib/kernels/src/hip/transpose_kernels.cpp line 67 at r1 (raw file):

                    float const *input_ptr,
                    float *output_ptr,
                    Domain in_domain,

Refactor to avoid Domain since that's a legion object. Should use ArrayShape instead, check out the cuda kernel for this.


lib/kernels/src/hip/transpose_kernels.cpp line 96 at r1 (raw file):

                     float *input_grad_ptr,
                     float const *output_grad_ptr,
                     Domain in_grad_domain,

See above on Domain

Copy link
Contributor Author

@Bob-Chen222 Bob-Chen222 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewable status: 0 of 4 files reviewed, 4 unresolved discussions (waiting on @reyna-abhyankar)


lib/kernels/src/hip/topk_kernels.cpp line 24 at r1 (raw file):

Previously, reyna-abhyankar (Reyna Abhyankar) wrote…

Missing init_kernel

TopKPerDeviceState init_kernel(bool sorted) {
  TopKPerDeviceState per_device_state = {sorted};
  return per_device_state;
}

Done.


lib/kernels/src/hip/transpose_kernels.cpp line 46 at r1 (raw file):

Previously, reyna-abhyankar (Reyna Abhyankar) wrote…

size_t instead of coord_t

Done.


lib/kernels/src/hip/transpose_kernels.cpp line 67 at r1 (raw file):

Previously, reyna-abhyankar (Reyna Abhyankar) wrote…

Refactor to avoid Domain since that's a legion object. Should use ArrayShape instead, check out the cuda kernel for this.

Done.


lib/kernels/src/hip/transpose_kernels.cpp line 96 at r1 (raw file):

Previously, reyna-abhyankar (Reyna Abhyankar) wrote…

See above on Domain

Done.

@reyna-abhyankar reyna-abhyankar enabled auto-merge (squash) June 5, 2024 01:21
@reyna-abhyankar reyna-abhyankar merged commit af1caf5 into flexflow:repo-refactor Jun 5, 2024
7 of 8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants