Skip to content
This repository has been archived by the owner on Aug 7, 2024. It is now read-only.

[DISCUSSION] fix float8 all-gather in FSDP2 + TP: DTensor(WeightWithDynamicFloat8CastTensor) #326

Draft
wants to merge 10 commits into
base: main
Choose a base branch
from

Commits on Jul 17, 2024

  1. add unit test for FSDP2 + torch.compile(transformer block)

    Summary:
    
    Test Plan:
    
    Reviewers:
    
    Subscribers:
    
    Tasks:
    
    Tags:
    weifengpy committed Jul 17, 2024
    Configuration menu
    Copy the full SHA
    b5cad8d View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    a6b8913 View commit details
    Browse the repository at this point in the history
  3. remove debug lines

    Summary:
    
    Test Plan:
    
    Reviewers:
    
    Subscribers:
    
    Tasks:
    
    Tags:
    weifengpy committed Jul 17, 2024
    Configuration menu
    Copy the full SHA
    272e85b View commit details
    Browse the repository at this point in the history
  4. fix linter

    Summary:
    
    Test Plan:
    
    Reviewers:
    
    Subscribers:
    
    Tasks:
    
    Tags:
    weifengpy committed Jul 17, 2024
    Configuration menu
    Copy the full SHA
    097ceed View commit details
    Browse the repository at this point in the history

Commits on Jul 18, 2024

  1. numeric baseline against compiled model

    Summary:
    
    Test Plan:
    
    Reviewers:
    
    Subscribers:
    
    Tasks:
    
    Tags:
    weifengpy committed Jul 18, 2024
    Configuration menu
    Copy the full SHA
    b6ebf8d View commit details
    Browse the repository at this point in the history
  2. update README and CI

    Summary:
    
    Test Plan:
    
    Reviewers:
    
    Subscribers:
    
    Tasks:
    
    Tags:
    weifengpy committed Jul 18, 2024
    Configuration menu
    Copy the full SHA
    2eaa51b View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    969f91f View commit details
    Browse the repository at this point in the history

Commits on Jul 21, 2024

  1. Configuration menu
    Copy the full SHA
    f475c40 View commit details
    Browse the repository at this point in the history

Commits on Jul 24, 2024

  1. fix float8 all-gather in 2d

    Summary:
    
    Test Plan:
    
    Reviewers:
    
    Subscribers:
    
    Tasks:
    
    Tags:
    weifengpy committed Jul 24, 2024
    Configuration menu
    Copy the full SHA
    cc763ce View commit details
    Browse the repository at this point in the history

Commits on Aug 1, 2024

  1. tested successfully

    Summary:
    
    Test Plan:
    
    Reviewers:
    
    Subscribers:
    
    Tasks:
    
    Tags:
    weifengpy committed Aug 1, 2024
    Configuration menu
    Copy the full SHA
    7fbb867 View commit details
    Browse the repository at this point in the history