Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add prefill/decode from seq lens in BaseCausalLMModel #383

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

sogartar
Copy link
Contributor

@sogartar sogartar commented Oct 30, 2024

We do not have a clearly defined interface for LMs. Decode and prefill have different signature when exporting to IREE. One of them uses an attention mask the other sequence lenghts.

This change adds to BaseCausalLMModel default implementations for the new prefill_from_seq_lens and decode_from_seq_lens methods.

The export script export_paged_llm_v1 does too much in its exported functions. It computes the attention mask then it shards its arguments and unshards its result. This change lets it be a thinner wrapper around the new functions.

Make paged_llm_v1.TorchGenerator use the new interface methods.

We do not have a clearly defined interface for LMs. Decode and prefill have different signature when exporting to IREE.
Here is added a new ABC CausalLMModelABC that makes a distinction between the two variants.
The BaseCausalLMModel provides a default implementation for the new prefill_from_seq_lens and decode_from_seq_lens methods.

The export script export_paged_llm_v1 does too much in its exported functions.
It computes the attention mask then. It shards its arguments and unshards its result.
This change lets it be a thiner wrapper around the new interface functions.

Make paged_llm_v1.TorchGenerator use the new interface methods.
@@ -27,7 +27,7 @@
################################################################################


class PagedLlamaModelV1(BaseCausalLMModel):
class PagedLlamaModelV1(BaseCausalLMModel, CausalLMModelABC):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Double inheritance almost always a bad idea especially considering these are both LLM interfaces. Can we have an alternative option? Why can't we expand this functionality in the existing BaseCausalLMModel?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed CausalLMModelABC. I added the prefill and decode unimplemented methods to BaseCausalLMModel.

@sogartar sogartar changed the title Introduce CausalLMModelABC interface Add prefill/decode from seq lens in BaseCausalLMModel Nov 7, 2024
@sogartar sogartar requested a review from rsuderman November 7, 2024 13:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants