Skip to content

Latest commit

 

History

History
4 lines (3 loc) · 288 Bytes

README.md

File metadata and controls

4 lines (3 loc) · 288 Bytes

simple-llama3

A simple tested pytorch implementation of llama3 without fairscale.

If you want to understand the transformer model I recommend you to read my implementation of a vanilla transformer, since I rehuse some code here.