Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finetuned weights for other compilers. #4

Open
jupiterepoch opened this issue Mar 19, 2021 · 1 comment
Open

Finetuned weights for other compilers. #4

jupiterepoch opened this issue Mar 19, 2021 · 1 comment

Comments

@jupiterepoch
Copy link

Hello, I was wondering if you could provide Finetuned weights (and maybe even finetuned weights) for other compilers (gcc and clang). If you already have them in hand (kept during your experiments), then it would save a lot of time for me. Thank you!

Also, could I ask how large a training set did you tune it on? My current training set has 60000 lines, with each line containing 512 bytes according to your specification. The fine-tuning process seems quite time-consuming. What GPU did you train it on?

@peikexin9
Copy link
Member

Hi @jupiterepoch, my apology, I just searched my archive seems I only kept the pretrained files. My training set seems to have 640K, e.g., for CPU2017_Windows_x64. The fine-tuning is run on 4 2080Ti. I suggest you can start with only 10k. Usually, I observe with a pretrained model, the small fine-tuning training set would give strong enough results :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants