You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I was wondering if you could provide Finetuned weights (and maybe even finetuned weights) for other compilers (gcc and clang). If you already have them in hand (kept during your experiments), then it would save a lot of time for me. Thank you!
Also, could I ask how large a training set did you tune it on? My current training set has 60000 lines, with each line containing 512 bytes according to your specification. The fine-tuning process seems quite time-consuming. What GPU did you train it on?
The text was updated successfully, but these errors were encountered:
Hi @jupiterepoch, my apology, I just searched my archive seems I only kept the pretrained files. My training set seems to have 640K, e.g., for CPU2017_Windows_x64. The fine-tuning is run on 4 2080Ti. I suggest you can start with only 10k. Usually, I observe with a pretrained model, the small fine-tuning training set would give strong enough results :-)
Hello, I was wondering if you could provide Finetuned weights (and maybe even finetuned weights) for other compilers (gcc and clang). If you already have them in hand (kept during your experiments), then it would save a lot of time for me. Thank you!
Also, could I ask how large a training set did you tune it on? My current training set has 60000 lines, with each line containing 512 bytes according to your specification. The fine-tuning process seems quite time-consuming. What GPU did you train it on?
The text was updated successfully, but these errors were encountered: