-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The result of wikisql_test generated by the model downloaded from huggingface #5
Comments
Did you encounter any issues during the inference? The normal prediction should be like this for your examples:
|
There was a warning:
No other useful information. |
Here is the list of GPU types that flash attention can support: Dao-AILab/flash-attention#148. You can check whether your machine is included. If not, you can try to use the following script to see whether you can get the correct predictions (basically set the flash attention as False)
|
Still the same. Should I fine tune the model first? |
I tried the model downloaded from huggingface on the dataset wikisql_test and get the result like following:
There are a lot meaningless tokens in the prediction results. Is this normal, or did I make some mistakes?
The shell command I used:
The machine I'm using:
The text was updated successfully, but these errors were encountered: