Skip to content
This repository has been archived by the owner on Apr 27, 2024. It is now read-only.

Do not limit pillow version #79

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

wangkuiyi
Copy link

@wangkuiyi wangkuiyi commented Jan 23, 2023

Fix #78

After this change, I successfully install dependencies withpython -m pip install -r requirements.txt

And I can build iree-torch with python setup.py develop, so that python -c 'import iree_torch' works for me.


However, when I ran python examples/bert.py, the following line

from transformers import AutoTokenizer, AutoModelForSequenceClassification

raised the error.

Traceback (most recent call last):
  File "/Users/y/w/iree-torch/examples/bert.py", line 21, in <module>
    from transformers import AutoTokenizer, AutoModelForSequenceClassification
ModuleNotFoundError: No module named 'transformers'

pip install transformers saved me. But I guess this fix might go into another pull request? I made #80

$ python examples/bert.py
Parsing sentence tokens.
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 512/512 [00:00<00:00, 318kB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 710/710 [00:00<00:00, 468kB/s]
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 232k/232k [00:00<00:00, 638kB/s]
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 466k/466k [00:00<00:00, 977kB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 112/112 [00:00<00:00, 46.4kB/s]
Instantiating model.
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 90.9M/90.9M [00:11<00:00, 8.22MB/s]
Tracing model.
Compiling with Torch-MLIR
Compiling with IREE
Loading in IREE
Running on IREE
RESULT: tensor([[ 1.8574, -1.8036]])
Model execution took 0.009077072143554688 seconds.

@dellis23
Copy link
Contributor

@silvasean do you know the original reason for limiting the pillow version?

@silvasean
Copy link
Contributor

silvasean commented Jan 24, 2023

It is documented in the comment just above the restriction.

# For torchvision, use pillow<7 to avoid `ImportError: cannot import name 'PILLOW_VERSION' from 'PIL'`
# See https://github.com/pytorch/vision/issues/1712

If that doesn't happen anymore then we can remove the limitation (we should remove that commend too though)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

May need to remove pillow<7 from requirements.txt
3 participants