You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
InitFluxLoRATraining
Couldn't build proto file into descriptor pool: duplicate file name sentencepiece_model.proto
ComfyUI Error Report
Error Details
Node Type: InitFluxLoRATraining
Exception Type: TypeError
Exception Message: Couldn't build proto file into descriptor pool: duplicate file name sentencepiece_model.proto
Stack Trace
File "E:\ComfyUI-aki-v1.3-2\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "E:\ComfyUI-aki-v1.3-2\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "E:\ComfyUI-aki-v1.3-2\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "E:\ComfyUI-aki-v1.3-2\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "E:\ComfyUI-aki-v1.3-2\custom_nodes\ComfyUI-FluxTrainer\nodes.py", line 523, in init_training
training_loop = network_trainer.init_train(args)
File "E:\ComfyUI-aki-v1.3-2\custom_nodes\ComfyUI-FluxTrainer\train_network.py", line 269, in init_train
tokenize_strategy = self.get_tokenize_strategy(args)
File "E:\ComfyUI-aki-v1.3-2\custom_nodes\ComfyUI-FluxTrainer\flux_train_network_comfy.py", line 157, in get_tokenize_strategy
return strategy_flux.FluxTokenizeStrategy(t5xxl_max_token_length, args.tokenizer_cache_dir)
File "E:\ComfyUI-aki-v1.3-2\custom_nodes\ComfyUI-FluxTrainer\library\strategy_flux.py", line 28, in __init__
self.t5xxl = self._load_tokenizer(T5TokenizerFast, T5_XXL_TOKENIZER_ID, tokenizer_cache_dir=tokenizer_cache_dir)
File "E:\ComfyUI-aki-v1.3-2\custom_nodes\ComfyUI-FluxTrainer\library\strategy_base.py", line 44, in _load_tokenizer
tokenizer = model_class.from_pretrained(model_id, subfolder=subfolder)
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\transformers\tokenization_utils_base.py", line 2213, in from_pretrained
return cls._from_pretrained(
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\transformers\tokenization_utils_base.py", line 2447, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\transformers\models\t5\tokenization_t5_fast.py", line 119, in __init__
super().__init__(
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\transformers\tokenization_utils_fast.py", line 119, in __init__
fast_tokenizer = convert_slow_tokenizer(slow_tokenizer)
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\transformers\convert_slow_tokenizer.py", line 1628, in convert_slow_tokenizer
return converter_class(transformer_tokenizer).converted()
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\transformers\convert_slow_tokenizer.py", line 553, in __init__
model_pb2 = import_protobuf()
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\transformers\convert_slow_tokenizer.py", line 38, in import_protobuf
from sentencepiece import sentencepiece_model_pb2
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\sentencepiece\sentencepiece_model_pb2.py", line 16, in <module>
DESCRIPTOR = _descriptor.FileDescriptor(
File "E:\ComfyUI-aki-v1.3-2\python\lib\site-packages\google\protobuf\descriptor.py", line 1228, in __new__
return _message.default_pool.AddSerializedFile(serialized_pb)
System Information
ComfyUI Version: v0.2.7
Arguments: E:\ComfyUI-aki-v1.3-2\main.py --auto-launch --preview-method auto --disable-cuda-malloc
InitFluxLoRATraining
Couldn't build proto file into descriptor pool: duplicate file name sentencepiece_model.proto
ComfyUI Error Report
Error Details
Stack Trace
System Information
Devices
Logs
Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
Additional Context
(Please add any additional context or steps to reproduce the error here)
The text was updated successfully, but these errors were encountered: