Failed to import transformers.models.cohere.configuration_cohere because of the following error (look up to see its traceback):
So I am using Meta-Llama-3-8B-Instruct inside my google colab and I am getting the following error :
ModuleNotFoundError: No module named 'transformers.models.cohere.configuration_cohere'
The above exception was the direct cause of the following exception:
RuntimeError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
RuntimeError: Failed to import transformers.models.cohere.configuration_cohere because of the following error (look up to see its traceback):
No module named 'transformers.models.cohere.configuration_cohere'
at line :
Load the model from HuggingFace
llama3 = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
trust_remote_code=True,
load_in_8bit=True
)