You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
Loading pretrained model
Fetching 15 files: 100%|████████████████████| 15/15 [00:00<00:00, 137368.03it/s]
ERROR:root:Model type mllama not supported.
Traceback (most recent call last):
File /python3.10/site-packages/mlx_lm/utils.py", line 55, in _get_classes
arch = importlib.import_module(f"mlx_lm.models.{model_type}")
File "/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1004, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'mlx_lm.models.mllama'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "versions/3.10/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "versions/3.10/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File /python3.10/site-packages/mlx_lm/lora.py", line 295, in <module>
main()
File /python3.10/site-packages/mlx_lm/lora.py", line 291, in main
run(types.SimpleNamespace(**args))
...
model_class, model_args_class = get_model_classes(config=config)
File /python3.10/site-packages/mlx_lm/utils.py", line 59, in _get_classes
raise ValueError(msg)
ValueError: Model type mllama not supported.
The text was updated successfully, but these errors were encountered:
Example code:
Output
The text was updated successfully, but these errors were encountered: