Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot train llama vision 3.2 #1071

Open
pdufour opened this issue Oct 24, 2024 · 0 comments
Open

Cannot train llama vision 3.2 #1071

pdufour opened this issue Oct 24, 2024 · 0 comments

Comments

@pdufour
Copy link

pdufour commented Oct 24, 2024

Example code:

!pip install mlx_lm
!python -m mlx_lm.lora --model meta-llama/Llama-3.2-11B-Vision --data HuggingFaceM4/WebSight --train --iters 1000

Output

None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
Loading pretrained model
Fetching 15 files: 100%|████████████████████| 15/15 [00:00<00:00, 137368.03it/s]
ERROR:root:Model type mllama not supported.
Traceback (most recent call last):
  File /python3.10/site-packages/mlx_lm/utils.py", line 55, in _get_classes
    arch = importlib.import_module(f"mlx_lm.models.{model_type}")
  File "/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1004, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'mlx_lm.models.mllama'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "versions/3.10/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "versions/3.10/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File /python3.10/site-packages/mlx_lm/lora.py", line 295, in <module>
    main()
  File /python3.10/site-packages/mlx_lm/lora.py", line 291, in main
    run(types.SimpleNamespace(**args))
...
    model_class, model_args_class = get_model_classes(config=config)
  File /python3.10/site-packages/mlx_lm/utils.py", line 59, in _get_classes
    raise ValueError(msg)
ValueError: Model type mllama not supported.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant