Skip to content

Commit

Permalink
Type and doc fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
danieldk committed Apr 11, 2024
1 parent 72c0f7c commit 842bbea
Show file tree
Hide file tree
Showing 2 changed files with 13 additions and 12 deletions.
2 changes: 1 addition & 1 deletion spacy/cli/distill.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ def distill_cli(


def distill(
teacher_model: str,
teacher_model: Union[str, Path],
student_config_path: Union[str, Path],
output_path: Optional[Union[str, Path]] = None,
*,
Expand Down
23 changes: 12 additions & 11 deletions website/docs/api/cli.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -1707,17 +1707,18 @@ $ python -m spacy project dvc [project_dir] [workflow] [--force] [--verbose] [--
Distill a _student_ pipeline from a _teacher_ pipeline. Distillation trains the
models in the student pipeline on the activations of the teacher's models. A
typical use case for distillation is to extract a smaller, more performant model
from a larger high-accuracy model. Since distillation uses the activations of the
teacher, distillation can be performed on a corpus of raw text without (gold standard)
annotations.
`distill` will save out the best performing pipeline across all epochs, as well as the final
pipeline. The `--code` argument can be used to provide a Python file that's
imported before the training process starts. This lets you register
from a larger high-accuracy model. Since distillation uses the activations of
the teacher, distillation can be performed on a corpus of raw text without (gold
standard) annotations. A development set of gold annotations _is_ needed to
evaluate the distilled model on during distillation.
`distill` will save out the best performing pipeline across all epochs, as well
as the final pipeline. The `--code` argument can be used to provide a Python
file that's imported before the training process starts. This lets you register
[custom functions](/usage/training#custom-functions) and architectures and refer
to them in your config, all while still using spaCy's built-in `train` workflow.
If you need to manage complex multi-step training workflows, check out the new
[spaCy projects](/usage/projects).
If you need to manage complex multi-step training workflows, check out the
[Weasel](https://github.com/explosion/weasel).
> #### Example
>
Expand All @@ -1731,14 +1732,14 @@ $ python -m spacy distill [teacher_model] [student_config_path] [--output] [--co
| Name | Description |
| --------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `teacher_model` | The teacher pipeline to distill the student from. ~~Path (positional)~~ |
| `teacher_model` | The teacher pipeline (name or path) to distill the student from. ~~Union[str, Path] (positional)~~ |
| `student_config_path` | The configuration of the student pipeline. ~~Path (positional)~~ |
| `--output`, `-o` | Directory to store the distilled pipeline in. Will be created if it doesn't exist. ~~Optional[Path] \(option)~~ |
| `--code`, `-c` | Comma-separated paths to Python files with additional code to be imported. Allows [registering custom functions](/usage/training#custom-functions) for new architectures. ~~Optional[Path] \(option)~~ |
| `--verbose`, `-V` | Show more detailed messages during distillation. ~~bool (flag)~~ |
| `--gpu-id`, `-g` | GPU ID or `-1` for CPU. Defaults to `-1`. ~~int (option)~~ |
| `--help`, `-h` | Show help message and available arguments. ~~bool (flag)~~ |
| **CREATES** | A `dvc.yaml` file in the project directory, based on the steps defined in the given workflow. |
| **CREATES** | The final trained pipeline and the best trained pipeline. |
## huggingface-hub {id="huggingface-hub",version="3.1"}
Expand Down

0 comments on commit 842bbea

Please sign in to comment.