Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Anomalib does not work with Python<=3.11 #2393

Open
1 task done
Kogia-sima opened this issue Oct 23, 2024 · 0 comments
Open
1 task done

[Bug]: Anomalib does not work with Python<=3.11 #2393

Kogia-sima opened this issue Oct 23, 2024 · 0 comments

Comments

@Kogia-sima
Copy link

Describe the bug

Anomalib depends on the new feature introduced with Python 3.12, which causes an error on Python 3.11.

When I simply run the train command from tutorial, anomalib raise the following error.

TypeError: TarFile.extract() got an unexpected keyword argument 'filter'

This behaviour is due to the filter argument was introduced in the Python 3.12, but I'm using Python 3.11 (I can't upgrade to 3.12 because I'm using the python of Databricks).

Dataset

MVTec

Model

PatchCore

Steps to reproduce the behavior

  1. Install anomalib by pip install anomalib==1.1.1
  2. Perform training using the following command:
$ anomalib train --model Patchcore --data anomalib.data.MVTec

OS information

OS information:

  • OS: Ubuntu 22.04.4 LTS
  • Python version: 3.11.0rc1
  • Anomalib version: 1.1.1 (latest)
  • PyTorch version: 2.3.1+cu121
  • CUDA/cuDNN version: 12.1
  • GPU models and configuration: Tesla T4

Expected behavior

  1. Download MVTec dataset
  2. Extract the dataset archive
  3. Perform training

Screenshots

No response

Pip/GitHub

pip

What version/branch did you use?

1.1.1

Configuration YAML

**no configuration file**

Logs

Could not find wandb. To use this feature, ensure that you have wandb installed.
2024-10-22 01:35:27,331 - anomalib.utils.config - WARNING - Anomalib currently does not support multi-gpu training. Setting devices to 1.
[10/22/24 01:35:27] WARNING  Anomalib currently does not support   �]8;id=503012;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/utils/config.py�\config.py�]8;;\:�]8;id=159524;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/utils/config.py#262�\262�]8;;�\
                             multi-gpu training. Setting devices                
                             to 1.                                              
2024-10-22 01:35:27,390 - anomalib.models.components.base.anomaly_module - INFO - Initializing Patchcore model.
                    INFO     Initializing Patchcore model.  �]8;id=364931;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/models/components/base/anomaly_module.py�\anomaly_module.py�]8;;\:�]8;id=673896;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/models/components/base/anomaly_module.py#42�\42�]8;;�\
2024-10-22 01:35:29,073 - timm.models._builder - INFO - Loading pretrained weights from Hugging Face hub (timm/wide_resnet50_2.racm_in1k)
[10/22/24 01:35:29] INFO     Loading pretrained weights from     �]8;id=593483;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/timm/models/_builder.py�\_builder.py�]8;;\:�]8;id=126564;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/timm/models/_builder.py#187�\187�]8;;�\
                             Hugging Face hub                                   
                             (timm/wide_resnet50_2.racm_in1k)                   
model.safetensors: 100%|██████████████████████| 276M/276M [00:01<00:00, 254MB/s]
2024-10-22 01:35:30,640 - timm.models._hub - INFO - [timm/wide_resnet50_2.racm_in1k] Safe alternative available for 'pytorch_model.bin' (as 'model.safetensors'). Loading weights using safetensors.
[10/22/24 01:35:30] INFO     [timm/wide_resnet50_2.racm_in1k] Safe   �]8;id=707170;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/timm/models/_hub.py�\_hub.py�]8;;\:�]8;id=456134;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/timm/models/_hub.py#180�\180�]8;;�\
                             alternative available for                          
                             'pytorch_model.bin' (as                            
                             'model.safetensors'). Loading weights              
                             using safetensors.                                 
2024-10-22 01:35:30,670 - timm.models._builder - INFO - Missing keys (fc.weight, fc.bias) discovered while loading pretrained weights. This is expected if model is being adapted.
                    INFO     Missing keys (fc.weight, fc.bias)   �]8;id=676737;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/timm/models/_builder.py�\_builder.py�]8;;\:�]8;id=197107;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/timm/models/_builder.py#245�\245�]8;;�\
                             discovered while loading pretrained                
                             weights. This is expected if model                 
                             is being adapted.                                  
2024-10-22 01:35:30,696 - anomalib.callbacks - INFO - Loading the callbacks
                    INFO     Loading the callbacks                �]8;id=356480;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/callbacks/__init__.py�\__init__.py�]8;;\:�]8;id=802581;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/callbacks/__init__.py#42�\42�]8;;�\
2024-10-22 01:35:30,697 - anomalib.engine.engine - INFO - Overriding gradient_clip_val from None with 0 for Patchcore
                    INFO     Overriding gradient_clip_val from None �]8;id=280383;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/engine/engine.py�\engine.py�]8;;\:�]8;id=703206;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/engine/engine.py#85�\85�]8;;�\
                             with 0 for Patchcore                               
2024-10-22 01:35:30,699 - anomalib.engine.engine - INFO - Overriding max_epochs from None with 1 for Patchcore
                    INFO     Overriding max_epochs from None with 1 �]8;id=862240;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/engine/engine.py�\engine.py�]8;;\:�]8;id=488562;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/engine/engine.py#85�\85�]8;;�\
                             for Patchcore                                      
2024-10-22 01:35:30,699 - anomalib.engine.engine - INFO - Overriding num_sanity_val_steps from None with 0 for Patchcore
                    INFO     Overriding num_sanity_val_steps from   �]8;id=643143;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/engine/engine.py�\engine.py�]8;;\:�]8;id=520574;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/engine/engine.py#85�\85�]8;;�\
                             None with 0 for Patchcore                          
2024-10-22 01:35:30,704 - lightning.pytorch.utilities.rank_zero - INFO - Trainer already configured with model summary callbacks: [<class 'lightning.pytorch.callbacks.rich_model_summary.RichModelSummary'>]. Skipping setting a default `ModelSummary` callback.
[10/22/24 01:35:30] INFO     Trainer already configured with     �]8;id=767503;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/lightning_utilities/core/rank_zero.py�\rank_zero.py�]8;;\:�]8;id=831667;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/lightning_utilities/core/rank_zero.py#63�\63�]8;;�\
                             model summary callbacks: [<class                   
                             'lightning.pytorch.callbacks.rich_m                
                             odel_summary.RichModelSummary'>].                  
                             Skipping setting a default                         
                             `ModelSummary` callback.                           
2024-10-22 01:35:30,716 - lightning.pytorch.utilities.rank_zero - INFO - GPU available: False, used: False
                    INFO     GPU available: False, used: False   �]8;id=529924;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/lightning_utilities/core/rank_zero.py�\rank_zero.py�]8;;\:�]8;id=608144;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/lightning_utilities/core/rank_zero.py#63�\63�]8;;�\
2024-10-22 01:35:30,717 - lightning.pytorch.utilities.rank_zero - INFO - TPU available: False, using: 0 TPU cores
                    INFO     TPU available: False, using: 0 TPU  �]8;id=198654;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/lightning_utilities/core/rank_zero.py�\rank_zero.py�]8;;\:�]8;id=728810;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/lightning_utilities/core/rank_zero.py#63�\63�]8;;�\
                             cores                                              
2024-10-22 01:35:30,718 - lightning.pytorch.utilities.rank_zero - INFO - HPU available: False, using: 0 HPUs
                    INFO     HPU available: False, using: 0 HPUs �]8;id=910705;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/lightning_utilities/core/rank_zero.py�\rank_zero.py�]8;;\:�]8;id=141536;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/lightning_utilities/core/rank_zero.py#63�\63�]8;;�\
2024-10-22 01:35:30,719 - anomalib.data.utils.download - INFO - Downloading the mvtec dataset.
                    INFO     Downloading the mvtec dataset.      �]8;id=263689;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/data/utils/download.py�\download.py�]8;;\:�]8;id=3066;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/data/utils/download.py#331�\331�]8;;�\
mvtec:   7%|██▎                             | 370M/5.26G [00:34<07:17, 11.2MB/s]

*** WARNING: max output size exceeded, skipping output. ***

mvtec: 5.26GB [07:42, 11.4MB/s]                                                 
2024-10-22 01:43:13,005 - anomalib.data.utils.download - INFO - Checking the hash of the downloaded file.
[10/22/24 01:43:13] INFO     Checking the hash of the downloaded �]8;id=145103;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/data/utils/download.py�\download.py�]8;;\:�]8;id=945492;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/data/utils/download.py#340�\340�]8;;�\
                             file.                                              
2024-10-22 01:43:18,796 - anomalib.data.utils.download - INFO - Extracting dataset into datasets/MVTec folder.
[10/22/24 01:43:18] INFO     Extracting dataset into             �]8;id=560692;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/data/utils/download.py�\download.py�]8;;\:�]8;id=97696;file:///local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e38e0/lib/python3.11/site-packages/anomalib/data/utils/download.py#292�\292�]8;;�\
                             datasets/MVTec folder.                             
╭───────────────────── Traceback (most recent call last) ──────────────────────╮
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/bin/anomalib:8 in <module>                                               │
│                                                                              │
│   5 from anomalib.cli.cli import main                                        │
│   6 if __name__ == '__main__':                                               │
│   7 │   sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])     │
│ ❱ 8 │   sys.exit(main())                                                     │
│   9                                                                          │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/anomalib/cli/cli.py:486 in main             │
│                                                                              │
│   483 def main() -> None:                                                    │
│   484 │   """Trainer via Anomalib CLI."""                                    │
│   485 │   configure_logger()                                                 │
│ ❱ 486 │   AnomalibCLI()                                                      │
│   487                                                                        │
│   488                                                                        │
│   489 if __name__ == "__main__":                                             │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/anomalib/cli/cli.py:65 in __init__          │
│                                                                              │
│    62 │   │   │   self.before_instantiate_classes()                          │
│    63 │   │   │   self.instantiate_classes()                                 │
│    64 │   │   if run:                                                        │
│ ❱  65 │   │   │   self._run_subcommand()                                     │
│    66 │                                                                      │
│    67 │   def init_parser(self, **kwargs) -> ArgumentParser:                 │
│    68 │   │   """Method that instantiates the argument parser."""            │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/anomalib/cli/cli.py:365 in _run_subcommand  │
│                                                                              │
│   362 │   │   elif self.config["subcommand"] in (*self.subcommands(), "train │
│   363 │   │   │   fn = getattr(self.engine, self.subcommand)                 │
│   364 │   │   │   fn_kwargs = self._prepare_subcommand_kwargs(self.subcomman │
│ ❱ 365 │   │   │   fn(**fn_kwargs)                                            │
│   366 │   │   elif PIPELINE_REGISTRY is not None and self.subcommand in pipe │
│   367 │   │   │   run_pipeline(self.config)                                  │
│   368 │   │   else:                                                          │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/anomalib/engine/engine.py:863 in train      │
│                                                                              │
│    860 │   │   │   # if the model is zero-shot or few-shot, we only need to  │
│    861 │   │   │   self.trainer.validate(model, val_dataloaders, None, verbo │
│    862 │   │   else:                                                         │
│ ❱  863 │   │   │   self.trainer.fit(model, train_dataloaders, val_dataloader │
│    864 │   │   self.trainer.test(model, test_dataloaders, ckpt_path=ckpt_pat │
│    865 │                                                                     │
│    866 │   def export(                                                       │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/lightning/pytorch/trainer/trainer.py:538 in │
│ fit                                                                          │
│                                                                              │
│    535 │   │   self.state.fn = TrainerFn.FITTING                             │
│    536 │   │   self.state.status = TrainerStatus.RUNNING                     │
│    537 │   │   self.training = True                                          │
│ ❱  538 │   │   call._call_and_handle_interrupt(                              │
│    539 │   │   │   self, self._fit_impl, model, train_dataloaders, val_datal │
│    540 │   │   )                                                             │
│    541                                                                       │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/lightning/pytorch/trainer/call.py:47 in     │
│ _call_and_handle_interrupt                                                   │
│                                                                              │
│    44 │   try:                                                               │
│    45 │   │   if trainer.strategy.launcher is not None:                      │
│    46 │   │   │   return trainer.strategy.launcher.launch(trainer_fn, *args, │
│ ❱  47 │   │   return trainer_fn(*args, **kwargs)                             │
│    48 │                                                                      │
│    49 │   except _TunerExitException:                                        │
│    50 │   │   _call_teardown_hook(trainer)                                   │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/lightning/pytorch/trainer/trainer.py:574 in │
│ _fit_impl                                                                    │
│                                                                              │
│    571 │   │   │   model_provided=True,                                      │
│    572 │   │   │   model_connected=self.lightning_module is not None,        │
│    573 │   │   )                                                             │
│ ❱  574 │   │   self._run(model, ckpt_path=ckpt_path)                         │
│    575 │   │                                                                 │
│    576 │   │   assert self.state.stopped                                     │
│    577 │   │   self.training = False                                         │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/lightning/pytorch/trainer/trainer.py:941 in │
│ _run                                                                         │
│                                                                              │
│    938 │   │   self.__setup_profiler()                                       │
│    939 │   │                                                                 │
│    940 │   │   log.debug(f"{self.__class__.__name__}: preparing data")       │
│ ❱  941 │   │   self._data_connector.prepare_data()                           │
│    942 │   │                                                                 │
│    943 │   │   call._call_setup_hook(self)  # allow user to set up Lightning │
│    944 │   │   log.debug(f"{self.__class__.__name__}: configuring model")    │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/lightning/pytorch/trainer/connectors/data_c │
│ onnector.py:93 in prepare_data                                               │
│                                                                              │
│    90 │   │   │   prepare_data_per_node = datamodule.prepare_data_per_node   │
│    91 │   │   │   with _InfiniteBarrier():                                   │
│    92 │   │   │   │   if (prepare_data_per_node and local_rank_zero) or (not │
│ ❱  93 │   │   │   │   │   call._call_lightning_datamodule_hook(trainer, "pre │
│    94 │   │                                                                  │
│    95 │   │   # handle lightning module prepare data:                        │
│    96 │   │   if lightning_module is not None and is_overridden("prepare_dat │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/lightning/pytorch/trainer/call.py:189 in    │
│ _call_lightning_datamodule_hook                                              │
│                                                                              │
│   186 │   fn = getattr(trainer.datamodule, hook_name)                        │
│   187 │   if callable(fn):                                                   │
│   188 │   │   with trainer.profiler.profile(f"[LightningDataModule]{trainer. │
│ ❱ 189 │   │   │   return fn(*args, **kwargs)                                 │
│   190 │   return None                                                        │
│   191                                                                        │
│   192                                                                        │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/anomalib/data/image/mvtec.py:414 in         │
│ prepare_data                                                                 │
│                                                                              │
│   411 │   │   if (self.root / self.category).is_dir():                       │
│   412 │   │   │   logger.info("Found the dataset.")                          │
│   413 │   │   else:                                                          │
│ ❱ 414 │   │   │   download_and_extract(self.root, DOWNLOAD_INFO)             │
│   415                                                                        │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/anomalib/data/utils/download.py:346 in      │
│ download_and_extract                                                         │
│                                                                              │
│   343 │   │   │   msg = f"Invalid URL to download dataset. Supported 'http:/ │
│   344 │   │   │   raise RuntimeError(msg)                                    │
│   345 │                                                                      │
│ ❱ 346 │   extract(downloaded_file_path, root)                                │
│   347                                                                        │
│   348                                                                        │
│   349 def is_within_directory(directory: Path, target: Path) -> bool:        │
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/anomalib/data/utils/download.py:306 in      │
│ extract                                                                      │
│                                                                              │
│   303 │   │   with tarfile.open(file_name) as tar_file:                      │
│   304 │   │   │   members = tar_file.getmembers()                            │
│   305 │   │   │   safe_members = [member for member in members if not is_fil │
│ ❱ 306 │   │   │   safe_extract(tar_file, root, safe_members)                 │
│   307 │                                                                      │
│   308 │   else:                                                              │
│   309 │   │   msg = f"Unrecognized file format: {file_name}"
│                                                                              │
│ /local_disk0/.ephemeral_nfs/envs/pythonEnv-627b69f2-c4d0-4495-bc6e-7580018e3 │
│ 8e0/lib/python3.11/site-packages/anomalib/data/utils/download.py:234 in      │
│ safe_extract                                                                 │
│                                                                              │
│   231 │   for member in members:                                             │
│   232 │   │   # check if the file already exists                             │
│   233 │   │   if not (root / member.name).exists():                          │
│ ❱ 234 │   │   │   tar_file.extract(member, root, filter="data")              │
│   235                                                                        │
│   236                                                                        │
│   237 def generate_hash(file_path: str | Path, algorithm: str = "sha256") -> │
╰──────────────────────────────────────────────────────────────────────────────╯
TypeError: TarFile.extract() got an unexpected keyword argument 'filter'

Code of Conduct

  • I agree to follow this project's Code of Conduct
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant