Proposal for lazy-loaded annotations. #1484
Closed
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This branch contains a proposal for lazy-loading annotations for
DetectionsDataset
.It's been tested scarcely - enough to show that memory usage does not grow when loading a large YOLO dataset. The main purpose is to have a code reference for further discussions.
Description
lazy_detections
objects, which store masks as polygons until loaded.data
along withresolution_xy
yolo_annotations_to_detections
was changed to produce lazy values.Union[Dict[str, Detections], LazyDict[str, Detections, Detections]
.Type of change
Please delete options that are not relevant.
How has this change been tested, please provide a testcase or example of how you tested the change?
pytest
+ loaded yolo dataset with it, comparing memory use and outputs with non-lazy method.The memory use does not grow, and the results are the same.
I did not test this thoroughly so far.
Any specific deployment considerations
For example, documentation changes, usability, usage/costs, secrets, etc.
Docs