Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean up relationship between deployment_tar and deployment #389

Merged
merged 9 commits into from
Dec 4, 2023

Conversation

bfineran
Copy link
Contributor

@bfineran bfineran commented Nov 21, 2023

Feature Description

Now, any directory that can be downloaded as tar.gz and then unzipped is by default downloaded in such a manner.
This does not prevent us from downloading single, "loose" files from the directory in question.

Downloading the deployment directory

Note, that the deployment is downloaded as tar and then unzipped:

stub = "zoo:mobilenet_v1-1.0-imagenet-pruned.4block_quantized"

model = Model(stub)
deployment_path = model.deployment.path
print(os.listdir(deployment_path))
print(os.listdir(model._path))
Downloading (…)ed/deployment.tar.gz: 100%|██████████| 4.48M/4.48M [00:00<00:00, 12.7MB/s]
['model.onnx']
['deployment', 'deployment.tar.gz']

Downloading the whole model

Note, that the appropriate folders get downloaded as tars and then unzipped

model = Model(stub)
print(os.listdir(model.path))
Downloading (…)d/training/model.pth: 100%|██████████| 32.6M/32.6M [00:02<00:00, 12.5MB/s]
Downloading (…)training-metric.yaml: 100%|██████████| 168/168 [00:00<00:00, 89.7kB/s]
Downloading (…)ed/deployment.tar.gz: 100%|██████████| 4.48M/4.48M [00:00<00:00, 12.0MB/s]
Downloading (…)ple-originals.tar.gz: 100%|██████████| 4.07M/4.07M [00:00<00:00, 9.58MB/s]
Downloading (…)sample-inputs.tar.gz: 100%|██████████| 3.31M/3.31M [00:00<00:00, 11.1MB/s]
Downloading (…)ample-outputs.tar.gz: 100%|██████████| 152k/152k [00:00<00:00, 4.29MB/s]
Downloading (…)sample-labels.tar.gz: 100%|██████████| 170k/170k [00:00<00:00, 3.43MB/s]
Downloading (…)k_quantized/model.md: 100%|██████████| 1.98k/1.98k [00:00<00:00, 1.19MB/s]
Downloading (…)ed/model.onnx.tar.gz: 100%|██████████| 4.48M/4.48M [00:00<00:00, 10.6MB/s]
['sample-inputs', 'sample-labels.tar.gz', 'deployment', 'sample-originals', 'sample-originals.tar.gz', 'deployment.tar.gz', 'model.md', 'sample-outputs.tar.gz', 'sample-inputs.tar.gz', 'model.onnx', 'sample-outputs', 'training', 'model.onnx.tar.gz', 'sample-labels']

Downloading a single "loose" deployment file

model = Model(stub)
deployment_onnx_file = model.deployment.get_file("model.onnx").path
print(os.listdir(os.path.dirname(deployment_onnx_file)))
print(os.listdir(model._path))
Downloading (…)eployment/model.onnx: 100%|██████████| 7.00M/7.00M [00:00<00:00, 11.4MB/s]
['model.onnx']
['deployment']

@dbogunowicz dbogunowicz changed the title [WIP] clean up relationship between deployment_tar and deployment Clean up relationship between deployment_tar and deployment Nov 27, 2023
@dbogunowicz
Copy link
Contributor

Reminder: once this PR lands, we need to update the relevant pathways to the deployment directory in sparseml and deepsparse.

src/sparsezoo/objects/directory.py Outdated Show resolved Hide resolved
# is used when we initialize the model from
# local directory (not stub) and both [some_directory]
# and [some_directory].tar.gz are present
allow_picking_one_from_multiple: bool = False,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

needs better name

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

like download_from_tar_if_available

Copy link
Contributor

@dbogunowicz dbogunowicz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

best pr ever

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants