Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-06-full-30622-v28-mkmlizer
Waiting for job on junhua024-chai-06-full-30622-v28-mkmlizer to finish
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission junhua024-chai-16-full-_94000_v4: HTTPConnectionPool(host='junhua024-chai-16-full-94000-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v28-mkmlizer: Downloaded to shared memory in 136.092s
junhua024-chai-06-full-30622-v28-mkmlizer: Checking if junhua024/chai_06_full_02102_1925 already exists in ChaiML
junhua024-chai-06-full-30622-v28-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpak3ayskr, device:0
junhua024-chai-06-full-30622-v28-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-06-full-30622-v28-mkmlizer: quantized model in 32.002s
junhua024-chai-06-full-30622-v28-mkmlizer: Processed model junhua024/chai_06_full_02102_1925 in 168.168s
junhua024-chai-06-full-30622-v28-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-06-full-30622-v28-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-06-full-30622-v28-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v28/nvidia
junhua024-chai-06-full-30622-v28-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v28/nvidia/config.json
junhua024-chai-06-full-30622-v28-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v28/nvidia/special_tokens_map.json
junhua024-chai-06-full-30622-v28-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v28/nvidia/tokenizer_config.json
junhua024-chai-06-full-30622-v28-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v28/nvidia/tokenizer.json
junhua024-chai-06-full-30622-v28-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v28/nvidia/flywheel_model.0.safetensors
junhua024-chai-06-full-30622-v28-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:24, 14.74it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.52it/s]
Loading 0: 3%|▎ | 12/363 [00:00<00:11, 29.75it/s]
Loading 0: 5%|▍ | 17/363 [00:00<00:10, 31.47it/s]
Loading 0: 6%|▋ | 23/363 [00:00<00:10, 32.44it/s]
Loading 0: 8%|▊ | 29/363 [00:00<00:08, 38.92it/s]
Loading 0: 9%|▉ | 34/363 [00:01<00:09, 34.02it/s]
Loading 0: 10%|█ | 38/363 [00:01<00:09, 33.25it/s]
Loading 0: 12%|█▏ | 42/363 [00:01<00:10, 30.99it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 37.99it/s]
Loading 0: 15%|█▍ | 54/363 [00:01<00:10, 30.35it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 31.53it/s]
Loading 0: 18%|█▊ | 65/363 [00:02<00:08, 33.43it/s]
Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 32.39it/s]
Loading 0: 20%|██ | 74/363 [00:02<00:08, 36.01it/s]
Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 36.86it/s]
Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 32.78it/s]
Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 39.37it/s]
Loading 0: 26%|██▌ | 94/363 [00:02<00:06, 39.84it/s]
Loading 0: 27%|██▋ | 99/363 [00:02<00:08, 31.82it/s]
Loading 0: 29%|██▊ | 104/363 [00:03<00:07, 32.94it/s]
Loading 0: 31%|███ | 112/363 [00:03<00:05, 42.91it/s]
Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 30.64it/s]
Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 32.54it/s]
Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 34.85it/s]
Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 34.38it/s]
Loading 0: 38%|███▊ | 138/363 [00:04<00:06, 34.98it/s]
Loading 0: 39%|███▉ | 143/363 [00:04<00:06, 34.73it/s]
Loading 0: 41%|████ | 149/363 [00:04<00:06, 35.16it/s]
Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 40.01it/s]
Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 36.74it/s]
Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 36.05it/s]
Loading 0: 46%|████▋ | 168/363 [00:04<00:05, 34.31it/s]
Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 41.34it/s]
Loading 0: 50%|████▉ | 181/363 [00:05<00:05, 35.27it/s]
Loading 0: 51%|█████ | 185/363 [00:05<00:05, 33.78it/s]
Loading 0: 53%|█████▎ | 191/363 [00:05<00:04, 35.49it/s]
Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 32.67it/s]
Loading 0: 55%|█████▌ | 201/363 [00:05<00:04, 32.69it/s]
Loading 0: 57%|█████▋ | 206/363 [00:06<00:04, 32.77it/s]
Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 33.90it/s]
Loading 0: 61%|██████ | 220/363 [00:06<00:03, 43.36it/s]
Loading 0: 62%|██████▏ | 225/363 [00:06<00:04, 34.02it/s]
Loading 0: 63%|██████▎ | 230/363 [00:06<00:03, 35.06it/s]
Loading 0: 66%|██████▌ | 239/363 [00:06<00:02, 43.22it/s]
Loading 0: 67%|██████▋ | 244/363 [00:07<00:03, 36.21it/s]
Loading 0: 69%|██████▊ | 249/363 [00:07<00:03, 36.40it/s]
Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 35.43it/s]
Loading 0: 71%|███████ | 258/363 [00:07<00:03, 34.63it/s]
Loading 0: 73%|███████▎ | 264/363 [00:07<00:02, 34.36it/s]
Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 34.47it/s]
Loading 0: 76%|███████▌ | 275/363 [00:07<00:02, 35.99it/s]
Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 40.89it/s]
Loading 0: 79%|███████▉ | 286/363 [00:08<00:01, 39.39it/s]
Loading 0: 80%|████████ | 291/363 [00:08<00:01, 36.99it/s]
Loading 0: 81%|████████▏ | 295/363 [00:08<00:01, 35.79it/s]
Loading 0: 83%|████████▎ | 302/363 [00:08<00:01, 39.55it/s]
Loading 0: 84%|████████▍ | 306/363 [00:08<00:01, 32.41it/s]
Loading 0: 86%|████████▌ | 311/363 [00:08<00:01, 34.52it/s]
Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 36.63it/s]
Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 35.40it/s]
Loading 0: 90%|█████████ | 327/363 [00:09<00:00, 36.19it/s]
Loading 0: 91%|█████████▏| 332/363 [00:09<00:00, 35.66it/s]
Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 34.99it/s]
Loading 0: 95%|█████████▍| 344/363 [00:09<00:00, 39.46it/s]
Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 26.86it/s]
Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 24.56it/s]
Loading 0: 98%|█████████▊| 357/363 [00:10<00:00, 26.22it/s]
Job junhua024-chai-06-full-30622-v28-mkmlizer completed after 199.07s with status: succeeded
Stopping job with name junhua024-chai-06-full-30622-v28-mkmlizer
Pipeline stage MKMLizer completed in 199.61s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.52s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-06-full-30622-v28
Waiting for inference service junhua024-chai-06-full-30622-v28 to be ready
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_96988_v4: HTTPConnectionPool(host='junhua024-chai-16-full-96988-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_96988_v4: HTTPConnectionPool(host='junhua024-chai-16-full-96988-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_96988_v1: HTTPConnectionPool(host='junhua024-chai-16-full-96988-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-06-full-30622-v28 ready after 321.4726083278656s
Pipeline stage MKMLDeployer completed in 322.04s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.594019651412964s
Received healthy response to inference request in 1.8838026523590088s
Received healthy response to inference request in 1.8881745338439941s
Received healthy response to inference request in 1.674699306488037s
Received healthy response to inference request in 1.7986464500427246s
5 requests
0 failed requests
5th percentile: 1.6994887351989747
10th percentile: 1.724278163909912
20th percentile: 1.773857021331787
30th percentile: 1.8156776905059815
40th percentile: 1.849740171432495
50th percentile: 1.8838026523590088
60th percentile: 1.8855514049530029
70th percentile: 1.887300157546997
80th percentile: 2.0293435573577883
90th percentile: 2.311681604385376
95th percentile: 2.4528506278991697
99th percentile: 2.565785846710205
mean time: 1.9678685188293457
Pipeline stage StressChecker completed in 11.35s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.85s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 1.04s
Shutdown handler de-registered
junhua024-chai-06-full_30622_v28 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3309.27s
Shutdown handler de-registered
junhua024-chai-06-full_30622_v28 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-06-full_30622_v28 status is now torndown due to DeploymentManager action