Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-16-full-96988-v4-mkmlizer
Waiting for job on junhua024-chai-16-full-96988-v4-mkmlizer to finish
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-96988-v4-mkmlizer: Downloaded to shared memory in 79.068s
junhua024-chai-16-full-96988-v4-mkmlizer: Checking if junhua024/chai_16_full_qkv106_o106_ffn106_1925 already exists in ChaiML
junhua024-chai-16-full-96988-v4-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmplkhqkbey, device:0
junhua024-chai-16-full-96988-v4-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-16-full-96988-v4-mkmlizer: quantized model in 32.854s
junhua024-chai-16-full-96988-v4-mkmlizer: Processed model junhua024/chai_16_full_qkv106_o106_ffn106_1925 in 112.011s
junhua024-chai-16-full-96988-v4-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-16-full-96988-v4-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-16-full-96988-v4-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-16-full-96988-v4/nvidia
junhua024-chai-16-full-96988-v4-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-16-full-96988-v4/nvidia/config.json
junhua024-chai-16-full-96988-v4-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-16-full-96988-v4/nvidia/special_tokens_map.json
junhua024-chai-16-full-96988-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-16-full-96988-v4/nvidia/tokenizer_config.json
junhua024-chai-16-full-96988-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-16-full-96988-v4/nvidia/flywheel_model.0.safetensors
junhua024-chai-16-full-96988-v4-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:22, 15.84it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.50it/s]
Loading 0: 3%|▎ | 12/363 [00:00<00:12, 28.13it/s]
Loading 0: 5%|▍ | 17/363 [00:00<00:11, 29.94it/s]
Loading 0: 6%|▋ | 23/363 [00:00<00:10, 32.35it/s]
Loading 0: 9%|▊ | 31/363 [00:00<00:07, 43.40it/s]
Loading 0: 10%|▉ | 36/363 [00:01<00:10, 31.54it/s]
Loading 0: 11%|█▏ | 41/363 [00:01<00:09, 32.30it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 40.14it/s]
Loading 0: 15%|█▌ | 55/363 [00:01<00:08, 34.29it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 33.29it/s]
Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 34.96it/s]
Loading 0: 19%|█▉ | 69/363 [00:02<00:11, 24.55it/s]
Loading 0: 20%|██ | 74/363 [00:02<00:10, 27.98it/s]
Loading 0: 21%|██▏ | 78/363 [00:02<00:09, 30.04it/s]
Loading 0: 23%|██▎ | 82/363 [00:02<00:10, 27.79it/s]
Loading 0: 24%|██▎ | 86/363 [00:02<00:09, 29.57it/s]
Loading 0: 25%|██▌ | 91/363 [00:02<00:09, 29.59it/s]
Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 32.23it/s]
Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 32.82it/s]
Loading 0: 29%|██▉ | 105/363 [00:03<00:08, 31.14it/s]
Loading 0: 31%|███ | 113/363 [00:03<00:06, 37.93it/s]
Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 30.55it/s]
Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 30.70it/s]
Loading 0: 35%|███▌ | 128/363 [00:04<00:07, 32.33it/s]
Loading 0: 36%|███▋ | 132/363 [00:04<00:07, 31.62it/s]
Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 35.33it/s]
Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 36.29it/s]
Loading 0: 40%|███▉ | 145/363 [00:04<00:06, 32.50it/s]
Loading 0: 41%|████ | 149/363 [00:04<00:06, 31.16it/s]
Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 37.54it/s]
Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 35.23it/s]
Loading 0: 45%|████▌ | 164/363 [00:05<00:05, 34.89it/s]
Loading 0: 46%|████▋ | 168/363 [00:05<00:05, 32.62it/s]
Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 39.08it/s]
Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 31.12it/s]
Loading 0: 51%|█████ | 185/363 [00:05<00:05, 31.95it/s]
Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 33.70it/s]
Loading 0: 54%|█████▎ | 195/363 [00:06<00:05, 32.32it/s]
Loading 0: 55%|█████▌ | 200/363 [00:06<00:04, 36.15it/s]
Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 36.83it/s]
Loading 0: 57%|█████▋ | 208/363 [00:06<00:04, 32.27it/s]
Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 30.74it/s]
Loading 0: 60%|██████ | 218/363 [00:06<00:03, 37.02it/s]
Loading 0: 61%|██████▏ | 223/363 [00:06<00:03, 35.09it/s]
Loading 0: 63%|██████▎ | 227/363 [00:06<00:03, 34.21it/s]
Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 31.40it/s]
Loading 0: 65%|██████▌ | 237/363 [00:07<00:03, 37.67it/s]
Loading 0: 67%|██████▋ | 242/363 [00:07<00:03, 32.10it/s]
Loading 0: 68%|██████▊ | 246/363 [00:07<00:03, 33.01it/s]
Loading 0: 69%|██████▉ | 250/363 [00:07<00:03, 33.59it/s]
Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 30.87it/s]
Loading 0: 71%|███████ | 258/363 [00:07<00:03, 30.07it/s]
Loading 0: 72%|███████▏ | 263/363 [00:08<00:02, 34.27it/s]
Loading 0: 74%|███████▎ | 267/363 [00:08<00:02, 35.28it/s]
Loading 0: 75%|███████▍ | 271/363 [00:08<00:02, 30.79it/s]
Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 30.34it/s]
Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 37.28it/s]
Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 35.57it/s]
Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 35.25it/s]
Loading 0: 81%|████████ | 294/363 [00:08<00:02, 33.17it/s]
Loading 0: 83%|████████▎ | 302/363 [00:09<00:01, 38.64it/s]
Loading 0: 84%|████████▍ | 306/363 [00:09<00:01, 30.64it/s]
Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 32.09it/s]
Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 33.73it/s]
Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 32.02it/s]
Loading 0: 90%|████████▉ | 326/363 [00:09<00:01, 35.72it/s]
Loading 0: 91%|█████████ | 330/363 [00:10<00:00, 35.81it/s]
Loading 0: 92%|█████████▏| 334/363 [00:10<00:00, 30.88it/s]
Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 29.53it/s]
Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 34.91it/s]
Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 23.49it/s]
Loading 0: 97%|█████████▋| 352/363 [00:11<00:00, 20.06it/s]
Loading 0: 98%|█████████▊| 357/363 [00:11<00:00, 23.17it/s]
Loading 0: 100%|█████████▉| 362/363 [00:11<00:00, 27.87it/s]
Job junhua024-chai-16-full-96988-v4-mkmlizer completed after 137.28s with status: succeeded
Stopping job with name junhua024-chai-16-full-96988-v4-mkmlizer
Pipeline stage MKMLizer completed in 137.75s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-16-full-96988-v4
Waiting for inference service junhua024-chai-16-full-96988-v4 to be ready
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-16-full-96988-v4 ready after 332.0049195289612s
Pipeline stage MKMLDeployer completed in 332.75s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2936201095581055s
Received healthy response to inference request in 2.101314067840576s
Received healthy response to inference request in 1.840669870376587s
Received healthy response to inference request in 1.9814965724945068s
Received healthy response to inference request in 1.885014533996582s
5 requests
0 failed requests
5th percentile: 1.849538803100586
10th percentile: 1.858407735824585
20th percentile: 1.876145601272583
30th percentile: 1.904310941696167
40th percentile: 1.942903757095337
50th percentile: 1.9814965724945068
60th percentile: 2.0294235706329347
70th percentile: 2.0773505687713625
80th percentile: 2.139775276184082
90th percentile: 2.2166976928710938
95th percentile: 2.2551589012145996
99th percentile: 2.2859278678894044
mean time: 2.0204230308532716
Pipeline stage StressChecker completed in 11.71s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.82s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.62s
Shutdown handler de-registered
junhua024-chai-16-full-_96988_v4 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
junhua024-chai-16-full-_96988_v4 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-16-full-_96988_v4 status is now torndown due to DeploymentManager action