Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-16-full-63041-v4-mkmlizer
Waiting for job on junhua024-chai-16-full-63041-v4-mkmlizer to finish
junhua024-chai-16-full-63041-v4-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ belonging to: ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-16-full-63041-v4-mkmlizer: ║ ║
junhua024-chai-16-full-63041-v4-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-63041-v4-mkmlizer: Downloaded to shared memory in 109.175s
junhua024-chai-16-full-63041-v4-mkmlizer: Checking if junhua024/chai_16_full_11_qkv_o_ffn_1925 already exists in ChaiML
junhua024-chai-16-full-63041-v4-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmplv5bbq7z, device:0
junhua024-chai-16-full-63041-v4-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-16-full-63041-v4-mkmlizer: quantized model in 31.649s
junhua024-chai-16-full-63041-v4-mkmlizer: Processed model junhua024/chai_16_full_11_qkv_o_ffn_1925 in 140.901s
junhua024-chai-16-full-63041-v4-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-16-full-63041-v4-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-16-full-63041-v4/nvidia/config.json
junhua024-chai-16-full-63041-v4-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-16-full-63041-v4/nvidia/special_tokens_map.json
junhua024-chai-16-full-63041-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-16-full-63041-v4/nvidia/tokenizer_config.json
junhua024-chai-16-full-63041-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-16-full-63041-v4/nvidia/tokenizer.json
junhua024-chai-16-full-63041-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-16-full-63041-v4/nvidia/flywheel_model.0.safetensors
junhua024-chai-16-full-63041-v4-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:24, 14.69it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.14it/s]
Loading 0: 3%|▎ | 12/363 [00:00<00:12, 28.51it/s]
Loading 0: 5%|▍ | 17/363 [00:00<00:11, 29.89it/s]
Loading 0: 6%|▋ | 23/363 [00:00<00:10, 32.24it/s]
Loading 0: 9%|▊ | 31/363 [00:00<00:07, 43.41it/s]
Loading 0: 10%|▉ | 36/363 [00:01<00:10, 31.47it/s]
Loading 0: 11%|█▏ | 41/363 [00:01<00:09, 32.52it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 41.50it/s]
Loading 0: 15%|█▌ | 55/363 [00:01<00:08, 35.45it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:08, 35.07it/s]
Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 36.89it/s]
Loading 0: 19%|█▉ | 69/363 [00:02<00:08, 35.53it/s]
Loading 0: 21%|██ | 75/363 [00:02<00:08, 35.50it/s]
Loading 0: 22%|██▏ | 80/363 [00:02<00:08, 33.67it/s]
Loading 0: 24%|██▎ | 86/363 [00:02<00:07, 39.01it/s]
Loading 0: 25%|██▌ | 91/363 [00:02<00:07, 38.62it/s]
Loading 0: 27%|██▋ | 97/363 [00:02<00:06, 39.10it/s]
Loading 0: 28%|██▊ | 102/363 [00:02<00:06, 37.58it/s]
Loading 0: 29%|██▉ | 106/363 [00:03<00:07, 36.58it/s]
Loading 0: 31%|███ | 113/363 [00:03<00:06, 41.28it/s]
Loading 0: 33%|███▎ | 118/363 [00:03<00:07, 34.69it/s]
Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 33.27it/s]
Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 35.45it/s]
Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 33.83it/s]
Loading 0: 38%|███▊ | 138/363 [00:03<00:06, 34.72it/s]
Loading 0: 39%|███▉ | 143/363 [00:04<00:06, 33.64it/s]
Loading 0: 41%|████ | 149/363 [00:04<00:06, 33.73it/s]
Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 39.18it/s]
Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 38.12it/s]
Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 37.63it/s]
Loading 0: 46%|████▋ | 168/363 [00:04<00:05, 34.79it/s]
Loading 0: 48%|████▊ | 176/363 [00:04<00:04, 41.69it/s]
Loading 0: 50%|████▉ | 181/363 [00:05<00:05, 34.57it/s]
Loading 0: 51%|█████ | 185/363 [00:05<00:05, 33.60it/s]
Loading 0: 53%|█████▎ | 191/363 [00:05<00:04, 35.24it/s]
Loading 0: 54%|█████▎ | 195/363 [00:05<00:04, 34.36it/s]
Loading 0: 55%|█████▌ | 201/363 [00:05<00:04, 35.55it/s]
Loading 0: 57%|█████▋ | 206/363 [00:05<00:04, 35.22it/s]
Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 34.91it/s]
Loading 0: 61%|██████ | 221/363 [00:06<00:03, 46.24it/s]
Loading 0: 63%|██████▎ | 227/363 [00:06<00:03, 37.74it/s]
Loading 0: 64%|██████▍ | 232/363 [00:06<00:03, 36.59it/s]
Loading 0: 66%|██████▌ | 239/363 [00:06<00:03, 40.70it/s]
Loading 0: 67%|██████▋ | 244/363 [00:06<00:03, 35.20it/s]
Loading 0: 68%|██████▊ | 248/363 [00:06<00:03, 33.90it/s]
Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 35.96it/s]
Loading 0: 71%|███████ | 258/363 [00:07<00:03, 34.56it/s]
Loading 0: 73%|███████▎ | 264/363 [00:07<00:02, 34.38it/s]
Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 34.07it/s]
Loading 0: 76%|███████▌ | 275/363 [00:07<00:02, 35.09it/s]
Loading 0: 77%|███████▋ | 281/363 [00:07<00:02, 40.25it/s]
Loading 0: 79%|███████▉ | 286/363 [00:07<00:02, 38.29it/s]
Loading 0: 80%|███████▉ | 290/363 [00:08<00:01, 38.06it/s]
Loading 0: 81%|████████ | 294/363 [00:08<00:01, 35.33it/s]
Loading 0: 83%|████████▎ | 302/363 [00:08<00:01, 41.99it/s]
Loading 0: 85%|████████▍ | 307/363 [00:08<00:01, 35.24it/s]
Loading 0: 86%|████████▌ | 311/363 [00:08<00:01, 34.35it/s]
Loading 0: 87%|████████▋ | 317/363 [00:08<00:01, 36.67it/s]
Loading 0: 88%|████████▊ | 321/363 [00:08<00:01, 34.81it/s]
Loading 0: 90%|█████████ | 327/363 [00:09<00:01, 35.35it/s]
Loading 0: 91%|█████████▏| 332/363 [00:09<00:00, 34.87it/s]
Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 35.90it/s]
Loading 0: 95%|█████████▍| 344/363 [00:09<00:00, 40.57it/s]
Loading 0: 96%|█████████▌| 349/363 [00:09<00:00, 26.71it/s]
Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 23.92it/s]
Loading 0: 98%|█████████▊| 357/363 [00:10<00:00, 25.39it/s]
Job junhua024-chai-16-full-63041-v4-mkmlizer completed after 162.65s with status: succeeded
Stopping job with name junhua024-chai-16-full-63041-v4-mkmlizer
Pipeline stage MKMLizer completed in 163.16s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-16-full-63041-v4
Waiting for inference service junhua024-chai-16-full-63041-v4 to be ready
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-16-full-63041-v4 ready after 322.3396053314209s
Pipeline stage MKMLDeployer completed in 322.91s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.5143654346466064s
Received healthy response to inference request in 1.6115913391113281s
Received healthy response to inference request in 2.080272674560547s
Received healthy response to inference request in 1.7069854736328125s
Received healthy response to inference request in 1.5504100322723389s
5 requests
0 failed requests
5th percentile: 1.5626462936401366
10th percentile: 1.5748825550079346
20th percentile: 1.5993550777435304
30th percentile: 1.630670166015625
40th percentile: 1.6688278198242188
50th percentile: 1.7069854736328125
60th percentile: 1.8563003540039062
70th percentile: 2.005615234375
80th percentile: 2.167091226577759
90th percentile: 2.3407283306121824
95th percentile: 2.4275468826293944
99th percentile: 2.497001724243164
mean time: 1.8927249908447266
Pipeline stage StressChecker completed in 11.39s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.69s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.72s
Shutdown handler de-registered
junhua024-chai-16-full-_63041_v4 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5137.33s
Shutdown handler de-registered
junhua024-chai-16-full-_63041_v4 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-16-full-_63041_v4 status is now torndown due to DeploymentManager action