Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-16-full-qkv-72-v5-mkmlizer
Waiting for job on junhua024-chai-16-full-qkv-72-v5-mkmlizer to finish
Failed to get response for submission junhua024-chai-16-full-_94000_v7: HTTPConnectionPool(host='junhua024-chai-16-full-94000-v7-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ belonging to: ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ║ ║
junhua024-chai-16-full-qkv-72-v5-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Downloaded to shared memory in 114.584s
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Checking if junhua024/chai_16_full_qkv100_o106_ffn106_1624 already exists in ChaiML
junhua024-chai-16-full-qkv-72-v5-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp_m_4jyai, device:0
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-16-full-qkv-72-v5-mkmlizer: quantized model in 32.941s
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Processed model junhua024/chai_16_full_qkv100_o106_ffn106_1624 in 147.602s
junhua024-chai-16-full-qkv-72-v5-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-16-full-qkv-72-v5-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-16-full-qkv-72-v5-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-16-full-qkv-72-v5/nvidia
junhua024-chai-16-full-qkv-72-v5-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-16-full-qkv-72-v5/nvidia/special_tokens_map.json
junhua024-chai-16-full-qkv-72-v5-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-16-full-qkv-72-v5/nvidia/config.json
junhua024-chai-16-full-qkv-72-v5-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-16-full-qkv-72-v5/nvidia/tokenizer_config.json
junhua024-chai-16-full-qkv-72-v5-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-16-full-qkv-72-v5/nvidia/tokenizer.json
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-16-full-qkv-72-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-16-full-qkv-72-v5/nvidia/flywheel_model.0.safetensors
junhua024-chai-16-full-qkv-72-v5-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:23, 15.22it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.71it/s]
Loading 0: 3%|▎ | 12/363 [00:00<00:12, 29.18it/s]
Loading 0: 5%|▍ | 17/363 [00:00<00:11, 31.38it/s]
Loading 0: 6%|▋ | 23/363 [00:00<00:10, 33.30it/s]
Loading 0: 9%|▊ | 31/363 [00:00<00:07, 44.22it/s]
Loading 0: 10%|▉ | 36/363 [00:01<00:09, 33.35it/s]
Loading 0: 11%|█▏ | 41/363 [00:01<00:09, 34.34it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 42.25it/s]
Loading 0: 15%|█▌ | 55/363 [00:01<00:08, 34.35it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 31.68it/s]
Loading 0: 18%|█▊ | 65/363 [00:01<00:09, 32.55it/s]
Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 32.17it/s]
Loading 0: 20%|██ | 74/363 [00:02<00:08, 35.56it/s]
Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 36.19it/s]
Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 32.34it/s]
Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 33.78it/s]
Loading 0: 25%|██▌ | 91/363 [00:02<00:07, 34.55it/s]
Loading 0: 27%|██▋ | 97/363 [00:02<00:07, 36.78it/s]
Loading 0: 28%|██▊ | 101/363 [00:02<00:07, 36.85it/s]
Loading 0: 29%|██▉ | 105/363 [00:03<00:07, 33.89it/s]
Loading 0: 31%|███ | 113/363 [00:03<00:06, 40.92it/s]
Loading 0: 33%|███▎ | 118/363 [00:03<00:07, 34.21it/s]
Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 32.90it/s]
Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 34.46it/s]
Loading 0: 36%|███▋ | 132/363 [00:03<00:07, 32.69it/s]
Loading 0: 38%|███▊ | 138/363 [00:04<00:06, 32.48it/s]
Loading 0: 39%|███▉ | 143/363 [00:04<00:06, 33.03it/s]
Loading 0: 41%|████ | 149/363 [00:04<00:06, 33.39it/s]
Loading 0: 43%|████▎ | 157/363 [00:04<00:04, 42.91it/s]
Loading 0: 45%|████▍ | 162/363 [00:04<00:06, 33.03it/s]
Loading 0: 46%|████▌ | 167/363 [00:04<00:05, 33.43it/s]
Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 40.76it/s]
Loading 0: 50%|████▉ | 181/363 [00:05<00:05, 33.20it/s]
Loading 0: 51%|█████ | 185/363 [00:05<00:05, 30.96it/s]
Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 30.73it/s]
Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 29.91it/s]
Loading 0: 55%|█████▌ | 200/363 [00:05<00:04, 33.41it/s]
Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 33.61it/s]
Loading 0: 57%|█████▋ | 208/363 [00:06<00:05, 29.01it/s]
Loading 0: 58%|█████▊ | 212/363 [00:06<00:05, 28.41it/s]
Loading 0: 60%|██████ | 218/363 [00:06<00:04, 34.96it/s]
Loading 0: 61%|██████▏ | 223/363 [00:06<00:04, 33.17it/s]
Loading 0: 63%|██████▎ | 227/363 [00:06<00:04, 33.53it/s]
Loading 0: 64%|██████▎ | 231/363 [00:06<00:04, 31.49it/s]
Loading 0: 66%|██████▌ | 239/363 [00:07<00:03, 38.73it/s]
Loading 0: 67%|██████▋ | 243/363 [00:07<00:03, 31.95it/s]
Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 33.69it/s]
Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 35.62it/s]
Loading 0: 71%|███████ | 258/363 [00:07<00:03, 33.68it/s]
Loading 0: 72%|███████▏ | 263/363 [00:07<00:02, 37.25it/s]
Loading 0: 74%|███████▎ | 267/363 [00:07<00:02, 37.08it/s]
Loading 0: 75%|███████▍ | 271/363 [00:08<00:02, 31.75it/s]
Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 30.32it/s]
Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 36.73it/s]
Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 34.38it/s]
Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 33.79it/s]
Loading 0: 81%|████████ | 294/363 [00:08<00:02, 30.59it/s]
Loading 0: 83%|████████▎ | 301/363 [00:08<00:01, 39.46it/s]
Loading 0: 84%|████████▍ | 306/363 [00:09<00:01, 29.10it/s]
Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 31.28it/s]
Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 33.02it/s]
Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 31.70it/s]
Loading 0: 90%|████████▉ | 326/363 [00:09<00:01, 35.32it/s]
Loading 0: 91%|█████████ | 330/363 [00:09<00:00, 35.99it/s]
Loading 0: 92%|█████████▏| 334/363 [00:09<00:00, 31.43it/s]
Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 30.12it/s]
Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 36.59it/s]
Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 24.37it/s]
Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 22.30it/s]
Loading 0: 98%|█████████▊| 357/363 [00:10<00:00, 23.88it/s]
Job junhua024-chai-16-full-qkv-72-v5-mkmlizer completed after 170.15s with status: succeeded
Stopping job with name junhua024-chai-16-full-qkv-72-v5-mkmlizer
Pipeline stage MKMLizer completed in 170.67s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-16-full-qkv-72-v5
Waiting for inference service junhua024-chai-16-full-qkv-72-v5 to be ready
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_94000_v6: HTTPConnectionPool(host='junhua024-chai-16-full-94000-v6-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_94000_v6: HTTPConnectionPool(host='junhua024-chai-16-full-94000-v6-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-16-full-qkv-72-v5 ready after 331.57129645347595s
Pipeline stage MKMLDeployer completed in 332.37s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.4908838272094727s
Received healthy response to inference request in 2.0420637130737305s
Received healthy response to inference request in 1.575993299484253s
Received healthy response to inference request in 1.7192416191101074s
Received healthy response to inference request in 1.6810956001281738s
5 requests
0 failed requests
5th percentile: 1.5970137596130372
10th percentile: 1.6180342197418214
20th percentile: 1.6600751399993896
30th percentile: 1.6887248039245606
40th percentile: 1.703983211517334
50th percentile: 1.7192416191101074
60th percentile: 1.8483704566955566
70th percentile: 1.977499294281006
80th percentile: 2.131827735900879
90th percentile: 2.311355781555176
95th percentile: 2.4011198043823243
99th percentile: 2.472931022644043
mean time: 1.9018556118011474
Pipeline stage StressChecker completed in 11.15s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.75s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.77s
Shutdown handler de-registered
junhua024-chai-16-full-qkv_72_v5 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5078.30s
Shutdown handler de-registered
junhua024-chai-16-full-qkv_72_v5 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-16-full-qkv_72_v5 status is now torndown due to DeploymentManager action