Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-06-full-30622-v26-mkmlizer
Waiting for job on junhua024-chai-06-full-30622-v26-mkmlizer to finish
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-06-full-30622-v26-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ belonging to: ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-06-full-30622-v26-mkmlizer: ║ ║
junhua024-chai-06-full-30622-v26-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-30622-v26-mkmlizer: Downloaded to shared memory in 79.152s
junhua024-chai-06-full-30622-v26-mkmlizer: Checking if junhua024/chai_06_full_02102_1925 already exists in ChaiML
junhua024-chai-06-full-30622-v26-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp5j94oubd, device:0
junhua024-chai-06-full-30622-v26-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission blend_fader_2025-07-10: ('http://chaiml-mistral-24b-dpo-59605-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:52970->127.0.0.1:8080: read: connection reset by peer\n')
junhua024-chai-06-full-30622-v26-mkmlizer: quantized model in 34.347s
junhua024-chai-06-full-30622-v26-mkmlizer: Processed model junhua024/chai_06_full_02102_1925 in 113.616s
junhua024-chai-06-full-30622-v26-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-06-full-30622-v26-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-06-full-30622-v26-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v26/nvidia
junhua024-chai-06-full-30622-v26-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v26/nvidia/special_tokens_map.json
junhua024-chai-06-full-30622-v26-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v26/nvidia/config.json
junhua024-chai-06-full-30622-v26-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v26/nvidia/tokenizer_config.json
junhua024-chai-06-full-30622-v26-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v26/nvidia/tokenizer.json
junhua024-chai-06-full-30622-v26-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-06-full-30622-v26/nvidia/flywheel_model.0.safetensors
junhua024-chai-06-full-30622-v26-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:28, 12.50it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:24, 14.34it/s]
Loading 0: 3%|▎ | 10/363 [00:00<00:13, 25.42it/s]
Loading 0: 4%|▎ | 13/363 [00:00<00:14, 24.42it/s]
Loading 0: 5%|▍ | 17/363 [00:00<00:14, 23.50it/s]
Loading 0: 6%|▋ | 23/363 [00:00<00:12, 27.13it/s]
Loading 0: 8%|▊ | 29/363 [00:01<00:09, 33.65it/s]
Loading 0: 9%|▉ | 34/363 [00:01<00:10, 32.43it/s]
Loading 0: 10%|█ | 38/363 [00:01<00:09, 32.83it/s]
Loading 0: 12%|█▏ | 42/363 [00:01<00:10, 30.51it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 36.83it/s]
Loading 0: 15%|█▍ | 54/363 [00:01<00:10, 29.79it/s]
Loading 0: 16%|█▋ | 59/363 [00:02<00:09, 31.64it/s]
Loading 0: 18%|█▊ | 65/363 [00:02<00:08, 33.75it/s]
Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 30.97it/s]
Loading 0: 20%|██ | 73/363 [00:02<00:08, 32.80it/s]
Loading 0: 21%|██ | 77/363 [00:02<00:09, 30.68it/s]
Loading 0: 22%|██▏ | 81/363 [00:02<00:10, 25.94it/s]
Loading 0: 24%|██▎ | 86/363 [00:02<00:09, 29.89it/s]
Loading 0: 25%|██▌ | 91/363 [00:03<00:08, 30.66it/s]
Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 30.25it/s]
Loading 0: 28%|██▊ | 101/363 [00:03<00:09, 28.11it/s]
Loading 0: 29%|██▊ | 104/363 [00:03<00:10, 23.65it/s]
Loading 0: 30%|███ | 109/363 [00:03<00:09, 27.49it/s]
Loading 0: 31%|███ | 113/363 [00:03<00:08, 29.66it/s]
Loading 0: 32%|███▏ | 117/363 [00:04<00:10, 23.60it/s]
Loading 0: 34%|███▎ | 122/363 [00:04<00:09, 26.12it/s]
Loading 0: 35%|███▍ | 127/363 [00:04<00:07, 30.79it/s]
Loading 0: 36%|███▌ | 131/363 [00:04<00:09, 24.19it/s]
Loading 0: 38%|███▊ | 137/363 [00:04<00:07, 30.30it/s]
Loading 0: 39%|███▉ | 141/363 [00:04<00:07, 31.32it/s]
Loading 0: 40%|███▉ | 145/363 [00:05<00:07, 28.03it/s]
Loading 0: 41%|████ | 149/363 [00:05<00:07, 27.62it/s]
Loading 0: 43%|████▎ | 155/363 [00:05<00:06, 34.07it/s]
Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 31.95it/s]
Loading 0: 45%|████▌ | 164/363 [00:05<00:06, 29.94it/s]
Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 28.25it/s]
Loading 0: 48%|████▊ | 174/363 [00:05<00:05, 34.73it/s]
Loading 0: 49%|████▉ | 178/363 [00:06<00:06, 29.11it/s]
Loading 0: 50%|█████ | 182/363 [00:06<00:06, 29.91it/s]
Loading 0: 51%|█████ | 186/363 [00:06<00:06, 28.86it/s]
Loading 0: 53%|█████▎ | 191/363 [00:06<00:06, 28.47it/s]
Loading 0: 53%|█████▎ | 194/363 [00:06<00:06, 24.98it/s]
Loading 0: 55%|█████▍ | 199/363 [00:06<00:05, 29.32it/s]
Loading 0: 56%|█████▌ | 203/363 [00:07<00:05, 29.25it/s]
Loading 0: 57%|█████▋ | 207/363 [00:07<00:05, 26.01it/s]
Loading 0: 58%|█████▊ | 212/363 [00:07<00:05, 28.00it/s]
Loading 0: 60%|█████▉ | 217/363 [00:07<00:04, 32.67it/s]
Loading 0: 61%|██████ | 222/363 [00:07<00:04, 34.99it/s]
Loading 0: 62%|██████▏ | 226/363 [00:07<00:04, 27.72it/s]
Loading 0: 63%|██████▎ | 230/363 [00:07<00:04, 26.88it/s]
Loading 0: 65%|██████▍ | 235/363 [00:08<00:04, 31.72it/s]
Loading 0: 66%|██████▌ | 240/363 [00:08<00:04, 29.82it/s]
Loading 0: 67%|██████▋ | 244/363 [00:08<00:04, 28.99it/s]
Loading 0: 68%|██████▊ | 248/363 [00:08<00:04, 28.19it/s]
Loading 0: 70%|██████▉ | 253/363 [00:08<00:03, 32.90it/s]
Loading 0: 71%|███████ | 257/363 [00:08<00:04, 25.54it/s]
Loading 0: 72%|███████▏ | 263/363 [00:09<00:03, 31.10it/s]
Loading 0: 74%|███████▎ | 267/363 [00:09<00:02, 32.05it/s]
Loading 0: 75%|███████▍ | 271/363 [00:09<00:03, 27.65it/s]
Loading 0: 76%|███████▌ | 275/363 [00:09<00:03, 26.65it/s]
Loading 0: 77%|███████▋ | 281/363 [00:09<00:02, 32.45it/s]
Loading 0: 79%|███████▉ | 286/363 [00:09<00:02, 31.00it/s]
Loading 0: 80%|███████▉ | 290/363 [00:09<00:02, 31.24it/s]
Loading 0: 81%|████████ | 294/363 [00:10<00:02, 28.90it/s]
Loading 0: 82%|████████▏ | 299/363 [00:10<00:01, 33.23it/s]
Loading 0: 83%|████████▎ | 303/363 [00:10<00:02, 29.04it/s]
Loading 0: 85%|████████▍ | 307/363 [00:10<00:01, 28.33it/s]
Loading 0: 86%|████████▌ | 311/363 [00:10<00:01, 27.80it/s]
Loading 0: 87%|████████▋ | 316/363 [00:10<00:01, 32.34it/s]
Loading 0: 88%|████████▊ | 320/363 [00:11<00:01, 25.60it/s]
Loading 0: 90%|████████▉ | 326/363 [00:11<00:01, 31.60it/s]
Loading 0: 91%|█████████ | 330/363 [00:11<00:01, 31.55it/s]
Loading 0: 92%|█████████▏| 334/363 [00:11<00:01, 26.56it/s]
Loading 0: 93%|█████████▎| 338/363 [00:11<00:00, 26.02it/s]
Loading 0: 94%|█████████▍| 343/363 [00:11<00:00, 29.65it/s]
Loading 0: 96%|█████████▌| 348/363 [00:11<00:00, 31.90it/s]
Loading 0: 97%|█████████▋| 352/363 [00:12<00:00, 16.76it/s]
Loading 0: 98%|█████████▊| 357/363 [00:12<00:00, 19.96it/s]
Job junhua024-chai-06-full-30622-v26-mkmlizer completed after 140.49s with status: succeeded
Stopping job with name junhua024-chai-06-full-30622-v26-mkmlizer
Pipeline stage MKMLizer completed in 141.12s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-06-full-30622-v26
Waiting for inference service junhua024-chai-06-full-30622-v26 to be ready
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-06-full-30622-v26 ready after 321.58236813545227s
Pipeline stage MKMLDeployer completed in 322.33s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.8340747356414795s
Received healthy response to inference request in 1.8861279487609863s
Received healthy response to inference request in 1.5429060459136963s
Received healthy response to inference request in 1.671539068222046s
Received healthy response to inference request in 2.388829469680786s
5 requests
0 failed requests
5th percentile: 1.5686326503753663
10th percentile: 1.5943592548370362
20th percentile: 1.645812463760376
30th percentile: 1.714456844329834
40th percentile: 1.80029239654541
50th percentile: 1.8861279487609863
60th percentile: 2.087208557128906
70th percentile: 2.2882891654968263
80th percentile: 2.477878522872925
90th percentile: 2.6559766292572022
95th percentile: 2.7450256824493406
99th percentile: 2.816264925003052
mean time: 2.064695453643799
Pipeline stage StressChecker completed in 12.11s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.77s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.71s
Shutdown handler de-registered
junhua024-chai-06-full_30622_v26 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4690.14s
Shutdown handler de-registered
junhua024-chai-06-full_30622_v26 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-06-full_30622_v26 status is now torndown due to DeploymentManager action