Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-066126-v48-mkmlizer
Waiting for job on junhua024-chai-1-full-066126-v48-mkmlizer to finish
junhua024-chai-1-full-066126-v48-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-066126-v48-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v48-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v48-mkmlizer: Downloaded to shared memory in 102.672s
junhua024-chai-1-full-066126-v48-mkmlizer: Checking if junhua024/chai-1-full-066126 already exists in ChaiML
junhua024-chai-1-full-066126-v48-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpd0fcby5s, device:0
junhua024-chai-1-full-066126-v48-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-066126-v48-mkmlizer: quantized model in 31.025s
junhua024-chai-1-full-066126-v48-mkmlizer: Processed model junhua024/chai-1-full-066126 in 133.884s
junhua024-chai-1-full-066126-v48-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-066126-v48-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-066126-v48-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v48/nvidia
junhua024-chai-1-full-066126-v48-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v48/nvidia/config.json
junhua024-chai-1-full-066126-v48-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v48/nvidia/special_tokens_map.json
junhua024-chai-1-full-066126-v48-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v48/nvidia/tokenizer_config.json
junhua024-chai-1-full-066126-v48-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v48/nvidia/tokenizer.json
junhua024-chai-1-full-066126-v48-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v48/nvidia/flywheel_model.0.safetensors
junhua024-chai-1-full-066126-v48-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:22, 15.96it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.66it/s]
Loading 0: 3%|▎ | 11/363 [00:00<00:10, 33.04it/s]
Loading 0: 4%|▍ | 16/363 [00:00<00:10, 32.21it/s]
Loading 0: 6%|▌ | 20/363 [00:00<00:10, 34.11it/s]
Loading 0: 7%|▋ | 24/363 [00:00<00:10, 32.61it/s]
Loading 0: 9%|▊ | 31/363 [00:00<00:07, 42.78it/s]
Loading 0: 10%|▉ | 36/363 [00:01<00:10, 31.01it/s]
Loading 0: 11%|█▏ | 41/363 [00:01<00:09, 32.50it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 40.13it/s]
Loading 0: 15%|█▌ | 55/363 [00:01<00:09, 33.96it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 33.14it/s]
Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 33.12it/s]
Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 32.50it/s]
Loading 0: 20%|██ | 74/363 [00:02<00:08, 35.93it/s]
Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 36.44it/s]
Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 31.43it/s]
Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 33.02it/s]
Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 32.60it/s]
Loading 0: 27%|██▋ | 97/363 [00:02<00:07, 33.38it/s]
Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 33.82it/s]
Loading 0: 29%|██▉ | 105/363 [00:03<00:08, 32.21it/s]
Loading 0: 31%|███ | 113/363 [00:03<00:06, 38.81it/s]
Loading 0: 32%|███▏ | 117/363 [00:03<00:07, 31.25it/s]
Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 33.13it/s]
Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 34.83it/s]
Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 33.46it/s]
Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 37.12it/s]
Loading 0: 39%|███▉ | 141/363 [00:04<00:05, 37.74it/s]
Loading 0: 40%|███▉ | 145/363 [00:04<00:06, 33.15it/s]
Loading 0: 41%|████ | 149/363 [00:04<00:06, 32.29it/s]
Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 38.51it/s]
Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 36.24it/s]
Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 36.37it/s]
Loading 0: 46%|████▋ | 168/363 [00:04<00:05, 33.59it/s]
Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 39.91it/s]
Loading 0: 50%|████▉ | 181/363 [00:05<00:05, 33.68it/s]
Loading 0: 51%|█████ | 185/363 [00:05<00:05, 31.98it/s]
Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 33.76it/s]
Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 33.35it/s]
Loading 0: 55%|█████▌ | 200/363 [00:05<00:04, 37.08it/s]
Loading 0: 56%|█████▌ | 204/363 [00:05<00:04, 37.66it/s]
Loading 0: 57%|█████▋ | 208/363 [00:06<00:04, 33.51it/s]
Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 32.27it/s]
Loading 0: 60%|██████ | 219/363 [00:06<00:03, 41.36it/s]
Loading 0: 62%|██████▏ | 224/363 [00:06<00:03, 36.00it/s]
Loading 0: 63%|██████▎ | 228/363 [00:06<00:03, 35.00it/s]
Loading 0: 64%|██████▍ | 232/363 [00:06<00:03, 34.33it/s]
Loading 0: 66%|██████▌ | 239/363 [00:06<00:03, 37.25it/s]
Loading 0: 67%|██████▋ | 243/363 [00:07<00:03, 30.49it/s]
Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 32.55it/s]
Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 34.26it/s]
Loading 0: 71%|███████ | 258/363 [00:07<00:03, 33.45it/s]
Loading 0: 72%|███████▏ | 263/363 [00:07<00:02, 36.78it/s]
Loading 0: 74%|███████▎ | 267/363 [00:07<00:02, 37.27it/s]
Loading 0: 75%|███████▍ | 271/363 [00:07<00:02, 32.47it/s]
Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 31.53it/s]
Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 36.42it/s]
Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 34.44it/s]
Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 34.61it/s]
Loading 0: 81%|████████ | 294/363 [00:08<00:02, 32.53it/s]
Loading 0: 83%|████████▎ | 302/363 [00:08<00:01, 38.46it/s]
Loading 0: 84%|████████▍ | 306/363 [00:08<00:01, 30.77it/s]
Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 32.15it/s]
Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 33.42it/s]
Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 32.57it/s]
Loading 0: 90%|████████▉ | 326/363 [00:09<00:01, 35.98it/s]
Loading 0: 91%|█████████ | 330/363 [00:09<00:00, 36.33it/s]
Loading 0: 92%|█████████▏| 334/363 [00:09<00:00, 31.27it/s]
Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 30.73it/s]
Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 37.32it/s]
Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 25.94it/s]
Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 24.26it/s]
Loading 0: 98%|█████████▊| 357/363 [00:10<00:00, 26.32it/s]
Job junhua024-chai-1-full-066126-v48-mkmlizer completed after 158.18s with status: succeeded
Stopping job with name junhua024-chai-1-full-066126-v48-mkmlizer
Pipeline stage MKMLizer completed in 158.66s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-066126-v48
Waiting for inference service junhua024-chai-1-full-066126-v48 to be ready
Failed to get response for submission junhua024-chai-1-full-066126_v47: HTTPConnectionPool(host='junhua024-chai-1-full-066126-v47-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-1-full-066126-v48 ready after 230.8318727016449s
Pipeline stage MKMLDeployer completed in 231.35s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.8285117149353027s
Received healthy response to inference request in 1.8165266513824463s
Received healthy response to inference request in 1.6141252517700195s
Received healthy response to inference request in 1.6144113540649414s
Received healthy response to inference request in 1.560880422592163s
5 requests
0 failed requests
5th percentile: 1.5715293884277344
10th percentile: 1.5821783542633057
20th percentile: 1.6034762859344482
30th percentile: 1.614182472229004
40th percentile: 1.6142969131469727
50th percentile: 1.6144113540649414
60th percentile: 1.6952574729919434
70th percentile: 1.7761035919189452
80th percentile: 2.018923664093018
90th percentile: 2.4237176895141603
95th percentile: 2.626114702224731
99th percentile: 2.7880323123931885
mean time: 1.8868910789489746
Pipeline stage StressChecker completed in 10.76s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 1.10s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.90s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v48 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3065.16s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v48 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-066126_v48 status is now torndown due to DeploymentManager action
junhua024-chai-1-full-066126_v48 status is now torndown due to DeploymentManager action