Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-002-v22-mkmlizer
Waiting for job on junhua024-chai-1-full-002-v22-mkmlizer to finish
junhua024-chai-1-full-002-v22-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-002-v22-mkmlizer: ║ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ Version: 0.29.3 ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-002-v22-mkmlizer: ║ ║
junhua024-chai-1-full-002-v22-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v22-mkmlizer: Downloaded to shared memory in 77.694s
junhua024-chai-1-full-002-v22-mkmlizer: Checking if junhua024/chai_1-full_002 already exists in ChaiML
junhua024-chai-1-full-002-v22-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp7kjcg6a4, device:0
junhua024-chai-1-full-002-v22-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-002-v22-mkmlizer: quantized model in 31.032s
junhua024-chai-1-full-002-v22-mkmlizer: Processed model junhua024/chai_1-full_002 in 108.807s
junhua024-chai-1-full-002-v22-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-002-v22-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-002-v22-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-002-v22
junhua024-chai-1-full-002-v22-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v22/config.json
junhua024-chai-1-full-002-v22-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v22/special_tokens_map.json
junhua024-chai-1-full-002-v22-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v22/tokenizer_config.json
junhua024-chai-1-full-002-v22-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v22/tokenizer.json
junhua024-chai-1-full-002-v22-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-002-v22/flywheel_model.0.safetensors
junhua024-chai-1-full-002-v22-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:22, 16.40it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:19, 18.32it/s]
Loading 0: 3%|▎ | 12/363 [00:00<00:11, 29.49it/s]
Loading 0: 5%|▍ | 17/363 [00:00<00:11, 30.79it/s]
Loading 0: 6%|▋ | 23/363 [00:00<00:10, 33.51it/s]
Loading 0: 9%|▊ | 31/363 [00:00<00:07, 44.96it/s]
Loading 0: 10%|▉ | 36/363 [00:01<00:09, 33.39it/s]
Loading 0: 11%|█▏ | 41/363 [00:01<00:09, 34.03it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 41.68it/s]
Loading 0: 15%|█▌ | 55/363 [00:01<00:08, 36.00it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:08, 34.93it/s]
Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 36.30it/s]
Loading 0: 19%|█▉ | 69/363 [00:01<00:08, 35.27it/s]
Loading 0: 20%|██ | 74/363 [00:02<00:07, 38.60it/s]
Loading 0: 22%|██▏ | 79/363 [00:02<00:07, 35.67it/s]
Loading 0: 23%|██▎ | 83/363 [00:02<00:07, 35.86it/s]
Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 39.64it/s]
Loading 0: 26%|██▌ | 94/363 [00:02<00:06, 39.01it/s]
Loading 0: 27%|██▋ | 98/363 [00:02<00:07, 33.23it/s]
Loading 0: 28%|██▊ | 102/363 [00:02<00:07, 33.20it/s]
Loading 0: 29%|██▉ | 106/363 [00:03<00:07, 33.33it/s]
Loading 0: 31%|███ | 113/363 [00:03<00:06, 38.08it/s]
Loading 0: 32%|███▏ | 117/363 [00:03<00:07, 31.17it/s]
Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 33.27it/s]
Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 34.37it/s]
Loading 0: 36%|███▋ | 132/363 [00:03<00:07, 32.38it/s]
Loading 0: 38%|███▊ | 137/363 [00:03<00:06, 35.35it/s]
Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 36.12it/s]
Loading 0: 40%|███▉ | 145/363 [00:04<00:06, 31.98it/s]
Loading 0: 41%|████ | 149/363 [00:04<00:06, 31.07it/s]
Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 37.33it/s]
Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 35.96it/s]
Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 35.67it/s]
Loading 0: 46%|████▋ | 168/363 [00:04<00:05, 33.50it/s]
Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 39.59it/s]
Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 31.64it/s]
Loading 0: 51%|█████ | 185/363 [00:05<00:05, 33.48it/s]
Loading 0: 53%|█████▎ | 191/363 [00:05<00:04, 35.25it/s]
Loading 0: 54%|█████▎ | 195/363 [00:05<00:04, 34.56it/s]
Loading 0: 55%|█████▌ | 201/363 [00:05<00:04, 34.75it/s]
Loading 0: 57%|█████▋ | 206/363 [00:05<00:04, 34.28it/s]
Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 34.74it/s]
Loading 0: 60%|██████ | 218/363 [00:06<00:03, 40.16it/s]
Loading 0: 61%|██████▏ | 223/363 [00:06<00:03, 38.23it/s]
Loading 0: 63%|██████▎ | 227/363 [00:06<00:03, 36.53it/s]
Loading 0: 64%|██████▎ | 231/363 [00:06<00:03, 34.21it/s]
Loading 0: 66%|██████▌ | 239/363 [00:06<00:03, 40.20it/s]
Loading 0: 67%|██████▋ | 244/363 [00:06<00:03, 34.76it/s]
Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 34.32it/s]
Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 35.92it/s]
Loading 0: 71%|███████ | 258/363 [00:07<00:02, 35.02it/s]
Loading 0: 73%|███████▎ | 264/363 [00:07<00:02, 34.59it/s]
Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 33.27it/s]
Loading 0: 76%|███████▌ | 275/363 [00:07<00:02, 34.17it/s]
Loading 0: 77%|███████▋ | 281/363 [00:07<00:02, 39.08it/s]
Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 36.21it/s]
Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 35.32it/s]
Loading 0: 81%|████████ | 294/363 [00:08<00:02, 33.50it/s]
Loading 0: 83%|████████▎ | 302/363 [00:08<00:01, 39.23it/s]
Loading 0: 84%|████████▍ | 306/363 [00:08<00:01, 31.94it/s]
Loading 0: 86%|████████▌ | 311/363 [00:08<00:01, 33.90it/s]
Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 35.56it/s]
Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 34.75it/s]
Loading 0: 90%|█████████ | 327/363 [00:09<00:01, 34.35it/s]
Loading 0: 91%|█████████▏| 332/363 [00:09<00:00, 33.95it/s]
Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 35.21it/s]
Loading 0: 95%|█████████▍| 344/363 [00:09<00:00, 40.22it/s]
Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 27.35it/s]
Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 24.86it/s]
Loading 0: 98%|█████████▊| 357/363 [00:10<00:00, 26.13it/s]
Job junhua024-chai-1-full-002-v22-mkmlizer completed after 139.08s with status: succeeded
Stopping job with name junhua024-chai-1-full-002-v22-mkmlizer
Pipeline stage MKMLizer completed in 139.57s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-002-v22
Waiting for inference service junhua024-chai-1-full-002-v22 to be ready
Inference service junhua024-chai-1-full-002-v22 ready after 180.87465691566467s
Pipeline stage MKMLDeployer completed in 181.40s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.8922946453094482s
Received healthy response to inference request in 2.108288049697876s
Received healthy response to inference request in 1.6594512462615967s
Received healthy response to inference request in 1.7417757511138916s
Received healthy response to inference request in 2.0793232917785645s
5 requests
0 failed requests
5th percentile: 1.6759161472320556
10th percentile: 1.6923810482025146
20th percentile: 1.7253108501434327
30th percentile: 1.8092852592468263
40th percentile: 1.9443042755126954
50th percentile: 2.0793232917785645
60th percentile: 2.0909091949462892
70th percentile: 2.1024950981140136
80th percentile: 2.2650893688201905
90th percentile: 2.578692007064819
95th percentile: 2.7354933261871337
99th percentile: 2.8609343814849852
mean time: 2.0962265968322753
Pipeline stage StressChecker completed in 11.82s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.66s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.97s
Shutdown handler de-registered
junhua024-chai-1-full-002_v22 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5192.65s
Shutdown handler de-registered
junhua024-chai-1-full-002_v22 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-002_v22 status is now torndown due to DeploymentManager action