Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-066126-v58-mkmlizer
Waiting for job on junhua024-chai-1-full-066126-v58-mkmlizer to finish
junhua024-chai-1-full-066126-v58-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-066126-v58-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v58-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Connection pool is full, discarding connection: %s. Connection pool size: %s
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission zmeeks-capitanito-45-e2_v5: HTTPConnectionPool(host='zmeeks-capitanito-45-e2-v5-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v58-mkmlizer: Downloaded to shared memory in 77.812s
junhua024-chai-1-full-066126-v58-mkmlizer: Checking if junhua024/chai-1-full-066126 already exists in ChaiML
junhua024-chai-1-full-066126-v58-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpkzr91q8f, device:0
junhua024-chai-1-full-066126-v58-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-066126-v58-mkmlizer: quantized model in 32.060s
junhua024-chai-1-full-066126-v58-mkmlizer: Processed model junhua024/chai-1-full-066126 in 109.966s
junhua024-chai-1-full-066126-v58-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-066126-v58-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-066126-v58-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v58/nvidia
junhua024-chai-1-full-066126-v58-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v58/nvidia/config.json
junhua024-chai-1-full-066126-v58-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v58/nvidia/special_tokens_map.json
junhua024-chai-1-full-066126-v58-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v58/nvidia/tokenizer_config.json
junhua024-chai-1-full-066126-v58-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v58/nvidia/tokenizer.json
junhua024-chai-1-full-066126-v58-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v58/nvidia/flywheel_model.0.safetensors
junhua024-chai-1-full-066126-v58-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:22, 15.82it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.78it/s]
Loading 0: 3%|▎ | 11/363 [00:00<00:10, 32.62it/s]
Loading 0: 4%|▍ | 15/363 [00:00<00:10, 34.55it/s]
Loading 0: 5%|▌ | 19/363 [00:00<00:13, 26.18it/s]
Loading 0: 6%|▋ | 23/363 [00:00<00:13, 25.80it/s]
Loading 0: 8%|▊ | 29/363 [00:00<00:10, 32.96it/s]
Loading 0: 9%|▉ | 34/363 [00:01<00:10, 31.00it/s]
Loading 0: 10%|█ | 38/363 [00:01<00:10, 32.07it/s]
Loading 0: 12%|█▏ | 42/363 [00:01<00:10, 30.93it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 37.95it/s]
Loading 0: 15%|█▍ | 54/363 [00:01<00:09, 30.94it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 33.01it/s]
Loading 0: 18%|█▊ | 65/363 [00:02<00:08, 34.12it/s]
Loading 0: 19%|█▉ | 69/363 [00:02<00:08, 33.52it/s]
Loading 0: 21%|██ | 75/363 [00:02<00:08, 33.37it/s]
Loading 0: 22%|██▏ | 80/363 [00:02<00:09, 30.93it/s]
Loading 0: 24%|██▎ | 86/363 [00:02<00:07, 36.13it/s]
Loading 0: 25%|██▌ | 91/363 [00:02<00:07, 35.79it/s]
Loading 0: 27%|██▋ | 97/363 [00:02<00:07, 36.03it/s]
Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 35.36it/s]
Loading 0: 29%|██▉ | 105/363 [00:03<00:07, 32.38it/s]
Loading 0: 30%|███ | 109/363 [00:03<00:07, 32.99it/s]
Loading 0: 31%|███ | 113/363 [00:03<00:07, 34.35it/s]
Loading 0: 32%|███▏ | 117/363 [00:03<00:09, 25.39it/s]
Loading 0: 34%|███▎ | 122/363 [00:03<00:08, 27.65it/s]
Loading 0: 35%|███▌ | 128/363 [00:04<00:07, 29.88it/s]
Loading 0: 36%|███▋ | 132/363 [00:04<00:07, 29.84it/s]
Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 33.97it/s]
Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 34.95it/s]
Loading 0: 40%|███▉ | 145/363 [00:04<00:07, 30.71it/s]
Loading 0: 41%|████ | 149/363 [00:04<00:07, 30.14it/s]
Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 36.01it/s]
Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 34.43it/s]
Loading 0: 45%|████▌ | 164/363 [00:05<00:05, 34.43it/s]
Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 31.32it/s]
Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 37.93it/s]
Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 30.56it/s]
Loading 0: 51%|█████ | 185/363 [00:05<00:05, 32.29it/s]
Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 32.85it/s]
Loading 0: 54%|█████▎ | 195/363 [00:06<00:05, 31.69it/s]
Loading 0: 55%|█████▍ | 199/363 [00:06<00:04, 33.15it/s]
Loading 0: 56%|█████▌ | 203/363 [00:06<00:05, 31.89it/s]
Loading 0: 57%|█████▋ | 207/363 [00:06<00:05, 26.26it/s]
Loading 0: 58%|█████▊ | 212/363 [00:06<00:05, 28.14it/s]
Loading 0: 60%|██████ | 218/363 [00:06<00:04, 34.30it/s]
Loading 0: 61%|██████▏ | 223/363 [00:06<00:04, 33.68it/s]
Loading 0: 63%|██████▎ | 227/363 [00:07<00:04, 33.54it/s]
Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 31.76it/s]
Loading 0: 66%|██████▌ | 238/363 [00:07<00:03, 40.30it/s]
Loading 0: 67%|██████▋ | 243/363 [00:07<00:04, 29.83it/s]
Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 31.95it/s]
Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 34.18it/s]
Loading 0: 71%|███████ | 258/363 [00:08<00:03, 33.72it/s]
Loading 0: 73%|███████▎ | 264/363 [00:08<00:02, 34.10it/s]
Loading 0: 74%|███████▍ | 269/363 [00:08<00:02, 33.58it/s]
Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 34.25it/s]
Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 38.64it/s]
Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 35.08it/s]
Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 33.20it/s]
Loading 0: 81%|████████ | 294/363 [00:09<00:02, 30.03it/s]
Loading 0: 82%|████████▏ | 298/363 [00:09<00:02, 32.09it/s]
Loading 0: 83%|████████▎ | 303/363 [00:09<00:01, 31.14it/s]
Loading 0: 85%|████████▍ | 307/363 [00:09<00:01, 30.51it/s]
Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 30.29it/s]
Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 31.70it/s]
Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 31.65it/s]
Loading 0: 90%|████████▉ | 326/363 [00:10<00:01, 35.31it/s]
Loading 0: 91%|█████████ | 330/363 [00:10<00:00, 36.33it/s]
Loading 0: 92%|█████████▏| 334/363 [00:10<00:00, 32.17it/s]
Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 31.45it/s]
Loading 0: 95%|█████████▌| 346/363 [00:10<00:00, 42.64it/s]
Loading 0: 97%|█████████▋| 351/363 [00:10<00:00, 23.93it/s]
Loading 0: 98%|█████████▊| 355/363 [00:11<00:00, 26.46it/s]
Loading 0: 99%|█████████▉| 359/363 [00:11<00:00, 28.61it/s]
Job junhua024-chai-1-full-066126-v58-mkmlizer completed after 138.4s with status: succeeded
Stopping job with name junhua024-chai-1-full-066126-v58-mkmlizer
Pipeline stage MKMLizer completed in 139.04s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-066126-v58
Waiting for inference service junhua024-chai-1-full-066126-v58 to be ready
Failed to get response for submission junhua024-chai-1-full-066126_v56: ('http://junhua024-chai-1-full-066126-v56-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:42652->127.0.0.1:8080: read: connection reset by peer\n')
Inference service junhua024-chai-1-full-066126-v58 ready after 230.97090411186218s
Pipeline stage MKMLDeployer completed in 231.73s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.5486018657684326s
Received healthy response to inference request in 1.69643235206604s
Received healthy response to inference request in 1.7000823020935059s
Received healthy response to inference request in 1.5705363750457764s
Received healthy response to inference request in 1.7269775867462158s
5 requests
0 failed requests
5th percentile: 1.5957155704498291
10th percentile: 1.620894765853882
20th percentile: 1.6712531566619873
30th percentile: 1.6971623420715332
40th percentile: 1.6986223220825196
50th percentile: 1.7000823020935059
60th percentile: 1.7108404159545898
70th percentile: 1.7215985298156737
80th percentile: 1.8913024425506593
90th percentile: 2.219952154159546
95th percentile: 2.384277009963989
99th percentile: 2.515736894607544
mean time: 1.8485260963439942
Pipeline stage StressChecker completed in 10.66s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.69s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.97s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v58 status is now deployed due to DeploymentManager action
Shutdown handler registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3146.83s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v58 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of junhua024-chai-1-full-066126_v58
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLDeleter
Checking if service junhua024-chai-1-full-066126-v58 is running
Tearing down inference service junhua024-chai-1-full-066126-v58
Service junhua024-chai-1-full-066126-v58 has been torndown
Pipeline stage MKMLDeleter completed in 4.02s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key junhua024-chai-1-full-066126-v58/nvidia/config.json from bucket guanaco-mkml-models
Deleting key junhua024-chai-1-full-066126-v58/nvidia/flywheel_model.0.safetensors from bucket guanaco-mkml-models
admin requested tearing down of chaiml-mattheo-riddle-m_84066_v1
Deleting key junhua024-chai-1-full-066126-v58/nvidia/special_tokens_map.json from bucket guanaco-mkml-models
Shutdown handler not registered because Python interpreter is not running in the main thread
Deleting key junhua024-chai-1-full-066126-v58/nvidia/tokenizer.json from bucket guanaco-mkml-models
run pipeline %s
Deleting key junhua024-chai-1-full-066126-v58/nvidia/tokenizer_config.json from bucket guanaco-mkml-models
run pipeline stage %s
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 3.15s
Pipeline stage %s skipped, reason=%s
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 0.40s
Pipeline stage MKMLDeleter completed in 0.40s
junhua024-chai-1-full-066126_v58 status is now torndown due to DeploymentManager action
junhua024-chai-1-full-066126_v58 status is now torndown due to DeploymentManager action