Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-02-full-27473-v1-mkmlizer
Waiting for job on junhua024-chai-02-full-27473-v1-mkmlizer to finish
junhua024-chai-02-full-27473-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ belonging to: ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-02-full-27473-v1-mkmlizer: ║ ║
junhua024-chai-02-full-27473-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission chaiml-next-door-annoyi_15417_v1: ('http://chaiml-next-door-annoyi-15417-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission chaiml-rirv938-mistral-_56339_v4: HTTPConnectionPool(host='guanaco-model-mesh.k2.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-27473-v1-mkmlizer: Downloaded to shared memory in 302.799s
junhua024-chai-02-full-27473-v1-mkmlizer: Checking if junhua024/chai_02_full_02_0200 already exists in ChaiML
junhua024-chai-02-full-27473-v1-mkmlizer: Creating repo ChaiML/chai_02_full_02_0200 and uploading /tmp/tmptaj5y9xa to it
Failed to get response for submission blend_fihal_2025-07-23: ('http://guanaco-model-mesh.k2.chaiverse.com/models/chaiml-exp-grpo-cp312-_36146_v14/predict', 'no healthy upstream')
junhua024-chai-02-full-27473-v1-mkmlizer:
0%| | 0/26 [00:00<?, ?it/s]
4%|▍ | 1/26 [00:06<02:39, 6.37s/it]
8%|▊ | 2/26 [00:11<02:15, 5.63s/it]
12%|█▏ | 3/26 [00:13<01:29, 3.90s/it]
15%|█▌ | 4/26 [00:14<01:05, 3.00s/it]
19%|█▉ | 5/26 [00:16<00:52, 2.49s/it]
23%|██▎ | 6/26 [00:17<00:42, 2.11s/it]
27%|██▋ | 7/26 [00:19<00:36, 1.94s/it]
31%|███ | 8/26 [00:22<00:42, 2.37s/it]
35%|███▍ | 9/26 [00:24<00:35, 2.10s/it]
38%|███▊ | 10/26 [00:26<00:36, 2.25s/it]
42%|████▏ | 11/26 [00:28<00:30, 2.05s/it]
46%|████▌ | 12/26 [00:29<00:25, 1.84s/it]
50%|█████ | 13/26 [00:31<00:21, 1.68s/it]
54%|█████▍ | 14/26 [00:33<00:22, 1.85s/it]
58%|█████▊ | 15/26 [00:35<00:20, 1.82s/it]
62%|██████▏ | 16/26 [00:36<00:17, 1.80s/it]
65%|██████▌ | 17/26 [00:39<00:17, 1.99s/it]
69%|██████▉ | 18/26 [00:41<00:15, 1.90s/it]
73%|███████▎ | 19/26 [00:42<00:13, 1.88s/it]
77%|███████▋ | 20/26 [00:44<00:10, 1.76s/it]
81%|████████ | 21/26 [00:46<00:08, 1.77s/it]
85%|████████▍ | 22/26 [00:47<00:07, 1.78s/it]
88%|████████▊ | 23/26 [00:50<00:05, 1.93s/it]
92%|█████████▏| 24/26 [00:56<00:06, 3.24s/it]
96%|█████████▌| 25/26 [00:58<00:02, 2.79s/it]
100%|██████████| 26/26 [00:59<00:00, 2.27s/it]
100%|██████████| 26/26 [00:59<00:00, 2.28s/it]
junhua024-chai-02-full-27473-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:q4, folder:/tmp/tmptaj5y9xa, device:0
junhua024-chai-02-full-27473-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission blend_fihal_2025-07-23: HTTPConnectionPool(host='guanaco-model-mesh.k2.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-02-full-27473-v1-mkmlizer: quantized model in 162.251s
junhua024-chai-02-full-27473-v1-mkmlizer: Processed model junhua024/chai_02_full_02_0200 in 551.977s
junhua024-chai-02-full-27473-v1-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-02-full-27473-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-02-full-27473-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-02-full-27473-v1/nvidia
junhua024-chai-02-full-27473-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-02-full-27473-v1/nvidia/tokenizer_config.json
junhua024-chai-02-full-27473-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-02-full-27473-v1/nvidia/tokenizer.json
junhua024-chai-02-full-27473-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-02-full-27473-v1/nvidia/flywheel_model.0.safetensors
junhua024-chai-02-full-27473-v1-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:00<00:21, 16.70it/s]
Loading 0: 1%| | 4/363 [00:02<03:31, 1.70it/s]
Loading 0: 1%|▏ | 5/363 [00:03<04:20, 1.38it/s]
Loading 0: 2%|▏ | 8/363 [00:03<02:17, 2.58it/s]
Loading 0: 2%|▏ | 9/363 [00:03<02:12, 2.68it/s]
Loading 0: 3%|▎ | 10/363 [00:03<01:53, 3.11it/s]
Loading 0: 3%|▎ | 11/363 [00:04<02:48, 2.09it/s]
Loading 0: 3%|▎ | 12/363 [00:05<03:40, 1.59it/s]
Loading 0: 4%|▍ | 14/363 [00:06<02:32, 2.30it/s]
Loading 0: 4%|▍ | 15/363 [00:06<02:19, 2.50it/s]
Loading 0: 4%|▍ | 16/363 [00:06<01:55, 2.99it/s]
Loading 0: 5%|▍ | 18/363 [00:07<02:21, 2.43it/s]
Loading 0: 6%|▌ | 21/363 [00:08<02:05, 2.73it/s]
Loading 0: 6%|▌ | 22/363 [00:09<02:41, 2.11it/s]
Loading 0: 6%|▋ | 23/363 [00:10<03:19, 1.70it/s]
Loading 0: 7%|▋ | 26/363 [00:11<02:03, 2.74it/s]
Loading 0: 7%|▋ | 27/363 [00:11<01:57, 2.86it/s]
Loading 0: 8%|▊ | 28/363 [00:11<01:42, 3.25it/s]
Loading 0: 8%|▊ | 29/363 [00:12<02:31, 2.20it/s]
Loading 0: 9%|▊ | 31/363 [00:12<01:53, 2.93it/s]
Loading 0: 9%|▉ | 32/363 [00:13<01:48, 3.05it/s]
Loading 0: 9%|▉ | 33/363 [00:13<01:32, 3.55it/s]
Loading 0: 10%|▉ | 35/363 [00:14<02:02, 2.67it/s]
Loading 0: 10%|▉ | 36/363 [00:15<02:49, 1.93it/s]
Loading 0: 11%|█ | 39/363 [00:16<02:16, 2.38it/s]
Loading 0: 11%|█ | 40/363 [00:17<02:48, 1.92it/s]
Loading 0: 11%|█▏ | 41/363 [00:18<03:24, 1.57it/s]
Loading 0: 12%|█▏ | 44/363 [00:18<02:03, 2.59it/s]
Loading 0: 12%|█▏ | 45/363 [00:18<01:56, 2.73it/s]
Loading 0: 13%|█▎ | 46/363 [00:18<01:41, 3.14it/s]
Loading 0: 13%|█▎ | 48/363 [00:19<01:23, 3.78it/s]
Loading 0: 13%|█▎ | 49/363 [00:19<01:23, 3.74it/s]
Loading 0: 14%|█▍ | 50/363 [00:19<01:14, 4.22it/s]
Loading 0: 14%|█▍ | 52/363 [00:20<01:48, 2.87it/s]
Loading 0: 15%|█▍ | 53/363 [00:21<02:29, 2.07it/s]
Loading 0: 15%|█▍ | 54/363 [00:22<03:09, 1.63it/s]
Loading 0: 16%|█▌ | 57/363 [00:23<02:21, 2.16it/s]
Loading 0: 16%|█▌ | 58/363 [00:24<02:50, 1.79it/s]
Loading 0: 16%|█▋ | 59/363 [00:25<03:20, 1.52it/s]
Loading 0: 17%|█▋ | 62/363 [00:25<01:59, 2.53it/s]
Loading 0: 17%|█▋ | 63/363 [00:26<01:51, 2.68it/s]
Loading 0: 18%|█▊ | 64/363 [00:26<01:36, 3.10it/s]
Loading 0: 18%|█▊ | 66/363 [00:27<01:57, 2.54it/s]
Loading 0: 18%|█▊ | 67/363 [00:28<02:31, 1.95it/s]
Loading 0: 19%|█▊ | 68/363 [00:29<03:06, 1.58it/s]
Loading 0: 20%|█▉ | 71/363 [00:29<01:48, 2.68it/s]
Loading 0: 20%|█▉ | 72/363 [00:29<01:43, 2.82it/s]
Loading 0: 20%|██ | 73/363 [00:30<01:29, 3.24it/s]
Loading 0: 20%|██ | 74/363 [00:31<02:13, 2.17it/s]
Loading 0: 21%|██ | 75/363 [00:32<02:53, 1.66it/s]
Loading 0: 21%|██ | 77/363 [00:32<02:00, 2.37it/s]
Loading 0: 21%|██▏ | 78/363 [00:32<01:50, 2.58it/s]
Loading 0: 22%|██▏ | 79/363 [00:32<01:32, 3.08it/s]
Loading 0: 22%|██▏ | 80/363 [00:33<02:23, 1.97it/s]
Loading 0: 23%|██▎ | 82/363 [00:34<01:41, 2.76it/s]
Loading 0: 23%|██▎ | 83/363 [00:34<01:35, 2.93it/s]
Loading 0: 23%|██▎ | 84/363 [00:34<01:21, 3.43it/s]
Loading 0: 24%|██▎ | 86/363 [00:35<01:42, 2.71it/s]
Loading 0: 25%|██▍ | 89/363 [00:36<01:34, 2.90it/s]
Loading 0: 25%|██▍ | 90/363 [00:37<02:04, 2.19it/s]
Loading 0: 25%|██▌ | 91/363 [00:38<02:36, 1.74it/s]
Loading 0: 26%|██▌ | 94/363 [00:38<01:35, 2.83it/s]
Loading 0: 26%|██▌ | 95/363 [00:39<01:31, 2.94it/s]
Loading 0: 26%|██▋ | 96/363 [00:39<01:19, 3.36it/s]
Loading 0: 27%|██▋ | 98/363 [00:40<01:40, 2.63it/s]
Loading 0: 27%|██▋ | 99/363 [00:41<02:16, 1.93it/s]
Loading 0: 28%|██▊ | 102/363 [00:42<01:50, 2.35it/s]
Loading 0: 28%|██▊ | 103/363 [00:43<02:15, 1.92it/s]
Loading 0: 29%|██▊ | 104/363 [00:44<02:41, 1.60it/s]
Loading 0: 29%|██▉ | 107/363 [00:44<01:37, 2.61it/s]
Loading 0: 30%|██▉ | 108/363 [00:44<01:32, 2.76it/s]
Loading 0: 30%|███ | 109/363 [00:44<01:19, 3.18it/s]
Loading 0: 31%|███ | 111/363 [00:45<01:05, 3.84it/s]
Loading 0: 31%|███ | 112/363 [00:45<01:06, 3.80it/s]
Loading 0: 31%|███ | 113/363 [00:45<00:58, 4.30it/s]
Loading 0: 32%|███▏ | 115/363 [00:46<01:25, 2.90it/s]
Loading 0: 32%|███▏ | 116/363 [00:47<01:58, 2.09it/s]
Loading 0: 32%|███▏ | 117/363 [00:48<02:31, 1.62it/s]
Loading 0: 33%|███▎ | 120/363 [00:49<01:53, 2.14it/s]
Loading 0: 33%|███▎ | 121/363 [00:50<02:15, 1.78it/s]
Loading 0: 34%|███▎ | 122/363 [00:51<02:39, 1.51it/s]
Loading 0: 34%|███▍ | 125/363 [00:52<01:34, 2.52it/s]
Loading 0: 35%|███▍ | 126/363 [00:52<01:28, 2.67it/s]
Loading 0: 35%|███▍ | 127/363 [00:52<01:16, 3.09it/s]
Loading 0: 36%|███▌ | 129/363 [00:53<01:32, 2.53it/s]
Loading 0: 36%|███▌ | 130/363 [00:54<01:59, 1.95it/s]
Loading 0: 36%|███▌ | 131/363 [00:55<02:27, 1.58it/s]
Loading 0: 37%|███▋ | 134/363 [00:55<01:25, 2.68it/s]
Loading 0: 37%|███▋ | 135/363 [00:56<01:20, 2.82it/s]
Loading 0: 37%|███▋ | 136/363 [00:56<01:09, 3.26it/s]
Loading 0: 38%|███▊ | 137/363 [00:57<01:43, 2.18it/s]
Loading 0: 38%|███▊ | 138/363 [00:58<02:15, 1.66it/s]
Loading 0: 39%|███▊ | 140/363 [00:58<01:34, 2.37it/s]
Loading 0: 39%|███▉ | 141/363 [00:58<01:26, 2.57it/s]
Loading 0: 39%|███▉ | 142/363 [00:58<01:12, 3.07it/s]
Loading 0: 40%|███▉ | 144/363 [00:59<01:29, 2.46it/s]
Loading 0: 40%|████ | 147/363 [01:00<01:19, 2.73it/s]
Loading 0: 41%|████ | 148/363 [01:01<01:42, 2.10it/s]
Loading 0: 41%|████ | 149/363 [01:02<02:06, 1.69it/s]
Loading 0: 42%|████▏ | 152/363 [01:03<01:17, 2.73it/s]
Loading 0: 42%|████▏ | 153/363 [01:03<01:13, 2.86it/s]
Loading 0: 42%|████▏ | 154/363 [01:03<01:03, 3.27it/s]
Loading 0: 43%|████▎ | 155/363 [01:04<01:34, 2.20it/s]
Loading 0: 43%|████▎ | 157/363 [01:04<01:10, 2.93it/s]
Loading 0: 44%|████▎ | 158/363 [01:05<01:07, 3.05it/s]
Loading 0: 44%|████▍ | 159/363 [01:05<00:57, 3.54it/s]
Loading 0: 44%|████▍ | 161/363 [01:06<01:16, 2.66it/s]
Loading 0: 45%|████▍ | 162/363 [01:07<01:44, 1.92it/s]
Loading 0: 45%|████▌ | 165/363 [01:08<01:23, 2.37it/s]
Loading 0: 46%|████▌ | 166/363 [01:09<01:42, 1.92it/s]
Loading 0: 46%|████▌ | 167/363 [01:10<02:03, 1.59it/s]
Loading 0: 47%|████▋ | 170/363 [01:10<01:13, 2.62it/s]
Loading 0: 47%|████▋ | 171/363 [01:11<01:09, 2.75it/s]
Loading 0: 47%|████▋ | 172/363 [01:11<01:00, 3.16it/s]
Loading 0: 48%|████▊ | 174/363 [01:11<00:49, 3.82it/s]
Loading 0: 48%|████▊ | 175/363 [01:11<00:49, 3.83it/s]
Loading 0: 48%|████▊ | 176/363 [01:11<00:43, 4.33it/s]
Loading 0: 49%|████▉ | 178/363 [01:12<01:03, 2.92it/s]
Loading 0: 49%|████▉ | 179/363 [01:13<01:27, 2.10it/s]
Loading 0: 50%|████▉ | 180/363 [01:14<01:50, 1.65it/s]
Loading 0: 50%|█████ | 183/363 [01:15<01:22, 2.18it/s]
Loading 0: 51%|█████ | 184/363 [01:16<01:39, 1.81it/s]
Loading 0: 51%|█████ | 185/363 [01:17<01:57, 1.52it/s]
Loading 0: 52%|█████▏ | 188/363 [01:18<01:08, 2.54it/s]
Loading 0: 52%|█████▏ | 189/363 [01:18<01:04, 2.69it/s]
Loading 0: 52%|█████▏ | 190/363 [01:18<00:55, 3.11it/s]
Loading 0: 53%|█████▎ | 192/363 [01:19<01:07, 2.54it/s]
Loading 0: 53%|█████▎ | 193/363 [01:20<01:27, 1.95it/s]
Loading 0: 53%|█████▎ | 194/363 [01:21<01:46, 1.59it/s]
Loading 0: 54%|█████▍ | 197/363 [01:21<01:01, 2.69it/s]
Loading 0: 55%|█████▍ | 198/363 [01:22<00:58, 2.83it/s]
Loading 0: 55%|█████▍ | 199/363 [01:22<00:50, 3.28it/s]
Loading 0: 55%|█████▌ | 200/363 [01:23<01:14, 2.19it/s]
Loading 0: 55%|█████▌ | 201/363 [01:24<01:36, 1.68it/s]
Loading 0: 56%|█████▌ | 203/363 [01:24<01:06, 2.40it/s]
Loading 0: 56%|█████▌ | 204/363 [01:24<01:01, 2.60it/s]
Loading 0: 56%|█████▋ | 205/363 [01:24<00:51, 3.09it/s]
Loading 0: 57%|█████▋ | 206/363 [01:25<00:41, 3.74it/s]
Loading 0: 57%|█████▋ | 207/363 [01:26<01:10, 2.20it/s]
Loading 0: 58%|█████▊ | 210/363 [01:26<00:58, 2.62it/s]
Loading 0: 58%|█████▊ | 211/363 [01:27<01:15, 2.01it/s]
Loading 0: 58%|█████▊ | 212/363 [01:28<01:33, 1.62it/s]
Loading 0: 59%|█████▉ | 215/363 [01:29<00:54, 2.72it/s]
Loading 0: 60%|█████▉ | 216/363 [01:29<00:51, 2.86it/s]
Loading 0: 60%|█████▉ | 217/363 [01:29<00:44, 3.28it/s]
Loading 0: 60%|██████ | 218/363 [01:30<01:06, 2.19it/s]
Loading 0: 61%|██████ | 220/363 [01:31<00:48, 2.93it/s]
Loading 0: 61%|██████ | 221/363 [01:31<00:46, 3.06it/s]
Loading 0: 61%|██████ | 222/363 [01:31<00:39, 3.55it/s]
Loading 0: 62%|██████▏ | 224/363 [01:32<00:52, 2.66it/s]
Loading 0: 62%|██████▏ | 225/363 [01:33<01:11, 1.92it/s]
Loading 0: 63%|██████▎ | 228/363 [01:34<00:56, 2.37it/s]
Loading 0: 63%|██████▎ | 229/363 [01:35<01:09, 1.92it/s]
Loading 0: 63%|██████▎ | 230/363 [01:36<01:23, 1.59it/s]
Loading 0: 64%|██████▍ | 233/363 [01:36<00:49, 2.62it/s]
Loading 0: 64%|██████▍ | 234/363 [01:36<00:46, 2.77it/s]
Loading 0: 65%|██████▍ | 235/363 [01:37<00:40, 3.19it/s]
Loading 0: 65%|██████▌ | 237/363 [01:37<00:32, 3.85it/s]
Loading 0: 66%|██████▌ | 238/363 [01:37<00:32, 3.80it/s]
Loading 0: 66%|██████▌ | 239/363 [01:37<00:28, 4.30it/s]
Loading 0: 66%|██████▋ | 241/363 [01:38<00:41, 2.91it/s]
Loading 0: 67%|██████▋ | 242/363 [01:39<00:57, 2.10it/s]
Loading 0: 67%|██████▋ | 243/363 [01:40<01:12, 1.65it/s]
Loading 0: 68%|██████▊ | 246/363 [01:41<00:53, 2.17it/s]
Loading 0: 68%|██████▊ | 247/363 [01:42<01:04, 1.80it/s]
Loading 0: 68%|██████▊ | 248/363 [01:43<01:15, 1.52it/s]
Loading 0: 69%|██████▉ | 251/363 [01:44<00:44, 2.54it/s]
Loading 0: 69%|██████▉ | 252/363 [01:44<00:41, 2.69it/s]
Loading 0: 70%|██████▉ | 253/363 [01:44<00:35, 3.12it/s]
Loading 0: 70%|███████ | 255/363 [01:45<00:42, 2.55it/s]
Loading 0: 71%|███████ | 256/363 [01:46<00:54, 1.97it/s]
Loading 0: 71%|███████ | 257/363 [01:47<01:06, 1.59it/s]
Loading 0: 72%|███████▏ | 260/363 [01:47<00:37, 2.72it/s]
Loading 0: 72%|███████▏ | 261/363 [01:48<00:35, 2.90it/s]
Loading 0: 72%|███████▏ | 262/363 [01:48<00:30, 3.34it/s]
Loading 0: 72%|███████▏ | 263/363 [01:49<00:44, 2.22it/s]
Loading 0: 73%|███████▎ | 264/363 [01:50<00:58, 1.69it/s]
Loading 0: 73%|███████▎ | 266/363 [01:50<00:40, 2.41it/s]
Loading 0: 74%|███████▎ | 267/363 [01:50<00:36, 2.62it/s]
Loading 0: 74%|███████▍ | 268/363 [01:50<00:30, 3.10it/s]
Loading 0: 74%|███████▍ | 270/363 [01:51<00:37, 2.46it/s]
Loading 0: 75%|███████▌ | 273/363 [01:52<00:32, 2.75it/s]
Loading 0: 75%|███████▌ | 274/363 [01:53<00:41, 2.12it/s]
Loading 0: 76%|███████▌ | 275/363 [01:54<00:51, 1.71it/s]
Loading 0: 77%|███████▋ | 278/363 [01:55<00:30, 2.76it/s]
Loading 0: 77%|███████▋ | 279/363 [01:55<00:29, 2.89it/s]
Loading 0: 77%|███████▋ | 280/363 [01:55<00:25, 3.31it/s]
Loading 0: 77%|███████▋ | 281/363 [01:56<00:36, 2.22it/s]
Loading 0: 78%|███████▊ | 283/363 [01:56<00:27, 2.96it/s]
Loading 0: 78%|███████▊ | 284/363 [01:57<00:25, 3.09it/s]
Loading 0: 79%|███████▊ | 285/363 [01:57<00:21, 3.59it/s]
Loading 0: 79%|███████▉ | 287/363 [01:58<00:28, 2.69it/s]
Loading 0: 79%|███████▉ | 288/363 [01:59<00:38, 1.94it/s]
Loading 0: 80%|████████ | 291/363 [02:00<00:30, 2.39it/s]
Loading 0: 80%|████████ | 292/363 [02:01<00:36, 1.93it/s]
Loading 0: 81%|████████ | 293/363 [02:02<00:43, 1.60it/s]
Loading 0: 82%|████████▏ | 296/363 [02:02<00:25, 2.63it/s]
Loading 0: 82%|████████▏ | 297/363 [02:02<00:23, 2.77it/s]
Loading 0: 82%|████████▏ | 298/363 [02:03<00:20, 3.19it/s]
Loading 0: 83%|████████▎ | 300/363 [02:03<00:16, 3.85it/s]
Loading 0: 83%|████████▎ | 301/363 [02:03<00:16, 3.81it/s]
Loading 0: 83%|████████▎ | 302/363 [02:03<00:14, 4.26it/s]
Loading 0: 84%|████████▎ | 304/363 [02:04<00:20, 2.88it/s]
Loading 0: 84%|████████▍ | 305/363 [02:05<00:27, 2.08it/s]
Loading 0: 84%|████████▍ | 306/363 [02:06<00:34, 1.63it/s]
Loading 0: 85%|████████▌ | 309/363 [02:07<00:24, 2.17it/s]
Loading 0: 85%|████████▌ | 310/363 [02:08<00:29, 1.80it/s]
Loading 0: 86%|████████▌ | 311/363 [02:09<00:34, 1.53it/s]
Loading 0: 87%|████████▋ | 314/363 [02:10<00:19, 2.54it/s]
Loading 0: 87%|████████▋ | 315/363 [02:10<00:17, 2.69it/s]
Loading 0: 87%|████████▋ | 316/363 [02:10<00:15, 3.10it/s]
Loading 0: 88%|████████▊ | 318/363 [02:11<00:17, 2.54it/s]
Loading 0: 88%|████████▊ | 319/363 [02:12<00:22, 1.95it/s]
Loading 0: 88%|████████▊ | 320/363 [02:13<00:27, 1.58it/s]
Loading 0: 89%|████████▉ | 323/363 [02:13<00:14, 2.67it/s]
Loading 0: 89%|████████▉ | 324/363 [02:14<00:13, 2.81it/s]
Loading 0: 90%|████████▉ | 325/363 [02:14<00:11, 3.22it/s]
Loading 0: 90%|████████▉ | 326/363 [02:15<00:17, 2.17it/s]
Loading 0: 90%|█████████ | 327/363 [02:16<00:21, 1.66it/s]
Loading 0: 91%|█████████ | 329/363 [02:16<00:14, 2.38it/s]
Loading 0: 91%|█████████ | 330/363 [02:16<00:12, 2.59it/s]
Loading 0: 91%|█████████ | 331/363 [02:16<00:10, 3.08it/s]
Loading 0: 92%|█████████▏| 333/363 [02:17<00:12, 2.48it/s]
Loading 0: 93%|█████████▎| 336/363 [02:18<00:09, 2.75it/s]
Loading 0: 93%|█████████▎| 337/363 [02:19<00:12, 2.13it/s]
Loading 0: 93%|█████████▎| 338/363 [02:20<00:14, 1.71it/s]
Loading 0: 94%|█████████▍| 341/363 [02:21<00:07, 2.75it/s]
Loading 0: 94%|█████████▍| 342/363 [02:21<00:07, 2.88it/s]
Loading 0: 94%|█████████▍| 343/363 [02:21<00:06, 3.27it/s]
Loading 0: 95%|█████████▍| 344/363 [02:22<00:08, 2.19it/s]
Loading 0: 95%|█████████▌| 346/363 [02:22<00:05, 2.87it/s]
Loading 0: 96%|█████████▌| 347/363 [02:23<00:05, 3.01it/s]
Loading 0: 96%|█████████▌| 348/363 [02:23<00:04, 3.45it/s]
Loading 0: 96%|█████████▌| 349/363 [02:23<00:04, 3.49it/s]
Loading 0: 96%|█████████▋| 350/363 [02:23<00:03, 4.22it/s]
Loading 0: 97%|█████████▋| 351/363 [02:24<00:05, 2.26it/s]
Loading 0: 97%|█████████▋| 352/363 [02:25<00:06, 1.65it/s]
Loading 0: 98%|█████████▊| 355/363 [02:26<00:03, 2.20it/s]
Loading 0: 98%|█████████▊| 356/363 [02:27<00:03, 1.79it/s]
Loading 0: 98%|█████████▊| 357/363 [02:28<00:03, 1.51it/s]
Loading 0: 99%|█████████▉| 360/363 [02:29<00:01, 2.56it/s]
Loading 0: 99%|█████████▉| 361/363 [02:29<00:00, 2.72it/s]
Loading 0: 100%|█████████▉| 362/363 [02:29<00:00, 3.14it/s]
Job junhua024-chai-02-full-27473-v1-mkmlizer completed after 571.92s with status: succeeded
Stopping job with name junhua024-chai-02-full-27473-v1-mkmlizer
Pipeline stage MKMLizer completed in 572.95s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.19s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-02-full-27473-v1
Waiting for inference service junhua024-chai-02-full-27473-v1 to be ready
Inference service junhua024-chai-02-full-27473-v1 ready after 160.97557187080383s
Pipeline stage MKMLDeployer completed in 161.51s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.819242477416992s
Received healthy response to inference request in 2.718595266342163s
Retrying (%r) after connection broken by '%r': %s
Received healthy response to inference request in 1.5064935684204102s
Received healthy response to inference request in 1.5726425647735596s
Received healthy response to inference request in 2.193681001663208s
5 requests
0 failed requests
5th percentile: 1.51972336769104
10th percentile: 1.5329531669616698
20th percentile: 1.5594127655029297
30th percentile: 1.6968502521514892
40th percentile: 1.9452656269073487
50th percentile: 2.193681001663208
60th percentile: 2.40364670753479
70th percentile: 2.613612413406372
80th percentile: 2.738724708557129
90th percentile: 2.7789835929870605
95th percentile: 2.7991130352020264
99th percentile: 2.815216588973999
mean time: 2.1621309757232665
Pipeline stage StressChecker completed in 12.20s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.72s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.95s
Shutdown handler de-registered
junhua024-chai-02-full-_27473_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3107.97s
Shutdown handler de-registered
junhua024-chai-02-full-_27473_v1 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-02-full-_27473_v1 status is now torndown due to DeploymentManager action