developer_uid: junhua024
submission_id: junhua024-chai-1-full-06611_v1
model_name: junhua024-chai-1-full-06611_v1
model_group: junhua024/chai_1-full_06
status: torndown
timestamp: 2025-06-29T16:32:59+00:00
num_battles: 8863
num_wins: 4141
celo_rating: 1258.61
family_friendly_score: 0.5589999999999999
family_friendly_standard_error: 0.007021666468866205
submission_type: basic
model_repo: junhua024/chai_1-full_06611
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5930897186584112, 'latency_mean': 1.685965005159378, 'latency_p50': 1.6893168687820435, 'latency_p90': 1.8702170848846436}, {'batch_size': 3, 'throughput': 1.077705580291761, 'latency_mean': 2.771201113462448, 'latency_p50': 2.7643325328826904, 'latency_p90': 3.055125689506531}, {'batch_size': 5, 'throughput': 1.3006122633445247, 'latency_mean': 3.8183338046073914, 'latency_p50': 3.8209930658340454, 'latency_p90': 4.292962431907654}, {'batch_size': 6, 'throughput': 1.3571270682446905, 'latency_mean': 4.3845209991931915, 'latency_p50': 4.35004997253418, 'latency_p90': 4.871243286132812}, {'batch_size': 8, 'throughput': 1.4168826680439677, 'latency_mean': 5.599608083963394, 'latency_p50': 5.58044970035553, 'latency_p90': 6.33091971874237}, {'batch_size': 10, 'throughput': 1.4478230117224984, 'latency_mean': 6.841547054052353, 'latency_p50': 6.78847861289978, 'latency_p90': 7.851336216926574}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-06611_v1
is_internal_developer: False
language_model: junhua024/chai_1-full_06611
model_size: 13B
ranking_group: single
throughput_3p7s: 1.29
us_pacific_date: 2025-06-29
win_ratio: 0.46722328782579264
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|im_start|>system\n{memory}<|im_end|>\n', 'prompt_template': '<|im_start|>user\n{prompt}<|im_end|>\n', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-06611-v1-mkmlizer
Waiting for job on junhua024-chai-1-full-06611-v1-mkmlizer to finish
junhua024-chai-1-full-06611-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ Version: 0.29.3 ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-06611-v1-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v1-mkmlizer: Downloaded to shared memory in 141.077s
junhua024-chai-1-full-06611-v1-mkmlizer: Checking if junhua024/chai_1-full_06611 already exists in ChaiML
junhua024-chai-1-full-06611-v1-mkmlizer: Creating repo ChaiML/chai_1-full_06611 and uploading /tmp/tmp5b2q372k to it
junhua024-chai-1-full-06611-v1-mkmlizer: 0%| | 0/26 [00:00<?, ?it/s] 4%|▍ | 1/26 [00:04<01:40, 4.00s/it] 8%|▊ | 2/26 [00:05<01:06, 2.75s/it] 12%|█▏ | 3/26 [00:07<00:53, 2.35s/it] 15%|█▌ | 4/26 [00:09<00:46, 2.11s/it] 19%|█▉ | 5/26 [00:11<00:42, 2.02s/it] 23%|██▎ | 6/26 [00:13<00:42, 2.13s/it] 27%|██▋ | 7/26 [00:15<00:38, 2.02s/it] 31%|███ | 8/26 [00:17<00:34, 1.94s/it] 35%|███▍ | 9/26 [00:19<00:34, 2.05s/it] 38%|███▊ | 10/26 [00:21<00:31, 1.95s/it] 42%|████▏ | 11/26 [00:23<00:28, 1.89s/it] 46%|████▌ | 12/26 [00:24<00:25, 1.83s/it] 50%|█████ | 13/26 [00:26<00:23, 1.82s/it] 54%|█████▍ | 14/26 [00:30<00:30, 2.52s/it] 58%|█████▊ | 15/26 [00:32<00:25, 2.30s/it] 62%|██████▏ | 16/26 [00:34<00:21, 2.19s/it] 65%|██████▌ | 17/26 [00:36<00:19, 2.12s/it] 69%|██████▉ | 18/26 [00:42<00:27, 3.41s/it] 73%|███████▎ | 19/26 [00:44<00:21, 3.05s/it] 77%|███████▋ | 20/26 [00:46<00:16, 2.71s/it] 81%|████████ | 21/26 [00:49<00:13, 2.72s/it] 85%|████████▍ | 22/26 [00:51<00:09, 2.46s/it] 88%|████████▊ | 23/26 [00:53<00:06, 2.26s/it] 92%|█████████▏| 24/26 [00:56<00:05, 2.66s/it] 96%|█████████▌| 25/26 [00:58<00:02, 2.41s/it] 100%|██████████| 26/26 [00:59<00:00, 2.05s/it] 100%|██████████| 26/26 [00:59<00:00, 2.30s/it]
junhua024-chai-1-full-06611-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp5b2q372k, device:0
junhua024-chai-1-full-06611-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-06611-v1-mkmlizer: quantized model in 54.189s
junhua024-chai-1-full-06611-v1-mkmlizer: Processed model junhua024/chai_1-full_06611 in 305.234s
junhua024-chai-1-full-06611-v1-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-06611-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v1/config.json
junhua024-chai-1-full-06611-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v1/tokenizer_config.json
junhua024-chai-1-full-06611-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v1/tokenizer.json
junhua024-chai-1-full-06611-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v1/flywheel_model.0.safetensors
junhua024-chai-1-full-06611-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:31, 11.45it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:28, 12.36it/s] Loading 0: 3%|▎ | 10/363 [00:00<00:15, 23.01it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:15, 22.53it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:16, 21.25it/s] Loading 0: 6%|▌ | 22/363 [00:00<00:12, 26.80it/s] Loading 0: 7%|▋ | 26/363 [00:01<00:12, 27.32it/s] Loading 0: 8%|▊ | 29/363 [00:01<00:12, 26.17it/s] Loading 0: 9%|▉ | 33/363 [00:01<00:11, 28.66it/s] Loading 0: 10%|█ | 37/363 [00:01<00:14, 22.61it/s] Loading 0: 11%|█▏ | 41/363 [00:01<00:15, 20.23it/s] Loading 0: 13%|█▎ | 46/363 [00:01<00:12, 25.01it/s] Loading 0: 14%|█▍ | 50/363 [00:02<00:11, 27.85it/s] Loading 0: 15%|█▍ | 54/363 [00:02<00:14, 21.72it/s] Loading 0: 16%|█▋ | 59/363 [00:02<00:12, 23.96it/s] Loading 0: 18%|█▊ | 64/363 [00:02<00:10, 28.30it/s] Loading 0: 19%|█▊ | 68/363 [00:02<00:13, 22.34it/s] Loading 0: 20%|██ | 73/363 [00:03<00:10, 26.76it/s] Loading 0: 21%|██ | 77/363 [00:03<00:10, 26.65it/s] Loading 0: 22%|██▏ | 81/363 [00:03<00:12, 23.04it/s] Loading 0: 23%|██▎ | 84/363 [00:03<00:11, 23.86it/s] Loading 0: 25%|██▍ | 90/363 [00:03<00:09, 29.37it/s] Loading 0: 26%|██▌ | 94/363 [00:03<00:09, 28.77it/s] Loading 0: 27%|██▋ | 98/363 [00:04<00:11, 23.41it/s] Loading 0: 28%|██▊ | 101/363 [00:04<00:10, 23.88it/s] Loading 0: 29%|██▊ | 104/363 [00:04<00:13, 19.54it/s] Loading 0: 30%|███ | 110/363 [00:04<00:09, 27.14it/s] Loading 0: 31%|███▏ | 114/363 [00:04<00:10, 23.41it/s] Loading 0: 32%|███▏ | 117/363 [00:04<00:11, 21.61it/s] Loading 0: 34%|███▎ | 122/363 [00:05<00:10, 23.42it/s] Loading 0: 35%|███▍ | 127/363 [00:05<00:08, 27.97it/s] Loading 0: 36%|███▌ | 131/363 [00:05<00:10, 21.47it/s] Loading 0: 37%|███▋ | 136/363 [00:05<00:08, 26.25it/s] Loading 0: 39%|███▊ | 140/363 [00:05<00:08, 25.30it/s] Loading 0: 39%|███▉ | 143/363 [00:05<00:10, 21.52it/s] Loading 0: 41%|████ | 148/363 [00:06<00:08, 25.51it/s] Loading 0: 42%|████▏ | 151/363 [00:06<00:08, 24.52it/s] Loading 0: 42%|████▏ | 154/363 [00:06<00:08, 25.29it/s] Loading 0: 44%|████▍ | 159/363 [00:06<00:07, 27.65it/s] Loading 0: 45%|████▍ | 162/363 [00:06<00:10, 19.75it/s] Loading 0: 46%|████▌ | 167/363 [00:07<00:09, 20.78it/s] Loading 0: 48%|████▊ | 174/363 [00:07<00:06, 29.30it/s] Loading 0: 49%|████▉ | 178/363 [00:07<00:07, 25.56it/s] Loading 0: 50%|█████ | 182/363 [00:07<00:07, 25.82it/s] Loading 0: 51%|█████ | 185/363 [00:07<00:07, 23.59it/s] Loading 0: 52%|█████▏ | 190/363 [00:07<00:06, 27.65it/s] Loading 0: 53%|█████▎ | 194/363 [00:08<00:08, 20.55it/s] Loading 0: 55%|█████▍ | 199/363 [00:08<00:06, 24.85it/s] Loading 0: 56%|█████▌ | 203/363 [00:08<00:06, 24.69it/s] Loading 0: 57%|█████▋ | 206/363 [00:08<00:07, 21.83it/s] Loading 0: 58%|█████▊ | 211/363 [00:08<00:05, 26.20it/s] Loading 0: 59%|█████▉ | 214/363 [00:08<00:06, 24.60it/s] Loading 0: 60%|█████▉ | 217/363 [00:08<00:05, 25.03it/s] Loading 0: 61%|██████ | 222/363 [00:09<00:04, 28.76it/s] Loading 0: 62%|██████▏ | 226/363 [00:09<00:06, 21.42it/s] Loading 0: 63%|██████▎ | 230/363 [00:09<00:06, 20.76it/s] Loading 0: 65%|██████▍ | 235/363 [00:09<00:05, 25.34it/s] Loading 0: 66%|██████▌ | 239/363 [00:09<00:04, 27.54it/s] Loading 0: 67%|██████▋ | 243/363 [00:10<00:05, 21.82it/s] Loading 0: 68%|██████▊ | 248/363 [00:10<00:04, 23.79it/s] Loading 0: 70%|██████▉ | 253/363 [00:10<00:04, 26.82it/s] Loading 0: 71%|███████ | 256/363 [00:10<00:04, 24.83it/s] Loading 0: 71%|███████▏ | 259/363 [00:10<00:04, 24.26it/s] Loading 0: 72%|███████▏ | 262/363 [00:10<00:04, 24.56it/s] Loading 0: 73%|███████▎ | 265/363 [00:10<00:04, 21.77it/s] Loading 0: 74%|███████▍ | 268/363 [00:11<00:04, 23.36it/s] Loading 0: 75%|███████▍ | 271/363 [00:11<00:04, 20.97it/s] Loading 0: 76%|███████▌ | 275/363 [00:11<00:04, 19.65it/s] Loading 0: 77%|███████▋ | 280/363 [00:11<00:03, 23.79it/s] Loading 0: 79%|███████▊ | 285/363 [00:11<00:02, 26.32it/s] Loading 0: 79%|███████▉ | 288/363 [00:12<00:03, 20.18it/s] Loading 0: 81%|████████ | 293/363 [00:12<00:03, 21.51it/s] Loading 0: 82%|████████▏ | 298/363 [00:12<00:02, 25.18it/s] Loading 0: 83%|████████▎ | 302/363 [00:12<00:02, 27.42it/s] Loading 0: 84%|████████▍ | 306/363 [00:12<00:02, 21.19it/s] Loading 0: 86%|████████▌ | 311/363 [00:12<00:02, 22.99it/s] Loading 0: 87%|████████▋ | 316/363 [00:13<00:01, 27.20it/s] Loading 0: 88%|████████▊ | 320/363 [00:13<00:02, 21.20it/s] Loading 0: 90%|████████▉ | 325/363 [00:13<00:01, 25.65it/s] Loading 0: 91%|█████████ | 329/363 [00:13<00:01, 25.65it/s] Loading 0: 91%|█████████▏| 332/363 [00:13<00:01, 21.86it/s] Loading 0: 93%|█████████▎| 337/363 [00:13<00:00, 26.58it/s] Loading 0: 94%|█████████▍| 341/363 [00:14<00:00, 27.60it/s] Loading 0: 95%|█████████▌| 345/363 [00:14<00:00, 28.55it/s] Loading 0: 96%|█████████▌| 349/363 [00:14<00:00, 16.37it/s] Loading 0: 97%|█████████▋| 352/363 [00:15<00:00, 14.59it/s] Loading 0: 98%|█████████▊| 357/363 [00:15<00:00, 17.74it/s] Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 21.81it/s]
Job junhua024-chai-1-full-06611-v1-mkmlizer completed after 342.22s with status: succeeded
Stopping job with name junhua024-chai-1-full-06611-v1-mkmlizer
Pipeline stage MKMLizer completed in 342.75s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.21s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-06611-v1
Waiting for inference service junhua024-chai-1-full-06611-v1 to be ready
Inference service junhua024-chai-1-full-06611-v1 ready after 191.15526032447815s
Pipeline stage MKMLDeployer completed in 191.76s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.6353812217712402s
Received healthy response to inference request in 1.93247652053833s
Received healthy response to inference request in 1.6120669841766357s
Received healthy response to inference request in 1.489119052886963s
Received healthy response to inference request in 1.7482962608337402s
5 requests
0 failed requests
5th percentile: 1.5137086391448975
10th percentile: 1.5382982254028321
20th percentile: 1.5874773979187011
30th percentile: 1.6393128395080567
40th percentile: 1.6938045501708985
50th percentile: 1.7482962608337402
60th percentile: 1.8219683647155762
70th percentile: 1.895640468597412
80th percentile: 2.0730574607849124
90th percentile: 2.354219341278076
95th percentile: 2.494800281524658
99th percentile: 2.607265033721924
mean time: 1.883468008041382
Pipeline stage StressChecker completed in 11.02s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.68s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.93s
Shutdown handler de-registered
junhua024-chai-1-full-06611_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service junhua024-chai-1-full-06611-v1-profiler
Waiting for inference service junhua024-chai-1-full-06611-v1-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5388.76s
Shutdown handler de-registered
junhua024-chai-1-full-06611_v1 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-06611_v1 status is now torndown due to DeploymentManager action