developer_uid: junhua024
submission_id: junhua024-chai-06-full-_59231_v2
model_name: junhua024-chai-06-full-_59231_v2
model_group: junhua024/chai_06_full_0
status: torndown
timestamp: 2025-07-17T18:33:00+00:00
num_battles: 6859
num_wins: 3374
celo_rating: 1278.51
family_friendly_score: 0.5538000000000001
family_friendly_standard_error: 0.0070300150782199615
submission_type: basic
model_repo: junhua024/chai_06_full_02108_2024
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5999189877712152, 'latency_mean': 1.666760801076889, 'latency_p50': 1.6549627780914307, 'latency_p90': 1.85354642868042}, {'batch_size': 3, 'throughput': 1.0773180912040117, 'latency_mean': 2.777620116472244, 'latency_p50': 2.766732096672058, 'latency_p90': 3.13109815120697}, {'batch_size': 5, 'throughput': 1.303218756326039, 'latency_mean': 3.8243135237693786, 'latency_p50': 3.8076876401901245, 'latency_p90': 4.239580655097962}, {'batch_size': 6, 'throughput': 1.3628307355784233, 'latency_mean': 4.375827633142471, 'latency_p50': 4.361254453659058, 'latency_p90': 4.91368556022644}, {'batch_size': 8, 'throughput': 1.4129407744305065, 'latency_mean': 5.6321450400352475, 'latency_p50': 5.673267126083374, 'latency_p90': 6.347490859031677}, {'batch_size': 10, 'throughput': 1.4473888384700464, 'latency_mean': 6.845866482257843, 'latency_p50': 6.8479450941085815, 'latency_p90': 7.632736158370972}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-06-full-_59231_v2
is_internal_developer: False
language_model: junhua024/chai_06_full_02108_2024
model_size: 13B
ranking_group: single
throughput_3p7s: 1.29
us_pacific_date: 2025-07-17
win_ratio: 0.4919084414637702
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-06-full-59231-v2-mkmlizer
Waiting for job on junhua024-chai-06-full-59231-v2-mkmlizer to finish
junhua024-chai-06-full-59231-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ belonging to: ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-06-full-59231-v2-mkmlizer: ║ ║
junhua024-chai-06-full-59231-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-59231-v2-mkmlizer: Downloaded to shared memory in 137.940s
junhua024-chai-06-full-59231-v2-mkmlizer: Checking if junhua024/chai_06_full_02108_2024 already exists in ChaiML
junhua024-chai-06-full-59231-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp32_subaw, device:0
junhua024-chai-06-full-59231-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-06-full-59231-v2-mkmlizer: quantized model in 32.805s
junhua024-chai-06-full-59231-v2-mkmlizer: Processed model junhua024/chai_06_full_02108_2024 in 170.840s
junhua024-chai-06-full-59231-v2-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-06-full-59231-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-06-full-59231-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-06-full-59231-v2/nvidia
junhua024-chai-06-full-59231-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-06-full-59231-v2/nvidia/config.json
junhua024-chai-06-full-59231-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-06-full-59231-v2/nvidia/special_tokens_map.json
junhua024-chai-06-full-59231-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-06-full-59231-v2/nvidia/tokenizer_config.json
junhua024-chai-06-full-59231-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-06-full-59231-v2/nvidia/tokenizer.json
junhua024-chai-06-full-59231-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-06-full-59231-v2/nvidia/flywheel_model.0.safetensors
junhua024-chai-06-full-59231-v2-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:23, 15.40it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.52it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:11, 31.95it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:10, 34.16it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:12, 27.74it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:12, 27.61it/s] Loading 0: 8%|▊ | 29/363 [00:00<00:09, 34.66it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:10, 32.02it/s] Loading 0: 10%|█ | 38/363 [00:01<00:10, 31.97it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:10, 30.02it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:08, 36.90it/s] Loading 0: 14%|█▍ | 52/363 [00:01<00:10, 30.88it/s] Loading 0: 15%|█▌ | 56/363 [00:01<00:09, 31.04it/s] Loading 0: 17%|█▋ | 60/363 [00:01<00:09, 30.48it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:10, 29.23it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:10, 29.16it/s] Loading 0: 20%|██ | 74/363 [00:02<00:08, 32.85it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:08, 33.62it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:09, 28.15it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:09, 29.96it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 31.10it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 31.56it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:08, 30.54it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:08, 28.87it/s] Loading 0: 31%|███ | 111/363 [00:03<00:07, 35.39it/s] Loading 0: 32%|███▏ | 115/363 [00:03<00:08, 29.61it/s] Loading 0: 33%|███▎ | 119/363 [00:03<00:08, 30.46it/s] Loading 0: 34%|███▍ | 123/363 [00:04<00:08, 29.37it/s] Loading 0: 35%|███▌ | 128/363 [00:04<00:08, 28.25it/s] Loading 0: 36%|███▌ | 131/363 [00:04<00:08, 26.18it/s] Loading 0: 37%|███▋ | 136/363 [00:04<00:07, 31.29it/s] Loading 0: 39%|███▊ | 140/363 [00:04<00:07, 30.39it/s] Loading 0: 40%|███▉ | 144/363 [00:04<00:08, 26.18it/s] Loading 0: 41%|████ | 149/363 [00:04<00:07, 28.05it/s] Loading 0: 43%|████▎ | 155/363 [00:05<00:06, 33.19it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 30.53it/s] Loading 0: 45%|████▌ | 164/363 [00:05<00:06, 30.60it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 29.28it/s] Loading 0: 48%|████▊ | 175/363 [00:05<00:04, 37.76it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:06, 27.60it/s] Loading 0: 51%|█████ | 185/363 [00:06<00:05, 29.81it/s] Loading 0: 53%|█████▎ | 191/363 [00:06<00:05, 30.73it/s] Loading 0: 54%|█████▎ | 195/363 [00:06<00:05, 30.32it/s] Loading 0: 55%|█████▌ | 200/363 [00:06<00:05, 31.71it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 31.95it/s] Loading 0: 57%|█████▋ | 208/363 [00:06<00:05, 28.05it/s] Loading 0: 58%|█████▊ | 212/363 [00:07<00:05, 28.00it/s] Loading 0: 60%|██████ | 218/363 [00:07<00:04, 34.08it/s] Loading 0: 61%|██████▏ | 223/363 [00:07<00:04, 32.13it/s] Loading 0: 63%|██████▎ | 227/363 [00:07<00:04, 32.13it/s] Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 30.26it/s] Loading 0: 65%|██████▌ | 237/363 [00:07<00:03, 36.52it/s] Loading 0: 66%|██████▋ | 241/363 [00:07<00:03, 30.91it/s] Loading 0: 67%|██████▋ | 245/363 [00:08<00:03, 31.71it/s] Loading 0: 69%|██████▊ | 249/363 [00:08<00:03, 30.50it/s] Loading 0: 70%|██████▉ | 254/363 [00:08<00:03, 29.34it/s] Loading 0: 71%|███████ | 258/363 [00:08<00:03, 29.33it/s] Loading 0: 72%|███████▏ | 263/363 [00:08<00:03, 33.32it/s] Loading 0: 74%|███████▎ | 267/363 [00:08<00:02, 33.99it/s] Loading 0: 75%|███████▍ | 271/363 [00:08<00:03, 28.74it/s] Loading 0: 76%|███████▌ | 275/363 [00:09<00:03, 28.18it/s] Loading 0: 77%|███████▋ | 281/363 [00:09<00:02, 34.03it/s] Loading 0: 79%|███████▉ | 286/363 [00:09<00:02, 32.05it/s] Loading 0: 80%|███████▉ | 290/363 [00:09<00:02, 31.81it/s] Loading 0: 81%|████████ | 294/363 [00:09<00:02, 30.06it/s] Loading 0: 83%|████████▎ | 300/363 [00:09<00:01, 36.72it/s] Loading 0: 84%|████████▎ | 304/363 [00:09<00:01, 31.07it/s] Loading 0: 85%|████████▍ | 308/363 [00:10<00:01, 31.78it/s] Loading 0: 86%|████████▌ | 312/363 [00:10<00:01, 30.84it/s] Loading 0: 87%|████████▋ | 317/363 [00:10<00:01, 29.32it/s] Loading 0: 88%|████████▊ | 321/363 [00:10<00:01, 29.12it/s] Loading 0: 90%|████████▉ | 326/363 [00:10<00:01, 32.62it/s] Loading 0: 91%|█████████ | 330/363 [00:10<00:00, 33.51it/s] Loading 0: 92%|█████████▏| 334/363 [00:10<00:01, 28.48it/s] Loading 0: 93%|█████████▎| 338/363 [00:11<00:00, 28.01it/s] Loading 0: 95%|█████████▍| 344/363 [00:11<00:00, 34.01it/s] Loading 0: 96%|█████████▌| 349/363 [00:11<00:00, 23.62it/s] Loading 0: 97%|█████████▋| 352/363 [00:11<00:00, 20.51it/s] Loading 0: 98%|█████████▊| 357/363 [00:11<00:00, 24.33it/s]
Job junhua024-chai-06-full-59231-v2-mkmlizer completed after 200.02s with status: succeeded
Stopping job with name junhua024-chai-06-full-59231-v2-mkmlizer
Pipeline stage MKMLizer completed in 200.63s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-06-full-59231-v2
Waiting for inference service junhua024-chai-06-full-59231-v2 to be ready
Failed to get response for submission junhua024-chai-06-full-_60967_v6: HTTPConnectionPool(host='junhua024-chai-06-full-60967-v6-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-06-full-59231-v2 ready after 301.42208552360535s
Pipeline stage MKMLDeployer completed in 302.20s
run pipeline stage %s
Running pipeline stage StressChecker
Failed to get response for submission junhua024-chai-06-full-_60967_v6: HTTPConnectionPool(host='junhua024-chai-06-full-60967-v6-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Received healthy response to inference request in 6.450276613235474s
Received healthy response to inference request in 2.190920114517212s
Received healthy response to inference request in 2.548283338546753s
Received healthy response to inference request in 6.7161829471588135s
Received healthy response to inference request in 3.737779140472412s
5 requests
0 failed requests
5th percentile: 2.26239275932312
10th percentile: 2.3338654041290283
20th percentile: 2.4768106937408447
30th percentile: 2.786182498931885
40th percentile: 3.2619808197021487
50th percentile: 3.737779140472412
60th percentile: 4.822778129577636
70th percentile: 5.907777118682861
80th percentile: 6.503457880020141
90th percentile: 6.609820413589477
95th percentile: 6.663001680374146
99th percentile: 6.70554669380188
mean time: 4.328688430786133
%s, retrying in %s seconds...
Received healthy response to inference request in 6.9441258907318115s
Received healthy response to inference request in 1.6449487209320068s
Received healthy response to inference request in 7.523065805435181s
Received healthy response to inference request in 1.7045161724090576s
Received healthy response to inference request in 1.6332566738128662s
5 requests
0 failed requests
5th percentile: 1.6355950832366943
10th percentile: 1.6379334926605225
20th percentile: 1.6426103115081787
30th percentile: 1.656862211227417
40th percentile: 1.6806891918182374
50th percentile: 1.7045161724090576
60th percentile: 3.800360059738159
70th percentile: 5.89620394706726
80th percentile: 7.059913873672485
90th percentile: 7.291489839553833
95th percentile: 7.407277822494507
99th percentile: 7.499908208847046
mean time: 3.8899826526641847
%s, retrying in %s seconds...
Received healthy response to inference request in 1.674199104309082s
Received healthy response to inference request in 1.9421217441558838s
Received healthy response to inference request in 1.5697929859161377s
Received healthy response to inference request in 1.6522722244262695s
Received healthy response to inference request in 1.798034429550171s
5 requests
0 failed requests
5th percentile: 1.586288833618164
10th percentile: 1.6027846813201905
20th percentile: 1.6357763767242433
30th percentile: 1.656657600402832
40th percentile: 1.665428352355957
50th percentile: 1.674199104309082
60th percentile: 1.7237332344055176
70th percentile: 1.7732673645019532
80th percentile: 1.8268518924713135
90th percentile: 1.8844868183135985
95th percentile: 1.9133042812347412
99th percentile: 1.9363582515716553
mean time: 1.7272840976715087
Pipeline stage StressChecker completed in 54.28s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.74s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.79s
Shutdown handler de-registered
junhua024-chai-06-full-_59231_v2 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service junhua024-chai-06-full-59231-v2-profiler
Waiting for inference service junhua024-chai-06-full-59231-v2-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5349.65s
Shutdown handler de-registered
junhua024-chai-06-full-_59231_v2 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-06-full-_59231_v2 status is now torndown due to DeploymentManager action
junhua024-chai-06-full-_59231_v2 status is now torndown due to DeploymentManager action