developer_uid: junhua024
submission_id: junhua024-chai-06-full_71610_v12
model_name: junhua024-chai-06-full_71610_v12
model_group: junhua024/chai_06_full_0
status: torndown
timestamp: 2025-07-18T05:13:44+00:00
num_battles: 6594
num_wins: 3279
celo_rating: 1273.71
family_friendly_score: 0.5558000000000001
family_friendly_standard_error: 0.0070268963276826565
submission_type: basic
model_repo: junhua024/chai_06_full_02102_1619_2024
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.6005989140941996, 'latency_mean': 1.6648979234695434, 'latency_p50': 1.6580144166946411, 'latency_p90': 1.830347228050232}, {'batch_size': 3, 'throughput': 1.101203078772153, 'latency_mean': 2.720709590911865, 'latency_p50': 2.730709433555603, 'latency_p90': 3.0330628633499144}, {'batch_size': 5, 'throughput': 1.3257458899858117, 'latency_mean': 3.7505234026908876, 'latency_p50': 3.7709919214248657, 'latency_p90': 4.1529449939727785}, {'batch_size': 6, 'throughput': 1.3795119159409261, 'latency_mean': 4.327033559083938, 'latency_p50': 4.328305125236511, 'latency_p90': 4.835097146034241}, {'batch_size': 8, 'throughput': 1.4465453736383422, 'latency_mean': 5.496317850351334, 'latency_p50': 5.5067901611328125, 'latency_p90': 6.091724562644958}, {'batch_size': 10, 'throughput': 1.4943923819303147, 'latency_mean': 6.630015799999237, 'latency_p50': 6.5919530391693115, 'latency_p90': 7.551863169670105}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-06-full_71610_v12
is_internal_developer: False
language_model: junhua024/chai_06_full_02102_1619_2024
model_size: 13B
ranking_group: single
throughput_3p7s: 1.32
us_pacific_date: 2025-07-17
win_ratio: 0.49727024567788897
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-06-full-71610-v12-mkmlizer
Waiting for job on junhua024-chai-06-full-71610-v12-mkmlizer to finish
Failed to get response for submission zmeeks-capitanito-54-2600_v9: HTTPConnectionPool(host='zmeeks-capitanito-54-2600-v9-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-06-full-71610-v12-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ belonging to: ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-06-full-71610-v12-mkmlizer: ║ ║
junhua024-chai-06-full-71610-v12-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission chaiml-bat-boys-azeril-_87348_v1: ('http://chaiml-bat-boys-azeril-87348-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission zmeeks-capitanito-54-2600_v10: HTTPConnectionPool(host='zmeeks-capitanito-54-2600-v10-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-71610-v12-mkmlizer: Downloaded to shared memory in 95.723s
junhua024-chai-06-full-71610-v12-mkmlizer: Checking if junhua024/chai_06_full_02102_1619_2024 already exists in ChaiML
junhua024-chai-06-full-71610-v12-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp3eplndmn, device:0
junhua024-chai-06-full-71610-v12-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-06-full-71610-v12-mkmlizer: quantized model in 31.948s
junhua024-chai-06-full-71610-v12-mkmlizer: Processed model junhua024/chai_06_full_02102_1619_2024 in 127.755s
junhua024-chai-06-full-71610-v12-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-06-full-71610-v12-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-06-full-71610-v12-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-06-full-71610-v12/nvidia
junhua024-chai-06-full-71610-v12-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-06-full-71610-v12/nvidia/config.json
junhua024-chai-06-full-71610-v12-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-06-full-71610-v12/nvidia/special_tokens_map.json
junhua024-chai-06-full-71610-v12-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-06-full-71610-v12/nvidia/tokenizer_config.json
junhua024-chai-06-full-71610-v12-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-06-full-71610-v12/nvidia/tokenizer.json
junhua024-chai-06-full-71610-v12-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-06-full-71610-v12/nvidia/flywheel_model.0.safetensors
junhua024-chai-06-full-71610-v12-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:23, 15.63it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:19, 17.93it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:12, 28.21it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:11, 29.48it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:10, 31.80it/s] Loading 0: 8%|▊ | 29/363 [00:00<00:08, 38.22it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:09, 35.50it/s] Loading 0: 10%|█ | 38/363 [00:01<00:09, 35.44it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:09, 32.67it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 38.55it/s] Loading 0: 15%|█▍ | 54/363 [00:01<00:10, 30.41it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 32.36it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:08, 33.44it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 31.86it/s] Loading 0: 20%|██ | 74/363 [00:02<00:08, 35.37it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 35.94it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:09, 30.93it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 32.85it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 33.59it/s] Loading 0: 26%|██▋ | 96/363 [00:02<00:07, 37.40it/s] Loading 0: 28%|██▊ | 100/363 [00:03<00:08, 29.98it/s] Loading 0: 29%|██▊ | 104/363 [00:03<00:08, 29.44it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 37.69it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 30.51it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 32.38it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 33.63it/s] Loading 0: 36%|███▋ | 132/363 [00:04<00:07, 32.50it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 35.75it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 36.37it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:06, 31.82it/s] Loading 0: 41%|████ | 149/363 [00:04<00:06, 30.91it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 36.73it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 34.60it/s] Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 34.73it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:05, 32.50it/s] Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 38.44it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 30.50it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 31.78it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 32.21it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 31.76it/s] Loading 0: 55%|█████▌ | 200/363 [00:06<00:04, 35.14it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 35.74it/s] Loading 0: 57%|█████▋ | 208/363 [00:06<00:04, 31.53it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:05, 29.55it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:04, 34.59it/s] Loading 0: 61%|██████▏ | 223/363 [00:06<00:04, 32.42it/s] Loading 0: 63%|██████▎ | 227/363 [00:06<00:04, 32.21it/s] Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 30.22it/s] Loading 0: 65%|██████▌ | 237/363 [00:07<00:03, 36.74it/s] Loading 0: 66%|██████▋ | 241/363 [00:07<00:03, 31.04it/s] Loading 0: 67%|██████▋ | 245/363 [00:07<00:03, 32.29it/s] Loading 0: 69%|██████▊ | 249/363 [00:07<00:03, 31.45it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 30.94it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 30.51it/s] Loading 0: 72%|███████▏ | 263/363 [00:07<00:02, 34.83it/s] Loading 0: 74%|███████▎ | 267/363 [00:08<00:02, 35.75it/s] Loading 0: 75%|███████▍ | 271/363 [00:08<00:03, 30.37it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:03, 28.86it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 34.87it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 33.43it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 33.88it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 31.96it/s] Loading 0: 83%|████████▎ | 302/363 [00:09<00:01, 38.33it/s] Loading 0: 84%|████████▍ | 306/363 [00:09<00:01, 31.08it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 32.85it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 33.68it/s] Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 32.08it/s] Loading 0: 90%|████████▉ | 326/363 [00:09<00:01, 35.30it/s] Loading 0: 91%|█████████ | 330/363 [00:09<00:00, 36.23it/s] Loading 0: 92%|█████████▏| 334/363 [00:10<00:00, 31.75it/s] Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 30.93it/s] Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 37.36it/s] Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 25.43it/s] Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 23.52it/s] Loading 0: 98%|█████████▊| 357/363 [00:11<00:00, 25.33it/s]
Job junhua024-chai-06-full-71610-v12-mkmlizer completed after 149.31s with status: succeeded
Stopping job with name junhua024-chai-06-full-71610-v12-mkmlizer
Pipeline stage MKMLizer completed in 149.96s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-06-full-71610-v12
Waiting for inference service junhua024-chai-06-full-71610-v12 to be ready
Failed to get response for submission zmeeks-capitanito-54-3000_v10: HTTPConnectionPool(host='zmeeks-capitanito-54-3000-v10-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission zmeeks-capitanito-54-3000_v11: HTTPConnectionPool(host='zmeeks-capitanito-54-3000-v11-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-bat-boys-azeril-_87348_v1: ('http://chaiml-bat-boys-azeril-87348-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Inference service junhua024-chai-06-full-71610-v12 ready after 311.7948248386383s
Pipeline stage MKMLDeployer completed in 312.43s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.509059429168701s
Received healthy response to inference request in 1.5530595779418945s
Received healthy response to inference request in 1.61356782913208s
Received healthy response to inference request in 1.5245616436004639s
Received healthy response to inference request in 1.7319343090057373s
5 requests
0 failed requests
5th percentile: 1.53026123046875
10th percentile: 1.535960817337036
20th percentile: 1.5473599910736084
30th percentile: 1.5651612281799316
40th percentile: 1.5893645286560059
50th percentile: 1.61356782913208
60th percentile: 1.660914421081543
70th percentile: 1.7082610130310059
80th percentile: 1.8873593330383303
90th percentile: 2.1982093811035157
95th percentile: 2.353634405136108
99th percentile: 2.4779744243621824
mean time: 1.7864365577697754
Pipeline stage StressChecker completed in 10.96s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.87s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.71s
Shutdown handler de-registered
junhua024-chai-06-full_71610_v12 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service junhua024-chai-06-full-71610-v12-profiler
Waiting for inference service junhua024-chai-06-full-71610-v12-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5362.08s
Shutdown handler de-registered
junhua024-chai-06-full_71610_v12 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-06-full_71610_v12 status is now torndown due to DeploymentManager action
junhua024-chai-06-full_71610_v12 status is now torndown due to DeploymentManager action