developer_uid: junhua024
submission_id: junhua024-chai-1-full-066126_v22
model_name: junhua024-chai-1-full-066126_v22
model_group: junhua024/chai-1-full-06
status: torndown
timestamp: 2025-07-14T09:09:35+00:00
num_battles: 17882
num_wins: 8445
celo_rating: 1273.02
family_friendly_score: 0.5478000000000001
family_friendly_standard_error: 0.007038681126461122
submission_type: basic
model_repo: junhua024/chai-1-full-066126
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5958384012477219, 'latency_mean': 1.6781484293937683, 'latency_p50': 1.6756776571273804, 'latency_p90': 1.8582544088363646}, {'batch_size': 3, 'throughput': 1.0752996920425428, 'latency_mean': 2.7848324584960937, 'latency_p50': 2.776618242263794, 'latency_p90': 3.055537533760071}, {'batch_size': 5, 'throughput': 1.302204273810605, 'latency_mean': 3.8188776183128357, 'latency_p50': 3.805890202522278, 'latency_p90': 4.250283265113831}, {'batch_size': 6, 'throughput': 1.3488596038554623, 'latency_mean': 4.4186802518367765, 'latency_p50': 4.443364262580872, 'latency_p90': 4.931076192855835}, {'batch_size': 8, 'throughput': 1.421793811571981, 'latency_mean': 5.583382289409638, 'latency_p50': 5.565322756767273, 'latency_p90': 6.253253602981568}, {'batch_size': 10, 'throughput': 1.4440316156447957, 'latency_mean': 6.869642262458801, 'latency_p50': 6.894250869750977, 'latency_p90': 7.869023346900939}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-066126_v22
is_internal_developer: False
language_model: junhua024/chai-1-full-066126
model_size: 13B
ranking_group: single
throughput_3p7s: 1.29
us_pacific_date: 2025-07-14
win_ratio: 0.47226261044625883
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '{memory} What \xa0nuanced feelings are behind these words?', 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-066126-v22-mkmlizer
Waiting for job on junhua024-chai-1-full-066126-v22-mkmlizer to finish
junhua024-chai-1-full-066126-v22-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-066126-v22-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v22-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v22-mkmlizer: Downloaded to shared memory in 82.281s
junhua024-chai-1-full-066126-v22-mkmlizer: Checking if junhua024/chai-1-full-066126 already exists in ChaiML
junhua024-chai-1-full-066126-v22-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpnfj6owdq, device:0
junhua024-chai-1-full-066126-v22-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-066126-v22-mkmlizer: quantized model in 37.551s
junhua024-chai-1-full-066126-v22-mkmlizer: Processed model junhua024/chai-1-full-066126 in 119.910s
junhua024-chai-1-full-066126-v22-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-066126-v22-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-066126-v22-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v22/nvidia
junhua024-chai-1-full-066126-v22-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v22/nvidia/config.json
junhua024-chai-1-full-066126-v22-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v22/nvidia/special_tokens_map.json
junhua024-chai-1-full-066126-v22-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v22/nvidia/tokenizer_config.json
junhua024-chai-1-full-066126-v22-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v22/nvidia/tokenizer.json
junhua024-chai-1-full-066126-v22-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v22/nvidia/flywheel_model.0.safetensors
junhua024-chai-1-full-066126-v22-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:27, 13.24it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:23, 15.39it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:12, 28.74it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:11, 30.42it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:15, 22.16it/s] Loading 0: 6%|▋ | 23/363 [00:01<00:15, 22.36it/s] Loading 0: 8%|▊ | 28/363 [00:01<00:11, 28.19it/s] Loading 0: 9%|▉ | 33/363 [00:01<00:10, 31.48it/s] Loading 0: 10%|█ | 37/363 [00:01<00:13, 24.51it/s] Loading 0: 11%|█▏ | 41/363 [00:01<00:13, 24.10it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:09, 33.17it/s] Loading 0: 15%|█▍ | 53/363 [00:01<00:10, 29.15it/s] Loading 0: 16%|█▌ | 57/363 [00:02<00:10, 29.09it/s] Loading 0: 17%|█▋ | 61/363 [00:02<00:10, 29.46it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:10, 27.29it/s] Loading 0: 19%|█▊ | 68/363 [00:02<00:11, 25.22it/s] Loading 0: 20%|██ | 74/363 [00:02<00:09, 31.31it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:09, 31.59it/s] Loading 0: 23%|██▎ | 82/363 [00:03<00:10, 26.99it/s] Loading 0: 24%|██▎ | 86/363 [00:03<00:09, 28.86it/s] Loading 0: 25%|██▌ | 91/363 [00:03<00:09, 30.11it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 30.61it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:08, 30.27it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:09, 26.72it/s] Loading 0: 30%|███ | 109/363 [00:03<00:09, 27.81it/s] Loading 0: 31%|███ | 113/363 [00:04<00:08, 29.51it/s] Loading 0: 32%|███▏ | 117/363 [00:04<00:10, 23.20it/s] Loading 0: 34%|███▎ | 122/363 [00:04<00:09, 25.42it/s] Loading 0: 35%|███▍ | 127/363 [00:04<00:07, 30.01it/s] Loading 0: 36%|███▌ | 131/363 [00:04<00:09, 24.27it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:07, 30.36it/s] Loading 0: 39%|███▉ | 141/363 [00:05<00:07, 31.43it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:07, 28.24it/s] Loading 0: 41%|████ | 149/363 [00:05<00:07, 27.33it/s] Loading 0: 43%|████▎ | 155/363 [00:05<00:06, 32.85it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 31.26it/s] Loading 0: 45%|████▌ | 164/363 [00:05<00:06, 31.61it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 29.74it/s] Loading 0: 48%|████▊ | 174/363 [00:06<00:05, 36.01it/s] Loading 0: 49%|████▉ | 178/363 [00:06<00:06, 30.51it/s] Loading 0: 50%|█████ | 182/363 [00:06<00:05, 31.64it/s] Loading 0: 51%|█████ | 186/363 [00:06<00:05, 30.39it/s] Loading 0: 53%|█████▎ | 191/363 [00:06<00:06, 28.36it/s] Loading 0: 53%|█████▎ | 194/363 [00:06<00:06, 25.69it/s] Loading 0: 55%|█████▍ | 199/363 [00:07<00:05, 30.03it/s] Loading 0: 56%|█████▌ | 203/363 [00:07<00:05, 27.95it/s] Loading 0: 57%|█████▋ | 206/363 [00:07<00:06, 23.54it/s] Loading 0: 58%|█████▊ | 212/363 [00:07<00:05, 25.89it/s] Loading 0: 60%|█████▉ | 217/363 [00:07<00:04, 30.48it/s] Loading 0: 61%|██████ | 222/363 [00:07<00:04, 33.64it/s] Loading 0: 62%|██████▏ | 226/363 [00:08<00:05, 26.45it/s] Loading 0: 63%|██████▎ | 230/363 [00:08<00:05, 26.17it/s] Loading 0: 66%|██████▌ | 238/363 [00:08<00:03, 37.00it/s] Loading 0: 67%|██████▋ | 243/363 [00:08<00:04, 27.99it/s] Loading 0: 68%|██████▊ | 248/363 [00:08<00:03, 29.74it/s] Loading 0: 70%|██████▉ | 254/363 [00:08<00:03, 30.84it/s] Loading 0: 71%|███████ | 258/363 [00:09<00:03, 30.59it/s] Loading 0: 72%|███████▏ | 263/363 [00:09<00:02, 34.14it/s] Loading 0: 74%|███████▎ | 267/363 [00:09<00:02, 34.34it/s] Loading 0: 75%|███████▍ | 271/363 [00:09<00:03, 28.93it/s] Loading 0: 76%|███████▌ | 275/363 [00:09<00:03, 28.09it/s] Loading 0: 77%|███████▋ | 281/363 [00:09<00:02, 33.79it/s] Loading 0: 79%|███████▉ | 286/363 [00:09<00:02, 31.29it/s] Loading 0: 80%|███████▉ | 290/363 [00:10<00:02, 30.15it/s] Loading 0: 81%|████████ | 294/363 [00:10<00:02, 25.76it/s] Loading 0: 82%|████████▏ | 298/363 [00:10<00:02, 28.16it/s] Loading 0: 83%|████████▎ | 302/363 [00:10<00:02, 30.45it/s] Loading 0: 84%|████████▍ | 306/363 [00:10<00:02, 23.58it/s] Loading 0: 86%|████████▌ | 311/363 [00:10<00:02, 25.27it/s] Loading 0: 87%|████████▋ | 317/363 [00:11<00:01, 26.94it/s] Loading 0: 88%|████████▊ | 320/363 [00:11<00:01, 24.89it/s] Loading 0: 90%|████████▉ | 326/363 [00:11<00:01, 30.56it/s] Loading 0: 91%|█████████ | 330/363 [00:11<00:01, 30.06it/s] Loading 0: 92%|█████████▏| 334/363 [00:11<00:01, 26.08it/s] Loading 0: 93%|█████████▎| 338/363 [00:11<00:00, 25.06it/s] Loading 0: 94%|█████████▍| 343/363 [00:12<00:00, 29.75it/s] Loading 0: 96%|█████████▌| 348/363 [00:12<00:00, 31.81it/s] Loading 0: 97%|█████████▋| 352/363 [00:12<00:00, 15.86it/s] Loading 0: 98%|█████████▊| 357/363 [00:12<00:00, 19.14it/s] Loading 0: 100%|█████████▉| 362/363 [00:12<00:00, 23.69it/s]
Job junhua024-chai-1-full-066126-v22-mkmlizer completed after 159.97s with status: succeeded
Stopping job with name junhua024-chai-1-full-066126-v22-mkmlizer
Pipeline stage MKMLizer completed in 160.47s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-066126-v22
Waiting for inference service junhua024-chai-1-full-066126-v22 to be ready
Inference service junhua024-chai-1-full-066126-v22 ready after 231.057785987854s
Pipeline stage MKMLDeployer completed in 231.61s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.4972972869873047s
Received healthy response to inference request in 1.495042085647583s
Received healthy response to inference request in 1.8156726360321045s
Received healthy response to inference request in 1.7614924907684326s
5 requests
1 failed requests
5th percentile: 1.5483321666717529
10th percentile: 1.6016222476959228
20th percentile: 1.7082024097442627
30th percentile: 1.772328519821167
40th percentile: 1.7940005779266357
50th percentile: 1.8156726360321045
60th percentile: 2.0883224964141847
70th percentile: 2.3609723567962644
80th percentile: 6.032566881179813
90th percentile: 13.10310606956482
95th percentile: 16.638375663757323
99th percentile: 19.46659133911133
mean time: 5.548629951477051
%s, retrying in %s seconds...
Received healthy response to inference request in 1.759671926498413s
Received healthy response to inference request in 1.4781477451324463s
Received healthy response to inference request in 1.8436429500579834s
Received healthy response to inference request in 1.6119284629821777s
Received healthy response to inference request in 1.5729336738586426s
5 requests
0 failed requests
5th percentile: 1.4971049308776856
10th percentile: 1.516062116622925
20th percentile: 1.5539764881134033
30th percentile: 1.5807326316833497
40th percentile: 1.5963305473327636
50th percentile: 1.6119284629821777
60th percentile: 1.671025848388672
70th percentile: 1.730123233795166
80th percentile: 1.7764661312103271
90th percentile: 1.8100545406341553
95th percentile: 1.8268487453460693
99th percentile: 1.8402841091156006
mean time: 1.6532649517059326
Pipeline stage StressChecker completed in 39.07s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.91s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.71s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v22 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 6022.26s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v22 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-066126_v22 status is now torndown due to DeploymentManager action