developer_uid: junhua024
submission_id: junhua024-chai-1-full-066126_v91
model_name: junhua024-chai-1-full-066126_v91
model_group: junhua024/chai-1-full-06
status: torndown
timestamp: 2025-07-16T03:06:29+00:00
num_battles: 7758
num_wins: 3723
celo_rating: 1276.09
family_friendly_score: 0.556
family_friendly_standard_error: 0.007026578114559035
submission_type: basic
model_repo: junhua024/chai-1-full-066126
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5949123052575457, 'latency_mean': 1.6808135211467743, 'latency_p50': 1.6871768236160278, 'latency_p90': 1.847775912284851}, {'batch_size': 3, 'throughput': 1.0747053900647858, 'latency_mean': 2.783099205493927, 'latency_p50': 2.780480146408081, 'latency_p90': 3.0767703533172606}, {'batch_size': 5, 'throughput': 1.290014362441466, 'latency_mean': 3.851796588897705, 'latency_p50': 3.8614706993103027, 'latency_p90': 4.325785827636719}, {'batch_size': 6, 'throughput': 1.353659116253372, 'latency_mean': 4.407068272829056, 'latency_p50': 4.419890880584717, 'latency_p90': 4.887210345268249}, {'batch_size': 8, 'throughput': 1.402584728143822, 'latency_mean': 5.661023179292679, 'latency_p50': 5.6874412298202515, 'latency_p90': 6.325840139389038}, {'batch_size': 10, 'throughput': 1.4589375178263617, 'latency_mean': 6.790407216548919, 'latency_p50': 6.791056156158447, 'latency_p90': 7.753618454933166}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-066126_v91
is_internal_developer: False
language_model: junhua024/chai-1-full-066126
model_size: 13B
ranking_group: single
throughput_3p7s: 1.27
us_pacific_date: 2025-07-15
win_ratio: 0.47989172467130703
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '{memory} What subtle feelings lie behind these words?', 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-066126-v91-mkmlizer
Waiting for job on junhua024-chai-1-full-066126-v91-mkmlizer to finish
junhua024-chai-1-full-066126-v91-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-066126-v91-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v91-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v91-mkmlizer: Downloaded to shared memory in 74.469s
junhua024-chai-1-full-066126-v91-mkmlizer: Checking if junhua024/chai-1-full-066126 already exists in ChaiML
junhua024-chai-1-full-066126-v91-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmphsx4hixp, device:0
junhua024-chai-1-full-066126-v91-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-066126-v91-mkmlizer: quantized model in 31.760s
junhua024-chai-1-full-066126-v91-mkmlizer: Processed model junhua024/chai-1-full-066126 in 106.313s
junhua024-chai-1-full-066126-v91-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-066126-v91-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-066126-v91-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v91/nvidia
junhua024-chai-1-full-066126-v91-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v91/nvidia/config.json
junhua024-chai-1-full-066126-v91-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v91/nvidia/special_tokens_map.json
junhua024-chai-1-full-066126-v91-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v91/nvidia/tokenizer_config.json
junhua024-chai-1-full-066126-v91-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v91/nvidia/tokenizer.json
junhua024-chai-1-full-066126-v91-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v91/nvidia/flywheel_model.0.safetensors
junhua024-chai-1-full-066126-v91-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:23, 15.30it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.38it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:10, 32.18it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:10, 34.76it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:11, 30.05it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:11, 29.71it/s] Loading 0: 8%|▊ | 30/363 [00:00<00:08, 40.20it/s] Loading 0: 10%|▉ | 35/363 [00:01<00:09, 35.36it/s] Loading 0: 11%|█ | 39/363 [00:01<00:09, 34.59it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:09, 34.22it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 37.77it/s] Loading 0: 15%|█▍ | 54/363 [00:01<00:10, 30.09it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 31.91it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:09, 32.35it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 31.60it/s] Loading 0: 20%|██ | 74/363 [00:02<00:08, 35.28it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 35.75it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:09, 31.12it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 32.61it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 33.69it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:07, 34.53it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 34.20it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:07, 32.33it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 38.40it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 29.85it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 31.38it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:07, 32.98it/s] Loading 0: 36%|███▋ | 132/363 [00:04<00:07, 32.14it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 35.75it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 36.28it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:06, 31.71it/s] Loading 0: 41%|████ | 149/363 [00:04<00:06, 30.92it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 37.34it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 34.97it/s] Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 34.53it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:05, 32.69it/s] Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 39.02it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 31.69it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 33.32it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:04, 34.64it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 33.31it/s] Loading 0: 55%|█████▌ | 200/363 [00:05<00:04, 36.96it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 36.78it/s] Loading 0: 57%|█████▋ | 208/363 [00:06<00:04, 31.43it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 30.46it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:03, 36.40it/s] Loading 0: 61%|██████▏ | 223/363 [00:06<00:04, 33.84it/s] Loading 0: 63%|██████▎ | 227/363 [00:06<00:04, 32.94it/s] Loading 0: 64%|██████▎ | 231/363 [00:06<00:04, 31.03it/s] Loading 0: 66%|██████▌ | 239/363 [00:07<00:03, 37.40it/s] Loading 0: 67%|██████▋ | 243/363 [00:07<00:03, 30.78it/s] Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 32.41it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 32.98it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 32.13it/s] Loading 0: 72%|███████▏ | 263/363 [00:07<00:02, 35.18it/s] Loading 0: 74%|███████▎ | 267/363 [00:07<00:02, 34.94it/s] Loading 0: 75%|███████▍ | 271/363 [00:08<00:02, 30.97it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 30.54it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 37.04it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 35.31it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 35.22it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 33.13it/s] Loading 0: 83%|████████▎ | 301/363 [00:08<00:01, 41.80it/s] Loading 0: 84%|████████▍ | 306/363 [00:09<00:01, 29.84it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 31.46it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 32.56it/s] Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 31.95it/s] Loading 0: 90%|████████▉ | 326/363 [00:09<00:01, 35.32it/s] Loading 0: 91%|█████████ | 330/363 [00:09<00:00, 35.80it/s] Loading 0: 92%|█████████▏| 334/363 [00:10<00:00, 30.97it/s] Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 30.32it/s] Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 36.15it/s] Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 24.64it/s] Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 22.69it/s] Loading 0: 98%|█████████▊| 357/363 [00:10<00:00, 24.45it/s]
Job junhua024-chai-1-full-066126-v91-mkmlizer completed after 128.87s with status: succeeded
Stopping job with name junhua024-chai-1-full-066126-v91-mkmlizer
Pipeline stage MKMLizer completed in 129.44s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-066126-v91
Waiting for inference service junhua024-chai-1-full-066126-v91 to be ready
Inference service junhua024-chai-1-full-066126-v91 ready after 271.0946810245514s
Pipeline stage MKMLDeployer completed in 271.63s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 3.098212242126465s
Received healthy response to inference request in 1.7387068271636963s
Received healthy response to inference request in 1.8332030773162842s
Received healthy response to inference request in 1.5608947277069092s
5 requests
1 failed requests
5th percentile: 1.5964571475982665
10th percentile: 1.632019567489624
20th percentile: 1.703144407272339
30th percentile: 1.757606077194214
40th percentile: 1.795404577255249
50th percentile: 1.8332030773162842
60th percentile: 2.3392067432403563
70th percentile: 2.8452104091644284
80th percentile: 6.567633581161502
90th percentile: 13.506476259231569
95th percentile: 16.9758975982666
99th percentile: 19.75143466949463
mean time: 5.735267162322998
%s, retrying in %s seconds...
Received healthy response to inference request in 1.6072895526885986s
Received healthy response to inference request in 1.9751131534576416s
Received healthy response to inference request in 1.508406639099121s
Received healthy response to inference request in 1.6944222450256348s
Received healthy response to inference request in 1.6498379707336426s
5 requests
0 failed requests
5th percentile: 1.5281832218170166
10th percentile: 1.547959804534912
20th percentile: 1.5875129699707031
30th percentile: 1.6157992362976075
40th percentile: 1.632818603515625
50th percentile: 1.6498379707336426
60th percentile: 1.6676716804504395
70th percentile: 1.6855053901672363
80th percentile: 1.7505604267120363
90th percentile: 1.8628367900848388
95th percentile: 1.91897497177124
99th percentile: 1.9638855171203613
mean time: 1.6870139122009278
Pipeline stage StressChecker completed in 40.96s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.88s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.78s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v91 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service junhua024-chai-1-full-066126-v91-profiler
Waiting for inference service junhua024-chai-1-full-066126-v91-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5450.82s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v91 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-066126_v91 status is now torndown due to DeploymentManager action
junhua024-chai-1-full-066126_v91 status is now torndown due to DeploymentManager action