developer_uid: junhua024
submission_id: junhua024-chai-1-full_94469_v123
model_name: junhua024-chai-1-full_94469_v123
model_group: junhua024/chai-1-full-06
status: torndown
timestamp: 2025-07-18T05:15:29+00:00
num_battles: 6164
num_wins: 3069
celo_rating: 1272.41
family_friendly_score: 0.5546
family_friendly_standard_error: 0.007028781402206217
submission_type: basic
model_repo: junhua024/chai-1-full-066126
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5962408957227326, 'latency_mean': 1.6770708537101746, 'latency_p50': 1.679550290107727, 'latency_p90': 1.86271071434021}, {'batch_size': 3, 'throughput': 1.060366634124189, 'latency_mean': 2.81469371676445, 'latency_p50': 2.819006562232971, 'latency_p90': 3.038171672821045}, {'batch_size': 5, 'throughput': 1.2883599771224248, 'latency_mean': 3.8643734002113344, 'latency_p50': 3.8656980991363525, 'latency_p90': 4.358347916603089}, {'batch_size': 6, 'throughput': 1.3457362722814825, 'latency_mean': 4.429242091178894, 'latency_p50': 4.43019163608551, 'latency_p90': 5.056080865859985}, {'batch_size': 8, 'throughput': 1.40503021642848, 'latency_mean': 5.658606249094009, 'latency_p50': 5.621620178222656, 'latency_p90': 6.361812496185303}, {'batch_size': 10, 'throughput': 1.4255939313064983, 'latency_mean': 6.952765865325928, 'latency_p50': 6.881294012069702, 'latency_p90': 7.850258755683899}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full_94469_v123
is_internal_developer: False
language_model: junhua024/chai-1-full-066126
model_size: 13B
ranking_group: single
throughput_3p7s: 1.27
us_pacific_date: 2025-07-17
win_ratio: 0.49789097988319275
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-94469-v123-mkmlizer
Waiting for job on junhua024-chai-1-full-94469-v123-mkmlizer to finish
junhua024-chai-1-full-94469-v123-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-94469-v123-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v123-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission chaiml-bat-boys-azeril-_87348_v1: ('http://chaiml-bat-boys-azeril-87348-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission zmeeks-capitanito-54-3000_v11: HTTPConnectionPool(host='zmeeks-capitanito-54-3000-v11-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v123-mkmlizer: Downloaded to shared memory in 85.889s
junhua024-chai-1-full-94469-v123-mkmlizer: Checking if junhua024/chai-1-full-066126 already exists in ChaiML
junhua024-chai-1-full-94469-v123-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpr3ym1p10, device:0
junhua024-chai-1-full-94469-v123-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-94469-v123-mkmlizer: quantized model in 37.782s
junhua024-chai-1-full-94469-v123-mkmlizer: Processed model junhua024/chai-1-full-066126 in 123.778s
junhua024-chai-1-full-94469-v123-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-94469-v123-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-94469-v123-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v123/nvidia
junhua024-chai-1-full-94469-v123-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v123/nvidia/config.json
junhua024-chai-1-full-94469-v123-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v123/nvidia/special_tokens_map.json
junhua024-chai-1-full-94469-v123-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v123/nvidia/tokenizer_config.json
junhua024-chai-1-full-94469-v123-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v123/nvidia/tokenizer.json
junhua024-chai-1-full-94469-v123-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v123/nvidia/flywheel_model.0.safetensors
junhua024-chai-1-full-94469-v123-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:24, 14.63it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:21, 16.91it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:11, 31.31it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:10, 33.50it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:12, 28.20it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:12, 27.66it/s] Loading 0: 8%|▊ | 29/363 [00:00<00:09, 35.08it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:10, 32.31it/s] Loading 0: 10%|█ | 38/363 [00:01<00:10, 30.82it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:11, 28.92it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:08, 37.89it/s] Loading 0: 15%|█▍ | 54/363 [00:01<00:11, 27.78it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:10, 29.91it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:09, 31.32it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 29.97it/s] Loading 0: 20%|██ | 74/363 [00:02<00:08, 32.97it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:08, 33.09it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:10, 26.72it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:09, 27.99it/s] Loading 0: 25%|██▌ | 91/363 [00:03<00:09, 28.98it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 30.14it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:08, 29.77it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:09, 28.18it/s] Loading 0: 31%|███ | 112/363 [00:03<00:06, 36.68it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:09, 26.06it/s] Loading 0: 34%|███▎ | 122/363 [00:04<00:09, 26.21it/s] Loading 0: 35%|███▌ | 128/363 [00:04<00:08, 27.53it/s] Loading 0: 36%|███▋ | 132/363 [00:04<00:08, 27.54it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:07, 30.22it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:07, 30.16it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:08, 25.38it/s] Loading 0: 41%|████ | 149/363 [00:05<00:08, 24.66it/s] Loading 0: 43%|████▎ | 155/363 [00:05<00:06, 30.38it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 29.25it/s] Loading 0: 45%|████▌ | 164/363 [00:05<00:06, 29.48it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:07, 27.64it/s] Loading 0: 48%|████▊ | 173/363 [00:05<00:05, 32.26it/s] Loading 0: 49%|████▉ | 177/363 [00:06<00:06, 28.77it/s] Loading 0: 50%|████▉ | 181/363 [00:06<00:06, 28.23it/s] Loading 0: 51%|█████ | 185/363 [00:06<00:06, 27.48it/s] Loading 0: 52%|█████▏ | 190/363 [00:06<00:05, 32.25it/s] Loading 0: 53%|█████▎ | 194/363 [00:06<00:06, 25.21it/s] Loading 0: 55%|█████▌ | 200/363 [00:06<00:05, 30.73it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:05, 30.92it/s] Loading 0: 57%|█████▋ | 208/363 [00:07<00:05, 26.72it/s] Loading 0: 58%|█████▊ | 212/363 [00:07<00:05, 26.26it/s] Loading 0: 60%|█████▉ | 217/363 [00:07<00:04, 31.00it/s] Loading 0: 61%|██████ | 222/363 [00:07<00:04, 33.19it/s] Loading 0: 62%|██████▏ | 226/363 [00:07<00:05, 26.48it/s] Loading 0: 63%|██████▎ | 230/363 [00:07<00:05, 25.57it/s] Loading 0: 65%|██████▌ | 237/363 [00:08<00:03, 34.24it/s] Loading 0: 66%|██████▋ | 241/363 [00:08<00:04, 29.43it/s] Loading 0: 67%|██████▋ | 245/363 [00:08<00:03, 30.44it/s] Loading 0: 69%|██████▊ | 249/363 [00:08<00:03, 29.71it/s] Loading 0: 70%|██████▉ | 254/363 [00:08<00:03, 28.57it/s] Loading 0: 71%|███████ | 258/363 [00:08<00:03, 28.18it/s] Loading 0: 72%|███████▏ | 263/363 [00:08<00:03, 31.52it/s] Loading 0: 74%|███████▎ | 267/363 [00:09<00:03, 31.85it/s] Loading 0: 75%|███████▍ | 271/363 [00:09<00:03, 27.12it/s] Loading 0: 76%|███████▌ | 275/363 [00:09<00:03, 25.92it/s] Loading 0: 77%|███████▋ | 281/363 [00:09<00:02, 31.46it/s] Loading 0: 79%|███████▉ | 286/363 [00:09<00:02, 29.41it/s] Loading 0: 80%|███████▉ | 290/363 [00:09<00:02, 29.52it/s] Loading 0: 81%|████████ | 294/363 [00:10<00:02, 27.38it/s] Loading 0: 82%|████████▏ | 299/363 [00:10<00:02, 31.95it/s] Loading 0: 83%|████████▎ | 303/363 [00:10<00:02, 28.09it/s] Loading 0: 85%|████████▍ | 307/363 [00:10<00:02, 27.68it/s] Loading 0: 86%|████████▌ | 311/363 [00:10<00:01, 27.25it/s] Loading 0: 87%|████████▋ | 317/363 [00:10<00:01, 28.80it/s] Loading 0: 88%|████████▊ | 320/363 [00:11<00:01, 26.64it/s] Loading 0: 90%|████████▉ | 326/363 [00:11<00:01, 33.20it/s] Loading 0: 91%|█████████ | 330/363 [00:11<00:00, 33.69it/s] Loading 0: 92%|█████████▏| 334/363 [00:11<00:00, 29.38it/s] Loading 0: 93%|█████████▎| 338/363 [00:11<00:00, 27.29it/s] Loading 0: 94%|█████████▍| 343/363 [00:11<00:00, 32.18it/s] Loading 0: 96%|█████████▌| 348/363 [00:11<00:00, 34.41it/s] Loading 0: 97%|█████████▋| 352/363 [00:12<00:00, 16.81it/s] Loading 0: 98%|█████████▊| 357/363 [00:12<00:00, 20.05it/s]
Job junhua024-chai-1-full-94469-v123-mkmlizer completed after 149.43s with status: succeeded
Stopping job with name junhua024-chai-1-full-94469-v123-mkmlizer
Pipeline stage MKMLizer completed in 149.95s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-94469-v123
Waiting for inference service junhua024-chai-1-full-94469-v123 to be ready
Failed to get response for submission zmeeks-capitanito-54-3000_v10: HTTPConnectionPool(host='zmeeks-capitanito-54-3000-v10-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission zmeeks-capitanito-54-2800_v5: HTTPConnectionPool(host='zmeeks-capitanito-54-2800-v5-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission zmeeks-capitanito-54-2800_v6: HTTPConnectionPool(host='zmeeks-capitanito-54-2800-v6-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-bat-boys-azeril-_87348_v1: ('http://chaiml-bat-boys-azeril-87348-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Inference service junhua024-chai-1-full-94469-v123 ready after 321.3289921283722s
Pipeline stage MKMLDeployer completed in 322.00s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.285862445831299s
Received healthy response to inference request in 1.8761579990386963s
Received healthy response to inference request in 1.5441346168518066s
Received healthy response to inference request in 1.6105523109436035s
Received healthy response to inference request in 1.8995375633239746s
5 requests
0 failed requests
5th percentile: 1.557418155670166
10th percentile: 1.5707016944885255
20th percentile: 1.597268772125244
30th percentile: 1.6636734485626221
40th percentile: 1.769915723800659
50th percentile: 1.8761579990386963
60th percentile: 1.8855098247528077
70th percentile: 1.894861650466919
80th percentile: 1.9768025398254396
90th percentile: 2.1313324928283692
95th percentile: 2.208597469329834
99th percentile: 2.2704094505310057
mean time: 1.8432489871978759
Pipeline stage StressChecker completed in 10.87s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.71s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 1.00s
Shutdown handler de-registered
junhua024-chai-1-full_94469_v123 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service junhua024-chai-1-full-94469-v123-profiler
Waiting for inference service junhua024-chai-1-full-94469-v123-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3048.05s
Shutdown handler de-registered
junhua024-chai-1-full_94469_v123 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full_94469_v123 status is now torndown due to DeploymentManager action
junhua024-chai-1-full_94469_v123 status is now torndown due to DeploymentManager action