developer_uid: junhua024
submission_id: junhua024-chai-1-full_94469_v133
model_name: junhua024-chai-1-full_94469_v133
model_group: junhua024/chai-1-full-06
status: torndown
timestamp: 2025-07-18T12:24:11+00:00
num_battles: 5619
num_wins: 2763
celo_rating: 1268.36
family_friendly_score: 0.5516
family_friendly_standard_error: 0.007033312732987209
submission_type: basic
model_repo: junhua024/chai-1-full-066126
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.593571382383346, 'latency_mean': 1.6845678544044496, 'latency_p50': 1.692692756652832, 'latency_p90': 1.8519887924194336}, {'batch_size': 3, 'throughput': 1.0558847032501657, 'latency_mean': 2.831224924325943, 'latency_p50': 2.8348522186279297, 'latency_p90': 3.08767671585083}, {'batch_size': 5, 'throughput': 1.2636183058226726, 'latency_mean': 3.9391052508354187, 'latency_p50': 3.948301315307617, 'latency_p90': 4.4386893033981325}, {'batch_size': 6, 'throughput': 1.3243593806769969, 'latency_mean': 4.510546909570694, 'latency_p50': 4.510810494422913, 'latency_p90': 5.0188984155654905}, {'batch_size': 8, 'throughput': 1.3786487397776919, 'latency_mean': 5.764431101083756, 'latency_p50': 5.789395213127136, 'latency_p90': 6.485280537605286}, {'batch_size': 10, 'throughput': 1.4107031041931697, 'latency_mean': 7.034316945075989, 'latency_p50': 7.101789593696594, 'latency_p90': 7.920010757446289}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full_94469_v133
is_internal_developer: False
language_model: junhua024/chai-1-full-066126
model_size: 13B
ranking_group: single
throughput_3p7s: 1.23
us_pacific_date: 2025-07-18
win_ratio: 0.49172450613988256
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-94469-v133-mkmlizer
Waiting for job on junhua024-chai-1-full-94469-v133-mkmlizer to finish
junhua024-chai-1-full-94469-v133-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-94469-v133-mkmlizer: ║ ║
junhua024-chai-1-full-94469-v133-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-94469-v133-mkmlizer: Downloaded to shared memory in 63.375s
junhua024-chai-1-full-94469-v133-mkmlizer: Checking if junhua024/chai-1-full-066126 already exists in ChaiML
junhua024-chai-1-full-94469-v133-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpiy93y7ll, device:0
junhua024-chai-1-full-94469-v133-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission zmeeks-capitanito-53-2000_v13: HTTPConnectionPool(host='zmeeks-capitanito-53-2000-v13-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-1-full-94469-v133-mkmlizer: quantized model in 30.892s
junhua024-chai-1-full-94469-v133-mkmlizer: Processed model junhua024/chai-1-full-066126 in 94.327s
junhua024-chai-1-full-94469-v133-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-94469-v133-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-94469-v133-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v133/nvidia
junhua024-chai-1-full-94469-v133-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v133/nvidia/special_tokens_map.json
junhua024-chai-1-full-94469-v133-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v133/nvidia/config.json
junhua024-chai-1-full-94469-v133-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v133/nvidia/tokenizer_config.json
junhua024-chai-1-full-94469-v133-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v133/nvidia/tokenizer.json
junhua024-chai-1-full-94469-v133-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-94469-v133/nvidia/flywheel_model.0.safetensors
junhua024-chai-1-full-94469-v133-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:22, 16.00it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:19, 18.57it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:11, 29.68it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:11, 30.86it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:10, 33.14it/s] Loading 0: 8%|▊ | 30/363 [00:00<00:07, 42.12it/s] Loading 0: 10%|▉ | 35/363 [00:01<00:09, 35.83it/s] Loading 0: 11%|█ | 39/363 [00:01<00:09, 35.62it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:09, 35.45it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 39.65it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:09, 34.09it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 33.35it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 35.21it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:08, 34.51it/s] Loading 0: 21%|██ | 75/363 [00:02<00:08, 34.05it/s] Loading 0: 22%|██▏ | 80/363 [00:02<00:08, 32.25it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:07, 37.94it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:07, 36.54it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:07, 37.48it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:07, 36.78it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:07, 33.70it/s] Loading 0: 31%|███ | 112/363 [00:03<00:05, 42.16it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 30.33it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 32.39it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 33.88it/s] Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 33.49it/s] Loading 0: 38%|███▊ | 138/363 [00:04<00:06, 33.54it/s] Loading 0: 39%|███▉ | 143/363 [00:04<00:06, 33.07it/s] Loading 0: 41%|████ | 149/363 [00:04<00:06, 33.77it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 38.19it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 34.70it/s] Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 34.26it/s] Loading 0: 46%|████▋ | 168/363 [00:04<00:06, 31.75it/s] Loading 0: 48%|████▊ | 174/363 [00:05<00:04, 38.06it/s] Loading 0: 49%|████▉ | 179/363 [00:05<00:05, 32.74it/s] Loading 0: 50%|█████ | 183/363 [00:05<00:05, 32.88it/s] Loading 0: 52%|█████▏ | 187/363 [00:05<00:05, 32.93it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 29.60it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 29.48it/s] Loading 0: 55%|█████▌ | 200/363 [00:05<00:04, 34.14it/s] Loading 0: 56%|█████▌ | 204/363 [00:05<00:04, 34.12it/s] Loading 0: 57%|█████▋ | 208/363 [00:06<00:05, 30.95it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 30.44it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:03, 37.29it/s] Loading 0: 61%|██████▏ | 223/363 [00:06<00:03, 35.28it/s] Loading 0: 63%|██████▎ | 227/363 [00:06<00:03, 35.22it/s] Loading 0: 64%|██████▎ | 231/363 [00:06<00:03, 33.01it/s] Loading 0: 66%|██████▌ | 239/363 [00:06<00:03, 38.58it/s] Loading 0: 67%|██████▋ | 243/363 [00:07<00:03, 31.58it/s] Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 34.08it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 35.10it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 34.53it/s] Loading 0: 73%|███████▎ | 264/363 [00:07<00:02, 34.29it/s] Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 33.99it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 35.18it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 40.53it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 36.84it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 36.39it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 34.09it/s] Loading 0: 83%|████████▎ | 302/363 [00:08<00:01, 39.47it/s] Loading 0: 84%|████████▍ | 306/363 [00:08<00:01, 32.22it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 33.89it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 34.90it/s] Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 33.95it/s] Loading 0: 90%|█████████ | 327/363 [00:09<00:01, 34.25it/s] Loading 0: 91%|█████████▏| 332/363 [00:09<00:00, 33.86it/s] Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 34.84it/s] Loading 0: 95%|█████████▍| 344/363 [00:09<00:00, 39.73it/s] Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 27.65it/s] Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 25.35it/s] Loading 0: 98%|█████████▊| 357/363 [00:10<00:00, 26.96it/s]
Job junhua024-chai-1-full-94469-v133-mkmlizer completed after 117.17s with status: succeeded
Stopping job with name junhua024-chai-1-full-94469-v133-mkmlizer
Pipeline stage MKMLizer completed in 117.67s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-94469-v133
Waiting for inference service junhua024-chai-1-full-94469-v133 to be ready
Failed to get response for submission chaiml-bat-boys-azeril-_87348_v1: ('http://chaiml-bat-boys-azeril-87348-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Failed to get response for submission zmeeks-capitanito-53-2000_v13: HTTPConnectionPool(host='zmeeks-capitanito-53-2000-v13-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission zmeeks-capitanito-53-2000_v13: HTTPConnectionPool(host='zmeeks-capitanito-53-2000-v13-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission zmeeks-capitanito-53-2000_v13: HTTPConnectionPool(host='zmeeks-capitanito-53-2000-v13-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-bat-boys-azeril-_87348_v1: ('http://chaiml-bat-boys-azeril-87348-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Failed to get response for submission zmeeks-capitanito-54-2800_v10: HTTPConnectionPool(host='zmeeks-capitanito-54-2800-v10-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-bat-boys-azeril-_87348_v1: ('http://chaiml-bat-boys-azeril-87348-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Inference service junhua024-chai-1-full-94469-v133 ready after 322.426283121109s
Pipeline stage MKMLDeployer completed in 322.98s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.448333978652954s
Received healthy response to inference request in 1.6261639595031738s
Received healthy response to inference request in 1.6056509017944336s
Received healthy response to inference request in 1.8838889598846436s
Received healthy response to inference request in 1.7753660678863525s
5 requests
0 failed requests
5th percentile: 1.6097535133361816
10th percentile: 1.6138561248779297
20th percentile: 1.6220613479614259
30th percentile: 1.6560043811798095
40th percentile: 1.7156852245330811
50th percentile: 1.7753660678863525
60th percentile: 1.818775224685669
70th percentile: 1.8621843814849854
80th percentile: 1.9967779636383058
90th percentile: 2.22255597114563
95th percentile: 2.335444974899292
99th percentile: 2.4257561779022216
mean time: 1.8678807735443115
Pipeline stage StressChecker completed in 11.35s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.68s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.76s
Shutdown handler de-registered
junhua024-chai-1-full_94469_v133 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2996.89s
Shutdown handler de-registered
junhua024-chai-1-full_94469_v133 status is now inactive due to auto deactivation removed underperforming models
Service rica40325-v4sft-v1 has been torndown
junhua024-chai-1-full_94469_v133 status is now torndown due to DeploymentManager action
junhua024-chai-1-full_94469_v133 status is now torndown due to DeploymentManager action