developer_uid: junhua024
submission_id: junhua024-chai-1-full-002_v11
model_name: junhua024-chai-1-full-002_v11
model_group: junhua024/chai_1-full_00
status: torndown
timestamp: 2025-06-29T04:41:02+00:00
num_battles: 6826
num_wins: 3137
celo_rating: 1245.93
family_friendly_score: 0.5928
family_friendly_standard_error: 0.006948210704922527
submission_type: basic
model_repo: junhua024/chai_1-full_002
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.6008203918661943, 'latency_mean': 1.6642136943340302, 'latency_p50': 1.664075493812561, 'latency_p90': 1.8289497137069701}, {'batch_size': 3, 'throughput': 1.0802466607988337, 'latency_mean': 2.7695375406742095, 'latency_p50': 2.7703323364257812, 'latency_p90': 3.0341217041015622}, {'batch_size': 5, 'throughput': 1.3014552188894835, 'latency_mean': 3.8220145332813265, 'latency_p50': 3.8012924194335938, 'latency_p90': 4.259822368621826}, {'batch_size': 6, 'throughput': 1.3540381209904722, 'latency_mean': 4.404426704645157, 'latency_p50': 4.422246336936951, 'latency_p90': 4.941321277618409}, {'batch_size': 8, 'throughput': 1.4280357156624908, 'latency_mean': 5.554188023805619, 'latency_p50': 5.583679676055908, 'latency_p90': 6.163592076301574}, {'batch_size': 10, 'throughput': 1.4637781534803624, 'latency_mean': 6.76681661605835, 'latency_p50': 6.777976155281067, 'latency_p90': 7.6641926765441895}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-002_v11
is_internal_developer: False
language_model: junhua024/chai_1-full_002
model_size: 13B
ranking_group: single
throughput_3p7s: 1.29
us_pacific_date: 2025-06-28
win_ratio: 0.4595663639027249
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.3, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-002-v11-mkmlizer
Waiting for job on junhua024-chai-1-full-002-v11-mkmlizer to finish
junhua024-chai-1-full-002-v11-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-002-v11-mkmlizer: ║ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ Version: 0.29.3 ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-002-v11-mkmlizer: ║ ║
junhua024-chai-1-full-002-v11-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v11-mkmlizer: Downloaded to shared memory in 79.550s
junhua024-chai-1-full-002-v11-mkmlizer: Checking if junhua024/chai_1-full_002 already exists in ChaiML
junhua024-chai-1-full-002-v11-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp18hds8kn, device:0
junhua024-chai-1-full-002-v11-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-002-v11-mkmlizer: quantized model in 31.940s
junhua024-chai-1-full-002-v11-mkmlizer: Processed model junhua024/chai_1-full_002 in 111.575s
junhua024-chai-1-full-002-v11-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-002-v11-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-002-v11-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-002-v11
junhua024-chai-1-full-002-v11-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v11/config.json
junhua024-chai-1-full-002-v11-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v11/special_tokens_map.json
junhua024-chai-1-full-002-v11-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v11/tokenizer_config.json
junhua024-chai-1-full-002-v11-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v11/tokenizer.json
junhua024-chai-1-full-002-v11-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-002-v11/flywheel_model.0.safetensors
junhua024-chai-1-full-002-v11-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:22, 16.00it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:19, 18.07it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:10, 33.02it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:09, 34.98it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:11, 30.07it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:11, 29.85it/s] Loading 0: 8%|▊ | 30/363 [00:00<00:08, 40.38it/s] Loading 0: 10%|▉ | 35/363 [00:01<00:09, 35.59it/s] Loading 0: 11%|█ | 39/363 [00:01<00:09, 35.37it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:09, 33.87it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 37.78it/s] Loading 0: 15%|█▍ | 54/363 [00:01<00:10, 30.24it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 32.37it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 33.63it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:08, 33.02it/s] Loading 0: 20%|██ | 74/363 [00:02<00:07, 36.56it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 37.13it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 31.23it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 33.11it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 33.92it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:07, 34.91it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 34.71it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:07, 32.59it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 38.33it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 29.85it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 31.71it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:07, 33.00it/s] Loading 0: 36%|███▋ | 132/363 [00:03<00:07, 32.22it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 35.53it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 35.39it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:07, 29.95it/s] Loading 0: 41%|████ | 149/363 [00:04<00:07, 29.60it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 35.91it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 34.14it/s] Loading 0: 45%|████▌ | 164/363 [00:04<00:06, 32.23it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 30.97it/s] Loading 0: 48%|████▊ | 176/363 [00:05<00:05, 37.33it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:06, 29.57it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 31.22it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 31.70it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 31.19it/s] Loading 0: 55%|█████▌ | 200/363 [00:06<00:04, 34.57it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 35.18it/s] Loading 0: 57%|█████▋ | 208/363 [00:06<00:05, 30.38it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:05, 29.56it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:04, 35.49it/s] Loading 0: 61%|██████▏ | 223/363 [00:06<00:04, 33.54it/s] Loading 0: 63%|██████▎ | 227/363 [00:06<00:04, 33.62it/s] Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 31.41it/s] Loading 0: 65%|██████▌ | 237/363 [00:07<00:03, 37.97it/s] Loading 0: 67%|██████▋ | 242/363 [00:07<00:03, 32.36it/s] Loading 0: 68%|██████▊ | 246/363 [00:07<00:03, 32.37it/s] Loading 0: 69%|██████▉ | 250/363 [00:07<00:03, 32.87it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 30.10it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 29.48it/s] Loading 0: 73%|███████▎ | 264/363 [00:08<00:03, 30.66it/s] Loading 0: 74%|███████▍ | 269/363 [00:08<00:03, 30.69it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 32.69it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 37.81it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 36.06it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 35.56it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 33.01it/s] Loading 0: 83%|████████▎ | 302/363 [00:09<00:01, 38.69it/s] Loading 0: 84%|████████▍ | 306/363 [00:09<00:01, 29.57it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 31.64it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 33.11it/s] Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 32.28it/s] Loading 0: 90%|████████▉ | 326/363 [00:09<00:01, 35.00it/s] Loading 0: 91%|█████████ | 330/363 [00:09<00:00, 34.99it/s] Loading 0: 92%|█████████▏| 334/363 [00:10<00:00, 30.85it/s] Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 30.07it/s] Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 36.34it/s] Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 25.24it/s] Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 23.25it/s] Loading 0: 98%|█████████▊| 357/363 [00:11<00:00, 24.91it/s]
Job junhua024-chai-1-full-002-v11-mkmlizer completed after 137.42s with status: succeeded
Stopping job with name junhua024-chai-1-full-002-v11-mkmlizer
Pipeline stage MKMLizer completed in 138.05s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.20s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-002-v11
Waiting for inference service junhua024-chai-1-full-002-v11 to be ready
Inference service junhua024-chai-1-full-002-v11 ready after 181.75087022781372s
Pipeline stage MKMLDeployer completed in 182.41s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.5267930030822754s
Received healthy response to inference request in 1.614577293395996s
Received healthy response to inference request in 1.7803966999053955s
Received healthy response to inference request in 1.5744819641113281s
Received healthy response to inference request in 1.4682235717773438s
5 requests
0 failed requests
5th percentile: 1.4894752502441406
10th percentile: 1.5107269287109375
20th percentile: 1.5532302856445312
30th percentile: 1.5825010299682618
40th percentile: 1.598539161682129
50th percentile: 1.614577293395996
60th percentile: 1.6809050559997558
70th percentile: 1.7472328186035155
80th percentile: 1.9296759605407716
90th percentile: 2.2282344818115236
95th percentile: 2.377513742446899
99th percentile: 2.4969371509552003
mean time: 1.7928945064544677
Pipeline stage StressChecker completed in 10.63s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.70s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.92s
Shutdown handler de-registered
junhua024-chai-1-full-002_v11 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service junhua024-chai-1-full-002-v11-profiler
Waiting for inference service junhua024-chai-1-full-002-v11-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4886.61s
Shutdown handler de-registered
junhua024-chai-1-full-002_v11 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-002_v11 status is now torndown due to DeploymentManager action