developer_uid: junhua024
submission_id: junhua024-chai-1-full-06611_v4
model_name: junhua024-chai-1-full-06611_v4
model_group: junhua024/chai_1-full_06
status: torndown
timestamp: 2025-06-29T18:58:10+00:00
num_battles: 6804
num_wins: 3301
celo_rating: 1268.47
family_friendly_score: 0.5604
family_friendly_standard_error: 0.007019285433717595
submission_type: basic
model_repo: junhua024/chai_1-full_06611
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5906395258945286, 'latency_mean': 1.6929598677158355, 'latency_p50': 1.7121572494506836, 'latency_p90': 1.8556434154510497}, {'batch_size': 3, 'throughput': 1.0657291653592962, 'latency_mean': 2.8029342246055604, 'latency_p50': 2.79615318775177, 'latency_p90': 3.144796133041382}, {'batch_size': 5, 'throughput': 1.2727467553701397, 'latency_mean': 3.9061718082427976, 'latency_p50': 3.9125618934631348, 'latency_p90': 4.319064331054688}, {'batch_size': 6, 'throughput': 1.3299687583682829, 'latency_mean': 4.491301836967469, 'latency_p50': 4.512812256813049, 'latency_p90': 5.021072292327881}, {'batch_size': 8, 'throughput': 1.3863330959544944, 'latency_mean': 5.725544406175613, 'latency_p50': 5.772651433944702, 'latency_p90': 6.364241361618042}, {'batch_size': 10, 'throughput': 1.4388808448087014, 'latency_mean': 6.883743841648101, 'latency_p50': 6.854722499847412, 'latency_p90': 7.980801320075988}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-06611_v4
is_internal_developer: False
language_model: junhua024/chai_1-full_06611
model_size: 13B
ranking_group: single
throughput_3p7s: 1.25
us_pacific_date: 2025-06-29
win_ratio: 0.4851557907113463
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|im_start|>system\n{memory}<|im_end|>\n', 'prompt_template': '<|im_start|>user\n{prompt}<|im_end|>\n', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-06611-v4-mkmlizer
Waiting for job on junhua024-chai-1-full-06611-v4-mkmlizer to finish
junhua024-chai-1-full-06611-v4-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ Version: 0.29.3 ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-06611-v4-mkmlizer: ║ ║
junhua024-chai-1-full-06611-v4-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-06611-v4-mkmlizer: Downloaded to shared memory in 78.940s
junhua024-chai-1-full-06611-v4-mkmlizer: Checking if junhua024/chai_1-full_06611 already exists in ChaiML
junhua024-chai-1-full-06611-v4-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpx9a6x4mb, device:0
junhua024-chai-1-full-06611-v4-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-06611-v4-mkmlizer: quantized model in 30.559s
junhua024-chai-1-full-06611-v4-mkmlizer: Processed model junhua024/chai_1-full_06611 in 109.580s
junhua024-chai-1-full-06611-v4-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-06611-v4-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-06611-v4-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v4
junhua024-chai-1-full-06611-v4-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v4/config.json
junhua024-chai-1-full-06611-v4-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v4/special_tokens_map.json
junhua024-chai-1-full-06611-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v4/tokenizer_config.json
junhua024-chai-1-full-06611-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v4/tokenizer.json
junhua024-chai-1-full-06611-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-06611-v4/flywheel_model.0.safetensors
junhua024-chai-1-full-06611-v4-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:23, 15.61it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.21it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:11, 31.64it/s] Loading 0: 4%|▍ | 16/363 [00:00<00:10, 33.97it/s] Loading 0: 6%|▌ | 21/363 [00:00<00:09, 36.66it/s] Loading 0: 7%|▋ | 25/363 [00:00<00:09, 36.17it/s] Loading 0: 9%|▉ | 32/363 [00:00<00:07, 45.16it/s] Loading 0: 10%|█ | 37/363 [00:01<00:10, 32.39it/s] Loading 0: 11%|█▏ | 41/363 [00:01<00:10, 31.37it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 41.06it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:08, 35.64it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:08, 35.19it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 37.22it/s] Loading 0: 19%|█▉ | 69/363 [00:01<00:08, 36.28it/s] Loading 0: 21%|██ | 75/363 [00:02<00:07, 36.24it/s] Loading 0: 22%|██▏ | 80/363 [00:02<00:08, 34.51it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:07, 38.90it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:07, 37.88it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:06, 38.27it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:06, 38.05it/s] Loading 0: 29%|██▉ | 105/363 [00:02<00:07, 35.69it/s] Loading 0: 31%|███ | 113/363 [00:03<00:05, 43.00it/s] Loading 0: 33%|███▎ | 118/363 [00:03<00:06, 36.63it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:06, 35.60it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 36.67it/s] Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 35.29it/s] Loading 0: 38%|███▊ | 138/363 [00:03<00:06, 34.98it/s] Loading 0: 39%|███▉ | 143/363 [00:04<00:06, 33.28it/s] Loading 0: 41%|████ | 149/363 [00:04<00:06, 34.58it/s] Loading 0: 44%|████▎ | 158/363 [00:04<00:04, 45.93it/s] Loading 0: 45%|████▍ | 163/363 [00:04<00:05, 35.45it/s] Loading 0: 46%|████▋ | 168/363 [00:04<00:05, 35.74it/s] Loading 0: 48%|████▊ | 176/363 [00:04<00:04, 42.25it/s] Loading 0: 50%|████▉ | 181/363 [00:04<00:04, 36.57it/s] Loading 0: 51%|█████ | 186/363 [00:05<00:04, 37.43it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:04, 37.00it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:04, 36.08it/s] Loading 0: 55%|█████▌ | 201/363 [00:05<00:04, 36.68it/s] Loading 0: 57%|█████▋ | 206/363 [00:05<00:04, 36.41it/s] Loading 0: 58%|█████▊ | 212/363 [00:05<00:04, 37.11it/s] Loading 0: 61%|██████ | 220/363 [00:05<00:03, 46.75it/s] Loading 0: 62%|██████▏ | 226/363 [00:06<00:03, 36.43it/s] Loading 0: 64%|██████▎ | 231/363 [00:06<00:03, 36.35it/s] Loading 0: 66%|██████▌ | 239/363 [00:06<00:02, 41.69it/s] Loading 0: 67%|██████▋ | 244/363 [00:06<00:03, 35.32it/s] Loading 0: 68%|██████▊ | 248/363 [00:06<00:03, 34.85it/s] Loading 0: 70%|██████▉ | 254/363 [00:06<00:02, 37.08it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:02, 36.03it/s] Loading 0: 73%|███████▎ | 264/363 [00:07<00:02, 35.47it/s] Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 35.38it/s] Loading 0: 76%|███████▌ | 275/363 [00:07<00:02, 36.37it/s] Loading 0: 78%|███████▊ | 282/363 [00:07<00:01, 43.73it/s] Loading 0: 79%|███████▉ | 287/363 [00:07<00:02, 36.74it/s] Loading 0: 80%|████████ | 292/363 [00:07<00:01, 36.50it/s] Loading 0: 82%|████████▏ | 298/363 [00:08<00:01, 37.83it/s] Loading 0: 83%|████████▎ | 303/363 [00:08<00:01, 36.61it/s] Loading 0: 85%|████████▍ | 307/363 [00:08<00:01, 33.87it/s] Loading 0: 86%|████████▌ | 311/363 [00:08<00:01, 31.92it/s] Loading 0: 87%|████████▋ | 317/363 [00:08<00:01, 31.90it/s] Loading 0: 88%|████████▊ | 321/363 [00:08<00:01, 31.17it/s] Loading 0: 90%|█████████ | 327/363 [00:09<00:01, 33.41it/s] Loading 0: 91%|█████████▏| 332/363 [00:09<00:00, 34.07it/s] Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 35.51it/s] Loading 0: 95%|█████████▌| 346/363 [00:09<00:00, 44.96it/s] Loading 0: 97%|█████████▋| 351/363 [00:09<00:00, 25.80it/s] Loading 0: 98%|█████████▊| 355/363 [00:09<00:00, 27.72it/s] Loading 0: 99%|█████████▉| 360/363 [00:10<00:00, 31.70it/s]
Job junhua024-chai-1-full-06611-v4-mkmlizer completed after 137.56s with status: succeeded
Stopping job with name junhua024-chai-1-full-06611-v4-mkmlizer
Pipeline stage MKMLizer completed in 138.09s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-06611-v4
Waiting for inference service junhua024-chai-1-full-06611-v4 to be ready
Inference service junhua024-chai-1-full-06611-v4 ready after 191.2642891407013s
Pipeline stage MKMLDeployer completed in 191.80s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.562495708465576s
Received healthy response to inference request in 1.7321858406066895s
Received healthy response to inference request in 1.826948642730713s
Received healthy response to inference request in 1.7465856075286865s
5 requests
1 failed requests
5th percentile: 1.735065793991089
10th percentile: 1.7379457473754882
20th percentile: 1.743705654144287
30th percentile: 1.7626582145690919
40th percentile: 1.7948034286499024
50th percentile: 1.826948642730713
60th percentile: 2.121167469024658
70th percentile: 2.415386295318603
80th percentile: 6.086472845077518
90th percentile: 13.134427118301392
95th percentile: 16.65840425491333
99th percentile: 19.47758596420288
mean time: 5.610119438171386
%s, retrying in %s seconds...
Received healthy response to inference request in 1.6820979118347168s
Received healthy response to inference request in 1.9260532855987549s
Received healthy response to inference request in 1.737525463104248s
Received healthy response to inference request in 1.7358922958374023s
Received healthy response to inference request in 1.5290865898132324s
5 requests
0 failed requests
5th percentile: 1.5596888542175293
10th percentile: 1.5902911186218263
20th percentile: 1.6514956474304199
30th percentile: 1.692856788635254
40th percentile: 1.7143745422363281
50th percentile: 1.7358922958374023
60th percentile: 1.7365455627441406
70th percentile: 1.737198829650879
80th percentile: 1.7752310276031495
90th percentile: 1.850642156600952
95th percentile: 1.8883477210998534
99th percentile: 1.9185121726989747
mean time: 1.722131109237671
Pipeline stage StressChecker completed in 39.89s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.70s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.77s
Shutdown handler de-registered
junhua024-chai-1-full-06611_v4 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3421.22s
Shutdown handler de-registered
junhua024-chai-1-full-06611_v4 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-06611_v4 status is now torndown due to DeploymentManager action