developer_uid: junhua024
submission_id: junhua024-chai-1-full-066126_v60
model_name: junhua024-chai-1-full-066126_v60
model_group: junhua024/chai-1-full-06
status: torndown
timestamp: 2025-07-15T10:27:36+00:00
num_battles: 8596
num_wins: 4166
celo_rating: 1272.91
family_friendly_score: 0.5542
family_friendly_standard_error: 0.007029400543431851
submission_type: basic
model_repo: junhua024/chai-1-full-066126
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5949682848325383, 'latency_mean': 1.680579743385315, 'latency_p50': 1.6686900854110718, 'latency_p90': 1.851362681388855}, {'batch_size': 3, 'throughput': 1.076117367724448, 'latency_mean': 2.777814255952835, 'latency_p50': 2.8018404245376587, 'latency_p90': 3.0396554231643678}, {'batch_size': 5, 'throughput': 1.2694052293165587, 'latency_mean': 3.927778632640839, 'latency_p50': 3.901508927345276, 'latency_p90': 4.4432701587677}, {'batch_size': 6, 'throughput': 1.3329644948276274, 'latency_mean': 4.471781673431397, 'latency_p50': 4.449207425117493, 'latency_p90': 5.027526259422302}, {'batch_size': 8, 'throughput': 1.4029771442302865, 'latency_mean': 5.658961888551712, 'latency_p50': 5.681788682937622, 'latency_p90': 6.3498996019363405}, {'batch_size': 10, 'throughput': 1.4260763910582386, 'latency_mean': 6.956271834373474, 'latency_p50': 6.971898555755615, 'latency_p90': 7.860449457168579}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-066126_v60
is_internal_developer: False
language_model: junhua024/chai-1-full-066126
model_size: 13B
ranking_group: single
throughput_3p7s: 1.24
us_pacific_date: 2025-07-15
win_ratio: 0.4846440204746394
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 8, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-066126-v60-mkmlizer
Waiting for job on junhua024-chai-1-full-066126-v60-mkmlizer to finish
junhua024-chai-1-full-066126-v60-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-066126-v60-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v60-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission zmeeks-capitanito-45-e2_v5: HTTPConnectionPool(host='zmeeks-capitanito-45-e2-v5-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission zmeeks-capitanito-45-e2_v5: HTTPConnectionPool(host='zmeeks-capitanito-45-e2-v5-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v60-mkmlizer: quantized model in 38.101s
junhua024-chai-1-full-066126-v60-mkmlizer: Processed model junhua024/chai-1-full-066126 in 116.892s
junhua024-chai-1-full-066126-v60-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-066126-v60-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-066126-v60-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v60/nvidia
junhua024-chai-1-full-066126-v60-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v60/nvidia/config.json
junhua024-chai-1-full-066126-v60-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v60/nvidia/special_tokens_map.json
junhua024-chai-1-full-066126-v60-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v60/nvidia/tokenizer_config.json
junhua024-chai-1-full-066126-v60-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v60/nvidia/tokenizer.json
junhua024-chai-1-full-066126-v60-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v60/nvidia/flywheel_model.0.safetensors
junhua024-chai-1-full-066126-v60-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:28, 12.53it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:24, 14.67it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:12, 27.45it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:11, 29.31it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:14, 24.09it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:14, 23.38it/s] Loading 0: 8%|▊ | 29/363 [00:01<00:11, 30.19it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:11, 28.50it/s] Loading 0: 10%|█ | 38/363 [00:01<00:11, 28.11it/s] Loading 0: 11%|█▏ | 41/363 [00:01<00:13, 24.24it/s] Loading 0: 13%|█▎ | 47/363 [00:01<00:10, 31.59it/s] Loading 0: 14%|█▍ | 51/363 [00:01<00:11, 27.54it/s] Loading 0: 15%|█▌ | 55/363 [00:02<00:11, 27.25it/s] Loading 0: 16%|█▋ | 59/363 [00:02<00:11, 27.42it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:10, 28.58it/s] Loading 0: 19%|█▊ | 68/363 [00:02<00:11, 26.30it/s] Loading 0: 20%|██ | 73/363 [00:02<00:09, 30.30it/s] Loading 0: 21%|██ | 77/363 [00:02<00:09, 30.27it/s] Loading 0: 22%|██▏ | 81/363 [00:03<00:10, 26.01it/s] Loading 0: 23%|██▎ | 84/363 [00:03<00:10, 26.84it/s] Loading 0: 25%|██▌ | 91/363 [00:03<00:09, 29.55it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 30.21it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:08, 30.07it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:09, 27.86it/s] Loading 0: 31%|███ | 111/363 [00:03<00:07, 34.25it/s] Loading 0: 32%|███▏ | 115/363 [00:04<00:08, 28.99it/s] Loading 0: 33%|███▎ | 119/363 [00:04<00:08, 30.12it/s] Loading 0: 34%|███▍ | 123/363 [00:04<00:08, 28.54it/s] Loading 0: 35%|███▍ | 127/363 [00:04<00:07, 30.99it/s] Loading 0: 36%|███▌ | 131/363 [00:04<00:09, 24.51it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:07, 30.77it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:07, 31.50it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:08, 26.48it/s] Loading 0: 41%|████ | 149/363 [00:05<00:08, 26.09it/s] Loading 0: 43%|████▎ | 155/363 [00:05<00:06, 31.85it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 30.53it/s] Loading 0: 45%|████▌ | 164/363 [00:05<00:06, 30.71it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 29.00it/s] Loading 0: 48%|████▊ | 175/363 [00:06<00:04, 37.85it/s] Loading 0: 50%|████▉ | 180/363 [00:06<00:06, 27.59it/s] Loading 0: 51%|█████ | 185/363 [00:06<00:06, 29.03it/s] Loading 0: 53%|█████▎ | 191/363 [00:06<00:05, 29.87it/s] Loading 0: 54%|█████▎ | 195/363 [00:06<00:05, 28.69it/s] Loading 0: 55%|█████▌ | 200/363 [00:06<00:05, 32.09it/s] Loading 0: 56%|█████▌ | 204/363 [00:07<00:04, 32.69it/s] Loading 0: 57%|█████▋ | 208/363 [00:07<00:05, 28.09it/s] Loading 0: 58%|█████▊ | 212/363 [00:07<00:05, 27.26it/s] Loading 0: 60%|██████ | 218/363 [00:07<00:04, 32.89it/s] Loading 0: 61%|██████▏ | 223/363 [00:07<00:04, 31.19it/s] Loading 0: 63%|██████▎ | 227/363 [00:07<00:04, 30.63it/s] Loading 0: 64%|██████▎ | 231/363 [00:08<00:04, 28.23it/s] Loading 0: 65%|██████▌ | 236/363 [00:08<00:03, 32.86it/s] Loading 0: 66%|██████▌ | 240/363 [00:08<00:04, 27.94it/s] Loading 0: 67%|██████▋ | 244/363 [00:08<00:04, 27.36it/s] Loading 0: 68%|██████▊ | 248/363 [00:08<00:04, 27.51it/s] Loading 0: 70%|██████▉ | 254/363 [00:08<00:03, 28.76it/s] Loading 0: 71%|███████ | 257/363 [00:08<00:03, 26.55it/s] Loading 0: 72%|███████▏ | 263/363 [00:09<00:03, 32.50it/s] Loading 0: 74%|███████▎ | 267/363 [00:09<00:02, 33.33it/s] Loading 0: 75%|███████▍ | 271/363 [00:09<00:03, 28.74it/s] Loading 0: 76%|███████▌ | 275/363 [00:09<00:03, 28.16it/s] Loading 0: 77%|███████▋ | 281/363 [00:09<00:02, 33.96it/s] Loading 0: 79%|███████▉ | 286/363 [00:09<00:02, 32.06it/s] Loading 0: 80%|███████▉ | 290/363 [00:09<00:02, 31.88it/s] Loading 0: 81%|████████ | 294/363 [00:10<00:02, 29.91it/s] Loading 0: 83%|████████▎ | 300/363 [00:10<00:01, 36.52it/s] Loading 0: 84%|████████▎ | 304/363 [00:10<00:01, 30.76it/s] Loading 0: 85%|████████▍ | 308/363 [00:10<00:01, 31.51it/s] Loading 0: 86%|████████▌ | 312/363 [00:10<00:01, 30.57it/s] Loading 0: 87%|████████▋ | 317/363 [00:10<00:01, 29.83it/s] Loading 0: 88%|████████▊ | 321/363 [00:10<00:01, 29.58it/s] Loading 0: 90%|████████▉ | 326/363 [00:11<00:01, 33.84it/s] Loading 0: 91%|█████████ | 330/363 [00:11<00:00, 34.57it/s] Loading 0: 92%|█████████▏| 334/363 [00:11<00:00, 29.98it/s] Loading 0: 93%|█████████▎| 338/363 [00:11<00:00, 28.97it/s] Loading 0: 95%|█████████▍| 344/363 [00:11<00:00, 34.42it/s] Loading 0: 96%|█████████▌| 349/363 [00:12<00:00, 23.40it/s] Loading 0: 97%|█████████▋| 352/363 [00:12<00:00, 19.81it/s] Loading 0: 98%|█████████▊| 357/363 [00:12<00:00, 23.10it/s]
Job junhua024-chai-1-full-066126-v60-mkmlizer completed after 150.62s with status: succeeded
Stopping job with name junhua024-chai-1-full-066126-v60-mkmlizer
Pipeline stage MKMLizer completed in 151.69s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-066126-v60
Waiting for inference service junhua024-chai-1-full-066126-v60 to be ready
Inference service junhua024-chai-1-full-066126-v60 ready after 241.96445035934448s
Pipeline stage MKMLDeployer completed in 242.73s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.6303775310516357s
Received healthy response to inference request in 1.5358126163482666s
Received healthy response to inference request in 1.6381285190582275s
Received healthy response to inference request in 1.8425467014312744s
Received healthy response to inference request in 2.1056532859802246s
5 requests
0 failed requests
5th percentile: 1.5562757968902587
10th percentile: 1.5767389774322509
20th percentile: 1.6176653385162354
30th percentile: 1.679012155532837
40th percentile: 1.7607794284820557
50th percentile: 1.8425467014312744
60th percentile: 1.9477893352508544
70th percentile: 2.0530319690704344
80th percentile: 2.2105981349945067
90th percentile: 2.4204878330230715
95th percentile: 2.5254326820373536
99th percentile: 2.6093885612487795
mean time: 1.9505037307739257
Pipeline stage StressChecker completed in 11.63s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.75s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.98s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v60 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5513.30s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v60 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-066126_v60 status is now torndown due to DeploymentManager action
junhua024-chai-1-full-066126_v60 status is now torndown due to DeploymentManager action