developer_uid: junhua024
submission_id: junhua024-chai-06-full-_34362_v2
model_name: junhua024-chai-06-full-_34362_v2
model_group: junhua024/chai_06_full_0
status: torndown
timestamp: 2025-07-18T22:01:54+00:00
num_battles: 6964
num_wins: 3296
celo_rating: 1268.35
family_friendly_score: 0.5474
family_friendly_standard_error: 0.00703922211611482
submission_type: basic
model_repo: junhua024/chai_06_full_021028_0106
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.602764411508422, 'latency_mean': 1.658899539709091, 'latency_p50': 1.6506624221801758, 'latency_p90': 1.8189516544342041}, {'batch_size': 3, 'throughput': 1.0828039498950692, 'latency_mean': 2.765225486755371, 'latency_p50': 2.7683757543563843, 'latency_p90': 3.000771427154541}, {'batch_size': 5, 'throughput': 1.3093402847208195, 'latency_mean': 3.807856595516205, 'latency_p50': 3.834325909614563, 'latency_p90': 4.2380657434463505}, {'batch_size': 6, 'throughput': 1.3805261471651418, 'latency_mean': 4.3158864736557, 'latency_p50': 4.295570731163025, 'latency_p90': 4.816239309310913}, {'batch_size': 8, 'throughput': 1.4299926504762013, 'latency_mean': 5.539864300489426, 'latency_p50': 5.617244243621826, 'latency_p90': 6.195995903015136}, {'batch_size': 10, 'throughput': 1.4607610333089731, 'latency_mean': 6.794132753610611, 'latency_p50': 6.78315544128418, 'latency_p90': 7.65176146030426}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-06-full-_34362_v2
is_internal_developer: False
language_model: junhua024/chai_06_full_021028_0106
model_size: 13B
ranking_group: single
throughput_3p7s: 1.3
us_pacific_date: 2025-07-18
win_ratio: 0.4732912119471568
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-06-full-34362-v2-mkmlizer
Waiting for job on junhua024-chai-06-full-34362-v2-mkmlizer to finish
Failed to get response for submission chaiml-mistral31-24b-s_69496_v19: HTTPConnectionPool(host='chaiml-mistral31-24b-s-69496-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-exp-grpo-cp312-_36146_v12: HTTPConnectionPool(host='chaiml-exp-grpo-cp312-36146-v12-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission blend_fader_2025-07-10: ('http://chaiml-0sw-96p-4ff-chai-45017-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:55796->127.0.0.1:8080: read: connection reset by peer\n')
Failed to get response for submission chaiml-exp-grpo-cp312-_36146_v12: HTTPConnectionPool(host='chaiml-exp-grpo-cp312-36146-v12-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-mistral31-24b-s_69496_v19: HTTPConnectionPool(host='chaiml-mistral31-24b-s-69496-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-20250611-retune-_1558_v12: HTTPConnectionPool(host='chaiml-20250611-retune-1558-v12-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-mistral31-24b-s_69496_v19: HTTPConnectionPool(host='chaiml-mistral31-24b-s-69496-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-06-full_30622_v23: HTTPConnectionPool(host='junhua024-chai-06-full-30622-v23-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-06-full-34362-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ belonging to: ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-06-full-34362-v2-mkmlizer: ║ ║
junhua024-chai-06-full-34362-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission chaiml-nis-8b-v1-llama3_48598_v5: HTTPConnectionPool(host='chaiml-nis-8b-v1-llama3-48598-v5-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-34362-v2-mkmlizer: Downloaded to shared memory in 80.212s
junhua024-chai-06-full-34362-v2-mkmlizer: Checking if junhua024/chai_06_full_021028_0106 already exists in ChaiML
junhua024-chai-06-full-34362-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp7kjozahu, device:0
junhua024-chai-06-full-34362-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission chaiml-20250611-retune-_1558_v12: HTTPConnectionPool(host='chaiml-20250611-retune-1558-v12-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-06-full-34362-v2-mkmlizer: quantized model in 32.165s
junhua024-chai-06-full-34362-v2-mkmlizer: Processed model junhua024/chai_06_full_021028_0106 in 112.488s
junhua024-chai-06-full-34362-v2-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-06-full-34362-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-06-full-34362-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-06-full-34362-v2/nvidia
junhua024-chai-06-full-34362-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-06-full-34362-v2/nvidia/config.json
junhua024-chai-06-full-34362-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-06-full-34362-v2/nvidia/special_tokens_map.json
junhua024-chai-06-full-34362-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-06-full-34362-v2/nvidia/tokenizer_config.json
junhua024-chai-06-full-34362-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-06-full-34362-v2/nvidia/tokenizer.json
junhua024-chai-06-full-34362-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-06-full-34362-v2/nvidia/flywheel_model.0.safetensors
junhua024-chai-06-full-34362-v2-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:26, 13.82it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:21, 16.58it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:11, 31.02it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:10, 33.42it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:12, 27.72it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:12, 27.37it/s] Loading 0: 8%|▊ | 29/363 [00:00<00:09, 34.52it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:10, 31.51it/s] Loading 0: 10%|█ | 38/363 [00:01<00:10, 31.57it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:10, 29.74it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:08, 38.50it/s] Loading 0: 15%|█▍ | 54/363 [00:01<00:11, 27.30it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:10, 29.65it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:10, 29.11it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:10, 28.45it/s] Loading 0: 20%|██ | 74/363 [00:02<00:09, 31.99it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:08, 33.03it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:09, 28.26it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:09, 29.95it/s] Loading 0: 25%|██▌ | 91/363 [00:03<00:08, 30.73it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 31.51it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:08, 31.93it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:08, 30.44it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 36.24it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 29.24it/s] Loading 0: 34%|███▎ | 122/363 [00:04<00:07, 30.67it/s] Loading 0: 35%|███▌ | 128/363 [00:04<00:07, 31.68it/s] Loading 0: 36%|███▋ | 132/363 [00:04<00:07, 30.59it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 33.86it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 34.67it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:07, 30.28it/s] Loading 0: 41%|████ | 149/363 [00:04<00:07, 29.67it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 35.36it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 33.26it/s] Loading 0: 45%|████▌ | 164/363 [00:05<00:06, 33.05it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 31.35it/s] Loading 0: 48%|████▊ | 176/363 [00:05<00:05, 37.04it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:06, 29.27it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 31.44it/s] Loading 0: 53%|█████▎ | 191/363 [00:06<00:05, 32.33it/s] Loading 0: 54%|█████▎ | 195/363 [00:06<00:05, 31.32it/s] Loading 0: 55%|█████▌ | 200/363 [00:06<00:04, 34.59it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 35.13it/s] Loading 0: 57%|█████▋ | 208/363 [00:06<00:05, 30.36it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:05, 29.73it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:04, 35.99it/s] Loading 0: 61%|██████▏ | 223/363 [00:07<00:04, 33.47it/s] Loading 0: 63%|██████▎ | 227/363 [00:07<00:04, 33.39it/s] Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 31.53it/s] Loading 0: 66%|██████▌ | 238/363 [00:07<00:03, 40.04it/s] Loading 0: 67%|██████▋ | 243/363 [00:07<00:04, 28.51it/s] Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 30.32it/s] Loading 0: 70%|██████▉ | 254/363 [00:08<00:03, 31.46it/s] Loading 0: 71%|███████ | 258/363 [00:08<00:03, 30.48it/s] Loading 0: 72%|███████▏ | 263/363 [00:08<00:02, 34.10it/s] Loading 0: 74%|███████▎ | 267/363 [00:08<00:02, 34.63it/s] Loading 0: 75%|███████▍ | 271/363 [00:08<00:03, 30.01it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 29.39it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 35.30it/s] Loading 0: 79%|███████▉ | 286/363 [00:09<00:02, 33.37it/s] Loading 0: 80%|███████▉ | 290/363 [00:09<00:02, 33.00it/s] Loading 0: 81%|████████ | 294/363 [00:09<00:02, 30.88it/s] Loading 0: 83%|████████▎ | 302/363 [00:09<00:01, 36.46it/s] Loading 0: 84%|████████▍ | 306/363 [00:09<00:02, 28.34it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 30.26it/s] Loading 0: 87%|████████▋ | 317/363 [00:10<00:01, 31.42it/s] Loading 0: 88%|████████▊ | 321/363 [00:10<00:01, 30.83it/s] Loading 0: 90%|████████▉ | 326/363 [00:10<00:01, 34.09it/s] Loading 0: 91%|█████████ | 330/363 [00:10<00:00, 34.93it/s] Loading 0: 92%|█████████▏| 334/363 [00:10<00:00, 30.39it/s] Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 29.87it/s] Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 36.20it/s] Loading 0: 96%|█████████▌| 349/363 [00:11<00:00, 24.45it/s] Loading 0: 97%|█████████▋| 353/363 [00:11<00:00, 22.45it/s] Loading 0: 98%|█████████▊| 357/363 [00:11<00:00, 24.05it/s]
Job junhua024-chai-06-full-34362-v2-mkmlizer completed after 593.75s with status: succeeded
Stopping job with name junhua024-chai-06-full-34362-v2-mkmlizer
Pipeline stage MKMLizer completed in 594.28s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-06-full-34362-v2
Waiting for inference service junhua024-chai-06-full-34362-v2 to be ready
Failed to get response for submission chaiml-20250611-retune-_1558_v12: HTTPConnectionPool(host='chaiml-20250611-retune-1558-v12-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-mistral31-24b-s_69496_v19: HTTPConnectionPool(host='chaiml-mistral31-24b-s-69496-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-mistral31-24b-s_69496_v19: HTTPConnectionPool(host='chaiml-mistral31-24b-s-69496-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-06-full-34362-v2 ready after 332.086407661438s
Pipeline stage MKMLDeployer completed in 332.69s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.522512435913086s
Received healthy response to inference request in 1.710143804550171s
Received healthy response to inference request in 1.7804148197174072s
Received healthy response to inference request in 2.1962010860443115s
Received healthy response to inference request in 1.6893978118896484s
5 requests
0 failed requests
5th percentile: 1.693547010421753
10th percentile: 1.6976962089538574
20th percentile: 1.7059946060180664
30th percentile: 1.724198007583618
40th percentile: 1.7523064136505127
50th percentile: 1.7804148197174072
60th percentile: 1.9467293262481689
70th percentile: 2.1130438327789305
80th percentile: 2.2614633560180666
90th percentile: 2.3919878959655763
95th percentile: 2.457250165939331
99th percentile: 2.5094599819183347
mean time: 1.9797339916229248
Pipeline stage StressChecker completed in 11.60s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.80s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.76s
Shutdown handler de-registered
junhua024-chai-06-full-_34362_v2 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3295.73s
Shutdown handler de-registered
junhua024-chai-06-full-_34362_v2 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-06-full-_34362_v2 status is now torndown due to DeploymentManager action
junhua024-chai-06-full-_34362_v2 status is now torndown due to DeploymentManager action