developer_uid: junhua024
submission_id: junhua024-chai-16-full-_74386_v6
model_name: junhua024-chai-16-full-_74386_v6
model_group: junhua024/chai_16_full_1
status: torndown
timestamp: 2025-07-19T15:05:27+00:00
num_battles: 7404
num_wins: 3809
celo_rating: 1280.81
family_friendly_score: 0.5522
family_friendly_standard_error: 0.007032427177013638
submission_type: basic
model_repo: junhua024/chai_16_full_102_o_ffn_1925
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5957648422688411, 'latency_mean': 1.67835022687912, 'latency_p50': 1.6762744188308716, 'latency_p90': 1.859789776802063}, {'batch_size': 3, 'throughput': 1.054439869281961, 'latency_mean': 2.8376157200336456, 'latency_p50': 2.836238741874695, 'latency_p90': 3.1457369327545166}, {'batch_size': 5, 'throughput': 1.2689177414746373, 'latency_mean': 3.9260709750652314, 'latency_p50': 3.9117809534072876, 'latency_p90': 4.423241233825683}, {'batch_size': 6, 'throughput': 1.3303901868485926, 'latency_mean': 4.476535472869873, 'latency_p50': 4.4929540157318115, 'latency_p90': 5.101133108139038}, {'batch_size': 8, 'throughput': 1.3904773206918954, 'latency_mean': 5.722205840349197, 'latency_p50': 5.693834066390991, 'latency_p90': 6.375933361053467}, {'batch_size': 10, 'throughput': 1.4105622472786037, 'latency_mean': 7.034717104434967, 'latency_p50': 7.071342349052429, 'latency_p90': 8.077599549293518}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-16-full-_74386_v6
is_internal_developer: False
language_model: junhua024/chai_16_full_102_o_ffn_1925
model_size: 13B
ranking_group: single
throughput_3p7s: 1.24
us_pacific_date: 2025-07-19
win_ratio: 0.5144516477579687
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-16-full-74386-v6-mkmlizer
Waiting for job on junhua024-chai-16-full-74386-v6-mkmlizer to finish
junhua024-chai-16-full-74386-v6-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ belonging to: ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-16-full-74386-v6-mkmlizer: ║ ║
junhua024-chai-16-full-74386-v6-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Failed to get response for submission junhua024-chai-16-full-_74386_v3: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-74386-v6-mkmlizer: quantized model in 30.205s
junhua024-chai-16-full-74386-v6-mkmlizer: Processed model junhua024/chai_16_full_102_o_ffn_1925 in 107.366s
junhua024-chai-16-full-74386-v6-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-16-full-74386-v6-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-16-full-74386-v6-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-16-full-74386-v6/nvidia
junhua024-chai-16-full-74386-v6-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-16-full-74386-v6/nvidia/config.json
junhua024-chai-16-full-74386-v6-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-16-full-74386-v6/nvidia/special_tokens_map.json
junhua024-chai-16-full-74386-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-16-full-74386-v6/nvidia/tokenizer_config.json
junhua024-chai-16-full-74386-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-16-full-74386-v6/nvidia/tokenizer.json
junhua024-chai-16-full-74386-v6-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-16-full-74386-v6/nvidia/flywheel_model.0.safetensors
junhua024-chai-16-full-74386-v6-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:21, 16.56it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:18, 19.39it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:10, 32.23it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:10, 34.16it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:09, 35.75it/s] Loading 0: 9%|▉ | 32/363 [00:00<00:06, 49.43it/s] Loading 0: 10%|█ | 38/363 [00:01<00:08, 38.85it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:08, 38.28it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 42.57it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:08, 36.58it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:08, 36.01it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:07, 38.04it/s] Loading 0: 19%|█▉ | 69/363 [00:01<00:08, 36.75it/s] Loading 0: 21%|██ | 75/363 [00:02<00:07, 37.22it/s] Loading 0: 22%|██▏ | 80/363 [00:02<00:08, 35.22it/s] Loading 0: 24%|██▍ | 87/363 [00:02<00:06, 42.99it/s] Loading 0: 25%|██▌ | 92/363 [00:02<00:06, 41.45it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:06, 39.02it/s] Loading 0: 28%|██▊ | 102/363 [00:02<00:06, 38.52it/s] Loading 0: 29%|██▉ | 106/363 [00:02<00:06, 38.62it/s] Loading 0: 31%|███ | 113/363 [00:02<00:06, 39.97it/s] Loading 0: 33%|███▎ | 118/363 [00:03<00:07, 34.18it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 33.64it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 36.18it/s] Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 34.57it/s] Loading 0: 38%|███▊ | 138/363 [00:03<00:06, 35.11it/s] Loading 0: 39%|███▉ | 143/363 [00:03<00:06, 35.40it/s] Loading 0: 41%|████ | 149/363 [00:04<00:05, 36.16it/s] Loading 0: 44%|████▎ | 158/363 [00:04<00:04, 47.65it/s] Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 38.52it/s] Loading 0: 47%|████▋ | 169/363 [00:04<00:05, 37.94it/s] Loading 0: 48%|████▊ | 176/363 [00:04<00:04, 42.43it/s] Loading 0: 50%|████▉ | 181/363 [00:04<00:05, 35.99it/s] Loading 0: 51%|█████ | 185/363 [00:04<00:05, 34.58it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:04, 36.20it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:04, 34.62it/s] Loading 0: 55%|█████▌ | 201/363 [00:05<00:04, 35.16it/s] Loading 0: 57%|█████▋ | 206/363 [00:05<00:04, 35.36it/s] Loading 0: 58%|█████▊ | 212/363 [00:05<00:04, 35.97it/s] Loading 0: 61%|██████ | 220/363 [00:05<00:03, 45.49it/s] Loading 0: 62%|██████▏ | 225/363 [00:06<00:03, 35.04it/s] Loading 0: 63%|██████▎ | 230/363 [00:06<00:03, 35.19it/s] Loading 0: 66%|██████▌ | 239/363 [00:06<00:02, 43.64it/s] Loading 0: 67%|██████▋ | 244/363 [00:06<00:03, 37.93it/s] Loading 0: 69%|██████▊ | 249/363 [00:06<00:02, 39.17it/s] Loading 0: 70%|██████▉ | 254/363 [00:06<00:02, 38.36it/s] Loading 0: 71%|███████▏ | 259/363 [00:06<00:02, 38.81it/s] Loading 0: 73%|███████▎ | 264/363 [00:07<00:02, 36.77it/s] Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 36.37it/s] Loading 0: 76%|███████▌ | 275/363 [00:07<00:02, 37.50it/s] Loading 0: 78%|███████▊ | 284/363 [00:07<00:01, 48.99it/s] Loading 0: 80%|███████▉ | 290/363 [00:07<00:01, 39.73it/s] Loading 0: 81%|████████▏ | 295/363 [00:07<00:01, 38.87it/s] Loading 0: 83%|████████▎ | 302/363 [00:07<00:01, 42.09it/s] Loading 0: 85%|████████▍ | 307/363 [00:08<00:01, 35.93it/s] Loading 0: 86%|████████▌ | 311/363 [00:08<00:01, 34.85it/s] Loading 0: 87%|████████▋ | 317/363 [00:08<00:01, 36.77it/s] Loading 0: 88%|████████▊ | 321/363 [00:08<00:01, 35.24it/s] Loading 0: 90%|█████████ | 327/363 [00:08<00:01, 35.55it/s] Loading 0: 91%|█████████▏| 332/363 [00:08<00:00, 35.27it/s] Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 35.80it/s] Loading 0: 95%|█████████▍| 344/363 [00:09<00:00, 40.63it/s] Loading 0: 96%|█████████▌| 349/363 [00:09<00:00, 27.52it/s] Loading 0: 97%|█████████▋| 353/363 [00:09<00:00, 24.82it/s] Loading 0: 98%|█████████▊| 357/363 [00:09<00:00, 26.41it/s]
Job junhua024-chai-16-full-74386-v6-mkmlizer completed after 139.84s with status: succeeded
Stopping job with name junhua024-chai-16-full-74386-v6-mkmlizer
Pipeline stage MKMLizer completed in 140.34s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-16-full-74386-v6
Waiting for inference service junhua024-chai-16-full-74386-v6 to be ready
Failed to get response for submission junhua024-chai-16-full-_74386_v2: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v4: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v1: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v2: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v4: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v4: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v4: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v2: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v2: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v2: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v4: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v4: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-16-full-_74386_v2: HTTPConnectionPool(host='junhua024-chai-16-full-74386-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-16-full-74386-v6 ready after 331.7070415019989s
Pipeline stage MKMLDeployer completed in 332.28s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.535053253173828s
Received healthy response to inference request in 1.5507535934448242s
Received healthy response to inference request in 2.175352096557617s
Received healthy response to inference request in 2.07924485206604s
Received healthy response to inference request in 2.2603354454040527s
5 requests
0 failed requests
5th percentile: 1.6564518451690673
10th percentile: 1.7621500968933106
20th percentile: 1.973546600341797
30th percentile: 2.0984663009643554
40th percentile: 2.1369091987609865
50th percentile: 2.175352096557617
60th percentile: 2.2093454360961915
70th percentile: 2.243338775634766
80th percentile: 2.315279006958008
90th percentile: 2.425166130065918
95th percentile: 2.480109691619873
99th percentile: 2.524064540863037
mean time: 2.1201478481292724
Pipeline stage StressChecker completed in 12.67s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.83s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.71s
Shutdown handler de-registered
junhua024-chai-16-full-_74386_v6 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service junhua024-chai-16-full-74386-v6-profiler
Waiting for inference service junhua024-chai-16-full-74386-v6-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4880.19s
Shutdown handler de-registered
junhua024-chai-16-full-_74386_v6 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-16-full-_74386_v6 status is now torndown due to DeploymentManager action