developer_uid: junhua024
submission_id: junhua024-chai-02-full-02_v3
model_name: junhua024-chai-02-full-02_v3
model_group: junhua024/chai_02-full_0
status: torndown
timestamp: 2025-07-12T09:34:41+00:00
num_battles: 20664
num_wins: 10057
celo_rating: 1254.49
family_friendly_score: 0.6108
family_friendly_standard_error: 0.006895264461933276
submission_type: basic
model_repo: junhua024/chai_02-full_02
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5978522868037695, 'latency_mean': 1.6725194942951203, 'latency_p50': 1.6781822443008423, 'latency_p90': 1.8415394306182862}, {'batch_size': 3, 'throughput': 1.0890843536767516, 'latency_mean': 2.746676788330078, 'latency_p50': 2.741593360900879, 'latency_p90': 3.0443851470947267}, {'batch_size': 5, 'throughput': 1.2985757841949976, 'latency_mean': 3.833174661397934, 'latency_p50': 3.8404436111450195, 'latency_p90': 4.263997936248779}, {'batch_size': 6, 'throughput': 1.372037729598239, 'latency_mean': 4.340357668399811, 'latency_p50': 4.3475459814071655, 'latency_p90': 4.844496369361877}, {'batch_size': 8, 'throughput': 1.4174455098700875, 'latency_mean': 5.603168494701386, 'latency_p50': 5.605790734291077, 'latency_p90': 6.282325696945191}, {'batch_size': 10, 'throughput': 1.4582366884786757, 'latency_mean': 6.785493595600128, 'latency_p50': 6.804738402366638, 'latency_p90': 7.6693350076675415}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-02-full-02_v3
is_internal_developer: False
language_model: junhua024/chai_02-full_02
model_size: 13B
ranking_group: single
throughput_3p7s: 1.29
us_pacific_date: 2025-07-12
win_ratio: 0.4866918312040263
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.2, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-02-full-02-v3-mkmlizer
Waiting for job on junhua024-chai-02-full-02-v3-mkmlizer to finish
junhua024-chai-02-full-02-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-02-full-02-v3-mkmlizer: ║ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ belonging to: ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-02-full-02-v3-mkmlizer: ║ ║
junhua024-chai-02-full-02-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-02-v3-mkmlizer: Downloaded to shared memory in 75.541s
junhua024-chai-02-full-02-v3-mkmlizer: Checking if junhua024/chai_02-full_02 already exists in ChaiML
junhua024-chai-02-full-02-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpkcaiisct, device:0
junhua024-chai-02-full-02-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-02-full-02-v3-mkmlizer: quantized model in 32.838s
junhua024-chai-02-full-02-v3-mkmlizer: Processed model junhua024/chai_02-full_02 in 108.512s
junhua024-chai-02-full-02-v3-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-02-full-02-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-02-full-02-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-02-full-02-v3/nvidia
junhua024-chai-02-full-02-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-02-full-02-v3/nvidia/config.json
junhua024-chai-02-full-02-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-02-full-02-v3/nvidia/special_tokens_map.json
junhua024-chai-02-full-02-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-02-full-02-v3/nvidia/tokenizer_config.json
junhua024-chai-02-full-02-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-02-full-02-v3/nvidia/tokenizer.json
junhua024-chai-02-full-02-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-02-full-02-v3/nvidia/flywheel_model.0.safetensors
junhua024-chai-02-full-02-v3-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:24, 14.53it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:21, 16.91it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:11, 31.59it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:10, 33.92it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:12, 28.26it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:12, 27.73it/s] Loading 0: 8%|▊ | 29/363 [00:00<00:09, 35.33it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:09, 33.35it/s] Loading 0: 10%|█ | 38/363 [00:01<00:09, 33.53it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:10, 31.28it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:07, 40.30it/s] Loading 0: 15%|█▍ | 54/363 [00:01<00:11, 27.87it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:10, 30.15it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:09, 31.68it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 30.77it/s] Loading 0: 20%|██ | 74/363 [00:02<00:08, 34.30it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:08, 35.12it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:09, 30.47it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 31.81it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 32.73it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:07, 34.00it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 34.02it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:08, 31.32it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 37.45it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 29.69it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 30.77it/s] Loading 0: 35%|███▍ | 127/363 [00:03<00:06, 34.73it/s] Loading 0: 36%|███▌ | 131/363 [00:04<00:08, 27.02it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 32.69it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 32.64it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:07, 28.77it/s] Loading 0: 41%|████ | 149/363 [00:04<00:07, 28.08it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:06, 33.19it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 30.94it/s] Loading 0: 45%|████▌ | 164/363 [00:05<00:06, 30.88it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 28.15it/s] Loading 0: 48%|████▊ | 175/363 [00:05<00:05, 36.77it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:06, 27.52it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:06, 29.64it/s] Loading 0: 53%|█████▎ | 191/363 [00:06<00:05, 31.48it/s] Loading 0: 54%|█████▎ | 195/363 [00:06<00:05, 30.42it/s] Loading 0: 55%|█████▌ | 200/363 [00:06<00:04, 34.09it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 34.53it/s] Loading 0: 57%|█████▋ | 208/363 [00:06<00:05, 30.00it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:05, 29.33it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:04, 35.50it/s] Loading 0: 61%|██████▏ | 223/363 [00:07<00:04, 34.02it/s] Loading 0: 63%|██████▎ | 227/363 [00:07<00:04, 33.89it/s] Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 31.72it/s] Loading 0: 65%|██████▌ | 237/363 [00:07<00:03, 37.88it/s] Loading 0: 67%|██████▋ | 242/363 [00:07<00:03, 32.06it/s] Loading 0: 68%|██████▊ | 246/363 [00:07<00:03, 32.65it/s] Loading 0: 69%|██████▉ | 250/363 [00:07<00:03, 33.17it/s] Loading 0: 70%|██████▉ | 254/363 [00:08<00:03, 30.25it/s] Loading 0: 71%|███████ | 258/363 [00:08<00:03, 29.58it/s] Loading 0: 73%|███████▎ | 264/363 [00:08<00:03, 30.69it/s] Loading 0: 74%|███████▍ | 269/363 [00:08<00:03, 30.51it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 31.94it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 36.68it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 34.87it/s] Loading 0: 80%|███████▉ | 290/363 [00:09<00:02, 34.04it/s] Loading 0: 81%|████████ | 294/363 [00:09<00:02, 32.20it/s] Loading 0: 83%|████████▎ | 302/363 [00:09<00:01, 38.08it/s] Loading 0: 84%|████████▍ | 306/363 [00:09<00:01, 30.33it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 31.76it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 32.62it/s] Loading 0: 88%|████████▊ | 321/363 [00:10<00:01, 31.57it/s] Loading 0: 90%|████████▉ | 326/363 [00:10<00:01, 34.89it/s] Loading 0: 91%|█████████ | 330/363 [00:10<00:00, 35.79it/s] Loading 0: 92%|█████████▏| 334/363 [00:10<00:00, 31.42it/s] Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 30.48it/s] Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 36.64it/s] Loading 0: 96%|█████████▌| 349/363 [00:11<00:00, 24.39it/s] Loading 0: 97%|█████████▋| 353/363 [00:11<00:00, 22.74it/s] Loading 0: 98%|█████████▊| 357/363 [00:11<00:00, 24.47it/s]
Job junhua024-chai-02-full-02-v3-mkmlizer completed after 136.96s with status: succeeded
Stopping job with name junhua024-chai-02-full-02-v3-mkmlizer
Pipeline stage MKMLizer completed in 137.47s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-02-full-02-v3
Waiting for inference service junhua024-chai-02-full-02-v3 to be ready
Inference service junhua024-chai-02-full-02-v3 ready after 201.5800440311432s
Pipeline stage MKMLDeployer completed in 202.10s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.3517301082611084s
Received healthy response to inference request in 1.1215617656707764s
Received healthy response to inference request in 0.9433779716491699s
Received healthy response to inference request in 1.5140209197998047s
Received healthy response to inference request in 1.5094072818756104s
5 requests
0 failed requests
5th percentile: 0.9790147304534912
10th percentile: 1.0146514892578125
20th percentile: 1.085925006866455
30th percentile: 1.1991308689117433
40th percentile: 1.3542690753936768
50th percentile: 1.5094072818756104
60th percentile: 1.5112527370452882
70th percentile: 1.5130981922149658
80th percentile: 1.6815627574920655
90th percentile: 2.016646432876587
95th percentile: 2.1841882705688476
99th percentile: 2.3182217407226564
mean time: 1.4880196094512939
Pipeline stage StressChecker completed in 8.71s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.78s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.89s
Shutdown handler de-registered
junhua024-chai-02-full-02_v3 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3040.93s
Shutdown handler de-registered
junhua024-chai-02-full-02_v3 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-02-full-02_v3 status is now torndown due to DeploymentManager action