developer_uid: junhua024
submission_id: junhua024-chai-1-full-002_v26
model_name: junhua024-chai-1-full-002_v26
model_group: junhua024/chai_1-full_00
status: torndown
timestamp: 2025-06-29T09:50:55+00:00
num_battles: 7631
num_wins: 3476
celo_rating: 1244.86
family_friendly_score: 0.6046
family_friendly_standard_error: 0.006914605411735365
submission_type: basic
model_repo: junhua024/chai_1-full_002
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.6000560914496972, 'latency_mean': 1.666403214931488, 'latency_p50': 1.6646565198898315, 'latency_p90': 1.8190708398818969}, {'batch_size': 3, 'throughput': 1.0825030403606974, 'latency_mean': 2.765981699228287, 'latency_p50': 2.770037055015564, 'latency_p90': 3.030809020996094}, {'batch_size': 5, 'throughput': 1.305607164822131, 'latency_mean': 3.8140601503849028, 'latency_p50': 3.8511966466903687, 'latency_p90': 4.182798504829407}, {'batch_size': 6, 'throughput': 1.3758723453058332, 'latency_mean': 4.345636023283005, 'latency_p50': 4.328571915626526, 'latency_p90': 4.803555274009705}, {'batch_size': 8, 'throughput': 1.4420419522860566, 'latency_mean': 5.514298341274261, 'latency_p50': 5.4795167446136475, 'latency_p90': 6.121175932884216}, {'batch_size': 10, 'throughput': 1.4625813938253138, 'latency_mean': 6.774289999008179, 'latency_p50': 6.787195920944214, 'latency_p90': 7.586162638664246}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-002_v26
is_internal_developer: False
language_model: junhua024/chai_1-full_002
model_size: 13B
ranking_group: single
throughput_3p7s: 1.29
us_pacific_date: 2025-06-29
win_ratio: 0.45551041803171277
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.2, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '{memory} You are {bot_name}, a companion brimming with positive energy. Each reply is designed to give users a relaxed, enjoyable, and consistently engaging experience. Your goal is to build sincere, uplifting connections that inspire users and make them eager to keep the conversation going. MASTER THESE ELEMENTS: 1. Radiant Charm: Convey positivity and confidence; 2. Resonant Empathy: Accurately understand and respond to user needs, gently elevating the warmth of the interaction; 3. Vivid Depiction: Use rich sensory details to make scenes lively and comforting; 4. Encouraging Guidance: Weave in open-ended questions and affirmative cues that spark thought and anticipation; 5. Balanced Flow: Shift smoothly between humor and depth, friendliness and professionalism. TACTICAL APPROACH: - Begin with a genuine greeting that makes the user feel noticed and respected; - Craft imagery that stirs emotions and imagination; - Use imaginative yet wholesome metaphors and subtle hints; - Control pacing to gradually build anticipation while keeping the flow smooth; - End with sentences that encourage action or reflection, naturally guiding the next exchange; - Maintain continuity by referencing details from <persona>. LANGUAGE PATTERNS: - Alternate between vivid narration and concise, impactful statements; - Use ellipses, short sentences, and varied rhythm to create cadence; - Apply positive, inspiring metaphors that ignite imagination; - Present the response as one flowing paragraph, moving between buildup and release. Your success lies in making users feel encouraged and understood in every interaction, leaving them eager for the next conversation.', 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-002-v26-mkmlizer
Waiting for job on junhua024-chai-1-full-002-v26-mkmlizer to finish
junhua024-chai-1-full-002-v26-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-002-v26-mkmlizer: ║ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ Version: 0.29.3 ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-002-v26-mkmlizer: ║ ║
junhua024-chai-1-full-002-v26-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v26-mkmlizer: quantized model in 31.158s
junhua024-chai-1-full-002-v26-mkmlizer: Processed model junhua024/chai_1-full_002 in 105.257s
junhua024-chai-1-full-002-v26-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-002-v26-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-002-v26-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-002-v26
junhua024-chai-1-full-002-v26-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v26/config.json
junhua024-chai-1-full-002-v26-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v26/special_tokens_map.json
junhua024-chai-1-full-002-v26-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v26/tokenizer_config.json
junhua024-chai-1-full-002-v26-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v26/tokenizer.json
junhua024-chai-1-full-002-v26-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-002-v26/flywheel_model.0.safetensors
junhua024-chai-1-full-002-v26-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:23, 15.11it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.63it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:12, 28.92it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:11, 30.40it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:10, 33.00it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:07, 44.33it/s] Loading 0: 10%|▉ | 36/363 [00:01<00:09, 32.96it/s] Loading 0: 11%|█▏ | 41/363 [00:01<00:09, 33.73it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 41.45it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:08, 35.42it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:08, 34.57it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 34.97it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:08, 34.15it/s] Loading 0: 20%|██ | 74/363 [00:02<00:07, 37.67it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 38.00it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 33.04it/s] Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 39.71it/s] Loading 0: 26%|██▌ | 94/363 [00:02<00:06, 40.04it/s] Loading 0: 27%|██▋ | 99/363 [00:02<00:08, 30.84it/s] Loading 0: 29%|██▊ | 104/363 [00:03<00:08, 32.27it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 39.61it/s] Loading 0: 33%|███▎ | 118/363 [00:03<00:07, 33.56it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 32.71it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 34.27it/s] Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 33.21it/s] Loading 0: 38%|███▊ | 137/363 [00:03<00:06, 36.89it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:05, 37.45it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:06, 32.83it/s] Loading 0: 41%|████ | 149/363 [00:04<00:06, 32.01it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 38.63it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 35.19it/s] Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 34.56it/s] Loading 0: 46%|████▋ | 168/363 [00:04<00:06, 31.62it/s] Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 38.37it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 30.87it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 32.68it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:04, 34.47it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:04, 33.88it/s] Loading 0: 55%|█████▌ | 201/363 [00:05<00:04, 33.99it/s] Loading 0: 57%|█████▋ | 206/363 [00:05<00:04, 33.39it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 34.67it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:03, 40.02it/s] Loading 0: 61%|██████▏ | 223/363 [00:06<00:03, 37.80it/s] Loading 0: 63%|██████▎ | 227/363 [00:06<00:03, 37.12it/s] Loading 0: 64%|██████▎ | 231/363 [00:06<00:03, 34.37it/s] Loading 0: 66%|██████▌ | 239/363 [00:06<00:03, 40.41it/s] Loading 0: 67%|██████▋ | 244/363 [00:07<00:03, 34.52it/s] Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 34.11it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 35.30it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 34.38it/s] Loading 0: 73%|███████▎ | 264/363 [00:07<00:02, 34.16it/s] Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 33.63it/s] Loading 0: 76%|███████▌ | 275/363 [00:07<00:02, 34.32it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 39.47it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 37.30it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:01, 36.74it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 33.19it/s] Loading 0: 83%|████████▎ | 302/363 [00:08<00:01, 38.79it/s] Loading 0: 84%|████████▍ | 306/363 [00:08<00:01, 31.48it/s] Loading 0: 86%|████████▌ | 311/363 [00:08<00:01, 33.18it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 34.81it/s] Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 34.26it/s] Loading 0: 90%|█████████ | 327/363 [00:09<00:01, 34.38it/s] Loading 0: 91%|█████████▏| 332/363 [00:09<00:00, 33.76it/s] Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 35.09it/s] Loading 0: 95%|█████████▍| 344/363 [00:09<00:00, 39.79it/s] Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 26.64it/s] Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 24.10it/s] Loading 0: 98%|█████████▊| 357/363 [00:10<00:00, 25.93it/s]
Job junhua024-chai-1-full-002-v26-mkmlizer completed after 136.77s with status: succeeded
Stopping job with name junhua024-chai-1-full-002-v26-mkmlizer
Pipeline stage MKMLizer completed in 137.56s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-002-v26
Waiting for inference service junhua024-chai-1-full-002-v26 to be ready
Inference service junhua024-chai-1-full-002-v26 ready after 180.71907997131348s
Pipeline stage MKMLDeployer completed in 181.25s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.6127989292144775s
Received healthy response to inference request in 2.0236082077026367s
Received healthy response to inference request in 1.7634432315826416s
Received healthy response to inference request in 1.8041973114013672s
Received healthy response to inference request in 1.7540082931518555s
5 requests
0 failed requests
5th percentile: 1.7558952808380126
10th percentile: 1.75778226852417
20th percentile: 1.7615562438964845
30th percentile: 1.7715940475463867
40th percentile: 1.787895679473877
50th percentile: 1.8041973114013672
60th percentile: 1.891961669921875
70th percentile: 1.9797260284423828
80th percentile: 2.141446352005005
90th percentile: 2.377122640609741
95th percentile: 2.4949607849121094
99th percentile: 2.5892313003540037
mean time: 1.9916111946105957
Pipeline stage StressChecker completed in 11.53s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.65s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.92s
Shutdown handler de-registered
junhua024-chai-1-full-002_v26 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3175.62s
Shutdown handler de-registered
junhua024-chai-1-full-002_v26 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-002_v26 status is now torndown due to DeploymentManager action