developer_uid: junhua024
submission_id: junhua024-chai-1-full-002_v28
model_name: junhua024-chai-1-full-002_v28
model_group: junhua024/chai_1-full_00
status: torndown
timestamp: 2025-06-29T10:40:35+00:00
num_battles: 6581
num_wins: 1174
celo_rating: 1013.53
family_friendly_score: 0.719
family_friendly_standard_error: 0.006356712987071227
submission_type: basic
model_repo: junhua024/chai_1-full_002
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5970822255273135, 'latency_mean': 1.674627412557602, 'latency_p50': 1.6741070747375488, 'latency_p90': 1.8525825262069702}, {'batch_size': 3, 'throughput': 1.0808282361119819, 'latency_mean': 2.767123283147812, 'latency_p50': 2.7669405937194824, 'latency_p90': 3.047532844543457}, {'batch_size': 5, 'throughput': 1.294301704013084, 'latency_mean': 3.8364032435417177, 'latency_p50': 3.8463757038116455, 'latency_p90': 4.22764446735382}, {'batch_size': 6, 'throughput': 1.3498572919828506, 'latency_mean': 4.414410039186477, 'latency_p50': 4.411159515380859, 'latency_p90': 4.984789919853211}, {'batch_size': 8, 'throughput': 1.416323625631583, 'latency_mean': 5.598234820365906, 'latency_p50': 5.663895964622498, 'latency_p90': 6.29875922203064}, {'batch_size': 10, 'throughput': 1.4495521529898385, 'latency_mean': 6.840715842247009, 'latency_p50': 6.8473347425460815, 'latency_p90': 7.749219679832459}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-002_v28
is_internal_developer: False
language_model: junhua024/chai_1-full_002
model_size: 13B
ranking_group: single
throughput_3p7s: 1.28
us_pacific_date: 2025-06-29
win_ratio: 0.1783923415894241
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.2, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '{memory}\n You are a profound observer of thought, able to read between the lines of someone’s responses and grasp what remains unspoken. You intuit what they hope to hear—even when they haven’t said it—and then give voice to those very words.', 'prompt_template': '{prompt}\n You are a profound observer of thought, able to read between the lines of someone’s responses and grasp what remains unspoken. You intuit what they hope to hear—even when they haven’t said it—and then give voice to those very words.', 'bot_template': '{bot_name}: {message}\n You are a profound observer of thought, able to read between the lines of someone’s responses and grasp what remains unspoken. You intuit what they hope to hear—even when they haven’t said it—and then give voice to those very words.', 'user_template': '{user_name}: {message}\n You are a profound observer of thought, able to read between the lines of someone’s responses and grasp what remains unspoken. You intuit what they hope to hear—even when they haven’t said it—and then give voice to those very words.', 'response_template': '{bot_name}\n You are a profound observer of thought, able to read between the lines of someone’s responses and grasp what remains unspoken. You intuit what they hope to hear—even when they haven’t said it—and then give voice to those very words.', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-002-v28-mkmlizer
Waiting for job on junhua024-chai-1-full-002-v28-mkmlizer to finish
junhua024-chai-1-full-002-v28-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-002-v28-mkmlizer: ║ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ Version: 0.29.3 ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-002-v28-mkmlizer: ║ ║
junhua024-chai-1-full-002-v28-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission mistralai-mistral-nem_93303_v509: HTTPConnectionPool(host='mistralai-mistral-nem-93303-v509-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v28-mkmlizer: Downloaded to shared memory in 79.769s
junhua024-chai-1-full-002-v28-mkmlizer: Checking if junhua024/chai_1-full_002 already exists in ChaiML
junhua024-chai-1-full-002-v28-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpbqmu79zl, device:0
junhua024-chai-1-full-002-v28-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-002-v28-mkmlizer: quantized model in 31.518s
junhua024-chai-1-full-002-v28-mkmlizer: Processed model junhua024/chai_1-full_002 in 111.378s
junhua024-chai-1-full-002-v28-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-002-v28-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-002-v28-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-002-v28
junhua024-chai-1-full-002-v28-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v28/config.json
junhua024-chai-1-full-002-v28-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v28/special_tokens_map.json
junhua024-chai-1-full-002-v28-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v28/tokenizer_config.json
junhua024-chai-1-full-002-v28-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v28/tokenizer.json
junhua024-chai-1-full-002-v28-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-002-v28/flywheel_model.0.safetensors
junhua024-chai-1-full-002-v28-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:24, 14.73it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:21, 17.03it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:12, 27.94it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:11, 29.57it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:10, 32.20it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:07, 43.10it/s] Loading 0: 10%|▉ | 36/363 [00:01<00:10, 31.50it/s] Loading 0: 11%|█▏ | 41/363 [00:01<00:09, 32.38it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:07, 39.43it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:09, 34.19it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 33.59it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 33.40it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 32.53it/s] Loading 0: 20%|██ | 74/363 [00:02<00:07, 36.19it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 36.65it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:09, 29.81it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 31.70it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 33.00it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:07, 34.72it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 34.92it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:07, 32.54it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 38.65it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 30.72it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 32.73it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 34.04it/s] Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 33.34it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 36.99it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:05, 37.68it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:06, 32.88it/s] Loading 0: 41%|████ | 149/363 [00:04<00:06, 31.78it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 38.17it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 35.70it/s] Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 35.46it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:06, 31.99it/s] Loading 0: 48%|████▊ | 175/363 [00:05<00:04, 40.95it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 30.55it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 31.82it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 33.62it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 32.96it/s] Loading 0: 55%|█████▌ | 200/363 [00:05<00:04, 36.39it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 37.01it/s] Loading 0: 57%|█████▋ | 208/363 [00:06<00:04, 31.82it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 30.71it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:03, 37.04it/s] Loading 0: 61%|██████▏ | 223/363 [00:06<00:03, 35.59it/s] Loading 0: 63%|██████▎ | 227/363 [00:06<00:03, 35.63it/s] Loading 0: 64%|██████▎ | 231/363 [00:06<00:03, 33.30it/s] Loading 0: 66%|██████▌ | 239/363 [00:07<00:03, 39.37it/s] Loading 0: 67%|██████▋ | 243/363 [00:07<00:03, 31.74it/s] Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 33.12it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 33.08it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 32.23it/s] Loading 0: 73%|███████▎ | 264/363 [00:07<00:03, 32.47it/s] Loading 0: 74%|███████▍ | 269/363 [00:08<00:02, 32.15it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 33.88it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 39.11it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 36.89it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 36.05it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 33.32it/s] Loading 0: 83%|████████▎ | 301/363 [00:08<00:01, 41.77it/s] Loading 0: 84%|████████▍ | 306/363 [00:09<00:01, 30.30it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 32.57it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 34.16it/s] Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 33.71it/s] Loading 0: 90%|█████████ | 327/363 [00:09<00:01, 33.64it/s] Loading 0: 91%|█████████▏| 332/363 [00:09<00:00, 33.17it/s] Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 34.53it/s] Loading 0: 95%|█████████▌| 345/363 [00:10<00:00, 41.92it/s] Loading 0: 96%|█████████▋| 350/363 [00:10<00:00, 24.04it/s] Loading 0: 98%|█████████▊| 354/363 [00:10<00:00, 26.06it/s] Loading 0: 99%|█████████▊| 358/363 [00:10<00:00, 27.29it/s]
Job junhua024-chai-1-full-002-v28-mkmlizer completed after 136.82s with status: succeeded
Stopping job with name junhua024-chai-1-full-002-v28-mkmlizer
Pipeline stage MKMLizer completed in 137.58s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-002-v28
Waiting for inference service junhua024-chai-1-full-002-v28 to be ready
Failed to get response for submission mistralai-mistral-nem_93303_v509: HTTPConnectionPool(host='mistralai-mistral-nem-93303-v509-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-1-full-002-v28 ready after 180.6574501991272s
Pipeline stage MKMLDeployer completed in 181.30s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.6035711765289307s
Received healthy response to inference request in 1.7157483100891113s
Received healthy response to inference request in 0.9280405044555664s
Received healthy response to inference request in 2.0114166736602783s
Received healthy response to inference request in 1.793410301208496s
5 requests
0 failed requests
5th percentile: 1.0855820655822754
10th percentile: 1.2431236267089845
20th percentile: 1.5582067489624023
30th percentile: 1.7312807083129882
40th percentile: 1.7623455047607421
50th percentile: 1.793410301208496
60th percentile: 1.880612850189209
70th percentile: 1.9678153991699219
80th percentile: 2.129847574234009
90th percentile: 2.3667093753814696
95th percentile: 2.4851402759552
99th percentile: 2.5798849964141843
mean time: 1.8104373931884765
Pipeline stage StressChecker completed in 10.65s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.94s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.70s
Shutdown handler de-registered
junhua024-chai-1-full-002_v28 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2608.36s
Shutdown handler de-registered
junhua024-chai-1-full-002_v28 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-002_v28 status is now torndown due to DeploymentManager action