developer_uid: junhua024
submission_id: junhua024-chai-06-full-_72764_v3
model_name: junhua024-chai-06-full-_72764_v3
model_group: junhua024/chai_06_full_0
status: torndown
timestamp: 2025-07-17T11:31:40+00:00
num_battles: 5915
num_wins: 2964
celo_rating: 1278.69
family_friendly_score: 0.5496
family_friendly_standard_error: 0.007036189878051899
submission_type: basic
model_repo: junhua024/chai_06_full_02102_2024
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5949437511737816, 'latency_mean': 1.6807180333137512, 'latency_p50': 1.6798992156982422, 'latency_p90': 1.8561721563339233}, {'batch_size': 3, 'throughput': 1.0820809234000204, 'latency_mean': 2.759508856534958, 'latency_p50': 2.771775960922241, 'latency_p90': 3.0540833711624145}, {'batch_size': 5, 'throughput': 1.3156729742559279, 'latency_mean': 3.775005030632019, 'latency_p50': 3.7633674144744873, 'latency_p90': 4.198205518722534}, {'batch_size': 6, 'throughput': 1.3745128419547625, 'latency_mean': 4.344774649143219, 'latency_p50': 4.388383865356445, 'latency_p90': 4.853987455368042}, {'batch_size': 8, 'throughput': 1.4356809694425559, 'latency_mean': 5.534211057424545, 'latency_p50': 5.561307787895203, 'latency_p90': 6.176820206642151}, {'batch_size': 10, 'throughput': 1.4724768481778097, 'latency_mean': 6.736873826980591, 'latency_p50': 6.6828625202178955, 'latency_p90': 7.6497371912002565}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-06-full-_72764_v3
is_internal_developer: False
language_model: junhua024/chai_06_full_02102_2024
model_size: 13B
ranking_group: single
throughput_3p7s: 1.31
us_pacific_date: 2025-07-17
win_ratio: 0.5010989010989011
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-06-full-72764-v3-mkmlizer
Waiting for job on junhua024-chai-06-full-72764-v3-mkmlizer to finish
junhua024-chai-06-full-72764-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ belonging to: ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-06-full-72764-v3-mkmlizer: ║ ║
junhua024-chai-06-full-72764-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-06-full-72764-v3-mkmlizer: Downloaded to shared memory in 105.652s
junhua024-chai-06-full-72764-v3-mkmlizer: Checking if junhua024/chai_06_full_02102_2024 already exists in ChaiML
junhua024-chai-06-full-72764-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp7uwjm6rz, device:0
junhua024-chai-06-full-72764-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-06-full-72764-v3-mkmlizer: quantized model in 31.582s
junhua024-chai-06-full-72764-v3-mkmlizer: Processed model junhua024/chai_06_full_02102_2024 in 137.312s
junhua024-chai-06-full-72764-v3-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-06-full-72764-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-06-full-72764-v3/nvidia/special_tokens_map.json
junhua024-chai-06-full-72764-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-06-full-72764-v3/nvidia/tokenizer_config.json
junhua024-chai-06-full-72764-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-06-full-72764-v3/nvidia/tokenizer.json
junhua024-chai-06-full-72764-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-06-full-72764-v3/nvidia/flywheel_model.0.safetensors
junhua024-chai-06-full-72764-v3-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:23, 15.12it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.84it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:11, 29.77it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:10, 31.55it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:10, 32.92it/s] Loading 0: 8%|▊ | 29/363 [00:00<00:08, 38.46it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:09, 33.49it/s] Loading 0: 10%|█ | 38/363 [00:01<00:09, 32.66it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:10, 30.04it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:08, 38.91it/s] Loading 0: 15%|█▍ | 54/363 [00:01<00:10, 29.90it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 32.45it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:08, 34.01it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:08, 32.67it/s] Loading 0: 21%|██ | 75/363 [00:02<00:08, 32.34it/s] Loading 0: 22%|██▏ | 79/363 [00:02<00:08, 33.79it/s] Loading 0: 23%|██▎ | 83/363 [00:02<00:08, 33.03it/s] Loading 0: 24%|██▍ | 87/363 [00:02<00:08, 34.02it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 32.13it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:08, 32.87it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 33.78it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:07, 32.52it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 40.27it/s] Loading 0: 33%|███▎ | 118/363 [00:03<00:07, 33.92it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 33.42it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 35.36it/s] Loading 0: 36%|███▋ | 132/363 [00:03<00:06, 34.02it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 37.47it/s] Loading 0: 39%|███▉ | 142/363 [00:04<00:06, 36.28it/s] Loading 0: 40%|████ | 146/363 [00:04<00:05, 36.94it/s] Loading 0: 41%|████▏ | 150/363 [00:04<00:06, 34.83it/s] Loading 0: 43%|████▎ | 157/363 [00:04<00:04, 43.25it/s] Loading 0: 45%|████▍ | 162/363 [00:04<00:06, 31.77it/s] Loading 0: 46%|████▌ | 167/363 [00:04<00:06, 32.36it/s] Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 39.31it/s] Loading 0: 50%|████▉ | 181/363 [00:05<00:05, 32.56it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 31.53it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 34.04it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 33.57it/s] Loading 0: 55%|█████▌ | 200/363 [00:05<00:04, 36.77it/s] Loading 0: 56%|█████▋ | 205/363 [00:06<00:04, 35.64it/s] Loading 0: 58%|█████▊ | 209/363 [00:06<00:04, 36.38it/s] Loading 0: 59%|█████▊ | 213/363 [00:06<00:04, 32.83it/s] Loading 0: 60%|██████ | 219/363 [00:06<00:03, 39.20it/s] Loading 0: 62%|██████▏ | 224/363 [00:06<00:03, 35.43it/s] Loading 0: 63%|██████▎ | 228/363 [00:06<00:03, 35.16it/s] Loading 0: 64%|██████▍ | 232/363 [00:06<00:03, 35.17it/s] Loading 0: 66%|██████▌ | 239/363 [00:06<00:03, 40.43it/s] Loading 0: 67%|██████▋ | 244/363 [00:07<00:03, 34.26it/s] Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 33.31it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 33.17it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 31.76it/s] Loading 0: 73%|███████▎ | 264/363 [00:07<00:03, 32.87it/s] Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 31.93it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 33.67it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 39.01it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 37.83it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:01, 37.29it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 34.48it/s] Loading 0: 83%|████████▎ | 302/363 [00:08<00:01, 40.95it/s] Loading 0: 85%|████████▍ | 307/363 [00:08<00:01, 34.97it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 34.15it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 36.20it/s] Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 34.86it/s] Loading 0: 90%|█████████ | 327/363 [00:09<00:01, 35.34it/s] Loading 0: 91%|█████████▏| 332/363 [00:09<00:00, 34.73it/s] Loading 0: 93%|█████████▎| 338/363 [00:09<00:00, 35.99it/s] Loading 0: 95%|█████████▌| 346/363 [00:09<00:00, 45.55it/s] Loading 0: 97%|█████████▋| 351/363 [00:10<00:00, 25.71it/s] Loading 0: 98%|█████████▊| 355/363 [00:10<00:00, 27.93it/s] Loading 0: 99%|█████████▉| 360/363 [00:10<00:00, 31.87it/s]
Job junhua024-chai-06-full-72764-v3-mkmlizer completed after 170.15s with status: succeeded
Stopping job with name junhua024-chai-06-full-72764-v3-mkmlizer
Pipeline stage MKMLizer completed in 170.87s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-06-full-72764-v3
Waiting for inference service junhua024-chai-06-full-72764-v3 to be ready
Inference service junhua024-chai-06-full-72764-v3 ready after 292.56725907325745s
Pipeline stage MKMLDeployer completed in 293.32s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.4872443675994873s
Received healthy response to inference request in 1.695481300354004s
Received healthy response to inference request in 1.6501891613006592s
Received healthy response to inference request in 1.9147629737854004s
Received healthy response to inference request in 1.795112133026123s
5 requests
0 failed requests
5th percentile: 1.659247589111328
10th percentile: 1.668306016921997
20th percentile: 1.686422872543335
30th percentile: 1.7154074668884278
40th percentile: 1.7552597999572754
50th percentile: 1.795112133026123
60th percentile: 1.842972469329834
70th percentile: 1.8908328056335448
80th percentile: 2.029259252548218
90th percentile: 2.2582518100738525
95th percentile: 2.3727480888366697
99th percentile: 2.464345111846924
mean time: 1.9085579872131349
Pipeline stage StressChecker completed in 10.99s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.73s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.72s
Shutdown handler de-registered
junhua024-chai-06-full-_72764_v3 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3043.04s
Shutdown handler de-registered
junhua024-chai-06-full-_72764_v3 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-06-full-_72764_v3 status is now torndown due to DeploymentManager action
junhua024-chai-06-full-_72764_v3 status is now torndown due to DeploymentManager action