developer_uid: junhua024
submission_id: junhua024-chai-1-full-002_v7
model_name: junhua024-chai-1-full-002_v7
model_group: junhua024/chai_1-full_00
status: torndown
timestamp: 2025-06-29T01:16:45+00:00
num_battles: 5762
num_wins: 2639
celo_rating: 1249.11
family_friendly_score: 0.5824
family_friendly_standard_error: 0.0069743851341892505
submission_type: basic
model_repo: junhua024/chai_1-full_002
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5936531470695178, 'latency_mean': 1.684362895488739, 'latency_p50': 1.6774836778640747, 'latency_p90': 1.848348569869995}, {'batch_size': 3, 'throughput': 1.0774366476724166, 'latency_mean': 2.779968489408493, 'latency_p50': 2.754416346549988, 'latency_p90': 3.0546260833740235}, {'batch_size': 5, 'throughput': 1.2809632653769016, 'latency_mean': 3.888007072210312, 'latency_p50': 3.8742549419403076, 'latency_p90': 4.338921785354614}, {'batch_size': 6, 'throughput': 1.3424501238590134, 'latency_mean': 4.447542771100998, 'latency_p50': 4.467798829078674, 'latency_p90': 4.962753272056579}, {'batch_size': 8, 'throughput': 1.390835771843456, 'latency_mean': 5.700652407407761, 'latency_p50': 5.74461817741394, 'latency_p90': 6.383651804924011}, {'batch_size': 10, 'throughput': 1.4244974733177194, 'latency_mean': 6.9691355240345, 'latency_p50': 6.939785122871399, 'latency_p90': 7.862961912155151}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-002_v7
is_internal_developer: False
language_model: junhua024/chai_1-full_002
model_size: 13B
ranking_group: single
throughput_3p7s: 1.26
us_pacific_date: 2025-06-28
win_ratio: 0.4580006942034016
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 20, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-002-v7-mkmlizer
Waiting for job on junhua024-chai-1-full-002-v7-mkmlizer to finish
junhua024-chai-1-full-002-v7-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-002-v7-mkmlizer: ║ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ Version: 0.29.3 ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-002-v7-mkmlizer: ║ ║
junhua024-chai-1-full-002-v7-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-002-v7-mkmlizer: Downloaded to shared memory in 74.566s
junhua024-chai-1-full-002-v7-mkmlizer: Checking if junhua024/chai_1-full_002 already exists in ChaiML
junhua024-chai-1-full-002-v7-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp0ls83klh, device:0
junhua024-chai-1-full-002-v7-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-002-v7-mkmlizer: quantized model in 31.724s
junhua024-chai-1-full-002-v7-mkmlizer: Processed model junhua024/chai_1-full_002 in 106.367s
junhua024-chai-1-full-002-v7-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-002-v7-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-002-v7-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-002-v7
junhua024-chai-1-full-002-v7-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-002-v7/tokenizer.json
junhua024-chai-1-full-002-v7-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-002-v7/flywheel_model.0.safetensors
junhua024-chai-1-full-002-v7-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:23, 15.51it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.81it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:11, 31.82it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:10, 34.22it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:11, 29.28it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:11, 28.99it/s] Loading 0: 8%|▊ | 29/363 [00:00<00:09, 36.96it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:09, 34.28it/s] Loading 0: 10%|█ | 38/363 [00:01<00:09, 34.39it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:09, 32.18it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 37.70it/s] Loading 0: 15%|█▍ | 54/363 [00:01<00:10, 30.46it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:09, 32.25it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:09, 32.83it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 32.41it/s] Loading 0: 20%|██ | 74/363 [00:02<00:07, 36.28it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:07, 36.89it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 32.25it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 33.77it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:07, 34.42it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:07, 35.14it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:07, 35.30it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:07, 32.83it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 37.50it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 30.47it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 32.26it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 33.90it/s] Loading 0: 36%|███▋ | 132/363 [00:04<00:07, 32.89it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:06, 36.54it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 36.96it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:06, 32.40it/s] Loading 0: 41%|████ | 149/363 [00:04<00:06, 31.40it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 37.88it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 35.40it/s] Loading 0: 45%|████▌ | 164/363 [00:04<00:05, 35.17it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:05, 32.91it/s] Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 38.39it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 30.60it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 32.50it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 33.78it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 33.20it/s] Loading 0: 55%|█████▌ | 201/363 [00:06<00:04, 33.04it/s] Loading 0: 57%|█████▋ | 206/363 [00:06<00:04, 32.80it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 33.89it/s] Loading 0: 60%|██████ | 219/363 [00:06<00:03, 41.34it/s] Loading 0: 62%|██████▏ | 224/363 [00:06<00:03, 36.39it/s] Loading 0: 63%|██████▎ | 228/363 [00:06<00:03, 34.44it/s] Loading 0: 64%|██████▍ | 232/363 [00:06<00:03, 34.12it/s] Loading 0: 66%|██████▌ | 239/363 [00:07<00:03, 37.93it/s] Loading 0: 67%|██████▋ | 243/363 [00:07<00:03, 31.06it/s] Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 33.47it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 35.03it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 33.69it/s] Loading 0: 72%|███████▏ | 263/363 [00:07<00:02, 36.74it/s] Loading 0: 74%|███████▎ | 267/363 [00:07<00:02, 36.92it/s] Loading 0: 75%|███████▍ | 271/363 [00:08<00:02, 32.33it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 31.02it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 37.83it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 35.96it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 35.90it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 33.43it/s] Loading 0: 83%|████████▎ | 302/363 [00:08<00:01, 39.08it/s] Loading 0: 84%|████████▍ | 306/363 [00:09<00:01, 31.46it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 29.67it/s] Loading 0: 87%|████████▋ | 316/363 [00:09<00:01, 31.53it/s] Loading 0: 88%|████████▊ | 320/363 [00:09<00:01, 21.92it/s] Loading 0: 90%|████████▉ | 325/363 [00:09<00:01, 25.94it/s] Loading 0: 91%|█████████ | 329/363 [00:09<00:01, 27.19it/s] Loading 0: 92%|█████████▏| 333/363 [00:10<00:01, 24.76it/s] Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 27.44it/s] Loading 0: 95%|█████████▍| 344/363 [00:10<00:00, 33.26it/s] Loading 0: 96%|█████████▌| 349/363 [00:10<00:00, 24.01it/s] Loading 0: 97%|█████████▋| 353/363 [00:10<00:00, 22.53it/s] Loading 0: 98%|█████████▊| 357/363 [00:11<00:00, 24.34it/s]
Job junhua024-chai-1-full-002-v7-mkmlizer completed after 126.23s with status: succeeded
Stopping job with name junhua024-chai-1-full-002-v7-mkmlizer
Pipeline stage MKMLizer completed in 126.77s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.22s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-002-v7
Waiting for inference service junhua024-chai-1-full-002-v7 to be ready
Failed to get response for submission junhua024-chai-1-full-002_v4: HTTPConnectionPool(host='junhua024-chai-1-full-002-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-1-full-002-v7 ready after 170.67275309562683s
Pipeline stage MKMLDeployer completed in 171.13s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2273097038269043s
Received healthy response to inference request in 1.4671084880828857s
Received healthy response to inference request in 1.6374330520629883s
Received healthy response to inference request in 1.67354416847229s
Received healthy response to inference request in 1.6022660732269287s
5 requests
0 failed requests
5th percentile: 1.4941400051116944
10th percentile: 1.5211715221405029
20th percentile: 1.57523455619812
30th percentile: 1.6092994689941407
40th percentile: 1.6233662605285644
50th percentile: 1.6374330520629883
60th percentile: 1.651877498626709
70th percentile: 1.6663219451904296
80th percentile: 1.784297275543213
90th percentile: 2.0058034896850585
95th percentile: 2.116556596755981
99th percentile: 2.20515908241272
mean time: 1.7215322971343994
Pipeline stage StressChecker completed in 10.17s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.66s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.70s
Shutdown handler de-registered
junhua024-chai-1-full-002_v7 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3184.96s
Shutdown handler de-registered
junhua024-chai-1-full-002_v7 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-002_v7 status is now torndown due to DeploymentManager action