developer_uid: bogoconic1
submission_id: chaiml-gy-exp35-sftlora_21148_v2
model_name: chaiml-gy-exp35-sftlora_21148_v2
model_group: ChaiML/gy-exp35-sftlora-
status: torndown
timestamp: 2025-06-28T14:57:04+00:00
num_battles: 10024
num_wins: 5262
celo_rating: 1290.84
family_friendly_score: 0.5134000000000001
family_friendly_standard_error: 0.00706852799386124
submission_type: basic
model_repo: ChaiML/gy-exp35-sftlora-exp32ep8stg2-gy-exp24payloads-grok-v2prompt-ep2
model_architecture: MistralForCausalLM
model_num_parameters: 24096691200.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.5156992732390464, 'latency_mean': 1.9389977955818176, 'latency_p50': 1.951043725013733, 'latency_p90': 2.1322076320648193}, {'batch_size': 3, 'throughput': 1.0446782597083544, 'latency_mean': 2.863799488544464, 'latency_p50': 2.860464334487915, 'latency_p90': 3.1549081325531008}, {'batch_size': 5, 'throughput': 1.3346377341996152, 'latency_mean': 3.7222204291820526, 'latency_p50': 3.7084256410598755, 'latency_p90': 4.113006258010865}, {'batch_size': 6, 'throughput': 1.4489644607923893, 'latency_mean': 4.126522312164306, 'latency_p50': 4.110745191574097, 'latency_p90': 4.533698177337646}, {'batch_size': 8, 'throughput': 1.5704450925004307, 'latency_mean': 5.054507212638855, 'latency_p50': 5.053735017776489, 'latency_p90': 5.732370519638062}, {'batch_size': 10, 'throughput': 1.6628695970504639, 'latency_mean': 5.977242769002914, 'latency_p50': 6.041674494743347, 'latency_p90': 6.662069296836853}]
gpu_counts: {'NVIDIA A100-SXM4-80GB': 1}
display_name: chaiml-gy-exp35-sftlora_21148_v2
is_internal_developer: True
language_model: ChaiML/gy-exp35-sftlora-exp32ep8stg2-gy-exp24payloads-grok-v2prompt-ep2
model_size: 24B
ranking_group: single
throughput_3p7s: 1.34
us_pacific_date: 2025-06-28
win_ratio: 0.5249401436552275
generation_params: {'temperature': 0.8, 'top_p': 0.95, 'min_p': 0.025, 'top_k': 60, 'presence_penalty': 0.4, 'frequency_penalty': 0.4, 'stopping_words': ['<|im_end|>', '<|im_start|>', '\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|system|>Family Friendly{memory}\n', 'prompt_template': '', 'bot_template': '<|im_start|>assistant\n{message}<|im_end|>\n', 'user_template': '<|im_start|>user\nYou:{message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n', 'truncate_by_message': True}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name chaiml-gy-exp35-sftlora-21148-v2-mkmlizer
Waiting for job on chaiml-gy-exp35-sftlora-21148-v2-mkmlizer to finish
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ Version: 0.29.3 ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ https://mk1.ai ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ belonging to: ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ Chai Research Corp. ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: Downloaded to shared memory in 59.728s
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: Checking if ChaiML/gy-exp35-sftlora-exp32ep8stg2-gy-exp24payloads-grok-v2prompt-ep2 already exists in ChaiML
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp5__bq5hd, device:0
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: quantized model in 48.949s
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: Processed model ChaiML/gy-exp35-sftlora-exp32ep8stg2-gy-exp24payloads-grok-v2prompt-ep2 in 108.677s
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: creating bucket guanaco-mkml-models
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-21148-v2
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-21148-v2/config.json
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-21148-v2/special_tokens_map.json
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-21148-v2/tokenizer_config.json
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-21148-v2/tokenizer.json
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-21148-v2/flywheel_model.0.safetensors
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-21148-v2/flywheel_model.1.safetensors
chaiml-gy-exp35-sftlora-21148-v2-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 4/363 [00:00<00:09, 36.81it/s] Loading 0: 2%|▏ | 8/363 [00:00<00:12, 28.43it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:11, 30.80it/s] Loading 0: 4%|▍ | 16/363 [00:00<00:12, 27.25it/s] Loading 0: 6%|▌ | 21/363 [00:00<00:10, 31.15it/s] Loading 0: 7%|▋ | 25/363 [00:00<00:12, 27.65it/s] Loading 0: 9%|▉ | 32/363 [00:01<00:09, 33.41it/s] Loading 0: 10%|▉ | 36/363 [00:01<00:15, 20.78it/s] Loading 0: 11%|█ | 40/363 [00:01<00:13, 23.50it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:13, 23.65it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:11, 26.90it/s] Loading 0: 14%|█▍ | 52/363 [00:01<00:12, 24.94it/s] Loading 0: 16%|█▌ | 57/363 [00:02<00:10, 27.84it/s] Loading 0: 17%|█▋ | 61/363 [00:02<00:11, 26.27it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:10, 27.44it/s] Loading 0: 19%|█▉ | 70/363 [00:02<00:11, 24.44it/s] Loading 0: 20%|██ | 73/363 [00:02<00:14, 20.69it/s] Loading 0: 22%|██▏ | 79/363 [00:03<00:11, 25.47it/s] Loading 0: 23%|██▎ | 82/363 [00:03<00:11, 24.32it/s] Loading 0: 24%|██▎ | 86/363 [00:03<00:10, 25.54it/s] Loading 0: 25%|██▍ | 89/363 [00:03<00:10, 25.70it/s] Loading 0: 25%|██▌ | 92/363 [00:03<00:13, 20.84it/s] Loading 0: 27%|██▋ | 99/363 [00:03<00:09, 28.07it/s] Loading 0: 28%|██▊ | 103/363 [00:03<00:09, 26.42it/s] Loading 0: 29%|██▉ | 107/363 [00:04<00:11, 22.36it/s] Loading 0: 31%|███ | 111/363 [00:04<00:09, 25.28it/s] Loading 0: 31%|███▏ | 114/363 [00:04<00:10, 22.80it/s] Loading 0: 33%|███▎ | 120/363 [00:04<00:08, 27.91it/s] Loading 0: 34%|███▍ | 124/363 [00:04<00:08, 26.78it/s] Loading 0: 36%|███▌ | 129/363 [00:04<00:08, 28.98it/s] Loading 0: 37%|███▋ | 133/363 [00:05<00:08, 27.78it/s] Loading 0: 38%|███▊ | 138/363 [00:05<00:07, 30.14it/s] Loading 0: 39%|███▉ | 142/363 [00:05<00:08, 27.46it/s] Loading 0: 40%|████ | 147/363 [00:05<00:06, 32.11it/s] Loading 0: 42%|████▏ | 151/363 [00:05<00:08, 23.82it/s] Loading 0: 42%|████▏ | 154/363 [00:06<00:09, 21.93it/s] Loading 0: 43%|████▎ | 157/363 [00:06<00:09, 22.78it/s] Loading 0: 44%|████▍ | 160/363 [00:06<00:09, 22.46it/s] Loading 0: 45%|████▌ | 165/363 [00:06<00:07, 25.47it/s] Loading 0: 46%|████▋ | 168/363 [00:06<00:08, 22.45it/s] Loading 0: 48%|████▊ | 174/363 [00:06<00:06, 27.16it/s] Loading 0: 49%|████▉ | 177/363 [00:06<00:07, 24.56it/s] Loading 0: 50%|█████ | 182/363 [00:07<00:06, 27.45it/s] Loading 0: 52%|█████▏ | 187/363 [00:07<00:07, 24.28it/s] Loading 0: 52%|█████▏ | 190/363 [00:07<00:07, 22.35it/s] Loading 0: 53%|█████▎ | 193/363 [00:07<00:07, 23.27it/s] Loading 0: 54%|█████▍ | 196/363 [00:07<00:07, 23.79it/s] Loading 0: 55%|█████▌ | 200/363 [00:22<00:06, 23.79it/s] Loading 0: 55%|█████▌ | 201/363 [00:22<03:06, 1.15s/it] Loading 0: 56%|█████▌ | 203/363 [00:22<02:34, 1.03it/s] Loading 0: 57%|█████▋ | 208/363 [00:22<01:33, 1.66it/s] Loading 0: 58%|█████▊ | 211/363 [00:23<01:11, 2.13it/s] Loading 0: 59%|█████▉ | 214/363 [00:23<00:53, 2.80it/s] Loading 0: 60%|██████ | 218/363 [00:23<00:36, 3.99it/s] Loading 0: 61%|██████ | 221/363 [00:23<00:27, 5.10it/s] Loading 0: 62%|██████▏ | 224/363 [00:23<00:22, 6.19it/s] Loading 0: 63%|██████▎ | 228/363 [00:23<00:15, 8.67it/s] Loading 0: 64%|██████▎ | 231/363 [00:24<00:13, 10.03it/s] Loading 0: 65%|██████▌ | 237/363 [00:24<00:08, 14.85it/s] Loading 0: 66%|██████▌ | 240/363 [00:24<00:07, 15.42it/s] Loading 0: 68%|██████▊ | 246/363 [00:24<00:05, 20.69it/s] Loading 0: 69%|██████▉ | 250/363 [00:24<00:05, 21.43it/s] Loading 0: 70%|███████ | 255/363 [00:24<00:04, 24.54it/s] Loading 0: 71%|███████▏ | 259/363 [00:24<00:04, 24.22it/s] Loading 0: 73%|███████▎ | 264/363 [00:25<00:03, 28.94it/s] Loading 0: 74%|███████▍ | 268/363 [00:25<00:04, 22.54it/s] Loading 0: 75%|███████▍ | 271/363 [00:25<00:04, 20.97it/s] Loading 0: 75%|███████▌ | 274/363 [00:25<00:04, 22.12it/s] Loading 0: 76%|███████▋ | 277/363 [00:25<00:03, 22.52it/s] Loading 0: 78%|███████▊ | 282/363 [00:25<00:03, 26.12it/s] Loading 0: 79%|███████▊ | 285/363 [00:26<00:03, 24.02it/s] Loading 0: 80%|████████ | 291/363 [00:26<00:02, 29.04it/s] Loading 0: 81%|████████▏ | 295/363 [00:26<00:02, 27.50it/s] Loading 0: 82%|████████▏ | 299/363 [00:26<00:02, 27.99it/s] Loading 0: 84%|████████▎ | 304/363 [00:26<00:02, 23.58it/s] Loading 0: 85%|████████▍ | 307/363 [00:26<00:02, 21.79it/s] Loading 0: 85%|████████▌ | 310/363 [00:27<00:02, 22.77it/s] Loading 0: 86%|████████▌ | 313/363 [00:27<00:02, 23.24it/s] Loading 0: 88%|████████▊ | 318/363 [00:27<00:01, 26.86it/s] Loading 0: 88%|████████▊ | 321/363 [00:27<00:01, 24.15it/s] Loading 0: 90%|█████████ | 327/363 [00:27<00:01, 28.88it/s] Loading 0: 91%|█████████ | 330/363 [00:27<00:01, 25.76it/s] Loading 0: 92%|█████████▏| 335/363 [00:27<00:00, 28.05it/s] Loading 0: 93%|█████████▎| 338/363 [00:28<00:00, 26.00it/s] Loading 0: 94%|█████████▍| 341/363 [00:28<00:01, 15.21it/s] Loading 0: 95%|█████████▌| 346/363 [00:28<00:00, 20.50it/s] Loading 0: 96%|█████████▌| 349/363 [00:28<00:00, 19.57it/s] Loading 0: 97%|█████████▋| 353/363 [00:28<00:00, 23.15it/s] Loading 0: 98%|█████████▊| 356/363 [00:29<00:00, 24.45it/s] Loading 0: 99%|█████████▉| 359/363 [00:29<00:00, 24.13it/s]
Job chaiml-gy-exp35-sftlora-21148-v2-mkmlizer completed after 146.68s with status: succeeded
Stopping job with name chaiml-gy-exp35-sftlora-21148-v2-mkmlizer
Pipeline stage MKMLizer completed in 147.28s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service chaiml-gy-exp35-sftlora-21148-v2
Waiting for inference service chaiml-gy-exp35-sftlora-21148-v2 to be ready
Failed to get response for submission junhua024-chai-1-full-002_v1: HTTPConnectionPool(host='junhua024-chai-1-full-002-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-1-full-002_v1: HTTPConnectionPool(host='junhua024-chai-1-full-002-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-1-full-002_v1: HTTPConnectionPool(host='junhua024-chai-1-full-002-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service chaiml-gy-exp35-sftlora-21148-v2 ready after 161.07369947433472s
Pipeline stage MKMLDeployer completed in 161.73s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.9320993423461914s
Received healthy response to inference request in 2.0257818698883057s
Received healthy response to inference request in 1.8386421203613281s
Received healthy response to inference request in 2.2730648517608643s
Failed to get response for submission junhua024-chai-1-full-002_v1: HTTPConnectionPool(host='junhua024-chai-1-full-002-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Received healthy response to inference request in 2.2993946075439453s
5 requests
0 failed requests
5th percentile: 1.8760700702667237
10th percentile: 1.9134980201721192
20th percentile: 1.9883539199829101
30th percentile: 2.0752384662628174
40th percentile: 2.174151659011841
50th percentile: 2.2730648517608643
60th percentile: 2.2835967540740967
70th percentile: 2.294128656387329
80th percentile: 2.425935554504395
90th percentile: 2.679017448425293
95th percentile: 2.805558395385742
99th percentile: 2.9067911529541015
mean time: 2.273796558380127
Pipeline stage StressChecker completed in 12.89s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.71s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.76s
Shutdown handler de-registered
chaiml-gy-exp35-sftlora_21148_v2 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service chaiml-gy-exp35-sftlora-21148-v2-profiler
Waiting for inference service chaiml-gy-exp35-sftlora-21148-v2-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3327.97s
Shutdown handler de-registered
chaiml-gy-exp35-sftlora_21148_v2 status is now inactive due to auto deactivation removed underperforming models
chaiml-gy-exp35-sftlora_21148_v2 status is now torndown due to DeploymentManager action