developer_uid: bogoconic1
submission_id: chaiml-gy-exp35-sftlora-_2298_v3
model_name: chaiml-gy-exp35-sftlora-_2298_v3
model_group: ChaiML/gy-exp35-sftlora-
status: torndown
timestamp: 2025-06-29T07:36:22+00:00
num_battles: 6543
num_wins: 3484
celo_rating: 1299.65
family_friendly_score: 0.5182
family_friendly_standard_error: 0.0070663818181584265
submission_type: basic
model_repo: ChaiML/gy-exp35-sftlora-exp32ep8stg2-gy-exp24payloads-grok-v2prompt-ep6
model_architecture: MistralForCausalLM
model_num_parameters: 24096691200.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.5256003527963847, 'latency_mean': 1.9024700891971589, 'latency_p50': 1.885946273803711, 'latency_p90': 2.121420168876648}, {'batch_size': 3, 'throughput': 1.0553047537035773, 'latency_mean': 2.832942923307419, 'latency_p50': 2.8272993564605713, 'latency_p90': 3.11845178604126}, {'batch_size': 5, 'throughput': 1.316306094556055, 'latency_mean': 3.7816186153888705, 'latency_p50': 3.7854665517807007, 'latency_p90': 4.222726058959961}, {'batch_size': 6, 'throughput': 1.435251831893972, 'latency_mean': 4.153407069444657, 'latency_p50': 4.160548567771912, 'latency_p90': 4.612574768066406}, {'batch_size': 8, 'throughput': 1.5562207366647653, 'latency_mean': 5.097978355884552, 'latency_p50': 5.150537133216858, 'latency_p90': 5.747073411941528}, {'batch_size': 10, 'throughput': 1.6475548638526387, 'latency_mean': 6.0195935702323915, 'latency_p50': 6.055419921875, 'latency_p90': 6.784937810897826}]
gpu_counts: {'NVIDIA A100-SXM4-80GB': 1}
display_name: chaiml-gy-exp35-sftlora-_2298_v3
is_internal_developer: True
language_model: ChaiML/gy-exp35-sftlora-exp32ep8stg2-gy-exp24payloads-grok-v2prompt-ep6
model_size: 24B
ranking_group: single
throughput_3p7s: 1.3
us_pacific_date: 2025-06-29
win_ratio: 0.5324774568240868
generation_params: {'temperature': 0.8, 'top_p': 0.95, 'min_p': 0.025, 'top_k': 60, 'presence_penalty': 0.4, 'frequency_penalty': 0.4, 'stopping_words': ['<|im_start|>', '\n', '<|im_end|>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|system|>Family Friendly{memory}\n', 'prompt_template': '', 'bot_template': '<|im_start|>assistant\n{message}<|im_end|>\n', 'user_template': '<|im_start|>user\nYou:{message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n', 'truncate_by_message': True}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name chaiml-gy-exp35-sftlora-2298-v3-mkmlizer
Waiting for job on chaiml-gy-exp35-sftlora-2298-v3-mkmlizer to finish
Failed to get response for submission junhua024-chai-1-full-002_v18: HTTPConnectionPool(host='junhua024-chai-1-full-002-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ Version: 0.29.3 ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ https://mk1.ai ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ belonging to: ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ Chai Research Corp. ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ║ ║
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: Downloaded to shared memory in 63.189s
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: Checking if ChaiML/gy-exp35-sftlora-exp32ep8stg2-gy-exp24payloads-grok-v2prompt-ep6 already exists in ChaiML
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp52s1ub4n, device:0
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: quantized model in 48.033s
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: Processed model ChaiML/gy-exp35-sftlora-exp32ep8stg2-gy-exp24payloads-grok-v2prompt-ep6 in 111.222s
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: creating bucket guanaco-mkml-models
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-2298-v3
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-2298-v3/config.json
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-2298-v3/special_tokens_map.json
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-2298-v3/tokenizer_config.json
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-2298-v3/tokenizer.json
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-2298-v3/flywheel_model.1.safetensors
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaiml-gy-exp35-sftlora-2298-v3/flywheel_model.0.safetensors
chaiml-gy-exp35-sftlora-2298-v3-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 4/363 [00:00<00:11, 32.46it/s] Loading 0: 2%|▏ | 8/363 [00:00<00:12, 27.33it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:11, 29.69it/s] Loading 0: 4%|▍ | 16/363 [00:00<00:13, 26.25it/s] Loading 0: 6%|▌ | 21/363 [00:00<00:11, 28.60it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:13, 25.00it/s] Loading 0: 8%|▊ | 30/363 [00:01<00:10, 32.36it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:14, 22.72it/s] Loading 0: 10%|█ | 37/363 [00:01<00:15, 21.58it/s] Loading 0: 11%|█ | 40/363 [00:01<00:14, 22.87it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:13, 22.95it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:11, 26.49it/s] Loading 0: 14%|█▍ | 51/363 [00:02<00:13, 23.95it/s] Loading 0: 16%|█▌ | 57/363 [00:02<00:10, 28.80it/s] Loading 0: 17%|█▋ | 60/363 [00:02<00:12, 25.00it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:10, 27.72it/s] Loading 0: 19%|█▉ | 70/363 [00:02<00:12, 23.89it/s] Loading 0: 20%|██ | 73/363 [00:02<00:14, 20.07it/s] Loading 0: 21%|██▏ | 78/363 [00:03<00:11, 25.05it/s] Loading 0: 22%|██▏ | 81/363 [00:03<00:12, 22.42it/s] Loading 0: 24%|██▎ | 86/363 [00:03<00:10, 25.48it/s] Loading 0: 25%|██▍ | 89/363 [00:03<00:10, 25.00it/s] Loading 0: 25%|██▌ | 92/363 [00:03<00:13, 19.82it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:10, 24.97it/s] Loading 0: 28%|██▊ | 101/363 [00:04<00:11, 22.17it/s] Loading 0: 29%|██▉ | 106/363 [00:04<00:09, 27.29it/s] Loading 0: 30%|███ | 110/363 [00:04<00:10, 23.11it/s] Loading 0: 31%|███ | 113/363 [00:04<00:12, 19.80it/s] Loading 0: 33%|███▎ | 120/363 [00:04<00:09, 26.62it/s] Loading 0: 34%|███▍ | 124/363 [00:05<00:09, 25.44it/s] Loading 0: 36%|███▌ | 129/363 [00:05<00:08, 28.52it/s] Loading 0: 37%|███▋ | 133/363 [00:05<00:08, 26.41it/s] Loading 0: 38%|███▊ | 138/363 [00:05<00:07, 28.50it/s] Loading 0: 39%|███▉ | 142/363 [00:05<00:08, 25.78it/s] Loading 0: 40%|████ | 147/363 [00:05<00:07, 30.00it/s] Loading 0: 42%|████▏ | 151/363 [00:06<00:10, 21.18it/s] Loading 0: 42%|████▏ | 154/363 [00:06<00:10, 20.42it/s] Loading 0: 43%|████▎ | 157/363 [00:06<00:09, 21.55it/s] Loading 0: 44%|████▍ | 160/363 [00:06<00:09, 21.69it/s] Loading 0: 45%|████▌ | 165/363 [00:06<00:07, 25.17it/s] Loading 0: 46%|████▋ | 168/363 [00:06<00:08, 22.64it/s] Loading 0: 47%|████▋ | 172/363 [00:06<00:07, 26.21it/s] Loading 0: 48%|████▊ | 175/363 [00:07<00:07, 25.91it/s] Loading 0: 49%|████▉ | 178/363 [00:07<00:07, 25.25it/s] Loading 0: 50%|████▉ | 181/363 [00:07<00:06, 26.33it/s] Loading 0: 51%|█████ | 186/363 [00:07<00:06, 28.74it/s] Loading 0: 52%|█████▏ | 189/363 [00:07<00:10, 17.27it/s] Loading 0: 53%|█████▎ | 193/363 [00:07<00:08, 20.71it/s] Loading 0: 54%|█████▍ | 196/363 [00:08<00:07, 21.41it/s] Loading 0: 55%|█████▌ | 200/363 [00:22<00:07, 21.41it/s] Loading 0: 55%|█████▌ | 201/363 [00:22<03:04, 1.14s/it] Loading 0: 56%|█████▌ | 203/363 [00:22<02:32, 1.05it/s] Loading 0: 57%|█████▋ | 208/363 [00:22<01:31, 1.69it/s] Loading 0: 58%|█████▊ | 211/363 [00:23<01:10, 2.16it/s] Loading 0: 59%|█████▉ | 214/363 [00:23<00:52, 2.84it/s] Loading 0: 60%|██████ | 218/363 [00:23<00:35, 4.08it/s] Loading 0: 61%|██████ | 221/363 [00:23<00:27, 5.22it/s] Loading 0: 62%|██████▏ | 224/363 [00:23<00:22, 6.23it/s] Loading 0: 63%|██████▎ | 229/363 [00:23<00:14, 9.20it/s] Loading 0: 64%|██████▍ | 232/363 [00:24<00:12, 10.83it/s] Loading 0: 65%|██████▌ | 237/363 [00:24<00:08, 14.82it/s] Loading 0: 66%|██████▌ | 240/363 [00:24<00:07, 15.67it/s] Loading 0: 68%|██████▊ | 246/363 [00:24<00:05, 21.68it/s] Loading 0: 69%|██████▉ | 250/363 [00:24<00:05, 22.31it/s] Loading 0: 70%|███████ | 255/363 [00:24<00:04, 26.07it/s] Loading 0: 71%|███████▏ | 259/363 [00:24<00:04, 25.68it/s] Loading 0: 73%|███████▎ | 264/363 [00:25<00:03, 30.42it/s] Loading 0: 74%|███████▍ | 268/363 [00:25<00:04, 23.12it/s] Loading 0: 75%|███████▍ | 271/363 [00:25<00:04, 21.88it/s] Loading 0: 75%|███████▌ | 274/363 [00:25<00:03, 23.02it/s] Loading 0: 76%|███████▋ | 277/363 [00:25<00:03, 23.76it/s] Loading 0: 78%|███████▊ | 282/363 [00:25<00:02, 27.53it/s] Loading 0: 79%|███████▊ | 285/363 [00:26<00:03, 24.72it/s] Loading 0: 80%|████████ | 291/363 [00:26<00:02, 29.33it/s] Loading 0: 81%|████████▏ | 295/363 [00:26<00:02, 27.40it/s] Loading 0: 82%|████████▏ | 299/363 [00:26<00:02, 27.56it/s] Loading 0: 84%|████████▎ | 304/363 [00:26<00:02, 24.78it/s] Loading 0: 85%|████████▍ | 307/363 [00:26<00:02, 23.12it/s] Loading 0: 85%|████████▌ | 310/363 [00:27<00:02, 24.07it/s] Loading 0: 86%|████████▌ | 313/363 [00:27<00:02, 23.99it/s] Loading 0: 88%|████████▊ | 318/363 [00:27<00:01, 27.95it/s] Loading 0: 88%|████████▊ | 321/363 [00:27<00:01, 25.20it/s] Loading 0: 90%|█████████ | 327/363 [00:27<00:01, 30.87it/s] Loading 0: 91%|█████████ | 331/363 [00:27<00:01, 29.32it/s] Loading 0: 92%|█████████▏| 335/363 [00:27<00:00, 30.46it/s] Loading 0: 93%|█████████▎| 339/363 [00:27<00:00, 30.16it/s] Loading 0: 94%|█████████▍| 343/363 [00:28<00:01, 17.72it/s] Loading 0: 96%|█████████▌| 348/363 [00:28<00:00, 19.64it/s] Loading 0: 98%|█████████▊| 355/363 [00:28<00:00, 27.08it/s] Loading 0: 99%|█████████▉| 359/363 [00:28<00:00, 26.45it/s]
Job chaiml-gy-exp35-sftlora-2298-v3-mkmlizer completed after 146.03s with status: succeeded
Stopping job with name chaiml-gy-exp35-sftlora-2298-v3-mkmlizer
Pipeline stage MKMLizer completed in 146.61s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service chaiml-gy-exp35-sftlora-2298-v3
Waiting for inference service chaiml-gy-exp35-sftlora-2298-v3 to be ready
Inference service chaiml-gy-exp35-sftlora-2298-v3 ready after 180.6645998954773s
Pipeline stage MKMLDeployer completed in 181.29s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.578077793121338s
Received healthy response to inference request in 1.9216253757476807s
Received healthy response to inference request in 1.8343913555145264s
Received healthy response to inference request in 2.0297186374664307s
Received healthy response to inference request in 2.0325703620910645s
5 requests
0 failed requests
5th percentile: 1.8518381595611573
10th percentile: 1.8692849636077882
20th percentile: 1.9041785717010498
30th percentile: 1.9432440280914307
40th percentile: 1.9864813327789306
50th percentile: 2.0297186374664307
60th percentile: 2.0308593273162843
70th percentile: 2.032000017166138
80th percentile: 2.1416718482971193
90th percentile: 2.3598748207092286
95th percentile: 2.468976306915283
99th percentile: 2.5562574958801267
mean time: 2.079276704788208
Pipeline stage StressChecker completed in 11.89s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.81s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.71s
Shutdown handler de-registered
chaiml-gy-exp35-sftlora-_2298_v3 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service chaiml-gy-exp35-sftlora-2298-v3-profiler
Waiting for inference service chaiml-gy-exp35-sftlora-2298-v3-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3361.45s
Shutdown handler de-registered
chaiml-gy-exp35-sftlora-_2298_v3 status is now inactive due to auto deactivation removed underperforming models
chaiml-gy-exp35-sftlora-_2298_v3 status is now torndown due to DeploymentManager action