developer_uid: rirv938
submission_id: rirv938-mistral-24b-grp_82678_v1
model_name: rirv938-mistral-24b-grp_82678_v1
model_group: rirv938/mistral_24b_grpo
status: torndown
timestamp: 2025-04-29T17:19:40+00:00
num_battles: 6100
num_wins: 3261
celo_rating: 1327.6
family_friendly_score: 0.5618000000000001
family_friendly_standard_error: 0.007016847725296595
submission_type: basic
model_repo: rirv938/mistral_24b_grpo_40k_cp1184_92ff_merged
model_architecture: MistralForCausalLM
model_num_parameters: 24096691200.0
best_of: 8
max_input_tokens: 768
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5536503493780778, 'latency_mean': 1.8060809218883513, 'latency_p50': 1.795714259147644, 'latency_p90': 1.999640941619873}, {'batch_size': 3, 'throughput': 1.1669855774069715, 'latency_mean': 2.5616282880306245, 'latency_p50': 2.5853830575942993, 'latency_p90': 2.869506597518921}, {'batch_size': 5, 'throughput': 1.5354623969779955, 'latency_mean': 3.2308424317836764, 'latency_p50': 3.2094374895095825, 'latency_p90': 3.5931224822998047}, {'batch_size': 6, 'throughput': 1.675610966913856, 'latency_mean': 3.555844578742981, 'latency_p50': 3.543360471725464, 'latency_p90': 3.9585484743118284}, {'batch_size': 8, 'throughput': 1.8402477183509058, 'latency_mean': 4.311695073843002, 'latency_p50': 4.324791431427002, 'latency_p90': 4.871357679367065}, {'batch_size': 10, 'throughput': 1.982703451986564, 'latency_mean': 5.004745423793793, 'latency_p50': 5.0065765380859375, 'latency_p90': 5.570697617530823}]
gpu_counts: {'NVIDIA A100-SXM4-80GB': 1}
display_name: rirv938-mistral-24b-grp_82678_v1
is_internal_developer: True
language_model: rirv938/mistral_24b_grpo_40k_cp1184_92ff_merged
model_size: 24B
ranking_group: single
throughput_3p7s: 1.72
us_pacific_date: 2025-04-29
win_ratio: 0.5345901639344263
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.2, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['You:', '###', '\n', '</s>'], 'max_input_tokens': 768, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rirv938-mistral-24b-grp-82678-v1-mkmlizer
Waiting for job on rirv938-mistral-24b-grp-82678-v1-mkmlizer to finish
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ _____ __ __ ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ /___/ ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ Version: 0.12.8 ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ https://mk1.ai ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ belonging to: ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ Chai Research Corp. ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ║ ║
rirv938-mistral-24b-grp-82678-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rirv938-mistral-24b-grp-82678-v1-mkmlizer: Downloaded to shared memory in 180.238s
rirv938-mistral-24b-grp-82678-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp4qs9cbhp, device:0
rirv938-mistral-24b-grp-82678-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rirv938-mistral-24b-grp-82678-v1-mkmlizer: quantized model in 62.120s
rirv938-mistral-24b-grp-82678-v1-mkmlizer: Processed model rirv938/mistral_24b_grpo_40k_cp1184_92ff_merged in 242.358s
rirv938-mistral-24b-grp-82678-v1-mkmlizer: creating bucket guanaco-mkml-models
rirv938-mistral-24b-grp-82678-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rirv938-mistral-24b-grp-82678-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rirv938-mistral-24b-grp-82678-v1
rirv938-mistral-24b-grp-82678-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rirv938-mistral-24b-grp-82678-v1/config.json
rirv938-mistral-24b-grp-82678-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rirv938-mistral-24b-grp-82678-v1/special_tokens_map.json
rirv938-mistral-24b-grp-82678-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rirv938-mistral-24b-grp-82678-v1/tokenizer_config.json
rirv938-mistral-24b-grp-82678-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rirv938-mistral-24b-grp-82678-v1/tokenizer.json
rirv938-mistral-24b-grp-82678-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/rirv938-mistral-24b-grp-82678-v1/flywheel_model.1.safetensors
rirv938-mistral-24b-grp-82678-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rirv938-mistral-24b-grp-82678-v1/flywheel_model.0.safetensors
rirv938-mistral-24b-grp-82678-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 3/363 [00:00<00:12, 29.89it/s] Loading 0: 2%|▏ | 6/363 [00:00<00:24, 14.33it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:15, 22.63it/s] Loading 0: 4%|▍ | 14/363 [00:00<00:25, 13.80it/s] Loading 0: 4%|▍ | 16/363 [00:01<00:25, 13.41it/s] Loading 0: 6%|▌ | 21/363 [00:01<00:20, 17.05it/s] Loading 0: 6%|▋ | 23/363 [00:01<00:24, 13.74it/s] Loading 0: 8%|▊ | 29/363 [00:01<00:15, 21.29it/s] Loading 0: 9%|▉ | 33/363 [00:01<00:17, 19.38it/s] Loading 0: 10%|▉ | 36/363 [00:02<00:22, 14.67it/s] Loading 0: 11%|█ | 39/363 [00:02<00:19, 16.81it/s] Loading 0: 12%|█▏ | 42/363 [00:02<00:21, 15.05it/s] Loading 0: 13%|█▎ | 46/363 [00:02<00:16, 18.85it/s] Loading 0: 14%|█▍ | 50/363 [00:02<00:13, 22.80it/s] Loading 0: 15%|█▍ | 53/363 [00:03<00:19, 16.30it/s] Loading 0: 15%|█▌ | 56/363 [00:03<00:18, 16.84it/s] Loading 0: 16%|█▋ | 59/363 [00:03<00:23, 13.18it/s] Loading 0: 18%|█▊ | 64/363 [00:03<00:16, 18.24it/s] Loading 0: 19%|█▊ | 68/363 [00:03<00:13, 21.62it/s] Loading 0: 20%|█▉ | 71/363 [00:04<00:18, 15.92it/s] Loading 0: 20%|██ | 74/363 [00:04<00:17, 16.45it/s] Loading 0: 21%|██ | 77/363 [00:04<00:21, 13.05it/s] Loading 0: 23%|██▎ | 82/363 [00:04<00:15, 18.42it/s] Loading 0: 24%|██▎ | 86/363 [00:04<00:12, 21.76it/s] Loading 0: 25%|██▍ | 89/363 [00:05<00:17, 16.00it/s] Loading 0: 25%|██▌ | 92/363 [00:05<00:16, 16.33it/s] Loading 0: 27%|██▋ | 97/363 [00:05<00:12, 21.42it/s] Loading 0: 28%|██▊ | 101/363 [00:05<00:11, 22.44it/s] Loading 0: 29%|██▊ | 104/363 [00:05<00:14, 18.31it/s] Loading 0: 29%|██▉ | 107/363 [00:06<00:17, 14.33it/s] Loading 0: 30%|███ | 109/363 [00:06<00:18, 13.91it/s] Loading 0: 31%|███ | 111/363 [00:06<00:17, 14.59it/s] Loading 0: 31%|███ | 113/363 [00:06<00:20, 12.13it/s] Loading 0: 33%|███▎ | 118/363 [00:06<00:13, 18.27it/s] Loading 0: 34%|███▎ | 122/363 [00:07<00:10, 21.94it/s] Loading 0: 34%|███▍ | 125/363 [00:07<00:15, 15.82it/s] Loading 0: 35%|███▌ | 128/363 [00:07<00:14, 16.12it/s] Loading 0: 36%|███▌ | 131/363 [00:07<00:18, 12.67it/s] Loading 0: 37%|███▋ | 136/363 [00:08<00:12, 17.82it/s] Loading 0: 39%|███▊ | 140/363 [00:08<00:10, 21.26it/s] Loading 0: 39%|███▉ | 143/363 [00:08<00:14, 15.55it/s] Loading 0: 40%|████ | 146/363 [00:08<00:13, 16.12it/s] Loading 0: 41%|████ | 149/363 [00:09<00:16, 12.92it/s] Loading 0: 42%|████▏ | 154/363 [00:09<00:11, 18.16it/s] Loading 0: 44%|████▍ | 159/363 [00:09<00:10, 19.08it/s] Loading 0: 45%|████▍ | 162/363 [00:09<00:13, 14.80it/s] Loading 0: 45%|████▌ | 165/363 [00:09<00:11, 16.70it/s] Loading 0: 46%|████▋ | 168/363 [00:10<00:12, 15.24it/s] Loading 0: 47%|████▋ | 172/363 [00:10<00:10, 19.04it/s] Loading 0: 49%|████▉ | 177/363 [00:10<00:09, 20.00it/s] Loading 0: 50%|████▉ | 180/363 [00:10<00:12, 15.02it/s] Loading 0: 50%|█████ | 183/363 [00:10<00:10, 17.09it/s] Loading 0: 51%|█████ | 186/363 [00:11<00:11, 15.35it/s] Loading 0: 53%|█████▎ | 191/363 [00:11<00:08, 20.65it/s] Loading 0: 54%|█████▎ | 195/363 [00:11<00:08, 19.61it/s] Loading 0: 55%|█████▍ | 198/363 [00:11<00:11, 14.71it/s] Loading 0: 55%|█████▌ | 201/363 [00:26<03:32, 1.31s/it] Loading 0: 56%|█████▌ | 203/363 [00:26<02:51, 1.07s/it] Loading 0: 57%|█████▋ | 207/363 [00:26<01:48, 1.44it/s] Loading 0: 58%|█████▊ | 210/363 [00:26<01:18, 1.96it/s] Loading 0: 59%|█████▊ | 213/363 [00:27<00:58, 2.56it/s] Loading 0: 59%|█████▉ | 215/363 [00:27<00:48, 3.05it/s] Loading 0: 60%|█████▉ | 217/363 [00:27<00:39, 3.71it/s] Loading 0: 60%|██████ | 219/363 [00:27<00:30, 4.66it/s] Loading 0: 61%|██████ | 221/363 [00:27<00:27, 5.18it/s] Loading 0: 62%|██████▏ | 226/363 [00:27<00:15, 9.13it/s] Loading 0: 63%|██████▎ | 230/363 [00:27<00:10, 12.38it/s] Loading 0: 64%|██████▍ | 233/363 [00:28<00:11, 11.02it/s] Loading 0: 65%|██████▌ | 236/363 [00:28<00:10, 12.43it/s] Loading 0: 66%|██████▌ | 239/363 [00:28<00:11, 10.83it/s] Loading 0: 67%|██████▋ | 244/363 [00:28<00:07, 15.57it/s] Loading 0: 68%|██████▊ | 248/363 [00:29<00:06, 18.75it/s] Loading 0: 69%|██████▉ | 251/363 [00:29<00:07, 14.34it/s] Loading 0: 70%|██████▉ | 254/363 [00:29<00:07, 14.82it/s] Loading 0: 71%|███████ | 257/363 [00:29<00:08, 12.16it/s] Loading 0: 72%|███████▏ | 262/363 [00:30<00:05, 17.21it/s] Loading 0: 74%|███████▎ | 267/363 [00:30<00:05, 18.18it/s] Loading 0: 74%|███████▍ | 270/363 [00:30<00:06, 14.37it/s] Loading 0: 75%|███████▌ | 273/363 [00:30<00:05, 16.35it/s] Loading 0: 76%|███████▌ | 276/363 [00:31<00:05, 15.01it/s] Loading 0: 77%|███████▋ | 280/363 [00:31<00:04, 18.98it/s] Loading 0: 79%|███████▊ | 285/363 [00:31<00:03, 19.62it/s] Loading 0: 79%|███████▉ | 288/363 [00:31<00:05, 14.76it/s] Loading 0: 80%|████████ | 291/363 [00:31<00:04, 16.78it/s] Loading 0: 81%|████████ | 294/363 [00:32<00:04, 15.12it/s] Loading 0: 82%|████████▏ | 299/363 [00:32<00:03, 20.47it/s] Loading 0: 83%|████████▎ | 303/363 [00:32<00:03, 18.94it/s] Loading 0: 84%|████████▍ | 306/363 [00:32<00:03, 14.53it/s] Loading 0: 85%|████████▌ | 309/363 [00:32<00:03, 16.69it/s] Loading 0: 86%|████████▌ | 312/363 [00:33<00:03, 15.20it/s] Loading 0: 87%|████████▋ | 317/363 [00:33<00:02, 20.57it/s] Loading 0: 88%|████████▊ | 321/363 [00:33<00:02, 19.57it/s] Loading 0: 89%|████████▉ | 324/363 [00:33<00:02, 14.46it/s] Loading 0: 90%|█████████ | 327/363 [00:33<00:02, 16.58it/s] Loading 0: 91%|█████████ | 330/363 [00:34<00:02, 14.85it/s] Loading 0: 92%|█████████▏| 334/363 [00:34<00:01, 18.87it/s] Loading 0: 93%|█████████▎| 338/363 [00:34<00:01, 22.70it/s] Loading 0: 94%|█████████▍| 341/363 [00:34<00:01, 16.24it/s] Loading 0: 95%|█████████▍| 344/363 [00:34<00:01, 16.88it/s] Loading 0: 96%|█████████▌| 347/363 [00:35<00:01, 13.11it/s] Loading 0: 97%|█████████▋| 352/363 [00:35<00:00, 18.10it/s] Loading 0: 98%|█████████▊| 356/363 [00:35<00:00, 21.44it/s] Loading 0: 99%|█████████▉| 359/363 [00:42<00:02, 1.60it/s] Loading 0: 100%|█████████▉| 362/363 [00:42<00:00, 2.08it/s]
Job rirv938-mistral-24b-grp-82678-v1-mkmlizer completed after 268.55s with status: succeeded
Stopping job with name rirv938-mistral-24b-grp-82678-v1-mkmlizer
Pipeline stage MKMLizer completed in 269.02s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rirv938-mistral-24b-grp-82678-v1
Waiting for inference service rirv938-mistral-24b-grp-82678-v1 to be ready
Inference service rirv938-mistral-24b-grp-82678-v1 ready after 140.48210644721985s
Pipeline stage MKMLDeployer completed in 140.96s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.399139165878296s
Received healthy response to inference request in 1.9371159076690674s
Received healthy response to inference request in 2.22345232963562s
Received healthy response to inference request in 1.9379632472991943s
5 requests
1 failed requests
5th percentile: 1.9372853755950927
10th percentile: 1.937454843521118
20th percentile: 1.937793779373169
30th percentile: 1.9950610637664794
40th percentile: 2.10925669670105
50th percentile: 2.22345232963562
60th percentile: 2.2937270641326903
70th percentile: 2.3640017986297606
80th percentile: 5.941725540161135
90th percentile: 13.026898288726809
95th percentile: 16.56948466300964
99th percentile: 19.403553762435912
mean time: 5.721948337554932
%s, retrying in %s seconds...
Received healthy response to inference request in 1.858776569366455s
Received healthy response to inference request in 2.118079423904419s
Received healthy response to inference request in 2.1863534450531006s
Received healthy response to inference request in 1.9361381530761719s
Received healthy response to inference request in 2.0393335819244385s
5 requests
0 failed requests
5th percentile: 1.8742488861083983
10th percentile: 1.8897212028503418
20th percentile: 1.9206658363342286
30th percentile: 1.956777238845825
40th percentile: 1.9980554103851318
50th percentile: 2.0393335819244385
60th percentile: 2.0708319187164306
70th percentile: 2.1023302555084227
80th percentile: 2.1317342281341554
90th percentile: 2.1590438365936278
95th percentile: 2.172698640823364
99th percentile: 2.183622484207153
mean time: 2.027736234664917
Pipeline stage StressChecker completed in 40.93s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.59s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.67s
Shutdown handler de-registered
rirv938-mistral-24b-grp_82678_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2870.70s
Shutdown handler de-registered
rirv938-mistral-24b-grp_82678_v1 status is now inactive due to auto deactivation removed underperforming models
rirv938-mistral-24b-grp_82678_v1 status is now torndown due to DeploymentManager action