developer_uid: rirv938
submission_id: rirv938-gy-grpo-r2-cp31_81326_v1
model_name: rirv938-gy-grpo-r2-cp31_81326_v1
model_group: rirv938/gy_grpo_r2_cp312
status: torndown
timestamp: 2025-07-09T05:02:02+00:00
num_battles: 6392
num_wins: 3392
celo_rating: 1314.52
family_friendly_score: 0.4948
family_friendly_standard_error: 0.007070685398177464
submission_type: basic
model_repo: rirv938/gy_grpo_r2_cp312_98ff_b5_merged
model_architecture: MistralForCausalLM
model_num_parameters: 24096691200.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5202641186259904, 'latency_mean': 1.9219257819652558, 'latency_p50': 1.9077339172363281, 'latency_p90': 2.123719906806946}, {'batch_size': 3, 'throughput': 1.0294998137106117, 'latency_mean': 2.903231816291809, 'latency_p50': 2.9206831455230713, 'latency_p90': 3.2078801870346068}, {'batch_size': 5, 'throughput': 1.3186361021804145, 'latency_mean': 3.7752302503585815, 'latency_p50': 3.7509766817092896, 'latency_p90': 4.244602870941162}, {'batch_size': 6, 'throughput': 1.4115651261723994, 'latency_mean': 4.213198263645172, 'latency_p50': 4.240391492843628, 'latency_p90': 4.65388765335083}, {'batch_size': 8, 'throughput': 1.5435971671362696, 'latency_mean': 5.136673545837402, 'latency_p50': 5.099844813346863, 'latency_p90': 5.707603454589844}, {'batch_size': 10, 'throughput': 1.6305760882843894, 'latency_mean': 6.0642307817935945, 'latency_p50': 6.05887758731842, 'latency_p90': 6.854314851760864}]
gpu_counts: {'NVIDIA A100-SXM4-80GB': 1}
display_name: rirv938-gy-grpo-r2-cp31_81326_v1
is_internal_developer: True
language_model: rirv938/gy_grpo_r2_cp312_98ff_b5_merged
model_size: 24B
ranking_group: single
throughput_3p7s: 1.3
us_pacific_date: 2025-07-08
win_ratio: 0.5306633291614519
generation_params: {'temperature': 1.0, 'top_p': 0.95, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.45, 'frequency_penalty': 0.45, 'stopping_words': ['You:', '</s>', '\n', '###'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '<|im_start|>assistant\n{message}<|im_end|>\n', 'user_template': '<|im_start|>user\nYou:{message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer
Waiting for job on rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer to finish
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ Version: 0.29.15 ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ https://mk1.ai ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ belonging to: ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ Chai Research Corp. ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ║ ║
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: Downloaded to shared memory in 170.371s
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: Checking if rirv938/gy_grpo_r2_cp312_98ff_b5_merged already exists in ChaiML
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: Creating repo ChaiML/gy_grpo_r2_cp312_98ff_b5_merged and uploading /tmp/tmpc4cw24_b to it
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: 0%| | 0/22 [00:00<?, ?it/s] 5%|▍ | 1/22 [00:04<01:29, 4.24s/it] 9%|▉ | 2/22 [00:10<01:43, 5.17s/it] 14%|█▎ | 3/22 [00:13<01:24, 4.45s/it] 18%|█▊ | 4/22 [00:17<01:13, 4.07s/it] 23%|██▎ | 5/22 [00:21<01:09, 4.06s/it] 27%|██▋ | 6/22 [00:24<01:02, 3.90s/it] 32%|███▏ | 7/22 [00:33<01:19, 5.32s/it] 36%|███▋ | 8/22 [00:41<01:27, 6.28s/it] 41%|████ | 9/22 [00:44<01:09, 5.38s/it] 45%|████▌ | 10/22 [00:51<01:11, 5.94s/it] 50%|█████ | 11/22 [00:55<00:57, 5.20s/it] 55%|█████▍ | 12/22 [01:02<00:56, 5.69s/it] 59%|█████▉ | 13/22 [01:06<00:46, 5.18s/it] 64%|██████▎ | 14/22 [01:13<00:45, 5.69s/it] 68%|██████▊ | 15/22 [01:17<00:36, 5.23s/it] 73%|███████▎ | 16/22 [01:21<00:29, 4.90s/it] 77%|███████▋ | 17/22 [01:25<00:23, 4.61s/it] 82%|████████▏ | 18/22 [01:29<00:17, 4.43s/it] 86%|████████▋ | 19/22 [01:32<00:12, 4.17s/it] 91%|█████████ | 20/22 [01:40<00:10, 5.14s/it] 95%|█████████▌| 21/22 [01:44<00:04, 4.77s/it] 100%|██████████| 22/22 [01:45<00:00, 3.68s/it] 100%|██████████| 22/22 [01:45<00:00, 4.79s/it]
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpc4cw24_b, device:0
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: quantized model in 57.030s
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: Processed model rirv938/gy_grpo_r2_cp312_98ff_b5_merged in 425.904s
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: creating bucket guanaco-mkml-models
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rirv938-gy-grpo-r2-cp31-81326-v1/nvidia
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rirv938-gy-grpo-r2-cp31-81326-v1/nvidia/config.json
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rirv938-gy-grpo-r2-cp31-81326-v1/nvidia/special_tokens_map.json
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rirv938-gy-grpo-r2-cp31-81326-v1/nvidia/tokenizer_config.json
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rirv938-gy-grpo-r2-cp31-81326-v1/nvidia/tokenizer.json
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/rirv938-gy-grpo-r2-cp31-81326-v1/nvidia/flywheel_model.1.safetensors
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rirv938-gy-grpo-r2-cp31-81326-v1/nvidia/flywheel_model.0.safetensors
rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 4/363 [00:00<00:16, 21.84it/s] Loading 0: 2%|▏ | 7/363 [00:00<00:19, 18.40it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:15, 22.51it/s] Loading 0: 4%|▍ | 14/363 [00:00<00:25, 13.66it/s] Loading 0: 4%|▍ | 16/363 [00:01<00:25, 13.48it/s] Loading 0: 6%|▌ | 21/363 [00:01<00:19, 17.17it/s] Loading 0: 6%|▋ | 23/363 [00:01<00:24, 13.93it/s] Loading 0: 8%|▊ | 28/363 [00:01<00:16, 20.09it/s] Loading 0: 9%|▉ | 32/363 [00:01<00:14, 23.61it/s] Loading 0: 10%|▉ | 35/363 [00:02<00:20, 16.31it/s] Loading 0: 10%|█ | 38/363 [00:02<00:19, 17.01it/s] Loading 0: 11%|█▏ | 41/363 [00:02<00:23, 13.45it/s] Loading 0: 13%|█▎ | 46/363 [00:02<00:16, 18.67it/s] Loading 0: 14%|█▍ | 50/363 [00:02<00:14, 21.73it/s] Loading 0: 15%|█▍ | 53/363 [00:03<00:19, 15.99it/s] Loading 0: 15%|█▌ | 56/363 [00:03<00:18, 16.63it/s] Loading 0: 16%|█▋ | 59/363 [00:03<00:23, 13.12it/s] Loading 0: 18%|█▊ | 64/363 [00:03<00:16, 18.14it/s] Loading 0: 19%|█▊ | 68/363 [00:03<00:13, 21.24it/s] Loading 0: 20%|█▉ | 71/363 [00:04<00:19, 15.08it/s] Loading 0: 20%|██ | 74/363 [00:04<00:18, 15.57it/s] Loading 0: 21%|██ | 77/363 [00:04<00:23, 12.08it/s] Loading 0: 23%|██▎ | 82/363 [00:04<00:16, 16.89it/s] Loading 0: 24%|██▎ | 86/363 [00:05<00:13, 20.01it/s] Loading 0: 25%|██▍ | 89/363 [00:05<00:19, 13.78it/s] Loading 0: 25%|██▌ | 92/363 [00:05<00:18, 14.36it/s] Loading 0: 26%|██▌ | 95/363 [00:05<00:16, 16.69it/s] Loading 0: 27%|██▋ | 99/363 [00:05<00:13, 20.12it/s] Loading 0: 28%|██▊ | 102/363 [00:06<00:14, 18.41it/s] Loading 0: 29%|██▉ | 105/363 [00:06<00:18, 13.67it/s] Loading 0: 29%|██▉ | 107/363 [00:06<00:20, 12.62it/s] Loading 0: 30%|███ | 109/363 [00:06<00:20, 12.47it/s] Loading 0: 31%|███ | 111/363 [00:06<00:18, 13.31it/s] Loading 0: 31%|███ | 113/363 [00:07<00:22, 11.22it/s] Loading 0: 33%|███▎ | 118/363 [00:07<00:13, 17.54it/s] Loading 0: 34%|███▎ | 122/363 [00:07<00:11, 21.22it/s] Loading 0: 34%|███▍ | 125/363 [00:07<00:16, 14.42it/s] Loading 0: 35%|███▌ | 128/363 [00:07<00:15, 15.28it/s] Loading 0: 36%|███▌ | 130/363 [00:08<00:17, 13.67it/s] Loading 0: 36%|███▋ | 132/363 [00:08<00:17, 13.22it/s] Loading 0: 37%|███▋ | 136/363 [00:08<00:13, 17.46it/s] Loading 0: 39%|███▊ | 140/363 [00:08<00:10, 21.10it/s] Loading 0: 39%|███▉ | 143/363 [00:08<00:15, 14.38it/s] Loading 0: 40%|███▉ | 145/363 [00:09<00:16, 13.54it/s] Loading 0: 40%|████ | 147/363 [00:09<00:15, 14.31it/s] Loading 0: 41%|████ | 149/363 [00:09<00:18, 11.85it/s] Loading 0: 42%|████▏ | 154/363 [00:09<00:11, 17.83it/s] Loading 0: 44%|████▎ | 158/363 [00:09<00:09, 21.59it/s] Loading 0: 44%|████▍ | 161/363 [00:10<00:13, 14.83it/s] Loading 0: 45%|████▌ | 164/363 [00:10<00:12, 15.54it/s] Loading 0: 46%|████▌ | 166/363 [00:10<00:14, 13.93it/s] Loading 0: 46%|████▋ | 168/363 [00:10<00:14, 13.64it/s] Loading 0: 47%|████▋ | 172/363 [00:10<00:10, 18.12it/s] Loading 0: 48%|████▊ | 176/363 [00:10<00:08, 22.12it/s] Loading 0: 49%|████▉ | 179/363 [00:11<00:11, 15.39it/s] Loading 0: 50%|█████ | 182/363 [00:11<00:11, 16.13it/s] Loading 0: 51%|█████ | 185/363 [00:11<00:14, 12.66it/s] Loading 0: 52%|█████▏ | 190/363 [00:11<00:09, 17.63it/s] Loading 0: 53%|█████▎ | 194/363 [00:11<00:07, 21.15it/s] Loading 0: 54%|█████▍ | 197/363 [00:12<00:10, 15.49it/s] Loading 0: 55%|█████▌ | 200/363 [00:12<00:09, 16.33it/s] Loading 0: 56%|█████▌ | 203/363 [00:27<03:46, 1.41s/it] Loading 0: 57%|█████▋ | 207/363 [00:27<02:25, 1.07it/s] Loading 0: 58%|█████▊ | 210/363 [00:27<01:46, 1.44it/s] Loading 0: 59%|█████▊ | 213/363 [00:27<01:19, 1.90it/s] Loading 0: 60%|█████▉ | 216/363 [00:28<01:00, 2.44it/s] Loading 0: 60%|██████ | 219/363 [00:28<00:43, 3.31it/s] Loading 0: 61%|██████ | 222/363 [00:28<00:33, 4.20it/s] Loading 0: 62%|██████▏ | 226/363 [00:28<00:22, 6.10it/s] Loading 0: 63%|██████▎ | 230/363 [00:28<00:15, 8.44it/s] Loading 0: 64%|██████▍ | 233/363 [00:29<00:15, 8.47it/s] Loading 0: 65%|██████▌ | 236/363 [00:29<00:12, 10.00it/s] Loading 0: 66%|██████▌ | 239/363 [00:29<00:12, 9.56it/s] Loading 0: 67%|██████▋ | 244/363 [00:29<00:08, 13.90it/s] Loading 0: 68%|██████▊ | 248/363 [00:29<00:06, 17.17it/s] Loading 0: 69%|██████▉ | 251/363 [00:30<00:08, 13.52it/s] Loading 0: 70%|██████▉ | 254/363 [00:30<00:07, 14.47it/s] Loading 0: 71%|███████ | 257/363 [00:30<00:08, 12.20it/s] Loading 0: 72%|███████▏ | 262/363 [00:30<00:05, 17.09it/s] Loading 0: 73%|███████▎ | 266/363 [00:30<00:04, 20.38it/s] Loading 0: 74%|███████▍ | 269/363 [00:31<00:06, 15.29it/s] Loading 0: 75%|███████▍ | 272/363 [00:31<00:05, 16.19it/s] Loading 0: 76%|███████▌ | 275/363 [00:31<00:06, 12.91it/s] Loading 0: 77%|███████▋ | 280/363 [00:31<00:04, 17.74it/s] Loading 0: 78%|███████▊ | 284/363 [00:32<00:03, 21.02it/s] Loading 0: 79%|███████▉ | 287/363 [00:32<00:04, 15.72it/s] Loading 0: 80%|███████▉ | 290/363 [00:32<00:04, 16.59it/s] Loading 0: 81%|████████ | 293/363 [00:32<00:05, 13.20it/s] Loading 0: 82%|████████▏ | 298/363 [00:33<00:03, 18.25it/s] Loading 0: 83%|████████▎ | 302/363 [00:33<00:02, 21.29it/s] Loading 0: 84%|████████▍ | 305/363 [00:33<00:03, 15.69it/s] Loading 0: 85%|████████▍ | 308/363 [00:33<00:03, 16.29it/s] Loading 0: 86%|████████▌ | 311/363 [00:34<00:03, 13.14it/s] Loading 0: 87%|████████▋ | 316/363 [00:34<00:02, 18.02it/s] Loading 0: 88%|████████▊ | 320/363 [00:34<00:02, 21.43it/s] Loading 0: 89%|████████▉ | 323/363 [00:34<00:02, 16.15it/s] Loading 0: 90%|████████▉ | 326/363 [00:34<00:02, 16.80it/s] Loading 0: 91%|█████████ | 329/363 [00:35<00:02, 13.25it/s] Loading 0: 92%|█████████▏| 334/363 [00:35<00:01, 18.30it/s] Loading 0: 93%|█████████▎| 338/363 [00:35<00:01, 21.52it/s] Loading 0: 94%|█████████▍| 341/363 [00:35<00:01, 16.08it/s] Loading 0: 95%|█████████▍| 344/363 [00:35<00:01, 16.78it/s] Loading 0: 96%|█████████▌| 347/363 [00:36<00:01, 13.24it/s] Loading 0: 97%|█████████▋| 352/363 [00:36<00:00, 18.29it/s] Loading 0: 98%|█████████▊| 356/363 [00:36<00:00, 21.40it/s] Loading 0: 99%|█████████▉| 359/363 [00:36<00:00, 11.56it/s] Loading 0: 100%|█████████▉| 362/363 [00:37<00:00, 11.57it/s]
Job rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer completed after 452.34s with status: succeeded
Stopping job with name rirv938-gy-grpo-r2-cp31-81326-v1-mkmlizer
Pipeline stage MKMLizer completed in 452.89s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rirv938-gy-grpo-r2-cp31-81326-v1
Waiting for inference service rirv938-gy-grpo-r2-cp31-81326-v1 to be ready
Failed to get response for submission chaiml-captain-curly-mo_32999_v1: HTTPConnectionPool(host='chaiml-captain-curly-mo-32999-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service rirv938-gy-grpo-r2-cp31-81326-v1 ready after 202.74814295768738s
Pipeline stage MKMLDeployer completed in 203.32s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.789336919784546s
Received healthy response to inference request in 2.160989761352539s
Received healthy response to inference request in 1.9150183200836182s
Received healthy response to inference request in 2.051609754562378s
Received healthy response to inference request in 1.9172642230987549s
5 requests
0 failed requests
5th percentile: 1.9154675006866455
10th percentile: 1.9159166812896729
20th percentile: 1.9168150424957275
30th percentile: 1.9441333293914795
40th percentile: 1.9978715419769286
50th percentile: 2.051609754562378
60th percentile: 2.095361757278442
70th percentile: 2.139113759994507
80th percentile: 2.2866591930389406
90th percentile: 2.5379980564117433
95th percentile: 2.6636674880981444
99th percentile: 2.7642030334472656
mean time: 2.166843795776367
Pipeline stage StressChecker completed in 12.22s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.63s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.73s
Shutdown handler de-registered
rirv938-gy-grpo-r2-cp31_81326_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3330.94s
Shutdown handler de-registered
rirv938-gy-grpo-r2-cp31_81326_v1 status is now inactive due to auto deactivation removed underperforming models
rirv938-gy-grpo-r2-cp31_81326_v1 status is now torndown due to DeploymentManager action