developer_uid: bogoconic1
submission_id: bogoconic1-mistral-nemo_85251_v1
model_name: bogoconic1-mistral-nemo_85251_v1
model_group: bogoconic1/mistral-nemo-
status: torndown
timestamp: 2025-05-01T11:51:45+00:00
num_battles: 8729
num_wins: 4095
celo_rating: 1262.74
family_friendly_score: 0.599
family_friendly_standard_error: 0.006931074952703946
submission_type: basic
model_repo: bogoconic1/mistral-nemo-13b-280k-step400
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.6098335850237565, 'latency_mean': 1.6397270131111146, 'latency_p50': 1.645824909210205, 'latency_p90': 1.8308682441711426}, {'batch_size': 3, 'throughput': 1.1015589793137517, 'latency_mean': 2.7096137976646424, 'latency_p50': 2.714420437812805, 'latency_p90': 2.995668959617615}, {'batch_size': 5, 'throughput': 1.3253582149396268, 'latency_mean': 3.74959591627121, 'latency_p50': 3.7660634517669678, 'latency_p90': 4.1819534540176395}, {'batch_size': 6, 'throughput': 1.4095182763245517, 'latency_mean': 4.236366754770279, 'latency_p50': 4.245135188102722, 'latency_p90': 4.742019486427307}, {'batch_size': 8, 'throughput': 1.4630128456524787, 'latency_mean': 5.413874696493149, 'latency_p50': 5.378474593162537, 'latency_p90': 6.101340866088867}, {'batch_size': 10, 'throughput': 1.5023635051734412, 'latency_mean': 6.604972856044769, 'latency_p50': 6.626284718513489, 'latency_p90': 7.520234084129333}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: bogoconic1-mistral-nemo_85251_v1
is_internal_developer: False
language_model: bogoconic1/mistral-nemo-13b-280k-step400
model_size: 13B
ranking_group: single
throughput_3p7s: 1.33
us_pacific_date: 2025-05-01
win_ratio: 0.46912590216519645
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.0, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name bogoconic1-mistral-nemo-85251-v1-mkmlizer
Waiting for job on bogoconic1-mistral-nemo-85251-v1-mkmlizer to finish
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ _____ __ __ ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ /___/ ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ Version: 0.12.8 ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ https://mk1.ai ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ The license key for the current software has been verified as ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ belonging to: ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ Chai Research Corp. ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ║ ║
bogoconic1-mistral-nemo-85251-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
bogoconic1-mistral-nemo-85251-v1-mkmlizer: Downloaded to shared memory in 46.523s
bogoconic1-mistral-nemo-85251-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpanya8uy0, device:0
bogoconic1-mistral-nemo-85251-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
bogoconic1-mistral-nemo-85251-v1-mkmlizer: quantized model in 35.988s
bogoconic1-mistral-nemo-85251-v1-mkmlizer: Processed model bogoconic1/mistral-nemo-13b-280k-step400 in 82.512s
bogoconic1-mistral-nemo-85251-v1-mkmlizer: creating bucket guanaco-mkml-models
bogoconic1-mistral-nemo-85251-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
bogoconic1-mistral-nemo-85251-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/bogoconic1-mistral-nemo-85251-v1
bogoconic1-mistral-nemo-85251-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/bogoconic1-mistral-nemo-85251-v1/config.json
bogoconic1-mistral-nemo-85251-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/bogoconic1-mistral-nemo-85251-v1/special_tokens_map.json
bogoconic1-mistral-nemo-85251-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/bogoconic1-mistral-nemo-85251-v1/tokenizer_config.json
bogoconic1-mistral-nemo-85251-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/bogoconic1-mistral-nemo-85251-v1/tokenizer.json
bogoconic1-mistral-nemo-85251-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/bogoconic1-mistral-nemo-85251-v1/flywheel_model.0.safetensors
bogoconic1-mistral-nemo-85251-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:11, 30.26it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:07, 45.79it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:08, 42.80it/s] Loading 0: 6%|▌ | 22/363 [00:00<00:08, 40.91it/s] Loading 0: 7%|▋ | 27/363 [00:00<00:08, 41.54it/s] Loading 0: 9%|▉ | 32/363 [00:00<00:09, 34.29it/s] Loading 0: 11%|█ | 40/363 [00:00<00:07, 43.60it/s] Loading 0: 12%|█▏ | 45/363 [00:01<00:07, 45.16it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 36.92it/s] Loading 0: 15%|█▌ | 56/363 [00:01<00:07, 41.89it/s] Loading 0: 17%|█▋ | 61/363 [00:01<00:09, 31.87it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:09, 31.62it/s] Loading 0: 20%|█▉ | 71/363 [00:01<00:07, 36.84it/s] Loading 0: 21%|██ | 76/363 [00:02<00:07, 36.23it/s] Loading 0: 22%|██▏ | 81/363 [00:02<00:07, 37.75it/s] Loading 0: 24%|██▍ | 87/363 [00:02<00:07, 37.87it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:07, 37.06it/s] Loading 0: 27%|██▋ | 99/363 [00:02<00:05, 45.68it/s] Loading 0: 29%|██▉ | 105/363 [00:02<00:05, 43.21it/s] Loading 0: 31%|███ | 112/363 [00:02<00:05, 47.28it/s] Loading 0: 32%|███▏ | 117/363 [00:02<00:05, 45.67it/s] Loading 0: 34%|███▍ | 123/363 [00:03<00:05, 43.00it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:05, 40.74it/s] Loading 0: 37%|███▋ | 134/363 [00:03<00:05, 44.96it/s] Loading 0: 38%|███▊ | 139/363 [00:03<00:05, 44.11it/s] Loading 0: 40%|███▉ | 144/363 [00:03<00:07, 29.62it/s] Loading 0: 41%|████ | 149/363 [00:03<00:06, 32.00it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 38.95it/s] Loading 0: 44%|████▍ | 161/363 [00:04<00:05, 40.34it/s] Loading 0: 46%|████▌ | 166/363 [00:04<00:04, 41.65it/s] Loading 0: 47%|████▋ | 172/363 [00:04<00:04, 40.68it/s] Loading 0: 49%|████▉ | 177/363 [00:04<00:04, 40.27it/s] Loading 0: 50%|█████ | 183/363 [00:04<00:04, 44.69it/s] Loading 0: 52%|█████▏ | 188/363 [00:04<00:03, 43.79it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:03, 43.45it/s] Loading 0: 55%|█████▍ | 198/363 [00:04<00:03, 44.48it/s] Loading 0: 56%|█████▌ | 203/363 [00:05<00:04, 37.75it/s] Loading 0: 58%|█████▊ | 210/363 [00:05<00:03, 44.47it/s] Loading 0: 59%|█████▉ | 215/363 [00:05<00:03, 44.20it/s] Loading 0: 61%|██████ | 221/363 [00:05<00:02, 47.69it/s] Loading 0: 62%|██████▏ | 226/363 [00:05<00:04, 30.00it/s] Loading 0: 63%|██████▎ | 230/363 [00:05<00:04, 30.17it/s] Loading 0: 65%|██████▌ | 237/363 [00:06<00:03, 37.13it/s] Loading 0: 67%|██████▋ | 242/363 [00:06<00:03, 38.65it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:02, 39.45it/s] Loading 0: 69%|██████▉ | 252/363 [00:06<00:02, 40.67it/s] Loading 0: 71%|███████ | 257/363 [00:06<00:03, 34.10it/s] Loading 0: 73%|███████▎ | 264/363 [00:06<00:02, 40.94it/s] Loading 0: 74%|███████▍ | 269/363 [00:06<00:02, 40.32it/s] Loading 0: 75%|███████▌ | 274/363 [00:06<00:02, 41.29it/s] Loading 0: 77%|███████▋ | 279/363 [00:07<00:01, 43.34it/s] Loading 0: 78%|███████▊ | 284/363 [00:07<00:02, 36.28it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 43.20it/s] Loading 0: 82%|████████▏ | 296/363 [00:07<00:01, 42.53it/s] Loading 0: 83%|████████▎ | 302/363 [00:07<00:01, 46.36it/s] Loading 0: 85%|████████▍ | 307/363 [00:14<00:21, 2.56it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:14, 3.48it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:07, 5.56it/s] Loading 0: 90%|████████▉ | 326/363 [00:14<00:04, 7.45it/s] Loading 0: 91%|█████████ | 331/363 [00:14<00:03, 9.51it/s] Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 13.40it/s] Loading 0: 95%|█████████▍| 344/363 [00:15<00:01, 16.65it/s] Loading 0: 96%|█████████▌| 349/363 [00:15<00:00, 19.58it/s] Loading 0: 98%|█████████▊| 355/363 [00:15<00:00, 24.69it/s] Loading 0: 99%|█████████▉| 360/363 [00:15<00:00, 28.38it/s]
Job bogoconic1-mistral-nemo-85251-v1-mkmlizer completed after 104.51s with status: succeeded
Stopping job with name bogoconic1-mistral-nemo-85251-v1-mkmlizer
Pipeline stage MKMLizer completed in 104.98s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service bogoconic1-mistral-nemo-85251-v1
Waiting for inference service bogoconic1-mistral-nemo-85251-v1 to be ready
Failed to get response for submission bogoconic1-mistral-nemo_61673_v1: ('http://bogoconic1-mistral-nemo-61673-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:59674->127.0.0.1:8080: read: connection reset by peer\n')
Inference service bogoconic1-mistral-nemo-85251-v1 ready after 150.57128190994263s
Pipeline stage MKMLDeployer completed in 151.01s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2296719551086426s
Received healthy response to inference request in 1.6753311157226562s
Received healthy response to inference request in 2.4888312816619873s
Received healthy response to inference request in 1.7051889896392822s
Received healthy response to inference request in 1.660341501235962s
5 requests
0 failed requests
5th percentile: 1.6633394241333008
10th percentile: 1.6663373470306397
20th percentile: 1.6723331928253173
30th percentile: 1.6813026905059814
40th percentile: 1.6932458400726318
50th percentile: 1.7051889896392822
60th percentile: 1.9149821758270262
70th percentile: 2.1247753620147702
80th percentile: 2.2815038204193114
90th percentile: 2.3851675510406496
95th percentile: 2.4369994163513184
99th percentile: 2.4784649085998534
mean time: 1.951872968673706
Pipeline stage StressChecker completed in 11.18s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.72s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.70s
Shutdown handler de-registered
bogoconic1-mistral-nemo_85251_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4768.83s
Shutdown handler de-registered
bogoconic1-mistral-nemo_85251_v1 status is now inactive due to auto deactivation removed underperforming models
bogoconic1-mistral-nemo_85251_v1 status is now torndown due to DeploymentManager action