developer_uid: bogoconic1
submission_id: bogoconic1-nemo-280k-av_88931_v1
model_name: bogoconic1-nemo-280k-av_88931_v1
model_group: bogoconic1/nemo-280k-avg
status: torndown
timestamp: 2025-05-02T07:59:58+00:00
num_battles: 6800
num_wins: 3292
celo_rating: 1277.98
family_friendly_score: 0.5754
family_friendly_standard_error: 0.0069902051472041935
submission_type: basic
model_repo: bogoconic1/nemo-280k-avg-chai-step200
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.6039048527135414, 'latency_mean': 1.6558190166950226, 'latency_p50': 1.6600618362426758, 'latency_p90': 1.814192271232605}, {'batch_size': 3, 'throughput': 1.109693247523293, 'latency_mean': 2.696892684698105, 'latency_p50': 2.7110408544540405, 'latency_p90': 2.9555413007736204}, {'batch_size': 5, 'throughput': 1.343242449770789, 'latency_mean': 3.7015683579444887, 'latency_p50': 3.725242257118225, 'latency_p90': 4.103866100311279}, {'batch_size': 6, 'throughput': 1.4168579372105459, 'latency_mean': 4.216242861747742, 'latency_p50': 4.234714508056641, 'latency_p90': 4.766851139068604}, {'batch_size': 8, 'throughput': 1.4793829003646743, 'latency_mean': 5.3746647548675535, 'latency_p50': 5.38042151927948, 'latency_p90': 5.963693833351135}, {'batch_size': 10, 'throughput': 1.529754063829574, 'latency_mean': 6.4900425744056705, 'latency_p50': 6.449192404747009, 'latency_p90': 7.428975057601929}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: bogoconic1-nemo-280k-av_88931_v1
is_internal_developer: False
language_model: bogoconic1/nemo-280k-avg-chai-step200
model_size: 13B
ranking_group: single
throughput_3p7s: 1.35
us_pacific_date: 2025-05-02
win_ratio: 0.48411764705882354
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.0, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name bogoconic1-nemo-280k-av-88931-v1-mkmlizer
Waiting for job on bogoconic1-nemo-280k-av-88931-v1-mkmlizer to finish
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ _____ __ __ ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ /___/ ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ Version: 0.12.8 ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ https://mk1.ai ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ The license key for the current software has been verified as ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ belonging to: ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ Chai Research Corp. ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ║ ║
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: Downloaded to shared memory in 42.869s
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp2yp0nfq7, device:0
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: quantized model in 36.263s
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: Processed model bogoconic1/nemo-280k-avg-chai-step200 in 79.133s
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: creating bucket guanaco-mkml-models
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/bogoconic1-nemo-280k-av-88931-v1
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/bogoconic1-nemo-280k-av-88931-v1/config.json
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/bogoconic1-nemo-280k-av-88931-v1/special_tokens_map.json
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/bogoconic1-nemo-280k-av-88931-v1/tokenizer_config.json
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/bogoconic1-nemo-280k-av-88931-v1/tokenizer.json
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/bogoconic1-nemo-280k-av-88931-v1/flywheel_model.0.safetensors
bogoconic1-nemo-280k-av-88931-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:11, 29.92it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:07, 49.47it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 44.96it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:07, 43.91it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 50.05it/s] Loading 0: 10%|█ | 37/363 [00:00<00:06, 46.59it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 45.12it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 50.29it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 47.19it/s] Loading 0: 17%|█▋ | 61/363 [00:01<00:08, 36.89it/s] Loading 0: 18%|█▊ | 66/363 [00:01<00:08, 36.80it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 40.81it/s] Loading 0: 21%|██▏ | 78/363 [00:01<00:07, 40.55it/s] Loading 0: 23%|██▎ | 83/363 [00:01<00:07, 39.53it/s] Loading 0: 25%|██▍ | 90/363 [00:02<00:06, 44.74it/s] Loading 0: 26%|██▋ | 96/363 [00:02<00:06, 43.21it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:06, 41.28it/s] Loading 0: 29%|██▉ | 106/363 [00:02<00:05, 42.87it/s] Loading 0: 31%|███ | 113/363 [00:02<00:05, 42.14it/s] Loading 0: 33%|███▎ | 118/363 [00:02<00:05, 41.41it/s] Loading 0: 34%|███▍ | 125/363 [00:02<00:04, 48.17it/s] Loading 0: 36%|███▌ | 131/363 [00:02<00:04, 47.64it/s] Loading 0: 37%|███▋ | 136/363 [00:03<00:05, 38.18it/s] Loading 0: 39%|███▉ | 142/363 [00:03<00:06, 32.26it/s] Loading 0: 40%|████ | 146/363 [00:03<00:06, 32.96it/s] Loading 0: 41%|████▏ | 150/363 [00:03<00:06, 31.97it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 37.41it/s] Loading 0: 44%|████▍ | 161/363 [00:03<00:05, 39.46it/s] Loading 0: 46%|████▌ | 166/363 [00:04<00:04, 41.30it/s] Loading 0: 47%|████▋ | 172/363 [00:04<00:04, 40.72it/s] Loading 0: 49%|████▉ | 177/363 [00:04<00:04, 39.63it/s] Loading 0: 50%|█████ | 183/363 [00:04<00:04, 44.16it/s] Loading 0: 52%|█████▏ | 188/363 [00:04<00:04, 43.43it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:03, 44.14it/s] Loading 0: 55%|█████▍ | 199/363 [00:04<00:03, 43.34it/s] Loading 0: 56%|█████▌ | 204/363 [00:04<00:03, 42.62it/s] Loading 0: 58%|█████▊ | 210/363 [00:05<00:03, 46.38it/s] Loading 0: 59%|█████▉ | 215/363 [00:05<00:03, 45.77it/s] Loading 0: 61%|██████ | 222/363 [00:05<00:03, 45.93it/s] Loading 0: 63%|██████▎ | 227/363 [00:05<00:04, 32.23it/s] Loading 0: 64%|██████▎ | 231/363 [00:05<00:04, 32.25it/s] Loading 0: 65%|██████▌ | 237/363 [00:05<00:03, 37.99it/s] Loading 0: 67%|██████▋ | 242/363 [00:05<00:03, 39.70it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:02, 41.13it/s] Loading 0: 69%|██████▉ | 252/363 [00:06<00:02, 41.85it/s] Loading 0: 71%|███████ | 257/363 [00:06<00:03, 34.35it/s] Loading 0: 73%|███████▎ | 264/363 [00:06<00:02, 42.02it/s] Loading 0: 74%|███████▍ | 269/363 [00:06<00:02, 42.74it/s] Loading 0: 75%|███████▌ | 274/363 [00:06<00:02, 43.55it/s] Loading 0: 77%|███████▋ | 280/363 [00:06<00:01, 42.50it/s] Loading 0: 79%|███████▊ | 285/363 [00:06<00:01, 42.10it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 46.59it/s] Loading 0: 82%|████████▏ | 296/363 [00:07<00:01, 46.23it/s] Loading 0: 83%|████████▎ | 301/363 [00:07<00:01, 46.27it/s] Loading 0: 84%|████████▍ | 306/363 [00:14<00:23, 2.43it/s] Loading 0: 85%|████████▌ | 310/363 [00:14<00:16, 3.15it/s] Loading 0: 87%|████████▋ | 315/363 [00:14<00:10, 4.42it/s] Loading 0: 88%|████████▊ | 321/363 [00:14<00:06, 6.32it/s] Loading 0: 91%|█████████ | 329/363 [00:14<00:03, 9.91it/s] Loading 0: 92%|█████████▏| 335/363 [00:14<00:02, 12.88it/s] Loading 0: 94%|█████████▎| 340/363 [00:14<00:01, 15.76it/s] Loading 0: 96%|█████████▌| 347/363 [00:14<00:00, 21.34it/s] Loading 0: 97%|█████████▋| 353/363 [00:15<00:00, 25.07it/s] Loading 0: 99%|█████████▊| 358/363 [00:15<00:00, 27.48it/s]
Job bogoconic1-nemo-280k-av-88931-v1-mkmlizer completed after 104.63s with status: succeeded
Stopping job with name bogoconic1-nemo-280k-av-88931-v1-mkmlizer
Pipeline stage MKMLizer completed in 105.12s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service bogoconic1-nemo-280k-av-88931-v1
Waiting for inference service bogoconic1-nemo-280k-av-88931-v1 to be ready
Inference service bogoconic1-nemo-280k-av-88931-v1 ready after 150.86253213882446s
Pipeline stage MKMLDeployer completed in 151.39s
run pipeline stage %s
Running pipeline stage StressChecker
Failed to get response for submission cycy233-modelv-cc1-step1500_v1: HTTPConnectionPool(host='cycy233-modelv-cc1-step1500-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission cycy233-modelv-cc1-step2500_v1: HTTPConnectionPool(host='cycy233-modelv-cc1-step2500-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.450457811355591s
Received healthy response to inference request in 1.239197015762329s
Received healthy response to inference request in 1.7858471870422363s
Received healthy response to inference request in 1.6137006282806396s
5 requests
1 failed requests
5th percentile: 1.3140977382659913
10th percentile: 1.3889984607696533
20th percentile: 1.5387999057769775
30th percentile: 1.648129940032959
40th percentile: 1.7169885635375977
50th percentile: 1.7858471870422363
60th percentile: 2.051691436767578
70th percentile: 2.3175356864929197
80th percentile: 5.988102149963382
90th percentile: 13.063390827178956
95th percentile: 16.60103516578674
99th percentile: 19.431150636672974
mean time: 5.445576429367065
%s, retrying in %s seconds...
Received healthy response to inference request in 1.511340618133545s
Received healthy response to inference request in 1.9330668449401855s
Received healthy response to inference request in 1.5250778198242188s
Received healthy response to inference request in 1.6054954528808594s
Received healthy response to inference request in 1.522611141204834s
5 requests
0 failed requests
5th percentile: 1.5135947227478028
10th percentile: 1.5158488273620605
20th percentile: 1.520357036590576
30th percentile: 1.523104476928711
40th percentile: 1.5240911483764648
50th percentile: 1.5250778198242188
60th percentile: 1.557244873046875
70th percentile: 1.5894119262695312
80th percentile: 1.6710097312927246
90th percentile: 1.8020382881164552
95th percentile: 1.8675525665283204
99th percentile: 1.9199639892578124
mean time: 1.6195183753967286
Pipeline stage StressChecker completed in 38.12s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.68s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.71s
Shutdown handler de-registered
bogoconic1-nemo-280k-av_88931_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service bogoconic1-nemo-280k-av-88931-v1-profiler
Waiting for inference service bogoconic1-nemo-280k-av-88931-v1-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 5021.51s
Shutdown handler de-registered
bogoconic1-nemo-280k-av_88931_v1 status is now inactive due to auto deactivation removed underperforming models
bogoconic1-nemo-280k-av_88931_v1 status is now torndown due to DeploymentManager action