developer_uid: azuruce
submission_id: wassimm-mistral-nemo-in_91128_v1
model_name: wassimm-mistral-nemo-in_91128_v1
model_group: wassimm/Mistral-Nemo-Ins
status: torndown
timestamp: 2025-02-03T21:18:45+00:00
num_battles: 6773
num_wins: 3035
celo_rating: 1222.23
family_friendly_score: 0.5828
family_friendly_standard_error: 0.006973437602789603
submission_type: basic
model_repo: wassimm/Mistral-Nemo-Instruct-2407-400k_data-w
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
display_name: wassimm-mistral-nemo-in_91128_v1
is_internal_developer: False
language_model: wassimm/Mistral-Nemo-Instruct-2407-400k_data-w
model_size: 13B
ranking_group: single
us_pacific_date: 2025-02-03
win_ratio: 0.4481027609626458
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.001, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['<|eot_id|>', '###', '<|im_end|>', 'User:', '</s>', '####', 'Me:', '\n', 'Bot:', '<|end_of_text|>', 'You:'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name wassimm-mistral-nemo-in-91128-v1-mkmlizer
Waiting for job on wassimm-mistral-nemo-in-91128-v1-mkmlizer to finish
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ _____ __ __ ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ /___/ ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ Version: 0.11.12 ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ https://mk1.ai ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ The license key for the current software has been verified as ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ belonging to: ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ Chai Research Corp. ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ║ ║
wassimm-mistral-nemo-in-91128-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
wassimm-mistral-nemo-in-91128-v1-mkmlizer: Downloaded to shared memory in 53.955s
wassimm-mistral-nemo-in-91128-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp41dwh0a0, device:0
wassimm-mistral-nemo-in-91128-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission nitral-ai-captain-eris_59667_v36: ('http://nitral-ai-captain-eris-59667-v36-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:54058->127.0.0.1:8080: read: connection reset by peer\n')
wassimm-mistral-nemo-in-91128-v1-mkmlizer: quantized model in 34.660s
wassimm-mistral-nemo-in-91128-v1-mkmlizer: Processed model wassimm/Mistral-Nemo-Instruct-2407-400k_data-w in 88.615s
wassimm-mistral-nemo-in-91128-v1-mkmlizer: creating bucket guanaco-mkml-models
wassimm-mistral-nemo-in-91128-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
wassimm-mistral-nemo-in-91128-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/wassimm-mistral-nemo-in-91128-v1
wassimm-mistral-nemo-in-91128-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/wassimm-mistral-nemo-in-91128-v1/config.json
wassimm-mistral-nemo-in-91128-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/wassimm-mistral-nemo-in-91128-v1/special_tokens_map.json
wassimm-mistral-nemo-in-91128-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/wassimm-mistral-nemo-in-91128-v1/tokenizer_config.json
wassimm-mistral-nemo-in-91128-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/wassimm-mistral-nemo-in-91128-v1/tokenizer.json
wassimm-mistral-nemo-in-91128-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/wassimm-mistral-nemo-in-91128-v1/flywheel_model.0.safetensors
wassimm-mistral-nemo-in-91128-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:10, 34.56it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:06, 56.31it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:06, 51.62it/s] Loading 0: 7%|▋ | 25/363 [00:00<00:06, 52.42it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 54.73it/s] Loading 0: 10%|█ | 37/363 [00:00<00:06, 52.18it/s] Loading 0: 12%|█▏ | 43/363 [00:00<00:06, 52.99it/s] Loading 0: 13%|█▎ | 49/363 [00:00<00:05, 54.66it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:05, 51.87it/s] Loading 0: 17%|█▋ | 61/363 [00:01<00:07, 38.81it/s] Loading 0: 18%|█▊ | 66/363 [00:01<00:07, 38.89it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:06, 43.39it/s] Loading 0: 21%|██▏ | 78/363 [00:01<00:06, 42.78it/s] Loading 0: 23%|██▎ | 83/363 [00:01<00:06, 43.64it/s] Loading 0: 25%|██▍ | 90/363 [00:01<00:05, 48.51it/s] Loading 0: 26%|██▋ | 96/363 [00:02<00:05, 47.09it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:05, 45.43it/s] Loading 0: 30%|██▉ | 108/363 [00:02<00:04, 51.19it/s] Loading 0: 31%|███▏ | 114/363 [00:02<00:05, 46.08it/s] Loading 0: 33%|███▎ | 119/363 [00:02<00:05, 44.86it/s] Loading 0: 35%|███▍ | 126/363 [00:02<00:04, 49.54it/s] Loading 0: 36%|███▋ | 132/363 [00:02<00:04, 48.13it/s] Loading 0: 38%|███▊ | 137/363 [00:02<00:04, 45.69it/s] Loading 0: 39%|███▉ | 142/363 [00:03<00:06, 34.81it/s] Loading 0: 40%|████ | 146/363 [00:03<00:06, 35.54it/s] Loading 0: 41%|████▏ | 150/363 [00:03<00:06, 34.85it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 40.64it/s] Loading 0: 44%|████▍ | 161/363 [00:03<00:04, 42.87it/s] Loading 0: 46%|████▌ | 166/363 [00:03<00:04, 44.03it/s] Loading 0: 47%|████▋ | 172/363 [00:03<00:04, 43.86it/s] Loading 0: 49%|████▉ | 177/363 [00:03<00:04, 43.80it/s] Loading 0: 51%|█████ | 184/363 [00:04<00:03, 49.06it/s] Loading 0: 52%|█████▏ | 190/363 [00:04<00:03, 46.65it/s] Loading 0: 54%|█████▎ | 195/363 [00:04<00:03, 45.08it/s] Loading 0: 56%|█████▌ | 202/363 [00:04<00:03, 49.71it/s] Loading 0: 57%|█████▋ | 208/363 [00:04<00:03, 46.28it/s] Loading 0: 59%|█████▊ | 213/363 [00:04<00:03, 44.55it/s] Loading 0: 61%|██████ | 220/363 [00:04<00:02, 50.58it/s] Loading 0: 62%|██████▏ | 226/363 [00:05<00:04, 32.64it/s] Loading 0: 64%|██████▎ | 231/363 [00:05<00:03, 33.39it/s] Loading 0: 66%|██████▌ | 238/363 [00:05<00:03, 38.95it/s] Loading 0: 67%|██████▋ | 244/363 [00:05<00:02, 40.30it/s] Loading 0: 69%|██████▊ | 249/363 [00:05<00:02, 40.63it/s] Loading 0: 70%|███████ | 255/363 [00:05<00:02, 44.56it/s] Loading 0: 72%|███████▏ | 260/363 [00:05<00:02, 45.54it/s] Loading 0: 73%|███████▎ | 265/363 [00:05<00:02, 43.93it/s] Loading 0: 74%|███████▍ | 270/363 [00:06<00:02, 44.62it/s] Loading 0: 76%|███████▌ | 275/363 [00:06<00:02, 38.34it/s] Loading 0: 78%|███████▊ | 283/363 [00:06<00:01, 46.75it/s] Loading 0: 80%|███████▉ | 289/363 [00:06<00:01, 46.31it/s] Loading 0: 81%|████████ | 294/363 [00:06<00:01, 45.75it/s] Loading 0: 83%|████████▎ | 301/363 [00:06<00:01, 51.37it/s] Loading 0: 85%|████████▍ | 307/363 [00:13<00:19, 2.84it/s] Loading 0: 86%|████████▌ | 312/363 [00:13<00:13, 3.75it/s] Loading 0: 88%|████████▊ | 320/363 [00:13<00:07, 5.82it/s] Loading 0: 90%|████████▉ | 326/363 [00:13<00:04, 7.77it/s] Loading 0: 91%|█████████ | 331/363 [00:14<00:03, 9.83it/s] Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 13.80it/s] Loading 0: 95%|█████████▍| 344/363 [00:14<00:01, 17.33it/s] Loading 0: 96%|█████████▋| 350/363 [00:14<00:00, 21.53it/s] Loading 0: 98%|█████████▊| 356/363 [00:14<00:00, 26.35it/s] Loading 0: 100%|█████████▉| 362/363 [00:14<00:00, 29.69it/s]
Job wassimm-mistral-nemo-in-91128-v1-mkmlizer completed after 114.57s with status: succeeded
Stopping job with name wassimm-mistral-nemo-in-91128-v1-mkmlizer
Pipeline stage MKMLizer completed in 115.02s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service wassimm-mistral-nemo-in-91128-v1
Waiting for inference service wassimm-mistral-nemo-in-91128-v1 to be ready
Inference service wassimm-mistral-nemo-in-91128-v1 ready after 160.5482439994812s
Pipeline stage MKMLDeployer completed in 160.97s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.065361499786377s
Received healthy response to inference request in 1.6310420036315918s
Received healthy response to inference request in 1.3101646900177002s
Received healthy response to inference request in 1.4003710746765137s
5 requests
1 failed requests
5th percentile: 1.3282059669494628
10th percentile: 1.3462472438812256
20th percentile: 1.382329797744751
30th percentile: 1.4465052604675293
40th percentile: 1.5387736320495606
50th percentile: 1.6310420036315918
60th percentile: 1.804769802093506
70th percentile: 1.9784976005554198
80th percentile: 5.677158308029178
90th percentile: 12.900751924514772
95th percentile: 16.512548732757566
99th percentile: 19.401986179351805
mean time: 5.30625696182251
%s, retrying in %s seconds...
Received healthy response to inference request in 1.7085869312286377s
Received healthy response to inference request in 1.3042047023773193s
Received healthy response to inference request in 1.651991367340088s
Received healthy response to inference request in 1.7392213344573975s
Received healthy response to inference request in 1.7480342388153076s
5 requests
0 failed requests
5th percentile: 1.373762035369873
10th percentile: 1.4433193683624268
20th percentile: 1.5824340343475343
30th percentile: 1.6633104801177978
40th percentile: 1.6859487056732179
50th percentile: 1.7085869312286377
60th percentile: 1.7208406925201416
70th percentile: 1.7330944538116455
80th percentile: 1.7409839153289794
90th percentile: 1.7445090770721436
95th percentile: 1.7462716579437256
99th percentile: 1.7476817226409913
mean time: 1.63040771484375
Pipeline stage StressChecker completed in 37.05s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.64s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.65s
Shutdown handler de-registered
wassimm-mistral-nemo-in_91128_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4768.47s
Shutdown handler de-registered
wassimm-mistral-nemo-in_91128_v1 status is now inactive due to auto deactivation removed underperforming models
wassimm-mistral-nemo-in_91128_v1 status is now torndown due to DeploymentManager action