developer_uid: zonemercy
submission_id: mistralai-mistral-nem_93303_v236
model_name: tempv1-3
model_group: mistralai/Mistral-Nemo-I
status: torndown
timestamp: 2024-12-28T21:10:19+00:00
num_battles: 14465
num_wins: 6308
celo_rating: 1229.77
family_friendly_score: 0.5946
family_friendly_standard_error: 0.006943354232645775
submission_type: basic
model_repo: mistralai/Mistral-Nemo-Instruct-2407
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 4
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.641221881581438, 'latency_mean': 1.5594536232948304, 'latency_p50': 1.5544320344924927, 'latency_p90': 1.7169620990753174}, {'batch_size': 3, 'throughput': 1.270716981612357, 'latency_mean': 2.3505579805374146, 'latency_p50': 2.336992859840393, 'latency_p90': 2.696609282493591}, {'batch_size': 5, 'throughput': 1.6078916656891689, 'latency_mean': 3.099547505378723, 'latency_p50': 3.1165976524353027, 'latency_p90': 3.445496344566345}, {'batch_size': 6, 'throughput': 1.7282057742570187, 'latency_mean': 3.4505859100818634, 'latency_p50': 3.4539579153060913, 'latency_p90': 3.944593977928162}, {'batch_size': 8, 'throughput': 1.8852335304008707, 'latency_mean': 4.211586371660233, 'latency_p50': 4.2080957889556885, 'latency_p90': 4.814598608016968}, {'batch_size': 10, 'throughput': 1.961549949041396, 'latency_mean': 5.061745156049728, 'latency_p50': 5.073854565620422, 'latency_p90': 5.719829678535461}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: tempv1-3
is_internal_developer: True
language_model: mistralai/Mistral-Nemo-Instruct-2407
model_size: 13B
ranking_group: single
throughput_3p7s: 1.8
us_pacific_date: 2024-12-28
win_ratio: 0.43608710680954027
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|im_end|>', '####', '</s>', 'You:', 'User:', 'Bot:', '<|eot_id|>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name mistralai-mistral-nem-93303-v236-mkmlizer
Waiting for job on mistralai-mistral-nem-93303-v236-mkmlizer to finish
mistralai-mistral-nem-93303-v236-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mistral-nem-93303-v236-mkmlizer: ║ _____ __ __ ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ /___/ ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ Version: 0.11.12 ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ https://mk1.ai ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ belonging to: ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ Chai Research Corp. ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
mistralai-mistral-nem-93303-v236-mkmlizer: ║ ║
mistralai-mistral-nem-93303-v236-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mistralai-mistral-nem-93303-v236-mkmlizer: Downloaded to shared memory in 58.106s
mistralai-mistral-nem-93303-v236-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpucfmkd36, device:0
mistralai-mistral-nem-93303-v236-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mistral-nem-93303-v236-mkmlizer: quantized model in 37.168s
mistralai-mistral-nem-93303-v236-mkmlizer: Processed model mistralai/Mistral-Nemo-Instruct-2407 in 95.274s
mistralai-mistral-nem-93303-v236-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mistral-nem-93303-v236-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-mistral-nem-93303-v236-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v236
mistralai-mistral-nem-93303-v236-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v236/config.json
mistralai-mistral-nem-93303-v236-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v236/special_tokens_map.json
mistralai-mistral-nem-93303-v236-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v236/tokenizer_config.json
mistralai-mistral-nem-93303-v236-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v236/tokenizer.json
mistralai-mistral-nem-93303-v236-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v236/flywheel_model.0.safetensors
mistralai-mistral-nem-93303-v236-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:12, 29.81it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:07, 48.13it/s] Loading 0: 5%|▍ | 18/363 [00:00<00:07, 48.42it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:08, 39.78it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:07, 45.85it/s] Loading 0: 10%|▉ | 36/363 [00:00<00:06, 46.73it/s] Loading 0: 11%|█▏ | 41/363 [00:00<00:08, 38.09it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:06, 45.25it/s] Loading 0: 15%|█▍ | 53/363 [00:01<00:06, 44.68it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:06, 48.13it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:09, 30.48it/s] Loading 0: 20%|█▉ | 71/363 [00:01<00:08, 35.65it/s] Loading 0: 21%|██ | 76/363 [00:01<00:08, 35.79it/s] Loading 0: 22%|██▏ | 81/363 [00:02<00:07, 37.41it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:06, 39.69it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 33.29it/s] Loading 0: 27%|██▋ | 98/363 [00:02<00:06, 40.43it/s] Loading 0: 28%|██▊ | 103/363 [00:02<00:06, 40.46it/s] Loading 0: 30%|██▉ | 108/363 [00:02<00:05, 42.61it/s] Loading 0: 31%|███ | 113/363 [00:02<00:06, 35.95it/s] Loading 0: 33%|███▎ | 118/363 [00:03<00:06, 36.13it/s] Loading 0: 34%|███▍ | 125/363 [00:03<00:05, 42.98it/s] Loading 0: 36%|███▌ | 130/363 [00:03<00:05, 41.91it/s] Loading 0: 37%|███▋ | 135/363 [00:03<00:05, 41.28it/s] Loading 0: 39%|███▊ | 140/363 [00:03<00:05, 41.76it/s] Loading 0: 40%|███▉ | 145/363 [00:03<00:08, 26.86it/s] Loading 0: 41%|████ | 149/363 [00:04<00:08, 26.23it/s] Loading 0: 43%|████▎ | 156/363 [00:04<00:06, 32.83it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:06, 32.68it/s] Loading 0: 45%|████▌ | 165/363 [00:04<00:05, 35.12it/s] Loading 0: 47%|████▋ | 169/363 [00:04<00:05, 34.49it/s] Loading 0: 48%|████▊ | 174/363 [00:04<00:05, 37.15it/s] Loading 0: 49%|████▉ | 178/363 [00:04<00:05, 32.48it/s] Loading 0: 50%|█████ | 183/363 [00:04<00:04, 36.48it/s] Loading 0: 52%|█████▏ | 187/363 [00:04<00:04, 36.48it/s] Loading 0: 53%|█████▎ | 193/363 [00:05<00:04, 40.53it/s] Loading 0: 55%|█████▍ | 199/363 [00:05<00:04, 40.07it/s] Loading 0: 56%|█████▌ | 204/363 [00:05<00:03, 40.15it/s] Loading 0: 58%|█████▊ | 210/363 [00:05<00:03, 43.87it/s] Loading 0: 59%|█████▉ | 215/363 [00:05<00:03, 43.72it/s] Loading 0: 61%|██████ | 221/363 [00:05<00:02, 47.62it/s] Loading 0: 62%|██████▏ | 226/363 [00:06<00:04, 29.52it/s] Loading 0: 63%|██████▎ | 230/363 [00:06<00:04, 29.10it/s] Loading 0: 65%|██████▌ | 237/363 [00:06<00:03, 35.18it/s] Loading 0: 67%|██████▋ | 242/363 [00:06<00:03, 36.42it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:03, 37.87it/s] Loading 0: 69%|██████▉ | 252/363 [00:06<00:02, 40.34it/s] Loading 0: 71%|███████ | 257/363 [00:06<00:03, 33.61it/s] Loading 0: 73%|███████▎ | 264/363 [00:06<00:02, 41.20it/s] Loading 0: 74%|███████▍ | 269/363 [00:07<00:02, 41.35it/s] Loading 0: 75%|███████▌ | 274/363 [00:07<00:02, 39.80it/s] Loading 0: 77%|███████▋ | 279/363 [00:07<00:02, 41.26it/s] Loading 0: 78%|███████▊ | 284/363 [00:07<00:02, 34.39it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 40.99it/s] Loading 0: 82%|████████▏ | 296/363 [00:07<00:01, 40.53it/s] Loading 0: 83%|████████▎ | 301/363 [00:07<00:01, 42.26it/s] Loading 0: 84%|████████▍ | 306/363 [00:14<00:23, 2.45it/s] Loading 0: 85%|████████▌ | 310/363 [00:14<00:16, 3.18it/s] Loading 0: 87%|████████▋ | 314/363 [00:14<00:11, 4.18it/s] Loading 0: 88%|████████▊ | 320/363 [00:15<00:06, 6.25it/s] Loading 0: 90%|████████▉ | 326/363 [00:15<00:04, 8.83it/s] Loading 0: 91%|█████████ | 331/363 [00:15<00:02, 11.40it/s] Loading 0: 93%|█████████▎| 338/363 [00:15<00:01, 16.17it/s] Loading 0: 95%|█████████▍| 344/363 [00:15<00:00, 19.94it/s] Loading 0: 96%|█████████▌| 349/363 [00:15<00:00, 23.22it/s] Loading 0: 98%|█████████▊| 356/363 [00:15<00:00, 29.86it/s] Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 32.80it/s]
Job mistralai-mistral-nem-93303-v236-mkmlizer completed after 135.26s with status: succeeded
Stopping job with name mistralai-mistral-nem-93303-v236-mkmlizer
Pipeline stage MKMLizer completed in 135.75s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service mistralai-mistral-nem-93303-v236
Waiting for inference service mistralai-mistral-nem-93303-v236 to be ready
Failed to get response for submission alexdaoud-trainer-bagir-_3620_v2: ('http://alexdaoud-trainer-bagir-3620-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission alexdaoud-trainer-bagir-_3620_v2: ('http://alexdaoud-trainer-bagir-3620-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Inference service mistralai-mistral-nem-93303-v236 ready after 321.1545090675354s
Pipeline stage MKMLDeployer completed in 321.66s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.3416013717651367s
Received healthy response to inference request in 0.6619718074798584s
Failed to get response for submission alexdaoud-trainer-bagir-_3620_v2: ('http://alexdaoud-trainer-bagir-3620-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Received healthy response to inference request in 0.9006948471069336s
Received healthy response to inference request in 1.3628027439117432s
Received healthy response to inference request in 0.6395649909973145s
5 requests
0 failed requests
5th percentile: 0.6440463542938233
10th percentile: 0.648527717590332
20th percentile: 0.6574904441833496
30th percentile: 0.7097164154052734
40th percentile: 0.8052056312561036
50th percentile: 0.9006948471069336
60th percentile: 1.0770574569702147
70th percentile: 1.253420066833496
80th percentile: 1.345841646194458
90th percentile: 1.3543221950531006
95th percentile: 1.3585624694824219
99th percentile: 1.361954689025879
mean time: 0.9813271522521972
Pipeline stage StressChecker completed in 6.23s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.65s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.62s
Shutdown handler de-registered
mistralai-mistral-nem_93303_v236 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2407.87s
Shutdown handler de-registered
mistralai-mistral-nem_93303_v236 status is now inactive due to auto deactivation removed underperforming models
mistralai-mistral-nem_93303_v236 status is now torndown due to DeploymentManager action
mistralai-mistral-nem_93303_v236 status is now torndown due to DeploymentManager action
mistralai-mistral-nem_93303_v236 status is now torndown due to DeploymentManager action