developer_uid: azuruce
submission_id: mistralai-mistral-nemo-_9330_v92
model_name: ebony-horror-baseline-no-memory
model_group: mistralai/Mistral-Nemo-I
status: torndown
timestamp: 2024-09-19T08:28:04+00:00
num_battles: 12434
num_wins: 5803
celo_rating: 1220.42
family_friendly_score: 0.0
submission_type: basic
model_repo: mistralai/Mistral-Nemo-Instruct-2407
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 4
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.6322357669034735, 'latency_mean': 1.5816245448589326, 'latency_p50': 1.5771127939224243, 'latency_p90': 1.7515336751937867}, {'batch_size': 3, 'throughput': 1.2468805941870431, 'latency_mean': 2.3954089319705965, 'latency_p50': 2.383520722389221, 'latency_p90': 2.6263983249664307}, {'batch_size': 5, 'throughput': 1.5393821420787255, 'latency_mean': 3.234255247116089, 'latency_p50': 3.2358548641204834, 'latency_p90': 3.590769147872925}, {'batch_size': 6, 'throughput': 1.6198695090785116, 'latency_mean': 3.68209899187088, 'latency_p50': 3.701622724533081, 'latency_p90': 4.108572840690613}, {'batch_size': 8, 'throughput': 1.7285627905934269, 'latency_mean': 4.596501760482788, 'latency_p50': 4.625166416168213, 'latency_p90': 5.257365441322326}, {'batch_size': 10, 'throughput': 1.774698760849284, 'latency_mean': 5.595861561298371, 'latency_p50': 5.607443332672119, 'latency_p90': 6.423185968399047}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: ebony-horror-baseline-no-memory
is_internal_developer: True
language_model: mistralai/Mistral-Nemo-Instruct-2407
model_size: 13B
ranking_group: single
throughput_3p7s: 1.63
us_pacific_date: 2024-09-19
win_ratio: 0.46670419816631814
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name mistralai-mistral-nemo-9330-v92-mkmlizer
Waiting for job on mistralai-mistral-nemo-9330-v92-mkmlizer to finish
mistralai-mistral-nemo-9330-v92-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ _____ __ __ ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ /___/ ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ Version: 0.10.1 ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ https://mk1.ai ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ belonging to: ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ Chai Research Corp. ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v92-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
mistralai-mistral-nemo-9330-v92-mkmlizer: Downloaded to shared memory in 49.163s
mistralai-mistral-nemo-9330-v92-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp_ske7pay, device:0
mistralai-mistral-nemo-9330-v92-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mistral-nemo-9330-v92-mkmlizer: quantized model in 36.184s
mistralai-mistral-nemo-9330-v92-mkmlizer: Processed model mistralai/Mistral-Nemo-Instruct-2407 in 85.347s
mistralai-mistral-nemo-9330-v92-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mistral-nemo-9330-v92-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v92/config.json
mistralai-mistral-nemo-9330-v92-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v92/special_tokens_map.json
mistralai-mistral-nemo-9330-v92-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v92/tokenizer_config.json
mistralai-mistral-nemo-9330-v92-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v92/tokenizer.json
mistralai-mistral-nemo-9330-v92-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v92/flywheel_model.0.safetensors
mistralai-mistral-nemo-9330-v92-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:11, 30.72it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:07, 48.02it/s] Loading 0: 5%|▍ | 18/363 [00:00<00:07, 46.97it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:08, 38.55it/s] Loading 0: 8%|▊ | 30/363 [00:00<00:07, 45.42it/s] Loading 0: 10%|▉ | 35/363 [00:00<00:07, 44.86it/s] Loading 0: 11%|█ | 40/363 [00:00<00:07, 45.97it/s] Loading 0: 12%|█▏ | 45/363 [00:01<00:06, 46.55it/s] Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 38.42it/s] Loading 0: 17%|█▋ | 60/363 [00:01<00:06, 47.86it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 33.32it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 39.80it/s] Loading 0: 21%|██ | 77/363 [00:01<00:06, 41.64it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:07, 37.69it/s] Loading 0: 25%|██▍ | 90/363 [00:02<00:05, 45.73it/s] Loading 0: 26%|██▋ | 96/363 [00:02<00:05, 44.69it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:05, 43.92it/s] Loading 0: 30%|███ | 109/363 [00:02<00:04, 52.24it/s] Loading 0: 32%|███▏ | 115/363 [00:02<00:05, 47.05it/s] Loading 0: 33%|███▎ | 121/363 [00:02<00:05, 43.51it/s] Loading 0: 35%|███▍ | 126/363 [00:02<00:05, 41.55it/s] Loading 0: 36%|███▌ | 131/363 [00:03<00:05, 43.29it/s] Loading 0: 37%|███▋ | 136/363 [00:03<00:06, 37.63it/s] Loading 0: 39%|███▉ | 142/363 [00:03<00:07, 31.51it/s] Loading 0: 40%|████ | 146/363 [00:03<00:06, 32.33it/s] Loading 0: 41%|████▏ | 150/363 [00:03<00:06, 32.50it/s] Loading 0: 43%|████▎ | 157/363 [00:03<00:05, 39.34it/s] Loading 0: 45%|████▍ | 163/363 [00:03<00:04, 40.83it/s] Loading 0: 46%|████▋ | 168/363 [00:04<00:04, 39.42it/s] Loading 0: 48%|████▊ | 174/363 [00:04<00:04, 42.19it/s] Loading 0: 49%|████▉ | 179/363 [00:04<00:04, 42.49it/s] Loading 0: 51%|█████ | 184/363 [00:04<00:04, 43.93it/s] Loading 0: 52%|█████▏ | 190/363 [00:04<00:04, 41.52it/s] Loading 0: 54%|█████▎ | 195/363 [00:04<00:04, 41.75it/s] Loading 0: 56%|█████▌ | 202/363 [00:04<00:03, 47.10it/s] Loading 0: 57%|█████▋ | 208/363 [00:04<00:03, 45.44it/s] Loading 0: 59%|█████▊ | 213/363 [00:05<00:03, 44.48it/s] Loading 0: 60%|██████ | 218/363 [00:05<00:03, 44.22it/s] Loading 0: 61%|██████▏ | 223/363 [00:05<00:04, 34.51it/s] Loading 0: 63%|██████▎ | 227/363 [00:05<00:03, 35.19it/s] Loading 0: 64%|██████▎ | 231/363 [00:05<00:04, 32.92it/s] Loading 0: 65%|██████▌ | 237/363 [00:05<00:03, 36.93it/s] Loading 0: 66%|██████▋ | 241/363 [00:05<00:03, 36.58it/s] Loading 0: 68%|██████▊ | 246/363 [00:06<00:02, 39.09it/s] Loading 0: 69%|██████▉ | 251/363 [00:06<00:02, 40.75it/s] Loading 0: 71%|███████ | 256/363 [00:06<00:02, 42.15it/s] Loading 0: 72%|███████▏ | 262/363 [00:06<00:02, 42.55it/s] Loading 0: 74%|███████▎ | 267/363 [00:06<00:02, 42.20it/s] Loading 0: 75%|███████▌ | 273/363 [00:06<00:01, 46.51it/s] Loading 0: 77%|███████▋ | 278/363 [00:06<00:01, 46.51it/s] Loading 0: 78%|███████▊ | 283/363 [00:06<00:01, 47.06it/s] Loading 0: 79%|███████▉ | 288/363 [00:06<00:01, 47.73it/s] Loading 0: 81%|████████ | 293/363 [00:07<00:01, 38.57it/s] Loading 0: 83%|████████▎ | 301/363 [00:07<00:01, 48.25it/s] Loading 0: 85%|████████▍ | 307/363 [00:14<00:20, 2.77it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:13, 3.68it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:07, 5.72it/s] Loading 0: 90%|████████▉ | 326/363 [00:14<00:04, 7.65it/s] Loading 0: 91%|█████████ | 331/363 [00:14<00:03, 9.58it/s] Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 13.41it/s] Loading 0: 95%|█████████▍| 344/363 [00:14<00:01, 16.85it/s] Loading 0: 96%|█████████▌| 349/363 [00:14<00:00, 20.04it/s] Loading 0: 98%|█████████▊| 356/363 [00:15<00:00, 26.31it/s] Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 30.12it/s]
Job mistralai-mistral-nemo-9330-v92-mkmlizer completed after 113.75s with status: succeeded
Stopping job with name mistralai-mistral-nemo-9330-v92-mkmlizer
Pipeline stage MKMLizer completed in 114.47s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service mistralai-mistral-nemo-9330-v92
Waiting for inference service mistralai-mistral-nemo-9330-v92 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service mistralai-mistral-nemo-9330-v92 ready after 191.23732018470764s
Pipeline stage MKMLDeployer completed in 191.62s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.8077218532562256s
Received healthy response to inference request in 1.3766798973083496s
Received healthy response to inference request in 0.6613223552703857s
Received healthy response to inference request in 1.4591174125671387s
Received healthy response to inference request in 1.8455865383148193s
5 requests
0 failed requests
5th percentile: 0.8043938636779785
10th percentile: 0.9474653720855712
20th percentile: 1.2336083889007567
30th percentile: 1.3931674003601073
40th percentile: 1.426142406463623
50th percentile: 1.4591174125671387
60th percentile: 1.5985591888427735
70th percentile: 1.738000965118408
80th percentile: 1.8152947902679444
90th percentile: 1.8304406642913817
95th percentile: 1.8380136013031005
99th percentile: 1.8440719509124757
mean time: 1.4300856113433837
Pipeline stage StressChecker completed in 7.92s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 4.70s
Shutdown handler de-registered
mistralai-mistral-nemo-_9330_v92 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service mistralai-mistral-nemo-9330-v92-profiler
Waiting for inference service mistralai-mistral-nemo-9330-v92-profiler to be ready
Inference service mistralai-mistral-nemo-9330-v92-profiler ready after 190.44166326522827s
Pipeline stage MKMLProfilerDeployer completed in 190.77s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/mistralai-mistral-ne5cf1bfcf44dcd4f2a6da478c87dacc31-deplolfdxr:/code/chaiverse_profiler_1726735037 --namespace tenant-chaiml-guanaco
kubectl exec -it mistralai-mistral-ne5cf1bfcf44dcd4f2a6da478c87dacc31-deplolfdxr --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1726735037 && python profiles.py profile --best_of_n 4 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1726735037/summary.json'
kubectl exec -it mistralai-mistral-ne5cf1bfcf44dcd4f2a6da478c87dacc31-deplolfdxr --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1726735037/summary.json'
Pipeline stage MKMLProfilerRunner completed in 963.79s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service mistralai-mistral-nemo-9330-v92-profiler is running
Tearing down inference service mistralai-mistral-nemo-9330-v92-profiler
Service mistralai-mistral-nemo-9330-v92-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.01s
Shutdown handler de-registered
mistralai-mistral-nemo-_9330_v92 status is now inactive due to auto deactivation removed underperforming models
mistralai-mistral-nemo-_9330_v92 status is now torndown due to DeploymentManager action