submission_id: rirv938-mistral-22b-14k-_4333_v1
developer_uid: robert_irvine
best_of: 8
celo_rating: 1232.16
display_name: rirv938-mistral-22b-14k-_4333_v1
family_friendly_score: 0.0
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '###', 'Bot:', 'User:', 'You:', '<|im_end|>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A6000': 1}
is_internal_developer: True
language_model: rirv938/mistral_22b_14k_virgo_edit_step_436
latencies: [{'batch_size': 1, 'throughput': 0.38009489698982063, 'latency_mean': 2.6308402597904204, 'latency_p50': 2.632531762123108, 'latency_p90': 2.9290745496749877}, {'batch_size': 2, 'throughput': 0.5964658819412691, 'latency_mean': 3.344731991291046, 'latency_p50': 3.357685685157776, 'latency_p90': 3.6741934537887575}, {'batch_size': 3, 'throughput': 0.7538270608910224, 'latency_mean': 3.9659270632266996, 'latency_p50': 3.949905514717102, 'latency_p90': 4.415823674201965}, {'batch_size': 4, 'throughput': 0.8722682672878436, 'latency_mean': 4.564331713914871, 'latency_p50': 4.546615242958069, 'latency_p90': 5.063575577735901}, {'batch_size': 5, 'throughput': 0.9687211961516706, 'latency_mean': 5.12864028096199, 'latency_p50': 5.0925257205963135, 'latency_p90': 5.67385630607605}]
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: rirv938/mistral_22b_14k_
model_name: rirv938-mistral-22b-14k-_4333_v1
model_num_parameters: 22247282688.0
model_repo: rirv938/mistral_22b_14k_virgo_edit_step_436
model_size: 22B
num_battles: 12143
num_wins: 5977
ranking_group: single
status: torndown
submission_type: basic
throughput_3p7s: 0.7
timestamp: 2024-09-23T20:13:27+00:00
us_pacific_date: 2024-09-23
win_ratio: 0.4922177386148398
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rirv938-mistral-22b-14k-4333-v1-mkmlizer
Waiting for job on rirv938-mistral-22b-14k-4333-v1-mkmlizer to finish
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ _____ __ __ ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ /___/ ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ Version: 0.10.1 ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ https://mk1.ai ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ belonging to: ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ Chai Research Corp. ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ║ ║
rirv938-mistral-22b-14k-4333-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
rirv938-mistral-22b-14k-4333-v1-mkmlizer: Downloaded to shared memory in 175.052s
rirv938-mistral-22b-14k-4333-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpnfkksbhj, device:0
rirv938-mistral-22b-14k-4333-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rirv938-mistral-22b-14k-4333-v1-mkmlizer: quantized model in 50.020s
rirv938-mistral-22b-14k-4333-v1-mkmlizer: Processed model rirv938/mistral_22b_14k_virgo_edit_step_436 in 225.072s
rirv938-mistral-22b-14k-4333-v1-mkmlizer: creating bucket guanaco-mkml-models
rirv938-mistral-22b-14k-4333-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rirv938-mistral-22b-14k-4333-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rirv938-mistral-22b-14k-4333-v1
rirv938-mistral-22b-14k-4333-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rirv938-mistral-22b-14k-4333-v1/config.json
rirv938-mistral-22b-14k-4333-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rirv938-mistral-22b-14k-4333-v1/special_tokens_map.json
rirv938-mistral-22b-14k-4333-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rirv938-mistral-22b-14k-4333-v1/tokenizer_config.json
rirv938-mistral-22b-14k-4333-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rirv938-mistral-22b-14k-4333-v1/tokenizer.json
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
rirv938-mistral-22b-14k-4333-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/rirv938-mistral-22b-14k-4333-v1/flywheel_model.1.safetensors
rirv938-mistral-22b-14k-4333-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rirv938-mistral-22b-14k-4333-v1/flywheel_model.0.safetensors
rirv938-mistral-22b-14k-4333-v1-mkmlizer: Loading 0: 0%| | 0/507 [00:00<?, ?it/s] Loading 0: 1%| | 4/507 [00:00<00:14, 35.35it/s] Loading 0: 2%|▏ | 8/507 [00:00<00:17, 29.06it/s] Loading 0: 2%|▏ | 12/507 [00:00<00:18, 27.37it/s] Loading 0: 3%|▎ | 15/507 [00:00<00:21, 23.06it/s] Loading 0: 4%|▎ | 19/507 [00:00<00:17, 27.50it/s] Loading 0: 5%|▍ | 23/507 [00:00<00:15, 30.39it/s] Loading 0: 5%|▌ | 27/507 [00:01<00:25, 19.16it/s] Loading 0: 6%|▌ | 31/507 [00:01<00:21, 22.24it/s] Loading 0: 7%|▋ | 34/507 [00:01<00:20, 22.83it/s] Loading 0: 8%|▊ | 39/507 [00:01<00:18, 25.26it/s] Loading 0: 8%|▊ | 42/507 [00:01<00:20, 22.95it/s] Loading 0: 9%|▉ | 46/507 [00:01<00:17, 26.33it/s] Loading 0: 10%|▉ | 50/507 [00:01<00:16, 28.46it/s] Loading 0: 11%|█ | 54/507 [00:02<00:23, 18.89it/s] Loading 0: 11%|█▏ | 58/507 [00:02<00:20, 21.72it/s] Loading 0: 12%|█▏ | 61/507 [00:02<00:20, 21.79it/s] Loading 0: 13%|█▎ | 64/507 [00:02<00:19, 23.18it/s] Loading 0: 13%|█▎ | 67/507 [00:02<00:18, 23.22it/s] Loading 0: 14%|█▍ | 70/507 [00:02<00:18, 23.18it/s] Loading 0: 15%|█▍ | 74/507 [00:03<00:18, 23.96it/s] Loading 0: 15%|█▌ | 78/507 [00:03<00:15, 27.08it/s] Loading 0: 16%|█▌ | 81/507 [00:03<00:21, 19.98it/s] Loading 0: 17%|█▋ | 84/507 [00:03<00:20, 20.84it/s] Loading 0: 17%|█▋ | 88/507 [00:03<00:18, 22.13it/s] Loading 0: 19%|█▊ | 94/507 [00:03<00:15, 26.02it/s] Loading 0: 19%|█▉ | 97/507 [00:04<00:15, 26.36it/s] Loading 0: 20%|█▉ | 100/507 [00:04<00:19, 21.05it/s] Loading 0: 21%|██ | 105/507 [00:04<00:15, 26.51it/s] Loading 0: 21%|██▏ | 109/507 [00:04<00:20, 19.87it/s] Loading 0: 22%|██▏ | 112/507 [00:04<00:19, 20.70it/s] Loading 0: 23%|██▎ | 115/507 [00:04<00:18, 21.20it/s] Loading 0: 24%|██▎ | 120/507 [00:05<00:15, 24.20it/s] Loading 0: 24%|██▍ | 123/507 [00:05<00:17, 22.52it/s] Loading 0: 25%|██▌ | 127/507 [00:05<00:14, 25.92it/s] Loading 0: 26%|██▌ | 131/507 [00:05<00:13, 28.76it/s] Loading 0: 27%|██▋ | 135/507 [00:05<00:19, 19.01it/s] Loading 0: 27%|██▋ | 139/507 [00:05<00:16, 21.91it/s] Loading 0: 28%|██▊ | 142/507 [00:06<00:16, 22.54it/s] Loading 0: 29%|██▉ | 147/507 [00:06<00:14, 25.13it/s] Loading 0: 30%|██▉ | 150/507 [00:06<00:15, 23.12it/s] Loading 0: 30%|███ | 154/507 [00:06<00:13, 26.51it/s] Loading 0: 31%|███ | 158/507 [00:06<00:12, 28.63it/s] Loading 0: 32%|███▏ | 162/507 [00:07<00:18, 19.02it/s] Loading 0: 33%|███▎ | 166/507 [00:07<00:15, 21.86it/s] Loading 0: 33%|███▎ | 169/507 [00:07<00:14, 22.61it/s] Loading 0: 34%|███▍ | 172/507 [00:07<00:14, 23.13it/s] Loading 0: 35%|███▍ | 175/507 [00:07<00:14, 23.53it/s] Loading 0: 35%|███▌ | 178/507 [00:07<00:13, 23.75it/s] Loading 0: 36%|███▌ | 182/507 [00:07<00:13, 24.49it/s] Loading 0: 37%|███▋ | 187/507 [00:08<00:13, 22.97it/s] Loading 0: 37%|███▋ | 190/507 [00:08<00:14, 21.33it/s] Loading 0: 38%|███▊ | 193/507 [00:08<00:14, 22.07it/s] Loading 0: 39%|███▊ | 196/507 [00:08<00:13, 22.75it/s] Loading 0: 40%|███▉ | 201/507 [00:08<00:11, 25.54it/s] Loading 0: 40%|████ | 204/507 [00:08<00:13, 22.99it/s] Loading 0: 41%|████ | 208/507 [00:08<00:11, 26.58it/s] Loading 0: 42%|████▏ | 212/507 [00:08<00:10, 29.22it/s] Loading 0: 43%|████▎ | 216/507 [00:09<00:15, 19.31it/s] Loading 0: 43%|████▎ | 220/507 [00:09<00:12, 22.15it/s] Loading 0: 44%|████▍ | 223/507 [00:09<00:12, 22.87it/s] Loading 0: 45%|████▍ | 228/507 [00:09<00:10, 25.57it/s] Loading 0: 46%|████▌ | 231/507 [00:09<00:12, 22.84it/s] Loading 0: 46%|████▋ | 235/507 [00:10<00:10, 25.60it/s] Loading 0: 47%|████▋ | 239/507 [00:10<00:09, 28.27it/s] Loading 0: 48%|████▊ | 243/507 [00:10<00:13, 19.39it/s] Loading 0: 49%|████▊ | 247/507 [00:10<00:11, 22.12it/s] Loading 0: 49%|████▉ | 250/507 [00:10<00:11, 22.27it/s] Loading 0: 50%|████▉ | 253/507 [00:10<00:10, 23.76it/s] Loading 0: 50%|█████ | 256/507 [00:10<00:10, 24.04it/s] Loading 0: 51%|█████ | 259/507 [00:11<00:10, 24.37it/s] Loading 0: 52%|█████▏ | 263/507 [00:11<00:09, 24.49it/s] Loading 0: 53%|█████▎ | 268/507 [00:11<00:10, 22.78it/s] Loading 0: 53%|█████▎ | 271/507 [00:11<00:11, 21.05it/s] Loading 0: 54%|█████▍ | 274/507 [00:11<00:10, 22.00it/s] Loading 0: 55%|█████▍ | 277/507 [00:11<00:10, 22.77it/s] Loading 0: 56%|█████▌ | 282/507 [00:12<00:08, 25.46it/s] Loading 0: 56%|█████▌ | 285/507 [00:12<00:09, 23.40it/s] Loading 0: 57%|█████▋ | 289/507 [00:12<00:08, 26.97it/s] Loading 0: 58%|█████▊ | 293/507 [00:12<00:07, 29.08it/s] Loading 0: 59%|█████▊ | 297/507 [00:12<00:11, 18.75it/s] Loading 0: 59%|█████▉ | 299/507 [00:27<00:11, 18.75it/s] Loading 0: 59%|█████▉ | 300/507 [00:27<04:17, 1.24s/it] Loading 0: 60%|█████▉ | 302/507 [00:27<03:29, 1.02s/it] Loading 0: 60%|██████ | 306/507 [00:27<02:15, 1.48it/s] Loading 0: 61%|██████ | 309/507 [00:27<01:39, 1.98it/s] Loading 0: 61%|██████▏ | 311/507 [00:27<01:21, 2.40it/s] Loading 0: 63%|██████▎ | 317/507 [00:28<00:44, 4.30it/s] Loading 0: 64%|██████▎ | 322/507 [00:28<00:30, 6.06it/s] Loading 0: 64%|██████▍ | 325/507 [00:28<00:25, 7.16it/s] Loading 0: 65%|██████▍ | 328/507 [00:28<00:20, 8.72it/s] Loading 0: 65%|██████▌ | 331/507 [00:28<00:16, 10.45it/s] Loading 0: 66%|██████▌ | 334/507 [00:28<00:13, 12.57it/s] Loading 0: 66%|██████▋ | 337/507 [00:28<00:11, 14.51it/s] Loading 0: 67%|██████▋ | 340/507 [00:29<00:10, 16.39it/s] Loading 0: 68%|██████▊ | 344/507 [00:29<00:08, 18.77it/s] Loading 0: 69%|██████▊ | 348/507 [00:29<00:06, 22.83it/s] Loading 0: 69%|██████▉ | 351/507 [00:29<00:09, 16.68it/s] Loading 0: 70%|███████ | 355/507 [00:29<00:07, 19.96it/s] Loading 0: 71%|███████ | 358/507 [00:29<00:07, 20.90it/s] Loading 0: 71%|███████ | 361/507 [00:30<00:06, 22.46it/s] Loading 0: 72%|███████▏ | 364/507 [00:30<00:06, 22.75it/s] Loading 0: 72%|███████▏ | 367/507 [00:30<00:06, 22.50it/s] Loading 0: 73%|███████▎ | 370/507 [00:30<00:05, 23.54it/s] Loading 0: 74%|███████▍ | 374/507 [00:30<00:05, 26.60it/s] Loading 0: 74%|███████▍ | 377/507 [00:30<00:06, 18.79it/s] Loading 0: 75%|███████▍ | 380/507 [00:30<00:06, 20.32it/s] Loading 0: 76%|███████▌ | 383/507 [00:31<00:07, 17.38it/s] Loading 0: 77%|███████▋ | 388/507 [00:31<00:05, 22.79it/s] Loading 0: 77%|███████▋ | 391/507 [00:31<00:04, 23.24it/s] Loading 0: 78%|███████▊ | 394/507 [00:31<00:04, 23.49it/s] Loading 0: 79%|███████▊ | 398/507 [00:31<00:04, 24.15it/s] Loading 0: 79%|███████▉ | 402/507 [00:31<00:03, 27.72it/s] Loading 0: 80%|███████▉ | 405/507 [00:32<00:05, 17.83it/s] Loading 0: 81%|████████ | 409/507 [00:32<00:04, 20.92it/s] Loading 0: 81%|████████▏ | 412/507 [00:32<00:04, 21.14it/s] Loading 0: 82%|████████▏ | 417/507 [00:32<00:03, 24.00it/s] Loading 0: 83%|████████▎ | 420/507 [00:32<00:03, 22.35it/s] Loading 0: 84%|████████▎ | 424/507 [00:32<00:03, 25.61it/s] Loading 0: 84%|████████▍ | 428/507 [00:32<00:02, 27.89it/s] Loading 0: 85%|████████▌ | 432/507 [00:33<00:03, 18.75it/s] Loading 0: 86%|████████▌ | 436/507 [00:33<00:03, 21.69it/s] Loading 0: 87%|████████▋ | 439/507 [00:33<00:03, 22.03it/s] Loading 0: 87%|████████▋ | 442/507 [00:33<00:02, 23.48it/s] Loading 0: 88%|████████▊ | 445/507 [00:33<00:02, 23.87it/s] Loading 0: 88%|████████▊ | 448/507 [00:33<00:02, 24.06it/s] Loading 0: 89%|████████▉ | 452/507 [00:34<00:02, 24.88it/s] Loading 0: 90%|█████████ | 457/507 [00:34<00:02, 23.97it/s] Loading 0: 91%|█████████ | 460/507 [00:34<00:02, 22.09it/s] Loading 0: 91%|█████████▏| 463/507 [00:34<00:01, 22.66it/s] Loading 0: 92%|█████████▏| 466/507 [00:34<00:01, 22.60it/s] Loading 0: 93%|█████████▎| 471/507 [00:34<00:01, 25.41it/s] Loading 0: 93%|█████████▎| 474/507 [00:34<00:01, 23.14it/s] Loading 0: 94%|█████████▍| 478/507 [00:35<00:01, 26.34it/s] Loading 0: 95%|█████████▌| 482/507 [00:35<00:00, 28.74it/s] Loading 0: 96%|█████████▌| 485/507 [00:37<00:04, 4.63it/s] Loading 0: 96%|█████████▋| 488/507 [00:37<00:03, 5.77it/s] Loading 0: 97%|█████████▋| 491/507 [00:37<00:02, 7.36it/s] Loading 0: 97%|█████████▋| 494/507 [00:37<00:01, 9.27it/s] Loading 0: 98%|█████████▊| 499/507 [00:37<00:00, 13.04it/s] Loading 0: 99%|█████████▉| 502/507 [00:38<00:00, 14.05it/s] Loading 0: 100%|█████████▉| 506/507 [00:38<00:00, 17.52it/s]
Job rirv938-mistral-22b-14k-4333-v1-mkmlizer completed after 248.21s with status: succeeded
Stopping job with name rirv938-mistral-22b-14k-4333-v1-mkmlizer
Pipeline stage MKMLizer completed in 249.09s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.30s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rirv938-mistral-22b-14k-4333-v1
Waiting for inference service rirv938-mistral-22b-14k-4333-v1 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service rirv938-mistral-22b-14k-4333-v1 ready after 200.95556902885437s
Pipeline stage MKMLDeployer completed in 201.34s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.8051948547363281s
Received healthy response to inference request in 2.4151463508605957s
Received healthy response to inference request in 3.0356950759887695s
Received healthy response to inference request in 2.5146172046661377s
Received healthy response to inference request in 1.8855516910552979s
5 requests
0 failed requests
5th percentile: 1.8212662220001221
10th percentile: 1.837337589263916
20th percentile: 1.8694803237915039
30th percentile: 1.9914706230163575
40th percentile: 2.2033084869384765
50th percentile: 2.4151463508605957
60th percentile: 2.4549346923828126
70th percentile: 2.4947230339050295
80th percentile: 2.6188327789306642
90th percentile: 2.827263927459717
95th percentile: 2.931479501724243
99th percentile: 3.0148519611358644
mean time: 2.3312410354614257
Pipeline stage StressChecker completed in 13.17s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 6.47s
Shutdown handler de-registered
rirv938-mistral-22b-14k-_4333_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rirv938-mistral-22b-14k-4333-v1-profiler
Waiting for inference service rirv938-mistral-22b-14k-4333-v1-profiler to be ready
Inference service rirv938-mistral-22b-14k-4333-v1-profiler ready after 200.47300219535828s
Pipeline stage MKMLProfilerDeployer completed in 200.87s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-mistral-22b-0dcab6ad7fa1c3c6763ac7d2f65e74da-deplop8rt7:/code/chaiverse_profiler_1727123142 --namespace tenant-chaiml-guanaco
kubectl exec -it rirv938-mistral-22b-0dcab6ad7fa1c3c6763ac7d2f65e74da-deplop8rt7 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1727123142 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1727123142/summary.json'
kubectl exec -it rirv938-mistral-22b-0dcab6ad7fa1c3c6763ac7d2f65e74da-deplop8rt7 --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1727123142/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1568.28s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rirv938-mistral-22b-14k-4333-v1-profiler is running
Tearing down inference service rirv938-mistral-22b-14k-4333-v1-profiler
Service rirv938-mistral-22b-14k-4333-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.28s
Shutdown handler de-registered
rirv938-mistral-22b-14k-_4333_v1 status is now inactive due to auto deactivation removed underperforming models
rirv938-mistral-22b-14k-_4333_v1 status is now torndown due to DeploymentManager action