submission_id: rirv938-llama-8b-mixtral_8633_v3
developer_uid: robert_irvine
best_of: 1
celo_rating: 1263.3
display_name: rirv938-llama-8b-mixtral_8633_v3
family_friendly_score: 0.0
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 256, 'best_of': 1, 'max_output_tokens': 1}
gpu_counts: {'NVIDIA RTX A5000': 1}
ineligible_reason: max_output_tokens!=64
is_internal_developer: True
language_model: rirv938/llama_8b_mixtral_vs_mixtral_250k_388
latencies: [{'batch_size': 1, 'throughput': 0.10653750596671728, 'latency_mean': 9.386307377815246, 'latency_p50': 9.378269791603088, 'latency_p90': 9.507824683189392}]
max_input_tokens: 256
max_output_tokens: 1
model_architecture: LlamaForSequenceClassification
model_group: rirv938/llama_8b_mixtral
model_name: rirv938-llama-8b-mixtral_8633_v3
model_num_parameters: 8030261248.0
model_repo: rirv938/llama_8b_mixtral_vs_mixtral_250k_388
model_size: 8B
num_battles: 11062
num_wins: 5724
ranking_group: single
status: torndown
submission_type: basic
timestamp: 2024-09-14T23:42:42+00:00
us_pacific_date: 2024-09-14
win_ratio: 0.517447116253842
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rirv938-llama-8b-mixtral-8633-v3-mkmlizer
Waiting for job on rirv938-llama-8b-mixtral-8633-v3-mkmlizer to finish
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ _____ __ __ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ /___/ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Version: 0.10.1 ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ https://mk1.ai ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ belonging to: ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Chai Research Corp. ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Failed to get response for submission blend_hokok_2024-09-09: ('http://neversleep-noromaid-v0-8068-v150-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: Downloaded to shared memory in 19.413s
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:t0, folder:/tmp/tmpk267oda7, device:0
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Stopping job with name rirv938-llama-8b-mixtral-8633-v3-mkmlizer
%s, retrying in %s seconds...
Stopping job with name rirv938-llama-8b-mixtral-8633-v3-mkmlizer
%s, retrying in %s seconds...
Stopping job with name rirv938-llama-8b-mixtral-8633-v3-mkmlizer
%s, retrying in %s seconds...
Starting job with name rirv938-llama-8b-mixtral-8633-v3-mkmlizer
Waiting for job on rirv938-llama-8b-mixtral-8633-v3-mkmlizer to finish
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ _____ __ __ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ /___/ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Version: 0.10.1 ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ https://mk1.ai ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ belonging to: ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Chai Research Corp. ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ║ ║
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: Downloaded to shared memory in 19.762s
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:t0, folder:/tmp/tmp6s_rxccz, device:0
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: quantized model in 84.107s
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: Processed model rirv938/llama_8b_mixtral_vs_mixtral_250k_388 in 103.869s
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: creating bucket guanaco-mkml-models
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rirv938-llama-8b-mixtral-8633-v3
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rirv938-llama-8b-mixtral-8633-v3/special_tokens_map.json
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rirv938-llama-8b-mixtral-8633-v3/config.json
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rirv938-llama-8b-mixtral-8633-v3/tokenizer_config.json
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rirv938-llama-8b-mixtral-8633-v3/tokenizer.json
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rirv938-llama-8b-mixtral-8633-v3/flywheel_model.0.safetensors
rirv938-llama-8b-mixtral-8633-v3-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 3/291 [00:00<00:55, 5.23it/s] Loading 0: 1%|▏ | 4/291 [00:01<01:30, 3.19it/s] Loading 0: 2%|▏ | 5/291 [00:01<01:59, 2.40it/s] Loading 0: 3%|▎ | 8/291 [00:02<01:02, 4.55it/s] Loading 0: 3%|▎ | 9/291 [00:02<01:01, 4.61it/s] Loading 0: 3%|▎ | 10/291 [00:02<00:53, 5.26it/s] Loading 0: 4%|▍ | 12/291 [00:02<01:03, 4.38it/s] Loading 0: 4%|▍ | 13/291 [00:03<01:25, 3.26it/s] Loading 0: 5%|▍ | 14/291 [00:04<01:47, 2.57it/s] Loading 0: 6%|▌ | 17/291 [00:04<01:01, 4.42it/s] Loading 0: 6%|▌ | 18/291 [00:04<00:58, 4.66it/s] Loading 0: 7%|▋ | 19/291 [00:04<00:51, 5.26it/s] Loading 0: 7%|▋ | 21/291 [00:05<01:01, 4.40it/s] Loading 0: 8%|▊ | 22/291 [00:05<01:20, 3.32it/s] Loading 0: 8%|▊ | 23/291 [00:06<01:42, 2.60it/s] Loading 0: 9%|▉ | 26/291 [00:06<00:59, 4.44it/s] Loading 0: 9%|▉ | 27/291 [00:06<00:56, 4.68it/s] Loading 0: 10%|▉ | 29/291 [00:06<00:40, 6.43it/s] Loading 0: 11%|█ | 31/291 [00:07<01:16, 3.38it/s] Loading 0: 11%|█ | 32/291 [00:08<01:34, 2.75it/s] Loading 0: 12%|█▏ | 35/291 [00:08<00:58, 4.35it/s] Loading 0: 12%|█▏ | 36/291 [00:09<00:55, 4.58it/s] Loading 0: 13%|█▎ | 37/291 [00:09<00:49, 5.12it/s] Loading 0: 13%|█▎ | 39/291 [00:09<00:57, 4.37it/s] Loading 0: 14%|█▎ | 40/291 [00:10<01:15, 3.35it/s] Loading 0: 14%|█▍ | 41/291 [00:10<01:34, 2.65it/s] Loading 0: 15%|█▌ | 44/291 [00:11<00:55, 4.48it/s] Loading 0: 15%|█▌ | 45/291 [00:11<00:52, 4.72it/s] Loading 0: 16%|█▌ | 46/291 [00:11<00:46, 5.30it/s] Loading 0: 16%|█▋ | 48/291 [00:11<00:54, 4.43it/s] Loading 0: 17%|█▋ | 49/291 [00:12<01:12, 3.35it/s] Loading 0: 17%|█▋ | 50/291 [00:13<01:31, 2.65it/s] Loading 0: 18%|█▊ | 53/291 [00:13<00:52, 4.50it/s] Loading 0: 19%|█▊ | 54/291 [00:13<00:50, 4.74it/s] Loading 0: 20%|█▉ | 57/291 [00:14<00:50, 4.60it/s] Loading 0: 20%|█▉ | 58/291 [00:14<01:05, 3.57it/s] Loading 0: 20%|██ | 59/291 [00:15<01:21, 2.84it/s] Loading 0: 21%|██▏ | 62/291 [00:15<00:49, 4.60it/s] Loading 0: 22%|██▏ | 63/291 [00:15<00:47, 4.80it/s] Loading 0: 22%|██▏ | 64/291 [00:15<00:42, 5.33it/s] Loading 0: 23%|██▎ | 66/291 [00:16<00:50, 4.44it/s] Loading 0: 23%|██▎ | 67/291 [00:17<01:06, 3.36it/s] Loading 0: 23%|██▎ | 68/291 [00:17<01:24, 2.65it/s] Loading 0: 24%|██▍ | 71/291 [00:17<00:49, 4.48it/s] Loading 0: 25%|██▍ | 72/291 [00:18<00:46, 4.73it/s] Loading 0: 25%|██▌ | 74/291 [00:18<00:33, 6.47it/s] Loading 0: 26%|██▌ | 76/291 [00:19<01:03, 3.40it/s] Loading 0: 26%|██▋ | 77/291 [00:19<01:16, 2.78it/s] Loading 0: 27%|██▋ | 80/291 [00:20<00:48, 4.39it/s] Loading 0: 28%|██▊ | 81/291 [00:20<00:45, 4.63it/s] Loading 0: 28%|██▊ | 82/291 [00:20<00:40, 5.16it/s] Loading 0: 29%|██▊ | 83/291 [00:20<00:39, 5.23it/s] Loading 0: 29%|██▉ | 84/291 [00:21<00:58, 3.51it/s] Loading 0: 29%|██▉ | 85/291 [00:21<01:13, 2.80it/s] Loading 0: 30%|██▉ | 86/291 [00:22<01:28, 2.32it/s] Loading 0: 31%|███ | 89/291 [00:22<00:47, 4.21it/s] Loading 0: 31%|███ | 90/291 [00:22<00:44, 4.49it/s] Loading 0: 31%|███▏ | 91/291 [00:22<00:39, 5.07it/s] Loading 0: 32%|███▏ | 93/291 [00:23<00:45, 4.32it/s] Loading 0: 32%|███▏ | 94/291 [00:23<01:00, 3.28it/s] Loading 0: 33%|███▎ | 95/291 [00:24<01:15, 2.61it/s] Loading 0: 34%|███▎ | 98/291 [00:24<00:42, 4.52it/s] Loading 0: 34%|███▍ | 99/291 [00:24<00:40, 4.76it/s] Loading 0: 35%|███▌ | 102/291 [00:25<00:40, 4.63it/s] Loading 0: 35%|███▌ | 103/291 [00:26<00:52, 3.59it/s] Loading 0: 36%|███▌ | 104/291 [00:26<01:05, 2.84it/s] Loading 0: 37%|███▋ | 107/291 [00:27<00:40, 4.55it/s] Loading 0: 37%|███▋ | 108/291 [00:27<00:38, 4.77it/s] Loading 0: 38%|███▊ | 111/291 [00:27<00:38, 4.64it/s] Loading 0: 38%|███▊ | 112/291 [00:28<00:49, 3.61it/s] Loading 0: 39%|███▉ | 113/291 [00:29<01:02, 2.86it/s] Loading 0: 40%|███▉ | 116/291 [00:29<00:38, 4.58it/s] Loading 0: 40%|████ | 117/291 [00:29<00:36, 4.80it/s] Loading 0: 41%|████ | 118/291 [00:29<00:32, 5.36it/s] Loading 0: 41%|████ | 120/291 [00:30<00:38, 4.49it/s] Loading 0: 42%|████▏ | 121/291 [00:30<00:50, 3.39it/s] Loading 0: 42%|████▏ | 122/291 [00:31<01:02, 2.68it/s] Loading 0: 43%|████▎ | 125/291 [00:31<00:36, 4.56it/s] Loading 0: 43%|████▎ | 126/291 [00:31<00:34, 4.79it/s] Loading 0: 44%|████▎ | 127/291 [00:31<00:30, 5.37it/s] Loading 0: 44%|████▍ | 129/291 [00:32<00:36, 4.47it/s] Loading 0: 45%|████▍ | 130/291 [00:32<00:47, 3.36it/s] Loading 0: 45%|████▌ | 131/291 [00:33<01:00, 2.65it/s] Loading 0: 46%|████▌ | 134/291 [00:33<00:34, 4.49it/s] Loading 0: 46%|████▋ | 135/291 [00:33<00:32, 4.73it/s] Loading 0: 47%|████▋ | 136/291 [00:33<00:29, 5.33it/s] Loading 0: 47%|████▋ | 138/291 [00:34<00:34, 4.44it/s] Loading 0: 48%|████▊ | 139/291 [00:35<00:45, 3.35it/s] Loading 0: 48%|████▊ | 140/291 [00:35<00:56, 2.66it/s] Loading 0: 49%|████▉ | 143/291 [00:35<00:32, 4.56it/s] Loading 0: 49%|████▉ | 144/291 [00:36<00:30, 4.79it/s] Loading 0: 50%|████▉ | 145/291 [00:36<00:27, 5.36it/s] Loading 0: 51%|█████ | 147/291 [00:36<00:32, 4.47it/s] Loading 0: 51%|█████ | 148/291 [00:37<00:42, 3.37it/s] Loading 0: 51%|█████ | 149/291 [00:37<00:53, 2.67it/s] Loading 0: 52%|█████▏ | 152/291 [00:38<00:30, 4.51it/s] Loading 0: 53%|█████▎ | 153/291 [00:38<00:29, 4.75it/s] Loading 0: 53%|█████▎ | 154/291 [00:38<00:25, 5.35it/s] Loading 0: 54%|█████▎ | 156/291 [00:39<00:30, 4.46it/s] Loading 0: 54%|█████▍ | 157/291 [00:39<00:39, 3.36it/s] Loading 0: 54%|█████▍ | 158/291 [00:40<00:50, 2.64it/s] Loading 0: 55%|█████▌ | 161/291 [00:40<00:28, 4.53it/s] Loading 0: 56%|█████▌ | 162/291 [00:40<00:27, 4.76it/s] Loading 0: 56%|█████▌ | 163/291 [00:40<00:24, 5.32it/s] Loading 0: 57%|█████▋ | 165/291 [00:41<00:28, 4.44it/s] Loading 0: 57%|█████▋ | 166/291 [00:41<00:37, 3.35it/s] Loading 0: 57%|█████▋ | 167/291 [00:42<00:46, 2.64it/s] Loading 0: 58%|█████▊ | 170/291 [00:42<00:26, 4.54it/s] Loading 0: 59%|█████▉ | 171/291 [00:42<00:25, 4.77it/s] Loading 0: 59%|█████▉ | 172/291 [00:42<00:22, 5.37it/s] Loading 0: 59%|█████▉ | 173/291 [00:43<00:32, 3.60it/s] Loading 0: 60%|██████ | 175/291 [00:43<00:23, 4.94it/s] Loading 0: 60%|██████ | 176/291 [00:43<00:22, 5.15it/s] Loading 0: 61%|██████ | 177/291 [00:43<00:19, 5.76it/s] Loading 0: 62%|██████▏ | 179/291 [00:44<00:24, 4.58it/s] Loading 0: 62%|██████▏ | 180/291 [00:45<00:32, 3.37it/s] Loading 0: 62%|██████▏ | 181/291 [00:45<00:41, 2.65it/s] Loading 0: 63%|██████▎ | 184/291 [00:45<00:23, 4.63it/s] Loading 0: 64%|██████▎ | 185/291 [00:46<00:21, 4.87it/s] Loading 0: 64%|██████▍ | 186/291 [00:46<00:19, 5.47it/s] Loading 0: 64%|██████▍ | 187/291 [00:46<00:18, 5.51it/s] Loading 0: 65%|██████▍ | 188/291 [00:46<00:28, 3.58it/s] Loading 0: 65%|██████▍ | 189/291 [00:47<00:37, 2.69it/s] Loading 0: 66%|██████▌ | 192/291 [00:48<00:27, 3.65it/s] Loading 0: 66%|██████▋ | 193/291 [00:48<00:32, 3.00it/s] Loading 0: 67%|██████▋ | 194/291 [00:49<00:39, 2.48it/s] Loading 0: 68%|██████▊ | 197/291 [00:49<00:22, 4.18it/s] Loading 0: 68%|██████▊ | 198/291 [00:49<00:20, 4.45it/s] Loading 0: 69%|██████▊ | 200/291 [00:49<00:14, 6.12it/s] Loading 0: 69%|██████▉ | 202/291 [00:50<00:26, 3.36it/s] Loading 0: 70%|██████▉ | 203/291 [00:51<00:31, 2.76it/s] Loading 0: 71%|███████ | 206/291 [00:51<00:19, 4.35it/s] Loading 0: 71%|███████ | 207/291 [00:51<00:18, 4.59it/s] Loading 0: 71%|███████▏ | 208/291 [00:52<00:16, 5.13it/s] Loading 0: 72%|███████▏ | 210/291 [00:52<00:18, 4.40it/s] Loading 0: 73%|███████▎ | 211/291 [00:53<00:23, 3.36it/s] Loading 0: 73%|███████▎ | 212/291 [00:53<00:29, 2.66it/s] Loading 0: 74%|███████▍ | 215/291 [00:53<00:16, 4.54it/s] Loading 0: 74%|███████▍ | 216/291 [00:54<00:15, 4.77it/s] Loading 0: 75%|███████▍ | 217/291 [00:54<00:13, 5.35it/s] Loading 0: 75%|███████▌ | 219/291 [00:54<00:16, 4.47it/s] Loading 0: 76%|███████▌ | 220/291 [00:55<00:21, 3.37it/s] Loading 0: 76%|███████▌ | 221/291 [00:56<00:26, 2.67it/s] Loading 0: 77%|███████▋ | 224/291 [00:56<00:14, 4.53it/s] Loading 0: 77%|███████▋ | 225/291 [00:56<00:13, 4.78it/s] Loading 0: 78%|███████▊ | 226/291 [00:56<00:12, 5.35it/s] Loading 0: 78%|███████▊ | 228/291 [00:57<00:14, 4.46it/s] Loading 0: 79%|███████▊ | 229/291 [00:57<00:18, 3.37it/s] Loading 0: 79%|███████▉ | 230/291 [00:58<00:22, 2.66it/s] Loading 0: 80%|████████ | 233/291 [00:58<00:12, 4.56it/s] Loading 0: 80%|████████ | 234/291 [00:58<00:11, 4.80it/s] Loading 0: 81%|████████ | 235/291 [00:58<00:10, 5.35it/s] Loading 0: 81%|████████▏ | 237/291 [00:59<00:12, 4.45it/s] Loading 0: 82%|████████▏ | 238/291 [00:59<00:15, 3.36it/s] Loading 0: 82%|████████▏ | 239/291 [01:00<00:19, 2.66it/s] Loading 0: 83%|████████▎ | 242/291 [01:00<00:10, 4.53it/s] Loading 0: 84%|████████▎ | 243/291 [01:00<00:10, 4.77it/s] Loading 0: 84%|████████▍ | 244/291 [01:00<00:08, 5.32it/s] Loading 0: 85%|████████▍ | 246/291 [01:01<00:10, 4.46it/s] Loading 0: 85%|████████▍ | 247/291 [01:02<00:13, 3.36it/s] Loading 0: 85%|████████▌ | 248/291 [01:02<00:16, 2.64it/s] Loading 0: 86%|████████▋ | 251/291 [01:02<00:08, 4.50it/s] Loading 0: 87%|████████▋ | 252/291 [01:03<00:08, 4.74it/s] Loading 0: 87%|████████▋ | 253/291 [01:03<00:07, 5.27it/s] Loading 0: 88%|████████▊ | 255/291 [01:03<00:08, 4.43it/s] Loading 0: 88%|████████▊ | 256/291 [01:04<00:10, 3.36it/s] Loading 0: 88%|████████▊ | 257/291 [01:04<00:12, 2.66it/s] Loading 0: 89%|████████▉ | 260/291 [01:05<00:06, 4.53it/s] Loading 0: 90%|████████▉ | 261/291 [01:05<00:06, 4.77it/s] Loading 0: 90%|█████████ | 262/291 [01:05<00:05, 5.33it/s] Loading 0: 91%|█████████ | 264/291 [01:06<00:06, 4.45it/s] Loading 0: 91%|█████████ | 265/291 [01:06<00:07, 3.36it/s] Loading 0: 91%|█████████▏| 266/291 [01:07<00:09, 2.66it/s] Loading 0: 92%|█████████▏| 269/291 [01:07<00:04, 4.53it/s] Loading 0: 93%|█████████▎| 270/291 [01:07<00:04, 4.77it/s] Loading 0: 93%|█████████▎| 271/291 [01:07<00:03, 5.36it/s] Loading 0: 94%|█████████▍| 273/291 [01:08<00:04, 4.47it/s] Loading 0: 94%|█████████▍| 274/291 [01:08<00:05, 3.37it/s] Loading 0: 95%|█████████▍| 275/291 [01:09<00:05, 2.67it/s] Loading 0: 96%|█████████▌| 278/291 [01:09<00:02, 4.53it/s] Loading 0: 96%|█████████▌| 279/291 [01:09<00:02, 4.77it/s] Loading 0: 96%|█████████▌| 280/291 [01:09<00:02, 5.37it/s] Loading 0: 97%|█████████▋| 281/291 [01:10<00:02, 3.62it/s] Loading 0: 97%|█████████▋| 282/291 [01:11<00:03, 2.75it/s] Loading 0: 98%|█████████▊| 284/291 [01:11<00:01, 3.95it/s] Loading 0: 98%|█████████▊| 285/291 [01:11<00:01, 4.31it/s] Loading 0: 98%|█████████▊| 286/291 [01:11<00:01, 4.98it/s] Loading 0: 99%|█████████▊| 287/291 [01:11<00:00, 5.10it/s] Loading 0: 99%|█████████▉| 288/291 [01:12<00:00, 3.38it/s]
Job rirv938-llama-8b-mixtral-8633-v3-mkmlizer completed after 128.99s with status: succeeded
Stopping job with name rirv938-llama-8b-mixtral-8633-v3-mkmlizer
Pipeline stage MKMLizer completed in 203.96s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 2.72s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rirv938-llama-8b-mixtral-8633-v3
Waiting for inference service rirv938-llama-8b-mixtral-8633-v3 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission blend_hokok_2024-09-09: ('http://neversleep-noromaid-v0-8068-v150-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service rirv938-llama-8b-mixtral-8633-v3 ready after 186.20277571678162s
Pipeline stage MKMLDeployer completed in 186.76s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 4.133759260177612s
Received healthy response to inference request in 4.871780157089233s
Received healthy response to inference request in 5.45391321182251s
Received healthy response to inference request in 5.975692987442017s
Received healthy response to inference request in 4.2572572231292725s
5 requests
0 failed requests
5th percentile: 4.158458852767945
10th percentile: 4.183158445358276
20th percentile: 4.23255763053894
30th percentile: 4.3801618099212645
40th percentile: 4.625970983505249
50th percentile: 4.871780157089233
60th percentile: 5.104633378982544
70th percentile: 5.337486600875854
80th percentile: 5.5582691669464115
90th percentile: 5.766981077194214
95th percentile: 5.871337032318115
99th percentile: 5.954821796417236
mean time: 4.938480567932129
%s, retrying in %s seconds...
Received healthy response to inference request in 3.863126277923584s
Received healthy response to inference request in 4.405805349349976s
Received healthy response to inference request in 2.691866159439087s
Received healthy response to inference request in 2.8128740787506104s
Received healthy response to inference request in 3.104231357574463s
5 requests
0 failed requests
5th percentile: 2.716067743301392
10th percentile: 2.740269327163696
20th percentile: 2.7886724948883055
30th percentile: 2.8711455345153807
40th percentile: 2.987688446044922
50th percentile: 3.104231357574463
60th percentile: 3.4077893257141114
70th percentile: 3.7113472938537595
80th percentile: 3.9716620922088626
90th percentile: 4.188733720779419
95th percentile: 4.297269535064697
99th percentile: 4.38409818649292
mean time: 3.375580644607544
Pipeline stage StressChecker completed in 44.04s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Pipeline stage TriggerMKMLProfilingPipeline completed in 10.54s
Shutdown handler de-registered
rirv938-llama-8b-mixtral_8633_v3 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rirv938-llama-8b-mixtral-8633-v3-profiler
Waiting for inference service rirv938-llama-8b-mixtral-8633-v3-profiler to be ready
Inference service rirv938-llama-8b-mixtral-8633-v3-profiler ready after 170.39417839050293s
Pipeline stage MKMLProfilerDeployer completed in 170.81s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mixbc047523a2abef6cde9cd935eb00d6ae-deplo8tfrp:/code/chaiverse_profiler_1726358054 --namespace tenant-chaiml-guanaco
kubectl exec -it rirv938-llama-8b-mixbc047523a2abef6cde9cd935eb00d6ae-deplo8tfrp --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1726358054 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1726358054/summary.json'
kubectl exec -it rirv938-llama-8b-mixbc047523a2abef6cde9cd935eb00d6ae-deplo8tfrp --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1726358054/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1887.15s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rirv938-llama-8b-mixtral-8633-v3-profiler is running
Tearing down inference service rirv938-llama-8b-mixtral-8633-v3-profiler
Service rirv938-llama-8b-mixtral-8633-v3-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.16s
Shutdown handler de-registered
rirv938-llama-8b-mixtral_8633_v3 status is now inactive due to auto deactivation removed underperforming models
rirv938-llama-8b-mixtral_8633_v3 status is now torndown due to DeploymentManager action