developer_uid: rirv938
submission_id: rirv938-llama-8b-multihe_9554_v4
model_name: rirv938-llama-8b-multihe_9554_v4
model_group: rirv938/llama_8b_multihe
status: inactive
timestamp: 2024-11-28T20:42:49+00:00
num_battles: 29712
num_wins: 15035
celo_rating: 1264.68
family_friendly_score: 0.5880000000000001
family_friendly_standard_error: 0.00696068962100739
submission_type: basic
model_repo: rirv938/llama_8b_multihead_204m_preference
model_architecture: LlamaForSequenceClassification
model_num_parameters: 8030261248.0
best_of: 1
max_input_tokens: 256
max_output_tokens: 1
display_name: rirv938-llama-8b-multihe_9554_v4
ineligible_reason: max_output_tokens!=64
is_internal_developer: True
language_model: rirv938/llama_8b_multihead_204m_preference
model_size: 8B
ranking_group: single
us_pacific_date: 2024-11-28
win_ratio: 0.5060245018847603
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 256, 'best_of': 1, 'max_output_tokens': 1}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rirv938-llama-8b-multihe-9554-v4-mkmlizer
Waiting for job on rirv938-llama-8b-multihe-9554-v4-mkmlizer to finish
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ _____ __ __ ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ /___/ ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ Version: 0.11.12 ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ https://mk1.ai ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ belonging to: ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ Chai Research Corp. ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ║ ║
rirv938-llama-8b-multihe-9554-v4-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rirv938-llama-8b-multihe-9554-v4-mkmlizer: Downloaded to shared memory in 44.334s
rirv938-llama-8b-multihe-9554-v4-mkmlizer: quantizing model to /dev/shm/model_cache, profile:t0, folder:/tmp/tmpjxu5zprl, device:0
rirv938-llama-8b-multihe-9554-v4-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
rirv938-llama-8b-multihe-9554-v4-mkmlizer: quantized model in 88.931s
rirv938-llama-8b-multihe-9554-v4-mkmlizer: Processed model rirv938/llama_8b_multihead_204m_preference in 133.265s
rirv938-llama-8b-multihe-9554-v4-mkmlizer: creating bucket guanaco-mkml-models
rirv938-llama-8b-multihe-9554-v4-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rirv938-llama-8b-multihe-9554-v4-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rirv938-llama-8b-multihe-9554-v4
rirv938-llama-8b-multihe-9554-v4-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rirv938-llama-8b-multihe-9554-v4/config.json
rirv938-llama-8b-multihe-9554-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rirv938-llama-8b-multihe-9554-v4/tokenizer_config.json
rirv938-llama-8b-multihe-9554-v4-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rirv938-llama-8b-multihe-9554-v4/special_tokens_map.json
rirv938-llama-8b-multihe-9554-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rirv938-llama-8b-multihe-9554-v4/tokenizer.json
rirv938-llama-8b-multihe-9554-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rirv938-llama-8b-multihe-9554-v4/flywheel_model.0.safetensors
rirv938-llama-8b-multihe-9554-v4-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 3/291 [00:00<00:59, 4.88it/s] Loading 0: 1%|▏ | 4/291 [00:01<01:37, 2.94it/s] Loading 0: 2%|▏ | 5/291 [00:01<02:10, 2.18it/s] Loading 0: 3%|▎ | 8/291 [00:02<01:07, 4.18it/s] Loading 0: 3%|▎ | 9/291 [00:02<01:05, 4.28it/s] Loading 0: 3%|▎ | 10/291 [00:02<00:57, 4.86it/s] Loading 0: 4%|▍ | 12/291 [00:03<01:08, 4.08it/s] Loading 0: 4%|▍ | 13/291 [00:03<01:30, 3.08it/s] Loading 0: 5%|▍ | 14/291 [00:04<01:56, 2.37it/s] Loading 0: 6%|▌ | 17/291 [00:04<01:06, 4.14it/s] Loading 0: 6%|▌ | 18/291 [00:04<01:02, 4.39it/s] Loading 0: 7%|▋ | 19/291 [00:04<00:54, 4.95it/s] Loading 0: 7%|▋ | 21/291 [00:05<01:04, 4.20it/s] Loading 0: 8%|▊ | 22/291 [00:06<01:24, 3.19it/s] Loading 0: 8%|▊ | 23/291 [00:06<01:46, 2.52it/s] Loading 0: 9%|▉ | 26/291 [00:06<01:01, 4.30it/s] Loading 0: 9%|▉ | 27/291 [00:07<00:58, 4.54it/s] Loading 0: 10%|▉ | 28/291 [00:07<00:53, 4.96it/s] Loading 0: 10%|█ | 30/291 [00:07<00:43, 6.01it/s] Loading 0: 11%|█ | 31/291 [00:07<00:43, 5.96it/s] Loading 0: 11%|█ | 32/291 [00:07<00:40, 6.43it/s] Loading 0: 11%|█▏ | 33/291 [00:08<00:44, 5.76it/s] Loading 0: 12%|█▏ | 34/291 [00:08<01:12, 3.53it/s] Loading 0: 12%|█▏ | 35/291 [00:09<01:34, 2.70it/s] Loading 0: 12%|█▏ | 36/291 [00:09<01:56, 2.20it/s] Loading 0: 13%|█▎ | 39/291 [00:10<01:20, 3.12it/s] Loading 0: 14%|█▎ | 40/291 [00:11<01:34, 2.65it/s] Loading 0: 14%|█▍ | 41/291 [00:11<01:52, 2.22it/s] Loading 0: 15%|█▌ | 44/291 [00:12<01:04, 3.84it/s] Loading 0: 15%|█▌ | 45/291 [00:12<01:00, 4.09it/s] Loading 0: 16%|█▌ | 46/291 [00:12<00:53, 4.58it/s] Loading 0: 16%|█▋ | 48/291 [00:12<00:59, 4.06it/s] Loading 0: 17%|█▋ | 49/291 [00:13<01:18, 3.08it/s] Loading 0: 17%|█▋ | 50/291 [00:14<01:38, 2.44it/s] Loading 0: 18%|█▊ | 53/291 [00:14<00:57, 4.15it/s] Loading 0: 19%|█▊ | 54/291 [00:14<00:54, 4.39it/s] Loading 0: 19%|█▉ | 55/291 [00:14<00:47, 4.96it/s] Loading 0: 20%|█▉ | 57/291 [00:15<00:55, 4.19it/s] Loading 0: 20%|█▉ | 58/291 [00:15<01:13, 3.18it/s] Loading 0: 20%|██ | 59/291 [00:16<01:32, 2.50it/s] Loading 0: 21%|██▏ | 62/291 [00:16<00:53, 4.31it/s] Loading 0: 22%|██▏ | 63/291 [00:16<00:50, 4.53it/s] Loading 0: 22%|██▏ | 64/291 [00:17<00:44, 5.10it/s] Loading 0: 23%|██▎ | 66/291 [00:17<00:52, 4.25it/s] Loading 0: 23%|██▎ | 67/291 [00:18<01:10, 3.19it/s] Loading 0: 23%|██▎ | 68/291 [00:18<01:27, 2.54it/s] Loading 0: 24%|██▍ | 71/291 [00:19<00:50, 4.37it/s] Loading 0: 25%|██▍ | 72/291 [00:19<00:47, 4.62it/s] Loading 0: 25%|██▌ | 73/291 [00:19<00:41, 5.20it/s] Loading 0: 25%|██▌ | 74/291 [00:19<01:01, 3.54it/s] Loading 0: 26%|██▌ | 75/291 [00:20<01:20, 2.67it/s] Loading 0: 26%|██▋ | 77/291 [00:20<00:55, 3.83it/s] Loading 0: 27%|██▋ | 78/291 [00:20<00:50, 4.18it/s] Loading 0: 27%|██▋ | 79/291 [00:21<00:44, 4.81it/s] Loading 0: 27%|██▋ | 80/291 [00:21<00:41, 5.06it/s] Loading 0: 28%|██▊ | 81/291 [00:21<01:02, 3.34it/s] Loading 0: 28%|██▊ | 82/291 [00:22<01:18, 2.66it/s] Loading 0: 29%|██▊ | 83/291 [00:23<01:35, 2.17it/s] Loading 0: 30%|██▉ | 86/291 [00:23<00:49, 4.12it/s] Loading 0: 30%|██▉ | 87/291 [00:23<00:46, 4.37it/s] Loading 0: 30%|███ | 88/291 [00:23<00:41, 4.94it/s] Loading 0: 31%|███ | 90/291 [00:24<00:48, 4.12it/s] Loading 0: 31%|███▏ | 91/291 [00:24<01:04, 3.12it/s] Loading 0: 32%|███▏ | 92/291 [00:25<01:20, 2.46it/s] Loading 0: 33%|███▎ | 95/291 [00:25<00:45, 4.28it/s] Loading 0: 33%|███▎ | 96/291 [00:25<00:43, 4.51it/s] Loading 0: 33%|███▎ | 97/291 [00:25<00:38, 5.02it/s] Loading 0: 34%|███▍ | 99/291 [00:26<00:45, 4.18it/s] Loading 0: 34%|███▍ | 100/291 [00:27<01:00, 3.16it/s] Loading 0: 35%|███▍ | 101/291 [00:27<01:16, 2.48it/s] Loading 0: 36%|███▌ | 104/291 [00:28<00:44, 4.22it/s] Loading 0: 36%|███▌ | 105/291 [00:28<00:41, 4.45it/s] Loading 0: 36%|███▋ | 106/291 [00:28<00:37, 4.97it/s] Loading 0: 37%|███▋ | 108/291 [00:28<00:43, 4.17it/s] Loading 0: 37%|███▋ | 109/291 [00:29<00:57, 3.16it/s] Loading 0: 38%|███▊ | 110/291 [00:30<01:12, 2.50it/s] Loading 0: 39%|███▉ | 113/291 [00:30<00:41, 4.24it/s] Loading 0: 39%|███▉ | 114/291 [00:30<00:39, 4.47it/s] Loading 0: 40%|███▉ | 115/291 [00:30<00:35, 4.98it/s] Loading 0: 40%|███▉ | 116/291 [00:31<00:51, 3.40it/s] Loading 0: 41%|████ | 118/291 [00:31<00:37, 4.59it/s] Loading 0: 41%|████ | 119/291 [00:31<00:35, 4.84it/s] Loading 0: 41%|████ | 120/291 [00:31<00:32, 5.28it/s] Loading 0: 42%|████▏ | 122/291 [00:32<00:39, 4.31it/s] Loading 0: 43%|████▎ | 125/291 [00:32<00:35, 4.68it/s] Loading 0: 43%|████▎ | 126/291 [00:33<00:46, 3.58it/s] Loading 0: 44%|████▎ | 127/291 [00:34<00:58, 2.81it/s] Loading 0: 45%|████▍ | 130/291 [00:34<00:35, 4.57it/s] Loading 0: 45%|████▌ | 131/291 [00:34<00:33, 4.83it/s] Loading 0: 45%|████▌ | 132/291 [00:34<00:29, 5.36it/s] Loading 0: 46%|████▌ | 133/291 [00:34<00:28, 5.50it/s] Loading 0: 46%|████▌ | 134/291 [00:35<00:43, 3.59it/s] Loading 0: 46%|████▋ | 135/291 [00:36<00:58, 2.67it/s] Loading 0: 47%|████▋ | 138/291 [00:36<00:42, 3.59it/s] Loading 0: 48%|████▊ | 139/291 [00:37<00:51, 2.95it/s] Loading 0: 48%|████▊ | 140/291 [00:37<01:02, 2.41it/s] Loading 0: 49%|████▉ | 143/291 [00:38<00:36, 4.08it/s] Loading 0: 49%|████▉ | 144/291 [00:38<00:34, 4.31it/s] Loading 0: 50%|████▉ | 145/291 [00:38<00:30, 4.84it/s] Loading 0: 51%|█████ | 147/291 [00:38<00:34, 4.13it/s] Loading 0: 51%|█████ | 148/291 [00:39<00:44, 3.19it/s] Loading 0: 51%|█████ | 149/291 [00:40<00:55, 2.54it/s] Loading 0: 52%|█████▏ | 152/291 [00:40<00:32, 4.31it/s] Loading 0: 53%|█████▎ | 153/291 [00:40<00:30, 4.56it/s] Loading 0: 53%|█████▎ | 154/291 [00:40<00:26, 5.11it/s] Loading 0: 54%|█████▎ | 156/291 [00:41<00:31, 4.30it/s] Loading 0: 54%|█████▍ | 157/291 [00:41<00:40, 3.27it/s] Loading 0: 54%|█████▍ | 158/291 [00:42<00:51, 2.58it/s] Loading 0: 55%|█████▌ | 161/291 [00:42<00:29, 4.38it/s] Loading 0: 56%|█████▌ | 162/291 [00:42<00:27, 4.63it/s] Loading 0: 56%|█████▌ | 163/291 [00:42<00:24, 5.21it/s] Loading 0: 57%|█████▋ | 165/291 [00:43<00:28, 4.36it/s] Loading 0: 57%|█████▋ | 166/291 [00:44<00:38, 3.28it/s] Loading 0: 57%|█████▋ | 167/291 [00:44<00:47, 2.60it/s] Loading 0: 58%|█████▊ | 170/291 [00:44<00:27, 4.41it/s] Loading 0: 59%|█████▉ | 171/291 [00:45<00:25, 4.65it/s] Loading 0: 59%|█████▉ | 172/291 [00:45<00:22, 5.23it/s] Loading 0: 60%|█████▉ | 174/291 [00:45<00:26, 4.39it/s] Loading 0: 60%|██████ | 175/291 [00:46<00:34, 3.32it/s] Loading 0: 60%|██████ | 176/291 [00:47<00:43, 2.63it/s] Loading 0: 62%|██████▏ | 179/291 [00:47<00:25, 4.48it/s] Loading 0: 62%|██████▏ | 180/291 [00:47<00:23, 4.72it/s] Loading 0: 62%|██████▏ | 181/291 [00:47<00:20, 5.29it/s] Loading 0: 63%|██████▎ | 183/291 [00:47<00:16, 6.51it/s] Loading 0: 63%|██████▎ | 184/291 [00:47<00:16, 6.42it/s] Loading 0: 64%|██████▎ | 185/291 [00:47<00:15, 6.96it/s] Loading 0: 64%|██████▍ | 186/291 [00:48<00:15, 6.91it/s] Loading 0: 64%|██████▍ | 187/291 [00:48<00:26, 3.87it/s] Loading 0: 65%|██████▍ | 188/291 [00:49<00:35, 2.90it/s] Loading 0: 65%|██████▍ | 189/291 [00:49<00:43, 2.33it/s] Loading 0: 66%|██████▌ | 192/291 [00:50<00:29, 3.35it/s] Loading 0: 66%|██████▋ | 193/291 [00:51<00:34, 2.81it/s] Loading 0: 67%|██████▋ | 194/291 [00:51<00:40, 2.37it/s] Loading 0: 68%|██████▊ | 197/291 [00:51<00:23, 4.03it/s] Loading 0: 68%|██████▊ | 198/291 [00:52<00:21, 4.30it/s] Loading 0: 68%|██████▊ | 199/291 [00:52<00:19, 4.77it/s] Loading 0: 69%|██████▉ | 201/291 [00:52<00:21, 4.13it/s] Loading 0: 69%|██████▉ | 202/291 [00:53<00:28, 3.16it/s] Loading 0: 70%|██████▉ | 203/291 [00:54<00:34, 2.52it/s] Loading 0: 71%|███████ | 206/291 [00:54<00:19, 4.27it/s] Loading 0: 71%|███████ | 207/291 [00:54<00:18, 4.50it/s] Loading 0: 71%|███████▏ | 208/291 [00:54<00:16, 5.07it/s] Loading 0: 72%|███████▏ | 210/291 [00:55<00:19, 4.23it/s] Loading 0: 73%|███████▎ | 211/291 [00:55<00:25, 3.19it/s] Loading 0: 73%|███████▎ | 212/291 [00:56<00:31, 2.52it/s] Loading 0: 74%|███████▍ | 215/291 [00:56<00:17, 4.27it/s] Loading 0: 74%|███████▍ | 216/291 [00:56<00:16, 4.48it/s] Loading 0: 75%|███████▍ | 217/291 [00:56<00:14, 5.01it/s] Loading 0: 75%|███████▌ | 219/291 [00:57<00:17, 4.19it/s] Loading 0: 76%|███████▌ | 220/291 [00:58<00:22, 3.13it/s] Loading 0: 76%|███████▌ | 221/291 [00:58<00:28, 2.47it/s] Loading 0: 77%|███████▋ | 224/291 [00:59<00:16, 4.18it/s] Loading 0: 77%|███████▋ | 225/291 [00:59<00:14, 4.42it/s] Loading 0: 78%|███████▊ | 226/291 [00:59<00:13, 4.92it/s] Loading 0: 78%|███████▊ | 227/291 [00:59<00:19, 3.27it/s] Loading 0: 78%|███████▊ | 228/291 [01:00<00:25, 2.48it/s] Loading 0: 79%|███████▉ | 230/291 [01:00<00:16, 3.63it/s] Loading 0: 79%|███████▉ | 231/291 [01:01<00:14, 4.02it/s] Loading 0: 80%|███████▉ | 232/291 [01:01<00:12, 4.57it/s] Loading 0: 80%|████████ | 233/291 [01:01<00:11, 5.01it/s] Loading 0: 80%|████████ | 234/291 [01:01<00:17, 3.20it/s] Loading 0: 81%|████████▏ | 237/291 [01:02<00:13, 3.97it/s] Loading 0: 82%|████████▏ | 238/291 [01:03<00:17, 3.08it/s] Loading 0: 82%|████████▏ | 239/291 [01:03<00:21, 2.47it/s] Loading 0: 83%|████████▎ | 242/291 [01:04<00:11, 4.15it/s] Loading 0: 84%|████████▎ | 243/291 [01:04<00:10, 4.38it/s] Loading 0: 84%|████████▍ | 244/291 [01:04<00:09, 4.93it/s] Loading 0: 85%|████████▍ | 246/291 [01:04<00:10, 4.12it/s] Loading 0: 85%|████████▍ | 247/291 [01:05<00:14, 3.13it/s] Loading 0: 85%|████████▌ | 248/291 [01:06<00:17, 2.50it/s] Loading 0: 86%|████████▋ | 251/291 [01:06<00:09, 4.30it/s] Loading 0: 87%|████████▋ | 252/291 [01:06<00:08, 4.53it/s] Loading 0: 87%|████████▋ | 253/291 [01:06<00:07, 5.11it/s] Loading 0: 88%|████████▊ | 255/291 [01:07<00:08, 4.23it/s] Loading 0: 88%|████████▊ | 256/291 [01:07<00:11, 3.18it/s] Loading 0: 88%|████████▊ | 257/291 [01:08<00:13, 2.53it/s] Loading 0: 89%|████████▉ | 260/291 [01:08<00:07, 4.34it/s] Loading 0: 90%|████████▉ | 261/291 [01:08<00:06, 4.56it/s] Loading 0: 90%|█████████ | 262/291 [01:08<00:05, 5.12it/s] Loading 0: 91%|█████████ | 264/291 [01:09<00:06, 4.20it/s] Loading 0: 91%|█████████ | 265/291 [01:10<00:08, 3.18it/s] Loading 0: 91%|█████████▏| 266/291 [01:10<00:09, 2.53it/s] Loading 0: 92%|█████████▏| 269/291 [01:11<00:05, 4.35it/s] Loading 0: 93%|█████████▎| 270/291 [01:11<00:04, 4.57it/s] Loading 0: 93%|█████████▎| 271/291 [01:11<00:03, 5.12it/s] Loading 0: 94%|█████████▍| 273/291 [01:11<00:04, 4.22it/s] Loading 0: 94%|█████████▍| 274/291 [01:12<00:05, 3.19it/s] Loading 0: 95%|█████████▍| 275/291 [01:13<00:06, 2.47it/s] Loading 0: 96%|█████████▌| 278/291 [01:13<00:03, 4.20it/s] Loading 0: 96%|█████████▌| 279/291 [01:13<00:02, 4.43it/s] Loading 0: 96%|█████████▌| 280/291 [01:13<00:02, 4.97it/s] Loading 0: 97%|█████████▋| 281/291 [01:14<00:02, 3.36it/s] Loading 0: 97%|█████████▋| 283/291 [01:14<00:01, 4.53it/s] Loading 0: 98%|█████████▊| 284/291 [01:14<00:01, 4.77it/s] Loading 0: 98%|█████████▊| 285/291 [01:14<00:01, 5.42it/s] Loading 0: 98%|█████████▊| 286/291 [01:14<00:00, 5.78it/s] Loading 0: 99%|█████████▊| 287/291 [01:15<00:01, 3.46it/s] Loading 0: 99%|█████████▉| 288/291 [01:16<00:01, 2.53it/s]
Job rirv938-llama-8b-multihe-9554-v4-mkmlizer completed after 166.07s with status: succeeded
Stopping job with name rirv938-llama-8b-multihe-9554-v4-mkmlizer
Pipeline stage MKMLizer completed in 167.18s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rirv938-llama-8b-multihe-9554-v4
Waiting for inference service rirv938-llama-8b-multihe-9554-v4 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service rirv938-llama-8b-multihe-9554-v4 ready after 130.5803689956665s
Pipeline stage MKMLDeployer completed in 131.13s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 5.206478595733643s
Received healthy response to inference request in 4.844113111495972s
Received healthy response to inference request in 2.8236026763916016s
Received healthy response to inference request in 2.2667665481567383s
Received healthy response to inference request in 5.072033405303955s
5 requests
0 failed requests
5th percentile: 2.378133773803711
10th percentile: 2.4895009994506836
20th percentile: 2.712235450744629
30th percentile: 3.2277047634124756
40th percentile: 4.035908937454224
50th percentile: 4.844113111495972
60th percentile: 4.935281229019165
70th percentile: 5.026449346542359
80th percentile: 5.098922443389893
90th percentile: 5.152700519561767
95th percentile: 5.179589557647705
99th percentile: 5.201100788116455
mean time: 4.042598867416382
%s, retrying in %s seconds...
Received healthy response to inference request in 5.524217128753662s
Received healthy response to inference request in 3.162252187728882s
Received healthy response to inference request in 4.226128578186035s
Received healthy response to inference request in 3.339550018310547s
Received healthy response to inference request in 2.225590705871582s
5 requests
0 failed requests
5th percentile: 2.412923002243042
10th percentile: 2.6002552986145018
20th percentile: 2.974919891357422
30th percentile: 3.1977117538452147
40th percentile: 3.2686308860778808
50th percentile: 3.339550018310547
60th percentile: 3.694181442260742
70th percentile: 4.0488128662109375
80th percentile: 4.485746288299561
90th percentile: 5.0049817085266115
95th percentile: 5.264599418640136
99th percentile: 5.472293586730957
mean time: 3.6955477237701415
%s, retrying in %s seconds...
Received healthy response to inference request in 3.3954780101776123s
Received healthy response to inference request in 3.622514009475708s
Received healthy response to inference request in 4.660444021224976s
Received healthy response to inference request in 3.5513248443603516s
Received healthy response to inference request in 2.767172336578369s
5 requests
0 failed requests
5th percentile: 2.8928334712982178
10th percentile: 3.0184946060180664
20th percentile: 3.2698168754577637
30th percentile: 3.4266473770141603
40th percentile: 3.488986110687256
50th percentile: 3.5513248443603516
60th percentile: 3.579800510406494
70th percentile: 3.6082761764526365
80th percentile: 3.8301000118255617
90th percentile: 4.245272016525268
95th percentile: 4.452858018875122
99th percentile: 4.6189268207550045
mean time: 3.5993866443634035
Pipeline stage StressChecker completed in 60.86s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.15s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 2.10s
Shutdown handler de-registered
rirv938-llama-8b-multihe_9554_v4 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rirv938-llama-8b-multihe-9554-v4-profiler
Waiting for inference service rirv938-llama-8b-multihe-9554-v4-profiler to be ready
Inference service rirv938-llama-8b-multihe-9554-v4-profiler ready after 140.3129620552063s
Pipeline stage MKMLProfilerDeployer completed in 140.67s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploj74xf:/code/chaiverse_profiler_1732827112 --namespace tenant-chaiml-guanaco
kubectl exec -it rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploj74xf --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1732827112 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1732827112/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploj74xf:/code/chaiverse_profiler_1732829879 --namespace tenant-chaiml-guanaco
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploj74xf:/code/chaiverse_profiler_1732829880 --namespace tenant-chaiml-guanaco
kubectl exec -it rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploj74xf --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1732829880 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1732829880/summary.json'
Received signal 2, running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rirv938-llama-8b-multihe-9554-v4-profiler is running
Tearing down inference service rirv938-llama-8b-multihe-9554-v4-profiler
Service rirv938-llama-8b-multihe-9554-v4-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.40s
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rirv938-llama-8b-multihe-9554-v4-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 1.51s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rirv938-llama-8b-multihe-9554-v4-profiler
Waiting for inference service rirv938-llama-8b-multihe-9554-v4-profiler to be ready
Inference service rirv938-llama-8b-multihe-9554-v4-profiler ready after 40.1140251159668s
Pipeline stage MKMLProfilerDeployer completed in 40.46s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploq6xwp:/code/chaiverse_profiler_1732830642 --namespace tenant-chaiml-guanaco
kubectl exec -it rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploq6xwp --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1732830642 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1732830642/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploq6xwp:/code/chaiverse_profiler_1732833411 --namespace tenant-chaiml-guanaco
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploq6xwp:/code/chaiverse_profiler_1732833411 --namespace tenant-chaiml-guanaco
kubectl exec -it rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deploq6xwp --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1732833411 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1732833411/summary.json'
Received signal 2, running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rirv938-llama-8b-multihe-9554-v4-profiler is running
Tearing down inference service rirv938-llama-8b-multihe-9554-v4-profiler
Service rirv938-llama-8b-multihe-9554-v4-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.66s
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rirv938-llama-8b-multihe-9554-v4-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 1.54s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rirv938-llama-8b-multihe-9554-v4-profiler
Waiting for inference service rirv938-llama-8b-multihe-9554-v4-profiler to be ready
Inference service rirv938-llama-8b-multihe-9554-v4-profiler ready after 100.23359084129333s
Pipeline stage MKMLProfilerDeployer completed in 100.61s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deplottwvc:/code/chaiverse_profiler_1732834328 --namespace tenant-chaiml-guanaco
kubectl exec -it rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deplottwvc --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1732834328 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1732834328/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deplottwvc:/code/chaiverse_profiler_1732837115 --namespace tenant-chaiml-guanaco
kubectl exec -it rirv938-llama-8b-mul6a96d23f56a5d2a925ebbec3a80992c8-deplottwvc --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1732837115 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1732837115/summary.json'
Received signal 2, running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rirv938-llama-8b-multihe-9554-v4-profiler is running
Tearing down inference service rirv938-llama-8b-multihe-9554-v4-profiler
Service rirv938-llama-8b-multihe-9554-v4-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.47s
Shutdown handler de-registered
rirv938-llama-8b-multihe_9554_v4 status is now inactive due to auto deactivation removed underperforming models