submission_id: kaoeiri-pantheramax-l3-r_7856_v5
developer_uid: helkero
best_of: 4
celo_rating: 1134.59
display_name: kaoeiri-pantheramax-l3-r_7856_v5
family_friendly_score: 0.0
formatter: {'memory_template': '### Instruction:\n{memory}\n', 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.94, 'top_p': 0.95, 'min_p': 0.075, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
is_internal_developer: False
language_model: Kaoeiri/PantheraMax-L3-RP-TestProbe-5-8x8B
max_input_tokens: 512
max_output_tokens: 64
model_architecture: MixtralForCausalLM
model_group: Kaoeiri/PantheraMax-L3-R
model_name: kaoeiri-pantheramax-l3-r_7856_v5
model_num_parameters: 47491321856.0
model_repo: Kaoeiri/PantheraMax-L3-RP-TestProbe-5-8x8B
model_size: 47B
num_battles: 9297
num_wins: 3936
ranking_group: single
reward_formatter: {'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '<|im_start|>user\n{prompt}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': False, 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n'}
reward_repo: Jellywibble/CHAI_alignment_reward_model
status: torndown
submission_type: basic
timestamp: 2024-07-18T15:02:15+00:00
us_pacific_date: 2024-07-18
win_ratio: 0.4233623749596644
Resubmit model
Running pipeline stage MKMLizer
Starting job with name kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer
Waiting for job on kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer to finish
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ _____ __ __ ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ /___/ ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ Version: 0.9.5.post3 ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ https://mk1.ai ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ The license key for the current software has been verified as ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ belonging to: ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ Chai Research Corp. ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ║ ║
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Downloaded to shared memory in 154.715s
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp362i5vw1, device:0
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Loading 0: 0%| | 0/995 [00:00<?, ?it/s] Loading 0: 1%| | 6/995 [00:00<00:17, 56.39it/s] Loading 0: 1%| | 12/995 [00:00<00:18, 54.20it/s] Loading 0: 2%|▏ | 18/995 [00:00<00:18, 53.36it/s] Loading 0: 2%|▏ | 24/995 [00:00<00:18, 53.06it/s] Loading 0: 4%|▎ | 35/995 [00:00<00:13, 68.92it/s] Loading 0: 4%|▍ | 42/995 [00:00<00:15, 62.12it/s] Loading 0: 5%|▍ | 49/995 [00:00<00:16, 58.89it/s] Loading 0: 6%|▌ | 55/995 [00:00<00:16, 56.99it/s] Loading 0: 7%|▋ | 66/995 [00:01<00:13, 69.13it/s] Loading 0: 7%|▋ | 74/995 [00:01<00:14, 62.04it/s] Loading 0: 8%|▊ | 81/995 [00:01<00:15, 59.04it/s] Loading 0: 9%|▉ | 92/995 [00:01<00:12, 69.89it/s] Loading 0: 10%|█ | 100/995 [00:02<00:34, 25.90it/s] Loading 0: 11%|█ | 106/995 [00:02<00:30, 29.38it/s] Loading 0: 11%|█▏ | 112/995 [00:02<00:26, 33.00it/s] Loading 0: 12%|█▏ | 118/995 [00:02<00:24, 36.54it/s] Loading 0: 12%|█▏ | 124/995 [00:02<00:22, 39.51it/s] Loading 0: 13%|█▎ | 130/995 [00:02<00:20, 42.25it/s] Loading 0: 14%|█▎ | 136/995 [00:02<00:19, 44.72it/s] Loading 0: 14%|█▍ | 142/995 [00:03<00:18, 46.69it/s] Loading 0: 15%|█▌ | 153/995 [00:03<00:13, 60.16it/s] Loading 0: 16%|█▌ | 160/995 [00:03<00:15, 54.56it/s] Loading 0: 17%|█▋ | 166/995 [00:03<00:15, 54.02it/s] Loading 0: 17%|█▋ | 172/995 [00:17<08:40, 1.58it/s] Loading 0: 18%|█▊ | 183/995 [00:17<05:04, 2.66it/s] Loading 0: 19%|█▉ | 191/995 [00:17<03:35, 3.73it/s] Loading 0: 20%|██ | 200/995 [00:18<02:47, 4.74it/s] Loading 0: 21%|██ | 206/995 [00:18<02:10, 6.04it/s] Loading 0: 21%|██▏ | 212/995 [00:18<01:40, 7.78it/s] Loading 0: 22%|██▏ | 218/995 [00:18<01:17, 10.06it/s] Loading 0: 23%|██▎ | 224/995 [00:18<00:59, 12.94it/s] Loading 0: 23%|██▎ | 230/995 [00:18<00:46, 16.44it/s] Loading 0: 24%|██▍ | 239/995 [00:19<00:31, 23.66it/s] Loading 0: 25%|██▍ | 246/995 [00:19<00:25, 29.38it/s] Loading 0: 25%|██▌ | 253/995 [00:19<00:21, 33.95it/s] Loading 0: 26%|██▌ | 260/995 [00:19<00:19, 37.26it/s] Loading 0: 27%|██▋ | 269/995 [00:19<00:15, 46.72it/s] Loading 0: 28%|██▊ | 276/995 [00:19<00:14, 51.33it/s] Loading 0: 28%|██▊ | 283/995 [00:19<00:13, 51.60it/s] Loading 0: 29%|██▉ | 290/995 [00:19<00:13, 52.11it/s] Loading 0: 30%|███ | 301/995 [00:20<00:28, 24.50it/s] Loading 0: 31%|███ | 307/995 [00:20<00:24, 27.67it/s] Loading 0: 31%|███▏ | 312/995 [00:20<00:22, 30.52it/s] Loading 0: 32%|███▏ | 317/995 [00:21<00:21, 32.06it/s] Loading 0: 32%|███▏ | 323/995 [00:21<00:18, 36.10it/s] Loading 0: 33%|███▎ | 333/995 [00:33<00:18, 36.10it/s] Loading 0: 34%|███▎ | 334/995 [00:33<05:35, 1.97it/s] Loading 0: 34%|███▍ | 340/995 [00:33<04:11, 2.60it/s] Loading 0: 35%|███▍ | 346/995 [00:33<03:06, 3.48it/s] Loading 0: 35%|███▌ | 352/995 [00:34<02:17, 4.67it/s] Loading 0: 36%|███▋ | 363/995 [00:34<01:21, 7.77it/s] Loading 0: 37%|███▋ | 370/995 [00:34<01:01, 10.16it/s] Loading 0: 38%|███▊ | 377/995 [00:34<00:46, 13.16it/s] Loading 0: 38%|███▊ | 383/995 [00:34<00:37, 16.31it/s] Loading 0: 40%|███▉ | 394/995 [00:34<00:24, 24.61it/s] Loading 0: 41%|████ | 403/995 [00:34<00:18, 32.00it/s] Loading 0: 41%|████▏ | 411/995 [00:35<00:30, 19.01it/s] Loading 0: 42%|████▏ | 417/995 [00:35<00:25, 22.42it/s] Loading 0: 43%|████▎ | 423/995 [00:35<00:21, 26.26it/s] Loading 0: 43%|████▎ | 429/995 [00:35<00:18, 30.32it/s] Loading 0: 44%|████▎ | 435/995 [00:36<00:16, 34.38it/s] Loading 0: 44%|████▍ | 441/995 [00:36<00:14, 38.35it/s] Loading 0: 45%|████▌ | 452/995 [00:36<00:10, 51.37it/s] Loading 0: 46%|████▌ | 459/995 [00:36<00:10, 51.08it/s] Loading 0: 47%|████▋ | 466/995 [00:36<00:10, 51.16it/s] Loading 0: 47%|████▋ | 472/995 [00:36<00:10, 51.35it/s] Loading 0: 49%|████▊ | 483/995 [00:36<00:07, 64.25it/s] Loading 0: 49%|████▉ | 491/995 [00:36<00:08, 60.15it/s] Loading 0: 50%|█████ | 498/995 [00:49<04:06, 2.02it/s] Loading 0: 51%|█████ | 508/995 [00:50<02:49, 2.88it/s] Loading 0: 52%|█████▏ | 514/995 [00:50<02:10, 3.69it/s] Loading 0: 52%|█████▏ | 520/995 [00:50<01:38, 4.82it/s] Loading 0: 53%|█████▎ | 526/995 [00:50<01:14, 6.33it/s] Loading 0: 53%|█████▎ | 532/995 [00:50<00:55, 8.34it/s] Loading 0: 55%|█████▍ | 543/995 [00:50<00:33, 13.53it/s] Loading 0: 55%|█████▌ | 550/995 [00:50<00:26, 17.04it/s] Loading 0: 56%|█████▌ | 557/995 [00:51<00:20, 21.12it/s] Loading 0: 57%|█████▋ | 564/995 [00:51<00:16, 25.61it/s] Loading 0: 58%|█████▊ | 575/995 [00:51<00:11, 36.25it/s] Loading 0: 59%|█████▊ | 583/995 [00:51<00:10, 39.99it/s] Loading 0: 59%|█████▉ | 590/995 [00:51<00:09, 42.38it/s] Loading 0: 60%|██████ | 599/995 [00:51<00:07, 51.09it/s] Loading 0: 61%|██████ | 606/995 [00:51<00:07, 54.98it/s] Loading 0: 62%|██████▏ | 614/995 [00:52<00:15, 24.73it/s] Loading 0: 62%|██████▏ | 620/995 [00:52<00:13, 28.47it/s] Loading 0: 63%|██████▎ | 626/995 [00:52<00:11, 32.52it/s] Loading 0: 64%|██████▎ | 632/995 [00:52<00:09, 36.42it/s] Loading 0: 64%|██████▍ | 638/995 [00:53<00:09, 37.65it/s] Loading 0: 65%|██████▍ | 643/995 [00:53<00:08, 39.16it/s] Loading 0: 65%|██████▌ | 649/995 [00:53<00:08, 42.60it/s] Loading 0: 66%|██████▌ | 655/995 [00:53<00:07, 45.51it/s] Loading 0: 67%|██████▋ | 663/995 [01:05<00:07, 45.51it/s] Loading 0: 67%|██████▋ | 664/995 [01:05<03:06, 1.78it/s] Loading 0: 67%|██████▋ | 670/995 [01:06<02:14, 2.41it/s] Loading 0: 68%|██████▊ | 676/995 [01:06<01:36, 3.29it/s] Loading 0: 69%|██████▊ | 682/995 [01:06<01:09, 4.49it/s] Loading 0: 70%|██████▉ | 693/995 [01:06<00:39, 7.60it/s] Loading 0: 70%|███████ | 700/995 [01:06<00:29, 9.99it/s] Loading 0: 71%|███████ | 707/995 [01:06<00:22, 13.01it/s] Loading 0: 72%|███████▏ | 715/995 [01:07<00:22, 12.20it/s] Loading 0: 72%|███████▏ | 721/995 [01:07<00:18, 15.20it/s] Loading 0: 73%|███████▎ | 727/995 [01:07<00:14, 18.81it/s] Loading 0: 74%|███████▎ | 733/995 [01:07<00:11, 22.95it/s] Loading 0: 74%|███████▍ | 739/995 [01:07<00:09, 27.43it/s] Loading 0: 75%|███████▍ | 745/995 [01:07<00:07, 32.00it/s] Loading 0: 76%|███████▌ | 756/995 [01:08<00:05, 45.31it/s] Loading 0: 77%|███████▋ | 763/995 [01:08<00:04, 47.66it/s] Loading 0: 77%|███████▋ | 770/995 [01:08<00:04, 49.19it/s] Loading 0: 78%|███████▊ | 777/995 [01:08<00:04, 52.86it/s] Loading 0: 79%|███████▉ | 786/995 [01:08<00:03, 61.01it/s] Loading 0: 80%|███████▉ | 793/995 [01:08<00:03, 59.12it/s] Loading 0: 80%|████████ | 800/995 [01:08<00:03, 57.97it/s] Loading 0: 81%|████████▏ | 809/995 [01:08<00:02, 65.52it/s] Loading 0: 83%|████████▎ | 821/995 [01:09<00:05, 29.57it/s] Loading 0: 83%|████████▎ | 827/995 [01:09<00:05, 32.93it/s] Loading 0: 84%|████████▎ | 833/995 [01:22<01:27, 1.85it/s] Loading 0: 84%|████████▍ | 839/995 [01:22<01:03, 2.46it/s] Loading 0: 85%|████████▍ | 845/995 [01:22<00:45, 3.30it/s] Loading 0: 86%|████████▌ | 851/995 [01:23<00:32, 4.46it/s] Loading 0: 86%|████████▌ | 857/995 [01:23<00:22, 6.02it/s] Loading 0: 87%|████████▋ | 863/995 [01:23<00:16, 8.09it/s] Loading 0: 88%|████████▊ | 874/995 [01:23<00:09, 13.41it/s] Loading 0: 89%|████████▊ | 881/995 [01:23<00:06, 17.02it/s] Loading 0: 89%|████████▉ | 888/995 [01:23<00:05, 21.22it/s] Loading 0: 90%|████████▉ | 895/995 [01:23<00:03, 25.84it/s] Loading 0: 91%|█████████ | 906/995 [01:23<00:02, 36.79it/s] Loading 0: 92%|█████████▏| 914/995 [01:23<00:01, 40.69it/s] Loading 0: 93%|█████████▎| 922/995 [01:31<00:20, 3.58it/s] Loading 0: 96%|█████████▌| 952/995 [01:31<00:04, 9.00it/s] Loading 0: 97%|█████████▋| 963/995 [01:31<00:02, 11.33it/s] Loading 0: 98%|█████████▊| 973/995 [01:31<00:01, 13.98it/s] Loading 0: 99%|█████████▊| 982/995 [01:31<00:00, 17.08it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: quantized model in 108.767s
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Processed model Kaoeiri/PantheraMax-L3-RP-TestProbe-5-8x8B in 263.482s
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: creating bucket guanaco-mkml-models
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/special_tokens_map.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/config.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/tokenizer_config.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/tokenizer.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.5.safetensors s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.5.safetensors
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.1.safetensors
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.3.safetensors s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.3.safetensors
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.0.safetensors
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.2.safetensors s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.2.safetensors
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.4.safetensors s3://guanaco-mkml-models/kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.4.safetensors
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: loading reward model from Jellywibble/CHAI_alignment_reward_model
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:950: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: warnings.warn(
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:778: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: warnings.warn(
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: warnings.warn(
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Saving duration: 0.196s
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Processed model Jellywibble/CHAI_alignment_reward_model in 4.188s
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: creating bucket guanaco-reward-models
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: Bucket 's3://guanaco-reward-models/' created
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/kaoeiri-pantheramax-l3-r-7856-v5_reward
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/kaoeiri-pantheramax-l3-r-7856-v5_reward/config.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/kaoeiri-pantheramax-l3-r-7856-v5_reward/special_tokens_map.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/kaoeiri-pantheramax-l3-r-7856-v5_reward/tokenizer_config.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/kaoeiri-pantheramax-l3-r-7856-v5_reward/merges.txt
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/kaoeiri-pantheramax-l3-r-7856-v5_reward/vocab.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/kaoeiri-pantheramax-l3-r-7856-v5_reward/tokenizer.json
kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/kaoeiri-pantheramax-l3-r-7856-v5_reward/reward.tensors
Job kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer completed after 320.08s with status: succeeded
Stopping job with name kaoeiri-pantheramax-l3-r-7856-v5-mkmlizer
Pipeline stage MKMLizer completed in 321.92s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service kaoeiri-pantheramax-l3-r-7856-v5
Waiting for inference service kaoeiri-pantheramax-l3-r-7856-v5 to be ready
Inference service kaoeiri-pantheramax-l3-r-7856-v5 ready after 193.40113973617554s
Pipeline stage ISVCDeployer completed in 196.38s
Running pipeline stage StressChecker
Received healthy response to inference request in 4.3627166748046875s
Received healthy response to inference request in 3.3239259719848633s
Received healthy response to inference request in 3.4946954250335693s
Received healthy response to inference request in 3.3808915615081787s
Received healthy response to inference request in 3.3233306407928467s
5 requests
0 failed requests
5th percentile: 3.32344970703125
10th percentile: 3.3235687732696535
20th percentile: 3.32380690574646
30th percentile: 3.335319089889526
40th percentile: 3.3581053256988525
50th percentile: 3.3808915615081787
60th percentile: 3.426413106918335
70th percentile: 3.4719346523284913
80th percentile: 3.668299674987793
90th percentile: 4.01550817489624
95th percentile: 4.189112424850464
99th percentile: 4.327995824813843
mean time: 3.577112054824829
Pipeline stage StressChecker completed in 18.72s
kaoeiri-pantheramax-l3-r_7856_v5 status is now deployed due to DeploymentManager action
kaoeiri-pantheramax-l3-r_7856_v5 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of kaoeiri-pantheramax-l3-r_7856_v5
Running pipeline stage ISVCDeleter
Checking if service kaoeiri-pantheramax-l3-r-7856-v5 is running
Tearing down inference service kaoeiri-pantheramax-l3-r-7856-v5
Service kaoeiri-pantheramax-l3-r-7856-v5 has been torndown
Pipeline stage ISVCDeleter completed in 5.78s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/config.json from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.1.safetensors from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.2.safetensors from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.3.safetensors from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.4.safetensors from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/flywheel_model.5.safetensors from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/tokenizer.json from bucket guanaco-mkml-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key kaoeiri-pantheramax-l3-r-7856-v5_reward/config.json from bucket guanaco-reward-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5_reward/merges.txt from bucket guanaco-reward-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5_reward/reward.tensors from bucket guanaco-reward-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key kaoeiri-pantheramax-l3-r-7856-v5_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 9.70s
kaoeiri-pantheramax-l3-r_7856_v5 status is now torndown due to DeploymentManager action