submission_id: mistralai-mixtral-8x7b-_3473_v65
developer_uid: chai_backend_admin
status: inactive
model_repo: mistralai/Mixtral-8x7B-Instruct-v0.1
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 50, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>', '<|user|>', '###'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '<s>[INST] This is an entertaining conversation. You are {bot_name} who has the persona: {memory}.\nEngage in a chat with {user_name} while staying in character. Try to flirt with {user_name}. Engage in *roleplay* actions. Describe the scene dramatically. \n', 'prompt_template': '{prompt}\n', 'bot_template': '{bot_name}: {message}</s>', 'user_template': '[INST] {user_name}: {message} [/INST]', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "''", 'prompt_template': "''", 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-07-09T22:15:27+00:00
model_name: mistralai-mixtral-8x7b-_3473_v65
model_group: mistralai/Mixtral-8x7B-I
num_battles: 36601
num_wins: 17609
celo_rating: 1179.15
propriety_score: 0.722213337597953
propriety_total_count: 6253.0
submission_type: basic
model_architecture: MixtralForCausalLM
model_num_parameters: 46702792704.0
best_of: 4
max_input_tokens: 512
max_output_tokens: 64
display_name: mistralai-mixtral-8x7b-_3473_v65
ineligible_reason: None
language_model: mistralai/Mixtral-8x7B-Instruct-v0.1
model_size: 47B
reward_model: ChaiML/gpt2_xl_pairwise_89m_step_347634
us_pacific_date: 2024-07-09
win_ratio: 0.48110707357722465
preference_data_url: None
Resubmit model
Running pipeline stage MKMLizer
Starting job with name mistralai-mixtral-8x7b-3473-v65-mkmlizer
Waiting for job on mistralai-mixtral-8x7b-3473-v65-mkmlizer to finish
HTTP Request: %s %s "%s %d %s"
HTTP Request: %s %s "%s %d %s"
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ _____ __ __ ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ /___/ ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ Version: 0.9.5.post1 ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ https://mk1.ai ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ belonging to: ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ Chai Research Corp. ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ║ ║
mistralai-mixtral-8x7b-3473-v65-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Downloaded to shared memory in 152.937s
mistralai-mixtral-8x7b-3473-v65-mkmlizer: quantizing model to /dev/shm/model_cache
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mixtral-8x7b-3473-v65-mkmlizer: quantized model in 69.669s
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Processed model mistralai/Mixtral-8x7B-Instruct-v0.1 in 222.606s
mistralai-mixtral-8x7b-3473-v65-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-mixtral-8x7b-3473-v65-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/config.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/special_tokens_map.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/tokenizer.model
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/tokenizer_config.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/tokenizer.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/flywheel_model.5.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/flywheel_model.5.safetensors
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/flywheel_model.1.safetensors
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/flywheel_model.3.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/flywheel_model.3.safetensors
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/flywheel_model.2.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/flywheel_model.2.safetensors
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/flywheel_model.4.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/flywheel_model.4.safetensors
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v65/flywheel_model.0.safetensors
mistralai-mixtral-8x7b-3473-v65-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Loading 0: 0%| | 0/995 [00:00<?, ?it/s] Loading 0: 1%| | 6/995 [00:00<00:16, 59.24it/s] Loading 0: 1%| | 12/995 [00:00<00:17, 57.33it/s] Loading 0: 2%|▏ | 19/995 [00:00<00:16, 59.51it/s] Loading 0: 3%|▎ | 25/995 [00:00<00:16, 57.36it/s] Loading 0: 4%|▍ | 38/995 [00:00<00:12, 77.57it/s] Loading 0: 5%|▍ | 46/995 [00:00<00:15, 61.65it/s] Loading 0: 5%|▌ | 53/995 [00:00<00:20, 45.45it/s] Loading 0: 6%|▌ | 59/995 [00:01<00:20, 46.46it/s] Loading 0: 7%|▋ | 66/995 [00:01<00:17, 51.67it/s] Loading 0: 7%|▋ | 72/995 [00:01<00:17, 52.58it/s] Loading 0: 8%|▊ | 79/995 [00:01<00:16, 54.63it/s] Loading 0: 9%|▊ | 85/995 [00:01<00:16, 55.25it/s] Loading 0: 10%|▉ | 97/995 [00:01<00:12, 70.68it/s] Loading 0: 11%|█ | 107/995 [00:01<00:17, 50.41it/s] Loading 0: 11%|█▏ | 114/995 [00:02<00:17, 50.64it/s] Loading 0: 12%|█▏ | 120/995 [00:02<00:17, 50.84it/s] Loading 0: 13%|█▎ | 128/995 [00:02<00:15, 56.35it/s] Loading 0: 14%|█▎ | 135/995 [00:02<00:14, 57.70it/s] Loading 0: 14%|█▍ | 142/995 [00:02<00:14, 58.69it/s] Loading 0: 15%|█▍ | 149/995 [00:02<00:14, 59.29it/s] Loading 0: 16%|█▋ | 162/995 [00:02<00:14, 59.04it/s] Loading 0: 17%|█▋ | 169/995 [00:03<00:14, 57.44it/s] Loading 0: 18%|█▊ | 175/995 [00:03<00:14, 56.16it/s] Loading 0: 18%|█▊ | 181/995 [00:03<00:14, 56.21it/s] Loading 0: 19%|█▉ | 187/995 [00:12<05:43, 2.36it/s] Loading 0: 19%|█▉ | 193/995 [00:12<04:10, 3.20it/s] Loading 0: 20%|██ | 199/995 [00:12<03:03, 4.34it/s] Loading 0: 21%|██ | 210/995 [00:13<01:53, 6.89it/s] Loading 0: 22%|██▏ | 219/995 [00:13<01:18, 9.90it/s] Loading 0: 23%|██▎ | 225/995 [00:13<01:02, 12.38it/s] Loading 0: 23%|██▎ | 232/995 [00:13<00:47, 16.10it/s] Loading 0: 24%|██▍ | 239/995 [00:13<00:37, 20.43it/s] Loading 0: 25%|██▌ | 252/995 [00:13<00:23, 31.70it/s] Loading 0: 26%|██▌ | 260/995 [00:13<00:20, 36.49it/s] Loading 0: 27%|██▋ | 268/995 [00:14<00:20, 35.35it/s] Loading 0: 28%|██▊ | 278/995 [00:14<00:16, 44.08it/s] Loading 0: 29%|██▊ | 285/995 [00:14<00:15, 46.13it/s] Loading 0: 29%|██▉ | 293/995 [00:14<00:13, 51.09it/s] Loading 0: 30%|███ | 300/995 [00:14<00:12, 54.21it/s] Loading 0: 31%|███ | 307/995 [00:14<00:12, 55.68it/s] Loading 0: 32%|███▏ | 319/995 [00:14<00:09, 70.97it/s] Loading 0: 33%|███▎ | 328/995 [00:15<00:14, 46.15it/s] Loading 0: 34%|███▎ | 335/995 [00:15<00:13, 47.71it/s] Loading 0: 34%|███▍ | 343/995 [00:15<00:12, 53.33it/s] Loading 0: 35%|███▌ | 350/995 [00:15<00:11, 54.00it/s] Loading 0: 36%|███▌ | 357/995 [00:23<03:37, 2.93it/s] Loading 0: 37%|███▋ | 365/995 [00:23<02:30, 4.18it/s] Loading 0: 37%|███▋ | 371/995 [00:24<01:59, 5.20it/s] Loading 0: 38%|███▊ | 379/995 [00:24<01:22, 7.43it/s] Loading 0: 39%|███▉ | 386/995 [00:24<01:01, 9.93it/s] Loading 0: 39%|███▉ | 392/995 [00:24<00:47, 12.66it/s] Loading 0: 41%|████ | 405/995 [00:24<00:28, 21.06it/s] Loading 0: 42%|████▏ | 413/995 [00:24<00:22, 25.68it/s] Loading 0: 43%|████▎ | 423/995 [00:25<00:20, 27.99it/s] Loading 0: 43%|████▎ | 429/995 [00:25<00:17, 31.45it/s] Loading 0: 44%|████▍ | 437/995 [00:25<00:14, 37.92it/s] Loading 0: 45%|████▍ | 444/995 [00:25<00:13, 41.82it/s] Loading 0: 45%|████▌ | 451/995 [00:25<00:11, 45.45it/s] Loading 0: 46%|████▌ | 458/995 [00:25<00:11, 47.77it/s] Loading 0: 47%|████▋ | 471/995 [00:25<00:08, 63.37it/s] Loading 0: 48%|████▊ | 479/995 [00:26<00:10, 47.27it/s] Loading 0: 49%|████▉ | 486/995 [00:26<00:10, 48.63it/s] Loading 0: 49%|████▉ | 492/995 [00:26<00:09, 50.79it/s] Loading 0: 50%|█████ | 500/995 [00:26<00:08, 56.21it/s] Loading 0: 51%|█████ | 507/995 [00:26<00:08, 56.28it/s] Loading 0: 52%|█████▏ | 514/995 [00:26<00:08, 56.74it/s] Loading 0: 52%|█████▏ | 521/995 [00:34<02:49, 2.79it/s] Loading 0: 53%|█████▎ | 526/995 [00:35<02:14, 3.48it/s] Loading 0: 54%|█████▎ | 534/995 [00:35<01:29, 5.16it/s] Loading 0: 54%|█████▍ | 539/995 [00:35<01:09, 6.52it/s] Loading 0: 55%|█████▍ | 545/995 [00:35<00:51, 8.71it/s] Loading 0: 55%|█████▌ | 551/995 [00:35<00:38, 11.49it/s] Loading 0: 57%|█████▋ | 563/995 [00:35<00:22, 19.41it/s] Loading 0: 57%|█████▋ | 570/995 [00:35<00:18, 23.61it/s] Loading 0: 58%|█████▊ | 577/995 [00:35<00:14, 28.69it/s] Loading 0: 59%|█████▊ | 584/995 [00:36<00:14, 28.38it/s] Loading 0: 59%|█████▉ | 592/995 [00:36<00:11, 34.94it/s] Loading 0: 60%|██████ | 598/995 [00:36<00:10, 38.16it/s] Loading 0: 61%|██████ | 604/995 [00:36<00:09, 41.36it/s] Loading 0: 61%|██████▏ | 610/995 [00:36<00:08, 44.78it/s] Loading 0: 63%|██████▎ | 622/995 [00:36<00:06, 61.13it/s] Loading 0: 63%|██████▎ | 630/995 [00:36<00:06, 58.37it/s] Loading 0: 64%|██████▍ | 637/995 [00:37<00:08, 44.07it/s] Loading 0: 65%|██████▍ | 643/995 [00:37<00:09, 36.01it/s] Loading 0: 65%|██████▌ | 648/995 [00:37<00:09, 36.93it/s] Loading 0: 66%|██████▌ | 655/995 [00:37<00:08, 41.47it/s] Loading 0: 66%|██████▋ | 660/995 [00:37<00:07, 43.12it/s] Loading 0: 67%|██████▋ | 665/995 [00:37<00:08, 39.08it/s] Loading 0: 67%|██████▋ | 670/995 [00:38<00:09, 33.81it/s] Loading 0: 68%|██████▊ | 675/995 [00:38<00:08, 37.08it/s] Loading 0: 69%|██████▉ | 691/995 [00:38<00:06, 44.65it/s] Loading 0: 70%|███████ | 697/995 [00:47<01:41, 2.94it/s] Loading 0: 71%|███████ | 703/995 [00:47<01:16, 3.83it/s] Loading 0: 71%|███████▏ | 709/995 [00:47<00:56, 5.05it/s] Loading 0: 72%|███████▏ | 716/995 [00:47<00:39, 7.03it/s] Loading 0: 72%|███████▏ | 721/995 [00:47<00:31, 8.82it/s] Loading 0: 73%|███████▎ | 727/995 [00:47<00:23, 11.58it/s] Loading 0: 74%|███████▎ | 733/995 [00:47<00:17, 15.04it/s] Loading 0: 74%|███████▍ | 739/995 [00:47<00:14, 17.11it/s] Loading 0: 75%|███████▌ | 747/995 [00:48<00:10, 23.36it/s] Loading 0: 76%|███████▌ | 753/995 [00:48<00:08, 27.44it/s] Loading 0: 76%|███████▋ | 759/995 [00:48<00:07, 31.79it/s] Loading 0: 77%|███████▋ | 765/995 [00:48<00:06, 36.34it/s] Loading 0: 78%|███████▊ | 777/995 [00:48<00:04, 52.99it/s] Loading 0: 79%|███████▉ | 785/995 [00:48<00:03, 53.35it/s] Loading 0: 80%|███████▉ | 794/995 [00:48<00:04, 45.04it/s] Loading 0: 81%|████████ | 801/995 [00:48<00:03, 49.01it/s] Loading 0: 81%|████████▏ | 810/995 [00:49<00:03, 56.55it/s] Loading 0: 82%|████████▏ | 817/995 [00:49<00:03, 57.06it/s] Loading 0: 83%|████████▎ | 824/995 [00:49<00:03, 56.99it/s] Loading 0: 84%|████████▎ | 831/995 [00:49<00:02, 58.09it/s] Loading 0: 85%|████████▌ | 847/995 [00:49<00:01, 83.45it/s] Loading 0: 86%|████████▌ | 857/995 [00:49<00:02, 55.02it/s] Loading 0: 87%|████████▋ | 865/995 [00:58<00:36, 3.53it/s] Loading 0: 88%|████████▊ | 874/995 [00:58<00:24, 4.89it/s] Loading 0: 89%|████████▊ | 881/995 [00:58<00:18, 6.31it/s] Loading 0: 89%|████████▉ | 888/995 [00:58<00:12, 8.23it/s] Loading 0: 90%|█████████ | 897/995 [00:58<00:09, 10.88it/s] Loading 0: 91%|█████████ | 903/995 [00:59<00:06, 13.45it/s] Loading 0: 91%|█████████▏| 909/995 [00:59<00:05, 16.52it/s] Loading 0: 92%|█████████▏| 915/995 [00:59<00:03, 20.38it/s] Loading 0: 93%|█████████▎| 921/995 [00:59<00:03, 24.48it/s] Loading 0: 94%|█████████▍| 933/995 [00:59<00:01, 37.17it/s] Loading 0: 95%|█████████▍| 941/995 [00:59<00:01, 40.60it/s] Loading 0: 95%|█████████▌| 948/995 [00:59<00:01, 45.48it/s] Loading 0: 96%|█████████▌| 955/995 [01:01<00:03, 13.22it/s] Loading 0: 97%|█████████▋| 961/995 [01:01<00:02, 16.31it/s] Loading 0: 97%|█████████▋| 969/995 [01:01<00:01, 21.82it/s] Loading 0: 98%|█████████▊| 975/995 [01:01<00:00, 25.59it/s] Loading 0: 99%|█████████▊| 981/995 [01:01<00:00, 29.66it/s] Loading 0: 99%|█████████▉| 987/995 [01:01<00:00, 34.30it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:950: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mixtral-8x7b-3473-v65-mkmlizer: warnings.warn(
mistralai-mixtral-8x7b-3473-v65-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:778: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mixtral-8x7b-3473-v65-mkmlizer: warnings.warn(
mistralai-mixtral-8x7b-3473-v65-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mixtral-8x7b-3473-v65-mkmlizer: warnings.warn(
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:04<00:04, 4.97s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.12s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.25s/it]
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 1.54it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 2.60it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 2.35it/s]
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Saving duration: 0.978s
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 12.160s
mistralai-mixtral-8x7b-3473-v65-mkmlizer: creating bucket guanaco-reward-models
mistralai-mixtral-8x7b-3473-v65-mkmlizer: Bucket 's3://guanaco-reward-models/' created
mistralai-mixtral-8x7b-3473-v65-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v65_reward
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v65_reward/special_tokens_map.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v65_reward/config.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v65_reward/vocab.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v65_reward/tokenizer_config.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v65_reward/merges.txt
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v65_reward/tokenizer.json
mistralai-mixtral-8x7b-3473-v65-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v65_reward/reward.tensors
HTTP Request: %s %s "%s %d %s"
Job mistralai-mixtral-8x7b-3473-v65-mkmlizer completed after 621.64s with status: succeeded
Stopping job with name mistralai-mixtral-8x7b-3473-v65-mkmlizer
Pipeline stage MKMLizer completed in 622.97s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.26s
Running pipeline stage ISVCDeployer
Creating inference service mistralai-mixtral-8x7b-3473-v65
Waiting for inference service mistralai-mixtral-8x7b-3473-v65 to be ready
Inference service mistralai-mixtral-8x7b-3473-v65 ready after 222.7401304244995s
Pipeline stage ISVCDeployer completed in 229.37s
Running pipeline stage StressChecker
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.5248048305511475s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 1.8505666255950928s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.2311792373657227s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 1.7951364517211914s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 1.5175755023956299s
5 requests
0 failed requests
5th percentile: 1.5730876922607422
10th percentile: 1.6285998821258545
20th percentile: 1.739624261856079
30th percentile: 1.8062224864959717
40th percentile: 1.8283945560455321
50th percentile: 1.8505666255950928
60th percentile: 2.0028116703033447
70th percentile: 2.1550567150115967
80th percentile: 2.2899043560028076
90th percentile: 2.4073545932769775
95th percentile: 2.4660797119140625
99th percentile: 2.5130598068237306
mean time: 1.9838525295257567
Pipeline stage StressChecker completed in 12.11s
mistralai-mixtral-8x7b-_3473_v65 status is now deployed due to DeploymentManager action
mistralai-mixtral-8x7b-_3473_v65 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics