submission_id: mistralai-mixtral-8x7b-_3473_v86
developer_uid: chai_backend_admin
status: inactive
model_repo: mistralai/Mixtral-8x7B-Instruct-v0.1
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 50, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>', '<|user|>', '###'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '<s>[INST] This is an entertaining conversation. You are {bot_name} who has the persona: {memory}.\nEngage in a chat with {user_name} while staying in character. Try to flirt with {user_name}. Engage in *roleplay* actions. Describe the scene dramatically. \n', 'prompt_template': '{prompt}\n', 'bot_template': '{bot_name}: {message}</s>', 'user_template': '[INST] {user_name}: {message} [/INST]', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': 'Memory: {memory}\n', 'prompt_template': '{prompt}\n', 'bot_template': 'Bot: {message}\n', 'user_template': 'User: {message}\n', 'response_template': 'Bot:', 'truncate_by_message': False}
timestamp: 2024-07-11T21:52:37+00:00
model_name: mistralai-mixtral-8x7b-_3473_v86
model_group: mistralai/Mixtral-8x7B-I
num_battles: 5182403
num_wins: 2375951
celo_rating: 1173.77
alignment_score: None
alignment_samples: 0
propriety_score: 0.7386531261146132
propriety_total_count: 557929.0
submission_type: basic
model_architecture: MixtralForCausalLM
model_num_parameters: 46702792704.0
best_of: 4
max_input_tokens: 512
max_output_tokens: 64
display_name: mistralai-mixtral-8x7b-_3473_v86
ineligible_reason: None
language_model: mistralai/Mixtral-8x7B-Instruct-v0.1
model_size: 47B
reward_model: ChaiML/gpt2_xl_pairwise_89m_step_347634
us_pacific_date: 2024-07-11
win_ratio: 0.4584651174368338
preference_data_url: None
Resubmit model
Running pipeline stage MKMLizer
Starting job with name mistralai-mixtral-8x7b-3473-v86-mkmlizer
Waiting for job on mistralai-mixtral-8x7b-3473-v86-mkmlizer to finish
Job mistralai-mixtral-8x7b-3473-v86-mkmlizer completed after 534.81s with status: failed
Stopping job with name mistralai-mixtral-8x7b-3473-v86-mkmlizer
%s, retrying in %s seconds...
Starting job with name mistralai-mixtral-8x7b-3473-v86-mkmlizer
Waiting for job on mistralai-mixtral-8x7b-3473-v86-mkmlizer to finish
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ _____ __ __ ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ /___/ ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ Version: 0.9.5.post1 ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ https://mk1.ai ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ belonging to: ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ Chai Research Corp. ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ║ ║
mistralai-mixtral-8x7b-3473-v86-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Downloaded to shared memory in 278.970s
mistralai-mixtral-8x7b-3473-v86-mkmlizer: quantizing model to /dev/shm/model_cache
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mixtral-8x7b-3473-v86-mkmlizer: quantized model in 102.056s
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Processed model mistralai/Mixtral-8x7B-Instruct-v0.1 in 381.026s
mistralai-mixtral-8x7b-3473-v86-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-mixtral-8x7b-3473-v86-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v86
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /dev/shm/model_cache/flywheel_model.5.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v86/flywheel_model.5.safetensors
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /dev/shm/model_cache/flywheel_model.3.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v86/flywheel_model.3.safetensors
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v86/flywheel_model.1.safetensors
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /dev/shm/model_cache/flywheel_model.2.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v86/flywheel_model.2.safetensors
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /dev/shm/model_cache/flywheel_model.4.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v86/flywheel_model.4.safetensors
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mixtral-8x7b-3473-v86/flywheel_model.0.safetensors
mistralai-mixtral-8x7b-3473-v86-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Loading 0: 0%| | 0/995 [00:00<?, ?it/s] Loading 0: 1%| | 6/995 [00:00<00:18, 53.55it/s] Loading 0: 1%| | 12/995 [00:00<00:19, 49.98it/s] Loading 0: 2%|▏ | 18/995 [00:00<00:20, 48.08it/s] Loading 0: 2%|▏ | 23/995 [00:00<00:20, 48.32it/s] Loading 0: 4%|▎ | 35/995 [00:00<00:14, 67.76it/s] Loading 0: 4%|▍ | 42/995 [00:00<00:15, 60.55it/s] Loading 0: 5%|▌ | 52/995 [00:01<00:29, 31.84it/s] Loading 0: 6%|▌ | 57/995 [00:01<00:27, 34.43it/s] Loading 0: 7%|▋ | 65/995 [00:01<00:22, 41.05it/s] Loading 0: 7%|▋ | 71/995 [00:01<00:21, 43.13it/s] Loading 0: 8%|▊ | 77/995 [00:01<00:20, 44.51it/s] Loading 0: 8%|▊ | 83/995 [00:01<00:19, 45.90it/s] Loading 0: 10%|▉ | 95/995 [00:01<00:14, 61.27it/s] Loading 0: 10%|█ | 102/995 [00:02<00:14, 60.92it/s] Loading 0: 11%|█ | 109/995 [00:02<00:28, 31.09it/s] Loading 0: 11%|█▏ | 114/995 [00:02<00:26, 33.29it/s] Loading 0: 12%|█▏ | 120/995 [00:02<00:23, 36.81it/s] Loading 0: 13%|█▎ | 128/995 [00:02<00:19, 43.94it/s] Loading 0: 13%|█▎ | 134/995 [00:03<00:18, 46.00it/s] Loading 0: 14%|█▍ | 140/995 [00:03<00:17, 47.71it/s] Loading 0: 15%|█▍ | 146/995 [00:03<00:17, 48.46it/s] Loading 0: 16%|█▋ | 162/995 [00:03<00:22, 37.22it/s] Loading 0: 17%|█▋ | 168/995 [00:03<00:20, 39.58it/s] Loading 0: 17%|█▋ | 174/995 [00:04<00:19, 41.99it/s] Loading 0: 18%|█▊ | 180/995 [00:04<00:18, 44.22it/s] Loading 0: 19%|█▊ | 186/995 [00:18<08:41, 1.55it/s] Loading 0: 19%|█▉ | 193/995 [00:18<06:02, 2.21it/s] Loading 0: 20%|██ | 199/995 [00:18<04:25, 3.00it/s] Loading 0: 21%|██ | 207/995 [00:18<02:56, 4.47it/s] Loading 0: 22%|██▏ | 214/995 [00:19<02:22, 5.49it/s] Loading 0: 22%|██▏ | 222/995 [00:19<01:38, 7.87it/s] Loading 0: 23%|██▎ | 228/995 [00:19<01:15, 10.11it/s] Loading 0: 24%|██▎ | 234/995 [00:19<00:58, 12.96it/s] Loading 0: 24%|██▍ | 240/995 [00:19<00:45, 16.44it/s] Loading 0: 25%|██▌ | 252/995 [00:19<00:28, 26.43it/s] Loading 0: 26%|██▌ | 260/995 [00:19<00:23, 31.73it/s] Loading 0: 27%|██▋ | 267/995 [00:20<00:29, 24.38it/s] Loading 0: 28%|██▊ | 277/995 [00:20<00:21, 33.02it/s] Loading 0: 29%|██▊ | 284/995 [00:20<00:19, 36.69it/s] Loading 0: 29%|██▉ | 292/995 [00:20<00:16, 42.83it/s] Loading 0: 30%|███ | 299/995 [00:20<00:15, 44.73it/s] Loading 0: 31%|███ | 305/995 [00:21<00:15, 44.21it/s] Loading 0: 31%|███▏ | 311/995 [00:21<00:14, 46.37it/s] Loading 0: 32%|███▏ | 320/995 [00:21<00:21, 32.01it/s] Loading 0: 33%|███▎ | 326/995 [00:21<00:18, 35.64it/s] Loading 0: 33%|███▎ | 332/995 [00:21<00:16, 39.09it/s] Loading 0: 34%|███▍ | 338/995 [00:21<00:15, 42.20it/s] Loading 0: 35%|███▍ | 346/995 [00:22<00:13, 49.19it/s] Loading 0: 35%|███▌ | 352/995 [00:34<06:23, 1.68it/s] Loading 0: 36%|███▌ | 358/995 [00:35<04:37, 2.29it/s] Loading 0: 37%|███▋ | 368/995 [00:35<02:57, 3.54it/s] Loading 0: 38%|███▊ | 376/995 [00:35<02:02, 5.03it/s] Loading 0: 38%|███▊ | 382/995 [00:35<01:34, 6.52it/s] Loading 0: 39%|███▉ | 388/995 [00:35<01:11, 8.49it/s] Loading 0: 40%|███▉ | 394/995 [00:36<00:54, 11.02it/s] Loading 0: 41%|████ | 406/995 [00:36<00:32, 18.18it/s] Loading 0: 42%|████▏ | 413/995 [00:36<00:26, 21.96it/s] Loading 0: 43%|████▎ | 423/995 [00:36<00:27, 20.98it/s] Loading 0: 43%|████▎ | 429/995 [00:36<00:23, 24.36it/s] Loading 0: 44%|████▍ | 437/995 [00:37<00:18, 30.51it/s] Loading 0: 45%|████▍ | 443/995 [00:37<00:16, 33.99it/s] Loading 0: 45%|████▌ | 449/995 [00:37<00:14, 37.35it/s] Loading 0: 46%|████▌ | 455/995 [00:37<00:13, 40.67it/s] Loading 0: 47%|████▋ | 467/995 [00:37<00:09, 56.10it/s] Loading 0: 48%|████▊ | 475/995 [00:37<00:08, 61.31it/s] Loading 0: 49%|████▊ | 483/995 [00:38<00:15, 33.00it/s] Loading 0: 49%|████▉ | 489/995 [00:38<00:13, 36.14it/s] Loading 0: 50%|████▉ | 495/995 [00:38<00:12, 39.33it/s] Loading 0: 51%|█████ | 503/995 [00:38<00:10, 46.10it/s] Loading 0: 51%|█████ | 509/995 [00:38<00:10, 47.74it/s] Loading 0: 52%|█████▏ | 515/995 [00:38<00:09, 49.05it/s] Loading 0: 52%|█████▏ | 517/995 [00:51<00:09, 49.05it/s] Loading 0: 52%|█████▏ | 518/995 [00:51<05:42, 1.39it/s] Loading 0: 53%|█████▎ | 526/995 [00:52<03:39, 2.14it/s] Loading 0: 54%|█████▎ | 534/995 [00:52<02:21, 3.25it/s] Loading 0: 54%|█████▍ | 540/995 [00:52<01:43, 4.38it/s] Loading 0: 55%|█████▍ | 546/995 [00:52<01:16, 5.91it/s] Loading 0: 55%|█████▌ | 552/995 [00:52<00:55, 7.94it/s] Loading 0: 57%|█████▋ | 564/995 [00:52<00:31, 13.65it/s] Loading 0: 57%|█████▋ | 572/995 [00:52<00:24, 17.61it/s] Loading 0: 58%|█████▊ | 581/995 [00:53<00:22, 18.15it/s] Loading 0: 59%|█████▉ | 587/995 [00:53<00:18, 21.58it/s] Loading 0: 60%|█████▉ | 595/995 [00:53<00:14, 27.57it/s] Loading 0: 60%|██████ | 601/995 [00:53<00:12, 31.52it/s] Loading 0: 61%|██████ | 607/995 [00:53<00:10, 35.39it/s] Loading 0: 62%|██████▏ | 613/995 [00:53<00:10, 35.88it/s] Loading 0: 63%|██████▎ | 625/995 [00:54<00:07, 51.01it/s] Loading 0: 64%|██████▎ | 633/995 [00:54<00:06, 56.68it/s] Loading 0: 64%|██████▍ | 641/995 [00:54<00:10, 32.48it/s] Loading 0: 65%|██████▌ | 647/995 [00:54<00:09, 36.02it/s] Loading 0: 66%|██████▌ | 655/995 [00:54<00:07, 42.72it/s] Loading 0: 66%|██████▋ | 661/995 [00:54<00:07, 45.12it/s] Loading 0: 67%|██████▋ | 667/995 [00:55<00:06, 47.16it/s] Loading 0: 68%|██████▊ | 673/995 [00:55<00:06, 47.15it/s] Loading 0: 69%|██████▉ | 685/995 [00:55<00:04, 63.55it/s] Loading 0: 70%|██████▉ | 693/995 [00:55<00:08, 36.59it/s] Loading 0: 70%|███████ | 699/995 [01:08<02:41, 1.83it/s] Loading 0: 71%|███████ | 705/995 [01:09<01:59, 2.43it/s] Loading 0: 71%|███████▏ | 711/995 [01:09<01:27, 3.26it/s] Loading 0: 72%|███████▏ | 719/995 [01:09<00:57, 4.81it/s] Loading 0: 73%|███████▎ | 725/995 [01:09<00:42, 6.34it/s] Loading 0: 73%|███████▎ | 731/995 [01:09<00:31, 8.36it/s] Loading 0: 74%|███████▍ | 739/995 [01:09<00:25, 10.07it/s] Loading 0: 75%|███████▌ | 747/995 [01:10<00:17, 14.03it/s] Loading 0: 76%|███████▌ | 753/995 [01:10<00:13, 17.39it/s] Loading 0: 76%|███████▋ | 759/995 [01:10<00:11, 21.28it/s] Loading 0: 77%|███████▋ | 765/995 [01:10<00:08, 25.59it/s] Loading 0: 78%|███████▊ | 777/995 [01:10<00:05, 38.79it/s] Loading 0: 79%|███████▉ | 785/995 [01:10<00:04, 42.12it/s] Loading 0: 80%|███████▉ | 794/995 [01:11<00:06, 30.33it/s] Loading 0: 80%|████████ | 800/995 [01:11<00:05, 33.74it/s] Loading 0: 81%|████████ | 808/995 [01:11<00:04, 40.33it/s] Loading 0: 82%|████████▏ | 814/995 [01:11<00:04, 42.97it/s] Loading 0: 82%|████████▏ | 820/995 [01:11<00:03, 45.04it/s] Loading 0: 83%|████████▎ | 826/995 [01:11<00:03, 46.73it/s] Loading 0: 84%|████████▍ | 836/995 [01:11<00:02, 58.76it/s] Loading 0: 85%|████████▍ | 843/995 [01:11<00:02, 61.13it/s] Loading 0: 85%|████████▌ | 850/995 [01:12<00:04, 31.74it/s] Loading 0: 86%|████████▌ | 856/995 [01:12<00:04, 34.58it/s] Loading 0: 87%|████████▋ | 862/995 [01:12<00:03, 37.93it/s] Loading 0: 87%|████████▋ | 863/995 [01:25<00:03, 37.93it/s] Loading 0: 87%|████████▋ | 864/995 [01:25<01:41, 1.30it/s] Loading 0: 88%|████████▊ | 872/995 [01:25<00:58, 2.12it/s] Loading 0: 88%|████████▊ | 878/995 [01:25<00:39, 2.96it/s] Loading 0: 89%|████████▉ | 884/995 [01:26<00:26, 4.13it/s] Loading 0: 89%|████████▉ | 890/995 [01:26<00:18, 5.70it/s] Loading 0: 90%|█████████ | 897/995 [01:26<00:13, 7.19it/s] Loading 0: 91%|█████████ | 905/995 [01:26<00:08, 10.52it/s] Loading 0: 92%|█████████▏| 911/995 [01:26<00:06, 13.46it/s] Loading 0: 92%|█████████▏| 917/995 [01:26<00:04, 17.05it/s] Loading 0: 93%|█████████▎| 923/995 [01:27<00:03, 21.21it/s] Loading 0: 94%|█████████▍| 935/995 [01:27<00:01, 33.42it/s] Loading 0: 95%|█████████▍| 943/995 [01:27<00:01, 37.83it/s] Loading 0: 96%|█████████▌| 952/995 [01:29<00:04, 10.63it/s] Loading 0: 96%|█████████▋| 958/995 [01:29<00:02, 13.13it/s] Loading 0: 97%|█████████▋| 966/995 [01:29<00:01, 17.56it/s] Loading 0: 98%|█████████▊| 972/995 [01:29<00:01, 21.07it/s] Loading 0: 98%|█████████▊| 978/995 [01:29<00:00, 24.98it/s] Loading 0: 99%|█████████▉| 984/995 [01:30<00:00, 29.06it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:950: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mixtral-8x7b-3473-v86-mkmlizer: warnings.warn(
mistralai-mixtral-8x7b-3473-v86-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:778: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mixtral-8x7b-3473-v86-mkmlizer: warnings.warn(
mistralai-mixtral-8x7b-3473-v86-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mixtral-8x7b-3473-v86-mkmlizer: warnings.warn(
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:08<00:08, 8.31s/it] Downloading shards: 100%|██████████| 2/2 [00:11<00:00, 5.06s/it] Downloading shards: 100%|██████████| 2/2 [00:11<00:00, 5.55s/it]
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 1.43it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:01<00:00, 1.93it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:01<00:00, 1.83it/s]
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Saving duration: 2.217s
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 16.328s
mistralai-mixtral-8x7b-3473-v86-mkmlizer: creating bucket guanaco-reward-models
mistralai-mixtral-8x7b-3473-v86-mkmlizer: Bucket 's3://guanaco-reward-models/' created
mistralai-mixtral-8x7b-3473-v86-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v86_reward
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v86_reward/special_tokens_map.json
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v86_reward/tokenizer_config.json
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v86_reward/config.json
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v86_reward/merges.txt
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v86_reward/vocab.json
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v86_reward/tokenizer.json
mistralai-mixtral-8x7b-3473-v86-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/mistralai-mixtral-8x7b-3473-v86_reward/reward.tensors
Job mistralai-mixtral-8x7b-3473-v86-mkmlizer completed after 446.22s with status: succeeded
Stopping job with name mistralai-mixtral-8x7b-3473-v86-mkmlizer
Pipeline stage MKMLizer completed in 984.15s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.27s
Running pipeline stage ISVCDeployer
Creating inference service mistralai-mixtral-8x7b-3473-v86
Waiting for inference service mistralai-mixtral-8x7b-3473-v86 to be ready
Inference service mistralai-mixtral-8x7b-3473-v86 ready after 82.003093957901s
Pipeline stage ISVCDeployer completed in 83.35s
Running pipeline stage StressChecker
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.5672624111175537s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 1.9091684818267822s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 1.9534218311309814s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 3.108717679977417s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 1.2926456928253174s
5 requests
0 failed requests
5th percentile: 1.4159502506256103
10th percentile: 1.5392548084259032
20th percentile: 1.7858639240264893
30th percentile: 1.9180191516876222
40th percentile: 1.9357204914093018
50th percentile: 1.9534218311309814
60th percentile: 2.1989580631256103
70th percentile: 2.444494295120239
80th percentile: 2.6755534648895263
90th percentile: 2.892135572433472
95th percentile: 3.0004266262054444
99th percentile: 3.0870594692230227
mean time: 2.1662432193756103
Pipeline stage StressChecker completed in 13.03s
mistralai-mixtral-8x7b-_3473_v86 status is now deployed due to DeploymentManager action

Usage Metrics

Latency Metrics