submission_id: mistralai-mistral-nemo-_9330_v34
developer_uid: zonemercy
alignment_samples: 0
best_of: 1
celo_rating: 1150.75
display_name: 0801v1-0
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['</s>', '###', 'Bot:', 'User:', 'You:', '<|im_end|>'], 'max_input_tokens': 1024, 'best_of': 1, 'max_output_tokens': 64, 'reward_max_token_input': 256}
ineligible_reason: propriety_total_count < 800
is_internal_developer: True
language_model: mistralai/Mistral-Nemo-Instruct-2407
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: mistralai/Mistral-Nemo-I
model_name: 0801v1-0
model_num_parameters: 12772070400.0
model_repo: mistralai/Mistral-Nemo-Instruct-2407
model_size: 13B
num_battles: 8810
num_wins: 3743
propriety_score: 0.732824427480916
propriety_total_count: 786.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': '', 'prompt_template': '', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-01T21:56:56+00:00
us_pacific_date: 2024-08-01
win_ratio: 0.4248581157775255
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name mistralai-mistral-nemo-9330-v34-mkmlizer
Waiting for job on mistralai-mistral-nemo-9330-v34-mkmlizer to finish
mistralai-mistral-nemo-9330-v34-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ _____ __ __ ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ /___/ ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ Version: 0.9.7 ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ https://mk1.ai ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ belonging to: ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ Chai Research Corp. ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v34-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Failed to get response for submission chaiml-sao10k-l3-rp-v3-3_v76: ('http://chaiml-sao10k-l3-rp-v3-3-v76-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission chaiml-sao10k-l3-rp-v3-3_v76: ('http://chaiml-sao10k-l3-rp-v3-3-v76-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
mistralai-mistral-nemo-9330-v34-mkmlizer: Downloaded to shared memory in 53.201s
mistralai-mistral-nemo-9330-v34-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp1dwi7z77, device:0
mistralai-mistral-nemo-9330-v34-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission chaiml-sao10k-l3-rp-v3-3_v76: ('http://chaiml-sao10k-l3-rp-v3-3-v76-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission chaiml-sao10k-l3-rp-v3-3_v76: ('http://chaiml-sao10k-l3-rp-v3-3-v76-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission chaiml-sao10k-l3-rp-v3-3_v73: ('http://chaiml-sao10k-l3-rp-v3-3-v73-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
mistralai-mistral-nemo-9330-v34-mkmlizer: quantized model in 37.041s
mistralai-mistral-nemo-9330-v34-mkmlizer: Processed model mistralai/Mistral-Nemo-Instruct-2407 in 90.242s
mistralai-mistral-nemo-9330-v34-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mistral-nemo-9330-v34-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-mistral-nemo-9330-v34-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v34
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v34/config.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v34/special_tokens_map.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v34/tokenizer_config.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v34/tokenizer.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v34/flywheel_model.0.safetensors
mistralai-mistral-nemo-9330-v34-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
mistralai-mistral-nemo-9330-v34-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:11, 31.77it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:06, 51.19it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 46.13it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:07, 44.57it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 50.35it/s] Loading 0: 10%|█ | 37/363 [00:00<00:07, 46.52it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 45.00it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 49.51it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 44.11it/s] Loading 0: 17%|█▋ | 60/363 [00:01<00:06, 44.74it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:10, 29.75it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:08, 36.24it/s] Loading 0: 21%|██ | 77/363 [00:01<00:07, 38.93it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 34.47it/s] Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 40.69it/s] Loading 0: 26%|██▌ | 94/363 [00:02<00:06, 40.95it/s] Loading 0: 27%|██▋ | 99/363 [00:02<00:06, 42.17it/s] Loading 0: 29%|██▉ | 105/363 [00:02<00:06, 40.99it/s] Loading 0: 30%|███ | 110/363 [00:02<00:05, 43.03it/s] Loading 0: 32%|███▏ | 115/363 [00:02<00:05, 42.95it/s] Loading 0: 33%|███▎ | 120/363 [00:02<00:05, 40.54it/s] Loading 0: 34%|███▍ | 125/363 [00:02<00:05, 42.84it/s] Loading 0: 36%|███▌ | 130/363 [00:03<00:05, 42.68it/s] Loading 0: 37%|███▋ | 135/363 [00:03<00:05, 42.01it/s] Loading 0: 39%|███▊ | 140/363 [00:03<00:05, 42.67it/s] Loading 0: 40%|███▉ | 145/363 [00:03<00:08, 26.50it/s] Loading 0: 41%|████ | 149/363 [00:03<00:07, 27.44it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 34.61it/s] Loading 0: 44%|████▍ | 161/363 [00:04<00:05, 36.23it/s] Loading 0: 46%|████▌ | 166/363 [00:04<00:05, 37.88it/s] Loading 0: 47%|████▋ | 172/363 [00:04<00:05, 37.75it/s] Loading 0: 49%|████▉ | 177/363 [00:04<00:04, 38.24it/s] Loading 0: 50%|█████ | 183/363 [00:04<00:04, 41.83it/s] Loading 0: 52%|█████▏ | 188/363 [00:04<00:04, 36.61it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:04, 37.92it/s] Loading 0: 54%|█████▍ | 197/363 [00:05<00:04, 37.66it/s] Loading 0: 55%|█████▌ | 201/363 [00:05<00:04, 37.96it/s] Loading 0: 56%|█████▋ | 205/363 [00:05<00:04, 36.85it/s] Loading 0: 58%|█████▊ | 210/363 [00:05<00:03, 39.65it/s] Loading 0: 59%|█████▉ | 215/363 [00:05<00:03, 40.00it/s] Loading 0: 61%|██████ | 220/363 [00:05<00:03, 41.77it/s] Loading 0: 62%|██████▏ | 225/363 [00:05<00:05, 25.99it/s] Loading 0: 63%|██████▎ | 230/363 [00:06<00:04, 28.47it/s] Loading 0: 65%|██████▌ | 237/363 [00:06<00:03, 35.75it/s] Loading 0: 67%|██████▋ | 242/363 [00:06<00:03, 37.42it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:03, 37.69it/s] Loading 0: 70%|██████▉ | 253/363 [00:06<00:02, 37.87it/s] Loading 0: 71%|███████ | 258/363 [00:06<00:02, 37.76it/s] Loading 0: 73%|███████▎ | 264/363 [00:06<00:02, 41.83it/s] Loading 0: 74%|███████▍ | 269/363 [00:06<00:02, 40.82it/s] Loading 0: 75%|███████▌ | 274/363 [00:07<00:02, 41.53it/s] Loading 0: 77%|███████▋ | 280/363 [00:07<00:02, 40.41it/s] Loading 0: 79%|███████▊ | 285/363 [00:07<00:01, 39.74it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 43.81it/s] Loading 0: 82%|████████▏ | 296/363 [00:07<00:01, 42.92it/s] Loading 0: 83%|████████▎ | 301/363 [00:07<00:01, 43.91it/s] Loading 0: 84%|████████▍ | 306/363 [00:14<00:23, 2.42it/s] Loading 0: 85%|████████▌ | 310/363 [00:14<00:16, 3.16it/s] Loading 0: 87%|████████▋ | 314/363 [00:14<00:11, 4.16it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:06, 6.22it/s] Loading 0: 90%|████████▉ | 326/363 [00:15<00:04, 8.72it/s] Loading 0: 91%|█████████ | 331/363 [00:15<00:02, 11.29it/s] Loading 0: 93%|█████████▎| 337/363 [00:15<00:01, 15.42it/s] Loading 0: 94%|█████████▍| 342/363 [00:15<00:01, 18.85it/s] Loading 0: 96%|█████████▌| 347/363 [00:15<00:00, 22.54it/s] Loading 0: 97%|█████████▋| 353/363 [00:15<00:00, 25.96it/s] Loading 0: 99%|█████████▊| 358/363 [00:15<00:00, 28.64it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v34-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v34-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v34-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v34-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v34-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v34-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.16s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.88s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.08s/it]
mistralai-mistral-nemo-9330-v34-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.36it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.91it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.55it/s]
mistralai-mistral-nemo-9330-v34-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
mistralai-mistral-nemo-9330-v34-mkmlizer: Saving duration: 1.390s
mistralai-mistral-nemo-9330-v34-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.129s
mistralai-mistral-nemo-9330-v34-mkmlizer: creating bucket guanaco-reward-models
mistralai-mistral-nemo-9330-v34-mkmlizer: Bucket 's3://guanaco-reward-models/' created
mistralai-mistral-nemo-9330-v34-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v34_reward
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v34_reward/config.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v34_reward/special_tokens_map.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v34_reward/tokenizer_config.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v34_reward/merges.txt
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v34_reward/vocab.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v34_reward/tokenizer.json
mistralai-mistral-nemo-9330-v34-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v34_reward/reward.tensors
Job mistralai-mistral-nemo-9330-v34-mkmlizer completed after 136.96s with status: succeeded
Stopping job with name mistralai-mistral-nemo-9330-v34-mkmlizer
Pipeline stage MKMLizer completed in 137.94s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Failed to get response for submission chaiml-sao10k-l3-rp-v3-3_v76: ('http://chaiml-sao10k-l3-rp-v3-3-v76-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Creating inference service mistralai-mistral-nemo-9330-v34
Waiting for inference service mistralai-mistral-nemo-9330-v34 to be ready
Inference service mistralai-mistral-nemo-9330-v34 ready after 141.1975724697113s
Pipeline stage ISVCDeployer completed in 142.98s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2983391284942627s
Received healthy response to inference request in 1.1029915809631348s
Received healthy response to inference request in 1.5108482837677002s
Received healthy response to inference request in 0.5345253944396973s
Received healthy response to inference request in 1.4978406429290771s
5 requests
0 failed requests
5th percentile: 0.6482186317443848
10th percentile: 0.7619118690490723
20th percentile: 0.9892983436584473
30th percentile: 1.1819613933563233
40th percentile: 1.3399010181427002
50th percentile: 1.4978406429290771
60th percentile: 1.5030436992645264
70th percentile: 1.5082467555999757
80th percentile: 1.6683464527130127
90th percentile: 1.9833427906036378
95th percentile: 2.14084095954895
99th percentile: 2.2668394947052
mean time: 1.3889090061187743
Pipeline stage StressChecker completed in 8.22s
mistralai-mistral-nemo-_9330_v34 status is now deployed due to DeploymentManager action
mistralai-mistral-nemo-_9330_v34 status is now inactive due to admin request
admin requested tearing down of mistralai-mistral-nemo-_9330_v34
Running pipeline stage ISVCDeleter
Checking if service mistralai-mistral-nemo-9330-v34 is running
Tearing down inference service mistralai-mistral-nemo-9330-v34
Service mistralai-mistral-nemo-9330-v34 has been torndown
Pipeline stage ISVCDeleter completed in 4.63s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key mistralai-mistral-nemo-9330-v34/config.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v34/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v34/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v34/tokenizer.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v34/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key mistralai-mistral-nemo-9330-v34_reward/config.json from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v34_reward/merges.txt from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v34_reward/reward.tensors from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v34_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v34_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v34_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v34_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.77s
mistralai-mistral-nemo-_9330_v34 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics