submission_id: mistralai-mistral-nemo-_9330_v36
developer_uid: chai_backend_admin
alignment_samples: 155206
alignment_score: 0.23341543019871236
best_of: 16
celo_rating: 1222.05
display_name: mistralai-mistral-nemo-_9330_v36
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['</s>', '###', 'Bot:', 'User:', 'You:', '<|im_end|>'], 'max_input_tokens': 1024, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: True
language_model: mistralai/Mistral-Nemo-Instruct-2407
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: mistralai/Mistral-Nemo-I
model_name: mistralai-mistral-nemo-_9330_v36
model_num_parameters: 12772070400.0
model_repo: mistralai/Mistral-Nemo-Instruct-2407
model_size: 13B
num_battles: 164734
num_wins: 82676
propriety_score: 0.7253583038580278
propriety_total_count: 48911.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': '', 'prompt_template': '', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-02T02:43:50+00:00
us_pacific_date: 2024-08-01
win_ratio: 0.5018757512110432
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name mistralai-mistral-nemo-9330-v36-mkmlizer
Waiting for job on mistralai-mistral-nemo-9330-v36-mkmlizer to finish
mistralai-mistral-nemo-9330-v36-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ _____ __ __ ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ /___/ ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ Version: 0.9.9 ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ https://mk1.ai ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ belonging to: ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ Chai Research Corp. ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v36-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mistralai-mistral-nemo-9330-v36-mkmlizer: Downloaded to shared memory in 51.391s
mistralai-mistral-nemo-9330-v36-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp1hgk8n59, device:0
mistralai-mistral-nemo-9330-v36-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mistral-nemo-9330-v36-mkmlizer: quantized model in 36.057s
mistralai-mistral-nemo-9330-v36-mkmlizer: Processed model mistralai/Mistral-Nemo-Instruct-2407 in 87.448s
mistralai-mistral-nemo-9330-v36-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mistral-nemo-9330-v36-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-mistral-nemo-9330-v36-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v36
mistralai-mistral-nemo-9330-v36-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v36/config.json
mistralai-mistral-nemo-9330-v36-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v36/special_tokens_map.json
mistralai-mistral-nemo-9330-v36-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v36/tokenizer_config.json
mistralai-mistral-nemo-9330-v36-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v36/tokenizer.json
mistralai-mistral-nemo-9330-v36-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v36/flywheel_model.0.safetensors
mistralai-mistral-nemo-9330-v36-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
mistralai-mistral-nemo-9330-v36-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:10, 32.82it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:06, 52.64it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 46.79it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:07, 44.83it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 50.55it/s] Loading 0: 10%|█ | 37/363 [00:00<00:06, 46.81it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 45.46it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 50.43it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 46.41it/s] Loading 0: 17%|█▋ | 61/363 [00:01<00:08, 35.23it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 34.82it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 41.19it/s] Loading 0: 21%|██▏ | 78/363 [00:01<00:07, 40.51it/s] Loading 0: 23%|██▎ | 83/363 [00:01<00:06, 40.30it/s] Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 44.27it/s] Loading 0: 26%|██▌ | 94/363 [00:02<00:06, 43.83it/s] Loading 0: 27%|██▋ | 99/363 [00:02<00:06, 43.81it/s] Loading 0: 29%|██▉ | 105/363 [00:02<00:06, 42.07it/s] Loading 0: 31%|███ | 112/363 [00:02<00:05, 45.97it/s] Loading 0: 32%|███▏ | 117/363 [00:02<00:05, 44.32it/s] Loading 0: 34%|███▍ | 123/363 [00:02<00:05, 42.52it/s] Loading 0: 35%|███▌ | 128/363 [00:02<00:05, 41.48it/s] Loading 0: 37%|███▋ | 134/363 [00:03<00:05, 44.73it/s] Loading 0: 38%|███▊ | 139/363 [00:03<00:05, 43.80it/s] Loading 0: 40%|███▉ | 144/363 [00:03<00:08, 27.01it/s] Loading 0: 41%|████ | 149/363 [00:03<00:07, 29.11it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 36.64it/s] Loading 0: 44%|████▍ | 161/363 [00:03<00:05, 38.56it/s] Loading 0: 46%|████▌ | 166/363 [00:04<00:04, 39.95it/s] Loading 0: 47%|████▋ | 172/363 [00:04<00:04, 39.75it/s] Loading 0: 49%|████▉ | 177/363 [00:04<00:04, 39.30it/s] Loading 0: 50%|█████ | 183/363 [00:04<00:04, 43.59it/s] Loading 0: 52%|█████▏ | 188/363 [00:04<00:04, 43.10it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:03, 43.56it/s] Loading 0: 55%|█████▍ | 198/363 [00:04<00:03, 44.25it/s] Loading 0: 56%|█████▌ | 203/363 [00:04<00:04, 36.70it/s] Loading 0: 58%|█████▊ | 210/363 [00:05<00:03, 43.93it/s] Loading 0: 59%|█████▉ | 215/363 [00:05<00:03, 44.06it/s] Loading 0: 61%|██████ | 221/363 [00:05<00:02, 47.73it/s] Loading 0: 62%|██████▏ | 226/363 [00:05<00:04, 29.65it/s] Loading 0: 63%|██████▎ | 230/363 [00:05<00:04, 30.04it/s] Loading 0: 65%|██████▌ | 237/363 [00:05<00:03, 36.97it/s] Loading 0: 67%|██████▋ | 242/363 [00:05<00:03, 37.72it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:02, 39.21it/s] Loading 0: 70%|██████▉ | 253/363 [00:06<00:02, 38.83it/s] Loading 0: 71%|███████ | 258/363 [00:06<00:02, 38.41it/s] Loading 0: 73%|███████▎ | 264/363 [00:06<00:02, 43.24it/s] Loading 0: 74%|███████▍ | 269/363 [00:06<00:02, 43.71it/s] Loading 0: 75%|███████▌ | 274/363 [00:06<00:02, 43.40it/s] Loading 0: 77%|███████▋ | 279/363 [00:06<00:01, 45.02it/s] Loading 0: 78%|███████▊ | 284/363 [00:07<00:02, 38.11it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 44.95it/s] Loading 0: 82%|████████▏ | 296/363 [00:07<00:01, 44.49it/s] Loading 0: 83%|████████▎ | 302/363 [00:07<00:01, 47.95it/s] Loading 0: 85%|████████▍ | 307/363 [00:14<00:21, 2.55it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:14, 3.47it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:07, 5.54it/s] Loading 0: 90%|████████▉ | 326/363 [00:14<00:04, 7.44it/s] Loading 0: 91%|█████████ | 331/363 [00:14<00:03, 9.48it/s] Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 13.34it/s] Loading 0: 95%|█████████▍| 344/363 [00:14<00:01, 16.52it/s] Loading 0: 96%|█████████▌| 349/363 [00:15<00:00, 19.60it/s] Loading 0: 98%|█████████▊| 356/363 [00:15<00:00, 25.35it/s] Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 28.25it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v36-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v36-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v36-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v36-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v36-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v36-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.29s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.14s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.31s/it]
mistralai-mistral-nemo-9330-v36-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.40it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.96it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.60it/s]
mistralai-mistral-nemo-9330-v36-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
mistralai-mistral-nemo-9330-v36-mkmlizer: Saving duration: 1.337s
mistralai-mistral-nemo-9330-v36-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.766s
mistralai-mistral-nemo-9330-v36-mkmlizer: creating bucket guanaco-reward-models
mistralai-mistral-nemo-9330-v36-mkmlizer: Bucket 's3://guanaco-reward-models/' created
mistralai-mistral-nemo-9330-v36-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v36_reward
mistralai-mistral-nemo-9330-v36-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v36_reward/reward.tensors
Job mistralai-mistral-nemo-9330-v36-mkmlizer completed after 134.66s with status: succeeded
Stopping job with name mistralai-mistral-nemo-9330-v36-mkmlizer
Pipeline stage MKMLizer completed in 136.37s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.42s
Running pipeline stage ISVCDeployer
Creating inference service mistralai-mistral-nemo-9330-v36
Waiting for inference service mistralai-mistral-nemo-9330-v36 to be ready
Inference service mistralai-mistral-nemo-9330-v36 ready after 132.29766821861267s
Pipeline stage ISVCDeployer completed in 133.85s
Running pipeline stage StressChecker
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 3.032381057739258s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.2316031455993652s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.3702189922332764s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.2918930053710938s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.293593168258667s
5 requests
0 failed requests
5th percentile: 2.243661117553711
10th percentile: 2.2557190895080566
20th percentile: 2.279835033416748
30th percentile: 2.292233037948608
40th percentile: 2.2929131031036376
50th percentile: 2.293593168258667
60th percentile: 2.324243497848511
70th percentile: 2.3548938274383544
80th percentile: 2.502651405334473
90th percentile: 2.7675162315368653
95th percentile: 2.8999486446380613
99th percentile: 3.0058945751190187
mean time: 2.443937873840332
Pipeline stage StressChecker completed in 15.71s
mistralai-mistral-nemo-_9330_v36 status is now deployed due to DeploymentManager action
mistralai-mistral-nemo-_9330_v36 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of mistralai-mistral-nemo-_9330_v36
Running pipeline stage ISVCDeleter
Checking if service mistralai-mistral-nemo-9330-v36 is running
Tearing down inference service mistralai-mistral-nemo-9330-v36
Service mistralai-mistral-nemo-9330-v36 has been torndown
Pipeline stage ISVCDeleter completed in 4.49s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key mistralai-mistral-nemo-9330-v36/config.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v36/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v36/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v36/tokenizer.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v36/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key mistralai-mistral-nemo-9330-v36_reward/config.json from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v36_reward/merges.txt from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v36_reward/reward.tensors from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v36_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v36_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v36_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key mistralai-mistral-nemo-9330-v36_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.15s
mistralai-mistral-nemo-_9330_v36 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics