submission_id: mistralai-mistral-nemo-_9330_v40
developer_uid: chai_backend_admin
best_of: 2
celo_rating: 1227.23
display_name: mistralai-mistral-nemo-_9330_v40
family_friendly_score: 0.0
formatter: {'memory_template': '### Instruction:\n{memory}\n', 'prompt_template': '### Input:\n{prompt}\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '### Response:\n{bot_name}:', 'truncate_by_message': True}
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '###', 'Bot:', 'User:', 'You:', '<|im_end|>'], 'max_input_tokens': 1500, 'best_of': 2, 'max_output_tokens': 128, 'reward_max_token_input': 256}
ineligible_reason: max_output_tokens!=64
is_internal_developer: True
language_model: mistralai/Mistral-Nemo-Instruct-2407
max_input_tokens: 1500
max_output_tokens: 128
model_architecture: MistralForCausalLM
model_group: mistralai/Mistral-Nemo-I
model_name: mistralai-mistral-nemo-_9330_v40
model_num_parameters: 12772070400.0
model_repo: mistralai/Mistral-Nemo-Instruct-2407
model_size: 13B
num_battles: 12502
num_wins: 6383
ranking_group: single
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-08T01:00:33+00:00
us_pacific_date: 2024-08-07
win_ratio: 0.5105583106702928
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name mistralai-mistral-nemo-9330-v40-mkmlizer
Waiting for job on mistralai-mistral-nemo-9330-v40-mkmlizer to finish
mistralai-mistral-nemo-9330-v40-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ _____ __ __ ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ /___/ ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ Version: 0.9.9 ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ https://mk1.ai ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ belonging to: ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ Chai Research Corp. ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v40-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mistralai-mistral-nemo-9330-v40-mkmlizer: Downloaded to shared memory in 53.370s
mistralai-mistral-nemo-9330-v40-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp13h3olh8, device:0
mistralai-mistral-nemo-9330-v40-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mistral-nemo-9330-v40-mkmlizer: quantized model in 36.672s
mistralai-mistral-nemo-9330-v40-mkmlizer: Processed model mistralai/Mistral-Nemo-Instruct-2407 in 90.042s
mistralai-mistral-nemo-9330-v40-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mistral-nemo-9330-v40-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-mistral-nemo-9330-v40-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v40
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v40/special_tokens_map.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v40/config.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v40/tokenizer_config.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v40/tokenizer.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v40/flywheel_model.0.safetensors
mistralai-mistral-nemo-9330-v40-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
mistralai-mistral-nemo-9330-v40-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:10, 33.61it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:06, 53.16it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 46.73it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:07, 44.99it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 50.84it/s] Loading 0: 10%|█ | 37/363 [00:00<00:06, 46.94it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 45.39it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 50.21it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 46.82it/s] Loading 0: 17%|█▋ | 61/363 [00:01<00:08, 35.35it/s] Loading 0: 18%|█▊ | 66/363 [00:01<00:08, 36.19it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 40.43it/s] Loading 0: 21%|██▏ | 78/363 [00:01<00:07, 39.75it/s] Loading 0: 23%|██▎ | 83/363 [00:01<00:07, 39.76it/s] Loading 0: 25%|██▍ | 90/363 [00:02<00:06, 44.73it/s] Loading 0: 26%|██▋ | 96/363 [00:02<00:06, 42.81it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:06, 41.73it/s] Loading 0: 29%|██▉ | 106/363 [00:02<00:05, 43.43it/s] Loading 0: 31%|███ | 112/363 [00:02<00:05, 46.54it/s] Loading 0: 32%|███▏ | 117/363 [00:02<00:05, 43.62it/s] Loading 0: 34%|███▎ | 122/363 [00:02<00:05, 44.84it/s] Loading 0: 35%|███▍ | 127/363 [00:02<00:06, 38.10it/s] Loading 0: 37%|███▋ | 134/363 [00:03<00:05, 44.78it/s] Loading 0: 38%|███▊ | 139/363 [00:03<00:05, 44.29it/s] Loading 0: 40%|███▉ | 144/363 [00:03<00:07, 28.35it/s] Loading 0: 41%|████ | 149/363 [00:03<00:07, 30.48it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 37.47it/s] Loading 0: 44%|████▍ | 161/363 [00:03<00:05, 39.13it/s] Loading 0: 46%|████▌ | 166/363 [00:04<00:05, 39.14it/s] Loading 0: 47%|████▋ | 171/363 [00:04<00:04, 41.45it/s] Loading 0: 48%|████▊ | 176/363 [00:04<00:05, 32.98it/s] Loading 0: 50%|█████ | 183/363 [00:04<00:04, 39.47it/s] Loading 0: 52%|█████▏ | 188/363 [00:04<00:04, 40.25it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:04, 40.45it/s] Loading 0: 55%|█████▍ | 198/363 [00:04<00:03, 42.48it/s] Loading 0: 56%|█████▌ | 203/363 [00:05<00:04, 36.14it/s] Loading 0: 58%|█████▊ | 210/363 [00:05<00:03, 43.18it/s] Loading 0: 59%|█████▉ | 215/363 [00:05<00:03, 43.03it/s] Loading 0: 61%|██████ | 220/363 [00:05<00:03, 44.74it/s] Loading 0: 62%|██████▏ | 225/363 [00:05<00:04, 28.03it/s] Loading 0: 63%|██████▎ | 230/363 [00:05<00:04, 30.60it/s] Loading 0: 65%|██████▌ | 237/363 [00:05<00:03, 37.44it/s] Loading 0: 67%|██████▋ | 242/363 [00:06<00:03, 38.79it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:02, 40.23it/s] Loading 0: 70%|██████▉ | 253/363 [00:06<00:02, 38.62it/s] Loading 0: 71%|███████ | 258/363 [00:06<00:02, 38.41it/s] Loading 0: 73%|███████▎ | 264/363 [00:06<00:02, 42.98it/s] Loading 0: 74%|███████▍ | 269/363 [00:06<00:02, 43.46it/s] Loading 0: 75%|███████▌ | 274/363 [00:06<00:02, 43.80it/s] Loading 0: 77%|███████▋ | 280/363 [00:06<00:01, 41.57it/s] Loading 0: 79%|███████▊ | 285/363 [00:07<00:01, 40.92it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 44.85it/s] Loading 0: 82%|████████▏ | 296/363 [00:07<00:02, 31.37it/s] Loading 0: 83%|████████▎ | 300/363 [00:07<00:01, 32.47it/s] Loading 0: 84%|████████▎ | 304/363 [00:14<00:27, 2.18it/s] Loading 0: 85%|████████▍ | 307/363 [00:14<00:20, 2.72it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:12, 3.98it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:06, 6.75it/s] Loading 0: 90%|████████▉ | 326/363 [00:14<00:04, 9.13it/s] Loading 0: 91%|█████████ | 331/363 [00:15<00:02, 11.60it/s] Loading 0: 93%|█████████▎| 338/363 [00:15<00:01, 16.22it/s] Loading 0: 95%|█████████▍| 344/363 [00:15<00:00, 19.70it/s] Loading 0: 96%|█████████▌| 349/363 [00:15<00:00, 22.90it/s] Loading 0: 98%|█████████▊| 356/363 [00:15<00:00, 29.37it/s] Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 32.22it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v40-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v40-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v40-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v40-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mistralai-mistral-nemo-9330-v40-mkmlizer: warnings.warn(
mistralai-mistral-nemo-9330-v40-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.89s/it] Downloading shards: 100%|██████████| 2/2 [00:09<00:00, 4.31s/it] Downloading shards: 100%|██████████| 2/2 [00:09<00:00, 4.55s/it]
mistralai-mistral-nemo-9330-v40-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.36it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.88it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.54it/s]
mistralai-mistral-nemo-9330-v40-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
mistralai-mistral-nemo-9330-v40-mkmlizer: Saving duration: 1.335s
mistralai-mistral-nemo-9330-v40-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 14.116s
mistralai-mistral-nemo-9330-v40-mkmlizer: creating bucket guanaco-reward-models
mistralai-mistral-nemo-9330-v40-mkmlizer: Bucket 's3://guanaco-reward-models/' created
mistralai-mistral-nemo-9330-v40-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v40_reward
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v40_reward/config.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v40_reward/tokenizer_config.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v40_reward/special_tokens_map.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v40_reward/merges.txt
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v40_reward/vocab.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v40_reward/tokenizer.json
mistralai-mistral-nemo-9330-v40-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/mistralai-mistral-nemo-9330-v40_reward/reward.tensors
Job mistralai-mistral-nemo-9330-v40-mkmlizer completed after 146.2s with status: succeeded
Stopping job with name mistralai-mistral-nemo-9330-v40-mkmlizer
Pipeline stage MKMLizer completed in 147.47s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service mistralai-mistral-nemo-9330-v40
Waiting for inference service mistralai-mistral-nemo-9330-v40 to be ready
Failed to get response for submission chaiml-elo-alignment-run-3_v17: ('http://chaiml-elo-alignment-run-3-v17-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:45034->127.0.0.1:8080: read: connection reset by peer\n')
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service mistralai-mistral-nemo-9330-v40 ready after 191.15431571006775s
Pipeline stage ISVCDeployer completed in 192.80s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2895562648773193s
Received healthy response to inference request in 2.060448169708252s
Received healthy response to inference request in 1.2911851406097412s
Received healthy response to inference request in 1.3700611591339111s
Received healthy response to inference request in 2.264594078063965s
5 requests
0 failed requests
5th percentile: 1.3069603443145752
10th percentile: 1.3227355480194092
20th percentile: 1.3542859554290771
30th percentile: 1.5081385612487792
40th percentile: 1.7842933654785158
50th percentile: 2.060448169708252
60th percentile: 2.142106533050537
70th percentile: 2.2237648963928223
80th percentile: 2.269586515426636
90th percentile: 2.2795713901519776
95th percentile: 2.2845638275146483
99th percentile: 2.288557777404785
mean time: 1.8551689624786376
Pipeline stage StressChecker completed in 10.06s
mistralai-mistral-nemo-_9330_v40 status is now deployed due to DeploymentManager action
mistralai-mistral-nemo-_9330_v40 status is now inactive due to auto deactivation removed underperforming models
mistralai-mistral-nemo-_9330_v40 status is now torndown due to DeploymentManager action