submission_id: axolotl-ai-co-romulus-mi_7539_v2
developer_uid: Azazelle
best_of: 4
celo_rating: 1176.23
display_name: romulus-mistral-nemo-12b-simpo
family_friendly_score: 0.0
formatter: {'memory_template': "<s>{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt} [/INST]', 'bot_template': '{bot_name}: {message}</s>', 'user_template': '[INST] {user_name}: {message} [/INST]', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.7, 'top_p': 1.0, 'min_p': 0.07, 'top_k': 1024, 'presence_penalty': 0.03, 'frequency_penalty': 0.01, 'stopping_words': ['\n', '\n\n', 'You:', '[INST]', '<\\s>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64, 'reward_max_token_input': 1024}
is_internal_developer: False
language_model: axolotl-ai-co/romulus-mistral-nemo-12b-simpo
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: axolotl-ai-co/romulus-mi
model_name: romulus-mistral-nemo-12b-simpo
model_num_parameters: 12772070400.0
model_repo: axolotl-ai-co/romulus-mistral-nemo-12b-simpo
model_size: 13B
num_battles: 12214
num_wins: 5815
ranking_group: single
reward_formatter: {'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'memory_template': '<|im_start|>system\n{memory}<|im_end|>\n', 'prompt_template': '<|im_start|>user\n{prompt}<|im_end|>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n'}
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
status: torndown
submission_type: basic
timestamp: 2024-07-29T20:35:05+00:00
us_pacific_date: 2024-07-29
win_ratio: 0.4760930080235795
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name axolotl-ai-co-romulus-mi-7539-v2-mkmlizer
Waiting for job on axolotl-ai-co-romulus-mi-7539-v2-mkmlizer to finish
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ _____ __ __ ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ /___/ ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ Version: 0.9.7 ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ https://mk1.ai ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ The license key for the current software has been verified as ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ belonging to: ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ Chai Research Corp. ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ║ ║
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: quantized model in 35.446s
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: Processed model axolotl-ai-co/romulus-mistral-nemo-12b-simpo in 70.645s
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: creating bucket guanaco-mkml-models
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v2
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v2/special_tokens_map.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v2/config.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v2/tokenizer_config.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v2/tokenizer.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v2/flywheel_model.0.safetensors
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:10, 35.15it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:06, 55.21it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 49.03it/s] Loading 0: 7%|▋ | 25/363 [00:00<00:06, 49.76it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 50.95it/s] Loading 0: 10%|█ | 37/363 [00:00<00:07, 45.61it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 44.36it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 48.71it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 45.25it/s] Loading 0: 17%|█▋ | 60/363 [00:01<00:06, 45.74it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:09, 31.23it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 37.62it/s] Loading 0: 21%|██ | 77/363 [00:01<00:07, 39.86it/s] Loading 0: 23%|██▎ | 82/363 [00:01<00:08, 34.51it/s] Loading 0: 25%|██▍ | 90/363 [00:02<00:06, 42.39it/s] Loading 0: 26%|██▋ | 96/363 [00:02<00:06, 40.65it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:06, 40.26it/s] Loading 0: 29%|██▉ | 106/363 [00:02<00:06, 42.26it/s] Loading 0: 31%|███ | 112/363 [00:02<00:05, 45.77it/s] Loading 0: 32%|███▏ | 117/363 [00:02<00:05, 42.60it/s] Loading 0: 34%|███▎ | 122/363 [00:02<00:05, 43.41it/s] Loading 0: 35%|███▍ | 127/363 [00:03<00:06, 37.37it/s] Loading 0: 37%|███▋ | 134/363 [00:03<00:05, 44.68it/s] Loading 0: 38%|███▊ | 139/363 [00:03<00:05, 44.73it/s] Loading 0: 40%|███▉ | 144/363 [00:03<00:07, 29.13it/s] Loading 0: 41%|████ | 149/363 [00:03<00:06, 31.75it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 39.00it/s] Loading 0: 44%|████▍ | 161/363 [00:03<00:05, 40.12it/s] Loading 0: 46%|████▌ | 166/363 [00:04<00:04, 41.60it/s] Loading 0: 47%|████▋ | 171/363 [00:04<00:04, 43.40it/s] Loading 0: 48%|████▊ | 176/363 [00:04<00:05, 36.38it/s] Loading 0: 50%|█████ | 183/363 [00:04<00:04, 43.88it/s] Loading 0: 52%|█████▏ | 188/363 [00:04<00:04, 43.46it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:03, 44.35it/s] Loading 0: 55%|█████▍ | 199/363 [00:04<00:03, 41.42it/s] Loading 0: 56%|█████▌ | 204/363 [00:04<00:03, 41.13it/s] Loading 0: 58%|█████▊ | 210/363 [00:05<00:03, 45.66it/s] Loading 0: 59%|█████▉ | 215/363 [00:05<00:03, 45.94it/s] Loading 0: 61%|██████ | 222/363 [00:05<00:03, 45.65it/s] Loading 0: 63%|██████▎ | 227/363 [00:05<00:04, 33.61it/s] Loading 0: 64%|██████▎ | 231/363 [00:05<00:03, 33.39it/s] Loading 0: 65%|██████▌ | 237/363 [00:05<00:03, 38.82it/s] Loading 0: 67%|██████▋ | 242/363 [00:05<00:03, 39.56it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:02, 40.30it/s] Loading 0: 69%|██████▉ | 252/363 [00:06<00:02, 42.47it/s] Loading 0: 71%|███████ | 257/363 [00:06<00:02, 35.83it/s] Loading 0: 73%|███████▎ | 264/363 [00:06<00:02, 43.03it/s] Loading 0: 74%|███████▍ | 269/363 [00:06<00:02, 42.97it/s] Loading 0: 75%|███████▌ | 274/363 [00:06<00:02, 42.89it/s] Loading 0: 77%|███████▋ | 279/363 [00:06<00:01, 43.58it/s] Loading 0: 78%|███████▊ | 284/363 [00:06<00:02, 36.40it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 43.12it/s] Loading 0: 82%|████████▏ | 296/363 [00:07<00:01, 42.89it/s] Loading 0: 83%|████████▎ | 302/363 [00:07<00:01, 46.99it/s] Loading 0: 85%|████████▍ | 307/363 [00:14<00:21, 2.56it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:14, 3.48it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:07, 5.55it/s] Loading 0: 90%|████████▉ | 326/363 [00:14<00:04, 7.41it/s] Loading 0: 91%|█████████ | 331/363 [00:14<00:03, 9.42it/s] Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 13.27it/s] Loading 0: 95%|█████████▍| 344/363 [00:14<00:01, 16.50it/s] Loading 0: 96%|█████████▌| 349/363 [00:15<00:00, 19.57it/s] Loading 0: 98%|█████████▊| 356/363 [00:15<00:00, 25.54it/s] Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 28.46it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: warnings.warn(
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: warnings.warn(
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: warnings.warn(
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: Saving duration: 0.336s
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 6.997s
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: creating bucket guanaco-reward-models
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/axolotl-ai-co-romulus-mi-7539-v2_reward
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/axolotl-ai-co-romulus-mi-7539-v2_reward/config.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/axolotl-ai-co-romulus-mi-7539-v2_reward/special_tokens_map.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/axolotl-ai-co-romulus-mi-7539-v2_reward/tokenizer_config.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/axolotl-ai-co-romulus-mi-7539-v2_reward/merges.txt
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/axolotl-ai-co-romulus-mi-7539-v2_reward/vocab.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/axolotl-ai-co-romulus-mi-7539-v2_reward/tokenizer.json
axolotl-ai-co-romulus-mi-7539-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/axolotl-ai-co-romulus-mi-7539-v2_reward/reward.tensors
Job axolotl-ai-co-romulus-mi-7539-v2-mkmlizer completed after 104.86s with status: succeeded
Stopping job with name axolotl-ai-co-romulus-mi-7539-v2-mkmlizer
Pipeline stage MKMLizer completed in 106.38s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Creating inference service axolotl-ai-co-romulus-mi-7539-v2
Waiting for inference service axolotl-ai-co-romulus-mi-7539-v2 to be ready
Inference service axolotl-ai-co-romulus-mi-7539-v2 ready after 111.01765036582947s
Pipeline stage ISVCDeployer completed in 112.76s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.467298746109009s
Received healthy response to inference request in 1.2100672721862793s
Received healthy response to inference request in 1.6160774230957031s
Received healthy response to inference request in 1.2232816219329834s
Received healthy response to inference request in 1.2720263004302979s
5 requests
0 failed requests
5th percentile: 1.2127101421356201
10th percentile: 1.215353012084961
20th percentile: 1.2206387519836426
30th percentile: 1.2330305576324463
40th percentile: 1.252528429031372
50th percentile: 1.2720263004302979
60th percentile: 1.40964674949646
70th percentile: 1.547267198562622
80th percentile: 1.7863216876983645
90th percentile: 2.1268102169036864
95th percentile: 2.2970544815063474
99th percentile: 2.4332498931884765
mean time: 1.5577502727508545
Pipeline stage StressChecker completed in 8.49s
axolotl-ai-co-romulus-mi_7539_v2 status is now deployed due to DeploymentManager action
axolotl-ai-co-romulus-mi_7539_v2 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of axolotl-ai-co-romulus-mi_7539_v2
Running pipeline stage ISVCDeleter
Checking if service axolotl-ai-co-romulus-mi-7539-v2 is running
Tearing down inference service axolotl-ai-co-romulus-mi-7539-v2
Service axolotl-ai-co-romulus-mi-7539-v2 has been torndown
Pipeline stage ISVCDeleter completed in 4.81s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key axolotl-ai-co-romulus-mi-7539-v2/config.json from bucket guanaco-mkml-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2/tokenizer.json from bucket guanaco-mkml-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key axolotl-ai-co-romulus-mi-7539-v2_reward/config.json from bucket guanaco-reward-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2_reward/merges.txt from bucket guanaco-reward-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2_reward/reward.tensors from bucket guanaco-reward-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key axolotl-ai-co-romulus-mi-7539-v2_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.61s
axolotl-ai-co-romulus-mi_7539_v2 status is now torndown due to DeploymentManager action