submission_id: zonemercy-acute-nemo-v1-_4488_v3
developer_uid: zonemercy
alignment_samples: 9499
alignment_score: 6.09211067246004
best_of: 16
celo_rating: 1253.1
display_name: zonemercy-acute-nemo-v1-_4488_v3
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.08, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '###', 'Bot:', 'User:', 'You:', '<|im_end|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: True
language_model: zonemercy/Acute-Nemo-v1-1e5ep1
max_input_tokens: 512
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: zonemercy/Acute-Nemo-v1-
model_name: zonemercy-acute-nemo-v1-_4488_v3
model_num_parameters: 12772070400.0
model_repo: zonemercy/Acute-Nemo-v1-1e5ep1
model_size: 13B
num_battles: 9499
num_wins: 5118
propriety_score: 0.7129714811407544
propriety_total_count: 1087.0
ranking_group: single
reward_formatter: {'bot_template': 'Bot: {message}\n', 'memory_template': '', 'prompt_template': '', 'response_template': 'Bot:', 'truncate_by_message': False, 'user_template': 'User: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-15T06:45:47+00:00
us_pacific_date: 2024-08-14
win_ratio: 0.5387935572165491
Download Preferencedata
Resubmit model
Running pipeline stage MKMLizer
Starting job with name zonemercy-acute-nemo-v1-4488-v3-mkmlizer
Waiting for job on zonemercy-acute-nemo-v1-4488-v3-mkmlizer to finish
Stopping job with name zonemercy-acute-nemo-v1-4488-v3-mkmlizer
%s, retrying in %s seconds...
Starting job with name zonemercy-acute-nemo-v1-4488-v3-mkmlizer
Waiting for job on zonemercy-acute-nemo-v1-4488-v3-mkmlizer to finish
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ _____ __ __ ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ /___/ ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ Version: 0.9.9 ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ https://mk1.ai ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ belonging to: ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ Chai Research Corp. ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ║ ║
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Downloaded to shared memory in 66.467s
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp0ygedlo_, device:0
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: quantized model in 40.333s
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Processed model zonemercy/Acute-Nemo-v1-1e5ep1 in 106.800s
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-acute-nemo-v1-4488-v3
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-acute-nemo-v1-4488-v3/config.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-acute-nemo-v1-4488-v3/special_tokens_map.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-acute-nemo-v1-4488-v3/tokenizer_config.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-acute-nemo-v1-4488-v3/tokenizer.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-acute-nemo-v1-4488-v3/flywheel_model.0.safetensors
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:15, 23.52it/s] Loading 0: 3%|▎ | 10/363 [00:00<00:11, 29.93it/s] Loading 0: 4%|▍ | 14/363 [00:00<00:13, 26.24it/s] Loading 0: 6%|▌ | 21/363 [00:00<00:08, 38.07it/s] Loading 0: 7%|▋ | 26/363 [00:01<00:15, 22.47it/s] Loading 0: 9%|▊ | 31/363 [00:01<00:12, 27.27it/s] Loading 0: 10%|▉ | 35/363 [00:01<00:11, 28.73it/s] Loading 0: 11%|█ | 39/363 [00:01<00:10, 29.82it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:11, 28.30it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:10, 31.43it/s] Loading 0: 14%|█▍ | 52/363 [00:01<00:10, 30.14it/s] Loading 0: 15%|█▌ | 56/363 [00:01<00:10, 30.51it/s] Loading 0: 17%|█▋ | 61/363 [00:02<00:11, 26.64it/s] Loading 0: 18%|█▊ | 64/363 [00:02<00:13, 22.76it/s] Loading 0: 20%|█▉ | 71/363 [00:02<00:09, 29.48it/s] Loading 0: 21%|██ | 75/363 [00:02<00:10, 28.79it/s] Loading 0: 22%|██▏ | 79/363 [00:02<00:10, 27.97it/s] Loading 0: 23%|██▎ | 84/363 [00:02<00:09, 30.21it/s] Loading 0: 24%|██▍ | 88/363 [00:03<00:09, 29.29it/s] Loading 0: 26%|██▌ | 93/363 [00:03<00:08, 32.23it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 30.79it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:10, 25.40it/s] Loading 0: 29%|██▊ | 104/363 [00:03<00:11, 22.50it/s] Loading 0: 31%|███ | 111/363 [00:03<00:08, 29.32it/s] Loading 0: 32%|███▏ | 115/363 [00:04<00:08, 28.71it/s] Loading 0: 33%|███▎ | 120/363 [00:04<00:07, 31.61it/s] Loading 0: 34%|███▍ | 124/363 [00:04<00:07, 30.19it/s] Loading 0: 36%|███▌ | 129/363 [00:04<00:07, 32.54it/s] Loading 0: 37%|███▋ | 133/363 [00:04<00:07, 31.06it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:07, 31.42it/s] Loading 0: 39%|███▉ | 142/363 [00:04<00:08, 27.22it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:08, 25.51it/s] Loading 0: 41%|████ | 149/363 [00:05<00:08, 24.31it/s] Loading 0: 43%|████▎ | 156/363 [00:05<00:06, 31.25it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 29.99it/s] Loading 0: 45%|████▌ | 165/363 [00:05<00:06, 32.53it/s] Loading 0: 47%|████▋ | 169/363 [00:05<00:06, 30.61it/s] Loading 0: 48%|████▊ | 174/363 [00:05<00:05, 33.26it/s] Loading 0: 49%|████▉ | 178/363 [00:06<00:05, 31.63it/s] Loading 0: 50%|█████ | 182/363 [00:06<00:06, 25.92it/s] Loading 0: 51%|█████ | 185/363 [00:06<00:07, 22.54it/s] Loading 0: 53%|█████▎ | 192/363 [00:06<00:05, 29.36it/s] Loading 0: 54%|█████▍ | 196/363 [00:06<00:05, 28.68it/s] Loading 0: 55%|█████▌ | 201/363 [00:06<00:05, 31.56it/s] Loading 0: 56%|█████▋ | 205/363 [00:07<00:05, 29.65it/s] Loading 0: 58%|█████▊ | 210/363 [00:07<00:04, 32.11it/s] Loading 0: 59%|█████▉ | 214/363 [00:07<00:04, 30.70it/s] Loading 0: 60%|██████ | 218/363 [00:07<00:04, 30.90it/s] Loading 0: 61%|██████▏ | 223/363 [00:07<00:05, 26.60it/s] Loading 0: 62%|██████▏ | 226/363 [00:07<00:05, 24.55it/s] Loading 0: 63%|██████▎ | 230/363 [00:08<00:05, 23.05it/s] Loading 0: 65%|██████▌ | 237/363 [00:08<00:04, 29.71it/s] Loading 0: 66%|██████▋ | 241/363 [00:08<00:04, 28.80it/s] Loading 0: 68%|██████▊ | 246/363 [00:08<00:03, 31.44it/s] Loading 0: 69%|██████▉ | 250/363 [00:08<00:03, 29.16it/s] Loading 0: 70%|███████ | 255/363 [00:08<00:03, 31.92it/s] Loading 0: 71%|███████▏ | 259/363 [00:08<00:03, 30.44it/s] Loading 0: 72%|███████▏ | 263/363 [00:09<00:04, 24.57it/s] Loading 0: 73%|███████▎ | 266/363 [00:09<00:04, 21.90it/s] Loading 0: 75%|███████▌ | 273/363 [00:09<00:03, 28.61it/s] Loading 0: 76%|███████▋ | 277/363 [00:09<00:03, 27.87it/s] Loading 0: 78%|███████▊ | 282/363 [00:09<00:02, 30.72it/s] Loading 0: 79%|███████▉ | 286/363 [00:09<00:02, 29.60it/s] Loading 0: 80%|████████ | 291/363 [00:10<00:02, 31.48it/s] Loading 0: 81%|████████▏ | 295/363 [00:10<00:02, 30.03it/s] Loading 0: 82%|████████▏ | 299/363 [00:10<00:02, 30.59it/s] Loading 0: 84%|████████▎ | 304/363 [00:10<00:02, 26.69it/s] Loading 0: 85%|████████▍ | 307/363 [00:10<00:02, 25.20it/s] Loading 0: 86%|████████▌ | 311/363 [00:10<00:02, 23.98it/s] Loading 0: 88%|████████▊ | 318/363 [00:11<00:01, 30.74it/s] Loading 0: 89%|████████▊ | 322/363 [00:11<00:01, 29.94it/s] Loading 0: 90%|█████████ | 327/363 [00:11<00:01, 32.70it/s] Loading 0: 91%|█████████ | 331/363 [00:11<00:01, 31.11it/s] Loading 0: 93%|█████████▎| 336/363 [00:11<00:00, 33.60it/s] Loading 0: 94%|█████████▎| 340/363 [00:11<00:00, 32.07it/s] Loading 0: 95%|█████████▍| 344/363 [00:18<00:09, 1.99it/s] Loading 0: 96%|█████████▌| 348/363 [00:18<00:05, 2.68it/s] Loading 0: 97%|█████████▋| 353/363 [00:19<00:02, 3.89it/s] Loading 0: 98%|█████████▊| 357/363 [00:19<00:01, 5.04it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: warnings.warn(
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: warnings.warn(
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: warnings.warn(
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.29s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.81s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.03s/it]
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.39it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.90it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.56it/s]
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Saving duration: 1.308s
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 12.781s
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: creating bucket guanaco-reward-models
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: Bucket 's3://guanaco-reward-models/' created
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/zonemercy-acute-nemo-v1-4488-v3_reward
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/zonemercy-acute-nemo-v1-4488-v3_reward/config.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/zonemercy-acute-nemo-v1-4488-v3_reward/special_tokens_map.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/zonemercy-acute-nemo-v1-4488-v3_reward/tokenizer_config.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/zonemercy-acute-nemo-v1-4488-v3_reward/merges.txt
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/zonemercy-acute-nemo-v1-4488-v3_reward/vocab.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/zonemercy-acute-nemo-v1-4488-v3_reward/tokenizer.json
zonemercy-acute-nemo-v1-4488-v3-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/zonemercy-acute-nemo-v1-4488-v3_reward/reward.tensors
Job zonemercy-acute-nemo-v1-4488-v3-mkmlizer completed after 156.0s with status: succeeded
Stopping job with name zonemercy-acute-nemo-v1-4488-v3-mkmlizer
Pipeline stage MKMLizer completed in 157.20s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service zonemercy-acute-nemo-v1-4488-v3
Waiting for inference service zonemercy-acute-nemo-v1-4488-v3 to be ready
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
Failed to get response for submission chaiml-llama-8b-pairwise_8189_v2:
Failed to get response for submission zonemercy-acute-nemo-v1-_7579_v1: ('http://zonemercy-acute-nemo-v1-7579-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'activator request timeout')
Failed to get response for submission zonemercy-acute-nemo-v1-_7579_v1: ('http://zonemercy-acute-nemo-v1-7579-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'activator request timeout')
Inference service zonemercy-acute-nemo-v1-4488-v3 ready after 221.41666674613953s
Pipeline stage ISVCDeployer completed in 222.20s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.697028160095215s
Received healthy response to inference request in 1.8225069046020508s
Received healthy response to inference request in 1.767345905303955s
Failed to get response for submission zonemercy-acute-nemo-v1-_7579_v1: ('http://zonemercy-acute-nemo-v1-7579-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'EOF\n')
Failed to get response for submission zonemercy-acute-nemo-v1-_7579_v1: ('http://zonemercy-acute-nemo-v1-7579-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'EOF\n')
Failed to get response for submission zonemercy-acute-nemo-v1-_7579_v1: ('http://zonemercy-acute-nemo-v1-7579-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'EOF\n')
Received healthy response to inference request in 1.764761209487915s
Received healthy response to inference request in 1.7829594612121582s
5 requests
0 failed requests
5th percentile: 1.765278148651123
10th percentile: 1.7657950878143311
20th percentile: 1.766828966140747
30th percentile: 1.7704686164855956
40th percentile: 1.776714038848877
50th percentile: 1.7829594612121582
60th percentile: 1.7987784385681151
70th percentile: 1.8145974159240723
80th percentile: 1.9974111557006837
90th percentile: 2.3472196578979494
95th percentile: 2.5221239089965817
99th percentile: 2.662047309875488
mean time: 1.9669203281402587
Pipeline stage StressChecker completed in 10.65s
zonemercy-acute-nemo-v1-_4488_v3 status is now deployed due to DeploymentManager action
Failed to get response for submission blend_goner_2024-08-03: ('http://chaiml-elo-alignment-run-3-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:52018->127.0.0.1:8080: read: connection reset by peer\n')
zonemercy-acute-nemo-v1-_4488_v3 status is now inactive due to auto deactivation removed underperforming models
zonemercy-acute-nemo-v1-_4488_v3 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics