submission_id: zonemercy-lexical-nemo-v_1518_v6
developer_uid: zonemercy
alignment_samples: 0
best_of: 4
celo_rating: 1240.98
display_name: zonemercy-lexical-nemo-v_1518_v6
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '###', 'Bot:', 'User:', 'You:', '<|im_end|>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: True
language_model: zonemercy/Lexical-Nemo-v4-1k1e5
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: zonemercy/Lexical-Nemo-v
model_name: zonemercy-lexical-nemo-v_1518_v6
model_num_parameters: 12772070400.0
model_repo: zonemercy/Lexical-Nemo-v4-1k1e5
model_size: 13B
num_battles: 16106
num_wins: 8528
propriety_score: 0.703862660944206
propriety_total_count: 1398.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': '', 'prompt_template': '', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-04T05:43:38+00:00
us_pacific_date: 2024-08-03
win_ratio: 0.5294921147398485
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name zonemercy-lexical-nemo-v-1518-v6-mkmlizer
Waiting for job on zonemercy-lexical-nemo-v-1518-v6-mkmlizer to finish
Stopping job with name zonemercy-lexical-nemo-v-1518-v6-mkmlizer
%s, retrying in %s seconds...
Starting job with name zonemercy-lexical-nemo-v-1518-v6-mkmlizer
Waiting for job on zonemercy-lexical-nemo-v-1518-v6-mkmlizer to finish
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ _____ __ __ ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ /___/ ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ Version: 0.9.9 ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ https://mk1.ai ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ belonging to: ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ Chai Research Corp. ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ║ ║
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Downloaded to shared memory in 59.506s
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp11_of_bx, device:0
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: quantized model in 41.506s
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Processed model zonemercy/Lexical-Nemo-v4-1k1e5 in 101.013s
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-lexical-nemo-v-1518-v6
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-v-1518-v6/config.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-v-1518-v6/special_tokens_map.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-v-1518-v6/tokenizer_config.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-v-1518-v6/tokenizer.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-lexical-nemo-v-1518-v6/flywheel_model.0.safetensors
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:15, 23.12it/s] Loading 0: 3%|▎ | 10/363 [00:00<00:12, 29.10it/s] Loading 0: 4%|▍ | 14/363 [00:00<00:13, 25.13it/s] Loading 0: 6%|▌ | 20/363 [00:00<00:10, 34.01it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:14, 23.12it/s] Loading 0: 7%|▋ | 27/363 [00:01<00:14, 22.44it/s] Loading 0: 9%|▊ | 31/363 [00:01<00:12, 25.91it/s] Loading 0: 10%|▉ | 35/363 [00:01<00:12, 27.32it/s] Loading 0: 11%|█ | 39/363 [00:01<00:11, 28.65it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:11, 27.67it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:10, 30.15it/s] Loading 0: 14%|█▍ | 52/363 [00:01<00:10, 28.30it/s] Loading 0: 15%|█▌ | 56/363 [00:02<00:10, 28.74it/s] Loading 0: 17%|█▋ | 61/363 [00:02<00:12, 24.53it/s] Loading 0: 18%|█▊ | 64/363 [00:02<00:13, 21.90it/s] Loading 0: 20%|█▉ | 71/363 [00:02<00:10, 28.59it/s] Loading 0: 21%|██ | 75/363 [00:02<00:10, 27.87it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:11, 25.72it/s] Loading 0: 23%|██▎ | 82/363 [00:03<00:09, 28.71it/s] Loading 0: 24%|██▎ | 86/363 [00:03<00:10, 25.38it/s] Loading 0: 26%|██▌ | 93/363 [00:03<00:08, 31.96it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 30.21it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:10, 24.02it/s] Loading 0: 29%|██▊ | 104/363 [00:03<00:11, 21.64it/s] Loading 0: 30%|███ | 109/363 [00:04<00:09, 26.65it/s] Loading 0: 31%|███ | 113/363 [00:04<00:10, 24.71it/s] Loading 0: 33%|███▎ | 120/363 [00:04<00:07, 31.20it/s] Loading 0: 34%|███▍ | 124/363 [00:04<00:08, 29.61it/s] Loading 0: 36%|███▌ | 129/363 [00:04<00:07, 31.60it/s] Loading 0: 37%|███▋ | 133/363 [00:04<00:07, 29.66it/s] Loading 0: 38%|███▊ | 137/363 [00:05<00:07, 29.73it/s] Loading 0: 39%|███▉ | 142/363 [00:05<00:08, 24.77it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:09, 23.42it/s] Loading 0: 41%|████ | 149/363 [00:05<00:09, 22.22it/s] Loading 0: 43%|████▎ | 156/363 [00:05<00:07, 28.62it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:07, 28.05it/s] Loading 0: 45%|████▌ | 165/363 [00:06<00:06, 30.43it/s] Loading 0: 47%|████▋ | 169/363 [00:06<00:06, 29.10it/s] Loading 0: 48%|████▊ | 174/363 [00:06<00:06, 31.33it/s] Loading 0: 49%|████▉ | 178/363 [00:06<00:06, 29.81it/s] Loading 0: 50%|█████ | 182/363 [00:06<00:07, 23.55it/s] Loading 0: 51%|█████ | 185/363 [00:06<00:08, 21.16it/s] Loading 0: 52%|█████▏ | 190/363 [00:07<00:06, 26.54it/s] Loading 0: 53%|█████▎ | 194/363 [00:07<00:06, 24.20it/s] Loading 0: 55%|█████▌ | 201/363 [00:07<00:05, 30.71it/s] Loading 0: 56%|█████▋ | 205/363 [00:07<00:05, 29.48it/s] Loading 0: 58%|█████▊ | 210/363 [00:07<00:04, 31.42it/s] Loading 0: 59%|█████▉ | 214/363 [00:07<00:05, 29.58it/s] Loading 0: 60%|██████ | 218/363 [00:07<00:04, 29.21it/s] Loading 0: 61%|██████▏ | 223/363 [00:08<00:05, 24.60it/s] Loading 0: 62%|██████▏ | 226/363 [00:08<00:05, 23.49it/s] Loading 0: 63%|██████▎ | 230/363 [00:08<00:05, 22.53it/s] Loading 0: 65%|██████▌ | 237/363 [00:08<00:04, 29.16it/s] Loading 0: 66%|██████▋ | 241/363 [00:08<00:04, 28.31it/s] Loading 0: 68%|██████▊ | 246/363 [00:09<00:03, 30.45it/s] Loading 0: 69%|██████▉ | 250/363 [00:09<00:03, 29.02it/s] Loading 0: 70%|███████ | 255/363 [00:09<00:03, 31.04it/s] Loading 0: 71%|███████▏ | 259/363 [00:09<00:03, 29.46it/s] Loading 0: 72%|███████▏ | 263/363 [00:09<00:04, 23.48it/s] Loading 0: 73%|███████▎ | 266/363 [00:09<00:04, 20.93it/s] Loading 0: 75%|███████▌ | 273/363 [00:10<00:03, 27.84it/s] Loading 0: 76%|███████▋ | 277/363 [00:10<00:03, 27.49it/s] Loading 0: 78%|███████▊ | 282/363 [00:10<00:02, 29.90it/s] Loading 0: 79%|███████▉ | 286/363 [00:10<00:02, 28.81it/s] Loading 0: 80%|████████ | 291/363 [00:10<00:02, 30.51it/s] Loading 0: 81%|████████▏ | 295/363 [00:10<00:02, 28.81it/s] Loading 0: 82%|████████▏ | 299/363 [00:10<00:02, 28.66it/s] Loading 0: 84%|████████▎ | 304/363 [00:11<00:02, 24.46it/s] Loading 0: 85%|████████▍ | 307/363 [00:11<00:02, 23.55it/s] Loading 0: 86%|████████▌ | 311/363 [00:11<00:02, 22.89it/s] Loading 0: 88%|████████▊ | 318/363 [00:11<00:01, 29.41it/s] Loading 0: 89%|████████▊ | 322/363 [00:11<00:01, 28.38it/s] Loading 0: 90%|█████████ | 327/363 [00:12<00:01, 30.33it/s] Loading 0: 91%|█████████ | 331/363 [00:12<00:01, 28.49it/s] Loading 0: 93%|█████████▎| 336/363 [00:12<00:00, 29.99it/s] Loading 0: 94%|█████████▎| 340/363 [00:12<00:00, 28.48it/s] Loading 0: 95%|█████████▍| 344/363 [00:19<00:09, 1.96it/s] Loading 0: 96%|█████████▌| 348/363 [00:19<00:05, 2.63it/s] Loading 0: 97%|█████████▋| 353/363 [00:19<00:02, 3.82it/s] Loading 0: 98%|█████████▊| 357/363 [00:20<00:01, 4.95it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: warnings.warn(
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: warnings.warn(
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: warnings.warn(
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:06<00:06, 6.19s/it] Downloading shards: 100%|██████████| 2/2 [00:09<00:00, 4.33s/it] Downloading shards: 100%|██████████| 2/2 [00:09<00:00, 4.61s/it]
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.38it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.92it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.57it/s]
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Saving duration: 1.379s
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 14.109s
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: creating bucket guanaco-reward-models
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: Bucket 's3://guanaco-reward-models/' created
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/zonemercy-lexical-nemo-v-1518-v6_reward
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/zonemercy-lexical-nemo-v-1518-v6_reward/config.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/zonemercy-lexical-nemo-v-1518-v6_reward/special_tokens_map.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/zonemercy-lexical-nemo-v-1518-v6_reward/tokenizer_config.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/zonemercy-lexical-nemo-v-1518-v6_reward/merges.txt
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/zonemercy-lexical-nemo-v-1518-v6_reward/vocab.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/zonemercy-lexical-nemo-v-1518-v6_reward/tokenizer.json
zonemercy-lexical-nemo-v-1518-v6-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/zonemercy-lexical-nemo-v-1518-v6_reward/reward.tensors
Job zonemercy-lexical-nemo-v-1518-v6-mkmlizer completed after 156.53s with status: succeeded
Stopping job with name zonemercy-lexical-nemo-v-1518-v6-mkmlizer
Pipeline stage MKMLizer completed in 158.16s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service zonemercy-lexical-nemo-v-1518-v6
Waiting for inference service zonemercy-lexical-nemo-v-1518-v6 to be ready
Inference service zonemercy-lexical-nemo-v-1518-v6 ready after 160.97004199028015s
Pipeline stage ISVCDeployer completed in 163.45s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.376934766769409s
Received healthy response to inference request in 1.560267686843872s
Received healthy response to inference request in 1.5799560546875s
Received healthy response to inference request in 1.5737237930297852s
Received healthy response to inference request in 1.6029043197631836s
5 requests
0 failed requests
5th percentile: 1.5629589080810546
10th percentile: 1.5656501293182372
20th percentile: 1.5710325717926026
30th percentile: 1.5749702453613281
40th percentile: 1.577463150024414
50th percentile: 1.5799560546875
60th percentile: 1.5891353607177734
70th percentile: 1.5983146667480468
80th percentile: 1.7577104091644289
90th percentile: 2.067322587966919
95th percentile: 2.222128677368164
99th percentile: 2.34597354888916
mean time: 1.73875732421875
Pipeline stage StressChecker completed in 9.35s
zonemercy-lexical-nemo-v_1518_v6 status is now deployed due to DeploymentManager action
zonemercy-lexical-nemo-v_1518_v6 status is now inactive due to auto deactivation removed underperforming models
zonemercy-lexical-nemo-v_1518_v6 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics