submission_id: zonemercy-random-user-l3-ep2_v2
developer_uid: zonemercy
alignment_samples: 0
best_of: 1
celo_rating: 1123.31
display_name: zonemercy-random-user-l3-ep2_v2
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 1, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: True
language_model: zonemercy/Random-User-l3-ep2
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: zonemercy/Random-User-l3
model_name: zonemercy-random-user-l3-ep2_v2
model_num_parameters: 8030261248.0
model_repo: zonemercy/Random-User-l3-ep2
model_size: 8B
num_battles: 12713
num_wins: 5332
propriety_score: 0.7646528403967539
propriety_total_count: 1109.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-07-29T19:44:42+00:00
us_pacific_date: 2024-07-29
win_ratio: 0.4194131990875482
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name zonemercy-random-user-l3-ep2-v2-mkmlizer
Waiting for job on zonemercy-random-user-l3-ep2-v2-mkmlizer to finish
zonemercy-random-user-l3-ep2-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ _____ __ __ ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ /___/ ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ Version: 0.9.7 ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ https://mk1.ai ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ belonging to: ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ Chai Research Corp. ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ║ ║
zonemercy-random-user-l3-ep2-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
zonemercy-random-user-l3-ep2-v2-mkmlizer: Downloaded to shared memory in 41.779s
zonemercy-random-user-l3-ep2-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpjz42guhg, device:0
zonemercy-random-user-l3-ep2-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zonemercy-random-user-l3-ep2-v2-mkmlizer: quantized model in 28.507s
zonemercy-random-user-l3-ep2-v2-mkmlizer: Processed model zonemercy/Random-User-l3-ep2 in 70.286s
zonemercy-random-user-l3-ep2-v2-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-random-user-l3-ep2-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-random-user-l3-ep2-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v2
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v2/config.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v2/special_tokens_map.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v2/tokenizer_config.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v2/tokenizer.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v2/flywheel_model.0.safetensors
zonemercy-random-user-l3-ep2-v2-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
zonemercy-random-user-l3-ep2-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:10, 26.87it/s] Loading 0: 4%|▍ | 12/291 [00:00<00:06, 43.43it/s] Loading 0: 6%|▌ | 17/291 [00:00<00:06, 40.71it/s] Loading 0: 8%|▊ | 22/291 [00:00<00:06, 39.45it/s] Loading 0: 9%|▉ | 27/291 [00:00<00:06, 41.32it/s] Loading 0: 11%|█ | 32/291 [00:00<00:06, 39.89it/s] Loading 0: 13%|█▎ | 37/291 [00:01<00:09, 26.65it/s] Loading 0: 14%|█▍ | 41/291 [00:01<00:09, 26.53it/s] Loading 0: 16%|█▋ | 48/291 [00:01<00:07, 34.57it/s] Loading 0: 18%|█▊ | 53/291 [00:01<00:06, 35.45it/s] Loading 0: 20%|█▉ | 57/291 [00:01<00:06, 36.42it/s] Loading 0: 21%|██ | 61/291 [00:01<00:06, 35.09it/s] Loading 0: 23%|██▎ | 66/291 [00:01<00:05, 37.77it/s] Loading 0: 24%|██▍ | 70/291 [00:01<00:06, 35.44it/s] Loading 0: 25%|██▌ | 74/291 [00:02<00:06, 34.65it/s] Loading 0: 27%|██▋ | 78/291 [00:02<00:06, 32.94it/s] Loading 0: 28%|██▊ | 82/291 [00:02<00:08, 23.40it/s] Loading 0: 30%|██▉ | 86/291 [00:02<00:07, 26.26it/s] Loading 0: 31%|███ | 90/291 [00:02<00:06, 29.01it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:06, 29.80it/s] Loading 0: 34%|███▍ | 99/291 [00:02<00:05, 33.79it/s] Loading 0: 35%|███▌ | 103/291 [00:03<00:05, 32.97it/s] Loading 0: 37%|███▋ | 108/291 [00:03<00:05, 36.20it/s] Loading 0: 38%|███▊ | 112/291 [00:03<00:05, 34.67it/s] Loading 0: 40%|███▉ | 116/291 [00:03<00:04, 35.08it/s] Loading 0: 42%|████▏ | 122/291 [00:03<00:04, 39.64it/s] Loading 0: 44%|████▎ | 127/291 [00:03<00:04, 37.21it/s] Loading 0: 46%|████▌ | 133/291 [00:04<00:05, 30.77it/s] Loading 0: 47%|████▋ | 137/291 [00:04<00:04, 31.21it/s] Loading 0: 48%|████▊ | 141/291 [00:04<00:05, 29.62it/s] Loading 0: 51%|█████ | 147/291 [00:04<00:04, 34.40it/s] Loading 0: 52%|█████▏ | 151/291 [00:04<00:04, 33.40it/s] Loading 0: 54%|█████▎ | 156/291 [00:04<00:03, 35.79it/s] Loading 0: 55%|█████▍ | 160/291 [00:04<00:03, 33.87it/s] Loading 0: 57%|█████▋ | 165/291 [00:04<00:03, 35.62it/s] Loading 0: 58%|█████▊ | 169/291 [00:05<00:03, 33.53it/s] Loading 0: 60%|█████▉ | 174/291 [00:05<00:03, 36.20it/s] Loading 0: 61%|██████ | 178/291 [00:05<00:03, 34.44it/s] Loading 0: 63%|██████▎ | 183/291 [00:05<00:02, 37.84it/s] Loading 0: 64%|██████▍ | 187/291 [00:05<00:03, 26.68it/s] Loading 0: 66%|██████▌ | 191/291 [00:05<00:03, 27.91it/s] Loading 0: 67%|██████▋ | 195/291 [00:05<00:03, 26.97it/s] Loading 0: 69%|██████▉ | 201/291 [00:06<00:02, 32.38it/s] Loading 0: 70%|███████ | 205/291 [00:06<00:02, 31.86it/s] Loading 0: 72%|███████▏ | 210/291 [00:06<00:02, 34.79it/s] Loading 0: 74%|███████▎ | 214/291 [00:06<00:02, 33.34it/s] Loading 0: 75%|███████▌ | 219/291 [00:06<00:02, 35.11it/s] Loading 0: 77%|███████▋ | 223/291 [00:06<00:02, 33.69it/s] Loading 0: 78%|███████▊ | 227/291 [00:06<00:01, 34.01it/s] Loading 0: 79%|███████▉ | 231/291 [00:06<00:01, 34.19it/s] Loading 0: 81%|████████ | 235/291 [00:07<00:02, 25.47it/s] Loading 0: 82%|████████▏ | 239/291 [00:07<00:02, 25.43it/s] Loading 0: 85%|████████▍ | 246/291 [00:07<00:01, 32.60it/s] Loading 0: 86%|████████▌ | 250/291 [00:07<00:01, 32.11it/s] Loading 0: 88%|████████▊ | 255/291 [00:07<00:01, 35.18it/s] Loading 0: 89%|████████▉ | 259/291 [00:07<00:00, 34.14it/s] Loading 0: 91%|█████████ | 264/291 [00:07<00:00, 36.48it/s] Loading 0: 92%|█████████▏| 268/291 [00:08<00:00, 34.90it/s] Loading 0: 94%|█████████▍| 273/291 [00:08<00:00, 37.44it/s] Loading 0: 95%|█████████▌| 277/291 [00:08<00:00, 35.33it/s] Loading 0: 97%|█████████▋| 281/291 [00:08<00:00, 34.87it/s] Loading 0: 98%|█████████▊| 286/291 [00:13<00:01, 2.61it/s] Loading 0: 99%|█████████▉| 289/291 [00:14<00:00, 3.24it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-random-user-l3-ep2-v2-mkmlizer: warnings.warn(
zonemercy-random-user-l3-ep2-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-random-user-l3-ep2-v2-mkmlizer: warnings.warn(
zonemercy-random-user-l3-ep2-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-random-user-l3-ep2-v2-mkmlizer: warnings.warn(
zonemercy-random-user-l3-ep2-v2-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.46s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.04s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.25s/it]
zonemercy-random-user-l3-ep2-v2-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.46it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 4.06it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.69it/s]
zonemercy-random-user-l3-ep2-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
zonemercy-random-user-l3-ep2-v2-mkmlizer: Saving duration: 1.351s
zonemercy-random-user-l3-ep2-v2-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.438s
zonemercy-random-user-l3-ep2-v2-mkmlizer: creating bucket guanaco-reward-models
zonemercy-random-user-l3-ep2-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
zonemercy-random-user-l3-ep2-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v2_reward
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v2_reward/special_tokens_map.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v2_reward/config.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v2_reward/tokenizer_config.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v2_reward/merges.txt
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v2_reward/vocab.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v2_reward/tokenizer.json
zonemercy-random-user-l3-ep2-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v2_reward/reward.tensors
Job zonemercy-random-user-l3-ep2-v2-mkmlizer completed after 115.74s with status: succeeded
Stopping job with name zonemercy-random-user-l3-ep2-v2-mkmlizer
Pipeline stage MKMLizer completed in 116.74s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service zonemercy-random-user-l3-ep2-v2
Waiting for inference service zonemercy-random-user-l3-ep2-v2 to be ready
Inference service zonemercy-random-user-l3-ep2-v2 ready after 110.9816517829895s
Pipeline stage ISVCDeployer completed in 112.48s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.8534307479858398s
Received healthy response to inference request in 1.0430197715759277s
Received healthy response to inference request in 0.9550237655639648s
Received healthy response to inference request in 0.9215152263641357s
Received healthy response to inference request in 0.811079740524292s
5 requests
0 failed requests
5th percentile: 0.8331668376922607
10th percentile: 0.8552539348602295
20th percentile: 0.899428129196167
30th percentile: 0.9282169342041016
40th percentile: 0.9416203498840332
50th percentile: 0.9550237655639648
60th percentile: 0.99022216796875
70th percentile: 1.025420570373535
80th percentile: 1.2051019668579104
90th percentile: 1.529266357421875
95th percentile: 1.6913485527038572
99th percentile: 1.8210143089294433
mean time: 1.116813850402832
Pipeline stage StressChecker completed in 6.29s
zonemercy-random-user-l3-ep2_v2 status is now deployed due to DeploymentManager action
zonemercy-random-user-l3-ep2_v2 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of zonemercy-random-user-l3-ep2_v2
Running pipeline stage ISVCDeleter
Checking if service zonemercy-random-user-l3-ep2-v2 is running
Tearing down inference service zonemercy-random-user-l3-ep2-v2
Service zonemercy-random-user-l3-ep2-v2 has been torndown
Pipeline stage ISVCDeleter completed in 4.60s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key zonemercy-random-user-l3-ep2-v2/config.json from bucket guanaco-mkml-models
Deleting key zonemercy-random-user-l3-ep2-v2/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key zonemercy-random-user-l3-ep2-v2/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key zonemercy-random-user-l3-ep2-v2/tokenizer.json from bucket guanaco-mkml-models
Deleting key zonemercy-random-user-l3-ep2-v2/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key zonemercy-random-user-l3-ep2-v2_reward/config.json from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v2_reward/merges.txt from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v2_reward/reward.tensors from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v2_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v2_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v2_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v2_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.60s
zonemercy-random-user-l3-ep2_v2 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics