submission_id: zonemercy-random-user-l3-ep2_v1
developer_uid: zonemercy
alignment_samples: 0
best_of: 1
celo_rating: 1128.26
display_name: zonemercy-random-user-l3-ep2_v1
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '###', '<|user|>'], 'max_input_tokens': 512, 'best_of': 1, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: True
language_model: zonemercy/Random-User-l3-ep2
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: zonemercy/Random-User-l3
model_name: zonemercy-random-user-l3-ep2_v1
model_num_parameters: 8030261248.0
model_repo: zonemercy/Random-User-l3-ep2
model_size: 8B
num_battles: 11688
num_wins: 5000
propriety_score: 0.7407035175879397
propriety_total_count: 995.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-07-29T18:27:19+00:00
us_pacific_date: 2024-07-29
win_ratio: 0.4277891854893908
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name zonemercy-random-user-l3-ep2-v1-mkmlizer
Waiting for job on zonemercy-random-user-l3-ep2-v1-mkmlizer to finish
zonemercy-random-user-l3-ep2-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ _____ __ __ ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ /___/ ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ Version: 0.9.7 ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ https://mk1.ai ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ belonging to: ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ Chai Research Corp. ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ║ ║
zonemercy-random-user-l3-ep2-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
zonemercy-random-user-l3-ep2-v1-mkmlizer: Downloaded to shared memory in 63.376s
zonemercy-random-user-l3-ep2-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpmocidapx, device:0
zonemercy-random-user-l3-ep2-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zonemercy-random-user-l3-ep2-v1-mkmlizer: quantized model in 28.948s
zonemercy-random-user-l3-ep2-v1-mkmlizer: Processed model zonemercy/Random-User-l3-ep2 in 92.324s
zonemercy-random-user-l3-ep2-v1-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-random-user-l3-ep2-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-random-user-l3-ep2-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v1
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v1/config.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v1/special_tokens_map.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v1/tokenizer_config.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v1/tokenizer.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-random-user-l3-ep2-v1/flywheel_model.0.safetensors
zonemercy-random-user-l3-ep2-v1-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
zonemercy-random-user-l3-ep2-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:11, 24.24it/s] Loading 0: 4%|▍ | 12/291 [00:00<00:08, 34.42it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:08, 32.19it/s] Loading 0: 7%|▋ | 21/291 [00:00<00:07, 34.63it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:08, 32.78it/s] Loading 0: 10%|█ | 30/291 [00:00<00:07, 37.26it/s] Loading 0: 12%|█▏ | 34/291 [00:01<00:10, 23.58it/s] Loading 0: 13%|█▎ | 38/291 [00:01<00:09, 25.40it/s] Loading 0: 14%|█▍ | 42/291 [00:01<00:09, 24.94it/s] Loading 0: 16%|█▋ | 48/291 [00:01<00:07, 30.72it/s] Loading 0: 18%|█▊ | 52/291 [00:01<00:07, 30.48it/s] Loading 0: 20%|█▉ | 57/291 [00:01<00:07, 32.99it/s] Loading 0: 21%|██ | 61/291 [00:02<00:07, 31.58it/s] Loading 0: 23%|██▎ | 66/291 [00:02<00:06, 34.59it/s] Loading 0: 24%|██▍ | 70/291 [00:02<00:06, 32.79it/s] Loading 0: 25%|██▌ | 74/291 [00:02<00:06, 33.29it/s] Loading 0: 27%|██▋ | 78/291 [00:02<00:06, 33.54it/s] Loading 0: 28%|██▊ | 82/291 [00:02<00:09, 22.93it/s] Loading 0: 30%|██▉ | 86/291 [00:02<00:07, 25.84it/s] Loading 0: 31%|███ | 90/291 [00:03<00:07, 28.26it/s] Loading 0: 32%|███▏ | 94/291 [00:03<00:06, 28.57it/s] Loading 0: 34%|███▍ | 99/291 [00:03<00:05, 32.26it/s] Loading 0: 35%|███▌ | 103/291 [00:03<00:05, 31.95it/s] Loading 0: 37%|███▋ | 108/291 [00:03<00:05, 34.32it/s] Loading 0: 38%|███▊ | 112/291 [00:03<00:05, 33.20it/s] Loading 0: 40%|███▉ | 116/291 [00:03<00:05, 33.63it/s] Loading 0: 42%|████▏ | 122/291 [00:03<00:04, 38.32it/s] Loading 0: 44%|████▎ | 127/291 [00:04<00:04, 35.94it/s] Loading 0: 46%|████▌ | 133/291 [00:04<00:05, 29.89it/s] Loading 0: 47%|████▋ | 137/291 [00:04<00:05, 30.13it/s] Loading 0: 48%|████▊ | 141/291 [00:04<00:05, 28.14it/s] Loading 0: 51%|█████ | 147/291 [00:04<00:04, 33.02it/s] Loading 0: 52%|█████▏ | 151/291 [00:04<00:04, 32.25it/s] Loading 0: 54%|█████▎ | 156/291 [00:04<00:03, 34.76it/s] Loading 0: 55%|█████▍ | 160/291 [00:05<00:03, 33.50it/s] Loading 0: 57%|█████▋ | 165/291 [00:05<00:03, 36.18it/s] Loading 0: 58%|█████▊ | 169/291 [00:05<00:03, 34.69it/s] Loading 0: 60%|█████▉ | 174/291 [00:05<00:03, 37.01it/s] Loading 0: 61%|██████ | 178/291 [00:05<00:03, 35.47it/s] Loading 0: 63%|██████▎ | 184/291 [00:05<00:02, 41.55it/s] Loading 0: 65%|██████▍ | 189/291 [00:06<00:04, 24.53it/s] Loading 0: 67%|██████▋ | 194/291 [00:06<00:03, 26.08it/s] Loading 0: 69%|██████▉ | 201/291 [00:06<00:02, 32.83it/s] Loading 0: 71%|███████ | 206/291 [00:06<00:02, 33.76it/s] Loading 0: 72%|███████▏ | 210/291 [00:06<00:02, 34.67it/s] Loading 0: 74%|███████▎ | 214/291 [00:06<00:02, 33.89it/s] Loading 0: 75%|███████▌ | 219/291 [00:06<00:01, 36.73it/s] Loading 0: 77%|███████▋ | 223/291 [00:07<00:01, 35.53it/s] Loading 0: 78%|███████▊ | 227/291 [00:07<00:01, 35.12it/s] Loading 0: 79%|███████▉ | 231/291 [00:07<00:01, 34.36it/s] Loading 0: 81%|████████ | 235/291 [00:07<00:02, 25.03it/s] Loading 0: 82%|████████▏ | 239/291 [00:07<00:02, 25.18it/s] Loading 0: 85%|████████▍ | 246/291 [00:07<00:01, 32.68it/s] Loading 0: 86%|████████▌ | 250/291 [00:07<00:01, 32.23it/s] Loading 0: 88%|████████▊ | 255/291 [00:08<00:01, 35.03it/s] Loading 0: 89%|████████▉ | 259/291 [00:08<00:00, 34.14it/s] Loading 0: 91%|█████████ | 264/291 [00:08<00:00, 36.60it/s] Loading 0: 92%|█████████▏| 268/291 [00:08<00:00, 34.41it/s] Loading 0: 94%|█████████▍| 273/291 [00:08<00:00, 36.94it/s] Loading 0: 95%|█████████▌| 277/291 [00:08<00:00, 35.62it/s] Loading 0: 97%|█████████▋| 281/291 [00:08<00:00, 34.22it/s] Loading 0: 98%|█████████▊| 286/291 [00:14<00:01, 2.59it/s] Loading 0: 99%|█████████▉| 289/291 [00:14<00:00, 3.22it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-random-user-l3-ep2-v1-mkmlizer: warnings.warn(
zonemercy-random-user-l3-ep2-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-random-user-l3-ep2-v1-mkmlizer: warnings.warn(
zonemercy-random-user-l3-ep2-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-random-user-l3-ep2-v1-mkmlizer: warnings.warn(
zonemercy-random-user-l3-ep2-v1-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.86s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.03s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.30s/it]
zonemercy-random-user-l3-ep2-v1-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.34it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.87it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.52it/s]
zonemercy-random-user-l3-ep2-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
zonemercy-random-user-l3-ep2-v1-mkmlizer: Saving duration: 1.333s
zonemercy-random-user-l3-ep2-v1-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.497s
zonemercy-random-user-l3-ep2-v1-mkmlizer: creating bucket guanaco-reward-models
zonemercy-random-user-l3-ep2-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
zonemercy-random-user-l3-ep2-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v1_reward
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v1_reward/special_tokens_map.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v1_reward/config.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v1_reward/tokenizer_config.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v1_reward/merges.txt
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v1_reward/vocab.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v1_reward/tokenizer.json
zonemercy-random-user-l3-ep2-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/zonemercy-random-user-l3-ep2-v1_reward/reward.tensors
Job zonemercy-random-user-l3-ep2-v1-mkmlizer completed after 136.36s with status: succeeded
Stopping job with name zonemercy-random-user-l3-ep2-v1-mkmlizer
Pipeline stage MKMLizer completed in 137.40s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service zonemercy-random-user-l3-ep2-v1
Waiting for inference service zonemercy-random-user-l3-ep2-v1 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service zonemercy-random-user-l3-ep2-v1 ready after 110.72176694869995s
Pipeline stage ISVCDeployer completed in 113.05s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.88460373878479s
Received healthy response to inference request in 1.0694019794464111s
Received healthy response to inference request in 1.0531904697418213s
Received healthy response to inference request in 0.48939037322998047s
Received healthy response to inference request in 1.049889087677002s
5 requests
0 failed requests
5th percentile: 0.6014901161193847
10th percentile: 0.713589859008789
20th percentile: 0.9377893447875977
30th percentile: 1.050549364089966
40th percentile: 1.0518699169158936
50th percentile: 1.0531904697418213
60th percentile: 1.0596750736236573
70th percentile: 1.0661596775054931
80th percentile: 1.232442331314087
90th percentile: 1.5585230350494386
95th percentile: 1.721563386917114
99th percentile: 1.8519956684112548
mean time: 1.109295129776001
Pipeline stage StressChecker completed in 6.26s
zonemercy-random-user-l3-ep2_v1 status is now deployed due to DeploymentManager action
zonemercy-random-user-l3-ep2_v1 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of zonemercy-random-user-l3-ep2_v1
Running pipeline stage ISVCDeleter
Checking if service zonemercy-random-user-l3-ep2-v1 is running
Tearing down inference service zonemercy-random-user-l3-ep2-v1
Service zonemercy-random-user-l3-ep2-v1 has been torndown
Pipeline stage ISVCDeleter completed in 4.71s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key zonemercy-random-user-l3-ep2-v1/config.json from bucket guanaco-mkml-models
Deleting key zonemercy-random-user-l3-ep2-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key zonemercy-random-user-l3-ep2-v1/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key zonemercy-random-user-l3-ep2-v1/tokenizer.json from bucket guanaco-mkml-models
Deleting key zonemercy-random-user-l3-ep2-v1/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key zonemercy-random-user-l3-ep2-v1_reward/config.json from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v1_reward/merges.txt from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v1_reward/reward.tensors from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v1_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v1_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v1_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key zonemercy-random-user-l3-ep2-v1_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.72s
zonemercy-random-user-l3-ep2_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics