submission_id: trace2333-fd-llama3-v2-n16_v1
developer_uid: Trace2333
alignment_samples: 0
best_of: 16
celo_rating: 1204.0
display_name: trace2333-fd-llama3-v2-n16_v1
formatter: {'memory_template': "<|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\nYou: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.05, 'top_p': 1.0, 'min_p': 0.06, 'top_k': 250, 'presence_penalty': 0.0, 'frequency_penalty': 0.1, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: False
language_model: Trace2333/fd_llama3_v2_N16
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: Trace2333/fd_llama3_v2_N
model_name: trace2333-fd-llama3-v2-n16_v1
model_num_parameters: 8030261248.0
model_repo: Trace2333/fd_llama3_v2_N16
model_size: 8B
num_battles: 14532
num_wins: 7233
propriety_score: 0.7474860335195531
propriety_total_count: 895.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: Jellywibble/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-01T14:43:44+00:00
us_pacific_date: 2024-08-01
win_ratio: 0.49772914946325353
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name trace2333-fd-llama3-v2-n16-v1-mkmlizer
Waiting for job on trace2333-fd-llama3-v2-n16-v1-mkmlizer to finish
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ _____ __ __ ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ /___/ ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ Version: 0.9.7 ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ https://mk1.ai ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ The license key for the current software has been verified as ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ belonging to: ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ Chai Research Corp. ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ║ ║
trace2333-fd-llama3-v2-n16-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Downloaded to shared memory in 58.910s
trace2333-fd-llama3-v2-n16-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpjvsr3g32, device:0
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
trace2333-fd-llama3-v2-n16-v1-mkmlizer: quantized model in 28.957s
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Processed model Trace2333/fd_llama3_v2_N16 in 87.867s
trace2333-fd-llama3-v2-n16-v1-mkmlizer: creating bucket guanaco-mkml-models
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
trace2333-fd-llama3-v2-n16-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/trace2333-fd-llama3-v2-n16-v1
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/trace2333-fd-llama3-v2-n16-v1/config.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/trace2333-fd-llama3-v2-n16-v1/special_tokens_map.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/trace2333-fd-llama3-v2-n16-v1/tokenizer_config.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/trace2333-fd-llama3-v2-n16-v1/tokenizer.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/trace2333-fd-llama3-v2-n16-v1/flywheel_model.0.safetensors
trace2333-fd-llama3-v2-n16-v1-mkmlizer: loading reward model from Jellywibble/gpt2_xl_pairwise_89m_step_347634
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:10, 27.48it/s] Loading 0: 4%|▍ | 12/291 [00:00<00:07, 36.75it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:08, 34.30it/s] Loading 0: 7%|▋ | 21/291 [00:00<00:07, 37.39it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:07, 35.09it/s] Loading 0: 10%|█ | 30/291 [00:00<00:06, 38.48it/s] Loading 0: 12%|█▏ | 34/291 [00:01<00:09, 26.37it/s] Loading 0: 13%|█▎ | 38/291 [00:01<00:09, 27.61it/s] Loading 0: 14%|█▍ | 42/291 [00:01<00:09, 26.88it/s] Loading 0: 16%|█▋ | 48/291 [00:01<00:07, 32.66it/s] Loading 0: 18%|█▊ | 52/291 [00:01<00:07, 32.25it/s] Loading 0: 20%|█▉ | 57/291 [00:01<00:07, 33.04it/s] Loading 0: 21%|██ | 61/291 [00:01<00:07, 32.52it/s] Loading 0: 23%|██▎ | 66/291 [00:02<00:06, 35.83it/s] Loading 0: 24%|██▍ | 70/291 [00:02<00:06, 34.73it/s] Loading 0: 25%|██▌ | 74/291 [00:02<00:06, 34.42it/s] Loading 0: 27%|██▋ | 78/291 [00:02<00:06, 34.58it/s] Loading 0: 28%|██▊ | 82/291 [00:02<00:08, 24.11it/s] Loading 0: 30%|██▉ | 86/291 [00:02<00:07, 26.88it/s] Loading 0: 31%|███ | 90/291 [00:02<00:06, 29.24it/s] Loading 0: 32%|███▏ | 94/291 [00:03<00:06, 28.81it/s] Loading 0: 34%|███▍ | 99/291 [00:03<00:06, 31.71it/s] Loading 0: 35%|███▌ | 103/291 [00:03<00:06, 31.26it/s] Loading 0: 37%|███▋ | 108/291 [00:03<00:05, 33.41it/s] Loading 0: 38%|███▊ | 112/291 [00:03<00:05, 32.30it/s] Loading 0: 40%|███▉ | 116/291 [00:03<00:05, 33.02it/s] Loading 0: 42%|████▏ | 122/291 [00:03<00:04, 37.33it/s] Loading 0: 44%|████▎ | 127/291 [00:03<00:04, 34.91it/s] Loading 0: 46%|████▌ | 133/291 [00:04<00:05, 29.54it/s] Loading 0: 47%|████▋ | 137/291 [00:04<00:05, 29.78it/s] Loading 0: 48%|████▊ | 141/291 [00:04<00:05, 27.67it/s] Loading 0: 51%|█████ | 147/291 [00:04<00:04, 32.08it/s] Loading 0: 52%|█████▏ | 151/291 [00:04<00:04, 31.77it/s] Loading 0: 54%|█████▎ | 156/291 [00:04<00:03, 34.41it/s] Loading 0: 55%|█████▍ | 160/291 [00:05<00:03, 33.01it/s] Loading 0: 57%|█████▋ | 165/291 [00:05<00:03, 35.77it/s] Loading 0: 58%|█████▊ | 169/291 [00:05<00:03, 34.43it/s] Loading 0: 60%|█████▉ | 174/291 [00:05<00:03, 36.77it/s] Loading 0: 61%|██████ | 178/291 [00:05<00:03, 34.82it/s] Loading 0: 63%|██████▎ | 184/291 [00:05<00:02, 40.42it/s] Loading 0: 65%|██████▍ | 189/291 [00:06<00:04, 24.78it/s] Loading 0: 67%|██████▋ | 194/291 [00:06<00:03, 26.26it/s] Loading 0: 69%|██████▉ | 201/291 [00:06<00:02, 32.81it/s] Loading 0: 70%|███████ | 205/291 [00:06<00:02, 31.88it/s] Loading 0: 72%|███████▏ | 210/291 [00:06<00:02, 34.44it/s] Loading 0: 74%|███████▎ | 214/291 [00:06<00:02, 32.72it/s] Loading 0: 75%|███████▌ | 219/291 [00:06<00:02, 34.97it/s] Loading 0: 77%|███████▋ | 223/291 [00:06<00:02, 33.35it/s] Loading 0: 78%|███████▊ | 227/291 [00:07<00:01, 33.68it/s] Loading 0: 79%|███████▉ | 231/291 [00:07<00:01, 34.03it/s] Loading 0: 81%|████████ | 235/291 [00:07<00:02, 25.76it/s] Loading 0: 82%|████████▏ | 239/291 [00:07<00:02, 25.73it/s] Loading 0: 85%|████████▍ | 246/291 [00:07<00:01, 33.31it/s] Loading 0: 86%|████████▌ | 250/291 [00:07<00:01, 32.61it/s] Loading 0: 88%|████████▊ | 255/291 [00:07<00:01, 35.57it/s] Loading 0: 89%|████████▉ | 259/291 [00:08<00:00, 34.51it/s] Loading 0: 91%|█████████ | 264/291 [00:08<00:00, 37.00it/s] Loading 0: 92%|█████████▏| 268/291 [00:08<00:00, 34.78it/s] Loading 0: 94%|█████████▍| 273/291 [00:08<00:00, 36.98it/s] Loading 0: 95%|█████████▌| 277/291 [00:08<00:00, 34.28it/s] Loading 0: 97%|█████████▋| 281/291 [00:08<00:00, 33.70it/s] Loading 0: 98%|█████████▊| 286/291 [00:14<00:01, 2.59it/s] Loading 0: 99%|█████████▉| 289/291 [00:14<00:00, 3.23it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
trace2333-fd-llama3-v2-n16-v1-mkmlizer: warnings.warn(
trace2333-fd-llama3-v2-n16-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
trace2333-fd-llama3-v2-n16-v1-mkmlizer: warnings.warn(
trace2333-fd-llama3-v2-n16-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
trace2333-fd-llama3-v2-n16-v1-mkmlizer: warnings.warn(
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Saving duration: 1.378s
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Processed model Jellywibble/gpt2_xl_pairwise_89m_step_347634 in 11.286s
trace2333-fd-llama3-v2-n16-v1-mkmlizer: creating bucket guanaco-reward-models
trace2333-fd-llama3-v2-n16-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
trace2333-fd-llama3-v2-n16-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/trace2333-fd-llama3-v2-n16-v1_reward
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/trace2333-fd-llama3-v2-n16-v1_reward/config.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/trace2333-fd-llama3-v2-n16-v1_reward/special_tokens_map.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/trace2333-fd-llama3-v2-n16-v1_reward/tokenizer_config.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/trace2333-fd-llama3-v2-n16-v1_reward/merges.txt
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/trace2333-fd-llama3-v2-n16-v1_reward/vocab.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/trace2333-fd-llama3-v2-n16-v1_reward/tokenizer.json
trace2333-fd-llama3-v2-n16-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/trace2333-fd-llama3-v2-n16-v1_reward/reward.tensors
Job trace2333-fd-llama3-v2-n16-v1-mkmlizer completed after 136.18s with status: succeeded
Stopping job with name trace2333-fd-llama3-v2-n16-v1-mkmlizer
Pipeline stage MKMLizer completed in 137.79s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service trace2333-fd-llama3-v2-n16-v1
Waiting for inference service trace2333-fd-llama3-v2-n16-v1 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service trace2333-fd-llama3-v2-n16-v1 ready after 130.89332556724548s
Pipeline stage ISVCDeployer completed in 132.61s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.237797498703003s
Received healthy response to inference request in 1.5090198516845703s
Received healthy response to inference request in 1.4760305881500244s
Received healthy response to inference request in 1.440880537033081s
Received healthy response to inference request in 1.3709869384765625s
5 requests
0 failed requests
5th percentile: 1.3849656581878662
10th percentile: 1.39894437789917
20th percentile: 1.4269018173217773
30th percentile: 1.4479105472564697
40th percentile: 1.461970567703247
50th percentile: 1.4760305881500244
60th percentile: 1.4892262935638427
70th percentile: 1.502421998977661
80th percentile: 1.654775381088257
90th percentile: 1.9462864398956299
95th percentile: 2.0920419692993164
99th percentile: 2.2086463928222657
mean time: 1.6069430828094482
Pipeline stage StressChecker completed in 8.84s
trace2333-fd-llama3-v2-n16_v1 status is now deployed due to DeploymentManager action
trace2333-fd-llama3-v2-n16_v1 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of trace2333-fd-llama3-v2-n16_v1
Running pipeline stage ISVCDeleter
Checking if service trace2333-fd-llama3-v2-n16-v1 is running
Tearing down inference service trace2333-fd-llama3-v2-n16-v1
Service trace2333-fd-llama3-v2-n16-v1 has been torndown
Pipeline stage ISVCDeleter completed in 4.64s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key trace2333-fd-llama3-v2-n16-v1/config.json from bucket guanaco-mkml-models
Deleting key trace2333-fd-llama3-v2-n16-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key trace2333-fd-llama3-v2-n16-v1/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key trace2333-fd-llama3-v2-n16-v1/tokenizer.json from bucket guanaco-mkml-models
Deleting key trace2333-fd-llama3-v2-n16-v1/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key trace2333-fd-llama3-v2-n16-v1_reward/config.json from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v2-n16-v1_reward/merges.txt from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v2-n16-v1_reward/reward.tensors from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v2-n16-v1_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v2-n16-v1_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v2-n16-v1_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v2-n16-v1_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.54s
trace2333-fd-llama3-v2-n16_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics