developer_uid: Trace2333
submission_id: trace2333-fd-llama3-v4-n16_v2
model_name: trace2333-fd-llama3-v4-n16_v2
model_group: Trace2333/fd_llama3_v4_N
status: torndown
timestamp: 2024-08-02T12:50:28+00:00
num_battles: 14190
num_wins: 6742
celo_rating: 1198.37
family_friendly_score: 0.0
submission_type: basic
model_repo: Trace2333/fd_llama3_v4_N16
model_architecture: LlamaForCausalLM
reward_repo: Jellywibble/gpt2_xl_pairwise_89m_step_347634
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: trace2333-fd-llama3-v4-n16_v2
is_internal_developer: False
language_model: Trace2333/fd_llama3_v4_N16
model_size: 8B
ranking_group: single
us_pacific_date: 2024-08-02
win_ratio: 0.475123326286117
generation_params: {'temperature': 1.05, 'top_p': 0.95, 'min_p': 0.06, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
formatter: {'memory_template': "<|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
Resubmit model
Running pipeline stage MKMLizer
Starting job with name trace2333-fd-llama3-v4-n16-v2-mkmlizer
Waiting for job on trace2333-fd-llama3-v4-n16-v2-mkmlizer to finish
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ _____ __ __ ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ /___/ ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ Version: 0.9.9 ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ https://mk1.ai ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ The license key for the current software has been verified as ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ belonging to: ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ Chai Research Corp. ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ║ ║
trace2333-fd-llama3-v4-n16-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
trace2333-fd-llama3-v4-n16-v2-mkmlizer: Downloaded to shared memory in 38.597s
trace2333-fd-llama3-v4-n16-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpogh37r15, device:0
trace2333-fd-llama3-v4-n16-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
trace2333-fd-llama3-v4-n16-v2-mkmlizer: creating bucket guanaco-mkml-models
trace2333-fd-llama3-v4-n16-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
trace2333-fd-llama3-v4-n16-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/trace2333-fd-llama3-v4-n16-v2
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/trace2333-fd-llama3-v4-n16-v2/config.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/trace2333-fd-llama3-v4-n16-v2/special_tokens_map.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/trace2333-fd-llama3-v4-n16-v2/tokenizer_config.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/trace2333-fd-llama3-v4-n16-v2/tokenizer.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/trace2333-fd-llama3-v4-n16-v2/flywheel_model.0.safetensors
trace2333-fd-llama3-v4-n16-v2-mkmlizer: loading reward model from Jellywibble/gpt2_xl_pairwise_89m_step_347634
trace2333-fd-llama3-v4-n16-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:10, 26.03it/s] Loading 0: 4%|▍ | 12/291 [00:00<00:07, 37.00it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:07, 34.55it/s] Loading 0: 7%|▋ | 21/291 [00:00<00:07, 38.51it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:07, 35.88it/s] Loading 0: 11%|█ | 32/291 [00:00<00:06, 39.81it/s] Loading 0: 12%|█▏ | 36/291 [00:01<00:10, 25.02it/s] Loading 0: 14%|█▍ | 41/291 [00:01<00:09, 26.16it/s] Loading 0: 16%|█▋ | 48/291 [00:01<00:07, 32.60it/s] Loading 0: 18%|█▊ | 52/291 [00:01<00:07, 32.04it/s] Loading 0: 20%|█▉ | 57/291 [00:01<00:06, 35.20it/s] Loading 0: 21%|██ | 61/291 [00:01<00:06, 33.86it/s] Loading 0: 23%|██▎ | 66/291 [00:01<00:06, 36.74it/s] Loading 0: 24%|██▍ | 70/291 [00:02<00:06, 35.06it/s] Loading 0: 25%|██▌ | 74/291 [00:02<00:06, 35.34it/s] Loading 0: 27%|██▋ | 78/291 [00:02<00:06, 34.93it/s] Loading 0: 28%|██▊ | 82/291 [00:02<00:08, 24.21it/s] Loading 0: 29%|██▉ | 85/291 [00:02<00:08, 24.98it/s] Loading 0: 31%|███ | 90/291 [00:02<00:06, 29.37it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:06, 29.20it/s] Loading 0: 34%|███▍ | 99/291 [00:03<00:05, 33.08it/s] Loading 0: 35%|███▌ | 103/291 [00:03<00:05, 32.21it/s] Loading 0: 37%|███▋ | 108/291 [00:03<00:05, 35.46it/s] Loading 0: 38%|███▊ | 112/291 [00:03<00:05, 34.28it/s] Loading 0: 40%|███▉ | 116/291 [00:03<00:05, 34.96it/s] Loading 0: 42%|████▏ | 122/291 [00:03<00:04, 38.66it/s] Loading 0: 44%|████▎ | 127/291 [00:03<00:04, 35.77it/s] Loading 0: 46%|████▌ | 133/291 [00:04<00:05, 31.16it/s] Loading 0: 47%|████▋ | 137/291 [00:04<00:04, 30.92it/s] Loading 0: 48%|████▊ | 141/291 [00:04<00:05, 29.14it/s] Loading 0: 51%|█████ | 147/291 [00:04<00:04, 34.38it/s] Loading 0: 52%|█████▏ | 151/291 [00:04<00:04, 33.19it/s] Loading 0: 54%|█████▎ | 156/291 [00:04<00:03, 35.36it/s] Loading 0: 55%|█████▍ | 160/291 [00:04<00:03, 32.77it/s] Loading 0: 57%|█████▋ | 165/291 [00:05<00:03, 34.88it/s] Loading 0: 58%|█████▊ | 169/291 [00:05<00:03, 32.34it/s] Loading 0: 60%|█████▉ | 174/291 [00:05<00:03, 36.00it/s] Loading 0: 61%|██████ | 178/291 [00:05<00:03, 34.33it/s] Loading 0: 63%|██████▎ | 184/291 [00:05<00:02, 40.45it/s] Loading 0: 65%|██████▍ | 189/291 [00:05<00:04, 24.81it/s] Loading 0: 67%|██████▋ | 194/291 [00:06<00:03, 25.24it/s] Loading 0: 69%|██████▉ | 201/291 [00:06<00:02, 31.92it/s] Loading 0: 70%|███████ | 205/291 [00:06<00:02, 31.23it/s] Loading 0: 72%|███████▏ | 210/291 [00:06<00:02, 34.36it/s] Loading 0: 74%|███████▎ | 214/291 [00:06<00:02, 33.25it/s] Loading 0: 75%|███████▌ | 219/291 [00:06<00:01, 36.53it/s] Loading 0: 77%|███████▋ | 223/291 [00:06<00:01, 34.40it/s] Loading 0: 78%|███████▊ | 227/291 [00:06<00:01, 34.59it/s] Loading 0: 79%|███████▉ | 231/291 [00:07<00:01, 34.62it/s] Loading 0: 81%|████████ | 235/291 [00:07<00:02, 26.28it/s] Loading 0: 82%|████████▏ | 239/291 [00:07<00:02, 25.88it/s] Loading 0: 85%|████████▍ | 246/291 [00:07<00:01, 33.92it/s] Loading 0: 86%|████████▌ | 250/291 [00:07<00:01, 32.54it/s] Loading 0: 88%|████████▊ | 255/291 [00:07<00:01, 34.91it/s] Loading 0: 89%|████████▉ | 259/291 [00:07<00:00, 33.24it/s] Loading 0: 91%|█████████ | 264/291 [00:08<00:00, 36.27it/s] Loading 0: 92%|█████████▏| 268/291 [00:08<00:00, 34.65it/s] Loading 0: 94%|█████████▍| 273/291 [00:08<00:00, 37.13it/s] Loading 0: 95%|█████████▌| 277/291 [00:08<00:00, 34.95it/s] Loading 0: 97%|█████████▋| 281/291 [00:08<00:00, 35.73it/s] Loading 0: 98%|█████████▊| 286/291 [00:14<00:01, 2.58it/s] Loading 0: 99%|█████████▉| 289/291 [00:14<00:00, 3.20it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
trace2333-fd-llama3-v4-n16-v2-mkmlizer: warnings.warn(
trace2333-fd-llama3-v4-n16-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
trace2333-fd-llama3-v4-n16-v2-mkmlizer: warnings.warn(
trace2333-fd-llama3-v4-n16-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
trace2333-fd-llama3-v4-n16-v2-mkmlizer: Saving duration: 1.414s
trace2333-fd-llama3-v4-n16-v2-mkmlizer: Processed model Jellywibble/gpt2_xl_pairwise_89m_step_347634 in 10.976s
trace2333-fd-llama3-v4-n16-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
trace2333-fd-llama3-v4-n16-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/trace2333-fd-llama3-v4-n16-v2_reward
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-n16-v2_reward/config.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-n16-v2_reward/special_tokens_map.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-n16-v2_reward/tokenizer_config.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/trace2333-fd-llama3-v4-n16-v2_reward/merges.txt
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-n16-v2_reward/vocab.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-n16-v2_reward/tokenizer.json
trace2333-fd-llama3-v4-n16-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/trace2333-fd-llama3-v4-n16-v2_reward/reward.tensors
Job trace2333-fd-llama3-v4-n16-v2-mkmlizer completed after 157.37s with status: succeeded
Stopping job with name trace2333-fd-llama3-v4-n16-v2-mkmlizer
Pipeline stage MKMLizer completed in 158.61s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service trace2333-fd-llama3-v4-n16-v2
Waiting for inference service trace2333-fd-llama3-v4-n16-v2 to be ready
Inference service trace2333-fd-llama3-v4-n16-v2 ready after 141.07454299926758s
Pipeline stage ISVCDeployer completed in 143.53s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2975738048553467s
Received healthy response to inference request in 1.4973883628845215s
Received healthy response to inference request in 1.44696044921875s
Received healthy response to inference request in 1.4340758323669434s
Received healthy response to inference request in 1.3863773345947266s
5 requests
0 failed requests
5th percentile: 1.39591703414917
10th percentile: 1.4054567337036132
20th percentile: 1.4245361328125
30th percentile: 1.4366527557373048
40th percentile: 1.4418066024780274
50th percentile: 1.44696044921875
60th percentile: 1.4671316146850586
70th percentile: 1.4873027801513672
80th percentile: 1.6574254512786868
90th percentile: 1.9774996280670167
95th percentile: 2.1375367164611814
99th percentile: 2.2655663871765137
mean time: 1.6124751567840576
Pipeline stage StressChecker completed in 8.99s
trace2333-fd-llama3-v4-n16_v2 status is now deployed due to DeploymentManager action
trace2333-fd-llama3-v4-n16_v2 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of trace2333-fd-llama3-v4-n16_v2
Running pipeline stage ISVCDeleter
Checking if service trace2333-fd-llama3-v4-n16-v2 is running
Tearing down inference service trace2333-fd-llama3-v4-n16-v2
Service trace2333-fd-llama3-v4-n16-v2 has been torndown
Pipeline stage ISVCDeleter completed in 4.55s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key trace2333-fd-llama3-v4-n16-v2/config.json from bucket guanaco-mkml-models
Deleting key trace2333-fd-llama3-v4-n16-v2/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key trace2333-fd-llama3-v4-n16-v2/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key trace2333-fd-llama3-v4-n16-v2/tokenizer.json from bucket guanaco-mkml-models
Deleting key trace2333-fd-llama3-v4-n16-v2/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key trace2333-fd-llama3-v4-n16-v2_reward/config.json from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v4-n16-v2_reward/merges.txt from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v4-n16-v2_reward/reward.tensors from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v4-n16-v2_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v4-n16-v2_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v4-n16-v2_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key trace2333-fd-llama3-v4-n16-v2_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.21s
trace2333-fd-llama3-v4-n16_v2 status is now torndown due to DeploymentManager action