submission_id: pjmixers-llama-3-1-instr_3684_v1
developer_uid: xzuyn
alignment_samples: 0
best_of: 16
celo_rating: 1181.94
display_name: pjmixers-llama-3-1-instr_3684_v1
formatter: {'memory_template': "<|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '---\n\n{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['<|eot_id|>', '<|end_of_text|>', '<|start_header_id|>', '<|end_header_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 512}
is_internal_developer: False
language_model: PJMixers/LLaMa-3.1-Instruct-CVLB-v0.1-ORPO-8B-r7s2450
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: PJMixers/LLaMa-3.1-Instr
model_name: pjmixers-llama-3-1-instr_3684_v1
model_num_parameters: 8030261248.0
model_repo: PJMixers/LLaMa-3.1-Instruct-CVLB-v0.1-ORPO-8B-r7s2450
model_size: 8B
num_battles: 12235
num_wins: 5861
propriety_score: 0.7317298797409806
propriety_total_count: 1081.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
status: torndown
submission_type: basic
timestamp: 2024-07-30T07:53:43+00:00
us_pacific_date: 2024-07-30
win_ratio: 0.4790355537392726
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name pjmixers-llama-3-1-instr-3684-v1-mkmlizer
Waiting for job on pjmixers-llama-3-1-instr-3684-v1-mkmlizer to finish
Stopping job with name pjmixers-llama-3-1-instr-3684-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name pjmixers-llama-3-1-instr-3684-v1-mkmlizer
Waiting for job on pjmixers-llama-3-1-instr-3684-v1-mkmlizer to finish
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ _____ __ __ ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ /___/ ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ Version: 0.9.7 ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ https://mk1.ai ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ The license key for the current software has been verified as ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ belonging to: ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ Chai Research Corp. ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ║ ║
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Downloaded to shared memory in 33.292s
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpl_4tfb_r, device:0
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: quantized model in 25.839s
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Processed model PJMixers/LLaMa-3.1-Instruct-CVLB-v0.1-ORPO-8B-r7s2450 in 59.132s
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: creating bucket guanaco-mkml-models
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/pjmixers-llama-3-1-instr-3684-v1
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/pjmixers-llama-3-1-instr-3684-v1/config.json
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/pjmixers-llama-3-1-instr-3684-v1/special_tokens_map.json
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/pjmixers-llama-3-1-instr-3684-v1/tokenizer_config.json
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/pjmixers-llama-3-1-instr-3684-v1/tokenizer.json
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/pjmixers-llama-3-1-instr-3684-v1/flywheel_model.0.safetensors
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:07, 36.78it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:04, 56.49it/s] Loading 0: 7%|▋ | 19/291 [00:00<00:05, 47.11it/s] Loading 0: 8%|▊ | 24/291 [00:00<00:05, 46.20it/s] Loading 0: 11%|█ | 31/291 [00:00<00:04, 52.68it/s] Loading 0: 13%|█▎ | 37/291 [00:00<00:05, 49.05it/s] Loading 0: 15%|█▍ | 43/291 [00:00<00:04, 50.55it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:04, 52.28it/s] Loading 0: 19%|█▉ | 55/291 [00:01<00:05, 46.21it/s] Loading 0: 21%|██ | 60/291 [00:01<00:05, 45.47it/s] Loading 0: 23%|██▎ | 67/291 [00:01<00:04, 51.28it/s] Loading 0: 25%|██▌ | 73/291 [00:01<00:04, 48.66it/s] Loading 0: 27%|██▋ | 79/291 [00:01<00:04, 49.31it/s] Loading 0: 29%|██▉ | 85/291 [00:01<00:06, 33.94it/s] Loading 0: 31%|███▏ | 91/291 [00:02<00:05, 35.49it/s] Loading 0: 33%|███▎ | 96/291 [00:02<00:05, 37.65it/s] Loading 0: 35%|███▌ | 103/291 [00:02<00:04, 44.54it/s] Loading 0: 37%|███▋ | 109/291 [00:02<00:04, 44.01it/s] Loading 0: 39%|███▉ | 114/291 [00:02<00:04, 44.06it/s] Loading 0: 42%|████▏ | 121/291 [00:02<00:03, 49.53it/s] Loading 0: 44%|████▎ | 127/291 [00:02<00:03, 47.68it/s] Loading 0: 45%|████▌ | 132/291 [00:02<00:03, 47.15it/s] Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 52.22it/s] Loading 0: 50%|████▉ | 145/291 [00:03<00:02, 49.06it/s] Loading 0: 52%|█████▏ | 151/291 [00:03<00:02, 50.14it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:02, 51.69it/s] Loading 0: 56%|█████▌ | 163/291 [00:03<00:02, 48.60it/s] Loading 0: 58%|█████▊ | 168/291 [00:03<00:02, 47.29it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:02, 53.19it/s] Loading 0: 62%|██████▏ | 181/291 [00:03<00:02, 44.18it/s] Loading 0: 64%|██████▍ | 187/291 [00:04<00:02, 36.04it/s] Loading 0: 66%|██████▌ | 192/291 [00:04<00:02, 37.56it/s] Loading 0: 68%|██████▊ | 197/291 [00:04<00:02, 39.38it/s] Loading 0: 70%|██████▉ | 203/291 [00:04<00:02, 38.71it/s] Loading 0: 73%|███████▎ | 211/291 [00:04<00:01, 47.41it/s] Loading 0: 75%|███████▍ | 217/291 [00:04<00:01, 46.04it/s] Loading 0: 76%|███████▋ | 222/291 [00:04<00:01, 46.52it/s] Loading 0: 79%|███████▊ | 229/291 [00:04<00:01, 51.79it/s] Loading 0: 81%|████████ | 235/291 [00:05<00:01, 48.11it/s] Loading 0: 83%|████████▎ | 241/291 [00:05<00:01, 47.01it/s] Loading 0: 85%|████████▍ | 247/291 [00:05<00:00, 49.41it/s] Loading 0: 87%|████████▋ | 253/291 [00:05<00:00, 47.50it/s] Loading 0: 89%|████████▊ | 258/291 [00:05<00:00, 46.90it/s] Loading 0: 91%|█████████ | 265/291 [00:05<00:00, 51.60it/s] Loading 0: 93%|█████████▎| 271/291 [00:05<00:00, 47.79it/s] Loading 0: 95%|█████████▍| 276/291 [00:05<00:00, 47.38it/s] Loading 0: 97%|█████████▋| 282/291 [00:06<00:00, 43.50it/s] Loading 0: 99%|█████████▊| 287/291 [00:11<00:01, 3.25it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: warnings.warn(
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: warnings.warn(
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: warnings.warn(
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Saving duration: 0.314s
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 6.636s
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: creating bucket guanaco-reward-models
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/pjmixers-llama-3-1-instr-3684-v1_reward
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/pjmixers-llama-3-1-instr-3684-v1_reward/tokenizer.json
pjmixers-llama-3-1-instr-3684-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/pjmixers-llama-3-1-instr-3684-v1_reward/reward.tensors
Job pjmixers-llama-3-1-instr-3684-v1-mkmlizer completed after 94.04s with status: succeeded
Stopping job with name pjmixers-llama-3-1-instr-3684-v1-mkmlizer
Pipeline stage MKMLizer completed in 95.92s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Creating inference service pjmixers-llama-3-1-instr-3684-v1
Waiting for inference service pjmixers-llama-3-1-instr-3684-v1 to be ready
Inference service pjmixers-llama-3-1-instr-3684-v1 ready after 120.73215508460999s
Pipeline stage ISVCDeployer completed in 122.82s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1295013427734375s
Received healthy response to inference request in 1.3578901290893555s
Received healthy response to inference request in 1.3062448501586914s
Received healthy response to inference request in 1.4347684383392334s
Received healthy response to inference request in 1.3770489692687988s
5 requests
0 failed requests
5th percentile: 1.3165739059448243
10th percentile: 1.3269029617309571
20th percentile: 1.3475610733032226
30th percentile: 1.361721897125244
40th percentile: 1.3693854331970214
50th percentile: 1.3770489692687988
60th percentile: 1.4001367568969727
70th percentile: 1.4232245445251466
80th percentile: 1.5737150192260743
90th percentile: 1.851608180999756
95th percentile: 1.9905547618865966
99th percentile: 2.1017120265960694
mean time: 1.5210907459259033
Pipeline stage StressChecker completed in 8.28s
pjmixers-llama-3-1-instr_3684_v1 status is now deployed due to DeploymentManager action
pjmixers-llama-3-1-instr_3684_v1 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of pjmixers-llama-3-1-instr_3684_v1
Running pipeline stage ISVCDeleter
Checking if service pjmixers-llama-3-1-instr-3684-v1 is running
Tearing down inference service pjmixers-llama-3-1-instr-3684-v1
Service pjmixers-llama-3-1-instr-3684-v1 has been torndown
Pipeline stage ISVCDeleter completed in 4.99s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key pjmixers-llama-3-1-instr-3684-v1/config.json from bucket guanaco-mkml-models
Deleting key pjmixers-llama-3-1-instr-3684-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key pjmixers-llama-3-1-instr-3684-v1/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key pjmixers-llama-3-1-instr-3684-v1/tokenizer.json from bucket guanaco-mkml-models
Deleting key pjmixers-llama-3-1-instr-3684-v1/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key pjmixers-llama-3-1-instr-3684-v1_reward/config.json from bucket guanaco-reward-models
Deleting key pjmixers-llama-3-1-instr-3684-v1_reward/merges.txt from bucket guanaco-reward-models
Deleting key pjmixers-llama-3-1-instr-3684-v1_reward/reward.tensors from bucket guanaco-reward-models
Deleting key pjmixers-llama-3-1-instr-3684-v1_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key pjmixers-llama-3-1-instr-3684-v1_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key pjmixers-llama-3-1-instr-3684-v1_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key pjmixers-llama-3-1-instr-3684-v1_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.37s
pjmixers-llama-3-1-instr_3684_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics