submission_id: chaiml-sao10k-l3-rp-v3-3_v13
developer_uid: robert_irvine
status: inactive
model_repo: ChaiML/sao10k-l3-rp-v3-3
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
generation_params: {'temperature': 0.95, 'top_p': 0.95, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>,', '<|eot_id|>,', '\n\n{user_name}'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>system<|end_header_id|>\n\nrespond with excitement<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': '""', 'prompt_template': '""', 'bot_template': 'Bot: {message}\n', 'user_template': 'User: {message}\n', 'response_template': 'Bot:', 'truncate_by_message': False}
timestamp: 2024-07-11T17:28:50+00:00
model_name: chaiml-sao10k-l3-rp-v3-3_v13
model_group: ChaiML/sao10k-l3-rp-v3-3
num_battles: 36836
num_wins: 19953
celo_rating: 1220.43
propriety_score: 0.7190213101815311
propriety_total_count: 5068.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
display_name: chaiml-sao10k-l3-rp-v3-3_v13
ineligible_reason: None
language_model: ChaiML/sao10k-l3-rp-v3-3
model_size: 8B
reward_model: ChaiML/gpt2_xl_pairwise_89m_step_347634
us_pacific_date: 2024-07-11
win_ratio: 0.5416711912259746
preference_data_url: None
Resubmit model
Running pipeline stage MKMLizer
Starting job with name chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer
Waiting for job on chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer to finish
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ _____ __ __ ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ /___/ ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ Version: 0.8.14 ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ https://mk1.ai ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ belonging to: ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ Chai Research Corp. ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ║ ║
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Downloaded to shared memory in 23.369s
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: quantizing model to /dev/shm/model_cache
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:05<13:19, 2.77s/it] Loading 0: 5%|▍ | 14/291 [00:05<01:22, 3.36it/s] Loading 0: 10%|▉ | 28/291 [00:05<00:32, 8.09it/s] Loading 0: 14%|█▍ | 41/291 [00:05<00:18, 13.78it/s] Loading 0: 19%|█▉ | 55/291 [00:05<00:10, 21.70it/s] Loading 0: 23%|██▎ | 67/291 [00:06<00:09, 22.88it/s] Loading 0: 27%|██▋ | 78/291 [00:06<00:07, 29.96it/s] Loading 0: 32%|███▏ | 94/291 [00:06<00:04, 43.56it/s] Loading 0: 36%|███▋ | 106/291 [00:06<00:03, 53.12it/s] Loading 0: 42%|████▏ | 121/291 [00:06<00:02, 67.74it/s] Loading 0: 46%|████▌ | 134/291 [00:06<00:02, 77.21it/s] Loading 0: 51%|█████ | 148/291 [00:07<00:01, 89.81it/s] Loading 0: 55%|█████▌ | 161/291 [00:07<00:01, 97.30it/s] Loading 0: 60%|█████▉ | 174/291 [00:07<00:02, 55.69it/s] Loading 0: 64%|██████▎ | 185/291 [00:07<00:01, 63.22it/s] Loading 0: 68%|██████▊ | 199/291 [00:07<00:01, 76.87it/s] Loading 0: 73%|███████▎ | 212/291 [00:07<00:00, 86.42it/s] Loading 0: 78%|███████▊ | 226/291 [00:08<00:00, 97.76it/s] Loading 0: 82%|████████▏ | 239/291 [00:08<00:00, 102.51it/s] Loading 0: 87%|████████▋ | 254/291 [00:08<00:00, 113.98it/s] Loading 0: 92%|█████████▏| 267/291 [00:08<00:00, 60.19it/s] Loading 0: 97%|█████████▋| 283/291 [00:08<00:00, 75.89it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: quantized model in 28.928s
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Processed model ChaiML/sao10k-l3-rp-v3-3 in 52.298s
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: creating bucket guanaco-mkml-models
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v13
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v13/config.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v13/special_tokens_map.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v13/tokenizer_config.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v13/tokenizer.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v13/flywheel_model.0.safetensors
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:919: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: warnings.warn(
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: warnings.warn(
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:769: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: warnings.warn(
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: warnings.warn(
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:06<00:06, 6.35s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.78s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.16s/it]
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 1.65it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 2.71it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 2.47it/s]
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Saving duration: 2.170s
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.068s
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: creating bucket guanaco-reward-models
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: Bucket 's3://guanaco-reward-models/' created
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v13_reward
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v13_reward/config.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v13_reward/special_tokens_map.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v13_reward/tokenizer_config.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v13_reward/vocab.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v13_reward/merges.txt
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v13_reward/tokenizer.json
chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v13_reward/reward.tensors
Job chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer completed after 106.25s with status: succeeded
Stopping job with name chaiml-sao10k-l3-rp-v3-3-v13-mkmlizer
Pipeline stage MKMLizer completed in 107.48s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.14s
Running pipeline stage ISVCDeployer
Creating inference service chaiml-sao10k-l3-rp-v3-3-v13
Waiting for inference service chaiml-sao10k-l3-rp-v3-3-v13 to be ready
Inference service chaiml-sao10k-l3-rp-v3-3-v13 ready after 40.225454568862915s
Pipeline stage ISVCDeployer completed in 47.12s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.0708348751068115s
Received healthy response to inference request in 1.271669864654541s
Received healthy response to inference request in 1.291741132736206s
Received healthy response to inference request in 1.2456088066101074s
Received healthy response to inference request in 1.2687592506408691s
5 requests
0 failed requests
5th percentile: 1.2502388954162598
10th percentile: 1.254868984222412
20th percentile: 1.2641291618347168
30th percentile: 1.2693413734436034
40th percentile: 1.2705056190490722
50th percentile: 1.271669864654541
60th percentile: 1.279698371887207
70th percentile: 1.287726879119873
80th percentile: 1.4475598812103272
90th percentile: 1.7591973781585695
95th percentile: 1.9150161266326904
99th percentile: 2.0396711254119873
mean time: 1.429722785949707
Pipeline stage StressChecker completed in 7.94s
chaiml-sao10k-l3-rp-v3-3_v13 status is now deployed due to DeploymentManager action
chaiml-sao10k-l3-rp-v3-3_v13 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics