submission_id: sao10k-l3-1-8b-stheno-3-_8136_v1
developer_uid: sao10k
alignment_samples: 0
best_of: 16
celo_rating: 1215.1
display_name: Stheno-beta-percentagedataset
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.2, 'top_p': 1.0, 'min_p': 0.1, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>,', '<|eot_id|>,', '\n\n{user_name}', 'You:', '\n\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: False
language_model: Sao10K/L3.1-8B-Stheno-3.4-Beta
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: Sao10K/L3.1-8B-Stheno-3.
model_name: Stheno-beta-percentagedataset
model_num_parameters: 8030261248.0
model_repo: Sao10K/L3.1-8B-Stheno-3.4-Beta
model_size: 8B
num_battles: 13804
num_wins: 6847
propriety_score: 0.7178257394084733
propriety_total_count: 1251.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: Jellywibble/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-04T10:10:01+00:00
us_pacific_date: 2024-08-04
win_ratio: 0.4960156476383657
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer
Waiting for job on sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer to finish
Stopping job with name sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer
Waiting for job on sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer to finish
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ _____ __ __ ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ /___/ ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ Version: 0.9.9 ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ https://mk1.ai ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ The license key for the current software has been verified as ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ belonging to: ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ Chai Research Corp. ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ║ ║
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Downloaded to shared memory in 30.597s
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpmdrtfcf_, device:0
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: quantized model in 26.038s
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Processed model Sao10K/L3.1-8B-Stheno-3.4-Beta in 56.635s
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: creating bucket guanaco-mkml-models
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/sao10k-l3-1-8b-stheno-3-8136-v1
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/sao10k-l3-1-8b-stheno-3-8136-v1/config.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/sao10k-l3-1-8b-stheno-3-8136-v1/special_tokens_map.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/sao10k-l3-1-8b-stheno-3-8136-v1/tokenizer_config.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/sao10k-l3-1-8b-stheno-3-8136-v1/tokenizer.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/sao10k-l3-1-8b-stheno-3-8136-v1/flywheel_model.0.safetensors
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: loading reward model from Jellywibble/gpt2_xl_pairwise_89m_step_347634
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 7/291 [00:00<00:05, 48.63it/s] Loading 0: 8%|▊ | 22/291 [00:00<00:03, 80.00it/s] Loading 0: 11%|█ | 31/291 [00:00<00:03, 83.41it/s] Loading 0: 15%|█▍ | 43/291 [00:00<00:03, 79.88it/s] Loading 0: 18%|█▊ | 53/291 [00:00<00:02, 85.63it/s] Loading 0: 21%|██▏ | 62/291 [00:00<00:02, 83.00it/s] Loading 0: 24%|██▍ | 71/291 [00:00<00:02, 83.36it/s] Loading 0: 29%|██▊ | 83/291 [00:01<00:08, 24.09it/s] Loading 0: 31%|███ | 90/291 [00:02<00:07, 28.22it/s] Loading 0: 33%|███▎ | 97/291 [00:02<00:05, 32.43it/s] Loading 0: 36%|███▋ | 106/291 [00:02<00:04, 39.71it/s] Loading 0: 40%|███▉ | 115/291 [00:02<00:03, 47.00it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 54.37it/s] Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 67.82it/s] Loading 0: 52%|█████▏ | 151/291 [00:02<00:01, 71.78it/s] Loading 0: 55%|█████▍ | 160/291 [00:02<00:01, 71.19it/s] Loading 0: 58%|█████▊ | 169/291 [00:03<00:01, 75.31it/s] Loading 0: 61%|██████ | 178/291 [00:03<00:01, 76.86it/s] Loading 0: 64%|██████▍ | 187/291 [00:04<00:04, 22.67it/s] Loading 0: 67%|██████▋ | 196/291 [00:04<00:03, 28.85it/s] Loading 0: 70%|███████ | 205/291 [00:04<00:02, 35.95it/s] Loading 0: 76%|███████▌ | 220/291 [00:04<00:01, 49.20it/s] Loading 0: 80%|███████▉ | 232/291 [00:04<00:01, 57.13it/s] Loading 0: 83%|████████▎ | 241/291 [00:04<00:00, 62.90it/s] Loading 0: 86%|████████▌ | 250/291 [00:04<00:00, 67.89it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 72.75it/s] Loading 0: 92%|█████████▏| 268/291 [00:05<00:00, 76.47it/s] Loading 0: 97%|█████████▋| 283/291 [00:05<00:00, 84.98it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: warnings.warn(
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: warnings.warn(
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: warnings.warn(
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Saving duration: 1.437s
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Processed model Jellywibble/gpt2_xl_pairwise_89m_step_347634 in 11.362s
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/sao10k-l3-1-8b-stheno-3-8136-v1_reward
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/sao10k-l3-1-8b-stheno-3-8136-v1_reward/special_tokens_map.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/sao10k-l3-1-8b-stheno-3-8136-v1_reward/tokenizer_config.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/sao10k-l3-1-8b-stheno-3-8136-v1_reward/config.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/sao10k-l3-1-8b-stheno-3-8136-v1_reward/merges.txt
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/sao10k-l3-1-8b-stheno-3-8136-v1_reward/vocab.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/sao10k-l3-1-8b-stheno-3-8136-v1_reward/tokenizer.json
sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/sao10k-l3-1-8b-stheno-3-8136-v1_reward/reward.tensors
Job sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer completed after 95.15s with status: succeeded
Stopping job with name sao10k-l3-1-8b-stheno-3-8136-v1-mkmlizer
Pipeline stage MKMLizer completed in 96.85s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Creating inference service sao10k-l3-1-8b-stheno-3-8136-v1
Waiting for inference service sao10k-l3-1-8b-stheno-3-8136-v1 to be ready
Inference service sao10k-l3-1-8b-stheno-3-8136-v1 ready after 150.90935230255127s
Pipeline stage ISVCDeployer completed in 153.01s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1741721630096436s
Received healthy response to inference request in 1.4037690162658691s
Received healthy response to inference request in 1.2180516719818115s
Received healthy response to inference request in 1.3940279483795166s
Received healthy response to inference request in 1.45383620262146s
5 requests
0 failed requests
5th percentile: 1.2532469272613525
10th percentile: 1.2884421825408936
20th percentile: 1.3588326930999757
30th percentile: 1.3959761619567872
40th percentile: 1.3998725891113282
50th percentile: 1.4037690162658691
60th percentile: 1.4237958908081054
70th percentile: 1.4438227653503417
80th percentile: 1.5979033946990968
90th percentile: 1.8860377788543703
95th percentile: 2.0301049709320065
99th percentile: 2.145358724594116
mean time: 1.5287714004516602
Pipeline stage StressChecker completed in 8.29s
sao10k-l3-1-8b-stheno-3-_8136_v1 status is now deployed due to DeploymentManager action
sao10k-l3-1-8b-stheno-3-_8136_v1 status is now inactive due to auto deactivation removed underperforming models
sao10k-l3-1-8b-stheno-3-_8136_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics