submission_id: sao10k-l3-rp-v4-1_v2
developer_uid: sao10k
status: inactive
model_repo: Sao10K/L3-RP-v4.1
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.1, 'top_p': 0.95, 'min_p': 0.05, 'top_k': 60, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>,', '<|eot_id|>,', '\n\n{user_name}'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-10T17:50:43+00:00
model_name: V4-Expr1-Delta
model_eval_status: success
model_group: Sao10K/L3-RP-v4.1
num_battles: 7350
num_wins: 4060
celo_rating: 1220.75
propriety_score: 0.6826064109301103
propriety_total_count: 1903.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
display_name: V4-Expr1-Delta
ineligible_reason: None
language_model: Sao10K/L3-RP-v4.1
model_size: 8B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-06-10
win_ratio: 0.5523809523809524
preference_data_url: None
Resubmit model
Running pipeline stage MKMLizer
Starting job with name sao10k-l3-rp-v4-1-v2-mkmlizer
Waiting for job on sao10k-l3-rp-v4-1-v2-mkmlizer to finish
sao10k-l3-rp-v4-1-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ _____ __ __ ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ /___/ ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ Version: 0.8.14 ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ https://mk1.ai ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ The license key for the current software has been verified as ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ belonging to: ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ Chai Research Corp. ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v4-1-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
sao10k-l3-rp-v4-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
sao10k-l3-rp-v4-1-v2-mkmlizer: warnings.warn(warning_message, FutureWarning)
sao10k-l3-rp-v4-1-v2-mkmlizer: Downloaded to shared memory in 12.926s
sao10k-l3-rp-v4-1-v2-mkmlizer: quantizing model to /dev/shm/model_cache
sao10k-l3-rp-v4-1-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
sao10k-l3-rp-v4-1-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:03<09:32, 1.98s/it] Loading 0: 8%|▊ | 22/291 [00:04<00:36, 7.44it/s] Loading 0: 14%|█▍ | 41/291 [00:04<00:15, 16.07it/s] Loading 0: 21%|██ | 60/291 [00:04<00:10, 22.84it/s] Loading 0: 29%|██▉ | 85/291 [00:04<00:05, 38.72it/s] Loading 0: 36%|███▌ | 105/291 [00:04<00:03, 53.04it/s] Loading 0: 45%|████▍ | 130/291 [00:04<00:02, 75.02it/s] Loading 0: 52%|█████▏ | 150/291 [00:04<00:01, 91.30it/s] Loading 0: 58%|█████▊ | 169/291 [00:05<00:01, 77.38it/s] Loading 0: 66%|██████▋ | 193/291 [00:05<00:00, 99.70it/s] Loading 0: 73%|███████▎ | 213/291 [00:05<00:00, 114.81it/s] Loading 0: 82%|████████▏ | 238/291 [00:05<00:00, 138.65it/s] Loading 0: 89%|████████▊ | 258/291 [00:05<00:00, 148.49it/s] Loading 0: 95%|█████████▌| 277/291 [00:06<00:00, 102.94it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
sao10k-l3-rp-v4-1-v2-mkmlizer: quantized model in 16.685s
sao10k-l3-rp-v4-1-v2-mkmlizer: Processed model Sao10K/L3-RP-v4.1 in 30.565s
sao10k-l3-rp-v4-1-v2-mkmlizer: creating bucket guanaco-mkml-models
sao10k-l3-rp-v4-1-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
sao10k-l3-rp-v4-1-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/sao10k-l3-rp-v4-1-v2
sao10k-l3-rp-v4-1-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/sao10k-l3-rp-v4-1-v2/config.json
sao10k-l3-rp-v4-1-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/sao10k-l3-rp-v4-1-v2/tokenizer_config.json
sao10k-l3-rp-v4-1-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/sao10k-l3-rp-v4-1-v2/special_tokens_map.json
sao10k-l3-rp-v4-1-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/sao10k-l3-rp-v4-1-v2/tokenizer.json
sao10k-l3-rp-v4-1-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/sao10k-l3-rp-v4-1-v2/flywheel_model.0.safetensors
sao10k-l3-rp-v4-1-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
sao10k-l3-rp-v4-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-rp-v4-1-v2-mkmlizer: warnings.warn(
sao10k-l3-rp-v4-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-rp-v4-1-v2-mkmlizer: warnings.warn(
sao10k-l3-rp-v4-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-rp-v4-1-v2-mkmlizer: warnings.warn(
sao10k-l3-rp-v4-1-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/sao10k-l3-rp-v4-1-v2_reward/reward.tensors
Job sao10k-l3-rp-v4-1-v2-mkmlizer completed after 53.3s with status: succeeded
Stopping job with name sao10k-l3-rp-v4-1-v2-mkmlizer
Pipeline stage MKMLizer completed in 57.09s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service sao10k-l3-rp-v4-1-v2
Waiting for inference service sao10k-l3-rp-v4-1-v2 to be ready
Inference service sao10k-l3-rp-v4-1-v2 ready after 100.77659249305725s
Pipeline stage ISVCDeployer completed in 107.84s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1002042293548584s
Received healthy response to inference request in 1.2087092399597168s
Received healthy response to inference request in 1.2357709407806396s
Received healthy response to inference request in 1.2424521446228027s
Received healthy response to inference request in 1.201690435409546s
5 requests
0 failed requests
5th percentile: 1.20309419631958
10th percentile: 1.2044979572296142
20th percentile: 1.2073054790496827
30th percentile: 1.2141215801239014
40th percentile: 1.2249462604522705
50th percentile: 1.2357709407806396
60th percentile: 1.238443422317505
70th percentile: 1.2411159038543702
80th percentile: 1.414002561569214
90th percentile: 1.7571033954620363
95th percentile: 1.9286538124084471
99th percentile: 2.065894145965576
mean time: 1.3977653980255127
Pipeline stage StressChecker completed in 7.52s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.03s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.03s
M-Eval Dataset for topic stay_in_character is loaded
sao10k-l3-rp-v4-1_v2 status is now deployed due to DeploymentManager action
sao10k-l3-rp-v4-1_v2 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics