submission_id: saishf-sovlish-maid-l3-8b_v1
developer_uid: saishf
status: inactive
model_repo: saishf/SOVLish-Maid-L3-8B
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-05-10T03:01:59+00:00
model_name: saishf-sovlish-maid-l3-8b_v1
model_eval_status: success
double_thumbs_up: 377
thumbs_up: 624
thumbs_down: 299
num_battles: 27232
num_wins: 12974
celo_rating: 1153.34
entertaining: 6.92
stay_in_character: 8.38
user_preference: 7.06
safety_score: 0.89
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 4
max_input_tokens: 512
max_output_tokens: 64
display_name: saishf-sovlish-maid-l3-8b_v1
double_thumbs_up_ratio: 0.29
feedback_count: 1300
ineligible_reason: None
language_model: saishf/SOVLish-Maid-L3-8B
model_score: 7.453333333333333
model_size: 8B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
single_thumbs_up_ratio: 0.48
thumbs_down_ratio: 0.23
thumbs_up_ratio: 0.77
us_pacific_date: 2024-05-09
win_ratio: 0.47642479435957696
Resubmit model
Running pipeline stage MKMLizer
Starting job with name saishf-sovlish-maid-l3-8b-v1-mkmlizer
Waiting for job on saishf-sovlish-maid-l3-8b-v1-mkmlizer to finish
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ _____ __ __ ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ /___/ ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ Version: 0.8.10 ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ The license key for the current software has been verified as ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ belonging to: ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ Chai Research Corp. ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ║ ║
saishf-sovlish-maid-l3-8b-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
saishf-sovlish-maid-l3-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
saishf-sovlish-maid-l3-8b-v1-mkmlizer: warnings.warn(warning_message, FutureWarning)
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Downloaded to shared memory in 57.555s
saishf-sovlish-maid-l3-8b-v1-mkmlizer: quantizing model to /dev/shm/model_cache
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:03<09:31, 1.98s/it] Loading 0: 39%|███▉ | 114/291 [00:04<00:05, 30.01it/s] Loading 0: 70%|██████▉ | 203/291 [00:06<00:01, 45.31it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
saishf-sovlish-maid-l3-8b-v1-mkmlizer: quantized model in 18.707s
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Processed model saishf/SOVLish-Maid-L3-8B in 77.327s
saishf-sovlish-maid-l3-8b-v1-mkmlizer: creating bucket guanaco-mkml-models
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
saishf-sovlish-maid-l3-8b-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/saishf-sovlish-maid-l3-8b-v1
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/saishf-sovlish-maid-l3-8b-v1/config.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/saishf-sovlish-maid-l3-8b-v1/special_tokens_map.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/saishf-sovlish-maid-l3-8b-v1/tokenizer_config.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/saishf-sovlish-maid-l3-8b-v1/tokenizer.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/saishf-sovlish-maid-l3-8b-v1/flywheel_model.0.safetensors
saishf-sovlish-maid-l3-8b-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
saishf-sovlish-maid-l3-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
saishf-sovlish-maid-l3-8b-v1-mkmlizer: warnings.warn(
saishf-sovlish-maid-l3-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
saishf-sovlish-maid-l3-8b-v1-mkmlizer: warnings.warn(
saishf-sovlish-maid-l3-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
saishf-sovlish-maid-l3-8b-v1-mkmlizer: warnings.warn(
saishf-sovlish-maid-l3-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
saishf-sovlish-maid-l3-8b-v1-mkmlizer: return self.fget.__get__(instance, owner)()
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Saving duration: 0.335s
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 14.973s
saishf-sovlish-maid-l3-8b-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
saishf-sovlish-maid-l3-8b-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/saishf-sovlish-maid-l3-8b-v1_reward
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/saishf-sovlish-maid-l3-8b-v1_reward/config.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/saishf-sovlish-maid-l3-8b-v1_reward/special_tokens_map.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/saishf-sovlish-maid-l3-8b-v1_reward/tokenizer_config.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/saishf-sovlish-maid-l3-8b-v1_reward/merges.txt
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/saishf-sovlish-maid-l3-8b-v1_reward/vocab.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/saishf-sovlish-maid-l3-8b-v1_reward/tokenizer.json
saishf-sovlish-maid-l3-8b-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/saishf-sovlish-maid-l3-8b-v1_reward/reward.tensors
Job saishf-sovlish-maid-l3-8b-v1-mkmlizer completed after 116.97s with status: succeeded
Stopping job with name saishf-sovlish-maid-l3-8b-v1-mkmlizer
Pipeline stage MKMLizer completed in 121.72s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service saishf-sovlish-maid-l3-8b-v1
Waiting for inference service saishf-sovlish-maid-l3-8b-v1 to be ready
Inference service saishf-sovlish-maid-l3-8b-v1 ready after 40.24197053909302s
Pipeline stage ISVCDeployer completed in 47.83s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.9513061046600342s
Received healthy response to inference request in 1.1026654243469238s
Received healthy response to inference request in 1.126826524734497s
Received healthy response to inference request in 1.0952773094177246s
Received healthy response to inference request in 0.8029100894927979s
5 requests
0 failed requests
5th percentile: 0.8613835334777832
10th percentile: 0.9198569774627685
20th percentile: 1.0368038654327392
30th percentile: 1.0967549324035644
40th percentile: 1.099710178375244
50th percentile: 1.1026654243469238
60th percentile: 1.112329864501953
70th percentile: 1.1219943046569825
80th percentile: 1.2917224407196046
90th percentile: 1.6215142726898195
95th percentile: 1.7864101886749266
99th percentile: 1.9183269214630128
mean time: 1.2157970905303954
Pipeline stage StressChecker completed in 6.67s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.03s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.04s
M-Eval Dataset for topic stay_in_character is loaded
saishf-sovlish-maid-l3-8b_v1 status is now deployed due to DeploymentManager action
saishf-sovlish-maid-l3-8b_v1 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics