submission_id: maldv-l3-badger-mushroom-4x8b_v9
developer_uid: chai_backend_admin
status: inactive
model_repo: maldv/l3-badger-mushroom-4x8b
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.3, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\nYou are {bot_name}. Write the next reply in your never-ending uncensored fictional roleplay with {user_name}.\n\nUse *actions and thoughts*\n\nMemory: {memory}', 'prompt_template': '{prompt}. Keep your response short.<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: ', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-05-09T13:46:44+00:00
model_name: badger-mushroom-4x8b
model_eval_status: success
double_thumbs_up: 90
thumbs_up: 152
thumbs_down: 82
num_battles: 10465
num_wins: 5144
celo_rating: 1167.52
entertaining: 6.88
stay_in_character: 8.4
user_preference: 7.12
safety_score: None
submission_type: basic
model_architecture: MixtralForCausalLM
model_num_parameters: 24942219264.0
best_of: 8
max_input_tokens: 512
max_output_tokens: 64
display_name: badger-mushroom-4x8b
double_thumbs_up_ratio: 0.2777777777777778
feedback_count: 324
ineligible_reason: None
language_model: maldv/l3-badger-mushroom-4x8b
model_score: 7.466666666666668
model_size: 25B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
single_thumbs_up_ratio: 0.4691358024691358
thumbs_down_ratio: 0.25308641975308643
thumbs_up_ratio: 0.7469135802469136
us_pacific_date: 2024-05-09
win_ratio: 0.4915432393693263
Resubmit model
Running pipeline stage MKMLizer
Starting job with name maldv-l3-badger-mushroom-4x8b-v9-mkmlizer
Waiting for job on maldv-l3-badger-mushroom-4x8b-v9-mkmlizer to finish
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ _____ __ __ ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ /___/ ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ Version: 0.8.10 ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ The license key for the current software has been verified as ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ belonging to: ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ Chai Research Corp. ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ║ ║
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: warnings.warn(warning_message, FutureWarning)
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: Downloaded to shared memory in 29.446s
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: quantizing model to /dev/shm/model_cache
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: Saving flywheel model at /dev/shm/model_cache
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: Loading 0: 0%| | 0/611 [00:00<?, ?it/s] Loading 0: 18%|█▊ | 109/611 [00:01<00:07, 67.74it/s] Loading 0: 38%|███▊ | 231/611 [00:03<00:05, 75.67it/s] Loading 0: 49%|████▉ | 299/611 [00:12<00:17, 18.21it/s] Loading 0: 58%|█████▊ | 353/611 [00:13<00:11, 22.16it/s] Loading 0: 75%|███████▌ | 460/611 [00:14<00:04, 33.80it/s] Loading 0: 86%|████████▌ | 525/611 [00:15<00:02, 38.87it/s] Loading 0: 95%|█████████▍| 579/611 [00:21<00:01, 22.76it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: quantized model in 34.260s
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: Processed model maldv/l3-badger-mushroom-4x8b in 66.933s
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: creating bucket guanaco-mkml-models
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/maldv-l3-badger-mushroom-4x8b-v9
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/maldv-l3-badger-mushroom-4x8b-v9/config.json
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/maldv-l3-badger-mushroom-4x8b-v9/special_tokens_map.json
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/maldv-l3-badger-mushroom-4x8b-v9/tokenizer_config.json
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/maldv-l3-badger-mushroom-4x8b-v9/tokenizer.json
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: warnings.warn(
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: warnings.warn(
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: warnings.warn(
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: return self.fget.__get__(instance, owner)()
maldv-l3-badger-mushroom-4x8b-v9-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
Job maldv-l3-badger-mushroom-4x8b-v9-mkmlizer completed after 101.81s with status: succeeded
Stopping job with name maldv-l3-badger-mushroom-4x8b-v9-mkmlizer
Pipeline stage MKMLizer completed in 103.46s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.48s
Running pipeline stage ISVCDeployer
Creating inference service maldv-l3-badger-mushroom-4x8b-v9
Waiting for inference service maldv-l3-badger-mushroom-4x8b-v9 to be ready
Inference service maldv-l3-badger-mushroom-4x8b-v9 ready after 304.32577776908875s
Pipeline stage ISVCDeployer completed in 311.59s
Running pipeline stage StressChecker
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 3.6896870136260986s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.727262258529663s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.7157909870147705s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.7278780937194824s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 2.818852186203003s
5 requests
0 failed requests
5th percentile: 2.718085241317749
10th percentile: 2.7203794956207275
20th percentile: 2.7249680042266844
30th percentile: 2.727385425567627
40th percentile: 2.727631759643555
50th percentile: 2.7278780937194824
60th percentile: 2.7642677307128904
70th percentile: 2.800657367706299
80th percentile: 2.9930191516876223
90th percentile: 3.3413530826568603
95th percentile: 3.5155200481414792
99th percentile: 3.654853620529175
mean time: 2.9358941078186036
Pipeline stage StressChecker completed in 18.04s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.16s
Running M-Eval for topic stay_in_character
Running pipeline stage DaemonicSafetyScorer
M-Eval Dataset for topic stay_in_character is loaded
Pipeline stage DaemonicSafetyScorer completed in 0.34s
%s, retrying in %s seconds...
maldv-l3-badger-mushroom-4x8b_v9 status is now deployed due to DeploymentManager action
%s, retrying in %s seconds...
maldv-l3-badger-mushroom-4x8b_v9 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics