submission_id: hastagaras-jamet-8b-l3-mk-i_v6
developer_uid: Hastagaras
status: inactive
model_repo: Hastagaras/Jamet-8B-L3-MK.I
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.05, 'top_p': 1.0, 'min_p': 0.085, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-11T03:48:42+00:00
model_name: p
model_eval_status: success
model_group: Hastagaras/Jamet-8B-L3-M
num_battles: 11072
num_wins: 6045
celo_rating: 1211.14
safety_score: 0.97
propriety_score: 0.6694852026858991
propriety_total_count: 4021.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: p
ineligible_reason: propriety_total_count < 5000
language_model: Hastagaras/Jamet-8B-L3-MK.I
model_size: 8B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-06-10
win_ratio: 0.5459718208092486
Resubmit model
Running pipeline stage MKMLizer
Starting job with name hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer
Waiting for job on hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer to finish
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ _____ __ __ ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ /___/ ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ Version: 0.8.14 ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ https://mk1.ai ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ The license key for the current software has been verified as ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ belonging to: ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ Chai Research Corp. ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: warnings.warn(warning_message, FutureWarning)
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Downloaded to shared memory in 31.409s
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: quantizing model to /dev/shm/model_cache
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Saving flywheel model at /dev/shm/model_cache
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:02, 122.41it/s] Loading 0: 11%|█ | 31/291 [00:00<00:01, 143.82it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:01, 151.06it/s] Loading 0: 23%|██▎ | 66/291 [00:00<00:01, 157.27it/s] Loading 0: 29%|██▊ | 83/291 [00:00<00:02, 79.14it/s] Loading 0: 33%|███▎ | 95/291 [00:00<00:02, 86.74it/s] Loading 0: 38%|███▊ | 112/291 [00:01<00:01, 104.69it/s] Loading 0: 45%|████▍ | 130/291 [00:01<00:01, 118.66it/s] Loading 0: 51%|█████ | 148/291 [00:01<00:01, 129.43it/s] Loading 0: 57%|█████▋ | 166/291 [00:01<00:00, 138.76it/s] Loading 0: 63%|██████▎ | 183/291 [00:01<00:00, 146.57it/s] Loading 0: 68%|██████▊ | 199/291 [00:01<00:01, 87.66it/s] Loading 0: 74%|███████▍ | 215/291 [00:01<00:00, 100.49it/s] Loading 0: 79%|███████▉ | 230/291 [00:02<00:00, 109.97it/s] Loading 0: 85%|████████▌ | 248/291 [00:02<00:00, 123.16it/s] Loading 0: 91%|█████████▏| 266/291 [00:02<00:00, 133.80it/s] Loading 0: 98%|█████████▊| 284/291 [00:02<00:00, 144.42it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: quantized model in 23.076s
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Processed model Hastagaras/Jamet-8B-L3-MK.I in 57.074s
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: creating bucket guanaco-mkml-models
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-mk-i-v6
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-mk-i-v6/config.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-mk-i-v6/special_tokens_map.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-mk-i-v6/tokenizer.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-mk-i-v6/tokenizer_config.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-mk-i-v6/flywheel_model.0.safetensors
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: warnings.warn(
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: warnings.warn(
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: return self.fget.__get__(instance, owner)()
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Saving duration: 0.421s
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 4.170s
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: creating bucket guanaco-reward-models
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: Bucket 's3://guanaco-reward-models/' created
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/hastagaras-jamet-8b-l3-mk-i-v6_reward
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-mk-i-v6_reward/config.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-mk-i-v6_reward/special_tokens_map.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/hastagaras-jamet-8b-l3-mk-i-v6_reward/merges.txt
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-mk-i-v6_reward/tokenizer_config.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-mk-i-v6_reward/vocab.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-mk-i-v6_reward/tokenizer.json
hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/hastagaras-jamet-8b-l3-mk-i-v6_reward/reward.tensors
Job hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer completed after 83.56s with status: succeeded
Stopping job with name hastagaras-jamet-8b-l3-mk-i-v6-mkmlizer
Pipeline stage MKMLizer completed in 86.40s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service hastagaras-jamet-8b-l3-mk-i-v6
Waiting for inference service hastagaras-jamet-8b-l3-mk-i-v6 to be ready
Inference service hastagaras-jamet-8b-l3-mk-i-v6 ready after 40.20259380340576s
Pipeline stage ISVCDeployer completed in 46.65s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1116273403167725s
Received healthy response to inference request in 1.3153982162475586s
Received healthy response to inference request in 1.2688744068145752s
Received healthy response to inference request in 1.2708532810211182s
Received healthy response to inference request in 1.347930669784546s
5 requests
0 failed requests
5th percentile: 1.2692701816558838
10th percentile: 1.2696659564971924
20th percentile: 1.2704575061798096
30th percentile: 1.2797622680664062
40th percentile: 1.2975802421569824
50th percentile: 1.3153982162475586
60th percentile: 1.3284111976623536
70th percentile: 1.3414241790771484
80th percentile: 1.5006700038909913
90th percentile: 1.806148672103882
95th percentile: 1.958888006210327
99th percentile: 2.081079473495483
mean time: 1.462936782836914
Pipeline stage StressChecker completed in 7.95s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.03s
Running pipeline stage DaemonicSafetyScorer
Pipeline stage DaemonicSafetyScorer completed in 0.04s
Running M-Eval for topic stay_in_character
hastagaras-jamet-8b-l3-mk-i_v6 status is now deployed due to DeploymentManager action
M-Eval Dataset for topic stay_in_character is loaded
hastagaras-jamet-8b-l3-mk-i_v6 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics