submission_id: hastagaras-jamet-8b-l3-m_8630_v6
developer_uid: Hastagaras
status: inactive
model_repo: Hastagaras/Jamet-8B-L3-MK.V-Blackroot
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.05, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-10T19:16:44+00:00
model_name: turu
model_eval_status: success
model_group: Hastagaras/Jamet-8B-L3-M
num_battles: 18587
num_wins: 10300
celo_rating: 1214.38
propriety_score: 0.6936083868785932
propriety_total_count: 5914.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: turu
ineligible_reason: None
language_model: Hastagaras/Jamet-8B-L3-MK.V-Blackroot
model_size: 8B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-06-10
win_ratio: 0.5541507505245602
Resubmit model
Running pipeline stage MKMLizer
Starting job with name hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer
Waiting for job on hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer to finish
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ _____ __ __ ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ /___/ ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ Version: 0.8.14 ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ https://mk1.ai ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ The license key for the current software has been verified as ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ belonging to: ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ Chai Research Corp. ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: warnings.warn(warning_message, FutureWarning)
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: Downloaded to shared memory in 23.060s
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: quantizing model to /dev/shm/model_cache
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: Saving flywheel model at /dev/shm/model_cache
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 6%|▌ | 18/291 [00:00<00:01, 178.27it/s] Loading 0: 13%|█▎ | 39/291 [00:00<00:01, 195.25it/s] Loading 0: 20%|██ | 59/291 [00:00<00:01, 187.21it/s] Loading 0: 29%|██▊ | 83/291 [00:00<00:02, 96.10it/s] Loading 0: 35%|███▌ | 103/291 [00:00<00:01, 114.43it/s] Loading 0: 43%|████▎ | 125/291 [00:00<00:01, 137.35it/s] Loading 0: 51%|█████ | 147/291 [00:01<00:00, 156.55it/s] Loading 0: 57%|█████▋ | 167/291 [00:01<00:00, 164.62it/s] Loading 0: 64%|██████▍ | 187/291 [00:01<00:00, 104.49it/s] Loading 0: 70%|██████▉ | 203/291 [00:01<00:00, 114.63it/s] Loading 0: 78%|███████▊ | 228/291 [00:01<00:00, 142.32it/s] Loading 0: 85%|████████▌ | 248/291 [00:01<00:00, 154.46it/s] Loading 0: 94%|█████████▍| 273/291 [00:01<00:00, 176.64it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: quantized model in 16.975s
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: Processed model Hastagaras/Jamet-8B-L3-MK.V-Blackroot in 40.989s
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: creating bucket guanaco-mkml-models
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-8630-v6
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-8630-v6/config.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-8630-v6/tokenizer_config.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-8630-v6/special_tokens_map.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-8630-v6/tokenizer.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-8630-v6/flywheel_model.0.safetensors
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: warnings.warn(
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: warnings.warn(
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: warnings.warn(
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: Saving duration: 0.231s
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 3.397s
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: creating bucket guanaco-reward-models
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: Bucket 's3://guanaco-reward-models/' created
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-8630-v6_reward
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-8630-v6_reward/special_tokens_map.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-8630-v6_reward/config.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-8630-v6_reward/vocab.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-8630-v6_reward/tokenizer_config.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-8630-v6_reward/merges.txt
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-8630-v6_reward/tokenizer.json
hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-8630-v6_reward/reward.tensors
Job hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer completed after 62.71s with status: succeeded
Stopping job with name hastagaras-jamet-8b-l3-m-8630-v6-mkmlizer
Pipeline stage MKMLizer completed in 65.92s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service hastagaras-jamet-8b-l3-m-8630-v6
Waiting for inference service hastagaras-jamet-8b-l3-m-8630-v6 to be ready
Inference service hastagaras-jamet-8b-l3-m-8630-v6 ready after 50.298237800598145s
Pipeline stage ISVCDeployer completed in 57.46s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.7765109539031982s
Received healthy response to inference request in 1.3265998363494873s
Received healthy response to inference request in 1.309326171875s
Received healthy response to inference request in 1.2788825035095215s
Received healthy response to inference request in 1.3377737998962402s
5 requests
0 failed requests
5th percentile: 1.2849712371826172
10th percentile: 1.291059970855713
20th percentile: 1.3032374382019043
30th percentile: 1.3127809047698975
40th percentile: 1.3196903705596923
50th percentile: 1.3265998363494873
60th percentile: 1.3310694217681884
70th percentile: 1.3355390071868896
80th percentile: 1.8255212306976323
90th percentile: 2.801016092300415
95th percentile: 3.288763523101806
99th percentile: 3.6789614677429197
mean time: 1.8058186531066895
Pipeline stage StressChecker completed in 9.67s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.03s
Running pipeline stage DaemonicSafetyScorer
Pipeline stage DaemonicSafetyScorer completed in 0.03s
Running M-Eval for topic stay_in_character
hastagaras-jamet-8b-l3-m_8630_v6 status is now deployed due to DeploymentManager action
M-Eval Dataset for topic stay_in_character is loaded
hastagaras-jamet-8b-l3-m_8630_v6 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics