submission_id: hastagaras-jamet-8b-l3-m_9627_v1
developer_uid: Hastagaras
status: inactive
model_repo: Hastagaras/Jamet-8B-L3-MK.V-BR2
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.08, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "<|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-06T12:03:09+00:00
model_name: br2
model_eval_status: success
model_group: Hastagaras/Jamet-8B-L3-M
num_battles: 23918
num_wins: 13340
celo_rating: 1220.03
propriety_score: 0.6997742663656885
propriety_total_count: 1329.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: br2
ineligible_reason: propriety_total_count < 5000
language_model: Hastagaras/Jamet-8B-L3-MK.V-BR2
model_size: 8B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-06-06
win_ratio: 0.5577389413830588
Resubmit model
Running pipeline stage MKMLizer
Starting job with name hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer
Waiting for job on hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer to finish
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ _____ __ __ ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ /___/ ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ Version: 0.8.14 ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ https://mk1.ai ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ The license key for the current software has been verified as ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ belonging to: ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ Chai Research Corp. ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ║ ║
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: warnings.warn(warning_message, FutureWarning)
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: Downloaded to shared memory in 32.915s
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: quantizing model to /dev/shm/model_cache
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:04<10:44, 2.23s/it] Loading 0: 5%|▌ | 15/291 [00:04<01:01, 4.46it/s] Loading 0: 11%|█▏ | 33/291 [00:04<00:21, 11.87it/s] Loading 0: 18%|█▊ | 51/291 [00:04<00:11, 21.37it/s] Loading 0: 22%|██▏ | 65/291 [00:05<00:08, 25.95it/s] Loading 0: 28%|██▊ | 81/291 [00:05<00:05, 37.18it/s] Loading 0: 33%|███▎ | 96/291 [00:05<00:03, 49.16it/s] Loading 0: 39%|███▉ | 113/291 [00:05<00:02, 65.17it/s] Loading 0: 45%|████▌ | 131/291 [00:05<00:01, 82.60it/s] Loading 0: 51%|█████ | 148/291 [00:05<00:01, 97.92it/s] Loading 0: 56%|█████▋ | 164/291 [00:05<00:01, 103.01it/s] Loading 0: 62%|██████▏ | 179/291 [00:06<00:01, 69.16it/s] Loading 0: 67%|██████▋ | 194/291 [00:06<00:01, 81.73it/s] Loading 0: 73%|███████▎ | 212/291 [00:06<00:00, 98.70it/s] Loading 0: 79%|███████▉ | 230/291 [00:06<00:00, 112.51it/s] Loading 0: 85%|████████▌ | 248/291 [00:06<00:00, 124.34it/s] Loading 0: 91%|█████████ | 265/291 [00:06<00:00, 133.28it/s] Loading 0: 97%|█████████▋| 281/291 [00:07<00:00, 89.93it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: quantized model in 22.987s
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: Processed model Hastagaras/Jamet-8B-L3-MK.V-BR2 in 58.375s
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: creating bucket guanaco-mkml-models
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-9627-v1
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-9627-v1/config.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-9627-v1/tokenizer.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-9627-v1/special_tokens_map.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-9627-v1/tokenizer_config.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/hastagaras-jamet-8b-l3-m-9627-v1/flywheel_model.0.safetensors
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: warnings.warn(
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: warnings.warn(
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: warnings.warn(
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: return self.fget.__get__(instance, owner)()
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: creating bucket guanaco-reward-models
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-9627-v1_reward
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-9627-v1_reward/special_tokens_map.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-9627-v1_reward/config.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-9627-v1_reward/tokenizer_config.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-9627-v1_reward/merges.txt
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-9627-v1_reward/vocab.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-9627-v1_reward/tokenizer.json
hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/hastagaras-jamet-8b-l3-m-9627-v1_reward/reward.tensors
Job hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer completed after 83.11s with status: succeeded
Stopping job with name hastagaras-jamet-8b-l3-m-9627-v1-mkmlizer
Pipeline stage MKMLizer completed in 84.01s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service hastagaras-jamet-8b-l3-m-9627-v1
Waiting for inference service hastagaras-jamet-8b-l3-m-9627-v1 to be ready
Inference service hastagaras-jamet-8b-l3-m-9627-v1 ready after 80.59800672531128s
Pipeline stage ISVCDeployer completed in 86.44s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1640381813049316s
Received healthy response to inference request in 1.3422863483428955s
Received healthy response to inference request in 1.3038265705108643s
Received healthy response to inference request in 1.3120551109313965s
Received healthy response to inference request in 1.241269588470459s
5 requests
0 failed requests
5th percentile: 1.25378098487854
10th percentile: 1.2662923812866211
20th percentile: 1.2913151741027833
30th percentile: 1.3054722785949706
40th percentile: 1.3087636947631835
50th percentile: 1.3120551109313965
60th percentile: 1.324147605895996
70th percentile: 1.3362401008605957
80th percentile: 1.5066367149353028
90th percentile: 1.8353374481201172
95th percentile: 1.9996878147125243
99th percentile: 2.13116810798645
mean time: 1.4726951599121094
Pipeline stage StressChecker completed in 7.99s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.03s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.04s
M-Eval Dataset for topic stay_in_character is loaded
hastagaras-jamet-8b-l3-m_9627_v1 status is now deployed due to DeploymentManager action
hastagaras-jamet-8b-l3-m_9627_v1 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics