submission_id: r136a1-maido-2x8b-l3_v1
developer_uid: R136a1
status: inactive
model_repo: R136a1/Maido-2x8B-L3
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.15, 'top_p': 1.0, 'min_p': 0.075, 'top_k': 70, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "<|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-25T07:14:35+00:00
model_name: r136a1-maido-2x8b-l3_v1
model_group: R136a1/Maido-2x8B-L3
num_battles: 11269
num_wins: 5989
celo_rating: 1202.42
propriety_score: 0.7122686918386993
propriety_total_count: 9349.0
submission_type: basic
model_architecture: MixtralForCausalLM
model_num_parameters: 13667667968.0
best_of: 8
max_input_tokens: 512
max_output_tokens: 64
display_name: r136a1-maido-2x8b-l3_v1
ineligible_reason: None
language_model: R136a1/Maido-2x8B-L3
model_size: 14B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-06-25
win_ratio: 0.5314579820747183
Resubmit model
Running pipeline stage MKMLizer
Starting job with name r136a1-maido-2x8b-l3-v1-mkmlizer
Waiting for job on r136a1-maido-2x8b-l3-v1-mkmlizer to finish
r136a1-maido-2x8b-l3-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ _____ __ __ ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ /___/ ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ Version: 0.8.14 ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ https://mk1.ai ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ The license key for the current software has been verified as ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ belonging to: ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ Chai Research Corp. ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ║ ║
r136a1-maido-2x8b-l3-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
r136a1-maido-2x8b-l3-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
r136a1-maido-2x8b-l3-v1-mkmlizer: warnings.warn(warning_message, FutureWarning)
r136a1-maido-2x8b-l3-v1-mkmlizer: Downloaded to shared memory in 46.532s
r136a1-maido-2x8b-l3-v1-mkmlizer: quantizing model to /dev/shm/model_cache
r136a1-maido-2x8b-l3-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
r136a1-maido-2x8b-l3-v1-mkmlizer: Loading 0: 0%| | 0/419 [00:00<?, ?it/s] Loading 0: 2%|▏ | 8/419 [00:00<00:06, 66.11it/s] Loading 0: 5%|▍ | 19/419 [00:00<00:04, 89.33it/s] Loading 0: 7%|▋ | 31/419 [00:00<00:03, 100.14it/s] Loading 0: 10%|█ | 43/419 [00:00<00:03, 104.73it/s] Loading 0: 13%|█▎ | 55/419 [00:00<00:03, 106.63it/s] Loading 0: 16%|█▌ | 66/419 [00:00<00:06, 56.65it/s] Loading 0: 20%|█▉ | 83/419 [00:01<00:04, 76.63it/s] Loading 0: 22%|██▏ | 94/419 [00:01<00:03, 81.70it/s] Loading 0: 25%|██▌ | 106/419 [00:01<00:03, 87.46it/s] Loading 0: 28%|██▊ | 117/419 [00:01<00:03, 91.77it/s] Loading 0: 31%|███ | 128/419 [00:01<00:03, 95.77it/s] Loading 0: 33%|███▎ | 139/419 [00:01<00:02, 96.11it/s] Loading 0: 36%|███▌ | 150/419 [00:02<00:05, 46.83it/s] Loading 0: 38%|███▊ | 161/419 [00:02<00:04, 56.08it/s] Loading 0: 41%|████ | 171/419 [00:02<00:03, 63.79it/s] Loading 0: 43%|████▎ | 182/419 [00:02<00:03, 71.10it/s] Loading 0: 47%|████▋ | 195/419 [00:02<00:02, 81.91it/s] Loading 0: 50%|████▉ | 208/419 [00:02<00:02, 90.87it/s] Loading 0: 52%|█████▏ | 219/419 [00:02<00:03, 60.17it/s] Loading 0: 55%|█████▌ | 231/419 [00:03<00:02, 70.24it/s] Loading 0: 58%|█████▊ | 243/419 [00:03<00:02, 78.66it/s] Loading 0: 61%|██████ | 254/419 [00:03<00:01, 84.26it/s] Loading 0: 64%|██████▎ | 267/419 [00:03<00:01, 92.75it/s] Loading 0: 67%|██████▋ | 280/419 [00:03<00:01, 98.80it/s] Loading 0: 69%|██████▉ | 291/419 [00:03<00:02, 61.84it/s] Loading 0: 72%|███████▏ | 300/419 [00:04<00:01, 64.94it/s] Loading 0: 74%|███████▎ | 309/419 [00:04<00:01, 67.97it/s] Loading 0: 76%|███████▌ | 319/419 [00:04<00:01, 74.28it/s] Loading 0: 79%|███████▉ | 331/419 [00:04<00:01, 84.61it/s] Loading 0: 83%|████████▎ | 347/419 [00:04<00:00, 103.30it/s] Loading 0: 86%|████████▌ | 360/419 [00:04<00:00, 109.64it/s] Loading 0: 88%|████████▊ | 367/419 [00:20<00:00, 109.64it/s] Loading 0: 88%|████████▊ | 368/419 [00:25<00:28, 1.76it/s] Loading 0: 95%|█████████▍| 396/419 [00:25<00:06, 3.63it/s] Loading 0: 98%|█████████▊| 412/419 [00:25<00:01, 5.10it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
r136a1-maido-2x8b-l3-v1-mkmlizer: quantized model in 33.177s
r136a1-maido-2x8b-l3-v1-mkmlizer: Processed model R136a1/Maido-2x8B-L3 in 84.150s
r136a1-maido-2x8b-l3-v1-mkmlizer: creating bucket guanaco-mkml-models
r136a1-maido-2x8b-l3-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
r136a1-maido-2x8b-l3-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/r136a1-maido-2x8b-l3-v1
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/r136a1-maido-2x8b-l3-v1/tokenizer_config.json
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/r136a1-maido-2x8b-l3-v1/config.json
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/r136a1-maido-2x8b-l3-v1/special_tokens_map.json
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/r136a1-maido-2x8b-l3-v1/tokenizer.json
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/r136a1-maido-2x8b-l3-v1/flywheel_model.1.safetensors
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/r136a1-maido-2x8b-l3-v1/flywheel_model.0.safetensors
r136a1-maido-2x8b-l3-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
r136a1-maido-2x8b-l3-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
r136a1-maido-2x8b-l3-v1-mkmlizer: warnings.warn(
r136a1-maido-2x8b-l3-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
r136a1-maido-2x8b-l3-v1-mkmlizer: warnings.warn(
r136a1-maido-2x8b-l3-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
r136a1-maido-2x8b-l3-v1-mkmlizer: warnings.warn(
r136a1-maido-2x8b-l3-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
r136a1-maido-2x8b-l3-v1-mkmlizer: return self.fget.__get__(instance, owner)()
r136a1-maido-2x8b-l3-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
r136a1-maido-2x8b-l3-v1-mkmlizer: Saving duration: 0.434s
r136a1-maido-2x8b-l3-v1-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 4.647s
r136a1-maido-2x8b-l3-v1-mkmlizer: creating bucket guanaco-reward-models
r136a1-maido-2x8b-l3-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
r136a1-maido-2x8b-l3-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/r136a1-maido-2x8b-l3-v1_reward
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/r136a1-maido-2x8b-l3-v1_reward/config.json
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/r136a1-maido-2x8b-l3-v1_reward/special_tokens_map.json
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/r136a1-maido-2x8b-l3-v1_reward/tokenizer_config.json
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/r136a1-maido-2x8b-l3-v1_reward/merges.txt
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/r136a1-maido-2x8b-l3-v1_reward/vocab.json
r136a1-maido-2x8b-l3-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/r136a1-maido-2x8b-l3-v1_reward/tokenizer.json
Job r136a1-maido-2x8b-l3-v1-mkmlizer completed after 114.68s with status: succeeded
Stopping job with name r136a1-maido-2x8b-l3-v1-mkmlizer
Pipeline stage MKMLizer completed in 115.12s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.16s
Running pipeline stage ISVCDeployer
Creating inference service r136a1-maido-2x8b-l3-v1
Waiting for inference service r136a1-maido-2x8b-l3-v1 to be ready
Inference service r136a1-maido-2x8b-l3-v1 ready after 50.25626254081726s
Pipeline stage ISVCDeployer completed in 56.16s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.0314810276031494s
Received healthy response to inference request in 2.1147665977478027s
Received healthy response to inference request in 2.1197092533111572s
Received healthy response to inference request in 2.1180057525634766s
Received healthy response to inference request in 2.0258073806762695s
5 requests
0 failed requests
5th percentile: 2.0435992240905763
10th percentile: 2.061391067504883
20th percentile: 2.096974754333496
30th percentile: 2.1154144287109373
40th percentile: 2.116710090637207
50th percentile: 2.1180057525634766
60th percentile: 2.118687152862549
70th percentile: 2.119368553161621
80th percentile: 2.302063608169556
90th percentile: 2.6667723178863527
95th percentile: 2.8491266727447506
99th percentile: 2.9950101566314697
mean time: 2.2819540023803713
Pipeline stage StressChecker completed in 12.25s
Running pipeline stage DaemonicSafetyScorer
Pipeline stage DaemonicSafetyScorer completed in 0.04s
r136a1-maido-2x8b-l3_v1 status is now deployed due to DeploymentManager action
r136a1-maido-2x8b-l3_v1 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics