submission_id: wespro-l3-limarp-oas-8b_v1
developer_uid: WesPro
best_of: 4
display_name: wespro-l3-limarp-oas-8b_v1
family_friendly_score: 0.0
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
ineligible_reason: num_battles<5000
is_internal_developer: False
language_model: WesPro/L3-LimaRP-OAS-8B
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_eval_status: success
model_group: WesPro/L3-LimaRP-OAS-8B
model_name: wespro-l3-limarp-oas-8b_v1
model_num_parameters: 8030261248.0
model_repo: WesPro/L3-LimaRP-OAS-8B
model_size: 8B
num_battles: 317
num_wins: 154
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
status: torndown
submission_type: basic
timestamp: 2024-06-02T15:56:18+00:00
us_pacific_date: 2024-06-02
win_ratio: 0.48580441640378547
Resubmit model
Running pipeline stage MKMLizer
Running pipeline stage MKMLizer
Starting job with name wespro-l3-limarp-oas-8b-v1-mkmlizer
Stopping job with name wespro-l3-limarp-oas-8b-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name wespro-l3-limarp-oas-8b-v1-mkmlizer
Waiting for job on wespro-l3-limarp-oas-8b-v1-mkmlizer to finish
wespro-l3-limarp-oas-8b-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ _____ __ __ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ /___/ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Version: 0.8.14 ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ https://mk1.ai ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ The license key for the current software has been verified as ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ belonging to: ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Chai Research Corp. ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
wespro-l3-limarp-oas-8b-v1-mkmlizer: warnings.warn(warning_message, FutureWarning)
Failed to get response for submission blend_ripuf_2024-05-23: ('http://hastagaras-esekembrew-0-3-v7-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:53410->127.0.0.1:8080: read: connection reset by peer\n')
wespro-l3-limarp-oas-8b-v1-mkmlizer: Downloaded to shared memory in 178.097s
wespro-l3-limarp-oas-8b-v1-mkmlizer: quantizing model to /dev/shm/model_cache
wespro-l3-limarp-oas-8b-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
wespro-l3-limarp-oas-8b-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:04<11:35, 2.41s/it] Loading 0: 5%|▍ | 14/291 [00:04<01:11, 3.85it/s] Loading 0: 10%|▉ | 28/291 [00:05<00:28, 9.21it/s] Loading 0: 14%|█▍ | 41/291 [00:05<00:15, 15.63it/s] Loading 0: 19%|█▉ | 56/291 [00:05<00:09, 25.13it/s] Loading 0: 24%|██▎ | 69/291 [00:05<00:08, 27.41it/s] Loading 0: 29%|██▉ | 85/291 [00:05<00:05, 39.60it/s] Loading 0: 33%|███▎ | 97/291 [00:05<00:03, 48.64it/s] Loading 0: 38%|███▊ | 112/291 [00:05<00:02, 62.74it/s] Loading 0: 43%|████▎ | 125/291 [00:06<00:02, 72.10it/s] Loading 0: 48%|████▊ | 140/291 [00:06<00:01, 83.87it/s] Loading 0: 53%|█████▎ | 154/291 [00:06<00:01, 94.62it/s] Loading 0: 57%|█████▋ | 167/291 [00:06<00:01, 65.21it/s] Loading 0: 62%|██████▏ | 181/291 [00:06<00:01, 77.73it/s] Loading 0: 67%|██████▋ | 194/291 [00:06<00:01, 86.75it/s] Loading 0: 73%|███████▎ | 211/291 [00:06<00:00, 101.25it/s] Loading 0: 77%|███████▋ | 224/291 [00:07<00:00, 106.58it/s] Loading 0: 82%|████████▏ | 239/291 [00:07<00:00, 113.21it/s] Loading 0: 87%|████████▋ | 254/291 [00:07<00:00, 122.18it/s] Loading 0: 92%|█████████▏| 268/291 [00:07<00:00, 71.84it/s] Loading 0: 97%|█████████▋| 283/291 [00:07<00:00, 82.21it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
wespro-l3-limarp-oas-8b-v1-mkmlizer: quantized model in 23.979s
wespro-l3-limarp-oas-8b-v1-mkmlizer: Processed model WesPro/L3-LimaRP-OAS-8B in 204.625s
wespro-l3-limarp-oas-8b-v1-mkmlizer: creating bucket guanaco-mkml-models
wespro-l3-limarp-oas-8b-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
wespro-l3-limarp-oas-8b-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/special_tokens_map.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/config.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/tokenizer_config.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/tokenizer.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/flywheel_model.0.safetensors
wespro-l3-limarp-oas-8b-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
wespro-l3-limarp-oas-8b-v1-mkmlizer: warnings.warn(
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
wespro-l3-limarp-oas-8b-v1-mkmlizer: warnings.warn(
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
wespro-l3-limarp-oas-8b-v1-mkmlizer: warnings.warn(
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
wespro-l3-limarp-oas-8b-v1-mkmlizer: return self.fget.__get__(instance, owner)()
wespro-l3-limarp-oas-8b-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
wespro-l3-limarp-oas-8b-v1-mkmlizer: Saving duration: 0.677s
wespro-l3-limarp-oas-8b-v1-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 14.809s
wespro-l3-limarp-oas-8b-v1-mkmlizer: creating bucket guanaco-reward-models
wespro-l3-limarp-oas-8b-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
wespro-l3-limarp-oas-8b-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/wespro-l3-limarp-oas-8b-v1_reward
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/wespro-l3-limarp-oas-8b-v1_reward/tokenizer_config.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/wespro-l3-limarp-oas-8b-v1_reward/tokenizer.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/wespro-l3-limarp-oas-8b-v1_reward/vocab.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/wespro-l3-limarp-oas-8b-v1_reward/merges.txt
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/wespro-l3-limarp-oas-8b-v1_reward/special_tokens_map.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/wespro-l3-limarp-oas-8b-v1_reward/config.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/wespro-l3-limarp-oas-8b-v1_reward/reward.tensors
Stopping job with name wespro-l3-limarp-oas-8b-v1-mkmlizer
%s, retrying in %s seconds...
Stopping job with name wespro-l3-limarp-oas-8b-v1-mkmlizer
%s, retrying in %s seconds...
Stopping job with name wespro-l3-limarp-oas-8b-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name wespro-l3-limarp-oas-8b-v1-mkmlizer
Waiting for job on wespro-l3-limarp-oas-8b-v1-mkmlizer to finish
wespro-l3-limarp-oas-8b-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ _____ __ __ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ /___/ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Version: 0.8.14 ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ https://mk1.ai ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ The license key for the current software has been verified as ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ belonging to: ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Chai Research Corp. ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ║ ║
wespro-l3-limarp-oas-8b-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
wespro-l3-limarp-oas-8b-v1-mkmlizer: warnings.warn(warning_message, FutureWarning)
wespro-l3-limarp-oas-8b-v1-mkmlizer: Downloaded to shared memory in 44.510s
wespro-l3-limarp-oas-8b-v1-mkmlizer: quantizing model to /dev/shm/model_cache
wespro-l3-limarp-oas-8b-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
wespro-l3-limarp-oas-8b-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:04<11:05, 2.30s/it] Loading 0: 4%|▍ | 13/291 [00:04<01:14, 3.73it/s] Loading 0: 8%|▊ | 24/291 [00:04<00:33, 8.09it/s] Loading 0: 14%|█▍ | 41/291 [00:04<00:14, 16.92it/s] Loading 0: 20%|█▉ | 58/291 [00:05<00:08, 27.73it/s] Loading 0: 24%|██▍ | 71/291 [00:05<00:07, 29.78it/s] Loading 0: 29%|██▉ | 85/291 [00:05<00:05, 40.31it/s] Loading 0: 33%|███▎ | 96/291 [00:05<00:04, 48.23it/s] Loading 0: 38%|███▊ | 112/291 [00:05<00:02, 64.25it/s] Loading 0: 43%|████▎ | 125/291 [00:05<00:02, 74.39it/s] Loading 0: 48%|████▊ | 139/291 [00:05<00:01, 87.00it/s] Loading 0: 52%|█████▏ | 152/291 [00:06<00:01, 94.69it/s] Loading 0: 57%|█████▋ | 166/291 [00:06<00:01, 68.21it/s] Loading 0: 61%|██████ | 177/291 [00:06<00:01, 75.01it/s] Loading 0: 66%|██████▋ | 193/291 [00:06<00:01, 91.95it/s] Loading 0: 70%|███████ | 205/291 [00:06<00:00, 97.10it/s] Loading 0: 76%|███████▌ | 221/291 [00:06<00:00, 108.05it/s] Loading 0: 82%|████████▏ | 238/291 [00:06<00:00, 118.26it/s] Loading 0: 86%|████████▋ | 251/291 [00:07<00:00, 114.41it/s] Loading 0: 91%|█████████▏| 266/291 [00:07<00:00, 76.16it/s] Loading 0: 97%|█████████▋| 282/291 [00:07<00:00, 91.50it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
wespro-l3-limarp-oas-8b-v1-mkmlizer: quantized model in 23.163s
wespro-l3-limarp-oas-8b-v1-mkmlizer: Processed model WesPro/L3-LimaRP-OAS-8B in 70.124s
wespro-l3-limarp-oas-8b-v1-mkmlizer: creating bucket guanaco-mkml-models
wespro-l3-limarp-oas-8b-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
wespro-l3-limarp-oas-8b-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/tokenizer_config.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/special_tokens_map.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/config.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/tokenizer.json
wespro-l3-limarp-oas-8b-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/wespro-l3-limarp-oas-8b-v1/flywheel_model.0.safetensors
wespro-l3-limarp-oas-8b-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
wespro-l3-limarp-oas-8b-v1-mkmlizer: warnings.warn(
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
wespro-l3-limarp-oas-8b-v1-mkmlizer: warnings.warn(
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
wespro-l3-limarp-oas-8b-v1-mkmlizer: warnings.warn(
wespro-l3-limarp-oas-8b-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
wespro-l3-limarp-oas-8b-v1-mkmlizer: return self.fget.__get__(instance, owner)()
wespro-l3-limarp-oas-8b-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
wespro-l3-limarp-oas-8b-v1-mkmlizer: Saving duration: 0.401s
wespro-l3-limarp-oas-8b-v1-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 160.593s
wespro-l3-limarp-oas-8b-v1-mkmlizer: creating bucket guanaco-reward-models
wespro-l3-limarp-oas-8b-v1-mkmlizer: ERROR: [Errno -3] Temporary failure in name resolution
wespro-l3-limarp-oas-8b-v1-mkmlizer: ERROR: Connection Error: Error resolving a server hostname.
wespro-l3-limarp-oas-8b-v1-mkmlizer: Please check the servers address specified in 'host_base', 'host_bucket', 'cloudfront_host', 'website_endpoint'
Job wespro-l3-limarp-oas-8b-v1-mkmlizer completed after 274.63s with status: failed
Stopping job with name wespro-l3-limarp-oas-8b-v1-mkmlizer
MKMLizerError('')
wespro-l3-limarp-oas-8b_v1 status is now failed due to DeploymentManager action
admin requested tearing down of wespro-l3-limarp-oas-8b_v1
Running pipeline stage ISVCDeleter
Checking if service wespro-l3-limarp-oas-8b-v1 is running
Tearing down inference service wespro-l3-limarp-oas-8b-v1
Toredown service wespro-l3-limarp-oas-8b-v1
Pipeline stage ISVCDeleter completed in 7.60s
Running pipeline stage MKMLModelDeleter
Skipping deletion as no model was successfully uploaded
Pipeline stage MKMLModelDeleter completed in 0.26s
wespro-l3-limarp-oas-8b_v1 status is now torndown due to DeploymentManager action