developer_uid: sao10k
submission_id: sao10k-l3-rp-v5-1_v2
model_name: RP-v5-Expr1
model_group: Sao10K/L3-RP-v5.1
status: torndown
timestamp: 2024-07-07T16:10:18+00:00
num_battles: 34147
num_wins: 18700
celo_rating: 1222.53
family_friendly_score: 0.0
submission_type: basic
model_repo: Sao10K/L3-RP-v5.1
model_architecture: LlamaForCausalLM
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: RP-v5-Expr1
is_internal_developer: False
language_model: Sao10K/L3-RP-v5.1
model_size: 8B
ranking_group: single
us_pacific_date: 2024-07-07
win_ratio: 0.5476322956628693
generation_params: {'temperature': 1.4, 'top_p': 1.0, 'min_p': 0.2, 'top_k': 50, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>,', '<|eot_id|>,', '\n\n{user_name}'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
Resubmit model
Running pipeline stage MKMLizer
Starting job with name sao10k-l3-rp-v5-1-v2-mkmlizer
Waiting for job on sao10k-l3-rp-v5-1-v2-mkmlizer to finish
Failed to get response for submission chaiml-sao10k-l3-rp-v3-3_v5: ('http://chaiml-sao10k-l3-rp-v3-3-v5-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:53802->127.0.0.1:8080: read: connection reset by peer\n')
sao10k-l3-rp-v5-1-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ _____ __ __ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ /___/ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Version: 0.8.14 ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ https://mk1.ai ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ The license key for the current software has been verified as ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ belonging to: ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Chai Research Corp. ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
sao10k-l3-rp-v5-1-v2-mkmlizer: Downloaded to shared memory in 52.704s
sao10k-l3-rp-v5-1-v2-mkmlizer: quantizing model to /dev/shm/model_cache
sao10k-l3-rp-v5-1-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
sao10k-l3-rp-v5-1-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:04<11:12, 2.33s/it] Loading 0: 5%|▍ | 14/291 [00:04<01:09, 3.97it/s] Loading 0: 10%|▉ | 28/291 [00:04<00:27, 9.48it/s] Loading 0: 14%|█▍ | 41/291 [00:04<00:15, 15.97it/s] Loading 0: 19%|█▉ | 55/291 [00:05<00:09, 24.85it/s] Loading 0: 23%|██▎ | 67/291 [00:05<00:08, 25.30it/s] Loading 0: 27%|██▋ | 78/291 [00:05<00:06, 32.78it/s] Loading 0: 32%|███▏ | 94/291 [00:05<00:04, 47.09it/s] Loading 0: 36%|███▋ | 106/291 [00:05<00:03, 56.60it/s] Loading 0: 42%|████▏ | 121/291 [00:05<00:02, 71.70it/s] Loading 0: 46%|████▌ | 134/291 [00:06<00:01, 81.34it/s] Loading 0: 51%|█████ | 149/291 [00:06<00:01, 92.34it/s] Loading 0: 56%|█████▌ | 163/291 [00:06<00:01, 102.49it/s] Loading 0: 60%|██████ | 176/291 [00:06<00:01, 66.54it/s] Loading 0: 65%|██████▌ | 190/291 [00:06<00:01, 78.63it/s] Loading 0: 70%|██████▉ | 203/291 [00:06<00:01, 86.14it/s] Loading 0: 75%|███████▍ | 217/291 [00:06<00:00, 96.95it/s] Loading 0: 79%|███████▉ | 230/291 [00:07<00:00, 102.66it/s] Loading 0: 84%|████████▍ | 244/291 [00:07<00:00, 111.24it/s] Loading 0: 88%|████████▊ | 257/291 [00:07<00:00, 113.79it/s] Loading 0: 93%|█████████▎| 270/291 [00:07<00:00, 67.05it/s] Loading 0: 98%|█████████▊| 284/291 [00:07<00:00, 78.08it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
sao10k-l3-rp-v5-1-v2-mkmlizer: quantized model in 23.504s
sao10k-l3-rp-v5-1-v2-mkmlizer: Processed model Sao10K/L3-RP-v5.1 in 76.208s
sao10k-l3-rp-v5-1-v2-mkmlizer: creating bucket guanaco-mkml-models
sao10k-l3-rp-v5-1-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
sao10k-l3-rp-v5-1-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/config.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/tokenizer_config.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/special_tokens_map.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/tokenizer.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/flywheel_model.0.safetensors
sao10k-l3-rp-v5-1-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
sao10k-l3-rp-v5-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:919: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-rp-v5-1-v2-mkmlizer: warnings.warn(
sao10k-l3-rp-v5-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
sao10k-l3-rp-v5-1-v2-mkmlizer: warnings.warn(
sao10k-l3-rp-v5-1-v2-mkmlizer: Traceback (most recent call last):
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 174, in _new_conn
sao10k-l3-rp-v5-1-v2-mkmlizer: conn = connection.create_connection(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/connection.py", line 72, in create_connection
sao10k-l3-rp-v5-1-v2-mkmlizer: for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/socket.py", line 955, in getaddrinfo
sao10k-l3-rp-v5-1-v2-mkmlizer: for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
sao10k-l3-rp-v5-1-v2-mkmlizer: socket.gaierror: [Errno -3] Temporary failure in name resolution
sao10k-l3-rp-v5-1-v2-mkmlizer: During handling of the above exception, another exception occurred:
sao10k-l3-rp-v5-1-v2-mkmlizer: Traceback (most recent call last):
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 714, in urlopen
sao10k-l3-rp-v5-1-v2-mkmlizer: httplib_response = self._make_request(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 403, in _make_request
sao10k-l3-rp-v5-1-v2-mkmlizer: self._validate_conn(conn)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1053, in _validate_conn
sao10k-l3-rp-v5-1-v2-mkmlizer: conn.connect()
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 363, in connect
sao10k-l3-rp-v5-1-v2-mkmlizer: self.sock = conn = self._new_conn()
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 186, in _new_conn
sao10k-l3-rp-v5-1-v2-mkmlizer: raise NewConnectionError(
sao10k-l3-rp-v5-1-v2-mkmlizer: urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f5fefd11d50>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution
sao10k-l3-rp-v5-1-v2-mkmlizer: During handling of the above exception, another exception occurred:
sao10k-l3-rp-v5-1-v2-mkmlizer: Traceback (most recent call last):
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 486, in send
sao10k-l3-rp-v5-1-v2-mkmlizer: resp = conn.urlopen(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 798, in urlopen
sao10k-l3-rp-v5-1-v2-mkmlizer: retries = retries.increment(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/retry.py", line 592, in increment
sao10k-l3-rp-v5-1-v2-mkmlizer: raise MaxRetryError(_pool, url, error or ResponseError(cause))
sao10k-l3-rp-v5-1-v2-mkmlizer: urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /ChaiML/reward_gpt2_medium_preference_24m_e2/resolve/main/config.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f5fefd11d50>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
sao10k-l3-rp-v5-1-v2-mkmlizer: During handling of the above exception, another exception occurred:
sao10k-l3-rp-v5-1-v2-mkmlizer: Traceback (most recent call last):
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1722, in _get_metadata_or_catch_error
sao10k-l3-rp-v5-1-v2-mkmlizer: metadata = get_hf_file_metadata(url=url, proxies=proxies, timeout=etag_timeout, headers=headers)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
sao10k-l3-rp-v5-1-v2-mkmlizer: return fn(*args, **kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1645, in get_hf_file_metadata
sao10k-l3-rp-v5-1-v2-mkmlizer: r = _request_wrapper(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 372, in _request_wrapper
sao10k-l3-rp-v5-1-v2-mkmlizer: response = _request_wrapper(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 395, in _request_wrapper
sao10k-l3-rp-v5-1-v2-mkmlizer: response = get_session().request(method=method, url=url, **params)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
sao10k-l3-rp-v5-1-v2-mkmlizer: resp = self.send(prep, **send_kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
sao10k-l3-rp-v5-1-v2-mkmlizer: r = adapter.send(request, **kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 66, in send
sao10k-l3-rp-v5-1-v2-mkmlizer: return super().send(request, *args, **kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 519, in send
sao10k-l3-rp-v5-1-v2-mkmlizer: raise ConnectionError(e, request=request)
sao10k-l3-rp-v5-1-v2-mkmlizer: requests.exceptions.ConnectionError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /ChaiML/reward_gpt2_medium_preference_24m_e2/resolve/main/config.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f5fefd11d50>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))"), '(Request ID: 8a0be281-2111-4ca2-8cbf-d22fa812b712)')
sao10k-l3-rp-v5-1-v2-mkmlizer: The above exception was the direct cause of the following exception:
sao10k-l3-rp-v5-1-v2-mkmlizer: Traceback (most recent call last):
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/utils/hub.py", line 399, in cached_file
sao10k-l3-rp-v5-1-v2-mkmlizer: resolved_file = hf_hub_download(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
sao10k-l3-rp-v5-1-v2-mkmlizer: return fn(*args, **kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1221, in hf_hub_download
sao10k-l3-rp-v5-1-v2-mkmlizer: return _hf_hub_download_to_cache_dir(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1325, in _hf_hub_download_to_cache_dir
sao10k-l3-rp-v5-1-v2-mkmlizer: _raise_on_head_call_error(head_call_error, force_download, local_files_only)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1826, in _raise_on_head_call_error
sao10k-l3-rp-v5-1-v2-mkmlizer: raise LocalEntryNotFoundError(
sao10k-l3-rp-v5-1-v2-mkmlizer: huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
sao10k-l3-rp-v5-1-v2-mkmlizer: The above exception was the direct cause of the following exception:
sao10k-l3-rp-v5-1-v2-mkmlizer: Traceback (most recent call last):
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/code/uploading/reward.py", line 66, in <module>
sao10k-l3-rp-v5-1-v2-mkmlizer: cli()
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1128, in __call__
sao10k-l3-rp-v5-1-v2-mkmlizer: return self.main(*args, **kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1053, in main
sao10k-l3-rp-v5-1-v2-mkmlizer: rv = self.invoke(ctx)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1659, in invoke
sao10k-l3-rp-v5-1-v2-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1395, in invoke
sao10k-l3-rp-v5-1-v2-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 754, in invoke
sao10k-l3-rp-v5-1-v2-mkmlizer: return __callback(*args, **kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/code/uploading/reward.py", line 29, in tensorize_reward_model
sao10k-l3-rp-v5-1-v2-mkmlizer: model, config, tokenizer = download_from_huggingface(repo_id, revision, hf_auth_token)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/code/uploading/reward.py", line 36, in download_from_huggingface
sao10k-l3-rp-v5-1-v2-mkmlizer: config = AutoConfig.from_pretrained(model, revision=revision, use_auth_token=token)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 934, in from_pretrained
sao10k-l3-rp-v5-1-v2-mkmlizer: config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/configuration_utils.py", line 632, in get_config_dict
sao10k-l3-rp-v5-1-v2-mkmlizer: config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/configuration_utils.py", line 689, in _get_config_dict
sao10k-l3-rp-v5-1-v2-mkmlizer: resolved_config_file = cached_file(
sao10k-l3-rp-v5-1-v2-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/utils/hub.py", line 442, in cached_file
sao10k-l3-rp-v5-1-v2-mkmlizer: raise EnvironmentError(
sao10k-l3-rp-v5-1-v2-mkmlizer: OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like ChaiML/reward_gpt2_medium_preference_24m_e2 is not the path to a directory containing a file named config.json.
sao10k-l3-rp-v5-1-v2-mkmlizer: Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Job sao10k-l3-rp-v5-1-v2-mkmlizer completed after 128.09s with status: failed
Stopping job with name sao10k-l3-rp-v5-1-v2-mkmlizer
%s, retrying in %s seconds...
Starting job with name sao10k-l3-rp-v5-1-v2-mkmlizer
Waiting for job on sao10k-l3-rp-v5-1-v2-mkmlizer to finish
sao10k-l3-rp-v5-1-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ _____ __ __ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ /___/ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Version: 0.8.14 ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ https://mk1.ai ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ The license key for the current software has been verified as ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ belonging to: ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Chai Research Corp. ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ║ ║
sao10k-l3-rp-v5-1-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
sao10k-l3-rp-v5-1-v2-mkmlizer: Downloaded to shared memory in 23.370s
sao10k-l3-rp-v5-1-v2-mkmlizer: quantizing model to /dev/shm/model_cache
sao10k-l3-rp-v5-1-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
sao10k-l3-rp-v5-1-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:05<13:28, 2.80s/it] Loading 0: 5%|▍ | 14/291 [00:05<01:23, 3.33it/s] Loading 0: 10%|▉ | 28/291 [00:05<00:32, 8.00it/s] Loading 0: 14%|█▍ | 41/291 [00:05<00:18, 13.63it/s] Loading 0: 19%|█▉ | 55/291 [00:06<00:10, 21.46it/s] Loading 0: 23%|██▎ | 67/291 [00:06<00:09, 22.87it/s] Loading 0: 27%|██▋ | 78/291 [00:06<00:07, 29.93it/s] Loading 0: 32%|███▏ | 94/291 [00:06<00:04, 43.18it/s] Loading 0: 36%|███▌ | 105/291 [00:06<00:03, 51.34it/s] Loading 0: 42%|████▏ | 121/291 [00:06<00:02, 67.69it/s] Loading 0: 46%|████▌ | 134/291 [00:07<00:02, 77.46it/s] Loading 0: 51%|█████ | 148/291 [00:07<00:01, 89.15it/s] Loading 0: 55%|█████▌ | 161/291 [00:07<00:01, 95.00it/s] Loading 0: 60%|█████▉ | 174/291 [00:07<00:02, 57.48it/s] Loading 0: 64%|██████▎ | 185/291 [00:07<00:01, 64.89it/s] Loading 0: 68%|██████▊ | 199/291 [00:07<00:01, 78.03it/s] Loading 0: 73%|███████▎ | 212/291 [00:07<00:00, 86.29it/s] Loading 0: 77%|███████▋ | 225/291 [00:08<00:00, 95.58it/s] Loading 0: 82%|████████▏ | 239/291 [00:08<00:00, 102.48it/s] Loading 0: 87%|████████▋ | 253/291 [00:08<00:00, 110.96it/s] Loading 0: 91%|█████████▏| 266/291 [00:08<00:00, 60.42it/s] Loading 0: 96%|█████████▌| 279/291 [00:08<00:00, 71.61it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
sao10k-l3-rp-v5-1-v2-mkmlizer: quantized model in 29.271s
sao10k-l3-rp-v5-1-v2-mkmlizer: Processed model Sao10K/L3-RP-v5.1 in 52.642s
sao10k-l3-rp-v5-1-v2-mkmlizer: creating bucket guanaco-mkml-models
sao10k-l3-rp-v5-1-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
sao10k-l3-rp-v5-1-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/config.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/special_tokens_map.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/tokenizer_config.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/tokenizer.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/sao10k-l3-rp-v5-1-v2/flywheel_model.0.safetensors
sao10k-l3-rp-v5-1-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
sao10k-l3-rp-v5-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:919: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-rp-v5-1-v2-mkmlizer: warnings.warn(
sao10k-l3-rp-v5-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
sao10k-l3-rp-v5-1-v2-mkmlizer: warnings.warn(
sao10k-l3-rp-v5-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:769: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-rp-v5-1-v2-mkmlizer: warnings.warn(
sao10k-l3-rp-v5-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-rp-v5-1-v2-mkmlizer: warnings.warn(
Failed to get response for submission blend_pefis_2024-07-04: ('http://mistralai-mixtral-8x7b-3473-v33-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"TypeError : SamplingParameters.__init__() got an unexpected keyword argument \'min_p\'"}')
sao10k-l3-rp-v5-1-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
sao10k-l3-rp-v5-1-v2-mkmlizer: return self.fget.__get__(instance, owner)()
sao10k-l3-rp-v5-1-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
sao10k-l3-rp-v5-1-v2-mkmlizer: Saving duration: 0.519s
sao10k-l3-rp-v5-1-v2-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 4.842s
sao10k-l3-rp-v5-1-v2-mkmlizer: creating bucket guanaco-reward-models
sao10k-l3-rp-v5-1-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
sao10k-l3-rp-v5-1-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/sao10k-l3-rp-v5-1-v2_reward
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/sao10k-l3-rp-v5-1-v2_reward/tokenizer_config.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/sao10k-l3-rp-v5-1-v2_reward/config.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/sao10k-l3-rp-v5-1-v2_reward/special_tokens_map.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/sao10k-l3-rp-v5-1-v2_reward/merges.txt
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/sao10k-l3-rp-v5-1-v2_reward/vocab.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/sao10k-l3-rp-v5-1-v2_reward/tokenizer.json
sao10k-l3-rp-v5-1-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/sao10k-l3-rp-v5-1-v2_reward/reward.tensors
Job sao10k-l3-rp-v5-1-v2-mkmlizer completed after 84.92s with status: succeeded
Stopping job with name sao10k-l3-rp-v5-1-v2-mkmlizer
Pipeline stage MKMLizer completed in 214.51s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.14s
Running pipeline stage ISVCDeployer
Creating inference service sao10k-l3-rp-v5-1-v2
Waiting for inference service sao10k-l3-rp-v5-1-v2 to be ready
Inference service sao10k-l3-rp-v5-1-v2 ready after 50.28175163269043s
Pipeline stage ISVCDeployer completed in 57.34s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1026082038879395s
Received healthy response to inference request in 1.3801159858703613s
Received healthy response to inference request in 1.3446300029754639s
Received healthy response to inference request in 1.2955729961395264s
Received healthy response to inference request in 1.3521733283996582s
5 requests
0 failed requests
5th percentile: 1.3053843975067139
10th percentile: 1.3151957988739014
20th percentile: 1.3348186016082764
30th percentile: 1.3461386680603027
40th percentile: 1.3491559982299806
50th percentile: 1.3521733283996582
60th percentile: 1.3633503913879395
70th percentile: 1.3745274543762207
80th percentile: 1.524614429473877
90th percentile: 1.8136113166809082
95th percentile: 1.9581097602844237
99th percentile: 2.0737085151672363
mean time: 1.4950201034545898
Pipeline stage StressChecker completed in 8.70s
sao10k-l3-rp-v5-1_v2 status is now deployed due to DeploymentManager action
sao10k-l3-rp-v5-1_v2 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of sao10k-l3-rp-v5-1_v2
Running pipeline stage ISVCDeleter
Checking if service sao10k-l3-rp-v5-1-v2 is running
Skipping teardown as no inference service was found
Pipeline stage ISVCDeleter completed in 4.98s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key sao10k-l3-rp-v5-1-v2/config.json from bucket guanaco-mkml-models
Deleting key sao10k-l3-rp-v5-1-v2/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key sao10k-l3-rp-v5-1-v2/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key sao10k-l3-rp-v5-1-v2/tokenizer.json from bucket guanaco-mkml-models
Deleting key sao10k-l3-rp-v5-1-v2/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key sao10k-l3-rp-v5-1-v2_reward/config.json from bucket guanaco-reward-models
Deleting key sao10k-l3-rp-v5-1-v2_reward/merges.txt from bucket guanaco-reward-models
Deleting key sao10k-l3-rp-v5-1-v2_reward/reward.tensors from bucket guanaco-reward-models
Deleting key sao10k-l3-rp-v5-1-v2_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key sao10k-l3-rp-v5-1-v2_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key sao10k-l3-rp-v5-1-v2_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key sao10k-l3-rp-v5-1-v2_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.32s
sao10k-l3-rp-v5-1_v2 status is now torndown due to DeploymentManager action