submission_id: gryphe-mythomax-l2-13b_v49
developer_uid: Gryphe
status: torndown
model_repo: Gryphe/MythoMax-L2-13b
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\\####\\", 'prompt_template': '{prompt}\\<START>\\', 'bot_template': '{bot_name}: {message}\\', 'user_template': '{user_name}: {message}\\', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\\####\\", 'prompt_template': '{prompt}\\<START>\\', 'bot_template': '{bot_name}: {message}\\', 'user_template': '{user_name}: {message}\\', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-05-15T16:43:23+00:00
model_name: gryphe-mythomax-l2-13b_v48
model_eval_status: success
model_group: Gryphe/MythoMax-L2-13b
num_battles: 9082
num_wins: 4145
celo_rating: 1145.88
propriety_score: 0.0
propriety_total_count: 0.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 13015864320.0
best_of: 8
max_input_tokens: 512
max_output_tokens: 64
display_name: gryphe-mythomax-l2-13b_v48
ineligible_reason: propriety_total_count < 800
language_model: Gryphe/MythoMax-L2-13b
model_size: 13B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-05-15
win_ratio: 0.4563972693239375
preference_data_url: None
Resubmit model
Running pipeline stage MKMLizer
Starting job with name gryphe-mythomax-l2-13b-v49-mkmlizer
Waiting for job on gryphe-mythomax-l2-13b-v49-mkmlizer to finish
gryphe-mythomax-l2-13b-v49-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ _____ __ __ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ /___/ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Version: 0.8.14 ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ https://mk1.ai ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ The license key for the current software has been verified as ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ belonging to: ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Chai Research Corp. ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
gryphe-mythomax-l2-13b-v49-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
gryphe-mythomax-l2-13b-v49-mkmlizer: warnings.warn(warning_message, FutureWarning)
gryphe-mythomax-l2-13b-v49-mkmlizer: Traceback (most recent call last):
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 174, in _new_conn
gryphe-mythomax-l2-13b-v49-mkmlizer: conn = connection.create_connection(
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/connection.py", line 72, in create_connection
gryphe-mythomax-l2-13b-v49-mkmlizer: for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/socket.py", line 955, in getaddrinfo
gryphe-mythomax-l2-13b-v49-mkmlizer: for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
gryphe-mythomax-l2-13b-v49-mkmlizer: socket.gaierror: [Errno -3] Temporary failure in name resolution
gryphe-mythomax-l2-13b-v49-mkmlizer: During handling of the above exception, another exception occurred:
gryphe-mythomax-l2-13b-v49-mkmlizer: Traceback (most recent call last):
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 714, in urlopen
gryphe-mythomax-l2-13b-v49-mkmlizer: httplib_response = self._make_request(
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 403, in _make_request
gryphe-mythomax-l2-13b-v49-mkmlizer: self._validate_conn(conn)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1053, in _validate_conn
gryphe-mythomax-l2-13b-v49-mkmlizer: conn.connect()
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 363, in connect
gryphe-mythomax-l2-13b-v49-mkmlizer: self.sock = conn = self._new_conn()
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 186, in _new_conn
gryphe-mythomax-l2-13b-v49-mkmlizer: raise NewConnectionError(
gryphe-mythomax-l2-13b-v49-mkmlizer: urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f96210d0d00>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution
gryphe-mythomax-l2-13b-v49-mkmlizer: During handling of the above exception, another exception occurred:
gryphe-mythomax-l2-13b-v49-mkmlizer: Traceback (most recent call last):
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 486, in send
gryphe-mythomax-l2-13b-v49-mkmlizer: resp = conn.urlopen(
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 798, in urlopen
gryphe-mythomax-l2-13b-v49-mkmlizer: retries = retries.increment(
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/retry.py", line 592, in increment
gryphe-mythomax-l2-13b-v49-mkmlizer: raise MaxRetryError(_pool, url, error or ResponseError(cause))
gryphe-mythomax-l2-13b-v49-mkmlizer: urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='cdn-lfs.huggingface.co', port=443): Max retries exceeded with url: /repos/55/04/26d0c556bfe653ed92965/84e1dac08e595d295a249bdf22f674663ceacc3d20553fa79475ba1d0cfd4ae8?response-content-disposition=attachment%3B+filename*%3DUTF-8%27%27pytorch_model-00013-of-00013.bin%3B+filename%3D%22pytorch_model-00013-of-00013.bin%22%3B&response-content-type=application%2Foctet-stream&Expires=1716049749&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcxNjA0OTc0OX19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy5odWdnaW5nZmFjZS5jby9yZXBvcy81NS8wNC81NTA0ZmM3MzgxNDQ5YzlhNDQxNjM4NDk3OTUzODIyYTZjY2EyYjVmZDgxMjZkMGM1NTZiZmU2NTNlZDkyOTY1Lzg0ZTFkYWMwOGU1OTVkMjk1YTI0OWJkZjIyZjY3NDY2M2NlYWNjM2QyMDU1M2ZhNzk0NzViYTFkMGNmZDRhZTg~cmVzcG9uc2UtY29udGVudC1kaXNwb3NpdGlvbj0qJnJlc3BvbnNlLWNvbnRlbnQtdHlwZT0qIn1dfQ__&Signature=pWQgqeE6UU7LvJX55kHWOdWYmRRhaFDfWIB0glTc12QAT7ExEcCi41YzfNUD5qP56e-JyRolY5I5tW0bE~faJU2t5p3JdgJhC9mjna4HuBTmMGWEOc3ZtImvUndV0HyS3SNW8iPkNmSw2xb2PX8WECg-GhYlSfJjJFiwlkCNg~NaXtW6fvJK60qElPgYQ~BaevATTwE7k3rNwNfzvagQhZ23NlzCu-HGxMvmQzJzlX6Iz37iCX~16dHQvWC85LERpopi~Ts~Zpnd-d9ATD4edqINNyRIIMTouxKfT-~VIiu72B~FMWVVbDzNpTkuO1Wvu7gdR3Rc-nuurkGHNppJjg__&Key-Pair-Id=KVTP0A1DKRTAX (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f96210d0d00>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
gryphe-mythomax-l2-13b-v49-mkmlizer: During handling of the above exception, another exception occurred:
gryphe-mythomax-l2-13b-v49-mkmlizer: Traceback (most recent call last):
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/code/uploading/mkmlize.py", line 151, in <module>
gryphe-mythomax-l2-13b-v49-mkmlizer: cli()
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1128, in __call__
gryphe-mythomax-l2-13b-v49-mkmlizer: return self.main(*args, **kwargs)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1053, in main
gryphe-mythomax-l2-13b-v49-mkmlizer: rv = self.invoke(ctx)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1659, in invoke
gryphe-mythomax-l2-13b-v49-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1395, in invoke
gryphe-mythomax-l2-13b-v49-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 754, in invoke
gryphe-mythomax-l2-13b-v49-mkmlizer: return __callback(*args, **kwargs)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/code/uploading/mkmlize.py", line 38, in quantize
gryphe-mythomax-l2-13b-v49-mkmlizer: temp_folder = download_to_shared_memory(repo_id, revision, hf_auth_token)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/code/uploading/mkmlize.py", line 65, in download_to_shared_memory
gryphe-mythomax-l2-13b-v49-mkmlizer: snapshot_download(
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 119, in _inner_fn
gryphe-mythomax-l2-13b-v49-mkmlizer: return fn(*args, **kwargs)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 314, in snapshot_download
gryphe-mythomax-l2-13b-v49-mkmlizer: _inner_hf_hub_download(file)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 290, in _inner_hf_hub_download
gryphe-mythomax-l2-13b-v49-mkmlizer: return hf_hub_download(
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 119, in _inner_fn
gryphe-mythomax-l2-13b-v49-mkmlizer: return fn(*args, **kwargs)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1492, in hf_hub_download
gryphe-mythomax-l2-13b-v49-mkmlizer: http_get(
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 456, in http_get
gryphe-mythomax-l2-13b-v49-mkmlizer: r = _request_wrapper(
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 392, in _request_wrapper
gryphe-mythomax-l2-13b-v49-mkmlizer: response = get_session().request(method=method, url=url, **params)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
gryphe-mythomax-l2-13b-v49-mkmlizer: resp = self.send(prep, **send_kwargs)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
gryphe-mythomax-l2-13b-v49-mkmlizer: r = adapter.send(request, **kwargs)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 68, in send
gryphe-mythomax-l2-13b-v49-mkmlizer: return super().send(request, *args, **kwargs)
gryphe-mythomax-l2-13b-v49-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 519, in send
gryphe-mythomax-l2-13b-v49-mkmlizer: raise ConnectionError(e, request=request)
gryphe-mythomax-l2-13b-v49-mkmlizer: requests.exceptions.ConnectionError: (MaxRetryError("HTTPSConnectionPool(host='cdn-lfs.huggingface.co', port=443): Max retries exceeded with url: /repos/55/04/26d0c556bfe653ed92965/84e1dac08e595d295a249bdf22f674663ceacc3d20553fa79475ba1d0cfd4ae8?response-content-disposition=attachment%3B+filename*%3DUTF-8%27%27pytorch_model-00013-of-00013.bin%3B+filename%3D%22pytorch_model-00013-of-00013.bin%22%3B&response-content-type=application%2Foctet-stream&Expires=1716049749&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcxNjA0OTc0OX19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy5odWdnaW5nZmFjZS5jby9yZXBvcy81NS8wNC81NTA0ZmM3MzgxNDQ5YzlhNDQxNjM4NDk3OTUzODIyYTZjY2EyYjVmZDgxMjZkMGM1NTZiZmU2NTNlZDkyOTY1Lzg0ZTFkYWMwOGU1OTVkMjk1YTI0OWJkZjIyZjY3NDY2M2NlYWNjM2QyMDU1M2ZhNzk0NzViYTFkMGNmZDRhZTg~cmVzcG9uc2UtY29udGVudC1kaXNwb3NpdGlvbj0qJnJlc3BvbnNlLWNvbnRlbnQtdHlwZT0qIn1dfQ__&Signature=pWQgqeE6UU7LvJX55kHWOdWYmRRhaFDfWIB0glTc12QAT7ExEcCi41YzfNUD5qP56e-JyRolY5I5tW0bE~faJU2t5p3JdgJhC9mjna4HuBTmMGWEOc3ZtImvUndV0HyS3SNW8iPkNmSw2xb2PX8WECg-GhYlSfJjJFiwlkCNg~NaXtW6fvJK60qElPgYQ~BaevATTwE7k3rNwNfzvagQhZ23NlzCu-HGxMvmQzJzlX6Iz37iCX~16dHQvWC85LERpopi~Ts~Zpnd-d9ATD4edqINNyRIIMTouxKfT-~VIiu72B~FMWVVbDzNpTkuO1Wvu7gdR3Rc-nuurkGHNppJjg__&Key-Pair-Id=KVTP0A1DKRTAX (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f96210d0d00>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))"), '(Request ID: d09a401e-cedd-4ef0-92d7-3c1977d910c9)')
Job gryphe-mythomax-l2-13b-v49-mkmlizer completed after 157.15s with status: failed
Stopping job with name gryphe-mythomax-l2-13b-v49-mkmlizer
%s, retrying in %s seconds...
Starting job with name gryphe-mythomax-l2-13b-v49-mkmlizer
Waiting for job on gryphe-mythomax-l2-13b-v49-mkmlizer to finish
gryphe-mythomax-l2-13b-v49-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ _____ __ __ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ /___/ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Version: 0.8.14 ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ https://mk1.ai ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ The license key for the current software has been verified as ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ belonging to: ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Chai Research Corp. ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ║ ║
gryphe-mythomax-l2-13b-v49-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
gryphe-mythomax-l2-13b-v49-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
gryphe-mythomax-l2-13b-v49-mkmlizer: warnings.warn(warning_message, FutureWarning)
gryphe-mythomax-l2-13b-v49-mkmlizer: Downloaded to shared memory in 37.609s
gryphe-mythomax-l2-13b-v49-mkmlizer: quantizing model to /dev/shm/model_cache
gryphe-mythomax-l2-13b-v49-mkmlizer: Saving flywheel model at /dev/shm/model_cache
gryphe-mythomax-l2-13b-v49-mkmlizer: quantized model in 23.535s
gryphe-mythomax-l2-13b-v49-mkmlizer: Processed model Gryphe/MythoMax-L2-13b in 65.223s
gryphe-mythomax-l2-13b-v49-mkmlizer: creating bucket guanaco-mkml-models
gryphe-mythomax-l2-13b-v49-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
gryphe-mythomax-l2-13b-v49-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/gryphe-mythomax-l2-13b-v49
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/gryphe-mythomax-l2-13b-v49/config.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/gryphe-mythomax-l2-13b-v49/special_tokens_map.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /dev/shm/model_cache/added_tokens.json s3://guanaco-mkml-models/gryphe-mythomax-l2-13b-v49/added_tokens.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/gryphe-mythomax-l2-13b-v49/tokenizer.model
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/gryphe-mythomax-l2-13b-v49/tokenizer.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/gryphe-mythomax-l2-13b-v49/tokenizer_config.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/gryphe-mythomax-l2-13b-v49/flywheel_model.0.safetensors
gryphe-mythomax-l2-13b-v49-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
gryphe-mythomax-l2-13b-v49-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 7%|▋ | 26/363 [00:00<00:05, 57.01it/s] Loading 0: 15%|█▌ | 56/363 [00:00<00:04, 64.44it/s] Loading 0: 24%|██▎ | 86/363 [00:01<00:04, 68.54it/s] Loading 0: 31%|███▏ | 114/363 [00:01<00:03, 68.17it/s] Loading 0: 39%|███▉ | 142/363 [00:02<00:03, 68.00it/s] Loading 0: 47%|████▋ | 170/363 [00:02<00:02, 67.95it/s] Loading 0: 55%|█████▌ | 200/363 [00:02<00:02, 68.19it/s] Loading 0: 63%|██████▎ | 230/363 [00:03<00:01, 70.71it/s] Loading 0: 71%|███████ | 258/363 [00:03<00:01, 70.52it/s] Loading 0: 79%|███████▉ | 286/363 [00:04<00:01, 70.70it/s] Loading 0: 87%|████████▋ | 314/363 [00:04<00:00, 70.73it/s] Loading 0: 95%|█████████▍| 344/363 [00:04<00:00, 75.13it/s] Loading 0: 100%|██████████| 363/363 [00:06<00:00, 34.88it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
gryphe-mythomax-l2-13b-v49-mkmlizer: warnings.warn(
gryphe-mythomax-l2-13b-v49-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
gryphe-mythomax-l2-13b-v49-mkmlizer: warnings.warn(
gryphe-mythomax-l2-13b-v49-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
gryphe-mythomax-l2-13b-v49-mkmlizer: warnings.warn(
gryphe-mythomax-l2-13b-v49-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
gryphe-mythomax-l2-13b-v49-mkmlizer: Saving duration: 0.417s
gryphe-mythomax-l2-13b-v49-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 5.162s
gryphe-mythomax-l2-13b-v49-mkmlizer: creating bucket guanaco-reward-models
gryphe-mythomax-l2-13b-v49-mkmlizer: Bucket 's3://guanaco-reward-models/' created
gryphe-mythomax-l2-13b-v49-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/gryphe-mythomax-l2-13b-v49_reward
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/gryphe-mythomax-l2-13b-v49_reward/special_tokens_map.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/gryphe-mythomax-l2-13b-v49_reward/config.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/gryphe-mythomax-l2-13b-v49_reward/tokenizer_config.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/gryphe-mythomax-l2-13b-v49_reward/merges.txt
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/gryphe-mythomax-l2-13b-v49_reward/vocab.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/gryphe-mythomax-l2-13b-v49_reward/tokenizer.json
gryphe-mythomax-l2-13b-v49-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/gryphe-mythomax-l2-13b-v49_reward/reward.tensors
Job gryphe-mythomax-l2-13b-v49-mkmlizer completed after 138.0s with status: succeeded
Stopping job with name gryphe-mythomax-l2-13b-v49-mkmlizer
Pipeline stage MKMLizer completed in 299.85s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Creating inference service gryphe-mythomax-l2-13b-v49
Waiting for inference service gryphe-mythomax-l2-13b-v49 to be ready
Inference service gryphe-mythomax-l2-13b-v49 ready after 40.277883768081665s
Pipeline stage ISVCDeployer completed in 47.62s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.756518840789795s
Received healthy response to inference request in 1.737701416015625s
Received healthy response to inference request in 1.7478070259094238s
Received healthy response to inference request in 1.7563755512237549s
Received healthy response to inference request in 1.7430171966552734s
5 requests
0 failed requests
5th percentile: 1.7387645721435547
10th percentile: 1.7398277282714845
20th percentile: 1.7419540405273437
30th percentile: 1.7439751625061035
40th percentile: 1.7458910942077637
50th percentile: 1.7478070259094238
60th percentile: 1.7512344360351562
70th percentile: 1.7546618461608887
80th percentile: 1.9564042091369631
90th percentile: 2.356461524963379
95th percentile: 2.556490182876587
99th percentile: 2.7165131092071535
mean time: 1.9482840061187745
Pipeline stage StressChecker completed in 10.98s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.05s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.04s
M-Eval Dataset for topic stay_in_character is loaded
gryphe-mythomax-l2-13b_v49 status is now deployed due to DeploymentManager action
gryphe-mythomax-l2-13b_v49 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of gryphe-mythomax-l2-13b_v49
Running pipeline stage ISVCDeleter
Checking if service gryphe-mythomax-l2-13b-v49 is running
Tearing down inference service gryphe-mythomax-l2-13b-v49
Toredown service gryphe-mythomax-l2-13b-v49
Pipeline stage ISVCDeleter completed in 4.49s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key gryphe-mythomax-l2-13b-v49/added_tokens.json from bucket guanaco-mkml-models
Deleting key gryphe-mythomax-l2-13b-v49/config.json from bucket guanaco-mkml-models
Deleting key gryphe-mythomax-l2-13b-v49/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key gryphe-mythomax-l2-13b-v49/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key gryphe-mythomax-l2-13b-v49/tokenizer.json from bucket guanaco-mkml-models
Deleting key gryphe-mythomax-l2-13b-v49/tokenizer.model from bucket guanaco-mkml-models
Deleting key gryphe-mythomax-l2-13b-v49/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key gryphe-mythomax-l2-13b-v49_reward/config.json from bucket guanaco-reward-models
Deleting key gryphe-mythomax-l2-13b-v49_reward/merges.txt from bucket guanaco-reward-models
Deleting key gryphe-mythomax-l2-13b-v49_reward/reward.tensors from bucket guanaco-reward-models
Deleting key gryphe-mythomax-l2-13b-v49_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key gryphe-mythomax-l2-13b-v49_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key gryphe-mythomax-l2-13b-v49_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key gryphe-mythomax-l2-13b-v49_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 3.63s
gryphe-mythomax-l2-13b_v49 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics