submission_id: mylesfriedman30-llama-2-_1165_v3
developer_uid: DDDDDDDDFFFF
best_of: 4
celo_rating: 1078.44
display_name: flirtbot
family_friendly_score: 0.0
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
is_internal_developer: False
language_model: mylesfriedman30/llama-2-7b-flirtbot
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_eval_status: success
model_group: mylesfriedman30/llama-2-
model_name: flirtbot
model_num_parameters: 6738415616.0
model_repo: mylesfriedman30/llama-2-7b-flirtbot
model_size: 7B
num_battles: 11746
num_wins: 4346
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
status: torndown
submission_type: basic
timestamp: 2024-06-15T17:48:03+00:00
us_pacific_date: 2024-06-15
win_ratio: 0.3699982972926954
Resubmit model
Running pipeline stage MKMLizer
Starting job with name mylesfriedman30-llama-2-1165-v3-mkmlizer
Waiting for job on mylesfriedman30-llama-2-1165-v3-mkmlizer to finish
Stopping job with name mylesfriedman30-llama-2-1165-v3-mkmlizer
%s, retrying in %s seconds...
Starting job with name mylesfriedman30-llama-2-1165-v3-mkmlizer
Waiting for job on mylesfriedman30-llama-2-1165-v3-mkmlizer to finish
mylesfriedman30-llama-2-1165-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ _____ __ __ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ /___/ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Version: 0.8.14 ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ https://mk1.ai ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ The license key for the current software has been verified as ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ belonging to: ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Chai Research Corp. ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mylesfriedman30-llama-2-1165-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
mylesfriedman30-llama-2-1165-v3-mkmlizer: warnings.warn(warning_message, FutureWarning)
mylesfriedman30-llama-2-1165-v3-mkmlizer: Traceback (most recent call last):
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 174, in _new_conn
mylesfriedman30-llama-2-1165-v3-mkmlizer: conn = connection.create_connection(
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/connection.py", line 72, in create_connection
mylesfriedman30-llama-2-1165-v3-mkmlizer: for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/socket.py", line 955, in getaddrinfo
mylesfriedman30-llama-2-1165-v3-mkmlizer: for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
mylesfriedman30-llama-2-1165-v3-mkmlizer: socket.gaierror: [Errno -3] Temporary failure in name resolution
mylesfriedman30-llama-2-1165-v3-mkmlizer: During handling of the above exception, another exception occurred:
mylesfriedman30-llama-2-1165-v3-mkmlizer: Traceback (most recent call last):
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 714, in urlopen
mylesfriedman30-llama-2-1165-v3-mkmlizer: httplib_response = self._make_request(
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 403, in _make_request
mylesfriedman30-llama-2-1165-v3-mkmlizer: self._validate_conn(conn)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1053, in _validate_conn
mylesfriedman30-llama-2-1165-v3-mkmlizer: conn.connect()
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 363, in connect
mylesfriedman30-llama-2-1165-v3-mkmlizer: self.sock = conn = self._new_conn()
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 186, in _new_conn
mylesfriedman30-llama-2-1165-v3-mkmlizer: raise NewConnectionError(
mylesfriedman30-llama-2-1165-v3-mkmlizer: urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f5c826a8d60>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution
mylesfriedman30-llama-2-1165-v3-mkmlizer: During handling of the above exception, another exception occurred:
mylesfriedman30-llama-2-1165-v3-mkmlizer: Traceback (most recent call last):
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 486, in send
mylesfriedman30-llama-2-1165-v3-mkmlizer: resp = conn.urlopen(
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 798, in urlopen
mylesfriedman30-llama-2-1165-v3-mkmlizer: retries = retries.increment(
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/retry.py", line 592, in increment
mylesfriedman30-llama-2-1165-v3-mkmlizer: raise MaxRetryError(_pool, url, error or ResponseError(cause))
mylesfriedman30-llama-2-1165-v3-mkmlizer: urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/mylesfriedman30/llama-2-7b-flirtbot/tree/main?recursive=True&expand=False (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f5c826a8d60>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
mylesfriedman30-llama-2-1165-v3-mkmlizer: During handling of the above exception, another exception occurred:
mylesfriedman30-llama-2-1165-v3-mkmlizer: Traceback (most recent call last):
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/code/uploading/mkmlize.py", line 151, in <module>
mylesfriedman30-llama-2-1165-v3-mkmlizer: cli()
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1128, in __call__
mylesfriedman30-llama-2-1165-v3-mkmlizer: return self.main(*args, **kwargs)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1053, in main
mylesfriedman30-llama-2-1165-v3-mkmlizer: rv = self.invoke(ctx)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1659, in invoke
mylesfriedman30-llama-2-1165-v3-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1395, in invoke
mylesfriedman30-llama-2-1165-v3-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 754, in invoke
mylesfriedman30-llama-2-1165-v3-mkmlizer: return __callback(*args, **kwargs)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/code/uploading/mkmlize.py", line 38, in quantize
mylesfriedman30-llama-2-1165-v3-mkmlizer: temp_folder = download_to_shared_memory(repo_id, revision, hf_auth_token)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/code/uploading/mkmlize.py", line 60, in download_to_shared_memory
mylesfriedman30-llama-2-1165-v3-mkmlizer: if repo_has_model_safetensors(repo_id, revision, token):
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/code/uploading/mkmlize.py", line 83, in repo_has_model_safetensors
mylesfriedman30-llama-2-1165-v3-mkmlizer: files = [f.path for f in list_files_info(repo_id, revision=revision, token=token)]
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/code/uploading/mkmlize.py", line 83, in <listcomp>
mylesfriedman30-llama-2-1165-v3-mkmlizer: files = [f.path for f in list_files_info(repo_id, revision=revision, token=token)]
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/hf_api.py", line 2755, in list_files_info
mylesfriedman30-llama-2-1165-v3-mkmlizer: for subpath_info in paginate(path=tree_url, headers=headers, params={"recursive": True, "expand": expand}):
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_pagination.py", line 36, in paginate
mylesfriedman30-llama-2-1165-v3-mkmlizer: r = session.get(path, params=params, headers=headers)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 602, in get
mylesfriedman30-llama-2-1165-v3-mkmlizer: return self.request("GET", url, **kwargs)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
mylesfriedman30-llama-2-1165-v3-mkmlizer: resp = self.send(prep, **send_kwargs)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
mylesfriedman30-llama-2-1165-v3-mkmlizer: r = adapter.send(request, **kwargs)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 68, in send
mylesfriedman30-llama-2-1165-v3-mkmlizer: return super().send(request, *args, **kwargs)
mylesfriedman30-llama-2-1165-v3-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 519, in send
mylesfriedman30-llama-2-1165-v3-mkmlizer: raise ConnectionError(e, request=request)
mylesfriedman30-llama-2-1165-v3-mkmlizer: requests.exceptions.ConnectionError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/mylesfriedman30/llama-2-7b-flirtbot/tree/main?recursive=True&expand=False (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f5c826a8d60>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))"), '(Request ID: e2f77971-83bb-432d-bf91-19c1a43b724c)')
Job mylesfriedman30-llama-2-1165-v3-mkmlizer completed after 22.93s with status: failed
Stopping job with name mylesfriedman30-llama-2-1165-v3-mkmlizer
%s, retrying in %s seconds...
Starting job with name mylesfriedman30-llama-2-1165-v3-mkmlizer
Waiting for job on mylesfriedman30-llama-2-1165-v3-mkmlizer to finish
mylesfriedman30-llama-2-1165-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ _____ __ __ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ /___/ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Version: 0.8.14 ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ https://mk1.ai ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ The license key for the current software has been verified as ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ belonging to: ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Chai Research Corp. ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ║ ║
mylesfriedman30-llama-2-1165-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mylesfriedman30-llama-2-1165-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
mylesfriedman30-llama-2-1165-v3-mkmlizer: warnings.warn(warning_message, FutureWarning)
mylesfriedman30-llama-2-1165-v3-mkmlizer: Downloaded to shared memory in 26.689s
mylesfriedman30-llama-2-1165-v3-mkmlizer: quantizing model to /dev/shm/model_cache
mylesfriedman30-llama-2-1165-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mylesfriedman30-llama-2-1165-v3-mkmlizer: quantized model in 14.441s
mylesfriedman30-llama-2-1165-v3-mkmlizer: Processed model mylesfriedman30/llama-2-7b-flirtbot in 43.331s
mylesfriedman30-llama-2-1165-v3-mkmlizer: creating bucket guanaco-mkml-models
mylesfriedman30-llama-2-1165-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mylesfriedman30-llama-2-1165-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mylesfriedman30-llama-2-1165-v3
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /dev/shm/model_cache/added_tokens.json s3://guanaco-mkml-models/mylesfriedman30-llama-2-1165-v3/added_tokens.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mylesfriedman30-llama-2-1165-v3/special_tokens_map.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mylesfriedman30-llama-2-1165-v3/tokenizer_config.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mylesfriedman30-llama-2-1165-v3/tokenizer.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mylesfriedman30-llama-2-1165-v3/config.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/mylesfriedman30-llama-2-1165-v3/tokenizer.model
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mylesfriedman30-llama-2-1165-v3/flywheel_model.0.safetensors
mylesfriedman30-llama-2-1165-v3-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
mylesfriedman30-llama-2-1165-v3-mkmlizer: Loading 0: 0%| | 0/323 [00:00<?, ?it/s] Loading 0: 75%|███████▍ | 242/323 [00:01<00:00, 181.13it/s] Loading 0: 100%|██████████| 323/323 [00:02<00:00, 106.49it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mylesfriedman30-llama-2-1165-v3-mkmlizer: warnings.warn(
mylesfriedman30-llama-2-1165-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mylesfriedman30-llama-2-1165-v3-mkmlizer: warnings.warn(
mylesfriedman30-llama-2-1165-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
mylesfriedman30-llama-2-1165-v3-mkmlizer: warnings.warn(
mylesfriedman30-llama-2-1165-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
mylesfriedman30-llama-2-1165-v3-mkmlizer: return self.fget.__get__(instance, owner)()
mylesfriedman30-llama-2-1165-v3-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
mylesfriedman30-llama-2-1165-v3-mkmlizer: Saving duration: 0.436s
mylesfriedman30-llama-2-1165-v3-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 12.253s
mylesfriedman30-llama-2-1165-v3-mkmlizer: creating bucket guanaco-reward-models
mylesfriedman30-llama-2-1165-v3-mkmlizer: Bucket 's3://guanaco-reward-models/' created
mylesfriedman30-llama-2-1165-v3-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/mylesfriedman30-llama-2-1165-v3_reward
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/mylesfriedman30-llama-2-1165-v3_reward/special_tokens_map.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/mylesfriedman30-llama-2-1165-v3_reward/config.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/mylesfriedman30-llama-2-1165-v3_reward/tokenizer_config.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/mylesfriedman30-llama-2-1165-v3_reward/merges.txt
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/mylesfriedman30-llama-2-1165-v3_reward/vocab.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/mylesfriedman30-llama-2-1165-v3_reward/tokenizer.json
mylesfriedman30-llama-2-1165-v3-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/mylesfriedman30-llama-2-1165-v3_reward/reward.tensors
Job mylesfriedman30-llama-2-1165-v3-mkmlizer completed after 104.27s with status: succeeded
Stopping job with name mylesfriedman30-llama-2-1165-v3-mkmlizer
Pipeline stage MKMLizer completed in 131.16s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service mylesfriedman30-llama-2-1165-v3
Waiting for inference service mylesfriedman30-llama-2-1165-v3 to be ready
Inference service mylesfriedman30-llama-2-1165-v3 ready after 50.27666735649109s
Pipeline stage ISVCDeployer completed in 57.36s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.6435606479644775s
Received healthy response to inference request in 1.0242302417755127s
Received healthy response to inference request in 5.378211975097656s
Received healthy response to inference request in 1.0400505065917969s
Received healthy response to inference request in 0.5851528644561768s
5 requests
0 failed requests
5th percentile: 0.6729683399200439
10th percentile: 0.7607838153839112
20th percentile: 0.9364147663116456
30th percentile: 1.0273942947387695
40th percentile: 1.0337224006652832
50th percentile: 1.0400505065917969
60th percentile: 1.2814545631408691
70th percentile: 1.5228586196899414
80th percentile: 2.390490913391114
90th percentile: 3.884351444244385
95th percentile: 4.63128170967102
99th percentile: 5.228825922012329
mean time: 1.934241247177124
Pipeline stage StressChecker completed in 10.64s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.05s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.04s
M-Eval Dataset for topic stay_in_character is loaded
mylesfriedman30-llama-2-_1165_v3 status is now deployed due to DeploymentManager action
mylesfriedman30-llama-2-_1165_v3 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of mylesfriedman30-llama-2-_1165_v3
Running pipeline stage ISVCDeleter
Checking if service mylesfriedman30-llama-2-1165-v3 is running
Skipping teardown as no inference service was found
Pipeline stage ISVCDeleter completed in 4.49s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key mylesfriedman30-llama-2-1165-v3/added_tokens.json from bucket guanaco-mkml-models
Deleting key mylesfriedman30-llama-2-1165-v3/config.json from bucket guanaco-mkml-models
Deleting key mylesfriedman30-llama-2-1165-v3/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key mylesfriedman30-llama-2-1165-v3/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key mylesfriedman30-llama-2-1165-v3/tokenizer.json from bucket guanaco-mkml-models
Deleting key mylesfriedman30-llama-2-1165-v3/tokenizer.model from bucket guanaco-mkml-models
Deleting key mylesfriedman30-llama-2-1165-v3/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key mylesfriedman30-llama-2-1165-v3_reward/config.json from bucket guanaco-reward-models
Deleting key mylesfriedman30-llama-2-1165-v3_reward/merges.txt from bucket guanaco-reward-models
Deleting key mylesfriedman30-llama-2-1165-v3_reward/reward.tensors from bucket guanaco-reward-models
Deleting key mylesfriedman30-llama-2-1165-v3_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key mylesfriedman30-llama-2-1165-v3_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key mylesfriedman30-llama-2-1165-v3_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key mylesfriedman30-llama-2-1165-v3_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 7.02s
mylesfriedman30-llama-2-_1165_v3 status is now torndown due to DeploymentManager action