developer_uid: rirv938
submission_id: rirv938-mistral-12b-oai_70376_v1
model_name: rirv938-mistral-12b-oai_70376_v1
model_group: rirv938/mistral_12b_oai_
status: torndown
timestamp: 2025-02-21T03:49:56+00:00
num_battles: 5900
num_wins: 2867
celo_rating: 1258.32
family_friendly_score: 0.5671999999999999
family_friendly_standard_error: 0.007006913157732155
submission_type: basic
model_repo: rirv938/mistral_12b_oai_prompt_oai_bon_mixed_outputs_900_v2
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.5990649241466275, 'latency_mean': 1.6692033040523528, 'latency_p50': 1.6475379467010498, 'latency_p90': 1.8546070337295533}, {'batch_size': 3, 'throughput': 1.078273723231989, 'latency_mean': 2.7689466679096224, 'latency_p50': 2.7709797620773315, 'latency_p90': 3.0384958028793334}, {'batch_size': 5, 'throughput': 1.2916855203463142, 'latency_mean': 3.8532646918296813, 'latency_p50': 3.8790106773376465, 'latency_p90': 4.311822938919067}, {'batch_size': 6, 'throughput': 1.3416850597588443, 'latency_mean': 4.444051986932754, 'latency_p50': 4.4835755825042725, 'latency_p90': 4.953636002540589}, {'batch_size': 8, 'throughput': 1.4167822328185318, 'latency_mean': 5.617311294078827, 'latency_p50': 5.619437098503113, 'latency_p90': 6.347938466072082}, {'batch_size': 10, 'throughput': 1.4440659451470337, 'latency_mean': 6.8766799688339235, 'latency_p50': 6.850102663040161, 'latency_p90': 7.80098237991333}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: rirv938-mistral-12b-oai_70376_v1
is_internal_developer: True
language_model: rirv938/mistral_12b_oai_prompt_oai_bon_mixed_outputs_900_v2
model_size: 13B
ranking_group: single
throughput_3p7s: 1.27
us_pacific_date: 2025-02-20
win_ratio: 0.4859322033898305
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.6, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>', '###', 'You:'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rirv938-mistral-12b-oai-70376-v1-mkmlizer
Waiting for job on rirv938-mistral-12b-oai-70376-v1-mkmlizer to finish
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ _____ __ __ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ /___/ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Version: 0.12.8 ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ https://mk1.ai ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ belonging to: ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Chai Research Corp. ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Traceback (most recent call last):
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 536, in _make_request
rirv938-mistral-12b-oai-70376-v1-mkmlizer: response = conn.getresponse()
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 461, in getresponse
rirv938-mistral-12b-oai-70376-v1-mkmlizer: httplib_response = super().getresponse()
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/http/client.py", line 1375, in getresponse
rirv938-mistral-12b-oai-70376-v1-mkmlizer: response.begin()
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/http/client.py", line 318, in begin
rirv938-mistral-12b-oai-70376-v1-mkmlizer: version, status, reason = self._read_status()
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/http/client.py", line 279, in _read_status
rirv938-mistral-12b-oai-70376-v1-mkmlizer: line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/socket.py", line 705, in readinto
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return self._sock.recv_into(b)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/ssl.py", line 1307, in recv_into
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return self.read(nbytes, buffer)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/ssl.py", line 1163, in read
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return self._sslobj.read(len, buffer)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: TimeoutError: The read operation timed out
rirv938-mistral-12b-oai-70376-v1-mkmlizer: The above exception was the direct cause of the following exception:
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Traceback (most recent call last):
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 667, in send
rirv938-mistral-12b-oai-70376-v1-mkmlizer: resp = conn.urlopen(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 844, in urlopen
rirv938-mistral-12b-oai-70376-v1-mkmlizer: retries = retries.increment(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/retry.py", line 470, in increment
rirv938-mistral-12b-oai-70376-v1-mkmlizer: raise reraise(type(error), error, _stacktrace)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/util.py", line 39, in reraise
rirv938-mistral-12b-oai-70376-v1-mkmlizer: raise value
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 790, in urlopen
rirv938-mistral-12b-oai-70376-v1-mkmlizer: response = self._make_request(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 538, in _make_request
rirv938-mistral-12b-oai-70376-v1-mkmlizer: self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 370, in _raise_timeout
rirv938-mistral-12b-oai-70376-v1-mkmlizer: raise ReadTimeoutError(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: During handling of the above exception, another exception occurred:
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Traceback (most recent call last):
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1374, in _get_metadata_or_catch_error
rirv938-mistral-12b-oai-70376-v1-mkmlizer: metadata = get_hf_file_metadata(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return fn(*args, **kwargs)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1294, in get_hf_file_metadata
rirv938-mistral-12b-oai-70376-v1-mkmlizer: r = _request_wrapper(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 278, in _request_wrapper
rirv938-mistral-12b-oai-70376-v1-mkmlizer: response = _request_wrapper(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 301, in _request_wrapper
rirv938-mistral-12b-oai-70376-v1-mkmlizer: response = get_session().request(method=method, url=url, **params)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
rirv938-mistral-12b-oai-70376-v1-mkmlizer: resp = self.send(prep, **send_kwargs)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
rirv938-mistral-12b-oai-70376-v1-mkmlizer: r = adapter.send(request, **kwargs)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 93, in send
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return super().send(request, *args, **kwargs)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 713, in send
rirv938-mistral-12b-oai-70376-v1-mkmlizer: raise ReadTimeout(e, request=request)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: requests.exceptions.ReadTimeout: (ReadTimeoutError("HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: 169b0f7a-5e9d-442a-a635-368ee82c1794)')
rirv938-mistral-12b-oai-70376-v1-mkmlizer: The above exception was the direct cause of the following exception:
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Traceback (most recent call last):
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 193, in <module>
rirv938-mistral-12b-oai-70376-v1-mkmlizer: cli()
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return self.main(*args, **kwargs)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
rirv938-mistral-12b-oai-70376-v1-mkmlizer: rv = self.invoke(ctx)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return __callback(*args, **kwargs)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 39, in quantize
rirv938-mistral-12b-oai-70376-v1-mkmlizer: temp_folder = download_to_shared_memory(repo_id, revision, hf_auth_token)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 86, in download_to_shared_memory
rirv938-mistral-12b-oai-70376-v1-mkmlizer: snapshot_download(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return fn(*args, **kwargs)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 294, in snapshot_download
rirv938-mistral-12b-oai-70376-v1-mkmlizer: _inner_hf_hub_download(file)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 270, in _inner_hf_hub_download
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return hf_hub_download(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return fn(*args, **kwargs)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 840, in hf_hub_download
rirv938-mistral-12b-oai-70376-v1-mkmlizer: return _hf_hub_download_to_local_dir(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1089, in _hf_hub_download_to_local_dir
rirv938-mistral-12b-oai-70376-v1-mkmlizer: _raise_on_head_call_error(head_call_error, force_download, local_files_only)
rirv938-mistral-12b-oai-70376-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1485, in _raise_on_head_call_error
rirv938-mistral-12b-oai-70376-v1-mkmlizer: raise LocalEntryNotFoundError(
rirv938-mistral-12b-oai-70376-v1-mkmlizer: huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
Job rirv938-mistral-12b-oai-70376-v1-mkmlizer completed after 170.14s with status: failed
Stopping job with name rirv938-mistral-12b-oai-70376-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name rirv938-mistral-12b-oai-70376-v1-mkmlizer
Waiting for job on rirv938-mistral-12b-oai-70376-v1-mkmlizer to finish
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ _____ __ __ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ /___/ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Version: 0.12.8 ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ https://mk1.ai ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ belonging to: ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Chai Research Corp. ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-70376-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Downloaded to shared memory in 81.753s
rirv938-mistral-12b-oai-70376-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmps17i4g8m, device:0
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rirv938-mistral-12b-oai-70376-v1-mkmlizer: quantized model in 42.589s
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Processed model rirv938/mistral_12b_oai_prompt_oai_bon_mixed_outputs_900_v2 in 124.343s
rirv938-mistral-12b-oai-70376-v1-mkmlizer: creating bucket guanaco-mkml-models
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rirv938-mistral-12b-oai-70376-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rirv938-mistral-12b-oai-70376-v1
rirv938-mistral-12b-oai-70376-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rirv938-mistral-12b-oai-70376-v1/config.json
rirv938-mistral-12b-oai-70376-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rirv938-mistral-12b-oai-70376-v1/special_tokens_map.json
rirv938-mistral-12b-oai-70376-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rirv938-mistral-12b-oai-70376-v1/tokenizer_config.json
rirv938-mistral-12b-oai-70376-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rirv938-mistral-12b-oai-70376-v1/tokenizer.json
rirv938-mistral-12b-oai-70376-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rirv938-mistral-12b-oai-70376-v1/flywheel_model.0.safetensors
rirv938-mistral-12b-oai-70376-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:17, 20.72it/s] Loading 0: 3%|▎ | 10/363 [00:00<00:13, 26.13it/s] Loading 0: 4%|▍ | 14/363 [00:00<00:14, 23.71it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:11, 29.88it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:15, 21.60it/s] Loading 0: 7%|▋ | 26/363 [00:01<00:17, 19.55it/s] Loading 0: 9%|▊ | 31/363 [00:01<00:13, 25.30it/s] Loading 0: 10%|▉ | 35/363 [00:01<00:12, 26.13it/s] Loading 0: 11%|█ | 39/363 [00:01<00:11, 27.04it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:13, 24.61it/s] Loading 0: 13%|█▎ | 46/363 [00:01<00:11, 27.81it/s] Loading 0: 14%|█▍ | 50/363 [00:02<00:12, 24.97it/s] Loading 0: 15%|█▌ | 55/363 [00:02<00:10, 30.00it/s] Loading 0: 17%|█▋ | 60/363 [00:02<00:09, 31.12it/s] Loading 0: 18%|█▊ | 64/363 [00:02<00:14, 20.66it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:11, 25.64it/s] Loading 0: 20%|██ | 73/363 [00:02<00:12, 24.14it/s] Loading 0: 21%|██ | 76/363 [00:03<00:11, 25.16it/s] Loading 0: 22%|██▏ | 79/363 [00:03<00:11, 24.92it/s] Loading 0: 23%|██▎ | 84/363 [00:03<00:09, 27.95it/s] Loading 0: 24%|██▍ | 88/363 [00:03<00:10, 27.03it/s] Loading 0: 25%|██▌ | 91/363 [00:03<00:09, 27.37it/s] Loading 0: 26%|██▌ | 95/363 [00:03<00:10, 24.49it/s] Loading 0: 28%|██▊ | 100/363 [00:03<00:08, 29.41it/s] Loading 0: 29%|██▊ | 104/363 [00:04<00:13, 19.54it/s] Loading 0: 30%|███ | 109/363 [00:04<00:10, 24.43it/s] Loading 0: 31%|███ | 113/363 [00:04<00:10, 23.00it/s] Loading 0: 33%|███▎ | 118/363 [00:04<00:08, 27.80it/s] Loading 0: 34%|███▎ | 122/363 [00:04<00:09, 25.55it/s] Loading 0: 35%|███▍ | 127/363 [00:04<00:07, 30.20it/s] Loading 0: 36%|███▌ | 131/363 [00:05<00:08, 26.48it/s] Loading 0: 37%|███▋ | 136/363 [00:05<00:07, 30.90it/s] Loading 0: 39%|███▉ | 141/363 [00:05<00:07, 31.14it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:09, 22.53it/s] Loading 0: 41%|████ | 149/363 [00:05<00:09, 22.10it/s] Loading 0: 42%|████▏ | 154/363 [00:06<00:07, 27.04it/s] Loading 0: 44%|████▎ | 158/363 [00:06<00:08, 24.98it/s] Loading 0: 45%|████▍ | 163/363 [00:06<00:06, 29.81it/s] Loading 0: 46%|████▌ | 167/363 [00:06<00:07, 26.36it/s] Loading 0: 47%|████▋ | 172/363 [00:06<00:06, 30.95it/s] Loading 0: 48%|████▊ | 176/363 [00:06<00:06, 27.10it/s] Loading 0: 50%|████▉ | 181/363 [00:06<00:05, 31.46it/s] Loading 0: 51%|█████ | 185/363 [00:07<00:08, 20.74it/s] Loading 0: 52%|█████▏ | 190/363 [00:07<00:06, 25.23it/s] Loading 0: 53%|█████▎ | 194/363 [00:07<00:07, 23.25it/s] Loading 0: 55%|█████▍ | 199/363 [00:07<00:05, 27.72it/s] Loading 0: 56%|█████▌ | 203/363 [00:07<00:06, 24.11it/s] Loading 0: 58%|█████▊ | 210/363 [00:08<00:05, 29.96it/s] Loading 0: 59%|█████▉ | 214/363 [00:08<00:05, 28.65it/s] Loading 0: 60%|██████ | 218/363 [00:08<00:05, 28.16it/s] Loading 0: 61%|██████▏ | 223/363 [00:08<00:06, 22.24it/s] Loading 0: 62%|██████▏ | 226/363 [00:08<00:06, 21.07it/s] Loading 0: 63%|██████▎ | 230/363 [00:09<00:06, 20.79it/s] Loading 0: 65%|██████▌ | 237/363 [00:09<00:04, 26.86it/s] Loading 0: 66%|██████▌ | 240/363 [00:09<00:04, 25.00it/s] Loading 0: 68%|██████▊ | 246/363 [00:09<00:04, 29.13it/s] Loading 0: 69%|██████▉ | 250/363 [00:09<00:03, 28.36it/s] Loading 0: 70%|███████ | 255/363 [00:09<00:03, 30.38it/s] Loading 0: 71%|███████▏ | 259/363 [00:09<00:03, 29.14it/s] Loading 0: 72%|███████▏ | 263/363 [00:10<00:04, 21.92it/s] Loading 0: 73%|███████▎ | 266/363 [00:10<00:04, 19.91it/s] Loading 0: 75%|███████▌ | 273/363 [00:10<00:03, 26.52it/s] Loading 0: 76%|███████▌ | 276/363 [00:10<00:03, 24.99it/s] Loading 0: 78%|███████▊ | 282/363 [00:10<00:02, 29.45it/s] Loading 0: 79%|███████▉ | 286/363 [00:11<00:02, 28.64it/s] Loading 0: 80%|███████▉ | 289/363 [00:11<00:02, 28.76it/s] Loading 0: 80%|████████ | 292/363 [00:11<00:02, 28.99it/s] Loading 0: 81%|████████▏ | 295/363 [00:11<00:02, 27.74it/s] Loading 0: 82%|████████▏ | 299/363 [00:11<00:02, 28.36it/s] Loading 0: 84%|████████▎ | 304/363 [00:11<00:02, 23.16it/s] Loading 0: 85%|████████▍ | 307/363 [00:11<00:02, 22.26it/s] Loading 0: 86%|████████▌ | 311/363 [00:12<00:02, 21.36it/s] Loading 0: 88%|████████▊ | 318/363 [00:12<00:01, 28.32it/s] Loading 0: 89%|████████▊ | 322/363 [00:12<00:01, 27.97it/s] Loading 0: 90%|█████████ | 327/363 [00:12<00:01, 29.97it/s] Loading 0: 91%|█████████ | 331/363 [00:12<00:01, 28.53it/s] Loading 0: 93%|█████████▎| 336/363 [00:12<00:00, 30.12it/s] Loading 0: 94%|█████████▎| 340/363 [00:13<00:00, 28.79it/s] Loading 0: 94%|█████████▍| 343/363 [00:13<00:00, 27.98it/s] Loading 0: 95%|█████████▌| 346/363 [00:20<00:10, 1.68it/s] Loading 0: 96%|█████████▌| 348/363 [00:20<00:07, 2.01it/s] Loading 0: 97%|█████████▋| 353/363 [00:20<00:03, 3.23it/s] Loading 0: 98%|█████████▊| 357/363 [00:20<00:01, 4.41it/s]
Job rirv938-mistral-12b-oai-70376-v1-mkmlizer completed after 156.2s with status: succeeded
Stopping job with name rirv938-mistral-12b-oai-70376-v1-mkmlizer
Pipeline stage MKMLizer completed in 327.34s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rirv938-mistral-12b-oai-70376-v1
Waiting for inference service rirv938-mistral-12b-oai-70376-v1 to be ready
Inference service rirv938-mistral-12b-oai-70376-v1 ready after 221.05626821517944s
Pipeline stage MKMLDeployer completed in 221.65s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.137831926345825s
Received healthy response to inference request in 1.5697269439697266s
Received healthy response to inference request in 1.6213347911834717s
Received healthy response to inference request in 1.636275291442871s
Received healthy response to inference request in 1.5423851013183594s
5 requests
0 failed requests
5th percentile: 1.5478534698486328
10th percentile: 1.5533218383789062
20th percentile: 1.5642585754394531
30th percentile: 1.5800485134124755
40th percentile: 1.6006916522979737
50th percentile: 1.6213347911834717
60th percentile: 1.6273109912872314
70th percentile: 1.6332871913909912
80th percentile: 1.736586618423462
90th percentile: 1.9372092723846437
95th percentile: 2.037520599365234
99th percentile: 2.117769660949707
mean time: 1.7015108108520507
Pipeline stage StressChecker completed in 10.07s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.82s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.69s
Shutdown handler de-registered
rirv938-mistral-12b-oai_70376_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rirv938-mistral-12b-oai-70376-v1-profiler
Waiting for inference service rirv938-mistral-12b-oai-70376-v1-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2690.75s
Shutdown handler de-registered
rirv938-mistral-12b-oai_70376_v1 status is now inactive due to auto deactivation removed underperforming models
rirv938-mistral-12b-oai_70376_v1 status is now torndown due to DeploymentManager action
ChatRequest
Generation Params
Prompt Formatter
Chat History
ChatMessage 1