developer_uid: rirv938
submission_id: rirv938-mistral-12b-oai_58888_v1
model_name: rirv938-mistral-12b-oai_58888_v1
model_group: rirv938/mistral_12b_oai_
status: torndown
timestamp: 2025-02-21T03:49:55+00:00
num_battles: 7015
num_wins: 3442
celo_rating: 1264.11
family_friendly_score: 0.5542
family_friendly_standard_error: 0.007029400543431851
submission_type: basic
model_repo: rirv938/mistral_12b_oai_prompt_oai_bon_mixed_outputs_600_v2
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.6009493397266882, 'latency_mean': 1.6639413607120515, 'latency_p50': 1.6629470586776733, 'latency_p90': 1.843718123435974}, {'batch_size': 3, 'throughput': 1.091259782011108, 'latency_mean': 2.74133580327034, 'latency_p50': 2.7423055171966553, 'latency_p90': 3.019801115989685}, {'batch_size': 5, 'throughput': 1.2965530991141332, 'latency_mean': 3.848733936548233, 'latency_p50': 3.839441418647766, 'latency_p90': 4.337932229042053}, {'batch_size': 6, 'throughput': 1.375002913760608, 'latency_mean': 4.327232747077942, 'latency_p50': 4.296808123588562, 'latency_p90': 4.8578572273254395}, {'batch_size': 8, 'throughput': 1.4401116041995177, 'latency_mean': 5.514962147474289, 'latency_p50': 5.554255723953247, 'latency_p90': 6.1889888048172}, {'batch_size': 10, 'throughput': 1.462418945076664, 'latency_mean': 6.7815291118621825, 'latency_p50': 6.839215993881226, 'latency_p90': 7.741189742088317}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: rirv938-mistral-12b-oai_58888_v1
is_internal_developer: True
language_model: rirv938/mistral_12b_oai_prompt_oai_bon_mixed_outputs_600_v2
model_size: 13B
ranking_group: single
throughput_3p7s: 1.28
us_pacific_date: 2025-02-20
win_ratio: 0.49066286528866715
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.6, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>', '###', 'You:'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rirv938-mistral-12b-oai-58888-v1-mkmlizer
Waiting for job on rirv938-mistral-12b-oai-58888-v1-mkmlizer to finish
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ _____ __ __ ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ /___/ ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ Version: 0.12.8 ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ https://mk1.ai ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ belonging to: ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ Chai Research Corp. ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ║ ║
rirv938-mistral-12b-oai-58888-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Traceback (most recent call last):
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 536, in _make_request
rirv938-mistral-12b-oai-58888-v1-mkmlizer: response = conn.getresponse()
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connection.py", line 461, in getresponse
rirv938-mistral-12b-oai-58888-v1-mkmlizer: httplib_response = super().getresponse()
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/http/client.py", line 1375, in getresponse
rirv938-mistral-12b-oai-58888-v1-mkmlizer: response.begin()
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/http/client.py", line 318, in begin
rirv938-mistral-12b-oai-58888-v1-mkmlizer: version, status, reason = self._read_status()
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/http/client.py", line 279, in _read_status
rirv938-mistral-12b-oai-58888-v1-mkmlizer: line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/socket.py", line 705, in readinto
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return self._sock.recv_into(b)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/ssl.py", line 1307, in recv_into
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return self.read(nbytes, buffer)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/ssl.py", line 1163, in read
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return self._sslobj.read(len, buffer)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: TimeoutError: The read operation timed out
rirv938-mistral-12b-oai-58888-v1-mkmlizer: The above exception was the direct cause of the following exception:
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Traceback (most recent call last):
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 667, in send
rirv938-mistral-12b-oai-58888-v1-mkmlizer: resp = conn.urlopen(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 844, in urlopen
rirv938-mistral-12b-oai-58888-v1-mkmlizer: retries = retries.increment(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/retry.py", line 470, in increment
rirv938-mistral-12b-oai-58888-v1-mkmlizer: raise reraise(type(error), error, _stacktrace)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/util/util.py", line 39, in reraise
rirv938-mistral-12b-oai-58888-v1-mkmlizer: raise value
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 790, in urlopen
rirv938-mistral-12b-oai-58888-v1-mkmlizer: response = self._make_request(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 538, in _make_request
rirv938-mistral-12b-oai-58888-v1-mkmlizer: self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/urllib3/connectionpool.py", line 370, in _raise_timeout
rirv938-mistral-12b-oai-58888-v1-mkmlizer: raise ReadTimeoutError(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: During handling of the above exception, another exception occurred:
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Traceback (most recent call last):
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1374, in _get_metadata_or_catch_error
rirv938-mistral-12b-oai-58888-v1-mkmlizer: metadata = get_hf_file_metadata(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return fn(*args, **kwargs)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1294, in get_hf_file_metadata
rirv938-mistral-12b-oai-58888-v1-mkmlizer: r = _request_wrapper(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 278, in _request_wrapper
rirv938-mistral-12b-oai-58888-v1-mkmlizer: response = _request_wrapper(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 301, in _request_wrapper
rirv938-mistral-12b-oai-58888-v1-mkmlizer: response = get_session().request(method=method, url=url, **params)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
rirv938-mistral-12b-oai-58888-v1-mkmlizer: resp = self.send(prep, **send_kwargs)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
rirv938-mistral-12b-oai-58888-v1-mkmlizer: r = adapter.send(request, **kwargs)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 93, in send
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return super().send(request, *args, **kwargs)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/requests/adapters.py", line 713, in send
rirv938-mistral-12b-oai-58888-v1-mkmlizer: raise ReadTimeout(e, request=request)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: requests.exceptions.ReadTimeout: (ReadTimeoutError("HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: 328e50c2-96d2-464c-aef5-31df1822bfaa)')
rirv938-mistral-12b-oai-58888-v1-mkmlizer: The above exception was the direct cause of the following exception:
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Traceback (most recent call last):
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 193, in <module>
rirv938-mistral-12b-oai-58888-v1-mkmlizer: cli()
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return self.main(*args, **kwargs)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
rirv938-mistral-12b-oai-58888-v1-mkmlizer: rv = self.invoke(ctx)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return __callback(*args, **kwargs)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 39, in quantize
rirv938-mistral-12b-oai-58888-v1-mkmlizer: temp_folder = download_to_shared_memory(repo_id, revision, hf_auth_token)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 86, in download_to_shared_memory
rirv938-mistral-12b-oai-58888-v1-mkmlizer: snapshot_download(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return fn(*args, **kwargs)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 294, in snapshot_download
rirv938-mistral-12b-oai-58888-v1-mkmlizer: _inner_hf_hub_download(file)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 270, in _inner_hf_hub_download
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return hf_hub_download(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return fn(*args, **kwargs)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 840, in hf_hub_download
rirv938-mistral-12b-oai-58888-v1-mkmlizer: return _hf_hub_download_to_local_dir(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1089, in _hf_hub_download_to_local_dir
rirv938-mistral-12b-oai-58888-v1-mkmlizer: _raise_on_head_call_error(head_call_error, force_download, local_files_only)
rirv938-mistral-12b-oai-58888-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1485, in _raise_on_head_call_error
rirv938-mistral-12b-oai-58888-v1-mkmlizer: raise LocalEntryNotFoundError(
rirv938-mistral-12b-oai-58888-v1-mkmlizer: huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
Job rirv938-mistral-12b-oai-58888-v1-mkmlizer completed after 169.78s with status: failed
Stopping job with name rirv938-mistral-12b-oai-58888-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name rirv938-mistral-12b-oai-58888-v1-mkmlizer
Waiting for job on rirv938-mistral-12b-oai-58888-v1-mkmlizer to finish
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Downloaded to shared memory in 83.331s
rirv938-mistral-12b-oai-58888-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpv4vskd96, device:0
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rirv938-mistral-12b-oai-58888-v1-mkmlizer: quantized model in 42.012s
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Processed model rirv938/mistral_12b_oai_prompt_oai_bon_mixed_outputs_600_v2 in 125.344s
rirv938-mistral-12b-oai-58888-v1-mkmlizer: creating bucket guanaco-mkml-models
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rirv938-mistral-12b-oai-58888-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rirv938-mistral-12b-oai-58888-v1
rirv938-mistral-12b-oai-58888-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rirv938-mistral-12b-oai-58888-v1/config.json
rirv938-mistral-12b-oai-58888-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rirv938-mistral-12b-oai-58888-v1/special_tokens_map.json
rirv938-mistral-12b-oai-58888-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rirv938-mistral-12b-oai-58888-v1/tokenizer_config.json
rirv938-mistral-12b-oai-58888-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rirv938-mistral-12b-oai-58888-v1/tokenizer.json
rirv938-mistral-12b-oai-58888-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rirv938-mistral-12b-oai-58888-v1/flywheel_model.0.safetensors
rirv938-mistral-12b-oai-58888-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:16, 21.83it/s] Loading 0: 3%|▎ | 10/363 [00:00<00:12, 29.11it/s] Loading 0: 4%|▍ | 14/363 [00:00<00:13, 25.65it/s] Loading 0: 6%|▌ | 21/363 [00:00<00:09, 36.67it/s] Loading 0: 7%|▋ | 26/363 [00:01<00:16, 20.36it/s] Loading 0: 9%|▊ | 31/363 [00:01<00:13, 24.57it/s] Loading 0: 10%|▉ | 35/363 [00:01<00:12, 26.31it/s] Loading 0: 11%|█ | 39/363 [00:01<00:11, 27.63it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:12, 26.28it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:10, 29.17it/s] Loading 0: 14%|█▍ | 52/363 [00:01<00:11, 28.01it/s] Loading 0: 15%|█▌ | 56/363 [00:02<00:10, 28.68it/s] Loading 0: 17%|█▋ | 61/363 [00:02<00:13, 23.20it/s] Loading 0: 18%|█▊ | 64/363 [00:02<00:14, 20.36it/s] Loading 0: 20%|█▉ | 71/363 [00:02<00:10, 27.22it/s] Loading 0: 21%|██ | 75/363 [00:02<00:10, 26.84it/s] Loading 0: 21%|██▏ | 78/363 [00:03<00:11, 24.29it/s] Loading 0: 23%|██▎ | 84/363 [00:03<00:09, 28.71it/s] Loading 0: 24%|██▍ | 88/363 [00:03<00:10, 27.06it/s] Loading 0: 26%|██▌ | 93/363 [00:03<00:09, 29.23it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:09, 27.53it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:12, 20.86it/s] Loading 0: 29%|██▊ | 104/363 [00:04<00:13, 18.67it/s] Loading 0: 31%|███ | 111/363 [00:04<00:10, 25.04it/s] Loading 0: 31%|███▏ | 114/363 [00:04<00:10, 24.06it/s] Loading 0: 33%|███▎ | 120/363 [00:04<00:08, 28.97it/s] Loading 0: 34%|███▍ | 124/363 [00:04<00:08, 28.22it/s] Loading 0: 36%|███▌ | 129/363 [00:04<00:07, 30.74it/s] Loading 0: 37%|███▋ | 133/363 [00:05<00:07, 29.93it/s] Loading 0: 38%|███▊ | 137/363 [00:05<00:07, 30.44it/s] Loading 0: 39%|███▉ | 142/363 [00:05<00:09, 24.37it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:09, 23.15it/s] Loading 0: 41%|████ | 149/363 [00:05<00:09, 22.38it/s] Loading 0: 43%|████▎ | 156/363 [00:05<00:06, 29.57it/s] Loading 0: 44%|████▍ | 160/363 [00:06<00:07, 28.23it/s] Loading 0: 45%|████▌ | 165/363 [00:06<00:06, 30.10it/s] Loading 0: 47%|████▋ | 169/363 [00:06<00:06, 29.41it/s] Loading 0: 48%|████▊ | 174/363 [00:06<00:05, 31.63it/s] Loading 0: 49%|████▉ | 178/363 [00:06<00:06, 29.51it/s] Loading 0: 50%|█████ | 182/363 [00:06<00:07, 23.46it/s] Loading 0: 51%|█████ | 185/363 [00:07<00:08, 20.85it/s] Loading 0: 53%|█████▎ | 192/363 [00:07<00:06, 28.12it/s] Loading 0: 54%|█████▍ | 196/363 [00:07<00:05, 28.10it/s] Loading 0: 55%|█████▌ | 201/363 [00:07<00:05, 30.53it/s] Loading 0: 56%|█████▋ | 205/363 [00:07<00:05, 29.74it/s] Loading 0: 58%|█████▊ | 210/363 [00:07<00:04, 32.25it/s] Loading 0: 59%|█████▉ | 214/363 [00:07<00:04, 30.48it/s] Loading 0: 60%|██████ | 218/363 [00:08<00:04, 30.81it/s] Loading 0: 61%|██████▏ | 223/363 [00:08<00:05, 25.62it/s] Loading 0: 62%|██████▏ | 226/363 [00:08<00:05, 23.98it/s] Loading 0: 63%|██████▎ | 230/363 [00:08<00:05, 23.30it/s] Loading 0: 65%|██████▌ | 237/363 [00:08<00:04, 30.43it/s] Loading 0: 66%|██████▋ | 241/363 [00:08<00:04, 29.65it/s] Loading 0: 68%|██████▊ | 246/363 [00:09<00:03, 32.40it/s] Loading 0: 69%|██████▉ | 250/363 [00:09<00:03, 31.27it/s] Loading 0: 70%|███████ | 255/363 [00:09<00:03, 33.39it/s] Loading 0: 71%|███████▏ | 259/363 [00:09<00:03, 31.52it/s] Loading 0: 72%|███████▏ | 263/363 [00:09<00:04, 24.68it/s] Loading 0: 73%|███████▎ | 266/363 [00:09<00:04, 21.63it/s] Loading 0: 75%|███████▌ | 273/363 [00:10<00:03, 28.97it/s] Loading 0: 76%|███████▋ | 277/363 [00:10<00:02, 28.82it/s] Loading 0: 78%|███████▊ | 282/363 [00:10<00:02, 30.97it/s] Loading 0: 79%|███████▉ | 286/363 [00:10<00:02, 29.10it/s] Loading 0: 80%|████████ | 291/363 [00:10<00:02, 31.68it/s] Loading 0: 81%|████████▏ | 295/363 [00:10<00:02, 30.24it/s] Loading 0: 82%|████████▏ | 299/363 [00:10<00:02, 30.37it/s] Loading 0: 84%|████████▎ | 304/363 [00:11<00:02, 25.54it/s] Loading 0: 85%|████████▍ | 307/363 [00:11<00:02, 24.14it/s] Loading 0: 86%|████████▌ | 311/363 [00:11<00:02, 22.78it/s] Loading 0: 88%|████████▊ | 318/363 [00:11<00:01, 29.39it/s] Loading 0: 89%|████████▊ | 322/363 [00:11<00:01, 28.31it/s] Loading 0: 90%|█████████ | 327/363 [00:12<00:01, 30.50it/s] Loading 0: 91%|█████████ | 331/363 [00:12<00:01, 29.41it/s] Loading 0: 93%|█████████▎| 336/363 [00:12<00:00, 31.94it/s] Loading 0: 94%|█████████▎| 340/363 [00:12<00:00, 30.21it/s] Loading 0: 95%|█████████▍| 344/363 [00:19<00:09, 1.94it/s] Loading 0: 96%|█████████▌| 348/363 [00:19<00:05, 2.61it/s] Loading 0: 97%|█████████▋| 353/363 [00:19<00:02, 3.80it/s] Loading 0: 98%|█████████▊| 357/363 [00:20<00:01, 4.93it/s]
Job rirv938-mistral-12b-oai-58888-v1-mkmlizer completed after 154.82s with status: succeeded
Stopping job with name rirv938-mistral-12b-oai-58888-v1-mkmlizer
Pipeline stage MKMLizer completed in 325.65s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.20s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rirv938-mistral-12b-oai-58888-v1
Waiting for inference service rirv938-mistral-12b-oai-58888-v1 to be ready
Inference service rirv938-mistral-12b-oai-58888-v1 ready after 211.1746208667755s
Pipeline stage MKMLDeployer completed in 211.76s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.0517990589141846s
Received healthy response to inference request in 1.5973165035247803s
Received healthy response to inference request in 1.5624101161956787s
Received healthy response to inference request in 1.5733935832977295s
Received healthy response to inference request in 1.9747202396392822s
5 requests
0 failed requests
5th percentile: 1.5646068096160888
10th percentile: 1.566803503036499
20th percentile: 1.5711968898773194
30th percentile: 1.5781781673431396
40th percentile: 1.58774733543396
50th percentile: 1.5973165035247803
60th percentile: 1.7482779979705811
70th percentile: 1.8992394924163818
80th percentile: 1.9901360034942628
90th percentile: 2.0209675312042235
95th percentile: 2.036383295059204
99th percentile: 2.0487159061431885
mean time: 1.751927900314331
Pipeline stage StressChecker completed in 10.34s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.77s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.73s
Shutdown handler de-registered
rirv938-mistral-12b-oai_58888_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rirv938-mistral-12b-oai-58888-v1-profiler
Waiting for inference service rirv938-mistral-12b-oai-58888-v1-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4891.79s
Shutdown handler de-registered
rirv938-mistral-12b-oai_58888_v1 status is now inactive due to auto deactivation removed underperforming models
rirv938-mistral-12b-oai_58888_v1 status is now torndown due to DeploymentManager action
ChatRequest
Generation Params
Prompt Formatter
Chat History
ChatMessage 1