submission_id: function_dirit_2024-10-04
developer_uid: valentin
celo_rating: 1172.0
display_name: llama-405b-toether
family_friendly_score: 0.6279645191409897
family_friendly_standard_error: 0.008064121120700487
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end▁of▁sentence|>'], 'max_input_tokens': 512, 'best_of': 8, 'max_output_tokens': 64}
is_internal_developer: True
model_group:
model_name: llama-405b-toether
num_battles: 3753
num_wins: 1453
ranking_group: single
status: torndown
submission_type: function
timestamp: 2024-10-04T12:45:13+00:00
us_pacific_date: 2024-10-04
win_ratio: 0.38715694111377563
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline %s
admin requested tearing down of blend_rofur_2024-10-03
run pipeline stage %s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage ProductionBlendMKMLTemplater
Running pipeline stage StressChecker
run pipeline %s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Received healthy response to inference request in 3.09596848487854s
Tearing down inference service blend-rofur-2024-10-03
Pipeline stage ProductionBlendMKMLTemplater completed in 29.54s
Running pipeline stage ProductionBlendMKMLTemplater
Tearing down inference service blend-rofur-2024-10-03
%s, retrying in %s seconds...
Tearing down inference service blend-rofur-2024-10-03
run pipeline stage %s
Received healthy response to inference request in 5.995012283325195s
Pipeline stage %s skipped, reason=%s
Tearing down inference service blend-rofur-2024-10-03
%s, retrying in %s seconds...
Creating inference service blend-rofur-2024-10-03
%s, retrying in %s seconds...
Running pipeline stage MKMLDeployer
Pipeline stage ProductionBlendMKMLTemplater completed in 52.64s
Received healthy response to inference request in 7.5592615604400635s
%s, retrying in %s seconds...
Creating inference service blend-rofur-2024-10-03
Waiting for inference service blend-rofur-2024-10-03 to be ready
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
run pipeline stage %s
Creating inference service blend-rofur-2024-10-03
Received healthy response to inference request in 8.153065204620361s
Ignoring service blend-rofur-2024-10-03 already deployed
Ignoring service blend-rofur-2024-10-03 already deployed
Ignoring service blend-rofur-2024-10-03 already deployed
Running pipeline stage MKMLDeployer
Ignoring service blend-rofur-2024-10-03 already deployed
Waiting for inference service blend-rofur-2024-10-03 to be ready
Received healthy response to inference request in 5.801152467727661s
Waiting for inference service blend-rofur-2024-10-03 to be ready
Waiting for inference service blend-rofur-2024-10-03 to be ready
Waiting for inference service blend-rofur-2024-10-03 to be ready
Creating inference service blend-rofur-2024-10-03
5 requests
Ignoring service blend-rofur-2024-10-03 already deployed
0 failed requests
Waiting for inference service blend-rofur-2024-10-03 to be ready
5th percentile: 3.6370052814483644
10th percentile: 4.178042078018189
admin requested tearing down of blend_rofur_2024-10-03
20th percentile: 5.260115671157837
admin requested tearing down of blend_rofur_2024-10-03
Shutdown handler not registered because Python interpreter is not running in the main thread
30th percentile: 5.839924430847168
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
40th percentile: 5.9174683570861815
run pipeline %s
run pipeline stage %s
50th percentile: 5.995012283325195
run pipeline stage %s
Running pipeline stage ProductionBlendMKMLTemplater
60th percentile: 6.620711994171143
Running pipeline stage ProductionBlendMKMLTemplater
Pipeline stage %s skipped, reason=%s
70th percentile: 7.246411705017089
Pipeline stage %s skipped, reason=%s
Pipeline stage ProductionBlendMKMLTemplater completed in 43.36s
80th percentile: 7.678022289276123
Pipeline stage ProductionBlendMKMLTemplater completed in 40.82s
run pipeline stage %s
90th percentile: 7.915543746948242
run pipeline stage %s
Running pipeline stage MKMLDeployer
95th percentile: 8.034304475784301
Running pipeline stage MKMLDeployer
Creating inference service blend-rofur-2024-10-03
99th percentile: 8.12931305885315
Creating inference service blend-rofur-2024-10-03
Ignoring service blend-rofur-2024-10-03 already deployed
mean time: 6.120892000198364
Ignoring service blend-rofur-2024-10-03 already deployed
Waiting for inference service blend-rofur-2024-10-03 to be ready
%s, retrying in %s seconds...
Waiting for inference service blend-rofur-2024-10-03 to be ready
Received healthy response to inference request in 2.6424314975738525s
Received healthy response to inference request in 1.9933724403381348s
Received healthy response to inference request in 2.1282413005828857s
admin requested tearing down of blend_rofur_2024-10-03
admin requested tearing down of blend_rofur_2024-10-03
Shutdown handler not registered because Python interpreter is not running in the main thread
Received healthy response to inference request in 3.761556625366211s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
Received healthy response to inference request in 3.677781105041504s
run pipeline %s
run pipeline stage %s
5 requests
run pipeline stage %s
Running pipeline stage ProductionBlendMKMLTemplater
0 failed requests
Running pipeline stage ProductionBlendMKMLTemplater
Pipeline stage %s skipped, reason=%s
5th percentile: 2.020346212387085
Pipeline stage %s skipped, reason=%s
Pipeline stage ProductionBlendMKMLTemplater completed in 48.25s
10th percentile: 2.0473199844360352
Pipeline stage ProductionBlendMKMLTemplater completed in 76.08s
Tearing down inference service blend-rofur-2024-10-03
run pipeline stage %s
20th percentile: 2.1012675285339357
run pipeline stage %s
%s, retrying in %s seconds...
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
Running pipeline stage MKMLDeployer
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
30th percentile: 2.231079339981079
Running pipeline stage MKMLDeployer
Creating inference service blend-rofur-2024-10-03
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
Creating inference service blend-rofur-2024-10-03
clean up pipeline due to error=DeploymentError('404\nReason: Not Found\nHTTP response headers: HTTPHeaderDict({\'Audit-Id\': \'6cf6c2a7-7429-4a55-acf8-8cc959a00985, 4cb56220-4376-4a35-b67c-e643232b87d9\', \'Cache-Control\': \'no-cache, private, no-cache, private\', \'Content-Length\': \'284\', \'Content-Type\': \'application/json\', \'Date\': \'Fri, 04 Oct 2024 13:00:40 GMT\', \'X-Kubernetes-Pf-Flowschema-Uid\': \'514c121f-0f8a-452c-aa56-437270a02244\', \'X-Kubernetes-Pf-Prioritylevel-Uid\': \'48ad322a-4034-4c03-9ea4-7745e1e2c31a\'})\nHTTP response body: b\'{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"inferenceservices.serving.kserve.io \\\\"blend-rofur-2024-10-03\\\\" not found","reason":"NotFound","details":{"name":"blend-rofur-2024-10-03","group":"serving.kserve.io","kind":"inferenceservices"},"code":404}\\n\'\nOriginal traceback: \n File "/usr/local/lib/python3.12/site-packages/kubernetes/dynamic/client.py", line 55, in inner\n resp = func(self, *args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/dynamic/client.py", line 273, in request\n api_response = self.client.call_api(\n ^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/api_client.py", line 348, in call_api\n return self.__call_api(resource_path, method,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/api_client.py", line 180, in __call_api\n response_data = self.request(\n ^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/api_client.py", line 415, in request\n return self.rest_client.DELETE(url,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/rest.py", line 270, in DELETE\n return self.request("DELETE", url,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/rest.py", line 238, in request\n raise ApiException(http_resp=r)\n')
%s, retrying in %s seconds...
clean up pipeline due to error=DeploymentError('404\nReason: Not Found\nHTTP response headers: HTTPHeaderDict({\'Audit-Id\': \'f17a15ff-3dc1-4959-bf85-8b90c88df266, 41f2d197-d35b-419c-ad38-4d9db95774a1\', \'Cache-Control\': \'no-cache, private, no-cache, private\', \'Content-Length\': \'284\', \'Content-Type\': \'application/json\', \'Date\': \'Fri, 04 Oct 2024 13:00:48 GMT\', \'X-Kubernetes-Pf-Flowschema-Uid\': \'514c121f-0f8a-452c-aa56-437270a02244\', \'X-Kubernetes-Pf-Prioritylevel-Uid\': \'48ad322a-4034-4c03-9ea4-7745e1e2c31a\'})\nHTTP response body: b\'{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"inferenceservices.serving.kserve.io \\\\"blend-rofur-2024-10-03\\\\" not found","reason":"NotFound","details":{"name":"blend-rofur-2024-10-03","group":"serving.kserve.io","kind":"inferenceservices"},"code":404}\\n\'\nOriginal traceback: \n File "/usr/local/lib/python3.12/site-packages/kubernetes/dynamic/client.py", line 55, in inner\n resp = func(self, *args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/dynamic/client.py", line 273, in request\n api_response = self.client.call_api(\n ^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/api_client.py", line 348, in call_api\n return self.__call_api(resource_path, method,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/api_client.py", line 180, in __call_api\n response_data = self.request(\n ^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/api_client.py", line 415, in request\n return self.rest_client.DELETE(url,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/rest.py", line 270, in DELETE\n return self.request("DELETE", url,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n File "/usr/local/lib/python3.12/site-packages/kubernetes/client/rest.py", line 238, in request\n raise ApiException(http_resp=r)\n')
%s, retrying in %s seconds...
40th percentile: 2.436755418777466
admin requested tearing down of blend_rofur_2024-10-03
Ignoring service blend-rofur-2024-10-03 already deployed
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
Ignoring service blend-rofur-2024-10-03 already deployed
Shutdown handler de-registered
Creating inference service blend-rofur-2024-10-03
Shutdown handler de-registered
Creating inference service blend-rofur-2024-10-03
Pipeline stage %s skipped, reason=%s
Creating inference service blend-rofur-2024-10-03
function_dirit_2024-10-04 status is now torndown due to DeploymentManager action
function_dirit_2024-10-04 status is now deployed due to DeploymentManager action
function_dirit_2024-10-04 status is now inactive due to auto deactivation removed underperforming models
function_dirit_2024-10-04 status is now torndown due to DeploymentManager action