submission_id: function_soson_2024-11-06
developer_uid: chai_backend_admin
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
model_name: retune_with_base
status: failed
timestamp: 2024-11-06T22:10:19+00:00
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 5.517536878585815s
Received healthy response to inference request in 3.0905303955078125s
Received healthy response to inference request in 3.446322441101074s
Received healthy response to inference request in 5.107716798782349s
Received healthy response to inference request in 4.116021394729614s
5 requests
0 failed requests
5th percentile: 3.161688804626465
10th percentile: 3.232847213745117
20th percentile: 3.375164031982422
30th percentile: 3.580262231826782
40th percentile: 3.848141813278198
50th percentile: 4.116021394729614
60th percentile: 4.512699556350708
70th percentile: 4.909377717971801
80th percentile: 5.189680814743042
90th percentile: 5.353608846664429
95th percentile: 5.435572862625122
99th percentile: 5.501144075393677
mean time: 4.255625581741333
%s, retrying in %s seconds...
Received healthy response to inference request in 6.180832862854004s
Received healthy response to inference request in 9.182952880859375s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 10.40718698501587s
Received healthy response to inference request in 10.492637395858765s
Received healthy response to inference request in 14.273050308227539s
5 requests
0 failed requests
5th percentile: 6.781256866455078
10th percentile: 7.381680870056153
20th percentile: 8.582528877258301
30th percentile: 9.427799701690674
40th percentile: 9.917493343353271
50th percentile: 10.40718698501587
60th percentile: 10.441367149353027
70th percentile: 10.475547313690186
80th percentile: 11.24871997833252
90th percentile: 12.76088514328003
95th percentile: 13.516967725753783
99th percentile: 14.121833791732788
mean time: 10.10733208656311
%s, retrying in %s seconds...
Received healthy response to inference request in 15.380198240280151s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 12.538949012756348s
Received healthy response to inference request in 14.049153089523315s
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.906632900238037s
5 requests
1 failed requests
5th percentile: 4.833096122741699
10th percentile: 6.7595593452453615
20th percentile: 10.612485790252686
30th percentile: 12.840989828109741
40th percentile: 13.445071458816528
50th percentile: 14.049153089523315
60th percentile: 14.58157114982605
70th percentile: 15.113989210128784
80th percentile: 16.334468269348147
90th percentile: 18.24300832748413
95th percentile: 19.197278356552122
99th percentile: 19.96069437980652
mean time: 13.005296325683593
clean up pipeline due to error=DeploymentChecksError('Unacceptable number of predict errors: 20.0%')
Shutdown handler de-registered
function_soson_2024-11-06 status is now failed due to DeploymentManager action