submission_id: function_semeb_2024-09-25
developer_uid: chai_backend_admin
celo_rating: 1252.22
display_name: reward_blend_default_full_bon
family_friendly_score: 0.5744928500166279
family_friendly_standard_error: 0.008992047974716953
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 50, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>', '<|user|>', '###'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
is_internal_developer: True
model_group:
model_name: reward_blend_default_full_bon
num_battles: 3080
num_wins: 1533
ranking_group: single
status: torndown
submission_type: function
timestamp: 2024-09-25T04:25:17+00:00
us_pacific_date: 2024-09-24
win_ratio: 0.49772727272727274
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 6.181146860122681s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 4.960723400115967s
Received healthy response to inference request in 8.706024885177612s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 5.344745635986328s
Received healthy response to inference request in 7.324533224105835s
5 requests
0 failed requests
5th percentile: 5.037527847290039
10th percentile: 5.114332294464111
20th percentile: 5.267941188812256
30th percentile: 5.5120258808135985
40th percentile: 5.84658637046814
50th percentile: 6.181146860122681
60th percentile: 6.638501405715942
70th percentile: 7.095855951309204
80th percentile: 7.600831556320191
90th percentile: 8.153428220748902
95th percentile: 8.429726552963256
99th percentile: 8.65076521873474
mean time: 6.503434801101685
%s, retrying in %s seconds...
Received healthy response to inference request in 6.403990983963013s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 6.955063104629517s
Received healthy response to inference request in 7.9380927085876465s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 8.717573165893555s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 7.285416603088379s
5 requests
0 failed requests
5th percentile: 6.514205408096314
10th percentile: 6.624419832229615
20th percentile: 6.844848680496216
30th percentile: 7.021133804321289
40th percentile: 7.153275203704834
50th percentile: 7.285416603088379
60th percentile: 7.546487045288086
70th percentile: 7.807557487487792
80th percentile: 8.093988800048828
90th percentile: 8.40578098297119
95th percentile: 8.561677074432373
99th percentile: 8.686393947601319
mean time: 7.4600273132324215
%s, retrying in %s seconds...
Received healthy response to inference request in 6.638851165771484s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 6.839630603790283s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 6.143150329589844s
Received healthy response to inference request in 5.45689582824707s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 5.142122983932495s
5 requests
0 failed requests
5th percentile: 5.2050775527954105
10th percentile: 5.268032121658325
20th percentile: 5.393941259384155
30th percentile: 5.594146728515625
40th percentile: 5.868648529052734
50th percentile: 6.143150329589844
60th percentile: 6.3414306640625
70th percentile: 6.539710998535156
80th percentile: 6.679007053375244
90th percentile: 6.759318828582764
95th percentile: 6.799474716186523
99th percentile: 6.831599426269531
mean time: 6.044130182266235
clean up pipeline due to error=%s
Shutdown handler de-registered
function_semeb_2024-09-25 status is now failed due to DeploymentManager action
function_semeb_2024-09-25 status is now torndown due to DeploymentManager action
function_semeb_2024-09-25 status is now inactive due to auto deactivation removed underperforming models
function_semeb_2024-09-25 status is now torndown due to DeploymentManager action