developer_uid: chai_backend_admin
submission_id: function_mitul_2024-11-27
model_name: retune_with_base
model_group:
status: inactive
timestamp: 2024-11-27T19:06:25+00:00
num_battles: 12194
num_wins: 6031
celo_rating: 1255.47
family_friendly_score: 0.5616
family_friendly_standard_error: 0.007017199441372605
submission_type: function
display_name: retune_with_base
is_internal_developer: True
ranking_group: single
us_pacific_date: 2024-11-27
win_ratio: 0.49458750205018864
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 4.761760473251343s
Received healthy response to inference request in 3.8544883728027344s
Received healthy response to inference request in 2.7835662364959717s
Received healthy response to inference request in 4.60717248916626s
Received healthy response to inference request in 2.8326070308685303s
5 requests
0 failed requests
5th percentile: 2.7933743953704835
10th percentile: 2.8031825542449953
20th percentile: 2.8227988719940185
30th percentile: 3.036983299255371
40th percentile: 3.4457358360290526
50th percentile: 3.8544883728027344
60th percentile: 4.155562019348144
70th percentile: 4.456635665893555
80th percentile: 4.6380900859832765
90th percentile: 4.699925279617309
95th percentile: 4.730842876434326
99th percentile: 4.75557695388794
mean time: 3.767918920516968
%s, retrying in %s seconds...
Received healthy response to inference request in 2.9246532917022705s
Received healthy response to inference request in 2.602668046951294s
Received healthy response to inference request in 1.2937300205230713s
Received healthy response to inference request in 2.963644027709961s
Received healthy response to inference request in 2.0978705883026123s
5 requests
0 failed requests
5th percentile: 1.4545581340789795
10th percentile: 1.6153862476348877
20th percentile: 1.937042474746704
30th percentile: 2.198830080032349
40th percentile: 2.4007490634918214
50th percentile: 2.602668046951294
60th percentile: 2.7314621448516845
70th percentile: 2.860256242752075
80th percentile: 2.9324514389038088
90th percentile: 2.948047733306885
95th percentile: 2.9558458805084227
99th percentile: 2.962084398269653
mean time: 2.376513195037842
Pipeline stage StressChecker completed in 33.03s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
run_pipeline:run_in_cloud %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 3.31s
Shutdown handler de-registered
function_mitul_2024-11-27 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3587.99s
Shutdown handler de-registered
function_mitul_2024-11-27 status is now inactive due to auto deactivation removed underperforming models