submission_id: function_miduk_2024-11-14
developer_uid: chai_backend_admin
celo_rating: 1245.87
display_name: retune_with_base
family_friendly_score: 0.5786
family_friendly_standard_error: 0.006983151723971061
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
is_internal_developer: True
model_group:
model_name: retune_with_base
num_battles: 13196
num_wins: 6517
ranking_group: single
status: inactive
submission_type: function
timestamp: 2024-11-14T18:49:32+00:00
us_pacific_date: 2024-11-14
win_ratio: 0.4938617762958472
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.6844027042388916s
Received healthy response to inference request in 4.0500853061676025s
Received healthy response to inference request in 2.9325127601623535s
Received healthy response to inference request in 3.6337647438049316s
Received healthy response to inference request in 4.002493143081665s
5 requests
0 failed requests
5th percentile: 3.0727631568908693
10th percentile: 3.2130135536193847
20th percentile: 3.493514347076416
30th percentile: 3.6438923358917235
40th percentile: 3.6641475200653075
50th percentile: 3.6844027042388916
60th percentile: 3.811638879776001
70th percentile: 3.9388750553131104
80th percentile: 4.012011575698852
90th percentile: 4.031048440933228
95th percentile: 4.040566873550415
99th percentile: 4.048181619644165
mean time: 3.6606517314910887
%s, retrying in %s seconds...
Received healthy response to inference request in 2.2711715698242188s
Received healthy response to inference request in 3.5353505611419678s
Received healthy response to inference request in 4.43052864074707s
Received healthy response to inference request in 3.390530586242676s
Received healthy response to inference request in 3.064406156539917s
5 requests
0 failed requests
5th percentile: 2.4298184871673585
10th percentile: 2.5884654045104982
20th percentile: 2.9057592391967773
30th percentile: 3.1296310424804688
40th percentile: 3.2600808143615723
50th percentile: 3.390530586242676
60th percentile: 3.4484585762023925
70th percentile: 3.506386566162109
80th percentile: 3.7143861770629885
90th percentile: 4.072457408905029
95th percentile: 4.25149302482605
99th percentile: 4.394721517562866
mean time: 3.33839750289917
Pipeline stage StressChecker completed in 37.79s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 6.44s
Shutdown handler de-registered
function_miduk_2024-11-14 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3296.82s
Shutdown handler de-registered
function_miduk_2024-11-14 status is now inactive due to auto deactivation removed underperforming models