submission_id: function_tenob_2024-11-12
developer_uid: chai_backend_admin
celo_rating: 1249.67
display_name: retune_with_base
family_friendly_score: 0.613
family_friendly_standard_error: 0.0068881202080103105
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
is_internal_developer: True
model_group:
model_name: retune_with_base
num_battles: 14873
num_wins: 7556
ranking_group: single
status: inactive
submission_type: function
timestamp: 2024-11-12T20:00:08+00:00
us_pacific_date: 2024-11-12
win_ratio: 0.5080346937403348
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 4.793699502944946s
Received healthy response to inference request in 3.4506711959838867s
Received healthy response to inference request in 3.320436954498291s
Received healthy response to inference request in 2.1281981468200684s
Received healthy response to inference request in 2.8754093647003174s
5 requests
0 failed requests
5th percentile: 2.277640390396118
10th percentile: 2.427082633972168
20th percentile: 2.7259671211242678
30th percentile: 2.964414882659912
40th percentile: 3.1424259185791015
50th percentile: 3.320436954498291
60th percentile: 3.3725306510925295
70th percentile: 3.4246243476867675
80th percentile: 3.719276857376099
90th percentile: 4.256488180160522
95th percentile: 4.525093841552734
99th percentile: 4.7399783706665035
mean time: 3.313683032989502
Pipeline stage StressChecker completed in 17.88s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 5.98s
Shutdown handler de-registered
function_tenob_2024-11-12 status is now deployed due to DeploymentManager action
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3847.85s
Shutdown handler de-registered
function_tenob_2024-11-12 status is now inactive due to auto deactivation removed underperforming models