submission_id: function_luhir_2024-11-15
developer_uid: chai_backend_admin
celo_rating: 1249.92
display_name: retune_with_base
family_friendly_score: 0.5766
family_friendly_standard_error: 0.00698759529452014
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
is_internal_developer: True
model_group:
model_name: retune_with_base
num_battles: 11148
num_wins: 5647
ranking_group: single
status: inactive
submission_type: function
timestamp: 2024-11-15T21:55:40+00:00
us_pacific_date: 2024-11-15
win_ratio: 0.5065482597775386
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.3298442363739014s
Received healthy response to inference request in 3.361193895339966s
Received healthy response to inference request in 3.07641863822937s
Received healthy response to inference request in 3.1789464950561523s
Received healthy response to inference request in 3.0628411769866943s
5 requests
0 failed requests
5th percentile: 3.0655566692352294
10th percentile: 3.0682721614837645
20th percentile: 3.073703145980835
30th percentile: 3.0969242095947265
40th percentile: 3.1379353523254396
50th percentile: 3.1789464950561523
60th percentile: 3.239305591583252
70th percentile: 3.2996646881103517
80th percentile: 3.3361141681671143
90th percentile: 3.34865403175354
95th percentile: 3.354923963546753
99th percentile: 3.3599399089813233
mean time: 3.2018488883972167
Pipeline stage StressChecker completed in 17.22s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 5.66s
Shutdown handler de-registered
function_luhir_2024-11-15 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3256.46s
Shutdown handler de-registered
function_luhir_2024-11-15 status is now inactive due to auto deactivation removed underperforming models