developer_uid: chai_backend_admin
submission_id: function_dijub_2024-11-15
model_name: retune_with_base
model_group:
status: inactive
timestamp: 2024-11-15T19:03:44+00:00
num_battles: 14060
num_wins: 7192
celo_rating: 1255.15
family_friendly_score: 0.5754
family_friendly_standard_error: 0.0069902051472041935
submission_type: function
display_name: retune_with_base
is_internal_developer: True
ranking_group: single
us_pacific_date: 2024-11-15
win_ratio: 0.5115220483641536
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.588265895843506s
Received healthy response to inference request in 2.7092998027801514s
Received healthy response to inference request in 3.7468624114990234s
Received healthy response to inference request in 2.0507421493530273s
Received healthy response to inference request in 2.1014132499694824s
5 requests
0 failed requests
5th percentile: 2.0608763694763184
10th percentile: 2.0710105895996094
20th percentile: 2.0912790298461914
30th percentile: 2.198783779144287
40th percentile: 2.3935248374938967
50th percentile: 2.588265895843506
60th percentile: 2.636679458618164
70th percentile: 2.6850930213928224
80th percentile: 2.916812324523926
90th percentile: 3.331837368011475
95th percentile: 3.5393498897552487
99th percentile: 3.7053599071502683
mean time: 2.639316701889038
Pipeline stage StressChecker completed in 14.41s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 5.52s
Shutdown handler de-registered
function_dijub_2024-11-15 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2680.72s
Shutdown handler de-registered
function_dijub_2024-11-15 status is now inactive due to auto deactivation removed underperforming models