developer_uid: chai_backend_admin
submission_id: function_fopin_2024-11-17
model_name: retune_with_base
model_group:
status: inactive
timestamp: 2024-11-17T18:21:36+00:00
num_battles: 17398
num_wins: 9129
celo_rating: 1266.48
family_friendly_score: 0.598
family_friendly_standard_error: 0.0069339166421294686
submission_type: function
display_name: retune_with_base
is_internal_developer: True
ranking_group: single
us_pacific_date: 2024-11-17
win_ratio: 0.5247154845384527
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 4.428696393966675s
Received healthy response to inference request in 3.843454599380493s
Received healthy response to inference request in 3.9111194610595703s
Received healthy response to inference request in 3.3758456707000732s
Received healthy response to inference request in 3.9924752712249756s
5 requests
0 failed requests
5th percentile: 3.4693674564361574
10th percentile: 3.562889242172241
20th percentile: 3.749932813644409
30th percentile: 3.8569875717163087
40th percentile: 3.8840535163879393
50th percentile: 3.9111194610595703
60th percentile: 3.9436617851257325
70th percentile: 3.9762041091918947
80th percentile: 4.079719495773316
90th percentile: 4.254207944869995
95th percentile: 4.341452169418335
99th percentile: 4.4112475490570064
mean time: 3.9103182792663573
%s, retrying in %s seconds...
Received healthy response to inference request in 3.5514633655548096s
Received healthy response to inference request in 3.6865055561065674s
Received healthy response to inference request in 4.42016077041626s
Received healthy response to inference request in 3.6462173461914062s
Received healthy response to inference request in 2.8083882331848145s
5 requests
0 failed requests
5th percentile: 2.9570032596588134
10th percentile: 3.1056182861328123
20th percentile: 3.4028483390808106
30th percentile: 3.570414161682129
40th percentile: 3.6083157539367674
50th percentile: 3.6462173461914062
60th percentile: 3.662332630157471
70th percentile: 3.6784479141235353
80th percentile: 3.833236598968506
90th percentile: 4.126698684692383
95th percentile: 4.273429727554321
99th percentile: 4.390814561843872
mean time: 3.6225470542907714
Pipeline stage StressChecker completed in 40.18s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.59s
Shutdown handler de-registered
function_fopin_2024-11-17 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3282.15s
Shutdown handler de-registered
function_fopin_2024-11-17 status is now inactive due to auto deactivation removed underperforming models