submission_id: function_notom_2024-08-20
developer_uid: chai_backend_admin
alignment_samples: 226914
alignment_score: 1.9236176679285926
celo_rating: 1216.11
display_name: gpt4-tl
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.1, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', 'You:'], 'max_input_tokens': 512, 'best_of': 8, 'max_output_tokens': 64}
is_internal_developer: True
model_group:
model_name: gpt4-tl
num_battles: 226914
num_wins: 107437
propriety_score: 0.7920161019792016
propriety_total_count: 20867.0
ranking_group: single
status: torndown
submission_type: function
timestamp: 2024-08-20T23:33:36+00:00
us_pacific_date: 2024-08-20
win_ratio: 0.47347012524568777
Download Preference Data
Resubmit model
Running pipeline stage StressChecker
Received healthy response to inference request in 1.4289507865905762s
Received healthy response to inference request in 2.128384590148926s
Received healthy response to inference request in 2.201862096786499s
Received healthy response to inference request in 2.8264729976654053s
Received healthy response to inference request in 2.3618953227996826s
5 requests
0 failed requests
5th percentile: 1.568837547302246
10th percentile: 1.708724308013916
20th percentile: 1.988497829437256
30th percentile: 2.1430800914764405
40th percentile: 2.1724710941314695
50th percentile: 2.201862096786499
60th percentile: 2.2658753871917723
70th percentile: 2.329888677597046
80th percentile: 2.454810857772827
90th percentile: 2.640641927719116
95th percentile: 2.7335574626922607
99th percentile: 2.8078898906707765
mean time: 2.189513158798218
Pipeline stage StressChecker completed in 11.53s
function_notom_2024-08-20 status is now deployed due to DeploymentManager action
function_notom_2024-08-20 status is now inactive due to auto deactivation removed underperforming models
function_notom_2024-08-20 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics