submission_id: function_lotel_2024-08-19
developer_uid: chai_backend_admin
alignment_samples: 11148
alignment_score: 2.982407739228581
celo_rating: 1161.99
display_name: gpt4-tl
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.1, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', 'You:'], 'max_input_tokens': 512, 'best_of': 8, 'max_output_tokens': 64}
is_internal_developer: True
model_group:
model_name: gpt4-tl
num_battles: 11148
num_wins: 4496
propriety_score: 0.7958762886597938
propriety_total_count: 970.0
ranking_group: single
status: torndown
submission_type: function
timestamp: 2024-08-19T06:48:32+00:00
us_pacific_date: 2024-08-18
win_ratio: 0.4033010405453893
Download Preference Data
Resubmit model
Running pipeline stage StressChecker
Received healthy response to inference request in 1.356203556060791s
Received healthy response to inference request in 2.3373427391052246s
Received healthy response to inference request in 1.081770896911621s
Received healthy response to inference request in 4.530778646469116s
Received healthy response to inference request in 3.274817705154419s
5 requests
0 failed requests
5th percentile: 1.136657428741455
10th percentile: 1.191543960571289
20th percentile: 1.301317024230957
30th percentile: 1.5524313926696778
40th percentile: 1.9448870658874513
50th percentile: 2.3373427391052246
60th percentile: 2.7123327255249023
70th percentile: 3.08732271194458
80th percentile: 3.5260098934173585
90th percentile: 4.028394269943237
95th percentile: 4.279586458206176
99th percentile: 4.480540208816528
mean time: 2.516182708740234
Pipeline stage StressChecker completed in 13.13s
function_lotel_2024-08-19 status is now deployed due to DeploymentManager action
function_lotel_2024-08-19 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of function_lotel_2024-08-19
Deleting key cycy233-l3-e-v2-c1-v36/special_tokens_map.json from bucket guanaco-mkml-models
Shutdown handler not registered because Python interpreter is not running in the main thread
Deleting key cycy233-l3-e-v2-c1-v36/tokenizer.json from bucket guanaco-mkml-models
run pipeline %s
Shutdown handler de-registered
Deleting key cycy233-l3-e-v2-c1-v36/tokenizer_config.json from bucket guanaco-mkml-models
function_lotel_2024-08-19 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics