developer_uid: rirv938
submission_id: function_bogan_2025-02-13
model_name: retune_with_base
model_group:
status: torndown
timestamp: 2025-02-13T21:39:14+00:00
num_battles: 7675
num_wins: 3964
celo_rating: 1278.2
family_friendly_score: 0.5609999999999999
family_friendly_standard_error: 0.007018247644533499
submission_type: function
display_name: retune_with_base
is_internal_developer: True
ranking_group: single
us_pacific_date: 2025-02-13
win_ratio: 0.5164820846905538
generation_params: {'temperature': 0.9, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.4580044746398926s
Received healthy response to inference request in 2.8335652351379395s
Received healthy response to inference request in 2.375598430633545s
Received healthy response to inference request in 3.2520108222961426s
Received healthy response to inference request in 3.1130402088165283s
5 requests
0 failed requests
5th percentile: 2.3920796394348143
10th percentile: 2.408560848236084
20th percentile: 2.4415232658386232
30th percentile: 2.5331166267395018
40th percentile: 2.6833409309387206
50th percentile: 2.8335652351379395
60th percentile: 2.945355224609375
70th percentile: 3.0571452140808106
80th percentile: 3.140834331512451
90th percentile: 3.196422576904297
95th percentile: 3.2242166996002197
99th percentile: 3.246451997756958
mean time: 2.8064438343048095
Pipeline stage StressChecker completed in 15.28s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.65s
Shutdown handler de-registered
function_bogan_2025-02-13 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4323.10s
Shutdown handler de-registered
function_bogan_2025-02-13 status is now inactive due to auto deactivation removed underperforming models
function_bogan_2025-02-13 status is now torndown due to DeploymentManager action
function_bogan_2025-02-13 status is now torndown due to DeploymentManager action
function_bogan_2025-02-13 status is now torndown due to DeploymentManager action
Unable to record family friendly update due to error: HTTPConnectionPool(host='chaiml-nemo-guard-merged-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Max retries exceeded with url: /v1/models/GPT-J-6B-lit-v2:predict (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7b0ec843e710>, 'Connection to chaiml-nemo-guard-merged-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com timed out. (connect timeout=12.0)'))
ChatRequest
Generation Params
Prompt Formatter
Chat History
ChatMessage 1