developer_uid: chai_backend_admin
submission_id: function_gurak_2025-12-16
model_name: kimid-v5a-noname-q235-lr1e4r64ep2
model_group:
status: torndown
timestamp: 2025-12-19T02:31:13+00:00
num_battles: 5769
num_wins: 3114
celo_rating: 1321.17
family_friendly_score: 0.0
family_friendly_standard_error: 0.0
submission_type: function
display_name: kimid-v5a-noname-q235-lr1e4r64ep2
is_internal_developer: True
ranking_group: single
us_pacific_date: 2025-12-18
win_ratio: 0.5397815912636506
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '### Instruction:\n{memory}\n', 'prompt_template': '### Input:\n{prompt}\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '### Response:\n{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 3.328378438949585s
Received healthy response to inference request in 3.655064344406128s
Received healthy response to inference request in 3.134660243988037s
Received healthy response to inference request in 1.9389371871948242s
Received healthy response to inference request in 3.820186138153076s
Received healthy response to inference request in 2.088041305541992s
Received healthy response to inference request in 2.1421103477478027s
Received healthy response to inference request in 1.9440295696258545s
Received healthy response to inference request in 2.236009359359741s
10 requests
1 failed requests
5th percentile: 1.9412287592887878
10th percentile: 1.9435203313827514
20th percentile: 2.0592389583587645
30th percentile: 2.1258896350860597
40th percentile: 2.1984497547149657
50th percentile: 2.685334801673889
60th percentile: 3.212147521972656
70th percentile: 3.426384210586548
80th percentile: 3.6880887031555174
90th percentile: 5.448449110984797
95th percentile: 12.775632488727553
99th percentile: 18.637379190921784
mean time: 4.439023280143738
%s, retrying in %s seconds...
Received healthy response to inference request in 2.1294238567352295s
Received healthy response to inference request in 1.7043633460998535s
Received healthy response to inference request in 1.9042811393737793s
Received healthy response to inference request in 2.5617237091064453s
Received healthy response to inference request in 2.2269630432128906s
Received healthy response to inference request in 2.6154708862304688s
Received healthy response to inference request in 1.9250004291534424s
Received healthy response to inference request in 2.7054011821746826s
Received healthy response to inference request in 2.8661067485809326s
Received healthy response to inference request in 2.457242488861084s
10 requests
0 failed requests
5th percentile: 1.7943263530731202
10th percentile: 1.8842893600463868
20th percentile: 1.9208565711975099
30th percentile: 2.068096828460693
40th percentile: 2.1879473686218263
50th percentile: 2.3421027660369873
60th percentile: 2.4990349769592286
70th percentile: 2.5778478622436523
80th percentile: 2.6334569454193115
90th percentile: 2.7214717388153074
95th percentile: 2.7937892436981198
99th percentile: 2.85164324760437
mean time: 2.309597682952881
Pipeline stage StressChecker completed in 70.40s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.56s
Shutdown handler de-registered
function_gurak_2025-12-16 status is now deployed due to DeploymentManager action
function_gurak_2025-12-16 status is now inactive due to auto deactivation removed underperforming models
function_gurak_2025-12-16 status is now torndown due to DeploymentManager action