Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 3.3038649559020996s
Received healthy response to inference request in 2.6958322525024414s
Received healthy response to inference request in 3.336091995239258s
Received healthy response to inference request in 3.0999929904937744s
Received healthy response to inference request in 4.453721046447754s
Received healthy response to inference request in 4.9840099811553955s
Received healthy response to inference request in 3.84428071975708s
Received healthy response to inference request in 4.587963819503784s
Received healthy response to inference request in 2.6446409225463867s
10 requests
1 failed requests
5th percentile: 2.6676770210266114
10th percentile: 2.690713119506836
20th percentile: 3.019160842895508
30th percentile: 3.242703366279602
40th percentile: 3.3232011795043945
50th percentile: 3.590186357498169
60th percentile: 4.088056850433349
70th percentile: 4.493993878364563
80th percentile: 4.667173051834107
90th percentile: 6.495507359504694
95th percentile: 13.297245562076553
99th percentile: 18.738636124134064
mean time: 5.304938244819641
%s, retrying in %s seconds...
Received healthy response to inference request in 3.2222070693969727s
Received healthy response to inference request in 2.4488840103149414s
Received healthy response to inference request in 2.680532455444336s
Received healthy response to inference request in 2.503709077835083s
Received healthy response to inference request in 6.585968494415283s
Received healthy response to inference request in 3.2372944355010986s
Received healthy response to inference request in 2.482318639755249s
Received healthy response to inference request in 4.91286563873291s
Received healthy response to inference request in 4.230975866317749s
Received healthy response to inference request in 6.066768407821655s
10 requests
0 failed requests
5th percentile: 2.46392959356308
10th percentile: 2.4789751768112183
20th percentile: 2.4994309902191163
30th percentile: 2.62748544216156
40th percentile: 3.005537223815918
50th percentile: 3.2297507524490356
60th percentile: 3.634767007827758
70th percentile: 4.435542798042297
80th percentile: 5.143646192550659
90th percentile: 6.118688416481018
95th percentile: 6.35232845544815
99th percentile: 6.539240486621857
mean time: 3.8371524095535277
Pipeline stage StressChecker completed in 94.16s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.61s
Shutdown handler de-registered
function_mahul_2025-12-29 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Generating Leaderboard row for %s
Generated Leaderboard row for %s
Pipeline stage OfflineFamilyFriendlyScorer completed in 2865.25s
Shutdown handler de-registered
function_mahul_2025-12-29 status is now inactive due to auto deactivation removed underperforming models
function_mahul_2025-12-29 status is now torndown due to DeploymentManager action