Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.739283800125122s
Received healthy response to inference request in 1.7495520114898682s
Received healthy response to inference request in 3.199032783508301s
Received healthy response to inference request in 1.6850628852844238s
Received healthy response to inference request in 3.790717601776123s
Received healthy response to inference request in 2.752941131591797s
Received healthy response to inference request in 2.957923412322998s
Received healthy response to inference request in 2.161381483078003s
Received healthy response to inference request in 1.9070894718170166s
10 requests
1 failed requests
5th percentile: 1.7140829920768739
10th percentile: 1.7431030988693237
20th percentile: 1.875581979751587
30th percentile: 2.085093879699707
40th percentile: 2.5081228733062746
50th percentile: 2.7461124658584595
60th percentile: 2.834934043884277
70th percentile: 3.030256223678589
80th percentile: 3.3173697471618655
90th percentile: 5.423291897773737
95th percentile: 12.769876229763014
99th percentile: 18.647143695354465
mean time: 4.305944514274597
%s, retrying in %s seconds...
Received healthy response to inference request in 2.365729331970215s
Received healthy response to inference request in 4.133402109146118s
Received healthy response to inference request in 3.9536075592041016s
Received healthy response to inference request in 1.7349326610565186s
Received healthy response to inference request in 2.5516767501831055s
Received healthy response to inference request in 1.7063567638397217s
Received healthy response to inference request in 2.261765718460083s
Received healthy response to inference request in 1.8063735961914062s
Received healthy response to inference request in 2.361905813217163s
Received healthy response to inference request in 1.6981592178344727s
10 requests
0 failed requests
5th percentile: 1.7018481135368346
10th percentile: 1.7055370092391968
20th percentile: 1.7292174816131591
30th percentile: 1.78494131565094
40th percentile: 2.0796088695526125
50th percentile: 2.311835765838623
60th percentile: 2.363435220718384
70th percentile: 2.421513557434082
80th percentile: 2.832062911987305
90th percentile: 3.971587014198303
95th percentile: 4.05249456167221
99th percentile: 4.117220599651337
mean time: 2.4573909521102903
Pipeline stage StressChecker completed in 70.83s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.64s
Shutdown handler de-registered
function_rusen_2025-12-17 status is now deployed due to DeploymentManager action
function_rusen_2025-12-17 status is now inactive due to auto deactivation removed underperforming models
function_rusen_2025-12-17 status is now torndown due to DeploymentManager action