Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 3.049119234085083s
Received healthy response to inference request in 3.854022264480591s
Received healthy response to inference request in 2.962453603744507s
Received healthy response to inference request in 3.1398026943206787s
Received healthy response to inference request in 2.7665812969207764s
Received healthy response to inference request in 3.203869342803955s
Received healthy response to inference request in 3.5310168266296387s
Received healthy response to inference request in 2.7686824798583984s
Received healthy response to inference request in 3.400799036026001s
10 requests
1 failed requests
5th percentile: 2.767526829242706
10th percentile: 2.768472361564636
20th percentile: 2.923699378967285
30th percentile: 3.0231195449829102
40th percentile: 3.1035293102264405
50th percentile: 3.171836018562317
60th percentile: 3.2826412200927733
70th percentile: 3.4398643732070924
80th percentile: 3.595617914199829
90th percentile: 5.479439353942865
95th percentile: 12.793816256523115
99th percentile: 18.645317778587344
mean time: 4.878453993797303
%s, retrying in %s seconds...
Received healthy response to inference request in 2.821181058883667s
Received healthy response to inference request in 6.668602228164673s
Received healthy response to inference request in 1.9847311973571777s
Received healthy response to inference request in 2.871647596359253s
Received healthy response to inference request in 1.7203543186187744s
Received healthy response to inference request in 3.2538371086120605s
Received healthy response to inference request in 1.7728936672210693s
Received healthy response to inference request in 3.6980528831481934s
Received healthy response to inference request in 3.2944746017456055s
Received healthy response to inference request in 1.8548054695129395s
10 requests
0 failed requests
5th percentile: 1.7439970254898072
10th percentile: 1.76763973236084
20th percentile: 1.8384231090545655
30th percentile: 1.9457534790039062
40th percentile: 2.4866011142730713
50th percentile: 2.84641432762146
60th percentile: 3.024523401260376
70th percentile: 3.266028356552124
80th percentile: 3.375190258026123
90th percentile: 3.9951078176498402
95th percentile: 5.331855022907254
99th percentile: 6.40125278711319
mean time: 2.9940580129623413
Pipeline stage StressChecker completed in 82.95s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.60s
Shutdown handler de-registered
function_ludur_2025-12-16 status is now deployed due to DeploymentManager action
function_ludur_2025-12-16 status is now inactive due to auto deactivation removed underperforming models
function_ludur_2025-12-16 status is now torndown due to DeploymentManager action