Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 3.5482261180877686s
Received healthy response to inference request in 1.8781800270080566s
Received healthy response to inference request in 2.8522145748138428s
Received healthy response to inference request in 2.9009554386138916s
Received healthy response to inference request in 2.682223320007324s
Received healthy response to inference request in 2.1267895698547363s
Received healthy response to inference request in 2.8299524784088135s
Received healthy response to inference request in 3.6174960136413574s
Received healthy response to inference request in 4.001025438308716s
10 requests
1 failed requests
5th percentile: 1.9900543212890625
10th percentile: 2.1019286155700683
20th percentile: 2.5711365699768067
30th percentile: 2.7856337308883665
40th percentile: 2.843309736251831
50th percentile: 2.876585006713867
60th percentile: 3.159863710403442
70th percentile: 3.569007086753845
80th percentile: 3.6942018985748293
90th percentile: 5.611923861503596
95th percentile: 12.860966765880567
99th percentile: 18.660201089382173
mean time: 4.654707264900208
%s, retrying in %s seconds...
Received healthy response to inference request in 6.467210292816162s
Received healthy response to inference request in 3.33941388130188s
Received healthy response to inference request in 3.8203766345977783s
Received healthy response to inference request in 1.7100160121917725s
Received healthy response to inference request in 3.2666990756988525s
Received healthy response to inference request in 1.8945870399475098s
Received healthy response to inference request in 2.067107677459717s
Received healthy response to inference request in 3.647458076477051s
Received healthy response to inference request in 3.3870086669921875s
Received healthy response to inference request in 4.344493389129639s
10 requests
0 failed requests
5th percentile: 1.7930729746818543
10th percentile: 1.876129937171936
20th percentile: 2.032603549957275
30th percentile: 2.9068216562271116
40th percentile: 3.310327959060669
50th percentile: 3.3632112741470337
60th percentile: 3.4911884307861327
70th percentile: 3.699333643913269
80th percentile: 3.9251999855041504
90th percentile: 4.55676507949829
95th percentile: 5.511987686157225
99th percentile: 6.276165771484376
mean time: 3.394437074661255
Pipeline stage StressChecker completed in 83.10s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.58s
Shutdown handler de-registered
function_dabab_2025-12-13 status is now deployed due to DeploymentManager action
function_dabab_2025-12-13 status is now inactive due to auto deactivation removed underperforming models