Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.680288791656494s
Received healthy response to inference request in 3.0811657905578613s
Received healthy response to inference request in 2.9382145404815674s
Received healthy response to inference request in 2.280606269836426s
Received healthy response to inference request in 3.6746909618377686s
Received healthy response to inference request in 2.192897319793701s
Received healthy response to inference request in 2.0960569381713867s
Received healthy response to inference request in 6.143524169921875s
Received healthy response to inference request in 2.3315861225128174s
10 requests
1 failed requests
5th percentile: 2.139635109901428
10th percentile: 2.1832132816314695
20th percentile: 2.263064479827881
30th percentile: 2.3162921667099
40th percentile: 2.5408077239990234
50th percentile: 2.8092516660690308
60th percentile: 2.995395040512085
70th percentile: 3.2592233419418335
80th percentile: 4.16845760345459
90th percentile: 7.542162966728205
95th percentile: 13.836037552356705
99th percentile: 18.87113722085953
mean time: 4.7548943042755125
%s, retrying in %s seconds...
Received healthy response to inference request in 3.391737699508667s
Received healthy response to inference request in 2.9261932373046875s
Received healthy response to inference request in 2.1064414978027344s
Received healthy response to inference request in 3.7268521785736084s
Received healthy response to inference request in 2.2062759399414062s
Received healthy response to inference request in 2.7913177013397217s
Received healthy response to inference request in 3.3614487648010254s
Received healthy response to inference request in 3.274064540863037s
Received healthy response to inference request in 3.5391674041748047s
Received healthy response to inference request in 2.9879791736602783s
10 requests
0 failed requests
5th percentile: 2.1513669967651365
10th percentile: 2.196292495727539
20th percentile: 2.6743093490600587
30th percentile: 2.885730576515198
40th percentile: 2.963264799118042
50th percentile: 3.1310218572616577
60th percentile: 3.3090182304382325
70th percentile: 3.370535445213318
80th percentile: 3.4212236404418945
90th percentile: 3.557935881614685
95th percentile: 3.6423940300941466
99th percentile: 3.7099605488777163
mean time: 3.031147813796997
Pipeline stage StressChecker completed in 82.15s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 1.66s
Shutdown handler de-registered
function_luham_2025-12-16 status is now deployed due to DeploymentManager action
function_luham_2025-12-16 status is now inactive due to auto deactivation removed underperforming models
function_luham_2025-12-16 status is now torndown due to DeploymentManager action