Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 4.782130241394043s
Received healthy response to inference request in 2.570821762084961s
Received healthy response to inference request in 2.2051854133605957s
Received healthy response to inference request in 7.318361759185791s
Received healthy response to inference request in 4.217477560043335s
Received healthy response to inference request in 3.747164487838745s
Received healthy response to inference request in 3.7472031116485596s
Received healthy response to inference request in 1.9856383800506592s
Received healthy response to inference request in 2.3671364784240723s
10 requests
1 failed requests
5th percentile: 2.0844345450401307
10th percentile: 2.1832307100296022
20th percentile: 2.334746265411377
30th percentile: 2.5097161769866942
40th percentile: 3.2766273975372315
50th percentile: 3.7471837997436523
60th percentile: 3.9353128910064696
70th percentile: 4.386873364448547
80th percentile: 5.289376544952393
90th percentile: 8.912126016616815
95th percentile: 16.08406517505644
99th percentile: 21.82161650180817
mean time: 5.6197123527526855
%s, retrying in %s seconds...
Received healthy response to inference request in 2.338163375854492s
Received healthy response to inference request in 2.412771701812744s
Received healthy response to inference request in 2.254366874694824s
Received healthy response to inference request in 2.4328269958496094s
Received healthy response to inference request in 2.5367515087127686s
Received healthy response to inference request in 1.9910345077514648s
Received healthy response to inference request in 4.394622564315796s
Received healthy response to inference request in 2.5298409461975098s
Received healthy response to inference request in 2.490708827972412s
Received healthy response to inference request in 2.966543674468994s
10 requests
0 failed requests
5th percentile: 2.1095340728759764
10th percentile: 2.2280336380004884
20th percentile: 2.3214040756225587
30th percentile: 2.3903892040252686
40th percentile: 2.4248048782348635
50th percentile: 2.4617679119110107
60th percentile: 2.5063616752624513
70th percentile: 2.5319141149520874
80th percentile: 2.622709941864014
90th percentile: 3.1093515634536737
95th percentile: 3.7519870638847337
99th percentile: 4.266095464229584
mean time: 2.6347630977630616
Pipeline stage StressChecker completed in 172.62s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 8.21s
Shutdown handler de-registered
function_seluk_2026-03-16 status is now deployed due to DeploymentManager action
function_seluk_2026-03-16 status is now inactive due to auto deactivation removed underperforming models