Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.386390447616577s
Received healthy response to inference request in 3.3171908855438232s
Received healthy response to inference request in 3.8593223094940186s
Received healthy response to inference request in 4.547959566116333s
read tcp 127.0.0.1:45934->127.0.0.1:8080: read: connection reset by peer
Received unhealthy response to inference request!
Received healthy response to inference request in 1.7073421478271484s
Received healthy response to inference request in 2.9578447341918945s
Received healthy response to inference request in 2.1184206008911133s
Received healthy response to inference request in 2.9654977321624756s
Received healthy response to inference request in 4.556678295135498s
10 requests
1 failed requests
5th percentile: 0.8207186818122864
10th percentile: 1.5461378812789917
20th percentile: 2.0362049102783204
30th percentile: 2.70601749420166
40th percentile: 2.9624365329742433
50th percentile: 3.1413443088531494
60th percentile: 3.3448707103729247
70th percentile: 3.5282700061798096
80th percentile: 3.9970497608184816
90th percentile: 4.548831439018249
95th percentile: 4.552754867076874
99th percentile: 4.555893609523773
mean time: 2.9511946201324464
%s, retrying in %s seconds...
Received healthy response to inference request in 1.9894046783447266s
Received healthy response to inference request in 3.0373992919921875s
Received healthy response to inference request in 4.763426065444946s
Received healthy response to inference request in 1.9139554500579834s
Received healthy response to inference request in 2.717938184738159s
Received healthy response to inference request in 6.6732237339019775s
Received healthy response to inference request in 3.5987608432769775s
Received healthy response to inference request in 2.6930854320526123s
Received healthy response to inference request in 2.8646581172943115s
Received healthy response to inference request in 3.671624183654785s
10 requests
0 failed requests
5th percentile: 1.947907602787018
10th percentile: 1.9818597555160522
20th percentile: 2.5523492813110353
30th percentile: 2.710482358932495
40th percentile: 2.8059701442718508
50th percentile: 2.9510287046432495
60th percentile: 3.261943912506103
70th percentile: 3.62061984539032
80th percentile: 3.8899845600128176
90th percentile: 4.954405832290648
95th percentile: 5.813814783096311
99th percentile: 6.501341943740845
mean time: 3.3923475980758666
Pipeline stage StressChecker completed in 66.14s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.92s
Shutdown handler de-registered
function_mudeb_2025-12-15 status is now deployed due to DeploymentManager action
function_mudeb_2025-12-15 status is now inactive due to auto deactivation removed underperforming models
function_mudeb_2025-12-15 status is now torndown due to DeploymentManager action