Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.6408448219299316s
Received healthy response to inference request in 3.3581702709198s
Received healthy response to inference request in 2.8198537826538086s
Received healthy response to inference request in 3.1461598873138428s
Received healthy response to inference request in 3.3279659748077393s
Received healthy response to inference request in 2.9434101581573486s
Received healthy response to inference request in 2.8530161380767822s
read tcp 127.0.0.1:50868->127.0.0.1:8080: read: connection reset by peer
Received unhealthy response to inference request!
Received healthy response to inference request in 8.20602035522461s
Received healthy response to inference request in 2.654820203781128s
10 requests
1 failed requests
5th percentile: 1.4657444953918457
10th percentile: 2.438624620437622
20th percentile: 2.7868470668792726
30th percentile: 2.84306743144989
40th percentile: 2.907252550125122
50th percentile: 3.0447850227355957
60th percentile: 3.2188823223114014
70th percentile: 3.3370272636413576
80th percentile: 3.414705181121826
90th percentile: 4.097362375259398
95th percentile: 6.151691365242
99th percentile: 7.795154557228089
mean time: 3.344312596321106
%s, retrying in %s seconds...
Received healthy response to inference request in 4.6008460521698s
Received healthy response to inference request in 2.286140203475952s
Received healthy response to inference request in 2.6774938106536865s
Received healthy response to inference request in 2.1030049324035645s
Received healthy response to inference request in 2.4963696002960205s
Received healthy response to inference request in 2.1157567501068115s
Received healthy response to inference request in 3.8330390453338623s
Received healthy response to inference request in 2.6984169483184814s
Received healthy response to inference request in 3.4778366088867188s
Received healthy response to inference request in 3.4731345176696777s
10 requests
0 failed requests
5th percentile: 2.1087432503700256
10th percentile: 2.114481568336487
20th percentile: 2.252063512802124
30th percentile: 2.43330078125
40th percentile: 2.60504412651062
50th percentile: 2.687955379486084
60th percentile: 3.0083039760589596
70th percentile: 3.47454514503479
80th percentile: 3.5488770961761475
90th percentile: 3.909819746017456
95th percentile: 4.255332899093627
99th percentile: 4.531743421554565
mean time: 2.9762038469314573
Pipeline stage StressChecker completed in 66.58s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.81s
Shutdown handler de-registered
function_ligel_2025-12-15 status is now deployed due to DeploymentManager action
function_ligel_2025-12-15 status is now inactive due to auto deactivation removed underperforming models
function_ligel_2025-12-15 status is now torndown due to DeploymentManager action