Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.7828009128570557s
Received healthy response to inference request in 2.2314631938934326s
Received healthy response to inference request in 3.7740700244903564s
Received healthy response to inference request in 3.8075315952301025s
read tcp 127.0.0.1:45260->127.0.0.1:8080: read: connection reset by peer
Received unhealthy response to inference request!
Received healthy response to inference request in 2.730210304260254s
Received healthy response to inference request in 3.189642906188965s
Received healthy response to inference request in 4.235850095748901s
Received healthy response to inference request in 2.7659599781036377s
Received healthy response to inference request in 2.6947214603424072s
10 requests
1 failed requests
5th percentile: 0.8527899980545044
10th percentile: 1.6137080192565918
20th percentile: 2.1417307376861574
30th percentile: 2.5557439804077147
40th percentile: 2.716014766693115
50th percentile: 2.748085141181946
60th percentile: 2.9354331493377686
70th percentile: 3.3649710416793823
80th percentile: 3.7807623386383056
90th percentile: 3.850363445281982
95th percentile: 4.0431067705154415
99th percentile: 4.197301430702209
mean time: 2.730412244796753
%s, retrying in %s seconds...
Received healthy response to inference request in 2.723357915878296s
Received healthy response to inference request in 4.1635119915008545s
Received healthy response to inference request in 1.9869577884674072s
Received healthy response to inference request in 2.656982898712158s
Received healthy response to inference request in 3.9988720417022705s
Received healthy response to inference request in 1.7313814163208008s
Received healthy response to inference request in 2.3202226161956787s
Received healthy response to inference request in 2.571655750274658s
Received healthy response to inference request in 1.7959685325622559s
Received healthy response to inference request in 2.0311279296875s
10 requests
0 failed requests
5th percentile: 1.7604456186294555
10th percentile: 1.7895098209381104
20th percentile: 1.9487599372863769
30th percentile: 2.017876887321472
40th percentile: 2.204584741592407
50th percentile: 2.4459391832351685
60th percentile: 2.605786609649658
70th percentile: 2.6768954038619994
80th percentile: 2.978460741043091
90th percentile: 4.015336036682129
95th percentile: 4.089424014091492
99th percentile: 4.148694396018982
mean time: 2.598003888130188
Pipeline stage StressChecker completed in 56.02s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 1.62s
Shutdown handler de-registered
function_solom_2025-12-18 status is now deployed due to DeploymentManager action
function_solom_2025-12-18 status is now inactive due to auto deactivation removed underperforming models
function_solom_2025-12-18 status is now torndown due to DeploymentManager action