Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 6.181146860122681s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 4.960723400115967s
Received healthy response to inference request in 8.706024885177612s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 5.344745635986328s
Received healthy response to inference request in 7.324533224105835s
5 requests
0 failed requests
5th percentile: 5.037527847290039
10th percentile: 5.114332294464111
20th percentile: 5.267941188812256
30th percentile: 5.5120258808135985
40th percentile: 5.84658637046814
50th percentile: 6.181146860122681
60th percentile: 6.638501405715942
70th percentile: 7.095855951309204
80th percentile: 7.600831556320191
90th percentile: 8.153428220748902
95th percentile: 8.429726552963256
99th percentile: 8.65076521873474
mean time: 6.503434801101685
%s, retrying in %s seconds...
Received healthy response to inference request in 6.403990983963013s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 6.955063104629517s
Received healthy response to inference request in 7.9380927085876465s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 8.717573165893555s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 7.285416603088379s
5 requests
0 failed requests
5th percentile: 6.514205408096314
10th percentile: 6.624419832229615
20th percentile: 6.844848680496216
30th percentile: 7.021133804321289
40th percentile: 7.153275203704834
50th percentile: 7.285416603088379
60th percentile: 7.546487045288086
70th percentile: 7.807557487487792
80th percentile: 8.093988800048828
90th percentile: 8.40578098297119
95th percentile: 8.561677074432373
99th percentile: 8.686393947601319
mean time: 7.4600273132324215
%s, retrying in %s seconds...
Received healthy response to inference request in 6.638851165771484s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 6.839630603790283s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 6.143150329589844s
Received healthy response to inference request in 5.45689582824707s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 5.142122983932495s
5 requests
0 failed requests
5th percentile: 5.2050775527954105
10th percentile: 5.268032121658325
20th percentile: 5.393941259384155
30th percentile: 5.594146728515625
40th percentile: 5.868648529052734
50th percentile: 6.143150329589844
60th percentile: 6.3414306640625
70th percentile: 6.539710998535156
80th percentile: 6.679007053375244
90th percentile: 6.759318828582764
95th percentile: 6.799474716186523
99th percentile: 6.831599426269531
mean time: 6.044130182266235
clean up pipeline due to error=%s
Shutdown handler de-registered
function_semeb_2024-09-25 status is now failed due to DeploymentManager action
function_semeb_2024-09-25 status is now torndown due to DeploymentManager action
function_semeb_2024-09-25 status is now inactive due to auto deactivation removed underperforming models
function_semeb_2024-09-25 status is now torndown due to DeploymentManager action