Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 1.4967710971832275s
Received healthy response to inference request in 2.6636178493499756s
Received healthy response to inference request in 1.9474070072174072s
Received healthy response to inference request in 2.1016900539398193s
Received healthy response to inference request in 1.9372313022613525s
Received healthy response to inference request in 1.5898797512054443s
Received healthy response to inference request in 2.937969923019409s
Received healthy response to inference request in 1.219423770904541s
Received healthy response to inference request in 2.694749116897583s
10 requests
1 failed requests
5th percentile: 1.34423006772995
10th percentile: 1.4690363645553588
20th percentile: 1.571258020401001
30th percentile: 1.83302583694458
40th percentile: 1.9433367252349854
50th percentile: 2.0245485305786133
60th percentile: 2.3264611721038815
70th percentile: 2.6729572296142576
80th percentile: 2.7433932781219483
90th percentile: 4.699620580673211
95th percentile: 12.627048540115338
99th percentile: 18.96899090766907
mean time: 3.9143216371536256
%s, retrying in %s seconds...
Received healthy response to inference request in 1.325622797012329s
Received healthy response to inference request in 1.2679767608642578s
Received healthy response to inference request in 1.8629157543182373s
Received healthy response to inference request in 1.58821439743042s
Received healthy response to inference request in 1.4180066585540771s
Received healthy response to inference request in 1.6652066707611084s
Received healthy response to inference request in 1.5005707740783691s
Received healthy response to inference request in 2.378481864929199s
Received healthy response to inference request in 2.1577866077423096s
Received healthy response to inference request in 5.156349182128906s
10 requests
0 failed requests
5th percentile: 1.29391747713089
10th percentile: 1.319858193397522
20th percentile: 1.3995298862457275
30th percentile: 1.4758015394210815
40th percentile: 1.5531569480895997
50th percentile: 1.6267105340957642
60th percentile: 1.74429030418396
70th percentile: 1.9513770103454589
80th percentile: 2.2019256591796874
90th percentile: 2.656268596649169
95th percentile: 3.9063088893890354
99th percentile: 4.906341123580933
mean time: 2.0321131467819216
Pipeline stage StressChecker completed in 75.97s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.74s
Shutdown handler de-registered
function_pinuk_2025-12-06 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
%s, retrying in %s seconds...
function_pinuk_2025-12-06 status is now inactive due to auto deactivation removed underperforming models
function_pinuk_2025-12-06 status is now torndown due to DeploymentManager action