Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.925781488418579s
Received healthy response to inference request in 2.65675950050354s
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 4.609150171279907s
Received healthy response to inference request in 2.0394999980926514s
Received healthy response to inference request in 3.8215315341949463s
Received healthy response to inference request in 3.5126471519470215s
Received healthy response to inference request in 4.6556689739227295s
Received healthy response to inference request in 6.718809127807617s
Received healthy response to inference request in 3.207469940185547s
10 requests
1 failed requests
5th percentile: 2.3172667741775514
10th percentile: 2.595033550262451
20th percentile: 2.8719770908355713
30th percentile: 3.1229634046554566
40th percentile: 3.3905762672424316
50th percentile: 3.667089343070984
60th percentile: 4.13657898902893
70th percentile: 4.623105812072754
80th percentile: 5.068297004699708
90th percentile: 8.05801277160644
95th percentile: 14.084429168701158
99th percentile: 18.905562286376956
mean time: 5.425816345214844
%s, retrying in %s seconds...
Failed to get response for submission chaiml-mistral-24b-2048_15988_v1: ('http://chaiml-mistral-24b-2048-15988-v1-predictor.tenant-chaiml-guanaco.kchai-coreweave-us-east-04a.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Received healthy response to inference request in 4.373692274093628s
Received healthy response to inference request in 2.2352774143218994s
Received healthy response to inference request in 2.993192195892334s
Received healthy response to inference request in 4.954463243484497s
Received healthy response to inference request in 5.318190574645996s
Received healthy response to inference request in 5.326735258102417s
Received healthy response to inference request in 4.982121706008911s
Received healthy response to inference request in 4.547181606292725s
Received healthy response to inference request in 3.913231611251831s
Received healthy response to inference request in 3.963456869125366s
10 requests
0 failed requests
5th percentile: 2.576339066028595
10th percentile: 2.9174007177352905
20th percentile: 3.7292237281799316
30th percentile: 3.9483892917633057
40th percentile: 4.209598112106323
50th percentile: 4.460436940193176
60th percentile: 4.710094261169433
70th percentile: 4.962760782241821
80th percentile: 5.049335479736328
90th percentile: 5.319045042991638
95th percentile: 5.322890150547027
99th percentile: 5.325966236591339
mean time: 4.26075427532196
Pipeline stage StressChecker completed in 103.87s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.53s
Shutdown handler de-registered
function_pateb_2026-02-15 status is now deployed due to DeploymentManager action
function_pateb_2026-02-15 status is now inactive due to auto deactivation removed underperforming models