Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Inference service chaiml-kasey-mean-older-58615-v1 ready after 152.71314454078674s
Pipeline stage VLLMDeployer completed in 163.97s
run pipeline stage %s
Received healthy response to inference request in 6.006218194961548s
Running pipeline stage StressChecker
Inference service chaiml-kasey-mean-older-62091-v1 ready after 151.08075046539307s
Pipeline stage VLLMDeployer completed in 161.48s
Received healthy response to inference request in 2.807194471359253s
run pipeline stage %s
Received healthy response to inference request in 6.011072635650635s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.4907240867614746s
Received healthy response to inference request in 5.237056255340576s
Received healthy response to inference request in 2.85123872756958s
Received healthy response to inference request in 3.1867244243621826s
Received healthy response to inference request in 2.6843771934509277s
Received healthy response to inference request in 2.871544361114502s
Received healthy response to inference request in 5.88506293296814s
Received healthy response to inference request in 2.901127815246582s
Received healthy response to inference request in 3.079287528991699s
Received healthy response to inference request in 3.6881937980651855s
Received healthy response to inference request in 3.1614110469818115s
Received healthy response to inference request in 2.945248603820801s
Received healthy response to inference request in 2.8206396102905273s
Received healthy response to inference request in 2.7812366485595703s
Received healthy response to inference request in 5.35004186630249s
Received healthy response to inference request in 2.9338271617889404s
Received healthy response to inference request in 2.795872926712036s
Received healthy response to inference request in 5.175680875778198s
Received healthy response to inference request in 2.7810959815979004s
Received healthy response to inference request in 3.10022234916687s
Received healthy response to inference request in 4.462849378585815s
Received healthy response to inference request in 2.7429769039154053s
Received healthy response to inference request in 2.791480779647827s
Received healthy response to inference request in 4.467803239822388s
Received healthy response to inference request in 2.9427905082702637s
Received healthy response to inference request in 7.88688588142395s
Received healthy response to inference request in 3.4108850955963135s
Received healthy response to inference request in 8.333678483963013s
Received healthy response to inference request in 3.192669630050659s
Received healthy response to inference request in 2.796412467956543s
10 requests
0 failed requests
5th percentile: 4.036788809299469
Received healthy response to inference request in 2.6708626747131348s
Received healthy response to inference request in 3.45797061920166s
10th percentile: 4.385383820533752
20th percentile: 4.466812467575073
Received healthy response to inference request in 2.9925436973571777s
Received healthy response to inference request in 4.039337635040283s
30th percentile: 4.963317584991455
40th percentile: 5.212506103515625
Received healthy response to inference request in 3.792365550994873s
50th percentile: 5.293549060821533
Received healthy response to inference request in 2.9665002822875977s
60th percentile: 5.56405029296875
70th percentile: 5.9214095115661625
Received healthy response to inference request in 2.8930561542510986s
Received healthy response to inference request in 3.691194534301758s
80th percentile: 6.007189083099365
90th percentile: 6.243333220481872
95th percentile: 7.28850585222244
Received healthy response to inference request in 3.05313777923584s
Received healthy response to inference request in 3.215024709701538s
99th percentile: 8.124643957614898
mean time: 5.461765766143799
Received healthy response to inference request in 3.285383701324463s
Received healthy response to inference request in 3.9219093322753906s
Pipeline stage StressChecker completed in 128.73s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
Received healthy response to inference request in 3.5350465774536133s
Received healthy response to inference request in 2.782888650894165s
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Received healthy response to inference request in 2.8870925903320312s
Received healthy response to inference request in 2.7473697662353516s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 11.49s
Shutdown handler de-registered
Received healthy response to inference request in 2.7976489067077637s
Received healthy response to inference request in 3.0377001762390137s
function_sebok_2026-03-12 status is now deployed due to DeploymentManager action
function_sebok_2026-03-12 status is now inactive due to auto deactivation removed underperforming models