Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.36751651763916s
Received healthy response to inference request in 3.792955160140991s
Received healthy response to inference request in 5.269430160522461s
{"detail":"('http://chaiml-elo-alignment-run-3-v44-predictor.tenant-chaiml-guanaco.k2.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:60920->127.0.0.1:8080: read: connection reset by peer\\n')"}
Received unhealthy response to inference request!
Received healthy response to inference request in 2.7741851806640625s
5 requests
1 failed requests
5th percentile: 1.1605798721313476
10th percentile: 1.5639811992645263
20th percentile: 2.370783853530884
30th percentile: 2.892851448059082
40th percentile: 3.130183982849121
50th percentile: 3.36751651763916
60th percentile: 3.5376919746398925
70th percentile: 3.707867431640625
80th percentile: 4.088250160217285
90th percentile: 4.678840160369873
95th percentile: 4.974135160446167
99th percentile: 5.2103711605072025
mean time: 3.1922531127929688
%s, retrying in %s seconds...
Received healthy response to inference request in 4.259687900543213s
Received healthy response to inference request in 2.7255890369415283s
Received healthy response to inference request in 3.4360241889953613s
Received healthy response to inference request in 2.2393081188201904s
Received healthy response to inference request in 3.964984655380249s
5 requests
0 failed requests
5th percentile: 2.336564302444458
10th percentile: 2.4338204860687256
20th percentile: 2.6283328533172607
30th percentile: 2.8676760673522947
40th percentile: 3.151850128173828
50th percentile: 3.4360241889953613
60th percentile: 3.6476083755493165
70th percentile: 3.859192562103271
80th percentile: 4.023925304412842
90th percentile: 4.141806602478027
95th percentile: 4.20074725151062
99th percentile: 4.247899770736694
mean time: 3.3251187801361084
%s, retrying in %s seconds...
Received healthy response to inference request in 3.2127599716186523s
Received healthy response to inference request in 3.8545496463775635s
Received healthy response to inference request in 3.2963812351226807s
Received healthy response to inference request in 3.5700631141662598s
Received healthy response to inference request in 3.4068610668182373s
5 requests
0 failed requests
5th percentile: 3.229484224319458
10th percentile: 3.2462084770202635
20th percentile: 3.279656982421875
30th percentile: 3.318477201461792
40th percentile: 3.3626691341400146
50th percentile: 3.4068610668182373
60th percentile: 3.472141885757446
70th percentile: 3.537422704696655
80th percentile: 3.6269604206085204
90th percentile: 3.740755033493042
95th percentile: 3.797652339935303
99th percentile: 3.8431701850891113
mean time: 3.4681230068206785
Pipeline stage StressChecker completed in 53.85s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.40s
Shutdown handler de-registered
function_gegaf_2024-12-06 status is now deployed due to DeploymentManager action
function_gegaf_2024-12-06 status is now inactive due to admin request
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Received signal 15, running shutdown handler
Shutdown handler de-registered
function_gegaf_2024-12-06 status is now inactive due to auto deactivation removed underperforming models