Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name white-bird-7-dpo-step100-v1-mkmlizer
Waiting for job on white-bird-7-dpo-step100-v1-mkmlizer to finish
white-bird-7-dpo-step100-v1-mkmlizer: Downloaded to shared memory in 45.622s
white-bird-7-dpo-step100-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpl2cta0pl, device:0
white-bird-7-dpo-step100-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
white-bird-7-dpo-step100-v1-mkmlizer: quantized model in 37.077s
white-bird-7-dpo-step100-v1-mkmlizer: Processed model white-bird/7_dpo_step100 in 82.700s
white-bird-7-dpo-step100-v1-mkmlizer: creating bucket guanaco-mkml-models
white-bird-7-dpo-step100-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
white-bird-7-dpo-step100-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/white-bird-7-dpo-step100-v1
white-bird-7-dpo-step100-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/white-bird-7-dpo-step100-v1/config.json
white-bird-7-dpo-step100-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/white-bird-7-dpo-step100-v1/special_tokens_map.json
white-bird-7-dpo-step100-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/white-bird-7-dpo-step100-v1/tokenizer_config.json
white-bird-7-dpo-step100-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/white-bird-7-dpo-step100-v1/tokenizer.json
white-bird-7-dpo-step100-v1-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:11, 31.38it/s]
Loading 0: 4%|▎ | 13/363 [00:00<00:06, 50.80it/s]
Loading 0: 5%|▌ | 19/363 [00:00<00:07, 44.95it/s]
Loading 0: 7%|▋ | 24/363 [00:00<00:07, 43.61it/s]
Loading 0: 9%|▊ | 31/363 [00:00<00:06, 49.45it/s]
Loading 0: 10%|█ | 37/363 [00:00<00:07, 45.57it/s]
Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 44.69it/s]
Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 49.37it/s]
Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 46.53it/s]
Loading 0: 17%|█▋ | 61/363 [00:01<00:08, 35.97it/s]
Loading 0: 18%|█▊ | 66/363 [00:01<00:08, 36.55it/s]
Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 40.20it/s]
Loading 0: 21%|██▏ | 78/363 [00:01<00:07, 39.16it/s]
Loading 0: 23%|██▎ | 83/363 [00:02<00:07, 37.72it/s]
Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 40.58it/s]
Loading 0: 26%|██▌ | 94/363 [00:02<00:06, 41.04it/s]
Loading 0: 27%|██▋ | 99/363 [00:02<00:06, 41.34it/s]
Loading 0: 29%|██▊ | 104/363 [00:02<00:06, 42.82it/s]
Loading 0: 30%|███ | 109/363 [00:02<00:05, 44.06it/s]
Loading 0: 31%|███▏ | 114/363 [00:02<00:06, 37.36it/s]
Loading 0: 33%|███▎ | 118/363 [00:02<00:06, 35.26it/s]
Loading 0: 34%|███▍ | 125/363 [00:02<00:05, 43.06it/s]
Loading 0: 36%|███▌ | 130/363 [00:03<00:05, 42.03it/s]
Loading 0: 37%|███▋ | 135/363 [00:03<00:05, 42.28it/s]
Loading 0: 39%|███▊ | 140/363 [00:03<00:05, 43.75it/s]
Loading 0: 40%|███▉ | 145/363 [00:03<00:07, 27.74it/s]
Loading 0: 41%|████ | 149/363 [00:03<00:07, 28.10it/s]
Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 36.08it/s]
Loading 0: 44%|████▍ | 161/363 [00:04<00:05, 37.74it/s]
Loading 0: 46%|████▌ | 166/363 [00:04<00:05, 38.74it/s]
Loading 0: 47%|████▋ | 172/363 [00:04<00:04, 38.95it/s]
Loading 0: 49%|████▉ | 177/363 [00:04<00:04, 39.39it/s]
Loading 0: 50%|█████ | 183/363 [00:04<00:04, 43.84it/s]
Loading 0: 52%|█████▏ | 188/363 [00:04<00:04, 43.75it/s]
Loading 0: 53%|█████▎ | 193/363 [00:04<00:03, 44.25it/s]
Loading 0: 55%|█████▍ | 199/363 [00:04<00:03, 42.71it/s]
Loading 0: 56%|█████▌ | 204/363 [00:05<00:03, 41.79it/s]
Loading 0: 58%|█████▊ | 210/363 [00:05<00:03, 44.19it/s]
Loading 0: 59%|█████▉ | 215/363 [00:05<00:03, 42.15it/s]
Loading 0: 61%|██████ | 220/363 [00:05<00:03, 42.68it/s]
Loading 0: 62%|██████▏ | 225/363 [00:05<00:05, 25.68it/s]
Loading 0: 63%|██████▎ | 230/363 [00:05<00:04, 26.82it/s]
Loading 0: 65%|██████▌ | 237/363 [00:06<00:03, 33.39it/s]
Loading 0: 67%|██████▋ | 242/363 [00:06<00:03, 34.63it/s]
Loading 0: 68%|██████▊ | 247/363 [00:06<00:03, 36.49it/s]
Loading 0: 69%|██████▉ | 252/363 [00:06<00:02, 39.20it/s]
Loading 0: 71%|███████ | 257/363 [00:06<00:03, 33.38it/s]
Loading 0: 73%|███████▎ | 264/363 [00:06<00:02, 38.62it/s]
Loading 0: 74%|███████▍ | 269/363 [00:06<00:02, 38.91it/s]
Loading 0: 75%|███████▌ | 274/363 [00:07<00:02, 39.61it/s]
Loading 0: 77%|███████▋ | 279/363 [00:07<00:01, 42.11it/s]
Loading 0: 78%|███████▊ | 284/363 [00:07<00:02, 34.77it/s]
Loading 0: 80%|████████ | 291/363 [00:07<00:01, 41.76it/s]
Loading 0: 82%|████████▏ | 296/363 [00:07<00:01, 41.92it/s]
Loading 0: 83%|████████▎ | 301/363 [00:07<00:01, 43.28it/s]
Loading 0: 84%|████████▍ | 306/363 [00:14<00:23, 2.46it/s]
Loading 0: 85%|████████▌ | 310/363 [00:14<00:16, 3.18it/s]
Loading 0: 87%|████████▋ | 314/363 [00:14<00:11, 4.17it/s]
Loading 0: 88%|████████▊ | 319/363 [00:14<00:07, 5.87it/s]
Loading 0: 89%|████████▉ | 323/363 [00:14<00:05, 7.53it/s]
Loading 0: 90%|█████████ | 328/363 [00:15<00:03, 10.33it/s]
Loading 0: 91%|█████████▏| 332/363 [00:15<00:02, 12.80it/s]
Loading 0: 93%|█████████▎| 338/363 [00:15<00:01, 17.63it/s]
Loading 0: 94%|█████████▍| 343/363 [00:15<00:00, 21.74it/s]
Loading 0: 96%|█████████▌| 348/363 [00:15<00:00, 22.26it/s]
Loading 0: 98%|█████████▊| 355/363 [00:15<00:00, 28.96it/s]
Loading 0: 99%|█████████▉| 360/363 [00:15<00:00, 31.29it/s]
Job white-bird-7-dpo-step100-v1-mkmlizer completed after 113.01s with status: succeeded
Stopping job with name white-bird-7-dpo-step100-v1-mkmlizer
Pipeline stage MKMLizer completed in 113.45s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service white-bird-7-dpo-step100-v1
Waiting for inference service white-bird-7-dpo-step100-v1 to be ready
Inference service white-bird-7-dpo-step100-v1 ready after 140.48565554618835s
Pipeline stage MKMLDeployer completed in 140.89s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.2560551166534424s
Received healthy response to inference request in 1.636408805847168s
Received healthy response to inference request in 1.6049039363861084s
Received healthy response to inference request in 1.5297136306762695s
5 requests
1 failed requests
5th percentile: 1.5447516918182373
10th percentile: 1.5597897529602052
20th percentile: 1.5898658752441406
30th percentile: 1.6112049102783204
40th percentile: 1.623806858062744
50th percentile: 1.636408805847168
60th percentile: 1.8842673301696777
70th percentile: 2.1321258544921875
80th percentile: 5.826369142532352
90th percentile: 12.966997194290162
95th percentile: 16.537311220169066
99th percentile: 19.39356244087219
mean time: 5.426941347122193
%s, retrying in %s seconds...
Received healthy response to inference request in 1.7387580871582031s
Received healthy response to inference request in 1.5193860530853271s
Received healthy response to inference request in 1.7792069911956787s
Received healthy response to inference request in 1.674245834350586s
Received healthy response to inference request in 1.6488275527954102s
5 requests
0 failed requests
5th percentile: 1.5452743530273438
10th percentile: 1.5711626529693603
20th percentile: 1.6229392528533935
30th percentile: 1.6539112091064454
40th percentile: 1.6640785217285157
50th percentile: 1.674245834350586
60th percentile: 1.7000507354736327
70th percentile: 1.7258556365966797
80th percentile: 1.7468478679656982
90th percentile: 1.7630274295806885
95th percentile: 1.7711172103881836
99th percentile: 1.7775890350341796
mean time: 1.672084903717041
Pipeline stage StressChecker completed in 37.59s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.79s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.62s
Shutdown handler de-registered
white-bird-7-dpo-step100_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service white-bird-7-dpo-step100-v1-profiler
Waiting for inference service white-bird-7-dpo-step100-v1-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4087.31s
Shutdown handler de-registered
white-bird-7-dpo-step100_v1 status is now inactive due to auto deactivation removed underperforming models
white-bird-7-dpo-step100_v1 status is now torndown due to DeploymentManager action