Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name jic062-dpo-v1-8-v1-mkmlizer
Waiting for job on jic062-dpo-v1-8-v1-mkmlizer to finish
jic062-dpo-v1-8-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-dpo-v1-8-v1-mkmlizer: ║ _____ __ __ ║
jic062-dpo-v1-8-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-dpo-v1-8-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-dpo-v1-8-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-dpo-v1-8-v1-mkmlizer: ║ /___/ ║
jic062-dpo-v1-8-v1-mkmlizer: ║ ║
jic062-dpo-v1-8-v1-mkmlizer: ║ Version: 0.10.1 ║
jic062-dpo-v1-8-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-dpo-v1-8-v1-mkmlizer: ║ https://mk1.ai ║
jic062-dpo-v1-8-v1-mkmlizer: ║ ║
jic062-dpo-v1-8-v1-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-dpo-v1-8-v1-mkmlizer: ║ belonging to: ║
jic062-dpo-v1-8-v1-mkmlizer: ║ ║
jic062-dpo-v1-8-v1-mkmlizer: ║ Chai Research Corp. ║
jic062-dpo-v1-8-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-dpo-v1-8-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-dpo-v1-8-v1-mkmlizer: ║ ║
jic062-dpo-v1-8-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission mistralai-mistral-nemo_9330_v109: ('http://mistralai-mistral-nemo-9330-v109-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission mistralai-mistral-nemo_9330_v109: ('http://mistralai-mistral-nemo-9330-v109-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
jic062-dpo-v1-8-v1-mkmlizer: Downloaded to shared memory in 47.385s
jic062-dpo-v1-8-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmprcbk3iyq, device:0
jic062-dpo-v1-8-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
jic062-dpo-v1-8-v1-mkmlizer: quantized model in 34.563s
jic062-dpo-v1-8-v1-mkmlizer: Processed model jic062/dpo-v1.8 in 81.948s
jic062-dpo-v1-8-v1-mkmlizer: creating bucket guanaco-mkml-models
jic062-dpo-v1-8-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-dpo-v1-8-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-dpo-v1-8-v1
jic062-dpo-v1-8-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-dpo-v1-8-v1/config.json
jic062-dpo-v1-8-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-dpo-v1-8-v1/special_tokens_map.json
jic062-dpo-v1-8-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-dpo-v1-8-v1/tokenizer_config.json
jic062-dpo-v1-8-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-dpo-v1-8-v1/tokenizer.json
jic062-dpo-v1-8-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-dpo-v1-8-v1/flywheel_model.0.safetensors
jic062-dpo-v1-8-v1-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:10, 35.73it/s]
Loading 0: 4%|▎ | 13/363 [00:00<00:06, 57.03it/s]
Loading 0: 6%|▌ | 20/363 [00:00<00:06, 54.81it/s]
Loading 0: 7%|▋ | 26/363 [00:00<00:06, 54.19it/s]
Loading 0: 9%|▉ | 32/363 [00:00<00:07, 45.61it/s]
Loading 0: 11%|█ | 40/363 [00:00<00:06, 53.73it/s]
Loading 0: 13%|█▎ | 46/363 [00:00<00:06, 50.35it/s]
Loading 0: 14%|█▍ | 52/363 [00:01<00:06, 51.51it/s]
Loading 0: 17%|█▋ | 60/363 [00:01<00:05, 52.63it/s]
Loading 0: 18%|█▊ | 66/363 [00:01<00:07, 37.88it/s]
Loading 0: 20%|█▉ | 72/363 [00:01<00:06, 41.96it/s]
Loading 0: 21%|██▏ | 78/363 [00:01<00:06, 42.21it/s]
Loading 0: 23%|██▎ | 83/363 [00:01<00:06, 42.19it/s]
Loading 0: 25%|██▍ | 90/363 [00:01<00:05, 47.09it/s]
Loading 0: 26%|██▋ | 96/363 [00:02<00:06, 43.36it/s]
Loading 0: 28%|██▊ | 101/363 [00:02<00:06, 42.95it/s]
Loading 0: 30%|███ | 109/363 [00:02<00:04, 51.74it/s]
Loading 0: 32%|███▏ | 115/363 [00:02<00:05, 47.46it/s]
Loading 0: 33%|███▎ | 121/363 [00:02<00:05, 46.11it/s]
Loading 0: 35%|███▍ | 127/363 [00:02<00:05, 42.67it/s]
Loading 0: 37%|███▋ | 135/363 [00:02<00:04, 49.71it/s]
Loading 0: 39%|███▉ | 141/363 [00:03<00:04, 47.68it/s]
Loading 0: 40%|████ | 146/363 [00:03<00:06, 35.75it/s]
Loading 0: 42%|████▏ | 151/363 [00:03<00:05, 36.13it/s]
Loading 0: 43%|████▎ | 157/363 [00:03<00:05, 39.75it/s]
Loading 0: 45%|████▍ | 163/363 [00:03<00:04, 40.83it/s]
Loading 0: 46%|████▋ | 168/363 [00:03<00:04, 41.48it/s]
Loading 0: 48%|████▊ | 175/363 [00:03<00:04, 46.30it/s]
Loading 0: 50%|████▉ | 181/363 [00:04<00:04, 44.94it/s]
Loading 0: 51%|█████ | 186/363 [00:04<00:04, 44.11it/s]
Loading 0: 53%|█████▎ | 192/363 [00:04<00:03, 47.85it/s]
Loading 0: 54%|█████▍ | 197/363 [00:04<00:03, 48.21it/s]
Loading 0: 56%|█████▌ | 202/363 [00:04<00:03, 48.57it/s]
Loading 0: 57%|█████▋ | 208/363 [00:04<00:03, 46.50it/s]
Loading 0: 59%|█████▊ | 213/363 [00:04<00:03, 45.08it/s]
Loading 0: 61%|██████ | 220/363 [00:04<00:02, 51.26it/s]
Loading 0: 62%|██████▏ | 226/363 [00:05<00:03, 34.40it/s]
Loading 0: 64%|██████▎ | 231/363 [00:05<00:03, 36.23it/s]
Loading 0: 66%|██████▌ | 238/363 [00:05<00:02, 41.91it/s]
Loading 0: 67%|██████▋ | 244/363 [00:05<00:02, 42.44it/s]
Loading 0: 69%|██████▊ | 249/363 [00:05<00:02, 42.28it/s]
Loading 0: 70%|███████ | 255/363 [00:05<00:02, 46.53it/s]
Loading 0: 72%|███████▏ | 260/363 [00:05<00:02, 46.23it/s]
Loading 0: 73%|███████▎ | 265/363 [00:05<00:02, 46.36it/s]
Loading 0: 75%|███████▍ | 271/363 [00:06<00:02, 44.82it/s]
Loading 0: 76%|███████▌ | 276/363 [00:06<00:01, 44.40it/s]
Loading 0: 78%|███████▊ | 282/363 [00:06<00:01, 48.41it/s]
Loading 0: 79%|███████▉ | 287/363 [00:06<00:01, 48.43it/s]
Loading 0: 80%|████████ | 292/363 [00:06<00:01, 48.44it/s]
Loading 0: 82%|████████▏ | 298/363 [00:06<00:01, 47.01it/s]
Loading 0: 84%|████████▎ | 304/363 [00:13<00:21, 2.70it/s]
Loading 0: 85%|████████▍ | 308/363 [00:13<00:15, 3.44it/s]
Loading 0: 86%|████████▌ | 312/363 [00:13<00:11, 4.43it/s]
Loading 0: 88%|████████▊ | 319/363 [00:13<00:06, 6.91it/s]
Loading 0: 89%|████████▉ | 324/363 [00:13<00:04, 9.02it/s]
Loading 0: 91%|█████████ | 329/363 [00:13<00:02, 11.71it/s]
Loading 0: 92%|█████████▏| 335/363 [00:14<00:01, 15.17it/s]
Loading 0: 94%|█████████▎| 340/363 [00:14<00:01, 18.54it/s]
Loading 0: 96%|█████████▌| 347/363 [00:14<00:00, 25.02it/s]
Loading 0: 97%|█████████▋| 353/363 [00:14<00:00, 28.86it/s]
Loading 0: 99%|█████████▊| 358/363 [00:14<00:00, 31.85it/s]
Job jic062-dpo-v1-8-v1-mkmlizer completed after 104.9s with status: succeeded
Stopping job with name jic062-dpo-v1-8-v1-mkmlizer
Pipeline stage MKMLizer completed in 106.60s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service jic062-dpo-v1-8-v1
Waiting for inference service jic062-dpo-v1-8-v1 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service jic062-dpo-v1-8-v1 ready after 213.04375839233398s
Pipeline stage MKMLDeployer completed in 214.09s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.5195460319519043s
Received healthy response to inference request in 1.9130589962005615s
Received healthy response to inference request in 2.4514896869659424s
Received healthy response to inference request in 2.9873502254486084s
Received healthy response to inference request in 2.2687504291534424s
5 requests
0 failed requests
5th percentile: 1.9841972827911376
10th percentile: 2.0553355693817137
20th percentile: 2.1976121425628663
30th percentile: 2.3052982807159426
40th percentile: 2.3783939838409425
50th percentile: 2.4514896869659424
60th percentile: 2.478712224960327
70th percentile: 2.5059347629547117
80th percentile: 2.613106870651245
90th percentile: 2.800228548049927
95th percentile: 2.8937893867492677
99th percentile: 2.9686380577087403
mean time: 2.428039073944092
Pipeline stage StressChecker completed in 14.20s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 4.47s
Shutdown handler de-registered
jic062-dpo-v1-8_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service jic062-dpo-v1-8-v1-profiler
Waiting for inference service jic062-dpo-v1-8-v1-profiler to be ready
Inference service jic062-dpo-v1-8-v1-profiler ready after 210.49563670158386s
Pipeline stage MKMLProfilerDeployer completed in 210.92s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/jic062-dpo-v1-8-v1-profiler-predictor-00001-deployment-6f4f74qs:/code/chaiverse_profiler_1727207243 --namespace tenant-chaiml-guanaco
kubectl exec -it jic062-dpo-v1-8-v1-profiler-predictor-00001-deployment-6f4f74qs --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1727207243 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1727207243/summary.json'
kubectl exec -it jic062-dpo-v1-8-v1-profiler-predictor-00001-deployment-6f4f74qs --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1727207243/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1183.03s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service jic062-dpo-v1-8-v1-profiler is running
Tearing down inference service jic062-dpo-v1-8-v1-profiler
Service jic062-dpo-v1-8-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.24s
Shutdown handler de-registered
jic062-dpo-v1-8_v1 status is now inactive due to auto deactivation removed underperforming models
jic062-dpo-v1-8_v1 status is now torndown due to DeploymentManager action