Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name alexdaoud-trainer-bagir-7252-v1-mkmlizer
Waiting for job on alexdaoud-trainer-bagir-7252-v1-mkmlizer to finish
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ _____ __ __ ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ /___/ ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ Version: 0.11.12 ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ https://mk1.ai ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ The license key for the current software has been verified as ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ belonging to: ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ Chai Research Corp. ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ║ ║
alexdaoud-trainer-bagir-7252-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
alexdaoud-trainer-bagir-7252-v1-mkmlizer: Downloaded to shared memory in 26.252s
alexdaoud-trainer-bagir-7252-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:t0, folder:/tmp/tmpchxae0d3, device:0
alexdaoud-trainer-bagir-7252-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
alexdaoud-trainer-bagir-7252-v1-mkmlizer: quantized model in 87.444s
alexdaoud-trainer-bagir-7252-v1-mkmlizer: Processed model alexdaoud/trainer_bagir_2024-12-11-checkpoint-8 in 113.696s
alexdaoud-trainer-bagir-7252-v1-mkmlizer: creating bucket guanaco-mkml-models
alexdaoud-trainer-bagir-7252-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
alexdaoud-trainer-bagir-7252-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/alexdaoud-trainer-bagir-7252-v1
alexdaoud-trainer-bagir-7252-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/alexdaoud-trainer-bagir-7252-v1/special_tokens_map.json
alexdaoud-trainer-bagir-7252-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/alexdaoud-trainer-bagir-7252-v1/config.json
alexdaoud-trainer-bagir-7252-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/alexdaoud-trainer-bagir-7252-v1/tokenizer_config.json
alexdaoud-trainer-bagir-7252-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/alexdaoud-trainer-bagir-7252-v1/tokenizer.json
alexdaoud-trainer-bagir-7252-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/alexdaoud-trainer-bagir-7252-v1/flywheel_model.0.safetensors
Job alexdaoud-trainer-bagir-7252-v1-mkmlizer completed after 145.87s with status: succeeded
Stopping job with name alexdaoud-trainer-bagir-7252-v1-mkmlizer
Pipeline stage MKMLizer completed in 146.41s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.17s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service alexdaoud-trainer-bagir-7252-v1
Waiting for inference service alexdaoud-trainer-bagir-7252-v1 to be ready
Inference service alexdaoud-trainer-bagir-7252-v1 ready after 180.59589219093323s
Pipeline stage MKMLDeployer completed in 181.16s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.3702566623687744s
Received healthy response to inference request in 5.511565685272217s
Received healthy response to inference request in 2.9879415035247803s
Received healthy response to inference request in 3.855393886566162s
Received healthy response to inference request in 3.2701311111450195s
5 requests
0 failed requests
5th percentile: 2.4937936305999755
10th percentile: 2.6173305988311766
20th percentile: 2.864404535293579
30th percentile: 3.0443794250488283
40th percentile: 3.157255268096924
50th percentile: 3.2701311111450195
60th percentile: 3.5042362213134766
70th percentile: 3.7383413314819336
80th percentile: 4.186628246307373
90th percentile: 4.849096965789795
95th percentile: 5.1803313255310055
99th percentile: 5.445318813323975
mean time: 3.5990577697753907
%s, retrying in %s seconds...
Received healthy response to inference request in 3.067051887512207s
Received healthy response to inference request in 3.29732084274292s
Received healthy response to inference request in 3.3939638137817383s
Received healthy response to inference request in 2.25795841217041s
Received healthy response to inference request in 3.199925422668457s
5 requests
0 failed requests
5th percentile: 2.4197771072387697
10th percentile: 2.581595802307129
20th percentile: 2.9052331924438475
30th percentile: 3.093626594543457
40th percentile: 3.146776008605957
50th percentile: 3.199925422668457
60th percentile: 3.2388835906982423
70th percentile: 3.2778417587280275
80th percentile: 3.3166494369506836
90th percentile: 3.355306625366211
95th percentile: 3.3746352195739746
99th percentile: 3.3900980949401855
mean time: 3.0432440757751467
Pipeline stage StressChecker completed in 36.05s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.22s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 2.05s
Shutdown handler de-registered
alexdaoud-trainer-bagir-_7252_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service alexdaoud-trainer-bagir-7252-v1-profiler
Waiting for inference service alexdaoud-trainer-bagir-7252-v1-profiler to be ready
Inference service alexdaoud-trainer-bagir-7252-v1-profiler ready after 190.42727208137512s
Pipeline stage MKMLProfilerDeployer completed in 190.74s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo4gkwx:/code/chaiverse_profiler_1733990297 --namespace tenant-chaiml-guanaco
kubectl exec -it alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo4gkwx --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1733990297 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1733990297/summary.json'
kubectl exec -it alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo4gkwx --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1733990297/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo4gkwx:/code/chaiverse_profiler_1733992608 --namespace tenant-chaiml-guanaco
kubectl exec -it alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo4gkwx --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1733992608 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1733992608/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo4gkwx:/code/chaiverse_profiler_1733992919 --namespace tenant-chaiml-guanaco
kubectl exec -it alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo4gkwx --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1733992919 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1733992919/summary.json'
clean up pipeline due to error=ISVCScriptError('Command failed with error: Defaulted container "kserve-container" out of: kserve-container, queue-proxy\nUnable to use a TTY - input is not a terminal or the right kind of file\ncommand terminated with exit code 137\n, output: waiting for startup of TargetModel(endpoint=\'localhost\', route=\'GPT-J-6B-lit-v2\', namespace=\'tenant-chaiml-guanaco\', max_characters=9999, reward=False, url_format=\'{endpoint}-predictor-default.{namespace}.knative.ord1.coreweave.cloud\')\n')
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service alexdaoud-trainer-bagir-7252-v1-profiler is running
Tearing down inference service alexdaoud-trainer-bagir-7252-v1-profiler
Service alexdaoud-trainer-bagir-7252-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.97s
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service alexdaoud-trainer-bagir-7252-v1-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 2.18s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service alexdaoud-trainer-bagir-7252-v1-profiler
Waiting for inference service alexdaoud-trainer-bagir-7252-v1-profiler to be ready
Inference service alexdaoud-trainer-bagir-7252-v1-profiler ready after 40.10985565185547s
Pipeline stage MKMLProfilerDeployer completed in 40.46s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo7wsq7:/code/chaiverse_profiler_1733993119 --namespace tenant-chaiml-guanaco
kubectl exec -it alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo7wsq7 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1733993119 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1733993119/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo7wsq7:/code/chaiverse_profiler_1733995903 --namespace tenant-chaiml-guanaco
kubectl exec -it alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplo7wsq7 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1733995903 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1733995903/summary.json'
Received signal 2, running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service alexdaoud-trainer-bagir-7252-v1-profiler is running
Tearing down inference service alexdaoud-trainer-bagir-7252-v1-profiler
Service alexdaoud-trainer-bagir-7252-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.14s
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service alexdaoud-trainer-bagir-7252-v1-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 2.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service alexdaoud-trainer-bagir-7252-v1-profiler
Waiting for inference service alexdaoud-trainer-bagir-7252-v1-profiler to be ready
Inference service alexdaoud-trainer-bagir-7252-v1-profiler ready after 110.33956599235535s
Pipeline stage MKMLProfilerDeployer completed in 110.69s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplojfc8l:/code/chaiverse_profiler_1733996812 --namespace tenant-chaiml-guanaco
kubectl exec -it alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplojfc8l --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1733996812 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1733996812/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplojfc8l:/code/chaiverse_profiler_1733999596 --namespace tenant-chaiml-guanaco
kubectl exec -it alexdaoud-trainer-ba1e08d7d05112ebf68eca334b0d221f73-deplojfc8l --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1733999596 && python profiles.py profile --best_of_n 1 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 256 --output_tokens 1 --summary /code/chaiverse_profiler_1733999596/summary.json'
Received signal 2, running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service alexdaoud-trainer-bagir-7252-v1-profiler is running
Tearing down inference service alexdaoud-trainer-bagir-7252-v1-profiler
Service alexdaoud-trainer-bagir-7252-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.87s
Shutdown handler de-registered
alexdaoud-trainer-bagir-_7252_v1 status is now inactive due to auto deactivation removed underperforming models