Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-16-full-12-429-v3-mkmlizer
Waiting for job on junhua024-chai-16-full-12-429-v3-mkmlizer to finish
junhua024-chai-16-full-12-429-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ belonging to: ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Downloaded to shared memory in 149.718s
junhua024-chai-16-full-12-429-v3-mkmlizer: Checking if junhua024/chai_16_full_12_o_ffn_1925 already exists in ChaiML
junhua024-chai-16-full-12-429-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpf_13ykc9, device:0
junhua024-chai-16-full-12-429-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-16-full-12-429-v3-mkmlizer: quantized model in 32.241s
junhua024-chai-16-full-12-429-v3-mkmlizer: Processed model junhua024/chai_16_full_12_o_ffn_1925 in 182.049s
junhua024-chai-16-full-12-429-v3-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-16-full-12-429-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-16-full-12-429-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/config.json
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/special_tokens_map.json
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/tokenizer_config.json
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/tokenizer.json
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/flywheel_model.0.safetensors
Job junhua024-chai-16-full-12-429-v3-mkmlizer completed after 211.21s with status: succeeded
Stopping job with name junhua024-chai-16-full-12-429-v3-mkmlizer
Pipeline stage MKMLizer completed in 211.76s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-16-full-12-429-v3
Waiting for inference service junhua024-chai-16-full-12-429-v3 to be ready
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Retrying (%r) after connection broken by '%r': %s
Inference service junhua024-chai-16-full-12-429-v3 ready after 331.46095967292786s
Pipeline stage MKMLDeployer completed in 331.90s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.4987235069274902s
Received healthy response to inference request in 1.6631414890289307s
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Received healthy response to inference request in 1.6433894634246826s
Received healthy response to inference request in 1.542820930480957s
Received healthy response to inference request in 2.034693956375122s
5 requests
0 failed requests
5th percentile: 1.5629346370697021
10th percentile: 1.5830483436584473
20th percentile: 1.6232757568359375
30th percentile: 1.6473398685455323
40th percentile: 1.6552406787872314
50th percentile: 1.6631414890289307
60th percentile: 1.8117624759674071
70th percentile: 1.9603834629058836
80th percentile: 2.127499866485596
90th percentile: 2.313111686706543
95th percentile: 2.4059175968170163
99th percentile: 2.4801623249053955
mean time: 1.8765538692474366
Pipeline stage StressChecker completed in 11.09s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.65s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.72s
Shutdown handler de-registered
junhua024-chai-16-full-12_429_v3 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4892.01s
Shutdown handler de-registered
junhua024-chai-16-full-12_429_v3 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-16-full-12_429_v3 status is now torndown due to DeploymentManager action