Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name nitral-ai-captain-bmo-12b-v22-mkmlizer
Waiting for job on nitral-ai-captain-bmo-12b-v22-mkmlizer to finish
nitral-ai-captain-bmo-12b-v22-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ _____ __ __ ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ /___/ ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ Version: 0.11.12 ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ https://mk1.ai ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ The license key for the current software has been verified as ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ belonging to: ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ Chai Research Corp. ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ║ ║
nitral-ai-captain-bmo-12b-v22-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
nitral-ai-captain-bmo-12b-v22-mkmlizer: Downloaded to shared memory in 38.702s
nitral-ai-captain-bmo-12b-v22-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp09mgrwwe, device:0
nitral-ai-captain-bmo-12b-v22-mkmlizer: Saving flywheel model at /dev/shm/model_cache
nitral-ai-captain-bmo-12b-v22-mkmlizer: quantized model in 35.746s
nitral-ai-captain-bmo-12b-v22-mkmlizer: Processed model Nitral-AI/Captain_BMO-12B in 74.448s
nitral-ai-captain-bmo-12b-v22-mkmlizer: creating bucket guanaco-mkml-models
nitral-ai-captain-bmo-12b-v22-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
nitral-ai-captain-bmo-12b-v22-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/nitral-ai-captain-bmo-12b-v22
nitral-ai-captain-bmo-12b-v22-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/nitral-ai-captain-bmo-12b-v22/config.json
nitral-ai-captain-bmo-12b-v22-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/nitral-ai-captain-bmo-12b-v22/special_tokens_map.json
nitral-ai-captain-bmo-12b-v22-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/nitral-ai-captain-bmo-12b-v22/tokenizer_config.json
nitral-ai-captain-bmo-12b-v22-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/nitral-ai-captain-bmo-12b-v22/tokenizer.json
nitral-ai-captain-bmo-12b-v22-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/nitral-ai-captain-bmo-12b-v22/flywheel_model.0.safetensors
nitral-ai-captain-bmo-12b-v22-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%| | 2/363 [00:06<18:26, 3.06s/it]
Loading 0: 2%|▏ | 6/363 [00:06<04:54, 1.21it/s]
Loading 0: 4%|▎ | 13/363 [00:06<01:44, 3.34it/s]
Loading 0: 5%|▌ | 19/363 [00:06<01:00, 5.73it/s]
Loading 0: 7%|▋ | 24/363 [00:06<00:42, 7.96it/s]
Loading 0: 9%|▉ | 32/363 [00:06<00:25, 13.11it/s]
Loading 0: 10%|█ | 38/363 [00:06<00:19, 16.75it/s]
Loading 0: 12%|█▏ | 43/363 [00:07<00:19, 16.73it/s]
Loading 0: 14%|█▍ | 50/363 [00:07<00:13, 22.43it/s]
Loading 0: 15%|█▌ | 56/363 [00:07<00:11, 26.13it/s]
Loading 0: 17%|█▋ | 61/363 [00:07<00:10, 28.98it/s]
Loading 0: 18%|█▊ | 67/363 [00:07<00:08, 34.34it/s]
Loading 0: 20%|█▉ | 72/363 [00:07<00:07, 36.45it/s]
Loading 0: 21%|██ | 77/363 [00:07<00:07, 39.13it/s]
Loading 0: 23%|██▎ | 83/363 [00:08<00:07, 39.42it/s]
Loading 0: 24%|██▍ | 88/363 [00:08<00:06, 40.08it/s]
Loading 0: 26%|██▌ | 94/363 [00:08<00:05, 44.86it/s]
Loading 0: 27%|██▋ | 99/363 [00:08<00:05, 45.34it/s]
Loading 0: 29%|██▊ | 104/363 [00:08<00:05, 46.08it/s]
Loading 0: 30%|███ | 110/363 [00:08<00:05, 43.88it/s]
Loading 0: 32%|███▏ | 115/363 [00:08<00:06, 41.13it/s]
Loading 0: 33%|███▎ | 121/363 [00:09<00:07, 33.40it/s]
Loading 0: 34%|███▍ | 125/363 [00:09<00:07, 33.26it/s]
Loading 0: 36%|███▌ | 130/363 [00:09<00:06, 36.74it/s]
Loading 0: 37%|███▋ | 134/363 [00:09<00:06, 36.73it/s]
Loading 0: 39%|███▊ | 140/363 [00:09<00:05, 40.66it/s]
Loading 0: 40%|████ | 146/363 [00:09<00:05, 40.08it/s]
Loading 0: 42%|████▏ | 151/363 [00:09<00:05, 39.17it/s]
Loading 0: 43%|████▎ | 157/363 [00:09<00:04, 44.03it/s]
Loading 0: 45%|████▍ | 162/363 [00:10<00:04, 43.61it/s]
Loading 0: 46%|████▌ | 167/363 [00:10<00:04, 44.62it/s]
Loading 0: 48%|████▊ | 173/363 [00:10<00:04, 42.63it/s]
Loading 0: 49%|████▉ | 178/363 [00:10<00:04, 41.20it/s]
Loading 0: 51%|█████ | 185/363 [00:10<00:03, 46.61it/s]
Loading 0: 53%|█████▎ | 191/363 [00:10<00:03, 45.36it/s]
Loading 0: 54%|█████▍ | 196/363 [00:10<00:03, 43.45it/s]
Loading 0: 56%|█████▌ | 202/363 [00:11<00:04, 35.38it/s]
Loading 0: 57%|█████▋ | 206/363 [00:11<00:04, 35.37it/s]
Loading 0: 58%|█████▊ | 212/363 [00:11<00:03, 38.93it/s]
Loading 0: 60%|██████ | 218/363 [00:11<00:03, 39.88it/s]
Loading 0: 61%|██████▏ | 223/363 [00:11<00:03, 40.11it/s]
Loading 0: 63%|██████▎ | 230/363 [00:11<00:02, 45.24it/s]
Loading 0: 65%|██████▌ | 236/363 [00:11<00:02, 43.64it/s]
Loading 0: 66%|██████▋ | 241/363 [00:11<00:02, 42.63it/s]
Loading 0: 68%|██████▊ | 248/363 [00:12<00:02, 47.05it/s]
Loading 0: 70%|██████▉ | 254/363 [00:12<00:02, 45.30it/s]
Loading 0: 71%|███████▏ | 259/363 [00:12<00:02, 43.72it/s]
Loading 0: 73%|███████▎ | 266/363 [00:12<00:02, 47.44it/s]
Loading 0: 75%|███████▍ | 272/363 [00:12<00:01, 45.66it/s]
Loading 0: 76%|███████▋ | 277/363 [00:12<00:01, 43.49it/s]
Loading 0: 78%|███████▊ | 283/363 [00:12<00:02, 33.92it/s]
Loading 0: 79%|███████▉ | 287/363 [00:13<00:02, 34.00it/s]
Loading 0: 81%|████████ | 293/363 [00:13<00:01, 38.18it/s]
Loading 0: 82%|████████▏ | 299/363 [00:13<00:01, 39.52it/s]
Loading 0: 84%|████████▎ | 304/363 [00:13<00:01, 40.45it/s]
Loading 0: 86%|████████▌ | 311/363 [00:13<00:01, 45.73it/s]
Loading 0: 87%|████████▋ | 317/363 [00:13<00:01, 44.12it/s]
Loading 0: 89%|████████▊ | 322/363 [00:13<00:00, 42.79it/s]
Loading 0: 91%|█████████ | 329/363 [00:13<00:00, 47.54it/s]
Loading 0: 92%|█████████▏| 335/363 [00:14<00:00, 46.68it/s]
Loading 0: 94%|█████████▎| 340/363 [00:14<00:00, 44.90it/s]
Loading 0: 96%|█████████▌| 347/363 [00:14<00:00, 49.13it/s]
Loading 0: 97%|█████████▋| 353/363 [00:14<00:00, 47.74it/s]
Loading 0: 99%|█████████▊| 358/363 [00:14<00:00, 46.45it/s]
Job nitral-ai-captain-bmo-12b-v22-mkmlizer completed after 104.24s with status: succeeded
Stopping job with name nitral-ai-captain-bmo-12b-v22-mkmlizer
Pipeline stage MKMLizer completed in 104.74s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service nitral-ai-captain-bmo-12b-v22
Waiting for inference service nitral-ai-captain-bmo-12b-v22 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service nitral-ai-captain-bmo-12b-v22 ready after 361.9682354927063s
Pipeline stage MKMLDeployer completed in 362.52s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.028761386871338s
Received healthy response to inference request in 1.5125854015350342s
Received healthy response to inference request in 1.448056697845459s
Received healthy response to inference request in 1.3565161228179932s
Received healthy response to inference request in 1.4426305294036865s
5 requests
0 failed requests
5th percentile: 1.3737390041351318
10th percentile: 1.3909618854522705
20th percentile: 1.4254076480865479
30th percentile: 1.443715763092041
40th percentile: 1.44588623046875
50th percentile: 1.448056697845459
60th percentile: 1.473868179321289
70th percentile: 1.4996796607971192
80th percentile: 1.615820598602295
90th percentile: 1.8222909927368165
95th percentile: 1.925526189804077
99th percentile: 2.0081143474578855
mean time: 1.5577100276947022
Pipeline stage StressChecker completed in 9.55s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.73s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.89s
Shutdown handler de-registered
nitral-ai-captain-bmo-12b_v22 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2587.23s
Shutdown handler de-registered
nitral-ai-captain-bmo-12b_v22 status is now inactive due to auto deactivation removed underperforming models
nitral-ai-captain-bmo-12b_v22 status is now torndown due to DeploymentManager action