Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name mistralai-ministral-8b-47735-v7-mkmlizer
Waiting for job on mistralai-ministral-8b-47735-v7-mkmlizer to finish
mistralai-ministral-8b-47735-v7-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-ministral-8b-47735-v7-mkmlizer: ║ _____ __ __ ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ /___/ ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ Version: 0.12.8 ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ https://mk1.ai ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ belonging to: ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ Chai Research Corp. ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
mistralai-ministral-8b-47735-v7-mkmlizer: ║ ║
mistralai-ministral-8b-47735-v7-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mistralai-ministral-8b-47735-v7-mkmlizer: Downloaded to shared memory in 38.097s
mistralai-ministral-8b-47735-v7-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp3txtb6zx, device:0
mistralai-ministral-8b-47735-v7-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-ministral-8b-47735-v7-mkmlizer:
Loading 0: 0%| | 0/327 [00:00<?, ?it/s]
Loading 0: 2%|▏ | 5/327 [00:00<00:10, 30.96it/s]
Loading 0: 4%|▎ | 12/327 [00:00<00:07, 42.60it/s]
Loading 0: 5%|▌ | 17/327 [00:00<00:07, 42.29it/s]
Loading 0: 7%|▋ | 22/327 [00:00<00:08, 35.69it/s]
Loading 0: 8%|▊ | 26/327 [00:00<00:08, 34.42it/s]
Loading 0: 10%|█ | 33/327 [00:00<00:07, 41.82it/s]
Loading 0: 12%|█▏ | 38/327 [00:00<00:06, 41.84it/s]
Loading 0: 13%|█▎ | 43/327 [00:01<00:06, 42.11it/s]
Loading 0: 15%|█▍ | 48/327 [00:01<00:06, 43.57it/s]
Loading 0: 16%|█▌ | 53/327 [00:01<00:07, 34.81it/s]
Loading 0: 18%|█▊ | 60/327 [00:01<00:06, 41.29it/s]
Loading 0: 20%|█▉ | 65/327 [00:01<00:06, 41.68it/s]
Loading 0: 21%|██▏ | 70/327 [00:01<00:06, 41.13it/s]
Loading 0: 23%|██▎ | 75/327 [00:01<00:05, 42.57it/s]
Loading 0: 24%|██▍ | 80/327 [00:02<00:07, 34.97it/s]
Loading 0: 27%|██▋ | 87/327 [00:02<00:05, 41.36it/s]
Loading 0: 28%|██▊ | 92/327 [00:02<00:05, 41.69it/s]
Loading 0: 30%|██▉ | 97/327 [00:02<00:07, 30.05it/s]
Loading 0: 31%|███ | 101/327 [00:02<00:07, 31.71it/s]
Loading 0: 32%|███▏ | 105/327 [00:02<00:07, 31.25it/s]
Loading 0: 33%|███▎ | 109/327 [00:02<00:06, 33.16it/s]
Loading 0: 35%|███▍ | 113/327 [00:03<00:06, 32.61it/s]
Loading 0: 36%|███▌ | 118/327 [00:03<00:05, 36.77it/s]
Loading 0: 37%|███▋ | 122/327 [00:03<00:05, 35.17it/s]
Loading 0: 39%|███▉ | 129/327 [00:03<00:04, 42.06it/s]
Loading 0: 41%|████ | 134/327 [00:03<00:04, 41.87it/s]
Loading 0: 43%|████▎ | 139/327 [00:03<00:04, 41.81it/s]
Loading 0: 44%|████▍ | 144/327 [00:03<00:04, 42.91it/s]
Loading 0: 46%|████▌ | 149/327 [00:03<00:05, 34.37it/s]
Loading 0: 48%|████▊ | 156/327 [00:04<00:04, 40.78it/s]
Loading 0: 49%|████▉ | 161/327 [00:04<00:04, 41.03it/s]
Loading 0: 51%|█████ | 166/327 [00:04<00:03, 40.77it/s]
Loading 0: 52%|█████▏ | 171/327 [00:04<00:03, 42.12it/s]
Loading 0: 54%|█████▍ | 176/327 [00:04<00:04, 34.27it/s]
Loading 0: 55%|█████▌ | 181/327 [00:04<00:03, 37.50it/s]
Loading 0: 57%|█████▋ | 186/327 [00:04<00:03, 38.25it/s]
Loading 0: 59%|█████▊ | 192/327 [00:04<00:03, 42.88it/s]
Loading 0: 60%|██████ | 197/327 [00:05<00:03, 42.90it/s]
Loading 0: 62%|██████▏ | 202/327 [00:05<00:02, 42.06it/s]
Loading 0: 63%|██████▎ | 207/327 [00:05<00:02, 43.07it/s]
Loading 0: 65%|██████▍ | 212/327 [00:05<00:03, 29.04it/s]
Loading 0: 66%|██████▌ | 216/327 [00:05<00:03, 30.89it/s]
Loading 0: 67%|██████▋ | 220/327 [00:05<00:03, 31.83it/s]
Loading 0: 69%|██████▊ | 224/327 [00:05<00:03, 33.29it/s]
Loading 0: 70%|██████▉ | 228/327 [00:06<00:02, 34.54it/s]
Loading 0: 71%|███████ | 232/327 [00:06<00:02, 34.70it/s]
Loading 0: 72%|███████▏ | 237/327 [00:06<00:02, 37.41it/s]
Loading 0: 74%|███████▎ | 241/327 [00:06<00:02, 36.52it/s]
Loading 0: 75%|███████▌ | 246/327 [00:06<00:02, 38.81it/s]
Loading 0: 76%|███████▋ | 250/327 [00:06<00:02, 37.11it/s]
Loading 0: 78%|███████▊ | 255/327 [00:06<00:01, 39.21it/s]
Loading 0: 79%|███████▉ | 259/327 [00:06<00:01, 38.02it/s]
Loading 0: 81%|████████ | 264/327 [00:06<00:01, 39.58it/s]
Loading 0: 82%|████████▏ | 268/327 [00:07<00:01, 38.10it/s]
Loading 0: 83%|████████▎ | 273/327 [00:07<00:01, 39.74it/s]
Loading 0: 85%|████████▍ | 277/327 [00:07<00:01, 38.15it/s]
Loading 0: 86%|████████▌ | 282/327 [00:07<00:01, 39.77it/s]
Loading 0: 87%|████████▋ | 286/327 [00:07<00:01, 38.43it/s]
Loading 0: 89%|████████▉ | 291/327 [00:07<00:00, 40.20it/s]
Loading 0: 91%|█████████ | 296/327 [00:07<00:00, 40.41it/s]
Loading 0: 92%|█████████▏| 301/327 [00:07<00:00, 40.88it/s]
Loading 0: 94%|█████████▎| 306/327 [00:08<00:00, 42.37it/s]
Loading 0: 95%|█████████▌| 311/327 [00:08<00:00, 34.39it/s]
Loading 0: 97%|█████████▋| 318/327 [00:08<00:00, 41.50it/s]
Loading 0: 99%|█████████▉| 323/327 [00:08<00:00, 42.01it/s]
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama_fast.LlamaTokenizerFast'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565 - if you loaded a llama tokenizer from a GGUF file you can ignore this message.
mistralai-ministral-8b-47735-v7-mkmlizer: quantized model in 29.355s
mistralai-ministral-8b-47735-v7-mkmlizer: Processed model mistralai/Ministral-8B-Instruct-2410 in 67.453s
mistralai-ministral-8b-47735-v7-mkmlizer: creating bucket guanaco-mkml-models
mistralai-ministral-8b-47735-v7-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-ministral-8b-47735-v7-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-ministral-8b-47735-v7
mistralai-ministral-8b-47735-v7-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-ministral-8b-47735-v7/config.json
mistralai-ministral-8b-47735-v7-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-ministral-8b-47735-v7/special_tokens_map.json
mistralai-ministral-8b-47735-v7-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-ministral-8b-47735-v7/tokenizer_config.json
mistralai-ministral-8b-47735-v7-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mistralai-ministral-8b-47735-v7/tokenizer.json
Failed to get response for submission chaiml-20250219-c-4epoc_80755_v1: HTTPConnectionPool(host='chaiml-20250219-c-4epoc-80755-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
mistralai-ministral-8b-47735-v7-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-ministral-8b-47735-v7/flywheel_model.0.safetensors
Job mistralai-ministral-8b-47735-v7-mkmlizer completed after 94.14s with status: succeeded
Stopping job with name mistralai-ministral-8b-47735-v7-mkmlizer
Pipeline stage MKMLizer completed in 94.59s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service mistralai-ministral-8b-47735-v7
Waiting for inference service mistralai-ministral-8b-47735-v7 to be ready
Inference service mistralai-ministral-8b-47735-v7 ready after 190.91955280303955s
Pipeline stage MKMLDeployer completed in 191.67s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.8739869594573975s
Received healthy response to inference request in 1.1879801750183105s
Received healthy response to inference request in 2.6751668453216553s
Received healthy response to inference request in 1.0872218608856201s
Received healthy response to inference request in 1.1862781047821045s
5 requests
0 failed requests
5th percentile: 1.107033109664917
10th percentile: 1.1268443584442138
20th percentile: 1.1664668560028075
30th percentile: 1.1866185188293457
40th percentile: 1.1872993469238282
50th percentile: 1.1879801750183105
60th percentile: 1.4623828887939452
70th percentile: 1.73678560256958
80th percentile: 2.034222936630249
90th percentile: 2.354694890975952
95th percentile: 2.5149308681488036
99th percentile: 2.643119649887085
mean time: 1.6021267890930175
Pipeline stage StressChecker completed in 9.13s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.60s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.60s
Shutdown handler de-registered
mistralai-ministral-8b-_47735_v7 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2276.79s
Shutdown handler de-registered
mistralai-ministral-8b-_47735_v7 status is now inactive due to auto deactivation removed underperforming models
mistralai-ministral-8b-_47735_v7 status is now torndown due to DeploymentManager action