Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name mistralai-mistral-nem-93303-v299-mkmlizer
Waiting for job on mistralai-mistral-nem-93303-v299-mkmlizer to finish
mistralai-mistral-nem-93303-v299-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mistral-nem-93303-v299-mkmlizer: ║ _____ __ __ ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ /___/ ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ Version: 0.11.12 ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ https://mk1.ai ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ belonging to: ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ Chai Research Corp. ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
mistralai-mistral-nem-93303-v299-mkmlizer: ║ ║
mistralai-mistral-nem-93303-v299-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Failed to get response for submission mistralai-mistral-nem_93303_v290: HTTPConnectionPool(host='mistralai-mistral-nem-93303-v290-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
mistralai-mistral-nem-93303-v299-mkmlizer: Downloaded to shared memory in 55.020s
mistralai-mistral-nem-93303-v299-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpakfdlvf2, device:0
mistralai-mistral-nem-93303-v299-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mistral-nem-93303-v299-mkmlizer: quantized model in 38.334s
mistralai-mistral-nem-93303-v299-mkmlizer: Processed model mistralai/Mistral-Nemo-Instruct-2407 in 93.354s
mistralai-mistral-nem-93303-v299-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mistral-nem-93303-v299-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-mistral-nem-93303-v299-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v299
mistralai-mistral-nem-93303-v299-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v299/config.json
mistralai-mistral-nem-93303-v299-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v299/special_tokens_map.json
mistralai-mistral-nem-93303-v299-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v299/tokenizer_config.json
mistralai-mistral-nem-93303-v299-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v299/tokenizer.json
mistralai-mistral-nem-93303-v299-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mistral-nem-93303-v299/flywheel_model.0.safetensors
mistralai-mistral-nem-93303-v299-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:12, 29.03it/s]
Loading 0: 4%|▎ | 13/363 [00:00<00:07, 48.21it/s]
Loading 0: 5%|▌ | 19/363 [00:00<00:08, 39.00it/s]
Loading 0: 7%|▋ | 24/363 [00:00<00:09, 35.97it/s]
Loading 0: 8%|▊ | 28/363 [00:00<00:09, 36.88it/s]
Loading 0: 9%|▉ | 32/363 [00:00<00:09, 34.25it/s]
Loading 0: 11%|█ | 39/363 [00:00<00:07, 42.57it/s]
Loading 0: 12%|█▏ | 44/363 [00:01<00:07, 42.72it/s]
Loading 0: 13%|█▎ | 49/363 [00:01<00:07, 43.61it/s]
Loading 0: 15%|█▍ | 54/363 [00:01<00:07, 44.02it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:06, 43.71it/s]
Loading 0: 18%|█▊ | 64/363 [00:01<00:12, 23.35it/s]
Loading 0: 20%|█▉ | 71/363 [00:02<00:09, 30.51it/s]
Loading 0: 21%|██ | 76/363 [00:02<00:08, 32.44it/s]
Loading 0: 22%|██▏ | 81/363 [00:02<00:07, 35.60it/s]
Loading 0: 24%|██▎ | 86/363 [00:02<00:07, 36.72it/s]
Loading 0: 25%|██▌ | 91/363 [00:02<00:09, 27.75it/s]
Loading 0: 27%|██▋ | 98/363 [00:02<00:07, 35.28it/s]
Loading 0: 28%|██▊ | 103/363 [00:02<00:07, 36.55it/s]
Loading 0: 30%|██▉ | 108/363 [00:02<00:06, 38.57it/s]
Loading 0: 31%|███ | 113/363 [00:03<00:07, 31.95it/s]
Loading 0: 33%|███▎ | 118/363 [00:03<00:08, 30.61it/s]
Loading 0: 34%|███▍ | 123/363 [00:03<00:07, 34.04it/s]
Loading 0: 35%|███▍ | 127/363 [00:03<00:07, 31.28it/s]
Loading 0: 37%|███▋ | 134/363 [00:03<00:06, 37.55it/s]
Loading 0: 38%|███▊ | 139/363 [00:03<00:06, 36.36it/s]
Loading 0: 39%|███▉ | 143/363 [00:04<00:09, 24.40it/s]
Loading 0: 40%|████ | 147/363 [00:04<00:08, 25.81it/s]
Loading 0: 42%|████▏ | 151/363 [00:04<00:07, 27.42it/s]
Loading 0: 43%|████▎ | 156/363 [00:04<00:06, 30.96it/s]
Loading 0: 44%|████▍ | 160/363 [00:04<00:06, 32.68it/s]
Loading 0: 46%|████▌ | 166/363 [00:04<00:05, 38.24it/s]
Loading 0: 47%|████▋ | 172/363 [00:05<00:04, 38.66it/s]
Loading 0: 49%|████▉ | 177/363 [00:05<00:04, 39.14it/s]
Loading 0: 51%|█████ | 184/363 [00:05<00:03, 45.03it/s]
Loading 0: 52%|█████▏ | 190/363 [00:05<00:03, 43.80it/s]
Loading 0: 54%|█████▎ | 195/363 [00:05<00:03, 43.08it/s]
Loading 0: 55%|█████▌ | 201/363 [00:05<00:03, 46.52it/s]
Loading 0: 57%|█████▋ | 206/363 [00:05<00:03, 45.53it/s]
Loading 0: 58%|█████▊ | 211/363 [00:05<00:03, 46.02it/s]
Loading 0: 60%|█████▉ | 217/363 [00:05<00:03, 45.51it/s]
Loading 0: 61%|██████▏ | 223/363 [00:06<00:04, 33.60it/s]
Loading 0: 63%|██████▎ | 227/363 [00:06<00:03, 34.44it/s]
Loading 0: 64%|██████▎ | 231/363 [00:06<00:03, 34.02it/s]
Loading 0: 65%|██████▌ | 237/363 [00:06<00:03, 39.35it/s]
Loading 0: 67%|██████▋ | 242/363 [00:06<00:02, 40.96it/s]
Loading 0: 68%|██████▊ | 247/363 [00:06<00:02, 42.69it/s]
Loading 0: 70%|██████▉ | 253/363 [00:06<00:02, 41.72it/s]
Loading 0: 71%|███████ | 258/363 [00:07<00:02, 38.76it/s]
Loading 0: 72%|███████▏ | 263/363 [00:07<00:02, 39.64it/s]
Loading 0: 74%|███████▍ | 268/363 [00:07<00:02, 38.36it/s]
Loading 0: 75%|███████▌ | 274/363 [00:07<00:02, 41.68it/s]
Loading 0: 77%|███████▋ | 280/363 [00:07<00:02, 41.43it/s]
Loading 0: 79%|███████▊ | 285/363 [00:07<00:02, 38.85it/s]
Loading 0: 80%|████████ | 291/363 [00:07<00:01, 41.91it/s]
Loading 0: 82%|████████▏ | 296/363 [00:08<00:01, 40.05it/s]
Loading 0: 83%|████████▎ | 301/363 [00:08<00:01, 41.03it/s]
Loading 0: 84%|████████▍ | 306/363 [00:15<00:24, 2.34it/s]
Loading 0: 85%|████████▌ | 309/363 [00:15<00:18, 2.86it/s]
Loading 0: 86%|████████▌ | 312/363 [00:15<00:14, 3.54it/s]
Loading 0: 88%|████████▊ | 320/363 [00:15<00:06, 6.28it/s]
Loading 0: 90%|████████▉ | 326/363 [00:15<00:04, 8.77it/s]
Loading 0: 91%|█████████ | 331/363 [00:15<00:02, 11.33it/s]
Loading 0: 93%|█████████▎| 338/363 [00:15<00:01, 16.13it/s]
Loading 0: 95%|█████████▍| 344/363 [00:16<00:00, 19.65it/s]
Loading 0: 96%|█████████▌| 349/363 [00:16<00:00, 22.53it/s]
Loading 0: 98%|█████████▊| 356/363 [00:16<00:00, 29.15it/s]
Loading 0: 100%|█████████▉| 362/363 [00:16<00:00, 32.94it/s]
Job mistralai-mistral-nem-93303-v299-mkmlizer completed after 124.41s with status: succeeded
Stopping job with name mistralai-mistral-nem-93303-v299-mkmlizer
Pipeline stage MKMLizer completed in 124.84s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service mistralai-mistral-nem-93303-v299
Waiting for inference service mistralai-mistral-nem-93303-v299 to be ready
Failed to get response for submission mistralai-mistral-nem_93303_v290: HTTPConnectionPool(host='mistralai-mistral-nem-93303-v290-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission mistralai-mistral-nem_93303_v291: HTTPConnectionPool(host='mistralai-mistral-nem-93303-v291-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission mistralai-mistral-nem_93303_v291: HTTPConnectionPool(host='mistralai-mistral-nem-93303-v291-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service mistralai-mistral-nem-93303-v299 ready after 190.65274953842163s
Pipeline stage MKMLDeployer completed in 191.06s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.0325136184692383s
Received healthy response to inference request in 1.6641011238098145s
Received healthy response to inference request in 1.6069047451019287s
Received healthy response to inference request in 1.6511507034301758s
5 requests
1 failed requests
5th percentile: 1.6157539367675782
10th percentile: 1.6246031284332276
20th percentile: 1.6423015117645263
30th percentile: 1.6537407875061034
40th percentile: 1.658920955657959
50th percentile: 1.6641011238098145
60th percentile: 1.811466121673584
70th percentile: 1.9588311195373533
80th percentile: 5.65331897735596
90th percentile: 12.894929695129395
95th percentile: 16.515735054016112
99th percentile: 19.412379341125487
mean time: 5.418242120742798
%s, retrying in %s seconds...
Received healthy response to inference request in 1.8117396831512451s
Received healthy response to inference request in 1.8110499382019043s
Received healthy response to inference request in 1.6544585227966309s
Received healthy response to inference request in 1.6542909145355225s
Received healthy response to inference request in 1.6098215579986572s
5 requests
0 failed requests
5th percentile: 1.6187154293060302
10th percentile: 1.6276093006134034
20th percentile: 1.6453970432281495
30th percentile: 1.654324436187744
40th percentile: 1.6543914794921875
50th percentile: 1.6544585227966309
60th percentile: 1.7170950889587402
70th percentile: 1.7797316551208495
80th percentile: 1.8111878871917724
90th percentile: 1.8114637851715087
95th percentile: 1.811601734161377
99th percentile: 1.8117120933532715
mean time: 1.708272123336792
Pipeline stage StressChecker completed in 38.49s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.80s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.63s
Shutdown handler de-registered
mistralai-mistral-nem_93303_v299 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.12s
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3909.29s
Shutdown handler de-registered
mistralai-mistral-nem_93303_v299 status is now inactive due to auto deactivation removed underperforming models
mistralai-mistral-nem_93303_v299 status is now torndown due to DeploymentManager action
mistralai-mistral-nem_93303_v299 status is now torndown due to DeploymentManager action
mistralai-mistral-nem_93303_v299 status is now torndown due to DeploymentManager action