Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name axolotl-ai-co-romulus-mi-7539-v4-mkmlizer
Waiting for job on axolotl-ai-co-romulus-mi-7539-v4-mkmlizer to finish
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ _____ __ __ ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ /___/ ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ Version: 0.11.12 ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ https://mk1.ai ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ The license key for the current software has been verified as ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ belonging to: ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ Chai Research Corp. ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ║ ║
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: Downloaded to shared memory in 56.276s
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpzix2uoqr, device:0
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: Saving flywheel model at /dev/shm/model_cache
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: quantized model in 36.110s
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: Processed model axolotl-ai-co/romulus-mistral-nemo-12b-simpo in 92.386s
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: creating bucket guanaco-mkml-models
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v4
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v4/config.json
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v4/special_tokens_map.json
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v4/tokenizer_config.json
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v4/tokenizer.json
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/axolotl-ai-co-romulus-mi-7539-v4/flywheel_model.0.safetensors
axolotl-ai-co-romulus-mi-7539-v4-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:11, 30.77it/s]
Loading 0: 4%|▎ | 13/363 [00:00<00:06, 51.10it/s]
Loading 0: 5%|▌ | 19/363 [00:00<00:07, 46.97it/s]
Loading 0: 7%|▋ | 24/363 [00:00<00:08, 42.10it/s]
Loading 0: 8%|▊ | 30/363 [00:00<00:07, 47.16it/s]
Loading 0: 10%|▉ | 35/363 [00:00<00:07, 45.28it/s]
Loading 0: 11%|█ | 40/363 [00:00<00:07, 45.71it/s]
Loading 0: 12%|█▏ | 45/363 [00:01<00:07, 44.73it/s]
Loading 0: 14%|█▍ | 50/363 [00:01<00:08, 35.98it/s]
Loading 0: 16%|█▋ | 59/363 [00:01<00:06, 48.28it/s]
Loading 0: 18%|█▊ | 65/363 [00:01<00:09, 31.28it/s]
Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 37.29it/s]
Loading 0: 21%|██ | 77/363 [00:01<00:07, 39.78it/s]
Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 33.76it/s]
Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 40.30it/s]
Loading 0: 26%|██▌ | 94/363 [00:02<00:06, 40.93it/s]
Loading 0: 27%|██▋ | 99/363 [00:02<00:06, 41.91it/s]
Loading 0: 29%|██▉ | 105/363 [00:02<00:06, 41.33it/s]
Loading 0: 31%|███ | 112/363 [00:02<00:05, 46.22it/s]
Loading 0: 32%|███▏ | 117/363 [00:02<00:05, 44.35it/s]
Loading 0: 34%|███▍ | 123/363 [00:02<00:05, 42.79it/s]
Loading 0: 35%|███▌ | 128/363 [00:03<00:05, 41.88it/s]
Loading 0: 37%|███▋ | 134/363 [00:03<00:04, 45.87it/s]
Loading 0: 38%|███▊ | 139/363 [00:03<00:05, 44.08it/s]
Loading 0: 40%|███▉ | 144/363 [00:03<00:07, 29.17it/s]
Loading 0: 41%|████ | 149/363 [00:03<00:06, 31.77it/s]
Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 38.83it/s]
Loading 0: 44%|████▍ | 161/363 [00:03<00:05, 40.05it/s]
Loading 0: 46%|████▌ | 166/363 [00:04<00:04, 40.61it/s]
Loading 0: 47%|████▋ | 171/363 [00:04<00:04, 42.21it/s]
Loading 0: 48%|████▊ | 176/363 [00:04<00:05, 35.23it/s]
Loading 0: 50%|█████ | 183/363 [00:04<00:04, 42.37it/s]
Loading 0: 52%|█████▏ | 188/363 [00:04<00:04, 41.06it/s]
Loading 0: 53%|█████▎ | 193/363 [00:04<00:04, 41.32it/s]
Loading 0: 55%|█████▍ | 199/363 [00:04<00:04, 40.62it/s]
Loading 0: 56%|█████▌ | 204/363 [00:05<00:03, 40.24it/s]
Loading 0: 58%|█████▊ | 211/363 [00:05<00:03, 45.29it/s]
Loading 0: 60%|█████▉ | 217/363 [00:05<00:03, 43.15it/s]
Loading 0: 61%|██████ | 222/363 [00:05<00:03, 44.22it/s]
Loading 0: 63%|██████▎ | 227/363 [00:05<00:04, 31.08it/s]
Loading 0: 64%|██████▎ | 231/363 [00:05<00:04, 31.03it/s]
Loading 0: 65%|██████▌ | 237/363 [00:05<00:03, 36.25it/s]
Loading 0: 67%|██████▋ | 242/363 [00:06<00:03, 38.20it/s]
Loading 0: 68%|██████▊ | 247/363 [00:06<00:02, 39.75it/s]
Loading 0: 69%|██████▉ | 252/363 [00:06<00:02, 41.78it/s]
Loading 0: 71%|███████ | 257/363 [00:06<00:02, 35.60it/s]
Loading 0: 73%|███████▎ | 265/363 [00:06<00:02, 43.70it/s]
Loading 0: 74%|███████▍ | 270/363 [00:06<00:02, 38.78it/s]
Loading 0: 76%|███████▌ | 275/363 [00:06<00:02, 33.70it/s]
Loading 0: 78%|███████▊ | 283/363 [00:07<00:01, 41.68it/s]
Loading 0: 79%|███████▉ | 288/363 [00:07<00:01, 42.85it/s]
Loading 0: 81%|████████ | 293/363 [00:07<00:01, 36.11it/s]
Loading 0: 82%|████████▏ | 299/363 [00:07<00:01, 41.24it/s]
Loading 0: 84%|████████▎ | 304/363 [00:14<00:22, 2.61it/s]
Loading 0: 85%|████████▍ | 308/363 [00:14<00:16, 3.34it/s]
Loading 0: 86%|████████▌ | 312/363 [00:14<00:11, 4.33it/s]
Loading 0: 88%|████████▊ | 319/363 [00:14<00:06, 6.81it/s]
Loading 0: 89%|████████▉ | 324/363 [00:14<00:04, 8.95it/s]
Loading 0: 91%|█████████ | 330/363 [00:14<00:02, 11.82it/s]
Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 17.45it/s]
Loading 0: 95%|█████████▍| 344/363 [00:15<00:00, 20.43it/s]
Loading 0: 96%|█████████▌| 349/363 [00:15<00:00, 23.56it/s]
Loading 0: 98%|█████████▊| 356/363 [00:15<00:00, 30.08it/s]
Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 32.98it/s]
Job axolotl-ai-co-romulus-mi-7539-v4-mkmlizer completed after 124.79s with status: succeeded
Stopping job with name axolotl-ai-co-romulus-mi-7539-v4-mkmlizer
Pipeline stage MKMLizer completed in 125.33s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service axolotl-ai-co-romulus-mi-7539-v4
Waiting for inference service axolotl-ai-co-romulus-mi-7539-v4 to be ready
Failed to get response for submission zonemercy-lexical-nemo-_1518_v18: ('http://zonemercy-lexical-nemo-1518-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:60656->127.0.0.1:8080: read: connection reset by peer\n')
Inference service axolotl-ai-co-romulus-mi-7539-v4 ready after 140.5286877155304s
Pipeline stage MKMLDeployer completed in 141.07s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.3072519302368164s
Received healthy response to inference request in 1.485682725906372s
Received healthy response to inference request in 1.5449700355529785s
Received healthy response to inference request in 1.6624107360839844s
Received healthy response to inference request in 1.4381499290466309s
5 requests
0 failed requests
5th percentile: 1.4476564884185792
10th percentile: 1.4571630477905273
20th percentile: 1.4761761665344237
30th percentile: 1.4975401878356933
40th percentile: 1.521255111694336
50th percentile: 1.5449700355529785
60th percentile: 1.5919463157653808
70th percentile: 1.6389225959777831
80th percentile: 1.7913789749145508
90th percentile: 2.0493154525756836
95th percentile: 2.17828369140625
99th percentile: 2.281458282470703
mean time: 1.6876930713653564
Pipeline stage StressChecker completed in 10.02s
Shutdown handler de-registered
axolotl-ai-co-romulus-mi_7539_v4 status is now deployed due to DeploymentManager action
axolotl-ai-co-romulus-mi_7539_v4 status is now inactive due to auto deactivation removed underperforming models
axolotl-ai-co-romulus-mi_7539_v4 status is now torndown due to DeploymentManager action