Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name chaiml-llama-8b-big-retu-8570-v2-mkmlizer
Waiting for job on chaiml-llama-8b-big-retu-8570-v2-mkmlizer to finish
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ _____ __ __ ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ /___/ ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ Version: 0.11.12 ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ https://mk1.ai ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ belonging to: ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ Chai Research Corp. ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ║ ║
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: Downloaded to shared memory in 19.766s
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:t0, folder:/tmp/tmp7mtdkvmx, device:0
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: quantized model in 84.029s
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: Processed model ChaiML/llama_8b_big_retune_6m_23424 in 103.796s
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: creating bucket guanaco-mkml-models
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-llama-8b-big-retu-8570-v2
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-llama-8b-big-retu-8570-v2/special_tokens_map.json
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-llama-8b-big-retu-8570-v2/config.json
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-llama-8b-big-retu-8570-v2/tokenizer_config.json
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-llama-8b-big-retu-8570-v2/tokenizer.json
chaiml-llama-8b-big-retu-8570-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaiml-llama-8b-big-retu-8570-v2/flywheel_model.0.safetensors
chaiml-llama-8b-big-retu-8570-v2-mkmlizer:
Loading 0: 0%| | 0/291 [00:00<?, ?it/s]
Loading 0: 1%| | 3/291 [00:00<00:56, 5.10it/s]
Loading 0: 1%|▏ | 4/291 [00:01<01:32, 3.10it/s]
Loading 0: 2%|▏ | 5/291 [00:01<02:02, 2.33it/s]
Loading 0: 3%|▎ | 8/291 [00:02<01:03, 4.45it/s]
Loading 0: 3%|▎ | 9/291 [00:02<01:02, 4.53it/s]
Loading 0: 3%|▎ | 10/291 [00:02<00:54, 5.19it/s]
Loading 0: 4%|▍ | 12/291 [00:02<01:04, 4.31it/s]
Loading 0: 4%|▍ | 13/291 [00:03<01:25, 3.24it/s]
Loading 0: 5%|▍ | 14/291 [00:04<01:48, 2.55it/s]
Loading 0: 6%|▌ | 17/291 [00:04<01:02, 4.36it/s]
Loading 0: 6%|▌ | 18/291 [00:04<00:59, 4.59it/s]
Loading 0: 7%|▋ | 19/291 [00:04<00:53, 5.08it/s]
Loading 0: 7%|▋ | 21/291 [00:05<01:02, 4.30it/s]
Loading 0: 8%|▊ | 22/291 [00:05<01:22, 3.26it/s]
Loading 0: 8%|▊ | 23/291 [00:06<01:43, 2.58it/s]
Loading 0: 9%|▉ | 26/291 [00:06<01:00, 4.39it/s]
Loading 0: 9%|▉ | 27/291 [00:06<00:57, 4.63it/s]
Loading 0: 10%|▉ | 28/291 [00:06<00:50, 5.18it/s]
Loading 0: 10%|█ | 30/291 [00:07<00:59, 4.36it/s]
Loading 0: 11%|█ | 31/291 [00:08<01:19, 3.28it/s]
Loading 0: 11%|█ | 32/291 [00:08<01:39, 2.60it/s]
Loading 0: 12%|█▏ | 35/291 [00:08<00:57, 4.42it/s]
Loading 0: 12%|█▏ | 36/291 [00:09<00:54, 4.65it/s]
Loading 0: 13%|█▎ | 37/291 [00:09<00:48, 5.21it/s]
Loading 0: 13%|█▎ | 39/291 [00:09<00:57, 4.37it/s]
Loading 0: 14%|█▎ | 40/291 [00:10<01:15, 3.31it/s]
Loading 0: 14%|█▍ | 41/291 [00:11<01:35, 2.62it/s]
Loading 0: 15%|█▌ | 44/291 [00:11<00:55, 4.45it/s]
Loading 0: 15%|█▌ | 45/291 [00:11<00:52, 4.69it/s]
Loading 0: 16%|█▋ | 48/291 [00:12<00:52, 4.59it/s]
Loading 0: 17%|█▋ | 49/291 [00:12<01:07, 3.57it/s]
Loading 0: 17%|█▋ | 50/291 [00:13<01:25, 2.83it/s]
Loading 0: 18%|█▊ | 53/291 [00:13<00:52, 4.52it/s]
Loading 0: 19%|█▊ | 54/291 [00:13<00:50, 4.73it/s]
Loading 0: 19%|█▉ | 55/291 [00:13<00:45, 5.23it/s]
Loading 0: 20%|█▉ | 57/291 [00:14<00:53, 4.40it/s]
Loading 0: 20%|█▉ | 58/291 [00:14<01:09, 3.34it/s]
Loading 0: 20%|██ | 59/291 [00:15<01:27, 2.64it/s]
Loading 0: 21%|██▏ | 62/291 [00:15<00:51, 4.43it/s]
Loading 0: 22%|██▏ | 63/291 [00:15<00:48, 4.67it/s]
Loading 0: 23%|██▎ | 66/291 [00:16<00:49, 4.57it/s]
Loading 0: 23%|██▎ | 67/291 [00:17<01:03, 3.55it/s]
Loading 0: 23%|██▎ | 68/291 [00:17<01:19, 2.81it/s]
Loading 0: 24%|██▍ | 71/291 [00:18<00:48, 4.53it/s]
Loading 0: 25%|██▍ | 72/291 [00:18<00:46, 4.74it/s]
Loading 0: 26%|██▌ | 75/291 [00:18<00:46, 4.61it/s]
Loading 0: 26%|██▌ | 76/291 [00:19<00:59, 3.59it/s]
Loading 0: 26%|██▋ | 77/291 [00:20<01:15, 2.85it/s]
Loading 0: 27%|██▋ | 80/291 [00:20<00:46, 4.51it/s]
Loading 0: 28%|██▊ | 81/291 [00:20<00:44, 4.72it/s]
Loading 0: 28%|██▊ | 82/291 [00:20<00:39, 5.29it/s]
Loading 0: 29%|██▊ | 83/291 [00:20<00:39, 5.21it/s]
Loading 0: 29%|██▉ | 84/291 [00:21<00:58, 3.51it/s]
Loading 0: 29%|██▉ | 85/291 [00:21<01:13, 2.79it/s]
Loading 0: 30%|██▉ | 86/291 [00:22<01:28, 2.31it/s]
Loading 0: 31%|███ | 89/291 [00:22<00:47, 4.25it/s]
Loading 0: 31%|███ | 90/291 [00:22<00:44, 4.53it/s]
Loading 0: 32%|███▏ | 93/291 [00:23<00:43, 4.51it/s]
Loading 0: 32%|███▏ | 94/291 [00:24<00:56, 3.50it/s]
Loading 0: 33%|███▎ | 95/291 [00:24<01:10, 2.79it/s]
Loading 0: 34%|███▎ | 98/291 [00:25<00:42, 4.49it/s]
Loading 0: 34%|███▍ | 99/291 [00:25<00:40, 4.72it/s]
Loading 0: 35%|███▌ | 102/291 [00:25<00:40, 4.61it/s]
Loading 0: 35%|███▌ | 103/291 [00:26<00:52, 3.60it/s]
Loading 0: 36%|███▌ | 104/291 [00:27<01:05, 2.85it/s]
Loading 0: 37%|███▋ | 107/291 [00:27<00:40, 4.51it/s]
Loading 0: 37%|███▋ | 108/291 [00:27<00:38, 4.74it/s]
Loading 0: 38%|███▊ | 111/291 [00:28<00:38, 4.63it/s]
Loading 0: 38%|███▊ | 112/291 [00:28<00:49, 3.62it/s]
Loading 0: 39%|███▉ | 113/291 [00:29<01:01, 2.90it/s]
Loading 0: 40%|███▉ | 116/291 [00:29<00:38, 4.57it/s]
Loading 0: 40%|████ | 117/291 [00:29<00:36, 4.78it/s]
Loading 0: 41%|████ | 118/291 [00:29<00:32, 5.32it/s]
Loading 0: 41%|████ | 120/291 [00:30<00:38, 4.48it/s]
Loading 0: 42%|████▏ | 121/291 [00:30<00:50, 3.38it/s]
Loading 0: 42%|████▏ | 122/291 [00:31<01:03, 2.68it/s]
Loading 0: 43%|████▎ | 125/291 [00:31<00:36, 4.51it/s]
Loading 0: 43%|████▎ | 126/291 [00:31<00:34, 4.75it/s]
Loading 0: 44%|████▍ | 129/291 [00:32<00:35, 4.63it/s]
Loading 0: 45%|████▍ | 130/291 [00:33<00:44, 3.60it/s]
Loading 0: 45%|████▌ | 131/291 [00:33<00:55, 2.86it/s]
Loading 0: 46%|████▌ | 134/291 [00:33<00:34, 4.56it/s]
Loading 0: 46%|████▋ | 135/291 [00:34<00:32, 4.77it/s]
Loading 0: 47%|████▋ | 136/291 [00:34<00:29, 5.33it/s]
Loading 0: 47%|████▋ | 138/291 [00:34<00:34, 4.47it/s]
Loading 0: 48%|████▊ | 139/291 [00:35<00:44, 3.38it/s]
Loading 0: 48%|████▊ | 140/291 [00:36<00:57, 2.65it/s]
Loading 0: 49%|████▉ | 143/291 [00:36<00:33, 4.46it/s]
Loading 0: 49%|████▉ | 144/291 [00:36<00:31, 4.69it/s]
Loading 0: 51%|█████ | 147/291 [00:37<00:31, 4.61it/s]
Loading 0: 51%|█████ | 148/291 [00:37<00:39, 3.59it/s]
Loading 0: 51%|█████ | 149/291 [00:38<00:49, 2.86it/s]
Loading 0: 52%|█████▏ | 152/291 [00:38<00:30, 4.55it/s]
Loading 0: 53%|█████▎ | 153/291 [00:38<00:28, 4.77it/s]
Loading 0: 54%|█████▎ | 156/291 [00:39<00:28, 4.66it/s]
Loading 0: 54%|█████▍ | 157/291 [00:39<00:36, 3.64it/s]
Loading 0: 54%|█████▍ | 158/291 [00:40<00:46, 2.89it/s]
Loading 0: 55%|█████▌ | 161/291 [00:40<00:28, 4.57it/s]
Loading 0: 56%|█████▌ | 162/291 [00:40<00:26, 4.79it/s]
Loading 0: 57%|█████▋ | 165/291 [00:41<00:27, 4.66it/s]
Loading 0: 57%|█████▋ | 166/291 [00:42<00:34, 3.64it/s]
Loading 0: 57%|█████▋ | 167/291 [00:42<00:42, 2.90it/s]
Loading 0: 58%|█████▊ | 170/291 [00:42<00:26, 4.63it/s]
Loading 0: 59%|█████▉ | 171/291 [00:43<00:24, 4.84it/s]
Loading 0: 59%|█████▉ | 173/291 [00:43<00:29, 4.01it/s]
Loading 0: 60%|██████ | 175/291 [00:43<00:22, 5.06it/s]
Loading 0: 60%|██████ | 176/291 [00:44<00:21, 5.23it/s]
Loading 0: 62%|██████▏ | 179/291 [00:44<00:22, 4.88it/s]
Loading 0: 62%|██████▏ | 180/291 [00:45<00:29, 3.73it/s]
Loading 0: 62%|██████▏ | 181/291 [00:45<00:37, 2.92it/s]
Loading 0: 63%|██████▎ | 184/291 [00:46<00:22, 4.72it/s]
Loading 0: 64%|██████▎ | 185/291 [00:46<00:21, 4.93it/s]
Loading 0: 64%|██████▍ | 186/291 [00:46<00:19, 5.49it/s]
Loading 0: 64%|██████▍ | 187/291 [00:46<00:18, 5.52it/s]
Loading 0: 65%|██████▍ | 188/291 [00:47<00:28, 3.65it/s]
Loading 0: 65%|██████▍ | 189/291 [00:47<00:37, 2.74it/s]
Loading 0: 66%|██████▌ | 192/291 [00:48<00:26, 3.68it/s]
Loading 0: 66%|██████▋ | 193/291 [00:48<00:32, 3.03it/s]
Loading 0: 67%|██████▋ | 194/291 [00:49<00:38, 2.53it/s]
Loading 0: 68%|██████▊ | 197/291 [00:49<00:22, 4.26it/s]
Loading 0: 68%|██████▊ | 198/291 [00:49<00:20, 4.52it/s]
Loading 0: 69%|██████▉ | 201/291 [00:50<00:19, 4.53it/s]
Loading 0: 69%|██████▉ | 202/291 [00:51<00:25, 3.55it/s]
Loading 0: 70%|██████▉ | 203/291 [00:51<00:30, 2.84it/s]
Loading 0: 71%|███████ | 206/291 [00:51<00:18, 4.53it/s]
Loading 0: 71%|███████ | 207/291 [00:52<00:17, 4.76it/s]
Loading 0: 72%|███████▏ | 210/291 [00:52<00:17, 4.67it/s]
Loading 0: 73%|███████▎ | 211/291 [00:53<00:22, 3.64it/s]
Loading 0: 73%|███████▎ | 212/291 [00:53<00:27, 2.89it/s]
Loading 0: 74%|███████▍ | 215/291 [00:54<00:16, 4.61it/s]
Loading 0: 74%|███████▍ | 216/291 [00:54<00:15, 4.83it/s]
Loading 0: 75%|███████▍ | 217/291 [00:54<00:13, 5.39it/s]
Loading 0: 75%|███████▌ | 219/291 [00:55<00:15, 4.51it/s]
Loading 0: 76%|███████▌ | 220/291 [00:55<00:20, 3.42it/s]
Loading 0: 76%|███████▌ | 221/291 [00:56<00:25, 2.71it/s]
Loading 0: 77%|███████▋ | 224/291 [00:56<00:14, 4.61it/s]
Loading 0: 77%|███████▋ | 225/291 [00:56<00:13, 4.84it/s]
Loading 0: 78%|███████▊ | 227/291 [00:56<00:09, 6.59it/s]
Loading 0: 79%|███████▊ | 229/291 [00:57<00:18, 3.44it/s]
Loading 0: 79%|███████▉ | 230/291 [00:58<00:21, 2.80it/s]
Loading 0: 80%|████████ | 233/291 [00:58<00:13, 4.42it/s]
Loading 0: 80%|████████ | 234/291 [00:58<00:12, 4.66it/s]
Loading 0: 81%|████████ | 235/291 [00:58<00:10, 5.22it/s]
Loading 0: 81%|████████▏ | 237/291 [00:59<00:12, 4.43it/s]
Loading 0: 82%|████████▏ | 238/291 [01:00<00:15, 3.39it/s]
Loading 0: 82%|████████▏ | 239/291 [01:00<00:19, 2.70it/s]
Loading 0: 83%|████████▎ | 242/291 [01:00<00:10, 4.60it/s]
Loading 0: 84%|████████▎ | 243/291 [01:00<00:09, 4.83it/s]
Loading 0: 84%|████████▍ | 244/291 [01:01<00:08, 5.41it/s]
Loading 0: 85%|████████▍ | 246/291 [01:01<00:09, 4.50it/s]
Loading 0: 85%|████████▍ | 247/291 [01:02<00:12, 3.40it/s]
Loading 0: 85%|████████▌ | 248/291 [01:02<00:15, 2.70it/s]
Loading 0: 86%|████████▋ | 251/291 [01:03<00:08, 4.63it/s]
Loading 0: 87%|████████▋ | 252/291 [01:03<00:08, 4.86it/s]
Loading 0: 87%|████████▋ | 254/291 [01:03<00:05, 6.64it/s]
Loading 0: 88%|████████▊ | 256/291 [01:04<00:10, 3.45it/s]
Loading 0: 88%|████████▊ | 257/291 [01:05<00:12, 2.82it/s]
Loading 0: 89%|████████▉ | 260/291 [01:05<00:06, 4.50it/s]
Loading 0: 90%|████████▉ | 261/291 [01:05<00:06, 4.71it/s]
Loading 0: 90%|█████████ | 263/291 [01:05<00:04, 6.34it/s]
Loading 0: 91%|█████████ | 265/291 [01:06<00:07, 3.46it/s]
Loading 0: 91%|█████████▏| 266/291 [01:07<00:08, 2.85it/s]
Loading 0: 92%|█████████▏| 269/291 [01:07<00:04, 4.46it/s]
Loading 0: 93%|█████████▎| 270/291 [01:07<00:04, 4.69it/s]
Loading 0: 94%|█████████▍| 273/291 [01:08<00:03, 4.61it/s]
Loading 0: 94%|█████████▍| 274/291 [01:08<00:04, 3.64it/s]
Loading 0: 95%|█████████▍| 275/291 [01:09<00:05, 2.89it/s]
Loading 0: 96%|█████████▌| 278/291 [01:09<00:02, 4.62it/s]
Loading 0: 96%|█████████▌| 279/291 [01:09<00:02, 4.83it/s]
Loading 0: 96%|█████████▌| 280/291 [01:09<00:02, 5.38it/s]
Loading 0: 97%|█████████▋| 281/291 [01:10<00:02, 3.70it/s]
Loading 0: 97%|█████████▋| 282/291 [01:11<00:03, 2.81it/s]
Loading 0: 98%|█████████▊| 284/291 [01:11<00:01, 3.98it/s]
Loading 0: 98%|█████████▊| 285/291 [01:11<00:01, 4.34it/s]
Loading 0: 99%|█████████▊| 287/291 [01:11<00:00, 5.21it/s]
Loading 0: 99%|█████████▉| 288/291 [01:12<00:00, 3.66it/s]
Job chaiml-llama-8b-big-retu-8570-v2-mkmlizer completed after 128.11s with status: succeeded
Stopping job with name chaiml-llama-8b-big-retu-8570-v2-mkmlizer
Pipeline stage MKMLizer completed in 129.34s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.29s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service chaiml-llama-8b-big-retu-8570-v2
Waiting for inference service chaiml-llama-8b-big-retu-8570-v2 to be ready
Inference service chaiml-llama-8b-big-retu-8570-v2 ready after 141.50179052352905s
Pipeline stage MKMLDeployer completed in 142.80s
run pipeline stage %s
Running pipeline stage StressChecker
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 5.070516109466553s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 3.6122803688049316s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 3.221102476119995s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 5.387114763259888s
HTTP Request: %s %s "%s %d %s"
Received healthy response to inference request in 5.228936672210693s
5 requests
0 failed requests
5th percentile: 3.2993380546569826
10th percentile: 3.3775736331939696
20th percentile: 3.534044790267944
30th percentile: 3.903927516937256
40th percentile: 4.487221813201904
50th percentile: 5.070516109466553
60th percentile: 5.133884334564209
70th percentile: 5.197252559661865
80th percentile: 5.260572290420532
90th percentile: 5.32384352684021
95th percentile: 5.355479145050049
99th percentile: 5.38078763961792
mean time: 4.5039900779724125
Pipeline stage StressChecker completed in 24.87s
Shutdown handler de-registered
chaiml-llama-8b-big-retu_8570_v2 status is now deployed due to DeploymentManager action
chaiml-llama-8b-big-retu_8570_v2 status is now inactive due to auto deactivation removed underperforming models
run pipeline %s
Deleting key arushimgupta-final-check-2833-v1/config.json from bucket guanaco-mkml-models
Tearing down inference service arushimgupta-lora-save-2-v1
Service arushimgupta-lora-save-1-v1 has been torndown
Deleting key anthracite-org-magnum-v2-6820-v1/special_tokens_map.json from bucket guanaco-mkml-models
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
admin requested tearing down of chaiml-llama-8b-big-retu_8570_v2
Checking if service arushimgupta-lora-save-6-v1 is running
Shutdown handler de-registered
Service arushimgupta-lora-save-2-v1 has been torndown
Pipeline stage MKMLDeleter completed in 7.36s
Deleting key anthracite-org-magnum-v2-6820-v1/tokenizer.json from bucket guanaco-mkml-models
run pipeline %s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from S3
Shutdown handler not registered because Python interpreter is not running in the main thread
blend_poful_2024-09-27 status is now torndown due to DeploymentManager action
Tearing down inference service arushimgupta-lora-save-3-v1
Pipeline stage MKMLDeleter completed in 7.50s
run pipeline stage %s
admin requested tearing down of chaiml-nemo-chai-4bio-me_9462_v2
Tearing down inference service arushimgupta-lora-save-6-v1
Deleting key arushimgupta-final-check-2833-v1/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key anthracite-org-magnum-v2-6820-v1/tokenizer_config.json from bucket guanaco-mkml-models
run pipeline stage %s
Cleaning model data from S3
Cleaning model data from model cache
Cleaning model data from model cache
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
Service arushimgupta-lora-save-3-v1 has been torndown
admin requested tearing down of chaiml-nemo-comm-2abio-m_6915_v1
Deleting key arushimgupta-final-check-2833-v1/tokenizer.json from bucket guanaco-mkml-models
Service arushimgupta-lora-save-6-v1 has been torndown
Pipeline stage MKMLModelDeleter completed in 13.05s
Running pipeline stage MKMLDeleter
Cleaning model data from model cache
Deleting key arushimgupta-final-check-3178-v2/config.json from bucket guanaco-mkml-models
run pipeline stage %s
Deleting key arushimgupta-final-check-3580-v1/config.json from bucket guanaco-mkml-models
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
run pipeline %s
Pipeline stage MKMLDeleter completed in 11.57s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLDeleter completed in 9.57s
Deleting key arushimgupta-final-check-2833-v1/tokenizer_config.json from bucket guanaco-mkml-models
Shutdown handler de-registered
Checking if service chaiml-lexical-nemov8-1k1e5-v11 is running
Deleting key arushimgupta-final-check-3580-v3/config.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
Deleting key arushimgupta-final-check-3580-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Cleaning model data from S3
Cleaning model data from model cache
run pipeline stage %s
run pipeline stage %s
run pipeline %s
run pipeline stage %s
Pipeline stage MKMLModelDeleter completed in 17.38s
anthracite-org-magnum-v2_6820_v1 status is now torndown due to DeploymentManager action
Deleting key arushimgupta-final-check-3580-v3/flywheel_model.0.safetensors from bucket guanaco-mkml-models
admin requested tearing down of chaiml-nemo-community-2a_v1
Tearing down inference service chaiml-lexical-nemov8-1k1e5-v11
Deleting key arushimgupta-final-check-3178-v2/special_tokens_map.json from bucket guanaco-mkml-models
Checking if service chaiml-llama-8b-big-retu-8570-v2 is running
Cleaning model data from model cache
Deleting key arushimgupta-lora-save-1-v1/config.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
Deleting key arushimgupta-final-check-3580-v1/special_tokens_map.json from bucket guanaco-mkml-models
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Shutdown handler de-registered
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of chaiml-nemo-community-2c_v1
Deleting key arushimgupta-final-check-3580-v3/special_tokens_map.json from bucket guanaco-mkml-models
Service chaiml-lexical-nemov8-1k1e5-v11 has been torndown
Deleting key arushimgupta-final-check-3178-v2/tokenizer.json from bucket guanaco-mkml-models
Deleting key arushimgupta-lora-save-2-v1/config.json from bucket guanaco-mkml-models
Deleting key arushimgupta-lora-save-1-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key arushimgupta-final-check-3580-v1/tokenizer.json from bucket guanaco-mkml-models
Checking if service chaiml-nemo-chai-4bio-me-9462-v2 is running
Tearing down inference service chaiml-llama-8b-big-retu-8570-v2
Cleaning model data from S3
Running pipeline stage MKMLDeleter
Cleaning model data from S3
run pipeline %s
arushimgupta-final-check_2833_v1 status is now torndown due to DeploymentManager action
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of chaiml-nemo-community-5_v1
Deleting key arushimgupta-final-check-3580-v3/tokenizer.json from bucket guanaco-mkml-models
Pipeline stage MKMLDeleter completed in 17.03s
Deleting key arushimgupta-final-check-3178-v2/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key arushimgupta-lora-save-2-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key arushimgupta-final-check-3580-v1/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key arushimgupta-lora-save-1-v1/special_tokens_map.json from bucket guanaco-mkml-models
Service chaiml-llama-8b-big-retu-8570-v2 has been torndown
Cleaning model data from model cache
Tearing down inference service chaiml-nemo-chai-4bio-me-9462-v2
Cleaning model data from model cache
Checking if service chaiml-nemo-comm-2abio-m-6915-v1 is running
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Deleting key arushimgupta-final-check-3580-v3/tokenizer_config.json from bucket guanaco-mkml-models
admin requested tearing down of chaiml-nemo-lyra-rica-2b_8403_v1
run pipeline stage %s
Pipeline stage MKMLModelDeleter completed in 29.48s
Pipeline stage MKMLModelDeleter completed in 29.85s
Deleting key arushimgupta-lora-save-1-v1/tokenizer.json from bucket guanaco-mkml-models
chaiml-llama-8b-big-retu_8570_v2 status is now torndown due to DeploymentManager action