submission_id: mistralai-mistral-nemo_9330_v121
developer_uid: immaculate_possum_03470
best_of: 8
celo_rating: 1235.59
display_name: nemo_base_1
family_friendly_score: 0.5624935280107695
family_friendly_standard_error: 0.008724393949547217
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.075, 'top_k': 60, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
ineligible_reason: num_battles<5000
is_internal_developer: False
language_model: mistralai/Mistral-Nemo-Instruct-2407
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: mistralai/Mistral-Nemo-I
model_name: nemo_base_1
model_num_parameters: 12772070400.0
model_repo: mistralai/Mistral-Nemo-Instruct-2407
model_size: 13B
num_battles: 3325
num_wins: 1592
ranking_group: single
status: torndown
submission_type: basic
timestamp: 2024-09-26T23:39:46+00:00
us_pacific_date: 2024-09-26
win_ratio: 0.478796992481203
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name mistralai-mistral-nemo-9330-v121-mkmlizer
Waiting for job on mistralai-mistral-nemo-9330-v121-mkmlizer to finish
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ _____ __ __ ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ /___/ ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ Version: 0.11.12 ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ https://mk1.ai ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ belonging to: ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ Chai Research Corp. ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ║ ║
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mistralai-mistral-nemo-9330-v121-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ _____ __ __ ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ /___/ ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ Version: 0.11.12 ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ https://mk1.ai ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ The license key for the current software has been verified as ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ belonging to: ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ Chai Research Corp. ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ║ ║
mistralai-mistral-nemo-9330-v121-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
mistralai-mistral-nemo-9330-v121-mkmlizer: Downloaded to shared memory in 48.849s
mistralai-mistral-nemo-9330-v121-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpt_mjcyku, device:0
mistralai-mistral-nemo-9330-v121-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mistral-nemo-9330-v121-mkmlizer: quantized model in 37.829s
mistralai-mistral-nemo-9330-v121-mkmlizer: Processed model mistralai/Mistral-Nemo-Instruct-2407 in 86.678s
mistralai-mistral-nemo-9330-v121-mkmlizer: creating bucket guanaco-mkml-models
mistralai-mistral-nemo-9330-v121-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
mistralai-mistral-nemo-9330-v121-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v121
mistralai-mistral-nemo-9330-v121-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v121/config.json
mistralai-mistral-nemo-9330-v121-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v121/special_tokens_map.json
mistralai-mistral-nemo-9330-v121-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v121/tokenizer_config.json
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: Downloaded to shared memory in 86.887s
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpd7yxdy7f, device:0
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: Saving flywheel model at /dev/shm/model_cache
mistralai-mistral-nemo-9330-v121-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/mistralai-mistral-nemo-9330-v121/flywheel_model.0.safetensors
mistralai-mistral-nemo-9330-v121-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:11, 30.09it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:07, 49.82it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 44.22it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:08, 41.97it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 47.94it/s] Loading 0: 10%|█ | 37/363 [00:00<00:07, 44.34it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 42.73it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 47.94it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 44.64it/s] Loading 0: 17%|█▋ | 61/363 [00:01<00:08, 33.64it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:09, 32.73it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 39.10it/s] Loading 0: 21%|██ | 77/363 [00:01<00:06, 41.21it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 34.99it/s] Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 42.43it/s] Loading 0: 26%|██▌ | 94/363 [00:02<00:06, 42.14it/s] Loading 0: 27%|██▋ | 99/363 [00:02<00:06, 42.07it/s] Loading 0: 29%|██▊ | 104/363 [00:02<00:05, 43.29it/s] Loading 0: 30%|███ | 110/363 [00:02<00:06, 41.10it/s] Loading 0: 32%|███▏ | 115/363 [00:02<00:06, 41.24it/s] Loading 0: 33%|███▎ | 120/363 [00:02<00:06, 38.68it/s] Loading 0: 34%|███▍ | 125/363 [00:03<00:05, 40.36it/s] Loading 0: 36%|███▌ | 130/363 [00:03<00:05, 39.20it/s] Loading 0: 37%|███▋ | 134/363 [00:03<00:05, 38.58it/s] Loading 0: 38%|███▊ | 138/363 [00:03<00:06, 36.59it/s] Loading 0: 39%|███▉ | 142/363 [00:03<00:08, 25.43it/s] Loading 0: 40%|████ | 146/363 [00:03<00:08, 26.73it/s] Loading 0: 41%|████▏ | 150/363 [00:04<00:08, 25.65it/s] Loading 0: 42%|████▏ | 154/363 [00:04<00:07, 28.61it/s] Loading 0: 44%|████▎ | 158/363 [00:04<00:07, 27.61it/s] Loading 0: 45%|████▍ | 163/363 [00:04<00:06, 32.27it/s] Loading 0: 46%|████▌ | 167/363 [00:04<00:06, 30.21it/s] Loading 0: 48%|████▊ | 174/363 [00:04<00:04, 37.96it/s] Loading 0: 49%|████▉ | 179/363 [00:04<00:04, 38.10it/s] Loading 0: 51%|█████ | 184/363 [00:04<00:04, 38.70it/s] Loading 0: 52%|█████▏ | 189/363 [00:05<00:04, 40.11it/s] Loading 0: 53%|█████▎ | 194/363 [00:05<00:05, 32.41it/s] Loading 0: 55%|█████▌ | 201/363 [00:05<00:04, 37.98it/s] Loading 0: 57%|█████▋ | 206/363 [00:05<00:04, 36.59it/s] Loading 0: 58%|█████▊ | 210/363 [00:05<00:04, 36.16it/s] Loading 0: 59%|█████▉ | 214/363 [00:05<00:04, 35.00it/s] Loading 0: 60%|██████ | 218/363 [00:05<00:04, 35.46it/s] Loading 0: 61%|██████▏ | 223/363 [00:06<00:05, 26.64it/s] Loading 0: 63%|██████▎ | 227/363 [00:06<00:04, 28.28it/s] Loading 0: 64%|██████▎ | 231/363 [00:06<00:04, 27.83it/s] Loading 0: 65%|██████▌ | 237/363 [00:06<00:03, 33.42it/s] Loading 0: 66%|██████▋ | 241/363 [00:06<00:03, 33.57it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:03, 37.89it/s] Loading 0: 69%|██████▉ | 251/363 [00:06<00:02, 37.80it/s] Loading 0: 70%|███████ | 255/363 [00:06<00:02, 37.84it/s] Loading 0: 71%|███████▏ | 259/363 [00:07<00:02, 35.99it/s] Loading 0: 73%|███████▎ | 264/363 [00:07<00:02, 38.84it/s] Loading 0: 74%|███████▍ | 268/363 [00:07<00:02, 37.01it/s] Loading 0: 75%|███████▌ | 273/363 [00:07<00:02, 37.99it/s] Loading 0: 76%|███████▋ | 277/363 [00:07<00:02, 36.61it/s] Loading 0: 78%|███████▊ | 282/363 [00:07<00:02, 38.94it/s] Loading 0: 79%|███████▉ | 286/363 [00:07<00:02, 36.05it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 36.93it/s] Loading 0: 81%|████████▏ | 295/363 [00:08<00:01, 35.70it/s] Loading 0: 82%|████████▏ | 299/363 [00:08<00:01, 35.84it/s] Loading 0: 84%|████████▎ | 304/363 [00:15<00:28, 2.10it/s] Loading 0: 85%|████████▍ | 307/363 [00:15<00:21, 2.64it/s] Loading 0: 86%|████████▌ | 312/363 [00:15<00:13, 3.90it/s] Loading 0: 88%|████████▊ | 319/363 [00:15<00:06, 6.35it/s] Loading 0: 89%|████████▉ | 324/363 [00:15<00:04, 8.49it/s] Loading 0: 91%|█████████ | 329/363 [00:15<00:03, 11.20it/s] Loading 0: 92%|█████████▏| 334/363 [00:15<00:01, 14.54it/s] Loading 0: 93%|█████████▎| 339/363 [00:15<00:01, 16.70it/s] Loading 0: 95%|█████████▌| 346/363 [00:16<00:00, 23.28it/s] Loading 0: 97%|█████████▋| 351/363 [00:16<00:00, 26.53it/s] Loading 0: 98%|█████████▊| 356/363 [00:16<00:00, 29.59it/s] Loading 0: 99%|█████████▉| 361/363 [00:16<00:00, 33.17it/s]
Job mistralai-mistral-nemo-9330-v121-mkmlizer completed after 123.34s with status: succeeded
Stopping job with name mistralai-mistral-nemo-9330-v121-mkmlizer
Pipeline stage MKMLizer completed in 123.68s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service mistralai-mistral-nemo-9330-v121
Waiting for inference service mistralai-mistral-nemo-9330-v121 to be ready
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: quantized model in 43.858s
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: Processed model ChaiML/0926-nemo-virgo-top-safe-bot-1edit in 130.745s
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: creating bucket guanaco-mkml-models
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-0926-nemo-virgo-t-3956-v7
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-0926-nemo-virgo-t-3956-v7/config.json
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-0926-nemo-virgo-t-3956-v7/special_tokens_map.json
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-0926-nemo-virgo-t-3956-v7/tokenizer_config.json
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaiml-0926-nemo-virgo-t-3956-v7/flywheel_model.0.safetensors
chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 4/363 [00:00<00:10, 35.48it/s] Loading 0: 2%|▏ | 8/363 [00:00<00:14, 24.06it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:13, 25.66it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:15, 22.31it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:13, 25.70it/s] Loading 0: 6%|▌ | 22/363 [00:00<00:12, 26.72it/s] Loading 0: 7%|▋ | 25/363 [00:01<00:18, 18.06it/s] Loading 0: 8%|▊ | 28/363 [00:01<00:17, 18.69it/s] Loading 0: 9%|▉ | 32/363 [00:01<00:17, 19.32it/s] Loading 0: 11%|█ | 39/363 [00:01<00:11, 27.07it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:12, 25.26it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:10, 29.63it/s] Loading 0: 14%|█▍ | 52/363 [00:02<00:11, 28.02it/s] Loading 0: 15%|█▌ | 56/363 [00:02<00:10, 28.44it/s] Loading 0: 17%|█▋ | 61/363 [00:02<00:11, 25.48it/s] Loading 0: 18%|█▊ | 64/363 [00:02<00:13, 22.09it/s] Loading 0: 20%|█▉ | 71/363 [00:02<00:10, 28.96it/s] Loading 0: 21%|██ | 75/363 [00:02<00:10, 28.36it/s] Loading 0: 21%|██▏ | 78/363 [00:03<00:11, 24.52it/s] Loading 0: 23%|██▎ | 82/363 [00:03<00:10, 26.22it/s] Loading 0: 23%|██▎ | 85/363 [00:03<00:10, 26.84it/s] Loading 0: 24%|██▍ | 88/363 [00:03<00:10, 26.11it/s] Loading 0: 26%|██▌ | 93/363 [00:03<00:09, 28.32it/s] Loading 0: 26%|██▋ | 96/363 [00:03<00:10, 25.68it/s] Loading 0: 28%|██▊ | 101/363 [00:04<00:11, 23.10it/s] Loading 0: 29%|██▊ | 104/363 [00:04<00:13, 19.89it/s] Loading 0: 30%|███ | 109/363 [00:04<00:10, 24.83it/s] Loading 0: 31%|███ | 112/363 [00:04<00:09, 25.86it/s] Loading 0: 32%|███▏ | 115/363 [00:04<00:10, 24.80it/s] Loading 0: 33%|███▎ | 118/363 [00:04<00:09, 25.67it/s] Loading 0: 34%|███▎ | 122/363 [00:04<00:10, 22.86it/s] Loading 0: 35%|███▍ | 127/363 [00:05<00:08, 28.32it/s] Loading 0: 36%|███▌ | 131/363 [00:05<00:09, 23.92it/s] Loading 0: 37%|███▋ | 136/363 [00:05<00:07, 28.52it/s] Loading 0: 39%|███▉ | 141/363 [00:05<00:07, 29.44it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:10, 20.24it/s] Loading 0: 41%|████ | 148/363 [00:05<00:09, 21.72it/s] Loading 0: 42%|████▏ | 151/363 [00:06<00:09, 22.12it/s] Loading 0: 43%|████▎ | 156/363 [00:06<00:08, 25.01it/s] Loading 0: 44%|████▍ | 159/363 [00:06<00:08, 22.87it/s] Loading 0: 45%|████▍ | 163/363 [00:06<00:07, 26.05it/s] Loading 0: 46%|████▌ | 167/363 [00:06<00:08, 23.12it/s] Loading 0: 47%|████▋ | 172/363 [00:06<00:06, 28.08it/s] Loading 0: 48%|████▊ | 176/363 [00:07<00:07, 24.15it/s] Loading 0: 50%|████▉ | 181/363 [00:07<00:06, 28.62it/s] Loading 0: 51%|█████ | 185/363 [00:07<00:09, 18.44it/s] Loading 0: 52%|█████▏ | 190/363 [00:07<00:07, 22.59it/s] Loading 0: 53%|█████▎ | 194/363 [00:07<00:08, 20.98it/s] Loading 0: 55%|█████▍ | 199/363 [00:08<00:06, 25.23it/s] Loading 0: 56%|█████▌ | 203/363 [00:08<00:07, 22.68it/s] Loading 0: 57%|█████▋ | 208/363 [00:08<00:05, 26.61it/s] Loading 0: 58%|█████▊ | 212/363 [00:08<00:06, 23.99it/s] Loading 0: 60%|█████▉ | 217/363 [00:08<00:05, 28.51it/s] Loading 0: 61%|██████ | 222/363 [00:08<00:04, 29.70it/s] Loading 0: 62%|██████▏ | 226/363 [00:09<00:06, 21.39it/s] Loading 0: 63%|██████▎ | 229/363 [00:09<00:05, 22.75it/s] Loading 0: 64%|██████▍ | 232/363 [00:09<00:05, 22.31it/s] Loading 0: 65%|██████▌ | 237/363 [00:09<00:04, 25.30it/s] Loading 0: 66%|██████▌ | 240/363 [00:09<00:05, 22.79it/s] Loading 0: 67%|██████▋ | 244/363 [00:09<00:04, 25.35it/s] Loading 0: 68%|██████▊ | 248/363 [00:10<00:05, 22.87it/s] Loading 0: 70%|██████▉ | 253/363 [00:10<00:03, 27.57it/s] Loading 0: 71%|███████ | 257/363 [00:10<00:04, 23.69it/s] Loading 0: 72%|███████▏ | 262/363 [00:10<00:03, 27.80it/s] Loading 0: 73%|███████▎ | 266/363 [00:11<00:05, 18.18it/s] Loading 0: 75%|███████▍ | 271/363 [00:11<00:04, 22.60it/s] Loading 0: 76%|███████▌ | 275/363 [00:11<00:04, 20.95it/s] Loading 0: 77%|███████▋ | 280/363 [00:11<00:03, 25.28it/s] Loading 0: 78%|███████▊ | 284/363 [00:11<00:03, 22.09it/s] Loading 0: 80%|███████▉ | 289/363 [00:11<00:02, 26.35it/s] Loading 0: 81%|████████ | 293/363 [00:12<00:02, 23.40it/s] Loading 0: 82%|████████▏ | 298/363 [00:12<00:02, 27.43it/s] Loading 0: 83%|████████▎ | 303/363 [00:12<00:02, 27.72it/s] Loading 0: 85%|████████▍ | 307/363 [00:12<00:02, 19.50it/s] Loading 0: 85%|████████▌ | 310/363 [00:12<00:02, 20.98it/s] Loading 0: 86%|████████▌ | 313/363 [00:12<00:02, 21.18it/s] Loading 0: 87%|████████▋ | 316/363 [00:13<00:02, 22.52it/s] Loading 0: 88%|████████▊ | 319/363 [00:13<00:01, 24.06it/s] Loading 0: 89%|████████▊ | 322/363 [00:13<00:01, 23.41it/s] Loading 0: 90%|████████▉ | 325/363 [00:13<00:01, 24.62it/s] Loading 0: 91%|█████████ | 329/363 [00:13<00:01, 22.24it/s] Loading 0: 92%|█████████▏| 334/363 [00:13<00:01, 27.82it/s] Loading 0: 93%|█████████▎| 338/363 [00:13<00:01, 23.79it/s] Loading 0: 94%|█████████▍| 343/363 [00:14<00:00, 28.47it/s] Loading 0: 96%|█████████▌| 347/363 [00:21<00:08, 1.88it/s] Loading 0: 96%|█████████▋| 350/363 [00:21<00:05, 2.41it/s] Loading 0: 97%|█████████▋| 353/363 [00:21<00:03, 3.11it/s] Loading 0: 98%|█████████▊| 357/363 [00:21<00:01, 4.33it/s]
Job chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer completed after 227.39s with status: succeeded
Stopping job with name chaiml-0926-nemo-virgo-t-3956-v7-mkmlizer
Pipeline stage MKMLizer completed in 227.73s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service chaiml-0926-nemo-virgo-t-3956-v7
Waiting for inference service chaiml-0926-nemo-virgo-t-3956-v7 to be ready
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name arushimgupta-peft-save-1-v5-mkmlizer
Waiting for job on arushimgupta-peft-save-1-v5-mkmlizer to finish
arushimgupta-peft-save-1-v5-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
arushimgupta-peft-save-1-v5-mkmlizer: ║ _____ __ __ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ /___/ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Version: 0.11.12 ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ https://mk1.ai ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ The license key for the current software has been verified as ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ belonging to: ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Chai Research Corp. ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
arushimgupta-peft-save-1-v5-mkmlizer: Downloaded to shared memory in 15.327s
arushimgupta-peft-save-1-v5-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmph2e7k95s, device:0
arushimgupta-peft-save-1-v5-mkmlizer: Saving flywheel model at /dev/shm/model_cache
arushimgupta-peft-save-1-v5-mkmlizer: Loading 0: 0%| | 0/1203 [00:00<?, ?it/s]Traceback (most recent call last):
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 151, in <module>
arushimgupta-peft-save-1-v5-mkmlizer: cli()
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
arushimgupta-peft-save-1-v5-mkmlizer: return self.main(*args, **kwargs)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
arushimgupta-peft-save-1-v5-mkmlizer: rv = self.invoke(ctx)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return __callback(*args, **kwargs)
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 42, in quantize
arushimgupta-peft-save-1-v5-mkmlizer: quantize_model(temp_folder, output_path, profile, device)
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 135, in quantize_model
arushimgupta-peft-save-1-v5-mkmlizer: flywheel.instrument(
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/instrument.py", line 93, in instrument
arushimgupta-peft-save-1-v5-mkmlizer: compiler.save_pretrained(input_model_path, output_model_path, storage_format)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/compiler.py", line 23, in save_pretrained
arushimgupta-peft-save-1-v5-mkmlizer: self.save_st_pretrained(input_model_path, output_model_path)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/compiler.py", line 38, in save_st_pretrained
arushimgupta-peft-save-1-v5-mkmlizer: for name, tensor in model_iterator:
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/models/mistral.py", line 241, in tensor_merger
arushimgupta-peft-save-1-v5-mkmlizer: for name, tensor in tensor_iterator:
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/loader.py", line 217, in tensor_compiler
arushimgupta-peft-save-1-v5-mkmlizer: compiled_tensor = runtime.instrument(tensor, profile.value)
arushimgupta-peft-save-1-v5-mkmlizer: RuntimeError: CUDA error: invalid configuration argument
arushimgupta-peft-save-1-v5-mkmlizer: CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
arushimgupta-peft-save-1-v5-mkmlizer: For debugging consider passing CUDA_LAUNCH_BLOCKING=1
arushimgupta-peft-save-1-v5-mkmlizer: Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.
arushimgupta-peft-save-1-v5-mkmlizer: Exception raised from c10_cuda_check_implementation at ../c10/cuda/CUDAException.cpp:43 (most recent call first):
arushimgupta-peft-save-1-v5-mkmlizer: frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x96 (0x71a4490cbf86 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, std::string const&) + 0x64 (0x71a44907ad10 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #2: c10::cuda::c10_cuda_check_implementation(int, char const*, char const*, int, bool) + 0x118 (0x71a4491a6f08 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #3: void at::native::gpu_kernel_impl<__nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> >(at::TensorIteratorBase&, __nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> const&) + 0x4de (0x71a3fbb0aa2e in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #4: void at::native::gpu_kernel<__nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> >(at::TensorIteratorBase&, __nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> const&) + 0x34b (0x71a3fbb0b01b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #5: at::native::direct_copy_kernel_cuda(at::TensorIteratorBase&) + 0x38c (0x71a3fbac9a0c in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #6: at::native::copy_device_to_device(at::TensorIterator&, bool, bool) + 0xb25 (0x71a3fbaca715 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #7: <unknown function> + 0x1910312 (0x71a3fbacc312 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #8: <unknown function> + 0x1cbebff (0x71a431099bff in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #9: at::native::copy_(at::Tensor&, at::Tensor const&, bool) + 0x62 (0x71a43109b5a2 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #10: at::_ops::copy_::call(at::Tensor&, at::Tensor const&, bool) + 0x15c (0x71a431e5635c in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #11: at::native::_to_copy(at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0x1e01 (0x71a4313b96b1 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #12: <unknown function> + 0x2e19f8b (0x71a4321f4f8b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #13: at::_ops::_to_copy::redispatch(c10::DispatchKeySet, at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0xf5 (0x71a4318fdc25 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #14: <unknown function> + 0x2c58a33 (0x71a432033a33 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #15: at::_ops::_to_copy::redispatch(c10::DispatchKeySet, at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0xf5 (0x71a4318fdc25 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #16: <unknown function> + 0x470df1f (0x71a433ae8f1f in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #17: <unknown function> + 0x470e35e (0x71a433ae935e in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #18: at::_ops::_to_copy::call(at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0x1eb (0x71a43198d68b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #19: at::native::to(at::Tensor const&, c10::ScalarType, bool, bool, std::optional<c10::MemoryFormat>) + 0xa2 (0x71a4313b6182 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #20: <unknown function> + 0x301e3b0 (0x71a4323f93b0 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #21: at::_ops::to_dtype::call(at::Tensor const&, c10::ScalarType, bool, bool, std::optional<c10::MemoryFormat>) + 0x178 (0x71a431b3d258 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #22: mkodec::instrument(at::Tensor, int) + 0x4b (0x71a36d9c81ab in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #23: <unknown function> + 0x981e2 (0x71a36d9b01e2 in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #24: <unknown function> + 0xa7b9b (0x71a36d9bfb9b in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #25: python3() [0x4fd907]
arushimgupta-peft-save-1-v5-mkmlizer: <omitting python frames>
arushimgupta-peft-save-1-v5-mkmlizer: frame #28: python3() [0x5112cf]
arushimgupta-peft-save-1-v5-mkmlizer: frame #30: python3() [0x5112cf]
arushimgupta-peft-save-1-v5-mkmlizer: frame #43: python3() [0x5095ce]
arushimgupta-peft-save-1-v5-mkmlizer: frame #50: python3() [0x509857]
arushimgupta-peft-save-1-v5-mkmlizer: frame #54: python3() [0x5cf913]
arushimgupta-peft-save-1-v5-mkmlizer: frame #57: python3() [0x5951c2]
arushimgupta-peft-save-1-v5-mkmlizer: frame #59: python3() [0x5c5ef7]
arushimgupta-peft-save-1-v5-mkmlizer: frame #60: python3() [0x5c1030]
arushimgupta-peft-save-1-v5-mkmlizer: frame #61: python3() [0x459781]
arushimgupta-peft-save-1-v5-mkmlizer:
arushimgupta-peft-save-1-v5-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
arushimgupta-peft-save-1-v5-mkmlizer: ║ _____ __ __ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ /___/ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Version: 0.11.12 ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ https://mk1.ai ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ The license key for the current software has been verified as ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ belonging to: ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Chai Research Corp. ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Job arushimgupta-peft-save-1-v5-mkmlizer completed after 52.27s with status: failed
Stopping job with name arushimgupta-peft-save-1-v5-mkmlizer
%s, retrying in %s seconds...
Starting job with name arushimgupta-peft-save-1-v5-mkmlizer
Waiting for job on arushimgupta-peft-save-1-v5-mkmlizer to finish
arushimgupta-peft-save-1-v5-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
arushimgupta-peft-save-1-v5-mkmlizer: ║ _____ __ __ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ /___/ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Version: 0.11.12 ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ https://mk1.ai ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ The license key for the current software has been verified as ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ belonging to: ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Chai Research Corp. ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
arushimgupta-peft-save-1-v5-mkmlizer: ║ ║
arushimgupta-peft-save-1-v5-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
arushimgupta-peft-save-1-v5-mkmlizer: Downloaded to shared memory in 16.231s
arushimgupta-peft-save-1-v5-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpjwz23o0t, device:0
arushimgupta-peft-save-1-v5-mkmlizer: Saving flywheel model at /dev/shm/model_cache
arushimgupta-peft-save-1-v5-mkmlizer: Loading 0: 0%| | 0/1203 [00:00<?, ?it/s]Traceback (most recent call last):
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 151, in <module>
arushimgupta-peft-save-1-v5-mkmlizer: cli()
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
arushimgupta-peft-save-1-v5-mkmlizer: return self.main(*args, **kwargs)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
arushimgupta-peft-save-1-v5-mkmlizer: rv = self.invoke(ctx)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return __callback(*args, **kwargs)
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 42, in quantize
arushimgupta-peft-save-1-v5-mkmlizer: quantize_model(temp_folder, output_path, profile, device)
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 135, in quantize_model
arushimgupta-peft-save-1-v5-mkmlizer: flywheel.instrument(
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/instrument.py", line 93, in instrument
arushimgupta-peft-save-1-v5-mkmlizer: compiler.save_pretrained(input_model_path, output_model_path, storage_format)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/compiler.py", line 23, in save_pretrained
arushimgupta-peft-save-1-v5-mkmlizer: self.save_st_pretrained(input_model_path, output_model_path)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/compiler.py", line 38, in save_st_pretrained
arushimgupta-peft-save-1-v5-mkmlizer: for name, tensor in model_iterator:
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/models/mistral.py", line 241, in tensor_merger
arushimgupta-peft-save-1-v5-mkmlizer: for name, tensor in tensor_iterator:
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/loader.py", line 217, in tensor_compiler
arushimgupta-peft-save-1-v5-mkmlizer: compiled_tensor = runtime.instrument(tensor, profile.value)
arushimgupta-peft-save-1-v5-mkmlizer: RuntimeError: CUDA error: invalid configuration argument
arushimgupta-peft-save-1-v5-mkmlizer: CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
arushimgupta-peft-save-1-v5-mkmlizer: For debugging consider passing CUDA_LAUNCH_BLOCKING=1
arushimgupta-peft-save-1-v5-mkmlizer: Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.
arushimgupta-peft-save-1-v5-mkmlizer: Exception raised from c10_cuda_check_implementation at ../c10/cuda/CUDAException.cpp:43 (most recent call first):
arushimgupta-peft-save-1-v5-mkmlizer: frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x96 (0x7e235e377f86 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, std::string const&) + 0x64 (0x7e235e326d10 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #2: c10::cuda::c10_cuda_check_implementation(int, char const*, char const*, int, bool) + 0x118 (0x7e235e770f08 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #3: void at::native::gpu_kernel_impl<__nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> >(at::TensorIteratorBase&, __nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> const&) + 0x4de (0x7e2310d0aa2e in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #4: void at::native::gpu_kernel<__nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> >(at::TensorIteratorBase&, __nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> const&) + 0x34b (0x7e2310d0b01b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #5: at::native::direct_copy_kernel_cuda(at::TensorIteratorBase&) + 0x38c (0x7e2310cc9a0c in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #6: at::native::copy_device_to_device(at::TensorIterator&, bool, bool) + 0xb25 (0x7e2310cca715 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #7: <unknown function> + 0x1910312 (0x7e2310ccc312 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #8: <unknown function> + 0x1cbebff (0x7e2346299bff in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #9: at::native::copy_(at::Tensor&, at::Tensor const&, bool) + 0x62 (0x7e234629b5a2 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #10: at::_ops::copy_::call(at::Tensor&, at::Tensor const&, bool) + 0x15c (0x7e234705635c in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #11: at::native::_to_copy(at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0x1e01 (0x7e23465b96b1 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #12: <unknown function> + 0x2e19f8b (0x7e23473f4f8b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #13: at::_ops::_to_copy::redispatch(c10::DispatchKeySet, at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0xf5 (0x7e2346afdc25 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #14: <unknown function> + 0x2c58a33 (0x7e2347233a33 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #15: at::_ops::_to_copy::redispatch(c10::DispatchKeySet, at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0xf5 (0x7e2346afdc25 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #16: <unknown function> + 0x470df1f (0x7e2348ce8f1f in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #17: <unknown function> + 0x470e35e (0x7e2348ce935e in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #18: at::_ops::_to_copy::call(at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0x1eb (0x7e2346b8d68b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #19: at::native::to(at::Tensor const&, c10::ScalarType, bool, bool, std::optional<c10::MemoryFormat>) + 0xa2 (0x7e23465b6182 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #20: <unknown function> + 0x301e3b0 (0x7e23475f93b0 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #21: at::_ops::to_dtype::call(at::Tensor const&, c10::ScalarType, bool, bool, std::optional<c10::MemoryFormat>) + 0x178 (0x7e2346d3d258 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #22: mkodec::instrument(at::Tensor, int) + 0x4b (0x7e2282bc81ab in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #23: <unknown function> + 0x981e2 (0x7e2282bb01e2 in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #24: <unknown function> + 0xa7b9b (0x7e2282bbfb9b in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #25: python3() [0x4fd907]
arushimgupta-peft-save-1-v5-mkmlizer: <omitting python frames>
arushimgupta-peft-save-1-v5-mkmlizer: frame #28: python3() [0x5112cf]
arushimgupta-peft-save-1-v5-mkmlizer: frame #30: python3() [0x5112cf]
arushimgupta-peft-save-1-v5-mkmlizer: frame #43: python3() [0x5095ce]
arushimgupta-peft-save-1-v5-mkmlizer: frame #50: python3() [0x509857]
arushimgupta-peft-save-1-v5-mkmlizer: frame #54: python3() [0x5cf913]
arushimgupta-peft-save-1-v5-mkmlizer: frame #57: python3() [0x5951c2]
arushimgupta-peft-save-1-v5-mkmlizer: frame #59: python3() [0x5c5ef7]
arushimgupta-peft-save-1-v5-mkmlizer: frame #60: python3() [0x5c1030]
arushimgupta-peft-save-1-v5-mkmlizer: frame #61: python3() [0x459781]
arushimgupta-peft-save-1-v5-mkmlizer:
arushimgupta-peft-save-1-v5-mkmlizer: Loading 0: 0%| | 0/1203 [00:00<?, ?it/s]Traceback (most recent call last):
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 151, in <module>
arushimgupta-peft-save-1-v5-mkmlizer: cli()
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
arushimgupta-peft-save-1-v5-mkmlizer: return self.main(*args, **kwargs)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
arushimgupta-peft-save-1-v5-mkmlizer: rv = self.invoke(ctx)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return __callback(*args, **kwargs)
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 42, in quantize
arushimgupta-peft-save-1-v5-mkmlizer: quantize_model(temp_folder, output_path, profile, device)
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 135, in quantize_model
arushimgupta-peft-save-1-v5-mkmlizer: flywheel.instrument(
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/instrument.py", line 93, in instrument
arushimgupta-peft-save-1-v5-mkmlizer: compiler.save_pretrained(input_model_path, output_model_path, storage_format)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/compiler.py", line 23, in save_pretrained
arushimgupta-peft-save-1-v5-mkmlizer: self.save_st_pretrained(input_model_path, output_model_path)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/compiler.py", line 38, in save_st_pretrained
arushimgupta-peft-save-1-v5-mkmlizer: for name, tensor in model_iterator:
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/models/mistral.py", line 241, in tensor_merger
arushimgupta-peft-save-1-v5-mkmlizer: for name, tensor in tensor_iterator:
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/loader.py", line 217, in tensor_compiler
arushimgupta-peft-save-1-v5-mkmlizer: compiled_tensor = runtime.instrument(tensor, profile.value)
arushimgupta-peft-save-1-v5-mkmlizer: RuntimeError: CUDA error: invalid configuration argument
arushimgupta-peft-save-1-v5-mkmlizer: CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
arushimgupta-peft-save-1-v5-mkmlizer: For debugging consider passing CUDA_LAUNCH_BLOCKING=1
arushimgupta-peft-save-1-v5-mkmlizer: Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.
arushimgupta-peft-save-1-v5-mkmlizer: Exception raised from c10_cuda_check_implementation at ../c10/cuda/CUDAException.cpp:43 (most recent call first):
arushimgupta-peft-save-1-v5-mkmlizer: frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x96 (0x797ea60cbf86 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, std::string const&) + 0x64 (0x797ea607ad10 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #2: c10::cuda::c10_cuda_check_implementation(int, char const*, char const*, int, bool) + 0x118 (0x797ea61a6f08 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #3: void at::native::gpu_kernel_impl<__nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> >(at::TensorIteratorBase&, __nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> const&) + 0x4de (0x797e58b0aa2e in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #4: void at::native::gpu_kernel<__nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> >(at::TensorIteratorBase&, __nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> const&) + 0x34b (0x797e58b0b01b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #5: at::native::direct_copy_kernel_cuda(at::TensorIteratorBase&) + 0x38c (0x797e58ac9a0c in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #6: at::native::copy_device_to_device(at::TensorIterator&, bool, bool) + 0xb25 (0x797e58aca715 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #7: <unknown function> + 0x1910312 (0x797e58acc312 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #8: <unknown function> + 0x1cbebff (0x797e8e099bff in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #9: at::native::copy_(at::Tensor&, at::Tensor const&, bool) + 0x62 (0x797e8e09b5a2 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #10: at::_ops::copy_::call(at::Tensor&, at::Tensor const&, bool) + 0x15c (0x797e8ee5635c in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #11: at::native::_to_copy(at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0x1e01 (0x797e8e3b96b1 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #12: <unknown function> + 0x2e19f8b (0x797e8f1f4f8b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #13: at::_ops::_to_copy::redispatch(c10::DispatchKeySet, at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0xf5 (0x797e8e8fdc25 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #14: <unknown function> + 0x2c58a33 (0x797e8f033a33 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #15: at::_ops::_to_copy::redispatch(c10::DispatchKeySet, at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0xf5 (0x797e8e8fdc25 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #16: <unknown function> + 0x470df1f (0x797e90ae8f1f in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #17: <unknown function> + 0x470e35e (0x797e90ae935e in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #18: at::_ops::_to_copy::call(at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0x1eb (0x797e8e98d68b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #19: at::native::to(at::Tensor const&, c10::ScalarType, bool, bool, std::optional<c10::MemoryFormat>) + 0xa2 (0x797e8e3b6182 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #20: <unknown function> + 0x301e3b0 (0x797e8f3f93b0 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #21: at::_ops::to_dtype::call(at::Tensor const&, c10::ScalarType, bool, bool, std::optional<c10::MemoryFormat>) + 0x178 (0x797e8eb3d258 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #22: mkodec::instrument(at::Tensor, int) + 0x4b (0x797dca9c81ab in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #23: <unknown function> + 0x981e2 (0x797dca9b01e2 in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #24: <unknown function> + 0xa7b9b (0x797dca9bfb9b in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #25: python3() [0x4fd907]
arushimgupta-peft-save-1-v5-mkmlizer: <omitting python frames>
arushimgupta-peft-save-1-v5-mkmlizer: frame #28: python3() [0x5112cf]
arushimgupta-peft-save-1-v5-mkmlizer: frame #30: python3() [0x5112cf]
arushimgupta-peft-save-1-v5-mkmlizer: frame #43: python3() [0x5095ce]
arushimgupta-peft-save-1-v5-mkmlizer: frame #50: python3() [0x509857]
arushimgupta-peft-save-1-v5-mkmlizer: frame #54: python3() [0x5cf913]
arushimgupta-peft-save-1-v5-mkmlizer: frame #57: python3() [0x5951c2]
arushimgupta-peft-save-1-v5-mkmlizer: frame #59: python3() [0x5c5ef7]
arushimgupta-peft-save-1-v5-mkmlizer: frame #60: python3() [0x5c1030]
arushimgupta-peft-save-1-v5-mkmlizer: frame #61: python3() [0x459781]
arushimgupta-peft-save-1-v5-mkmlizer:
Job arushimgupta-peft-save-1-v5-mkmlizer completed after 64.89s with status: failed
Stopping job with name arushimgupta-peft-save-1-v5-mkmlizer
%s, retrying in %s seconds...
Starting job with name arushimgupta-peft-save-1-v5-mkmlizer
Waiting for job on arushimgupta-peft-save-1-v5-mkmlizer to finish
arushimgupta-peft-save-1-v5-mkmlizer: Downloaded to shared memory in 16.662s
arushimgupta-peft-save-1-v5-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpz0mzyzqk, device:0
arushimgupta-peft-save-1-v5-mkmlizer: Saving flywheel model at /dev/shm/model_cache
arushimgupta-peft-save-1-v5-mkmlizer: Downloaded to shared memory in 15.827s
arushimgupta-peft-save-1-v5-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp9c4qp7od, device:0
arushimgupta-peft-save-1-v5-mkmlizer: Saving flywheel model at /dev/shm/model_cache
arushimgupta-peft-save-1-v5-mkmlizer: Loading 0: 0%| | 0/1203 [00:00<?, ?it/s]Traceback (most recent call last):
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 151, in <module>
arushimgupta-peft-save-1-v5-mkmlizer: cli()
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
arushimgupta-peft-save-1-v5-mkmlizer: return self.main(*args, **kwargs)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
arushimgupta-peft-save-1-v5-mkmlizer: rv = self.invoke(ctx)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
arushimgupta-peft-save-1-v5-mkmlizer: return __callback(*args, **kwargs)
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 42, in quantize
arushimgupta-peft-save-1-v5-mkmlizer: quantize_model(temp_folder, output_path, profile, device)
arushimgupta-peft-save-1-v5-mkmlizer: File "/code/uploading/mkmlize.py", line 135, in quantize_model
Inference service mistralai-mistral-nemo-9330-v121 ready after 220.55023670196533s
arushimgupta-peft-save-1-v5-mkmlizer: flywheel.instrument(
Pipeline stage MKMLDeployer completed in 221.10s
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/instrument.py", line 93, in instrument
run pipeline stage %s
arushimgupta-peft-save-1-v5-mkmlizer: compiler.save_pretrained(input_model_path, output_model_path, storage_format)
Running pipeline stage StressChecker
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/compiler.py", line 23, in save_pretrained
arushimgupta-peft-save-1-v5-mkmlizer: self.save_st_pretrained(input_model_path, output_model_path)
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/compiler.py", line 38, in save_st_pretrained
arushimgupta-peft-save-1-v5-mkmlizer: for name, tensor in model_iterator:
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/models/mistral.py", line 241, in tensor_merger
arushimgupta-peft-save-1-v5-mkmlizer: for name, tensor in tensor_iterator:
arushimgupta-peft-save-1-v5-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/loader.py", line 217, in tensor_compiler
arushimgupta-peft-save-1-v5-mkmlizer: compiled_tensor = runtime.instrument(tensor, profile.value)
arushimgupta-peft-save-1-v5-mkmlizer: RuntimeError: CUDA error: invalid configuration argument
arushimgupta-peft-save-1-v5-mkmlizer: CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
arushimgupta-peft-save-1-v5-mkmlizer: For debugging consider passing CUDA_LAUNCH_BLOCKING=1
arushimgupta-peft-save-1-v5-mkmlizer: Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.
arushimgupta-peft-save-1-v5-mkmlizer: Exception raised from c10_cuda_check_implementation at ../c10/cuda/CUDAException.cpp:43 (most recent call first):
arushimgupta-peft-save-1-v5-mkmlizer: frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x96 (0x751b9ef77f86 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, std::string const&) + 0x64 (0x751b9ef26d10 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10.so)
Received healthy response to inference request in 2.363326072692871s
arushimgupta-peft-save-1-v5-mkmlizer: frame #2: c10::cuda::c10_cuda_check_implementation(int, char const*, char const*, int, bool) + 0x118 (0x751b9f3a2f08 in /opt/conda/lib/python3.10/site-packages/torch/lib/libc10_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #3: void at::native::gpu_kernel_impl<__nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> >(at::TensorIteratorBase&, __nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> const&) + 0x4de (0x751b5190aa2e in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #4: void at::native::gpu_kernel<__nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> >(at::TensorIteratorBase&, __nv_hdl_wrapper_t<false, true, false, __nv_dl_tag<void (*)(at::TensorIteratorBase&), &at::native::direct_copy_kernel_cuda, 18u>, c10::Half (c10::Half)> const&) + 0x34b (0x751b5190b01b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #5: at::native::direct_copy_kernel_cuda(at::TensorIteratorBase&) + 0x38c (0x751b518c9a0c in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #6: at::native::copy_device_to_device(at::TensorIterator&, bool, bool) + 0xb25 (0x751b518ca715 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #7: <unknown function> + 0x1910312 (0x751b518cc312 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cuda.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #8: <unknown function> + 0x1cbebff (0x751b86e99bff in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #9: at::native::copy_(at::Tensor&, at::Tensor const&, bool) + 0x62 (0x751b86e9b5a2 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #10: at::_ops::copy_::call(at::Tensor&, at::Tensor const&, bool) + 0x15c (0x751b87c5635c in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #11: at::native::_to_copy(at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0x1e01 (0x751b871b96b1 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #12: <unknown function> + 0x2e19f8b (0x751b87ff4f8b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
Received healthy response to inference request in 1.7707421779632568s
arushimgupta-peft-save-1-v5-mkmlizer: frame #13: at::_ops::_to_copy::redispatch(c10::DispatchKeySet, at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0xf5 (0x751b876fdc25 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #14: <unknown function> + 0x2c58a33 (0x751b87e33a33 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #15: at::_ops::_to_copy::redispatch(c10::DispatchKeySet, at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0xf5 (0x751b876fdc25 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #16: <unknown function> + 0x470df1f (0x751b898e8f1f in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #17: <unknown function> + 0x470e35e (0x751b898e935e in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #18: at::_ops::_to_copy::call(at::Tensor const&, std::optional<c10::ScalarType>, std::optional<c10::Layout>, std::optional<c10::Device>, std::optional<bool>, bool, std::optional<c10::MemoryFormat>) + 0x1eb (0x751b8778d68b in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #19: at::native::to(at::Tensor const&, c10::ScalarType, bool, bool, std::optional<c10::MemoryFormat>) + 0xa2 (0x751b871b6182 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #20: <unknown function> + 0x301e3b0 (0x751b881f93b0 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #21: at::_ops::to_dtype::call(at::Tensor const&, c10::ScalarType, bool, bool, std::optional<c10::MemoryFormat>) + 0x178 (0x751b8793d258 in /opt/conda/lib/python3.10/site-packages/torch/lib/libtorch_cpu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #22: mkodec::instrument(at::Tensor, int) + 0x4b (0x751ac37c81ab in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #23: <unknown function> + 0x981e2 (0x751ac37b01e2 in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
Received healthy response to inference request in 1.785046100616455s
arushimgupta-peft-save-1-v5-mkmlizer: frame #24: <unknown function> + 0xa7b9b (0x751ac37bfb9b in /opt/conda/lib/python3.10/site-packages/mk1/flywheel/runtime.cpython-310-x86_64-linux-gnu.so)
arushimgupta-peft-save-1-v5-mkmlizer: frame #25: python3() [0x4fd907]
arushimgupta-peft-save-1-v5-mkmlizer: <omitting python frames>
arushimgupta-peft-save-1-v5-mkmlizer: frame #28: python3() [0x5112cf]
arushimgupta-peft-save-1-v5-mkmlizer: frame #30: python3() [0x5112cf]
arushimgupta-peft-save-1-v5-mkmlizer: frame #43: python3() [0x5095ce]
arushimgupta-peft-save-1-v5-mkmlizer: frame #50: python3() [0x509857]
arushimgupta-peft-save-1-v5-mkmlizer: frame #54: python3() [0x5cf913]
arushimgupta-peft-save-1-v5-mkmlizer: frame #57: python3() [0x5951c2]
arushimgupta-peft-save-1-v5-mkmlizer: frame #59: python3() [0x5c5ef7]
arushimgupta-peft-save-1-v5-mkmlizer: frame #60: python3() [0x5c1030]
Received healthy response to inference request in 1.944331169128418s
arushimgupta-peft-save-1-v5-mkmlizer: frame #61: python3() [0x459781]
arushimgupta-peft-save-1-v5-mkmlizer:
Received healthy response to inference request in 1.7631661891937256s
5 requests
0 failed requests
5th percentile: 1.7646813869476319
10th percentile: 1.7661965847015382
20th percentile: 1.7692269802093505
30th percentile: 1.7736029624938965
40th percentile: 1.7793245315551758
50th percentile: 1.785046100616455
60th percentile: 1.8487601280212402
70th percentile: 1.9124741554260254
80th percentile: 2.0281301498413087
90th percentile: 2.1957281112670897
95th percentile: 2.2795270919799804
99th percentile: 2.346566276550293
mean time: 1.9253223419189454
Pipeline stage StressChecker completed in 13.86s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Job arushimgupta-peft-save-1-v5-mkmlizer completed after 54.6s with status: failed
Stopping job with name arushimgupta-peft-save-1-v5-mkmlizer
clean up pipeline due to error=MKMLizerError('')
Shutdown handler de-registered
MKMLizerError('')
arushimgupta-peft-save-1_v5 status is now failed due to DeploymentManager action
Pipeline stage TriggerMKMLProfilingPipeline completed in 8.56s
Shutdown handler de-registered
mistralai-mistral-nemo_9330_v121 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.29s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.21s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service mistralai-mistral-nemo-9330-v121-profiler
Waiting for inference service mistralai-mistral-nemo-9330-v121-profiler to be ready
Tearing down inference service mistralai-mistral-nemo-9330-v121-profiler
%s, retrying in %s seconds...
Creating inference service mistralai-mistral-nemo-9330-v121-profiler
Waiting for inference service mistralai-mistral-nemo-9330-v121-profiler to be ready
Tearing down inference service mistralai-mistral-nemo-9330-v121-profiler
%s, retrying in %s seconds...
Creating inference service mistralai-mistral-nemo-9330-v121-profiler
Waiting for inference service mistralai-mistral-nemo-9330-v121-profiler to be ready
Tearing down inference service mistralai-mistral-nemo-9330-v121-profiler
clean up pipeline due to error=%s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.29s
Shutdown handler de-registered
mistralai-mistral-nemo_9330_v121 status is now inactive due to auto deactivation removed underperforming models
Shutdown handler de-registered
admin requested tearing down of blend_hanik_2024-09-27
Tearing down inference service mistralai-mistral-nemo-9330-v112
Service mistralai-mistral-nemo-9330-v111 has been torndown
Checking if service mistralai-mistral-nemo-9330-v114 is running
Pipeline stage MKMLDeleter completed in 131.35s
run pipeline stage %s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
run pipeline %s
Shutdown handler de-registered
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
admin requested tearing down of mistralai-mistral-nemo_9330_v121
run pipeline stage %s
blend_fulat_2024-09-27 status is now torndown due to DeploymentManager action
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of blend_kufab_2024-09-27
Service mistralai-mistral-nemo-9330-v112 has been torndown
Pipeline stage MKMLDeleter completed in 129.81s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Tearing down inference service mistralai-mistral-nemo-9330-v114
Checking if service mistralai-mistral-nemo-9330-v115 is running
Running pipeline stage MKMLDeleter
admin requested tearing down of blend_rofur_2024-10-03
admin requested tearing down of blend_rofur_2024-10-03
admin requested tearing down of blend_dones_2024-09-27
run pipeline stage %s
blend_fulat_2024-09-27 status is now torndown due to DeploymentManager action
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of blend_kufab_2024-09-27
Pipeline stage MKMLDeleter completed in 210.82s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
run pipeline stage %s
Service mistralai-mistral-nemo-9330-v114 has been torndown
blend_gelom_2024-09-27 status is now torndown due to DeploymentManager action
Tearing down inference service mistralai-mistral-nemo-9330-v115
Shutdown handler de-registered
Checking if service mistralai-mistral-nemo-9330-v116 is running
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLDeployer
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of blend_rofur_2024-10-03
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Cleaning model data from S3
Cleaning model data from model cache
Running pipeline stage MKMLDeleter
Tearing down inference service blend-rofur-2024-10-03
Pipeline stage MKMLDeleter completed in 243.15s
Service mistralai-mistral-nemo-9330-v115 has been torndown
blend_gujen_2024-09-27 status is now torndown due to DeploymentManager action
run pipeline %s
Tearing down inference service mistralai-mistral-nemo-9330-v116
run pipeline %s
run pipeline stage %s
run pipeline %s
Creating inference service blend-rofur-2024-10-03
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler de-registered
run pipeline %s
run pipeline %s
Cleaning model data from S3
Running pipeline stage MKMLModelDeleter
Cleaning model data from model cache
Deleting key meta-llama-llama-3-1-8b-7331-v1/config.json from bucket guanaco-mkml-models
Checking if service mistralai-mistral-nemo-9330-v117 is running
%s, retrying in %s seconds...
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 310.64s
run pipeline stage %s
Service mistralai-mistral-nemo-9330-v116 has been torndown
run pipeline stage %s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Ignoring service blend-rofur-2024-10-03 already deployed
run pipeline %s
blend_hanik_2024-09-27 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Shutdown handler de-registered
Cleaning model data from S3
Deleting key mistralai-mistral-nemo-9330-v110/config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key meta-llama-llama-3-1-8b-7331-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
admin requested tearing down of blend_dones_2024-09-27
Creating inference service blend-rofur-2024-10-03
admin requested tearing down of blend_rofur_2024-10-03
Tearing down inference service mistralai-mistral-nemo-9330-v117
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Running pipeline stage ProductionBlendMKMLTemplater
Pipeline stage MKMLDeleter completed in 400.79s
Running pipeline stage MKMLDeleter
Checking if service mistralai-mistral-nemo-9330-v119 is running
Running pipeline stage ProductionBlendMKMLTemplater
Waiting for inference service blend-rofur-2024-10-03 to be ready
admin requested tearing down of blend_rofur_2024-10-03
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v110/flywheel_model.0.safetensors from bucket guanaco-mkml-models
blend_kufab_2024-09-27 status is now torndown due to DeploymentManager action
Deleting key mistralai-mistral-nemo-9330-v111/config.json from bucket guanaco-mkml-models
blend_dones_2024-09-27 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 60.17s
Shutdown handler de-registered
blend_fulat_2024-09-27 status is now torndown due to DeploymentManager action
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of blend_kufab_2024-09-27
Running pipeline stage MKMLDeleter
run pipeline stage %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Cleaning model data from model cache
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeleter
run pipeline stage %s
Cleaning model data from S3
run pipeline %s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 4.49s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Cleaning model data from S3
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Cleaning model data from model cache
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Cleaning model data from model cache
Pipeline stage MKMLDeleter completed in 6.64s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Deleting key meta-llama-llama-3-1-8b-7331-v1/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v110/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Cleaning model data from S3
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
Deleting key mistralai-mistral-nemo-9330-v111/config.json from bucket guanaco-mkml-models
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 8.10s
Deleting key meta-llama-llama-3-1-8b-7331-v1/tokenizer.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Running pipeline stage MKMLDeleter
Deleting key mistralai-mistral-nemo-9330-v110/special_tokens_map.json from bucket guanaco-mkml-models
Cleaning model data from S3
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Checking if service mistralai-mistral-nemo-9330-v117 is running
Deleting key mistralai-mistral-nemo-9330-v111/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Pipeline stage MKMLDeleter completed in 5.80s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Cleaning model data from model cache
run pipeline %s
Cleaning model data from S3
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v121
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 7.47s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Cleaning model data from model cache
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Cleaning model data from S3
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Running pipeline stage MKMLModelDeleter
Deleting key meta-llama-llama-3-1-8b-7331-v1/tokenizer_config.json from bucket guanaco-mkml-models
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 8.35s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v110/tokenizer.json from bucket guanaco-mkml-models
run pipeline %s
Cleaning model data from model cache
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
Cleaning model data from S3
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLModelDeleter completed in 19.43s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 9.91s
Running pipeline stage MKMLDeleter
Deleting key mistralai-mistral-nemo-9330-v110/tokenizer_config.json from bucket guanaco-mkml-models
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v111/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v124
Checking if service mistralai-mistral-nemo-9330-v117 is running
Cleaning model data from model cache
Cleaning model data from S3
Shutdown handler de-registered
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Checking if service mistralai-mistral-nemo-9330-v119 is running
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 6.91s
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline stage %s
run pipeline %s
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Pipeline stage MKMLDeleter completed in 8.18s
Pipeline stage MKMLModelDeleter completed in 7.47s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Cleaning model data from S3
Shutdown handler not registered because Python interpreter is not running in the main thread
meta-llama-llama-3-1-8b-_7331_v1 status is now torndown due to DeploymentManager action
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 9.61s
Cleaning model data from S3
Running pipeline stage MKMLDeleter
Connection pool is full, discarding connection: %s. Connection pool size: %s
run pipeline stage %s
Cleaning model data from model cache
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v123
mistralai-mistral-nemo_9330_v110 status is now torndown due to DeploymentManager action
run pipeline stage %s
Cleaning model data from model cache
Connection pool is full, discarding connection: %s. Connection pool size: %s
Pipeline stage MKMLDeleter completed in 11.02s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Cleaning model data from S3
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v124
Running pipeline stage MKMLModelDeleter
Checking if service mistralai-mistral-nemo-9330-v117 is running
Deleting key mistralai-mistral-nemo-9330-v112/config.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v111/special_tokens_map.json from bucket guanaco-mkml-models
run pipeline stage %s
Checking if service mistralai-mistral-nemo-9330-v119 is running
Running pipeline stage MKMLDeleter
run pipeline stage %s
Cleaning model data from model cache
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Cleaning model data from S3
admin requested tearing down of mistralai-mistral-nemo_9330_v125
Deleting key mistralai-mistral-nemo-9330-v112/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v111/tokenizer.json from bucket guanaco-mkml-models
Running pipeline stage MKMLModelDeleter
Tearing down inference service mistralai-mistral-nemo-9330-v117
Running pipeline stage MKMLDeleter
Checking if service mistralai-mistral-nemo-9330-v121 is running
Deleting key mistralai-mistral-nemo-9330-v114/config.json from bucket guanaco-mkml-models
run pipeline stage %s
Tearing down inference service mistralai-mistral-nemo-9330-v119
run pipeline %s
Cleaning model data from model cache
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v126
Deleting key mistralai-mistral-nemo-9330-v112/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v111/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from S3
Service mistralai-mistral-nemo-9330-v117 has been torndown
Checking if service mistralai-mistral-nemo-9330-v122 is running
Deleting key mistralai-mistral-nemo-9330-v114/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Tearing down inference service mistralai-mistral-nemo-9330-v121
Running pipeline stage MKMLDeleter
Service mistralai-mistral-nemo-9330-v119 has been torndown
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v115/config.json from bucket guanaco-mkml-models
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v127
Deleting key mistralai-mistral-nemo-9330-v112/tokenizer.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Pipeline stage MKMLModelDeleter completed in 40.75s
Pipeline stage MKMLDeleter completed in 31.29s
Service mistralai-mistral-nemo-9330-v121 has been torndown
Deleting key mistralai-mistral-nemo-9330-v114/special_tokens_map.json from bucket guanaco-mkml-models
Checking if service mistralai-mistral-nemo-9330-v123 is running
Pipeline stage MKMLDeleter completed in 25.17s
Tearing down inference service mistralai-mistral-nemo-9330-v122
Running pipeline stage MKMLDeleter
Deleting key mistralai-mistral-nemo-9330-v115/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline stage %s
run pipeline %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Running pipeline stage MKMLModelDeleter
Deleting key cycy233-nemo-p-e-v4-c4-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key cycy233-nemo-p-e-v4-c3-v1/config.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
run pipeline stage %s
Deleting key cycy233-nemo-p-e-v4-c1-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline stage %s
run pipeline %s
Pipeline stage MKMLDeleter completed in 7.15s
Shutdown handler not registered because Python interpreter is not running in the main thread
Deleting key chaiml-nemo-chai-5merge-ties-v2/tokenizer_config.json from bucket guanaco-mkml-models
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Deleting key cycy233-nemo-p-e-v4-c3-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
Deleting key cycy233-nemo-p-e-v4-c4-v1/special_tokens_map.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline stage %s
Deleting key cycy233-nemo-p-e-v4-c1-v1/special_tokens_map.json from bucket guanaco-mkml-models
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 27.27s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLModelDeleter completed in 9.43s
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Pipeline stage MKMLDeleter completed in 9.52s
Deleting key cycy233-nemo-p-e-v4-c3-v1/special_tokens_map.json from bucket guanaco-mkml-models
Cleaning model data from S3
Deleting key cycy233-nemo-p-e-v4-c4-v1/tokenizer.json from bucket guanaco-mkml-models
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Running pipeline stage MKMLModelDeleter
Deleting key cycy233-nemo-p-e-v4-c1-v1/tokenizer.json from bucket guanaco-mkml-models
run pipeline stage %s
Shutdown handler de-registered
run pipeline %s
Shutdown handler de-registered
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
Deleting key cycy233-nemo-p-e-v4-c3-v1/tokenizer.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Pipeline stage MKMLDeleter completed in 12.10s
Pipeline stage %s skipped, reason=%s
Cleaning model data from S3
Running pipeline stage MKMLDeleter
Deleting key cycy233-nemo-p-e-v4-c1-v1/tokenizer_config.json from bucket guanaco-mkml-models
chaiml-nemo-chai-5merge-ties_v2 status is now torndown due to DeploymentManager action
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
mistralai-mistral-nemo_9330_v111 status is now torndown due to DeploymentManager action
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Deleting key cycy233-nemo-p-e-v4-c3-v1/tokenizer_config.json from bucket guanaco-mkml-models
admin requested tearing down of mistralai-mistral-nemo_9330_v124
Pipeline stage MKMLModelDeleter completed in 38.75s
Deleting key mistralai-mistral-nemo-9330-v112/tokenizer_config.json from bucket guanaco-mkml-models
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 15.19s
Cleaning model data from model cache
Pipeline stage %s skipped, reason=%s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
run pipeline stage %s
run pipeline %s
Deleting key cycy233-nemo-p-e-v4-c2-v1/config.json from bucket guanaco-mkml-models
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Pipeline stage MKMLDeleter completed in 6.17s
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeleter
run pipeline stage %s
Deleting key cycy233-nemo-p-e-v4-c2-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Pipeline stage MKMLDeleter completed in 8.01s
Pipeline stage MKMLModelDeleter completed in 7.75s
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Running pipeline stage MKMLModelDeleter
Deleting key cycy233-nemo-p-e-v4-c2-v1/special_tokens_map.json from bucket guanaco-mkml-models
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
run pipeline stage %s
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 9.33s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Cleaning model data from S3
run pipeline %s
Deleting key cycy233-nemo-p-e-v4-c2-v1/tokenizer.json from bucket guanaco-mkml-models
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
mistralai-mistral-nemo_9330_v111 status is now torndown due to DeploymentManager action
admin requested tearing down of mistralai-mistral-nemo_9330_v124
Shutdown handler de-registered
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 11.40s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Deleting key cycy233-nemo-p-e-v4-c2-v1/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from S3
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v125
mistralai-mistral-nemo_9330_v112 status is now torndown due to DeploymentManager action
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 12.32s
Checking if service mistralai-mistral-nemo-9330-v121 is running
Deleting key mistralai-mistral-nemo-9330-v114/special_tokens_map.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 37.58s
Cleaning model data from model cache
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v126
Cleaning model data from S3
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v114/tokenizer.json from bucket guanaco-mkml-models
Checking if service mistralai-mistral-nemo-9330-v122 is running
Shutdown handler de-registered
Deleting key mistralai-mistral-nemo-9330-v115/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Skipping teardown as no inference service was found
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v127
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Deleting key mistralai-mistral-nemo-9330-v114/tokenizer_config.json from bucket guanaco-mkml-models
cycy233-nemo-p-e-v4-c2_v1 status is now torndown due to DeploymentManager action
Tearing down inference service mistralai-mistral-nemo-9330-v122
Running pipeline stage MKMLDeleter
Checking if service mistralai-mistral-nemo-9330-v123 is running
Deleting key mistralai-mistral-nemo-9330-v115/special_tokens_map.json from bucket guanaco-mkml-models
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 21.86s
Deleting key mistralai-mistral-nemo-9330-v116/config.json from bucket guanaco-mkml-models
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v128
Cleaning model data from model cache
Cleaning model data from S3
Pipeline stage MKMLModelDeleter completed in 36.26s
Service mistralai-mistral-nemo-9330-v122 has been torndown
Checking if service mistralai-mistral-nemo-9330-v124 is running
Deleting key mistralai-mistral-nemo-9330-v115/tokenizer.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v116/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline stage %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
run pipeline stage %s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLDeleter completed in 5.14s
run pipeline %s
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 5.96s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 7.37s
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
Pipeline stage MKMLModelDeleter completed in 8.40s
Pipeline stage MKMLDeleter completed in 8.43s
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
run pipeline %s
Shutdown handler de-registered
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v124
run pipeline stage %s
Pipeline stage MKMLModelDeleter completed in 10.79s
Pipeline stage MKMLDeleter completed in 11.04s
Pipeline stage %s skipped, reason=%s
Cleaning model data from S3
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
mistralai-mistral-nemo_9330_v112 status is now torndown due to DeploymentManager action
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v125
run pipeline stage %s
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 12.65s
Cleaning model data from model cache
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Cleaning model data from S3
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v126
run pipeline stage %s
mistralai-mistral-nemo_9330_v114 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 12.17s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Cleaning model data from model cache
run pipeline %s
Cleaning model data from S3
Shutdown handler not registered because Python interpreter is not running in the main thread
Deleting key mistralai-mistral-nemo-9330-v115/tokenizer_config.json from bucket guanaco-mkml-models
admin requested tearing down of mistralai-mistral-nemo_9330_v127
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 10.89s
Running pipeline stage MKMLDeleter
Deleting key mistralai-mistral-nemo-9330-v116/special_tokens_map.json from bucket guanaco-mkml-models
run pipeline stage %s
Cleaning model data from model cache
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 27.95s
Shutdown handler not registered because Python interpreter is not running in the main thread
Checking if service mistralai-mistral-nemo-9330-v123 is running
Cleaning model data from S3
admin requested tearing down of mistralai-mistral-nemo_9330_v128
Running pipeline stage MKMLModelDeleter
Connection pool is full, discarding connection: %s. Connection pool size: %s
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 6.97s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLDeleter completed in 8.03s
admin requested tearing down of mistralai-mistral-nemo_9330_v122
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 10.53s
run pipeline %s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLDeleter completed in 11.58s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 11.33s
Shutdown handler de-registered
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v124
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 12.63s
Pipeline stage %s skipped, reason=%s
Shutdown handler de-registered
Running pipeline stage MKMLDeleter
mistralai-mistral-nemo_9330_v111 status is now torndown due to DeploymentManager action
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 13.11s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v125
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 15.62s
mistralai-mistral-nemo_9330_v112 status is now torndown due to DeploymentManager action
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 15.64s
run pipeline stage %s
Shutdown handler de-registered
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
admin requested tearing down of mistralai-mistral-nemo_9330_v126
Cleaning model data from S3
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 14.30s
Pipeline stage %s skipped, reason=%s
Shutdown handler de-registered
mistralai-mistral-nemo_9330_v114 status is now torndown due to DeploymentManager action
Running pipeline stage MKMLDeleter
run pipeline %s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v127
Cleaning model data from S3
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 13.86s
mistralai-mistral-nemo_9330_v115 status is now torndown due to DeploymentManager action
run pipeline stage %s
Running pipeline stage MKMLDeleter
run pipeline %s
Checking if service mistralai-mistral-nemo-9330-v123 is running
Cleaning model data from model cache
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v128
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
run pipeline stage %s
Running pipeline stage MKMLDeleter
Deleting key mistralai-mistral-nemo-9330-v116/tokenizer.json from bucket guanaco-mkml-models
Checking if service mistralai-mistral-nemo-9330-v124 is running
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v117/config.json from bucket guanaco-mkml-models
run pipeline %s
Tearing down inference service mistralai-mistral-nemo-9330-v123
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v129
Cleaning model data from model cache
Cleaning model data from S3
Running pipeline stage MKMLModelDeleter
Checking if service mistralai-mistral-nemo-9330-v125 is running
Deleting key mistralai-mistral-nemo-9330-v116/tokenizer_config.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
Deleting key mistralai-mistral-nemo-9330-v117/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline stage %s
Tearing down inference service mistralai-mistral-nemo-9330-v124
Service mistralai-mistral-nemo-9330-v123 has been torndown
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v130
Deleting key mistralai-mistral-nemo-9330-v119/config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Cleaning model data from S3
Pipeline stage MKMLModelDeleter completed in 40.21s
Checking if service mistralai-mistral-nemo-9330-v126 is running
Running pipeline stage MKMLDeleter
Tearing down inference service mistralai-mistral-nemo-9330-v125
Deleting key mistralai-mistral-nemo-9330-v117/special_tokens_map.json from bucket guanaco-mkml-models
Service mistralai-mistral-nemo-9330-v124 has been torndown
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 30.76s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Deleting key mistralai-mistral-nemo-9330-v119/flywheel_model.0.safetensors from bucket guanaco-mkml-models
admin requested tearing down of mistralai-mistral-nemo_9330_v131
Deleting key mistralai-mistral-nemo-9330-v121/config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Shutdown handler de-registered
Checking if service mistralai-mistral-nemo-9330-v127 is running
Service mistralai-mistral-nemo-9330-v125 has been torndown
Deleting key mistralai-mistral-nemo-9330-v117/tokenizer.json from bucket guanaco-mkml-models
Tearing down inference service mistralai-mistral-nemo-9330-v126
Pipeline stage MKMLDeleter completed in 28.46s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v132
Deleting key mistralai-mistral-nemo-9330-v121/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v122/config.json from bucket guanaco-mkml-models
mistralai-mistral-nemo_9330_v116 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 30.12s
Deleting key mistralai-mistral-nemo-9330-v117/tokenizer_config.json from bucket guanaco-mkml-models
Tearing down inference service mistralai-mistral-nemo-9330-v127
Service mistralai-mistral-nemo-9330-v126 has been torndown
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Checking if service mistralai-mistral-nemo-9330-v128 is running
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Deleting key mistralai-mistral-nemo-9330-v119/tokenizer.json from bucket guanaco-mkml-models
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v133
Deleting key mistralai-mistral-nemo-9330-v121/special_tokens_map.json from bucket guanaco-mkml-models
Connection pool is full, discarding connection: %s. Connection pool size: %s
Deleting key mistralai-mistral-nemo-9330-v122/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Connection pool is full, discarding connection: %s. Connection pool size: %s
run pipeline stage %s
Service mistralai-mistral-nemo-9330-v127 has been torndown
Pipeline stage MKMLModelDeleter completed in 54.78s
Pipeline stage MKMLDeleter completed in 28.92s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
run pipeline %s
Running pipeline stage MKMLDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
admin requested tearing down of mistralai-mistral-nemo_9330_v121
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
run pipeline %s
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Running pipeline stage MKMLDeleter
Pipeline stage MKMLDeleter completed in 5.51s
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
run pipeline %s
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
run pipeline stage %s
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 6.16s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 5.27s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v124
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
Shutdown handler de-registered
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline stage %s
admin requested tearing down of mistralai-mistral-nemo_9330_v125
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 8.23s
mistralai-mistral-nemo_9330_v115 status is now torndown due to DeploymentManager action
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLModelDeleter completed in 9.71s
admin requested tearing down of mistralai-mistral-nemo_9330_v126
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 10.68s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline %s
Shutdown handler de-registered
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v127
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLModelDeleter completed in 10.15s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
mistralai-mistral-nemo_9330_v116 status is now torndown due to DeploymentManager action
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
Cleaning model data from S3
admin requested tearing down of mistralai-mistral-nemo_9330_v128
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 10.64s
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Running pipeline stage MKMLDeleter
run pipeline %s
Cleaning model data from model cache
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v129
mistralai-mistral-nemo_9330_v117 status is now torndown due to DeploymentManager action
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLDeleter completed in 11.99s
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Cleaning model data from model cache
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v130
run pipeline stage %s
Cleaning model data from S3
Deleting key mistralai-mistral-nemo-9330-v119/tokenizer.json from bucket guanaco-mkml-models
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 10.48s
Running pipeline stage MKMLDeleter
Deleting key mistralai-mistral-nemo-9330-v121/special_tokens_map.json from bucket guanaco-mkml-models
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Cleaning model data from S3
admin requested tearing down of mistralai-mistral-nemo_9330_v131
Running pipeline stage MKMLModelDeleter
Cleaning model data from model cache
Deleting key mistralai-mistral-nemo-9330-v119/tokenizer_config.json from bucket guanaco-mkml-models
Pipeline stage MKMLDeleter completed in 11.46s
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v121/tokenizer.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
run pipeline stage %s
Checking if service mistralai-mistral-nemo-9330-v127 is running
run pipeline %s
Cleaning model data from model cache
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v132
Cleaning model data from S3
Deleting key mistralai-mistral-nemo-9330-v122/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLModelDeleter completed in 34.03s
Deleting key mistralai-mistral-nemo-9330-v121/tokenizer_config.json from bucket guanaco-mkml-models
Running pipeline stage MKMLDeleter
Checking if service mistralai-mistral-nemo-9330-v128 is running
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v123/config.json from bucket guanaco-mkml-models
Skipping teardown as no inference service was found
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Cleaning model data from model cache
admin requested tearing down of mistralai-mistral-nemo_9330_v133
Running pipeline stage MKMLModelDeleter
Deleting key mistralai-mistral-nemo-9330-v122/special_tokens_map.json from bucket guanaco-mkml-models
Cleaning model data from S3
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 35.85s
Checking if service mistralai-mistral-nemo-9330-v129 is running
Running pipeline stage MKMLDeleter
Deleting key mistralai-mistral-nemo-9330-v123/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 22.85s
Tearing down inference service mistralai-mistral-nemo-9330-v128
run pipeline %s
Deleting key mistralai-mistral-nemo-9330-v124/config.json from bucket guanaco-mkml-models
Shutdown handler not registered because Python interpreter is not running in the main thread
Cleaning model data from S3
admin requested tearing down of mistralai-mistral-nemo_9330_v134
Deleting key mistralai-mistral-nemo-9330-v122/tokenizer.json from bucket guanaco-mkml-models
mistralai-mistral-nemo_9330_v119 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Connection pool is full, discarding connection: %s. Connection pool size: %s
Tearing down inference service mistralai-mistral-nemo-9330-v129
Checking if service mistralai-mistral-nemo-9330-v130 is running
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
run pipeline %s
run pipeline stage %s
Cleaning model data from S3
Cleaning model data from model cache
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v131
Pipeline stage MKMLDeleter completed in 12.40s
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
mistralai-mistral-nemo_9330_v121 status is now torndown due to DeploymentManager action
Cleaning model data from S3
run pipeline stage %s
Running pipeline stage MKMLDeleter
Cleaning model data from model cache
Pipeline stage %s skipped, reason=%s
mistralai-mistral-nemo_9330_v119 status is now torndown due to DeploymentManager action
Pipeline stage MKMLModelDeleter completed in 12.19s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v131
mistralai-mistral-nemo_9330_v121 status is now torndown due to DeploymentManager action
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 12.96s
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
run pipeline %s
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLDeleter completed in 7.36s
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLDeleter completed in 8.56s
run pipeline %s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Pipeline stage MKMLModelDeleter completed in 10.91s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler de-registered
admin requested tearing down of mistralai-mistral-nemo_9330_v123
Pipeline stage MKMLDeleter completed in 12.54s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 13.62s
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline %s
mistralai-mistral-nemo_9330_v111 status is now torndown due to DeploymentManager action
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
admin requested tearing down of mistralai-mistral-nemo_9330_v124
Pipeline stage MKMLDeleter completed in 16.00s
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 15.77s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v125
run pipeline stage %s
Pipeline stage MKMLModelDeleter completed in 17.03s
Pipeline stage MKMLDeleter completed in 17.01s
mistralai-mistral-nemo_9330_v112 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v126
Running pipeline stage MKMLModelDeleter
Shutdown handler de-registered
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 18.07s
mistralai-mistral-nemo_9330_v114 status is now torndown due to DeploymentManager action
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 18.74s
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage %s skipped, reason=%s
admin requested tearing down of mistralai-mistral-nemo_9330_v127
Connection pool is full, discarding connection: %s. Connection pool size: %s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
mistralai-mistral-nemo_9330_v115 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 19.37s
Pipeline stage %s skipped, reason=%s
Shutdown handler de-registered
run pipeline %s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Pipeline stage MKMLModelDeleter completed in 19.02s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v128
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 19.90s
mistralai-mistral-nemo_9330_v116 status is now torndown due to DeploymentManager action
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Shutdown handler de-registered
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v129
Pipeline stage MKMLModelDeleter completed in 17.80s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Running pipeline stage MKMLDeleter
mistralai-mistral-nemo_9330_v117 status is now torndown due to DeploymentManager action
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v130
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 17.29s
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 16.31s
run pipeline stage %s
Running pipeline stage MKMLDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v131
mistralai-mistral-nemo_9330_v119 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 17.34s
Pipeline stage MKMLDeleter completed in 15.99s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v132
Pipeline stage MKMLDeleter completed in 15.99s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v132
mistralai-mistral-nemo_9330_v121 status is now torndown due to DeploymentManager action
Pipeline stage MKMLModelDeleter completed in 15.50s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLDeleter completed in 7.49s
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
run pipeline %s
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 9.69s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Pipeline stage MKMLModelDeleter completed in 11.01s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLDeleter completed in 11.69s
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 14.83s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v124
mistralai-mistral-nemo_9330_v111 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 16.65s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 16.41s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
admin requested tearing down of mistralai-mistral-nemo_9330_v125
mistralai-mistral-nemo_9330_v112 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 18.47s
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 18.40s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v126
Pipeline stage MKMLModelDeleter completed in 17.87s
mistralai-mistral-nemo_9330_v114 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 17.39s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
Connection pool is full, discarding connection: %s. Connection pool size: %s
Shutdown handler de-registered
admin requested tearing down of mistralai-mistral-nemo_9330_v127
run pipeline stage %s
mistralai-mistral-nemo_9330_v115 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 19.08s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 18.26s
Running pipeline stage MKMLDeleter
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
mistralai-mistral-nemo_9330_v116 status is now torndown due to DeploymentManager action
admin requested tearing down of mistralai-mistral-nemo_9330_v128
Running pipeline stage MKMLModelDeleter
Shutdown handler de-registered
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 21.86s
Pipeline stage MKMLModelDeleter completed in 20.01s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
mistralai-mistral-nemo_9330_v117 status is now torndown due to DeploymentManager action
run pipeline stage %s
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 21.32s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v130
Running pipeline stage MKMLModelDeleter
mistralai-mistral-nemo_9330_v119 status is now torndown due to DeploymentManager action
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 16.59s
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline %s
Shutdown handler de-registered
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v131
Pipeline stage MKMLModelDeleter completed in 18.51s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 17.22s
admin requested tearing down of mistralai-mistral-nemo_9330_v131
Pipeline stage MKMLModelDeleter completed in 18.51s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 17.22s
Running pipeline stage MKMLDeleter
run pipeline %s
run pipeline stage %s
mistralai-mistral-nemo_9330_v121 status is now torndown due to DeploymentManager action
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler de-registered
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Pipeline stage MKMLDeleter completed in 5.96s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage %s skipped, reason=%s
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 9.82s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLDeleter completed in 10.36s
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Shutdown handler de-registered
run pipeline %s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLDeleter completed in 13.69s
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 13.44s
Running pipeline stage MKMLDeleter
mistralai-mistral-nemo_9330_v111 status is now torndown due to DeploymentManager action
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
admin requested tearing down of mistralai-mistral-nemo_9330_v124
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 16.71s
Pipeline stage MKMLModelDeleter completed in 16.33s
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline %s
admin requested tearing down of mistralai-mistral-nemo_9330_v125
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Shutdown handler de-registered
mistralai-mistral-nemo_9330_v112 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 19.60s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLModelDeleter completed in 18.67s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage %s skipped, reason=%s
admin requested tearing down of mistralai-mistral-nemo_9330_v126
Running pipeline stage MKMLModelDeleter
mistralai-mistral-nemo_9330_v114 status is now torndown due to DeploymentManager action
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 21.46s
Shutdown handler de-registered
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 19.94s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 20.25s
mistralai-mistral-nemo_9330_v115 status is now torndown due to DeploymentManager action
Running pipeline stage MKMLDeleter
run pipeline stage %s
Shutdown handler de-registered
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 19.61s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v128
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
mistralai-mistral-nemo_9330_v116 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage %s skipped, reason=%s
admin requested tearing down of mistralai-mistral-nemo_9330_v129
Pipeline stage MKMLModelDeleter completed in 23.55s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLDeleter completed in 22.85s
Pipeline stage %s skipped, reason=%s
mistralai-mistral-nemo_9330_v117 status is now torndown due to DeploymentManager action
run pipeline stage %s
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 21.18s
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler de-registered
Connection pool is full, discarding connection: %s. Connection pool size: %s
admin requested tearing down of mistralai-mistral-nemo_9330_v130
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Pipeline stage MKMLDeleter completed in 25.25s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
mistralai-mistral-nemo_9330_v119 status is now torndown due to DeploymentManager action
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage %s skipped, reason=%s
admin requested tearing down of mistralai-mistral-nemo_9330_v131
Pipeline stage MKMLModelDeleter completed in 23.69s
Running pipeline stage MKMLDeleter
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 23.08s
mistralai-mistral-nemo_9330_v119 status is now torndown due to DeploymentManager action
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage %s skipped, reason=%s
admin requested tearing down of mistralai-mistral-nemo_9330_v131
Pipeline stage MKMLModelDeleter completed in 23.69s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 23.08s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
mistralai-mistral-nemo_9330_v121 status is now torndown due to DeploymentManager action
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 20.90s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
run pipeline %s
Pipeline stage MKMLDeleter completed in 7.43s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v121
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline stage %s
run pipeline %s
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
Pipeline stage MKMLModelDeleter completed in 10.56s
Pipeline stage %s skipped, reason=%s
admin requested tearing down of mistralai-mistral-nemo_9330_v122
Running pipeline stage MKMLDeleter
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLModelDeleter completed in 13.69s
run pipeline %s
Running pipeline stage MKMLModelDeleter
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 14.17s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v123
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Shutdown handler de-registered
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
mistralai-mistral-nemo_9330_v111 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 18.66s
admin requested tearing down of mistralai-mistral-nemo_9330_v124
Pipeline stage MKMLModelDeleter completed in 20.00s
Pipeline stage %s skipped, reason=%s
mistralai-mistral-nemo_9330_v112 status is now torndown due to DeploymentManager action
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 19.22s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v125
run pipeline %s
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 18.91s
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
admin requested tearing down of mistralai-mistral-nemo_9330_v126
mistralai-mistral-nemo_9330_v114 status is now torndown due to DeploymentManager action
run pipeline stage %s
mistralai-mistral-nemo_9330_v115 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 19.45s
Pipeline stage MKMLModelDeleter completed in 18.49s
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline %s
Running pipeline stage MKMLDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v127
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Shutdown handler de-registered
Pipeline stage MKMLDeleter completed in 18.94s
Pipeline stage MKMLModelDeleter completed in 18.44s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v128
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
mistralai-mistral-nemo_9330_v116 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Pipeline stage MKMLDeleter completed in 18.77s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v129
run pipeline %s
Pipeline stage MKMLModelDeleter completed in 21.53s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 21.07s
Pipeline stage %s skipped, reason=%s
mistralai-mistral-nemo_9330_v117 status is now torndown due to DeploymentManager action
run pipeline stage %s
Running pipeline stage MKMLDeleter
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v130
run pipeline stage %s
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 22.46s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 20.65s
Running pipeline stage MKMLModelDeleter
Pipeline stage %s skipped, reason=%s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of mistralai-mistral-nemo_9330_v131
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of cycy233-nemo-p-e-v4-c1_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
admin requested tearing down of chaiml-nemo-chai-4-ties-_5371_v1
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
admin requested tearing down of mistralai-mistral-nemo_9330_v131
Pipeline stage MKMLDeleter completed in 22.08s
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Shutdown handler de-registered
mistralai-mistral-nemo_9330_v121 status is now torndown due to DeploymentManager action
Pipeline stage MKMLModelDeleter completed in 17.37s
run pipeline %s
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLModelDeleter completed in 12.82s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
mistralai-mistral-nemo_9330_v121 status is now torndown due to DeploymentManager action
run pipeline stage %s