developer_uid: rica40325
submission_id: rica40325-mistral-3678_v5
model_name: rica40325-mistral-13b-2452_v1
model_group: rica40325/mistral-3678
status: torndown
timestamp: 2024-09-26T08:55:19+00:00
num_battles: 3336
num_wins: 1703
celo_rating: 1257.89
family_friendly_score: 0.561960542540074
family_friendly_standard_error: 0.008679876464129481
submission_type: basic
model_repo: rica40325/mistral-3678
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.6134811071489403, 'latency_mean': 1.6299810361862184, 'latency_p50': 1.6372666358947754, 'latency_p90': 1.7830361604690552}, {'batch_size': 3, 'throughput': 1.0860508525898884, 'latency_mean': 2.7573498940467833, 'latency_p50': 2.7409937381744385, 'latency_p90': 3.0574384927749634}, {'batch_size': 5, 'throughput': 1.2575118538847754, 'latency_mean': 3.9645818030834197, 'latency_p50': 3.960404396057129, 'latency_p90': 4.44483380317688}, {'batch_size': 6, 'throughput': 1.254179591065032, 'latency_mean': 4.761644139289856, 'latency_p50': 4.761838793754578, 'latency_p90': 5.3102374792099}, {'batch_size': 8, 'throughput': 1.2536985311855022, 'latency_mean': 6.343208496570587, 'latency_p50': 6.373468637466431, 'latency_p90': 7.175728511810303}, {'batch_size': 10, 'throughput': 1.2120553027106677, 'latency_mean': 8.198763157129287, 'latency_p50': 8.255642771720886, 'latency_p90': 9.263514733314514}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: rica40325-mistral-13b-2452_v1
ineligible_reason: num_battles<5000
is_internal_developer: False
language_model: rica40325/mistral-3678
model_size: 13B
ranking_group: single
throughput_3p7s: 1.23
us_pacific_date: 2024-09-26
win_ratio: 0.5104916067146283
generation_params: {'temperature': 0.95, 'top_p': 0.95, 'min_p': 0.1, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['</s>', 'Bot:', 'User:', 'You:', 'Me:', '####'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rica40325-mistral-3678-v5-mkmlizer
Waiting for job on rica40325-mistral-3678-v5-mkmlizer to finish
rica40325-mistral-3678-v5-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rica40325-mistral-3678-v5-mkmlizer: ║ _____ __ __ ║
rica40325-mistral-3678-v5-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rica40325-mistral-3678-v5-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rica40325-mistral-3678-v5-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rica40325-mistral-3678-v5-mkmlizer: ║ /___/ ║
rica40325-mistral-3678-v5-mkmlizer: ║ ║
rica40325-mistral-3678-v5-mkmlizer: ║ Version: 0.11.12 ║
rica40325-mistral-3678-v5-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rica40325-mistral-3678-v5-mkmlizer: ║ https://mk1.ai ║
rica40325-mistral-3678-v5-mkmlizer: ║ ║
rica40325-mistral-3678-v5-mkmlizer: ║ The license key for the current software has been verified as ║
rica40325-mistral-3678-v5-mkmlizer: ║ belonging to: ║
rica40325-mistral-3678-v5-mkmlizer: ║ ║
rica40325-mistral-3678-v5-mkmlizer: ║ Chai Research Corp. ║
rica40325-mistral-3678-v5-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rica40325-mistral-3678-v5-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
rica40325-mistral-3678-v5-mkmlizer: ║ ║
rica40325-mistral-3678-v5-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rica40325-mistral-3678-v5-mkmlizer: Downloaded to shared memory in 43.737s
rica40325-mistral-3678-v5-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp4f8z9tua, device:0
rica40325-mistral-3678-v5-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rica40325-mistral-3678-v5-mkmlizer: /opt/conda/lib/python3.10/site-packages/mk1/flywheel/functional/loader.py:55: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
rica40325-mistral-3678-v5-mkmlizer: tensors = torch.load(model_shard_filename, map_location=torch.device(self.device), mmap=True)
rica40325-mistral-3678-v5-mkmlizer: quantized model in 36.877s
rica40325-mistral-3678-v5-mkmlizer: Processed model rica40325/mistral-3678 in 80.614s
rica40325-mistral-3678-v5-mkmlizer: creating bucket guanaco-mkml-models
rica40325-mistral-3678-v5-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rica40325-mistral-3678-v5-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rica40325-mistral-3678-v5
rica40325-mistral-3678-v5-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rica40325-mistral-3678-v5/config.json
rica40325-mistral-3678-v5-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rica40325-mistral-3678-v5/special_tokens_map.json
rica40325-mistral-3678-v5-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rica40325-mistral-3678-v5/tokenizer_config.json
rica40325-mistral-3678-v5-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rica40325-mistral-3678-v5/tokenizer.json
rica40325-mistral-3678-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rica40325-mistral-3678-v5/flywheel_model.0.safetensors
rica40325-mistral-3678-v5-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 2%|▏ | 7/363 [00:00<00:07, 49.66it/s] Loading 0: 4%|▍ | 16/363 [00:00<00:05, 67.99it/s] Loading 0: 7%|▋ | 26/363 [00:00<00:04, 80.17it/s] Loading 0: 11%|█ | 39/363 [00:00<00:03, 98.46it/s] Loading 0: 14%|█▍ | 50/363 [00:00<00:03, 87.67it/s] Loading 0: 17%|█▋ | 60/363 [00:00<00:03, 86.40it/s] Loading 0: 19%|█▉ | 69/363 [00:01<00:14, 20.81it/s] Loading 0: 22%|██▏ | 79/363 [00:02<00:10, 26.71it/s] Loading 0: 24%|██▍ | 88/363 [00:02<00:08, 32.84it/s] Loading 0: 27%|██▋ | 97/363 [00:02<00:06, 40.13it/s] Loading 0: 29%|██▉ | 106/363 [00:02<00:05, 47.82it/s] Loading 0: 32%|███▏ | 115/363 [00:02<00:04, 54.05it/s] Loading 0: 34%|███▍ | 124/363 [00:02<00:03, 61.12it/s] Loading 0: 37%|███▋ | 134/363 [00:02<00:03, 69.51it/s] Loading 0: 39%|███▉ | 143/363 [00:03<00:11, 19.85it/s] Loading 0: 43%|████▎ | 157/363 [00:04<00:07, 28.97it/s] Loading 0: 46%|████▌ | 167/363 [00:04<00:05, 35.90it/s] Loading 0: 48%|████▊ | 175/363 [00:04<00:04, 39.88it/s] Loading 0: 51%|█████ | 184/363 [00:04<00:03, 46.27it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:03, 53.08it/s] Loading 0: 56%|█████▌ | 202/363 [00:04<00:02, 58.65it/s] Loading 0: 58%|█████▊ | 211/363 [00:04<00:02, 64.23it/s] Loading 0: 61%|██████ | 220/363 [00:04<00:02, 69.65it/s] Loading 0: 63%|██████▎ | 229/363 [00:06<00:06, 19.86it/s] Loading 0: 66%|██████▌ | 238/363 [00:06<00:04, 25.55it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:03, 31.77it/s] Loading 0: 71%|███████ | 257/363 [00:06<00:02, 40.70it/s] Loading 0: 74%|███████▎ | 267/363 [00:06<00:01, 49.90it/s] Loading 0: 76%|███████▌ | 276/363 [00:06<00:01, 56.30it/s] Loading 0: 79%|███████▊ | 285/363 [00:06<00:01, 61.52it/s] Loading 0: 81%|████████ | 294/363 [00:06<00:01, 66.54it/s] Loading 0: 83%|████████▎ | 303/363 [00:07<00:00, 70.24it/s] Loading 0: 86%|████████▌ | 312/363 [00:08<00:02, 20.84it/s] Loading 0: 88%|████████▊ | 319/363 [00:08<00:01, 25.01it/s] Loading 0: 90%|█████████ | 328/363 [00:08<00:01, 32.14it/s] Loading 0: 93%|█████████▎| 337/363 [00:08<00:00, 38.67it/s] Loading 0: 96%|█████████▌| 347/363 [00:08<00:00, 48.42it/s] Loading 0: 98%|█████████▊| 355/363 [00:08<00:00, 53.36it/s] Loading 0: 100%|██████████| 363/363 [00:15<00:00, 4.14it/s]
Job rica40325-mistral-3678-v5-mkmlizer completed after 113.92s with status: succeeded
Stopping job with name rica40325-mistral-3678-v5-mkmlizer
Pipeline stage MKMLizer completed in 114.18s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rica40325-mistral-3678-v5
Waiting for inference service rica40325-mistral-3678-v5 to be ready
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer
Waiting for job on zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer to finish
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ _____ __ __ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ /___/ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Version: 0.11.12 ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ https://mk1.ai ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ belonging to: ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Chai Research Corp. ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: Downloaded to shared memory in 81.221s
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmphm6jyx6h, device:0
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: Loading 0: 0%| | 0/507 [00:00<?, ?it/s] Loading 0: 1%| | 5/507 [00:00<00:19, 26.22it/s] Loading 0: 2%|▏ | 12/507 [00:00<00:11, 42.82it/s] Loading 0: 3%|▎ | 17/507 [00:00<00:12, 39.95it/s] Loading 0: 4%|▍ | 22/507 [00:00<00:12, 40.19it/s] Loading 0: 5%|▌ | 27/507 [00:00<00:11, 41.95it/s] Loading 0: 6%|▋ | 32/507 [00:00<00:13, 35.57it/s] Loading 0: 8%|▊ | 39/507 [00:00<00:10, 43.13it/s] Loading 0: 9%|▊ | 44/507 [00:01<00:10, 42.61it/s] Loading 0: 10%|▉ | 49/507 [00:01<00:12, 36.64it/s] Loading 0: 10%|█ | 53/507 [00:01<00:16, 27.73it/s] Loading 0: 11%|█ | 57/507 [00:01<00:16, 27.15it/s] Loading 0: 12%|█▏ | 63/507 [00:01<00:13, 32.68it/s] Loading 0: 13%|█▎ | 67/507 [00:01<00:13, 33.03it/s] Loading 0: 14%|█▍ | 73/507 [00:02<00:12, 35.69it/s] Loading 0: 16%|█▌ | 80/507 [00:02<00:11, 37.28it/s] Loading 0: 17%|█▋ | 87/507 [00:02<00:09, 43.51it/s] Loading 0: 18%|█▊ | 92/507 [00:02<00:09, 42.34it/s] Loading 0: 19%|█▉ | 97/507 [00:02<00:09, 41.95it/s] Loading 0: 20%|██ | 102/507 [00:02<00:09, 42.87it/s] Loading 0: 21%|██ | 107/507 [00:02<00:11, 36.27it/s] Loading 0: 22%|██▏ | 113/507 [00:03<00:12, 30.88it/s] Loading 0: 23%|██▎ | 117/507 [00:03<00:13, 29.67it/s] Loading 0: 24%|██▍ | 122/507 [00:03<00:12, 30.67it/s] Loading 0: 25%|██▌ | 129/507 [00:03<00:09, 37.93it/s] Loading 0: 26%|██▋ | 134/507 [00:03<00:09, 38.27it/s] Loading 0: 27%|██▋ | 139/507 [00:03<00:09, 39.32it/s] Loading 0: 28%|██▊ | 144/507 [00:03<00:08, 41.05it/s] Loading 0: 29%|██▉ | 149/507 [00:04<00:10, 35.80it/s] Loading 0: 31%|███ | 156/507 [00:04<00:08, 43.40it/s] Loading 0: 32%|███▏ | 161/507 [00:04<00:08, 42.00it/s] Loading 0: 33%|███▎ | 166/507 [00:04<00:07, 43.66it/s] Loading 0: 34%|███▎ | 171/507 [00:04<00:12, 27.01it/s] Loading 0: 35%|███▍ | 176/507 [00:04<00:11, 28.21it/s] Loading 0: 36%|███▌ | 183/507 [00:05<00:09, 35.48it/s] Loading 0: 37%|███▋ | 188/507 [00:05<00:08, 36.52it/s] Loading 0: 38%|███▊ | 193/507 [00:05<00:08, 38.20it/s] Loading 0: 39%|███▉ | 198/507 [00:05<00:07, 40.70it/s] Loading 0: 40%|████ | 203/507 [00:05<00:08, 35.59it/s] Loading 0: 41%|████▏ | 210/507 [00:05<00:06, 42.70it/s] Loading 0: 42%|████▏ | 215/507 [00:05<00:06, 41.73it/s] Loading 0: 43%|████▎ | 220/507 [00:05<00:07, 36.74it/s] Loading 0: 44%|████▍ | 224/507 [00:06<00:09, 28.48it/s] Loading 0: 45%|████▌ | 230/507 [00:06<00:09, 29.76it/s] Loading 0: 47%|████▋ | 237/507 [00:06<00:07, 36.36it/s] Loading 0: 48%|████▊ | 242/507 [00:06<00:07, 36.99it/s] Loading 0: 49%|████▊ | 247/507 [00:06<00:06, 38.23it/s] Loading 0: 50%|████▉ | 252/507 [00:06<00:06, 40.14it/s] Loading 0: 51%|█████ | 257/507 [00:07<00:07, 35.20it/s] Loading 0: 52%|█████▏ | 264/507 [00:07<00:05, 42.87it/s] Loading 0: 53%|█████▎ | 269/507 [00:07<00:05, 42.50it/s] Loading 0: 54%|█████▍ | 274/507 [00:07<00:05, 42.47it/s] Loading 0: 55%|█████▌ | 279/507 [00:07<00:05, 43.27it/s] Loading 0: 56%|█████▌ | 284/507 [00:07<00:07, 30.53it/s] Loading 0: 57%|█████▋ | 288/507 [00:07<00:07, 29.27it/s] Loading 0: 58%|█████▊ | 293/507 [00:08<00:07, 29.80it/s] Loading 0: 59%|█████▉ | 299/507 [00:22<00:06, 29.80it/s] Loading 0: 59%|█████▉ | 300/507 [00:22<02:47, 1.24it/s] Loading 0: 60%|█████▉ | 302/507 [00:22<02:25, 1.41it/s] Loading 0: 61%|██████ | 307/507 [00:22<01:38, 2.03it/s] Loading 0: 61%|██████ | 310/507 [00:23<01:17, 2.53it/s] Loading 0: 62%|██████▏ | 314/507 [00:23<00:55, 3.49it/s] Loading 0: 63%|██████▎ | 319/507 [00:23<00:36, 5.09it/s] Loading 0: 64%|██████▍ | 324/507 [00:23<00:25, 7.20it/s] Loading 0: 65%|██████▍ | 328/507 [00:23<00:19, 9.25it/s] Loading 0: 66%|██████▌ | 333/507 [00:23<00:13, 12.54it/s] Loading 0: 67%|██████▋ | 338/507 [00:23<00:10, 16.46it/s] Loading 0: 68%|██████▊ | 343/507 [00:24<00:10, 15.85it/s] Loading 0: 68%|██████▊ | 347/507 [00:24<00:08, 17.94it/s] Loading 0: 70%|██████▉ | 354/507 [00:24<00:06, 24.77it/s] Loading 0: 71%|███████ | 359/507 [00:24<00:05, 27.66it/s] Loading 0: 72%|███████▏ | 364/507 [00:24<00:04, 30.48it/s] Loading 0: 73%|███████▎ | 369/507 [00:24<00:04, 33.90it/s] Loading 0: 74%|███████▍ | 374/507 [00:24<00:04, 31.31it/s] Loading 0: 75%|███████▌ | 381/507 [00:25<00:03, 38.05it/s] Loading 0: 76%|███████▌ | 386/507 [00:25<00:03, 37.35it/s] Loading 0: 77%|███████▋ | 391/507 [00:25<00:03, 34.64it/s] Loading 0: 78%|███████▊ | 395/507 [00:25<00:04, 27.59it/s] Loading 0: 79%|███████▉ | 401/507 [00:25<00:03, 29.33it/s] Loading 0: 80%|████████ | 408/507 [00:25<00:02, 35.64it/s] Loading 0: 81%|████████▏ | 412/507 [00:25<00:02, 35.90it/s] Loading 0: 82%|████████▏ | 418/507 [00:26<00:02, 39.11it/s] Loading 0: 84%|████████▎ | 424/507 [00:26<00:02, 39.95it/s] Loading 0: 85%|████████▍ | 429/507 [00:26<00:01, 39.02it/s] Loading 0: 86%|████████▌ | 435/507 [00:26<00:01, 42.81it/s] Loading 0: 87%|████████▋ | 440/507 [00:26<00:01, 41.19it/s] Loading 0: 88%|████████▊ | 445/507 [00:26<00:01, 39.86it/s] Loading 0: 89%|████████▉ | 450/507 [00:26<00:01, 40.51it/s] Loading 0: 90%|████████▉ | 455/507 [00:29<00:07, 6.61it/s] Loading 0: 91%|█████████ | 459/507 [00:29<00:05, 8.15it/s] Loading 0: 92%|█████████▏| 465/507 [00:29<00:03, 11.30it/s] Loading 0: 93%|█████████▎| 472/507 [00:29<00:02, 16.24it/s] Loading 0: 94%|█████████▍| 477/507 [00:29<00:01, 19.41it/s] Loading 0: 95%|█████████▌| 482/507 [00:29<00:01, 23.04it/s] Loading 0: 96%|█████████▌| 487/507 [00:29<00:00, 26.73it/s] Loading 0: 97%|█████████▋| 492/507 [00:30<00:00, 26.11it/s] Loading 0: 98%|█████████▊| 499/507 [00:30<00:00, 33.57it/s] Loading 0: 99%|█████████▉| 504/507 [00:30<00:00, 35.69it/s] Traceback (most recent call last):
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 151, in <module>
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: cli()
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: return self.main(*args, **kwargs)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: rv = self.invoke(ctx)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: return _process_result(sub_ctx.command.invoke(sub_ctx))
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: return ctx.invoke(self.callback, **ctx.params)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: return __callback(*args, **kwargs)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 42, in quantize
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: quantize_model(temp_folder, output_path, profile, device)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/code/uploading/mkmlize.py", line 135, in quantize_model
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: flywheel.instrument(
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/mk1/flywheel/instrument.py", line 96, in instrument
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: tokenizer = AutoTokenizer.from_pretrained(input_model_path, verbose=False)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 897, in from_pretrained
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2271, in from_pretrained
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: return cls._from_pretrained(
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2505, in _from_pretrained
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: tokenizer = cls(*init_inputs, **init_kwargs)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 157, in __init__
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: super().__init__(
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: File "/opt/conda/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 115, in __init__
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file)
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: Exception: data did not match any variant of untagged enum ModelWrapper at line 275732 column 3
Job zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer completed after 155.0s with status: failed
Stopping job with name zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer
Waiting for job on zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer to finish
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ _____ __ __ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ /___/ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Version: 0.11.12 ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ https://mk1.ai ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ belonging to: ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Chai Research Corp. ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ║ ║
zonemercy-vingt-deux-v5-2e6v0-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Inference service rica40325-mistral-3678-v5 ready after 220.47120213508606s
Pipeline stage MKMLDeployer completed in 220.78s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.3580095767974854s
Received healthy response to inference request in 1.9114880561828613s
Received healthy response to inference request in 1.8306169509887695s
Received healthy response to inference request in 1.730583906173706s
Received healthy response to inference request in 1.751753330230713s
5 requests
0 failed requests
5th percentile: 1.7348177909851075
10th percentile: 1.7390516757965089
20th percentile: 1.7475194454193115
30th percentile: 1.7675260543823241
40th percentile: 1.7990715026855468
50th percentile: 1.8306169509887695
60th percentile: 1.8629653930664063
70th percentile: 1.895313835144043
80th percentile: 2.000792360305786
90th percentile: 2.1794009685516356
95th percentile: 2.2687052726745605
99th percentile: 2.3401487159729
mean time: 1.916490364074707
Pipeline stage StressChecker completed in 11.12s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 1.98s
Shutdown handler de-registered
rica40325-mistral-3678_v5 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rica40325-mistral-3678-v5-profiler
Waiting for inference service rica40325-mistral-3678-v5-profiler to be ready
Inference service rica40325-mistral-3678-v5-profiler ready after 220.48814988136292s
Pipeline stage MKMLProfilerDeployer completed in 220.94s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rica40325-mistral-3678-v5-profiler-predictor-00001-deploym57nk6:/code/chaiverse_profiler_1727341544 --namespace tenant-chaiml-guanaco
kubectl exec -it rica40325-mistral-3678-v5-profiler-predictor-00001-deploym57nk6 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1727341544 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1727341544/summary.json'
kubectl exec -it rica40325-mistral-3678-v5-profiler-predictor-00001-deploym57nk6 --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1727341544/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1159.58s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rica40325-mistral-3678-v5-profiler is running
Tearing down inference service rica40325-mistral-3678-v5-profiler
Service rica40325-mistral-3678-v5-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 3.00s
Shutdown handler de-registered
rica40325-mistral-3678_v5 status is now inactive due to auto deactivation removed underperforming models
Shutdown handler de-registered
Pipeline stage ProductionBlendMKMLTemplater completed in 76.56s
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 66.28s
Running pipeline stage MKMLModelDeleter
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
Creating inference service blend-rofur-2024-10-03
Shutdown handler de-registered
zonemercy-vingt-deux-v3-1e5v2_v6 status is now torndown due to DeploymentManager action
Creating inference service blend-rofur-2024-10-03
%s, retrying in %s seconds...
Running pipeline stage MKMLModelDeleter
%s, retrying in %s seconds...
admin requested tearing down of rica40325-mistral-3678_v5
run pipeline stage %s
Waiting for inference service blend-rofur-2024-10-03 to be ready
admin requested tearing down of blend_rofur_2024-10-03
zonemercy-vingt-deux-v3-1e5v2_v7 status is now torndown due to DeploymentManager action
Waiting for inference service blend-rofur-2024-10-03 to be ready
admin requested tearing down of blend_rofur_2024-10-03
Ignoring service blend-rofur-2024-10-03 already deployed
Pipeline stage %s skipped, reason=%s
Creating inference service blend-rofur-2024-10-03
admin requested tearing down of blend_rofur_2024-10-03
Running pipeline stage MKMLDeployer
Shutdown handler not registered because Python interpreter is not running in the main thread
Shutdown handler not registered because Python interpreter is not running in the main thread
Waiting for inference service blend-rofur-2024-10-03 to be ready
Pipeline stage MKMLModelDeleter completed in 91.40s
Ignoring service blend-rofur-2024-10-03 already deployed
Ignoring service blend-rofur-2024-10-03 already deployed
run pipeline %s
run pipeline %s
Creating inference service blend-rofur-2024-10-03
run pipeline %s
Shutdown handler de-registered
Waiting for inference service blend-rofur-2024-10-03 to be ready
run pipeline stage %s
run pipeline stage %s
Ignoring service blend-rofur-2024-10-03 already deployed
run pipeline stage %s
zonemercy-vingt-deux-v2-1e5_v23 status is now torndown due to DeploymentManager action
Running pipeline stage MKMLDeleter
Running pipeline stage ProductionBlendMKMLTemplater
Waiting for inference service blend-rofur-2024-10-03 to be ready
Running pipeline stage ProductionBlendMKMLTemplater
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 37.44s
Pipeline stage ProductionBlendMKMLTemplater completed in 37.18s
Pipeline stage ProductionBlendMKMLTemplater completed in 41.66s
run pipeline stage %s
run pipeline stage %s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeployer
Running pipeline stage MKMLDeployer
Pipeline stage %s skipped, reason=%s
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
Pipeline stage MKMLModelDeleter completed in 36.94s
Ignoring service blend-rofur-2024-10-03 already deployed
Ignoring service blend-rofur-2024-10-03 already deployed
admin requested tearing down of rica40325-mistral-3678_v5
Shutdown handler de-registered
Waiting for inference service blend-rofur-2024-10-03 to be ready
admin requested tearing down of blend_rofur_2024-10-03
Shutdown handler de-registered
Waiting for inference service blend-rofur-2024-10-03 to be ready
Shutdown handler not registered because Python interpreter is not running in the main thread
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
admin requested tearing down of zonemercy-vingt-deux-v1-1e5_v22
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
admin requested tearing down of blend_rofur_2024-10-03
run pipeline stage %s
run pipeline %s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLDeleter
run pipeline stage %s
Running pipeline stage ProductionBlendMKMLTemplater
run pipeline %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 57.75s
Pipeline stage %s skipped, reason=%s
Pipeline stage ProductionBlendMKMLTemplater completed in 53.22s
Running pipeline stage ProductionBlendMKMLTemplater
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 57.77s
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Running pipeline stage MKMLDeployer
Pipeline stage ProductionBlendMKMLTemplater completed in 56.84s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Creating inference service blend-rofur-2024-10-03
Pipeline stage MKMLModelDeleter completed in 53.68s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeployer
Ignoring service blend-rofur-2024-10-03 already deployed
admin requested tearing down of rica40325-mistral-3678_v5
Shutdown handler de-registered
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
Running pipeline stage ProductionBlendMKMLTemplater
Pipeline stage MKMLModelDeleter completed in 23.84s
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
Pipeline stage %s skipped, reason=%s
admin requested tearing down of rica40325-mistral-3678_v5
Pipeline stage MKMLModelDeleter completed in 69.07s
Waiting for inference service blend-rofur-2024-10-03 to be ready
admin requested tearing down of blend_rofur_2024-10-03
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
admin requested tearing down of blend_rofur_2024-10-03
Pipeline stage ProductionBlendMKMLTemplater completed in 98.18s
Shutdown handler not registered because Python interpreter is not running in the main thread
Pipeline stage MKMLModelDeleter completed in 99.09s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Tearing down inference service blend-rofur-2024-10-03
zonemercy-vingt-deux-v1-1e5_v22 status is now torndown due to DeploymentManager action
run pipeline %s
run pipeline %s
Tearing down inference service blend-rofur-2024-10-03
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 110.02s
run pipeline %s
Tearing down inference service blend-rofur-2024-10-03
%s, retrying in %s seconds...
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
run pipeline stage %s
run pipeline stage %s
%s, retrying in %s seconds...
zonemercy-vingt-deux-v1-1e5_v23 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 114.48s
run pipeline stage %s
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
Running pipeline stage MKMLDeleter
Running pipeline stage ProductionBlendMKMLTemplater
Creating inference service blend-rofur-2024-10-03
zonemercy-vingt-deux-v2-1e5_v23 status is now torndown due to DeploymentManager action
Running pipeline stage ProductionBlendMKMLTemplater
Shutdown handler de-registered
Waiting for inference service blend-rofur-2024-10-03 to be ready
Ignoring service blend-rofur-2024-10-03 already deployed
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Ignoring service blend-rofur-2024-10-03 already deployed
Pipeline stage %s skipped, reason=%s
zonemercy-vingt-deux-v2-1e5_v24 status is now torndown due to DeploymentManager action
Ignoring service blend-rofur-2024-10-03 already deployed
Waiting for inference service blend-rofur-2024-10-03 to be ready
Ignoring service blend-rofur-2024-10-03 already deployed
Ignoring service blend-rofur-2024-10-03 already deployed
Ignoring service blend-rofur-2024-10-03 already deployed
Pipeline stage MKMLDeleter completed in 157.79s
Pipeline stage ProductionBlendMKMLTemplater completed in 159.82s
Waiting for inference service blend-rofur-2024-10-03 to be ready
Pipeline stage ProductionBlendMKMLTemplater completed in 151.36s
Waiting for inference service blend-rofur-2024-10-03 to be ready
Waiting for inference service blend-rofur-2024-10-03 to be ready
Waiting for inference service blend-rofur-2024-10-03 to be ready
Waiting for inference service blend-rofur-2024-10-03 to be ready
run pipeline stage %s
run pipeline stage %s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeployer
Running pipeline stage MKMLDeployer
Pipeline stage %s skipped, reason=%s
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
Pipeline stage MKMLModelDeleter completed in 44.03s
Pipeline stage MKMLModelDeleter completed in 57.66s
run pipeline %s
Pipeline stage %s skipped, reason=%s
Shutdown handler not registered because Python interpreter is not running in the main thread
Waiting for inference service blend-rofur-2024-10-03 to be ready
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
Waiting for inference service blend-rofur-2024-10-03 to be ready
Waiting for inference service blend-rofur-2024-10-03 to be ready
Pipeline stage ProductionBlendMKMLTemplater completed in 63.16s
Pipeline stage ProductionBlendMKMLTemplater completed in 63.16s
Pipeline stage ProductionBlendMKMLTemplater completed in 61.96s
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
admin requested tearing down of rica40325-mistral-3678_v5
run pipeline stage %s
admin requested tearing down of blend_rofur_2024-10-03
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLDeployer
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of blend_rofur_2024-10-03
Running pipeline stage MKMLDeployer
run pipeline %s
Creating inference service blend-rofur-2024-10-03
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Creating inference service blend-rofur-2024-10-03
run pipeline stage %s
Ignoring service blend-rofur-2024-10-03 already deployed
run pipeline stage %s
run pipeline %s
Ignoring service blend-rofur-2024-10-03 already deployed
Running pipeline stage MKMLDeleter
Waiting for inference service blend-rofur-2024-10-03 to be ready
Running pipeline stage ProductionBlendMKMLTemplater
run pipeline stage %s
Waiting for inference service blend-rofur-2024-10-03 to be ready
Pipeline stage %s skipped, reason=%s
Pipeline stage %s skipped, reason=%s
Running pipeline stage ProductionBlendMKMLTemplater
Pipeline stage MKMLDeleter completed in 68.66s
Pipeline stage ProductionBlendMKMLTemplater completed in 70.24s
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
run pipeline stage %s
Pipeline stage ProductionBlendMKMLTemplater completed in 52.88s
Running pipeline stage MKMLModelDeleter
Running pipeline stage MKMLDeployer
run pipeline stage %s
Pipeline stage %s skipped, reason=%s
admin requested tearing down of rica40325-mistral-3678_v5
Running pipeline stage MKMLDeployer
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 71.02s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLDeleter
Pipeline stage MKMLModelDeleter completed in 92.57s
run pipeline stage %s
Waiting for inference service blend-rofur-2024-10-03 to be ready
Pipeline stage %s skipped, reason=%s
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
Running pipeline stage MKMLModelDeleter
Pipeline stage MKMLDeleter completed in 99.74s
admin requested tearing down of rica40325-mistral-3678_v5
run pipeline stage %s
run pipeline %s
Pipeline stage %s skipped, reason=%s
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 116.77s
zonemercy-vingt-deux-v1-1e5_v22 status is now torndown due to DeploymentManager action
Pipeline stage %s skipped, reason=%s
Shutdown handler de-registered
run pipeline %s
Pipeline stage %s skipped, reason=%s
Running pipeline stage ProductionBlendMKMLTemplater
run pipeline stage %s
Tearing down inference service blend-rofur-2024-10-03
Pipeline stage MKMLDeleter completed in 113.36s
zonemercy-vingt-deux-v1-1e5_v23 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Pipeline stage %s skipped, reason=%s
Tearing down inference service blend-rofur-2024-10-03
Tearing down inference service blend-rofur-2024-10-03
Waiting for inference service blend-rofur-2024-10-03 to be ready
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
Pipeline stage MKMLModelDeleter completed in 54.10s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLModelDeleter
Waiting for inference service blend-rofur-2024-10-03 to be ready
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 49.99s
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 45.96s
Pipeline stage MKMLModelDeleter completed in 46.78s
Waiting for inference service blend-rofur-2024-10-03 to be ready
Pipeline stage %s skipped, reason=%s
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
Tearing down inference service blend-rofur-2024-10-03
Running pipeline stage MKMLModelDeleter
Tearing down inference service blend-rofur-2024-10-03
zonemercy-vingt-deux-v1-1e5_v22 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
Pipeline stage MKMLModelDeleter completed in 36.48s
%s, retrying in %s seconds...
%s, retrying in %s seconds...
admin requested tearing down of blend_rofur_2024-10-03
Pipeline stage %s skipped, reason=%s
%s, retrying in %s seconds...
Tearing down inference service blend-rofur-2024-10-03
zonemercy-vingt-deux-v1-1e5_v23 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
admin requested tearing down of blend_rofur_2024-10-03
Shutdown handler not registered because Python interpreter is not running in the main thread
%s, retrying in %s seconds...
Pipeline stage MKMLModelDeleter completed in 105.91s
Creating inference service blend-rofur-2024-10-03
%s, retrying in %s seconds...
zonemercy-vingt-deux-v2-1e5_v23 status is now torndown due to DeploymentManager action
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
Waiting for inference service blend-rofur-2024-10-03 to be ready
Creating inference service blend-rofur-2024-10-03
Creating inference service blend-rofur-2024-10-03
Shutdown handler de-registered
Ignoring service blend-rofur-2024-10-03 already deployed
Creating inference service blend-rofur-2024-10-03
run pipeline stage %s
run pipeline %s
run pipeline stage %s
Ignoring service blend-rofur-2024-10-03 already deployed
Ignoring service blend-rofur-2024-10-03 already deployed
zonemercy-vingt-deux-v2-1e5_v24 status is now torndown due to DeploymentManager action
Waiting for inference service blend-rofur-2024-10-03 to be ready
Ignoring service blend-rofur-2024-10-03 already deployed
Running pipeline stage ProductionBlendMKMLTemplater
run pipeline stage %s
Running pipeline stage MKMLDeleter
Waiting for inference service blend-rofur-2024-10-03 to be ready
Waiting for inference service blend-rofur-2024-10-03 to be ready
Waiting for inference service blend-rofur-2024-10-03 to be ready
Pipeline stage %s skipped, reason=%s
Running pipeline stage ProductionBlendMKMLTemplater
Pipeline stage %s skipped, reason=%s
Pipeline stage ProductionBlendMKMLTemplater completed in 67.46s
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 62.31s
run pipeline stage %s
Pipeline stage ProductionBlendMKMLTemplater completed in 40.83s
run pipeline stage %s
admin requested tearing down of rica40325-mistral-3678_v5
admin requested tearing down of blend_rofur_2024-10-03
Running pipeline stage MKMLDeployer
run pipeline stage %s
admin requested tearing down of blend_rofur_2024-10-03
Running pipeline stage MKMLDeployer
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of blend_rofur_2024-10-03
Shutdown handler not registered because Python interpreter is not running in the main thread
Creating inference service blend-rofur-2024-10-03
Running pipeline stage MKMLDeployer
Pipeline stage %s skipped, reason=%s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
Ignoring service blend-rofur-2024-10-03 already deployed
Pipeline stage MKMLModelDeleter completed in 76.43s
Creating inference service blend-rofur-2024-10-03
run pipeline stage %s
run pipeline %s
run pipeline stage %s
Waiting for inference service blend-rofur-2024-10-03 to be ready
Shutdown handler de-registered
Ignoring service blend-rofur-2024-10-03 already deployed
Running pipeline stage MKMLDeleter
run pipeline stage %s
Running pipeline stage ProductionBlendMKMLTemplater
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action
Waiting for inference service blend-rofur-2024-10-03 to be ready
Pipeline stage %s skipped, reason=%s
Running pipeline stage ProductionBlendMKMLTemplater
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLDeleter completed in 67.83s
Pipeline stage %s skipped, reason=%s
Pipeline stage ProductionBlendMKMLTemplater completed in 61.48s
run pipeline stage %s
Pipeline stage ProductionBlendMKMLTemplater completed in 62.15s
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Running pipeline stage MKMLDeployer
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeployer
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeployer
admin requested tearing down of rica40325-mistral-3678_v5
Creating inference service blend-rofur-2024-10-03
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 49.85s
Pipeline stage %s skipped, reason=%s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
rica40325-mistral-3678_v5 status is now torndown due to DeploymentManager action