developer_uid: chai_backend_admin
submission_id: chaiml-2fe5-c13f-linear-w01_v32
model_name: chaiml-2fe5-c13f-linear-w01_v32
model_group: ChaiML/2fe5-c13f-linear-
status: torndown
timestamp: 2026-01-14T16:59:07+00:00
num_battles: 6209
num_wins: 3225
celo_rating: 1306.68
family_friendly_score: 0.0
family_friendly_standard_error: 0.0
submission_type: basic
model_repo: ChaiML/2fe5-c13f-linear-w01
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 10
max_input_tokens: 1024
max_output_tokens: 64
reward_model: chaiml-prm-kimi-v1-300k_92220_v2
display_name: chaiml-2fe5-c13f-linear-w01_v32
ineligible_reason: model is not deployable
is_internal_developer: True
language_model: ChaiML/2fe5-c13f-linear-w01
model_size: 13B
ranking_group: single
us_pacific_date: 2025-12-17
win_ratio: 0.5194073119665003
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['Bot:', '<|eot_id|>', '<|im_end|>', '\n', '</s>', 'User:', 'You:', '####'], 'max_input_tokens': 1024, 'best_of': 10, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': True}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name chaiml-2fe5-c13f-linear-w01-v32-mkmlizer
Waiting for job on chaiml-2fe5-c13f-linear-w01-v32-mkmlizer to finish
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: bash: cannot set terminal process group (-1): Inappropriate ioctl for device
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: bash: no job control in this shell
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: /root/miniconda3/envs/nvidia/lib/python3.11/site-packages/mk1/__init__.py:1: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81.
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: __import__('pkg_resources').declare_namespace(__name__)
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ Version: 0.30.6+torch280 ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ https://mk1.ai ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ belonging to: ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ Chai Research Corp. ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ║ ║
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: Downloaded to shared memory in 27.267s
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: Checking if ChaiML/2fe5-c13f-linear-w01 already exists in ChaiML
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: quantizing model to /dev/shm/model_cache, profile:q4, folder:/tmp/tmph55k29by, device:0
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: Loading 0: 0%| | 0.00/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5.00/363 [00:01<02:10, 2.74it/s] Loading 0: 1%|▏ | 5.00/363 [00:01<02:10, 2.74it/s] Loading 0: 2%|▏ | 8.00/363 [00:02<02:05, 2.84it/s] Loading 0: 2%|▏ | 8.00/363 [00:02<02:05, 2.84it/s] Loading 0: 4%|▎ | 13.0/363 [00:04<01:54, 3.06it/s] Loading 0: 4%|▎ | 13.0/363 [00:04<01:54, 3.06it/s] Loading 0: 4%|▍ | 15.0/363 [00:06<02:39, 2.19it/s] Loading 0: 4%|▍ | 15.0/363 [00:06<02:39, 2.19it/s] Loading 0: 6%|▌ | 22.0/363 [00:07<01:51, 3.05it/s] Loading 0: 6%|▌ | 22.0/363 [00:07<01:51, 3.05it/s] Loading 0: 7%|▋ | 24.0/363 [00:09<02:24, 2.35it/s] Loading 0: 7%|▋ | 24.0/363 [00:09<02:24, 2.35it/s] Loading 0: 9%|▊ | 31.0/363 [00:10<01:47, 3.09it/s] Loading 0: 9%|▊ | 31.0/363 [00:10<01:47, 3.09it/s] Loading 0: 9%|▉ | 33.0/363 [00:12<02:16, 2.42it/s] Loading 0: 9%|▉ | 33.0/363 [00:12<02:16, 2.42it/s] Loading 0: 11%|█ | 40.0/363 [00:14<01:47, 2.99it/s] Loading 0: 11%|█ | 40.0/363 [00:14<01:47, 2.99it/s] Loading 0: 12%|█▏ | 42.0/363 [00:16<02:14, 2.39it/s] Loading 0: 12%|█▏ | 42.0/363 [00:16<02:14, 2.39it/s] Loading 0: 13%|█▎ | 49.0/363 [00:17<01:42, 3.05it/s] Loading 0: 13%|█▎ | 49.0/363 [00:17<01:42, 3.05it/s] Loading 0: 14%|█▍ | 51.0/363 [00:19<02:07, 2.45it/s] Loading 0: 14%|█▍ | 51.0/363 [00:19<02:07, 2.45it/s] Loading 0: 16%|█▌ | 58.0/363 [00:20<01:38, 3.09it/s] Loading 0: 16%|█▌ | 58.0/363 [00:20<01:38, 3.09it/s] Loading 0: 17%|█▋ | 60.0/363 [00:22<02:03, 2.45it/s] Loading 0: 17%|█▋ | 60.0/363 [00:22<02:03, 2.45it/s] Loading 0: 18%|█▊ | 67.0/363 [00:23<01:36, 3.07it/s] Loading 0: 18%|█▊ | 67.0/363 [00:23<01:36, 3.07it/s] Loading 0: 19%|█▉ | 69.0/363 [00:25<01:59, 2.46it/s] Loading 0: 19%|█▉ | 69.0/363 [00:25<01:59, 2.46it/s] Loading 0: 21%|██ | 76.0/363 [00:27<01:33, 3.08it/s] Loading 0: 21%|██ | 76.0/363 [00:27<01:33, 3.08it/s] Loading 0: 21%|██▏ | 78.0/363 [00:28<01:55, 2.47it/s] Loading 0: 21%|██▏ | 78.0/363 [00:28<01:55, 2.47it/s] Loading 0: 23%|██▎ | 85.0/363 [00:30<01:30, 3.08it/s] Loading 0: 23%|██▎ | 85.0/363 [00:30<01:30, 3.08it/s] Loading 0: 24%|██▍ | 87.0/363 [00:32<01:51, 2.47it/s] Loading 0: 24%|██▍ | 87.0/363 [00:32<01:51, 2.47it/s] Loading 0: 26%|██▌ | 94.0/363 [00:33<01:27, 3.08it/s] Loading 0: 26%|██▌ | 94.0/363 [00:33<01:27, 3.08it/s] Loading 0: 26%|██▋ | 96.0/363 [00:35<01:48, 2.47it/s] Loading 0: 26%|██▋ | 96.0/363 [00:35<01:48, 2.47it/s] Loading 0: 28%|██▊ | 103/363 [00:36<01:23, 3.10it/s] Loading 0: 28%|██▊ | 103/363 [00:36<01:23, 3.10it/s] Loading 0: 29%|██▉ | 105/363 [00:38<01:44, 2.48it/s] Loading 0: 29%|██▉ | 105/363 [00:38<01:44, 2.48it/s] Loading 0: 31%|███ | 112/363 [00:40<01:20, 3.10it/s] Loading 0: 31%|███ | 112/363 [00:40<01:20, 3.10it/s] Loading 0: 31%|███▏ | 114/363 [00:41<01:40, 2.48it/s] Loading 0: 31%|███▏ | 114/363 [00:41<01:40, 2.48it/s] Loading 0: 33%|███▎ | 121/363 [00:43<01:20, 3.01it/s] Loading 0: 33%|███▎ | 121/363 [00:43<01:20, 3.01it/s] Loading 0: 34%|███▍ | 123/363 [00:45<01:38, 2.44it/s] Loading 0: 34%|███▍ | 123/363 [00:45<01:38, 2.44it/s] Loading 0: 36%|███▌ | 130/363 [00:46<01:16, 3.05it/s] Loading 0: 36%|███▌ | 130/363 [00:46<01:16, 3.05it/s] Loading 0: 36%|███▋ | 132/363 [00:48<01:34, 2.45it/s] Loading 0: 36%|███▋ | 132/363 [00:48<01:34, 2.45it/s] Loading 0: 38%|███▊ | 139/363 [00:50<01:12, 3.07it/s] Loading 0: 38%|███▊ | 139/363 [00:50<01:12, 3.07it/s] Loading 0: 39%|███▉ | 141/363 [00:51<01:29, 2.47it/s] Loading 0: 39%|███▉ | 141/363 [00:51<01:29, 2.47it/s] Loading 0: 41%|████ | 148/363 [00:53<01:09, 3.10it/s] Loading 0: 41%|████ | 148/363 [00:53<01:09, 3.10it/s] Loading 0: 41%|████▏ | 150/363 [00:54<01:25, 2.48it/s] Loading 0: 41%|████▏ | 150/363 [00:54<01:25, 2.48it/s] Loading 0: 43%|████▎ | 157/363 [00:56<01:06, 3.11it/s] Loading 0: 43%|████▎ | 157/363 [00:56<01:06, 3.11it/s] Loading 0: 44%|████▍ | 159/363 [00:58<01:21, 2.49it/s] Loading 0: 44%|████▍ | 159/363 [00:58<01:21, 2.49it/s] Loading 0: 46%|████▌ | 166/363 [00:59<01:03, 3.11it/s] Loading 0: 46%|████▌ | 166/363 [00:59<01:03, 3.11it/s] Loading 0: 46%|████▋ | 168/363 [01:01<01:18, 2.49it/s] Loading 0: 46%|████▋ | 168/363 [01:01<01:18, 2.49it/s] Loading 0: 48%|████▊ | 175/363 [01:02<01:00, 3.12it/s] Loading 0: 48%|████▊ | 175/363 [01:02<01:00, 3.12it/s] Loading 0: 49%|████▉ | 177/363 [01:04<01:14, 2.50it/s] Loading 0: 49%|████▉ | 177/363 [01:04<01:14, 2.50it/s] Loading 0: 51%|█████ | 184/363 [01:06<00:57, 3.12it/s] Loading 0: 51%|█████ | 184/363 [01:06<00:57, 3.12it/s] Loading 0: 51%|█████ | 186/363 [01:07<01:11, 2.49it/s] Loading 0: 51%|█████ | 186/363 [01:07<01:11, 2.49it/s] Loading 0: 53%|█████▎ | 193/363 [01:09<00:54, 3.11it/s] Loading 0: 53%|█████▎ | 193/363 [01:09<00:54, 3.11it/s] Loading 0: 54%|█████▎ | 195/363 [01:11<01:07, 2.49it/s] Loading 0: 54%|█████▎ | 195/363 [01:11<01:07, 2.49it/s] Loading 0: 56%|█████▌ | 202/363 [01:12<00:53, 3.03it/s] Loading 0: 56%|█████▌ | 202/363 [01:12<00:53, 3.03it/s] Loading 0: 56%|█████▌ | 204/363 [01:14<01:04, 2.45it/s] Loading 0: 56%|█████▌ | 204/363 [01:14<01:04, 2.45it/s] Loading 0: 58%|█████▊ | 211/363 [01:15<00:49, 3.07it/s] Loading 0: 58%|█████▊ | 211/363 [01:15<00:49, 3.07it/s] Loading 0: 59%|█████▊ | 213/363 [01:17<01:00, 2.47it/s] Loading 0: 59%|█████▊ | 213/363 [01:17<01:00, 2.47it/s] Loading 0: 61%|██████ | 220/363 [01:19<00:46, 3.10it/s] Loading 0: 61%|██████ | 220/363 [01:19<00:46, 3.10it/s] Loading 0: 61%|██████ | 222/363 [01:20<00:56, 2.48it/s] Loading 0: 61%|██████ | 222/363 [01:20<00:56, 2.48it/s] Loading 0: 63%|██████▎ | 229/363 [01:22<00:43, 3.11it/s] Loading 0: 63%|██████▎ | 229/363 [01:22<00:43, 3.11it/s] Loading 0: 64%|██████▎ | 231/363 [01:24<00:53, 2.48it/s] Loading 0: 64%|██████▎ | 231/363 [01:24<00:53, 2.48it/s] Loading 0: 66%|██████▌ | 238/363 [01:25<00:40, 3.11it/s] Loading 0: 66%|██████▌ | 238/363 [01:25<00:40, 3.11it/s] Loading 0: 66%|██████▌ | 240/363 [01:27<00:49, 2.49it/s] Loading 0: 66%|██████▌ | 240/363 [01:27<00:49, 2.49it/s] Loading 0: 68%|██████▊ | 247/363 [01:28<00:37, 3.11it/s] Loading 0: 68%|██████▊ | 247/363 [01:28<00:37, 3.11it/s] Loading 0: 69%|██████▊ | 249/363 [01:30<00:45, 2.49it/s] Loading 0: 69%|██████▊ | 249/363 [01:30<00:45, 2.49it/s] Loading 0: 71%|███████ | 256/363 [01:31<00:34, 3.11it/s] Loading 0: 71%|███████ | 256/363 [01:31<00:34, 3.11it/s] Loading 0: 71%|███████ | 258/363 [01:33<00:42, 2.49it/s] Loading 0: 71%|███████ | 258/363 [01:33<00:42, 2.49it/s] Loading 0: 73%|███████▎ | 265/363 [01:35<00:31, 3.10it/s] Loading 0: 73%|███████▎ | 265/363 [01:35<00:31, 3.10it/s] Loading 0: 74%|███████▎ | 267/363 [01:36<00:38, 2.48it/s] Loading 0: 74%|███████▎ | 267/363 [01:36<00:38, 2.48it/s] Loading 0: 75%|███████▌ | 274/363 [01:38<00:28, 3.10it/s] Loading 0: 75%|███████▌ | 274/363 [01:38<00:28, 3.10it/s] Loading 0: 76%|███████▌ | 276/363 [01:40<00:35, 2.48it/s] Loading 0: 76%|███████▌ | 276/363 [01:40<00:35, 2.48it/s] Loading 0: 78%|███████▊ | 283/363 [01:41<00:26, 3.00it/s] Loading 0: 78%|███████▊ | 283/363 [01:41<00:26, 3.00it/s] Loading 0: 79%|███████▊ | 285/363 [01:43<00:32, 2.43it/s] Loading 0: 79%|███████▊ | 285/363 [01:43<00:32, 2.43it/s] Loading 0: 80%|████████ | 292/363 [01:45<00:23, 3.06it/s] Loading 0: 80%|████████ | 292/363 [01:45<00:23, 3.06it/s] Loading 0: 81%|████████ | 294/363 [01:46<00:28, 2.46it/s] Loading 0: 81%|████████ | 294/363 [01:46<00:28, 2.46it/s] Loading 0: 83%|████████▎ | 301/363 [01:48<00:20, 3.07it/s] Loading 0: 83%|████████▎ | 301/363 [01:48<00:20, 3.07it/s] Loading 0: 83%|████████▎ | 303/363 [01:50<00:24, 2.46it/s] Loading 0: 83%|████████▎ | 303/363 [01:50<00:24, 2.46it/s] Loading 0: 85%|████████▌ | 310/363 [01:51<00:17, 3.09it/s] Loading 0: 85%|████████▌ | 310/363 [01:51<00:17, 3.09it/s] Loading 0: 86%|████████▌ | 312/363 [01:53<00:20, 2.47it/s] Loading 0: 86%|████████▌ | 312/363 [01:53<00:20, 2.47it/s] Loading 0: 88%|████████▊ | 319/363 [01:54<00:14, 3.09it/s] Loading 0: 88%|████████▊ | 319/363 [01:54<00:14, 3.09it/s] Loading 0: 88%|████████▊ | 321/363 [01:56<00:16, 2.48it/s] Loading 0: 88%|████████▊ | 321/363 [01:56<00:16, 2.48it/s] Loading 0: 90%|█████████ | 328/363 [01:57<00:11, 3.10it/s] Loading 0: 90%|█████████ | 328/363 [01:57<00:11, 3.10it/s] Loading 0: 91%|█████████ | 330/363 [01:59<00:13, 2.48it/s] Loading 0: 91%|█████████ | 330/363 [01:59<00:13, 2.48it/s] Loading 0: 93%|█████████▎| 337/363 [02:01<00:08, 3.11it/s] Loading 0: 93%|█████████▎| 337/363 [02:01<00:08, 3.11it/s] Loading 0: 93%|█████████▎| 339/363 [02:02<00:09, 2.48it/s] Loading 0: 93%|█████████▎| 339/363 [02:02<00:09, 2.48it/s] Loading 0: 95%|█████████▌| 346/363 [02:04<00:05, 3.10it/s] Loading 0: 95%|█████████▌| 346/363 [02:04<00:05, 3.10it/s] Loading 0: 96%|█████████▌| 348/363 [02:06<00:06, 2.48it/s] Loading 0: 96%|█████████▌| 348/363 [02:06<00:06, 2.48it/s] Loading 0: 98%|█████████▊| 355/363 [02:07<00:02, 3.10it/s] Loading 0: 98%|█████████▊| 355/363 [02:07<00:02, 3.10it/s] Loading 0: 98%|█████████▊| 357/363 [02:09<00:02, 2.47it/s] Loading 0: 98%|█████████▊| 357/363 [02:09<00:02, 2.47it/s] Loading 0: 100%|██████████| 363/363 [02:10<00:00, 3.38it/s] Loading 0: 100%|██████████| 363/363 [02:10<00:00, 3.38it/s] Loading 0: 100%|██████████| 363/363 [02:10<00:00, 2.79it/s]
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: The tokenizer you are loading from '/tmp/tmph55k29by' with an incorrect regex pattern: https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503/discussions/84#69121093e8b480e709447d5e. This will lead to incorrect tokenization. You should set the `fix_mistral_regex=True` flag when loading this tokenizer to fix this issue.
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: quantized model in 141.634s
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: Processed model ChaiML/2fe5-c13f-linear-w01 in 168.901s
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-2fe5-c13f-linear-w01-v32/nvidia
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-2fe5-c13f-linear-w01-v32/nvidia/config.json
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-2fe5-c13f-linear-w01-v32/nvidia/special_tokens_map.json
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: cp /dev/shm/model_cache/chat_template.jinja s3://guanaco-mkml-models/chaiml-2fe5-c13f-linear-w01-v32/nvidia/chat_template.jinja
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-2fe5-c13f-linear-w01-v32/nvidia/tokenizer_config.json
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-2fe5-c13f-linear-w01-v32/nvidia/tokenizer.json
chaiml-2fe5-c13f-linear-w01-v32-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaiml-2fe5-c13f-linear-w01-v32/nvidia/flywheel_model.0.safetensors
Job chaiml-2fe5-c13f-linear-w01-v32-mkmlizer completed after 257.26s with status: succeeded
Stopping job with name chaiml-2fe5-c13f-linear-w01-v32-mkmlizer
Pipeline stage MKMLizer completed in 257.76s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service chaiml-2fe5-c13f-linear-w01-v32
Waiting for inference service chaiml-2fe5-c13f-linear-w01-v32 to be ready
Inference service chaiml-2fe5-c13f-linear-w01-v32 ready after 160.91907262802124s
Pipeline stage MKMLDeployer completed in 161.60s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 1.4741241931915283s
Received healthy response to inference request in 1.6372201442718506s
Received healthy response to inference request in 1.496262550354004s
Received healthy response to inference request in 1.5602564811706543s
5 requests
1 failed requests
5th percentile: 1.4785518646240234
10th percentile: 1.4829795360565186
20th percentile: 1.4918348789215088
30th percentile: 1.509061336517334
40th percentile: 1.5346589088439941
50th percentile: 1.5602564811706543
60th percentile: 1.5910419464111327
70th percentile: 1.6218274116516114
80th percentile: 5.331175661087039
90th percentile: 12.719086694717408
95th percentile: 16.41304221153259
99th percentile: 19.36820662498474
mean time: 5.254972219467163
%s, retrying in %s seconds...
Received healthy response to inference request in 1.4460430145263672s
Received healthy response to inference request in 1.1381230354309082s
Received healthy response to inference request in 1.341003179550171s
Received healthy response to inference request in 1.523102045059204s
Received healthy response to inference request in 1.146012306213379s
5 requests
0 failed requests
5th percentile: 1.1397008895874023
10th percentile: 1.1412787437438965
20th percentile: 1.1444344520568848
30th percentile: 1.1850104808807373
40th percentile: 1.263006830215454
50th percentile: 1.341003179550171
60th percentile: 1.3830191135406493
70th percentile: 1.425035047531128
80th percentile: 1.4614548206329345
90th percentile: 1.4922784328460694
95th percentile: 1.5076902389526368
99th percentile: 1.5200196838378905
mean time: 1.3188567161560059
Pipeline stage StressChecker completed in 35.29s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.62s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.74s
Shutdown handler de-registered
chaiml-2fe5-c13f-linear-w01_v32 status is now deployed due to DeploymentManager action
chaiml-2fe5-c13f-linear-w01_v32 status is now inactive due to auto deactivation removed underperforming models
chaiml-2fe5-c13f-linear-w01_v32 status is now torndown due to DeploymentManager action