Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage VLLMUploader
Starting job with name chaiml-pony-v2-q235b-lr-18913-v1-uploader
Waiting for job on chaiml-pony-v2-q235b-lr-18913-v1-uploader to finish
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Using quantization_mode: w4a16
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Checking if ChaiML/pony-v2-q235b-lr1e4ep1r64g4-W4A16 already exists in ChaiML
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Downloading snapshot of ChaiML/pony-v2-q235b-lr1e4ep1r64g4...
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Downloaded in 155.687s
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Applying quantization...
chaiml-pony-v2-q235b-lr-18913-v1-uploader: [33;1m2026-02-25 23:55:04 WARNING modeling_utils.py L4670: `torch_dtype` is deprecated! Use `dtype` instead![0m
chaiml-pony-v2-q235b-lr-18913-v1-uploader: [38;20m2026-02-25 23:55:49 INFO base.py L366: using torch.bfloat16 for quantization tuning[0m
chaiml-pony-v2-q235b-lr-18913-v1-uploader: [38;20m2026-02-25 23:55:54 INFO base.py L1145: start to compute imatrix[0m
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/local/lib/python3.12/dist-packages/torch/backends/cuda/__init__.py:131: UserWarning: Please use the new API settings to control TF32 behavior, such as torch.backends.cudnn.conv.fp32_precision = 'tf32' or torch.backends.cuda.matmul.fp32_precision = 'ieee'. Old settings, e.g, torch.backends.cuda.matmul.allow_tf32 = True, torch.backends.cudnn.allow_tf32 = True, allowTF32CuDNN() and allowTF32CuBLAS() will be deprecated after Pytorch 2.9. Please see https://pytorch.org/docs/main/notes/cuda.html#tensorfloat-32-tf32-on-ampere-and-later-devices (Triggered internally at /pytorch/aten/src/ATen/Context.cpp:80.)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: return torch._C._get_cublas_allow_tf32()
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:56.739000 7 torch/_dynamo/convert_frame.py:1358] [6/8] torch._dynamo hit config.recompile_limit (8)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:56.739000 7 torch/_dynamo/convert_frame.py:1358] [6/8] function: 'forward' (/usr/local/lib/python3.12/dist-packages/transformers/models/qwen3_moe/modeling_qwen3_moe.py:208)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:56.739000 7 torch/_dynamo/convert_frame.py:1358] [6/8] last reason: 6/7: self._modules['up_proj'].imatrix_cnt == 1346 # module.imatrix_cnt += input.shape[0] # auto_round/compressors/base.py:1179 in get_imatrix_hook (HINT: torch.compile considers integer attributes of the nn.Module to be static. If you are observing recompilation, you might want to make this integer dynamic using torch._dynamo.config.allow_unspec_int_on_nn_module = True, or convert this integer into a tensor.)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:56.739000 7 torch/_dynamo/convert_frame.py:1358] [6/8] To log all recompilation reasons, use TORCH_LOGS="recompiles".
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:56.739000 7 torch/_dynamo/convert_frame.py:1358] [6/8] To diagnose recompilation issues, see https://pytorch.org/docs/main/torch.compiler_troubleshooting.html
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:59.742000 7 torch/_dynamo/convert_frame.py:1358] [3/8] torch._dynamo hit config.recompile_limit (8)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:59.742000 7 torch/_dynamo/convert_frame.py:1358] [3/8] function: 'forward' (/usr/local/lib/python3.12/dist-packages/transformers/models/qwen3_moe/modeling_qwen3_moe.py:305)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:59.742000 7 torch/_dynamo/convert_frame.py:1358] [3/8] last reason: 3/7: self._modules['self_attn']._modules['k_proj'].imatrix_cnt == 56 # module.imatrix_cnt += input.shape[0] # auto_round/compressors/base.py:1179 in get_imatrix_hook (HINT: torch.compile considers integer attributes of the nn.Module to be static. If you are observing recompilation, you might want to make this integer dynamic using torch._dynamo.config.allow_unspec_int_on_nn_module = True, or convert this integer into a tensor.)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:59.742000 7 torch/_dynamo/convert_frame.py:1358] [3/8] To log all recompilation reasons, use TORCH_LOGS="recompiles".
chaiml-pony-v2-q235b-lr-18913-v1-uploader: W0225 23:56:59.742000 7 torch/_dynamo/convert_frame.py:1358] [3/8] To diagnose recompilation issues, see https://pytorch.org/docs/main/torch.compiler_troubleshooting.html
chaiml-pony-v2-q235b-lr-18913-v1-uploader: [33;1m2026-02-25 23:57:18 WARNING gguf.py L297: please use more data via setting `nsamples` to improve accuracy as calibration activations contain 0[0m
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------- 2026-02-26 01:22:22 (0:00:00) ----------
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Files: hashed 11/38 (26.1M/131.9G) | pre-uploaded: 0/1 (0.0/131.9G) (+27 unsure) | committed: 0/38 (0.0/131.9G) | ignored: 0
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Workers: hashing: 27 | get upload mode: 0 | pre-uploading: 1 | committing: 0 | waiting: 98
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------------------------------------------------
chaiml-pony-v2-q235b-lr-18913-v1-uploader:
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------- 2026-02-26 01:23:22 (0:01:00) ----------
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Files: hashed 38/38 (131.9G/131.9G) | pre-uploaded: 2/28 (1.9G/131.9G) | committed: 0/38 (0.0/131.9G) | ignored: 0
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Workers: hashing: 0 | get upload mode: 0 | pre-uploading: 26 | committing: 0 | waiting: 100
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------------------------------------------------
chaiml-pony-v2-q235b-lr-18913-v1-uploader:
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------- 2026-02-26 01:24:22 (0:02:00) ----------
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Files: hashed 38/38 (131.9G/131.9G) | pre-uploaded: 10/28 (41.9G/131.9G) | committed: 0/38 (0.0/131.9G) | ignored: 0
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Workers: hashing: 0 | get upload mode: 0 | pre-uploading: 18 | committing: 0 | waiting: 108
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------------------------------------------------
chaiml-pony-v2-q235b-lr-18913-v1-uploader:
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------- 2026-02-26 01:25:22 (0:03:00) ----------
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Files: hashed 38/38 (131.9G/131.9G) | pre-uploaded: 18/28 (81.9G/131.9G) | committed: 0/38 (0.0/131.9G) | ignored: 0
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Workers: hashing: 0 | get upload mode: 0 | pre-uploading: 10 | committing: 0 | waiting: 116
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------------------------------------------------
chaiml-pony-v2-q235b-lr-18913-v1-uploader:
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------- 2026-02-26 01:26:22 (0:04:00) ----------
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Files: hashed 38/38 (131.9G/131.9G) | pre-uploaded: 26/28 (121.9G/131.9G) | committed: 0/38 (0.0/131.9G) | ignored: 0
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Workers: hashing: 0 | get upload mode: 0 | pre-uploading: 2 | committing: 0 | waiting: 124
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------------------------------------------------
chaiml-pony-v2-q235b-lr-18913-v1-uploader:
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
[K[F
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------- 2026-02-26 01:27:22 (0:05:00) ----------
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Files: hashed 38/38 (131.9G/131.9G) | pre-uploaded: 28/28 (131.9G/131.9G) | committed: 0/38 (0.0/131.9G) | ignored: 0
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Workers: hashing: 0 | get upload mode: 0 | pre-uploading: 0 | committing: 1 | waiting: 125
chaiml-pony-v2-q235b-lr-18913-v1-uploader: ---------------------------------------------------
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Processed model ChaiML/pony-v2-q235b-lr1e4ep1r64g4 in 5716.746s
chaiml-pony-v2-q235b-lr-18913-v1-uploader: creating bucket guanaco-vllm-models
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/lib/python3/dist-packages/S3/BaseUtils.py:56: SyntaxWarning: invalid escape sequence '\.'
chaiml-pony-v2-q235b-lr-18913-v1-uploader: RE_S3_DATESTRING = re.compile('\.[0-9]*(?:[Z\\-\\+]*?)')
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/lib/python3/dist-packages/S3/BaseUtils.py:57: SyntaxWarning: invalid escape sequence '\s'
chaiml-pony-v2-q235b-lr-18913-v1-uploader: RE_XML_NAMESPACE = re.compile(b'^(<?[^>]+?>\s*|\s*)(<\w+) xmlns=[\'"](https?://[^\'"]+)[\'"]', re.MULTILINE)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:240: SyntaxWarning: invalid escape sequence '\.'
chaiml-pony-v2-q235b-lr-18913-v1-uploader: invalid = re.search("([^a-z0-9\.-])", bucket, re.UNICODE)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:244: SyntaxWarning: invalid escape sequence '\.'
chaiml-pony-v2-q235b-lr-18913-v1-uploader: invalid = re.search("([^A-Za-z0-9\._-])", bucket, re.UNICODE)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:255: SyntaxWarning: invalid escape sequence '\.'
chaiml-pony-v2-q235b-lr-18913-v1-uploader: if re.search("-\.", bucket, re.UNICODE):
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:257: SyntaxWarning: invalid escape sequence '\.'
chaiml-pony-v2-q235b-lr-18913-v1-uploader: if re.search("\.\.", bucket, re.UNICODE):
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/lib/python3/dist-packages/S3/S3Uri.py:155: SyntaxWarning: invalid escape sequence '\w'
chaiml-pony-v2-q235b-lr-18913-v1-uploader: _re = re.compile("^(\w+://)?(.*)", re.UNICODE)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: /usr/lib/python3/dist-packages/S3/FileLists.py:480: SyntaxWarning: invalid escape sequence '\*'
chaiml-pony-v2-q235b-lr-18913-v1-uploader: wildcard_split_result = re.split("\*|\?", uri_str, maxsplit=1)
chaiml-pony-v2-q235b-lr-18913-v1-uploader: Bucket 's3://guanaco-vllm-models/' created
chaiml-pony-v2-q235b-lr-18913-v1-uploader: uploading /dev/shm/model_output to s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/chat_template.jinja s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/chat_template.jinja
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/tokenizer_config.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/tokenizer_config.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/quantization_config.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/quantization_config.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/config.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/config.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/added_tokens.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/added_tokens.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/vocab.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/vocab.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/generation_config.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/generation_config.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/merges.txt s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/merges.txt
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model.safetensors.index.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model.safetensors.index.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/tokenizer.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/tokenizer.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/special_tokens_map.json s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/special_tokens_map.json
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00027-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00027-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00011-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00011-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00025-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00025-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00015-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00015-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00010-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00010-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00009-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00009-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00018-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00018-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00008-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00008-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00001-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00001-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00013-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00013-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00007-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00007-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00021-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00021-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00019-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00019-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00022-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00022-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00004-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00004-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00026-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00026-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00005-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00005-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00023-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00023-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00017-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00017-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00014-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00014-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00024-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00024-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00016-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00016-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00006-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00006-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00020-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00020-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00002-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00002-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00012-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00012-of-00027.safetensors
chaiml-pony-v2-q235b-lr-18913-v1-uploader: cp /dev/shm/model_output/model-00003-of-00027.safetensors s3://guanaco-vllm-models/chaiml-pony-v2-q235b-lr-18913-v1/default/model-00003-of-00027.safetensors
Job chaiml-pony-v2-q235b-lr-18913-v1-uploader completed after 5836.14s with status: succeeded
Stopping job with name chaiml-pony-v2-q235b-lr-18913-v1-uploader
Pipeline stage VLLMUploader completed in 5836.64s
run pipeline stage %s
Running pipeline stage VLLMTemplater
Pipeline stage VLLMTemplater completed in 0.60s
run pipeline stage %s
Running pipeline stage VLLMDeployer
Creating inference service chaiml-pony-v2-q235b-lr-18913-v1
Waiting for inference service chaiml-pony-v2-q235b-lr-18913-v1 to be ready
Inference service chaiml-pony-v2-q235b-lr-18913-v1 ready after 370.3212220668793s
Pipeline stage VLLMDeployer completed in 370.75s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.0131771564483643s
Received healthy response to inference request in 1.9543778896331787s
Received healthy response to inference request in 1.995619297027588s
Received healthy response to inference request in 1.8246362209320068s
Received healthy response to inference request in 2.0829355716705322s
Received healthy response to inference request in 1.8723044395446777s
Received healthy response to inference request in 2.099777936935425s
Received healthy response to inference request in 1.921806812286377s
Received healthy response to inference request in 2.0010433197021484s
Received healthy response to inference request in 1.971395492553711s
Received healthy response to inference request in 2.197892189025879s
Received healthy response to inference request in 1.9960601329803467s
Received healthy response to inference request in 2.1232926845550537s
Received healthy response to inference request in 1.948730707168579s
Received healthy response to inference request in 2.3105626106262207s
Received healthy response to inference request in 1.9883747100830078s
Received healthy response to inference request in 1.9812545776367188s
Received healthy response to inference request in 1.9975600242614746s
Received healthy response to inference request in 1.981532335281372s
Received healthy response to inference request in 1.8802998065948486s
Received healthy response to inference request in 2.0038015842437744s
Received healthy response to inference request in 1.999208927154541s
Received healthy response to inference request in 2.0183987617492676s
Received healthy response to inference request in 1.9835636615753174s
Received healthy response to inference request in 2.187330961227417s
Received healthy response to inference request in 2.0921032428741455s
Received healthy response to inference request in 1.9123051166534424s
Received healthy response to inference request in 2.2024478912353516s
Received healthy response to inference request in 1.9148008823394775s
Received healthy response to inference request in 1.996347188949585s
30 requests
0 failed requests
5th percentile: 1.8759023547172546
10th percentile: 1.909104585647583
20th percentile: 1.9433459281921386
30th percentile: 1.9782968521118165
40th percentile: 1.9864502906799317
50th percentile: 1.9962036609649658
60th percentile: 1.999942684173584
70th percentile: 2.0147436380386354
80th percentile: 2.0936381816864014
90th percentile: 2.188387084007263
95th percentile: 2.200397825241089
99th percentile: 2.279209342002869
mean time: 2.0150980710983277
Pipeline stage StressChecker completed in 65.94s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.74s
Shutdown handler de-registered
chaiml-pony-v2-q235b-lr_18913_v1 status is now deployed due to DeploymentManager action
chaiml-pony-v2-q235b-lr_18913_v1 status is now inactive due to auto deactivation removed underperforming models
chaiml-pony-v2-q235b-lr_18913_v1 status is now torndown due to DeploymentManager action