developer_uid: junhua024
submission_id: junhua024-chai-1-full-066126_v34
model_name: junhua024-chai-1-full-066126_v34
model_group: junhua024/chai-1-full-06
status: torndown
timestamp: 2025-07-14T13:49:32+00:00
num_battles: 7004
num_wins: 3401
celo_rating: 1283.93
family_friendly_score: 0.0
family_friendly_standard_error: 0.0
submission_type: basic
model_repo: junhua024/chai-1-full-066126
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5942395188898179, 'latency_mean': 1.6827002274990082, 'latency_p50': 1.673323631286621, 'latency_p90': 1.8632280111312867}, {'batch_size': 3, 'throughput': 1.0672148798624064, 'latency_mean': 2.80464314699173, 'latency_p50': 2.7979644536972046, 'latency_p90': 3.1155941486358643}, {'batch_size': 5, 'throughput': 1.2771258043687996, 'latency_mean': 3.906018273830414, 'latency_p50': 3.9101881980895996, 'latency_p90': 4.485973429679871}, {'batch_size': 6, 'throughput': 1.3386693003113403, 'latency_mean': 4.467246136665344, 'latency_p50': 4.477274656295776, 'latency_p90': 5.032438778877259}, {'batch_size': 8, 'throughput': 1.3891002415878921, 'latency_mean': 5.727704417705536, 'latency_p50': 5.7483474016189575, 'latency_p90': 6.376453614234924}, {'batch_size': 10, 'throughput': 1.41193972635252, 'latency_mean': 7.016164909601212, 'latency_p50': 7.065720319747925, 'latency_p90': 7.88216872215271}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-1-full-066126_v34
is_internal_developer: False
language_model: junhua024/chai-1-full-066126
model_size: 13B
ranking_group: single
throughput_3p7s: 1.25
us_pacific_date: 2025-07-14
win_ratio: 0.4855796687607082
generation_params: {'temperature': 0.96, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-1-full-066126-v34-mkmlizer
Waiting for job on junhua024-chai-1-full-066126-v34-mkmlizer to finish
junhua024-chai-1-full-066126-v34-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ belonging to: ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-1-full-066126-v34-mkmlizer: ║ ║
junhua024-chai-1-full-066126-v34-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-1-full-066126-v34-mkmlizer: Downloaded to shared memory in 83.235s
junhua024-chai-1-full-066126-v34-mkmlizer: Checking if junhua024/chai-1-full-066126 already exists in ChaiML
junhua024-chai-1-full-066126-v34-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpz_qi_40t, device:0
junhua024-chai-1-full-066126-v34-mkmlizer: Saving flywheel model at /dev/shm/model_cache
junhua024-chai-1-full-066126-v34-mkmlizer: quantized model in 39.149s
junhua024-chai-1-full-066126-v34-mkmlizer: Processed model junhua024/chai-1-full-066126 in 122.458s
junhua024-chai-1-full-066126-v34-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-1-full-066126-v34-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-1-full-066126-v34-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v34/nvidia
junhua024-chai-1-full-066126-v34-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v34/nvidia/config.json
junhua024-chai-1-full-066126-v34-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v34/nvidia/special_tokens_map.json
junhua024-chai-1-full-066126-v34-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v34/nvidia/tokenizer_config.json
junhua024-chai-1-full-066126-v34-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v34/nvidia/tokenizer.json
junhua024-chai-1-full-066126-v34-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-1-full-066126-v34/nvidia/flywheel_model.0.safetensors
junhua024-chai-1-full-066126-v34-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:24, 14.65it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:21, 16.76it/s] Loading 0: 3%|▎ | 11/363 [00:00<00:11, 31.17it/s] Loading 0: 4%|▍ | 15/363 [00:00<00:10, 33.92it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:12, 27.79it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:12, 26.53it/s] Loading 0: 8%|▊ | 29/363 [00:00<00:09, 33.72it/s] Loading 0: 9%|▉ | 34/363 [00:01<00:10, 31.72it/s] Loading 0: 10%|█ | 38/363 [00:01<00:10, 31.13it/s] Loading 0: 12%|█▏ | 42/363 [00:01<00:11, 28.24it/s] Loading 0: 13%|█▎ | 46/363 [00:01<00:10, 30.65it/s] Loading 0: 14%|█▍ | 51/363 [00:01<00:10, 29.15it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:10, 28.17it/s] Loading 0: 16%|█▋ | 59/363 [00:02<00:10, 28.02it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:10, 29.11it/s] Loading 0: 19%|█▊ | 68/363 [00:02<00:11, 26.29it/s] Loading 0: 20%|██ | 74/363 [00:02<00:08, 32.35it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:08, 33.06it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:09, 28.31it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:09, 29.86it/s] Loading 0: 25%|██▌ | 91/363 [00:03<00:08, 30.52it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 31.25it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:08, 30.90it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:09, 28.41it/s] Loading 0: 30%|███ | 110/363 [00:03<00:07, 32.98it/s] Loading 0: 31%|███▏ | 114/363 [00:03<00:08, 28.13it/s] Loading 0: 33%|███▎ | 118/363 [00:04<00:08, 27.37it/s] Loading 0: 34%|███▎ | 122/363 [00:04<00:08, 26.84it/s] Loading 0: 35%|███▌ | 128/363 [00:04<00:08, 28.28it/s] Loading 0: 36%|███▌ | 131/363 [00:04<00:08, 25.87it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:07, 31.87it/s] Loading 0: 39%|███▉ | 141/363 [00:04<00:06, 32.48it/s] Loading 0: 40%|███▉ | 145/363 [00:04<00:07, 27.42it/s] Loading 0: 41%|████ | 149/363 [00:05<00:08, 26.12it/s] Loading 0: 42%|████▏ | 154/363 [00:05<00:06, 31.06it/s] Loading 0: 44%|████▍ | 159/363 [00:05<00:06, 33.48it/s] Loading 0: 45%|████▍ | 163/363 [00:05<00:07, 25.82it/s] Loading 0: 46%|████▌ | 167/363 [00:05<00:07, 24.69it/s] Loading 0: 47%|████▋ | 172/363 [00:05<00:06, 29.57it/s] Loading 0: 49%|████▉ | 177/363 [00:06<00:06, 29.51it/s] Loading 0: 50%|████▉ | 181/363 [00:06<00:06, 28.57it/s] Loading 0: 51%|█████ | 185/363 [00:06<00:06, 28.09it/s] Loading 0: 53%|█████▎ | 191/363 [00:06<00:05, 29.51it/s] Loading 0: 54%|█████▎ | 195/363 [00:06<00:05, 28.81it/s] Loading 0: 55%|█████▌ | 200/363 [00:06<00:05, 32.55it/s] Loading 0: 56%|█████▌ | 204/363 [00:06<00:04, 32.85it/s] Loading 0: 57%|█████▋ | 208/363 [00:07<00:05, 26.96it/s] Loading 0: 58%|█████▊ | 212/363 [00:07<00:05, 26.21it/s] Loading 0: 60%|██████ | 218/363 [00:07<00:04, 31.62it/s] Loading 0: 61%|██████▏ | 223/363 [00:07<00:04, 29.82it/s] Loading 0: 63%|██████▎ | 227/363 [00:07<00:04, 29.46it/s] Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 27.20it/s] Loading 0: 65%|██████▌ | 237/363 [00:08<00:03, 33.74it/s] Loading 0: 66%|██████▋ | 241/363 [00:08<00:04, 28.03it/s] Loading 0: 67%|██████▋ | 245/363 [00:08<00:04, 29.10it/s] Loading 0: 69%|██████▊ | 249/363 [00:08<00:04, 27.51it/s] Loading 0: 70%|██████▉ | 254/363 [00:08<00:03, 27.34it/s] Loading 0: 71%|███████ | 257/363 [00:08<00:04, 24.96it/s] Loading 0: 72%|███████▏ | 262/363 [00:09<00:03, 30.14it/s] Loading 0: 73%|███████▎ | 266/363 [00:09<00:03, 30.11it/s] Loading 0: 74%|███████▍ | 270/363 [00:09<00:03, 26.34it/s] Loading 0: 76%|███████▌ | 275/363 [00:09<00:03, 26.85it/s] Loading 0: 77%|███████▋ | 281/363 [00:09<00:02, 32.28it/s] Loading 0: 79%|███████▉ | 286/363 [00:09<00:02, 30.67it/s] Loading 0: 80%|███████▉ | 290/363 [00:09<00:02, 29.81it/s] Loading 0: 81%|████████ | 294/363 [00:10<00:02, 27.44it/s] Loading 0: 82%|████████▏ | 299/363 [00:10<00:02, 31.96it/s] Loading 0: 83%|████████▎ | 303/363 [00:10<00:02, 27.80it/s] Loading 0: 85%|████████▍ | 307/363 [00:10<00:02, 26.55it/s] Loading 0: 86%|████████▌ | 311/363 [00:10<00:02, 25.60it/s] Loading 0: 87%|████████▋ | 316/363 [00:10<00:01, 30.14it/s] Loading 0: 88%|████████▊ | 320/363 [00:11<00:01, 23.04it/s] Loading 0: 90%|████████▉ | 325/363 [00:11<00:01, 27.91it/s] Loading 0: 91%|█████████ | 329/363 [00:11<00:01, 27.95it/s] Loading 0: 92%|█████████▏| 333/363 [00:11<00:01, 23.88it/s] Loading 0: 93%|█████████▎| 338/363 [00:11<00:00, 25.36it/s] Loading 0: 94%|█████████▍| 343/363 [00:11<00:00, 29.45it/s] Loading 0: 96%|█████████▌| 348/363 [00:12<00:00, 31.14it/s] Loading 0: 97%|█████████▋| 352/363 [00:12<00:00, 16.30it/s] Loading 0: 98%|█████████▊| 357/363 [00:12<00:00, 19.44it/s]
Job junhua024-chai-1-full-066126-v34-mkmlizer completed after 149.02s with status: succeeded
Stopping job with name junhua024-chai-1-full-066126-v34-mkmlizer
Pipeline stage MKMLizer completed in 149.58s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-1-full-066126-v34
Waiting for inference service junhua024-chai-1-full-066126-v34 to be ready
Failed to get response for submission junhua024-chai-1-full-066126_v30: HTTPConnectionPool(host='junhua024-chai-1-full-066126-v30-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-1-full-066126-v34 ready after 231.5263135433197s
Pipeline stage MKMLDeployer completed in 233.56s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.4886322021484375s
Received healthy response to inference request in 1.6846976280212402s
Received healthy response to inference request in 1.7515373229980469s
Received healthy response to inference request in 1.705890417098999s
Received healthy response to inference request in 1.607133150100708s
5 requests
0 failed requests
5th percentile: 1.6226460456848144
10th percentile: 1.6381589412689208
20th percentile: 1.6691847324371338
30th percentile: 1.688936185836792
40th percentile: 1.6974133014678956
50th percentile: 1.705890417098999
60th percentile: 1.7241491794586181
70th percentile: 1.7424079418182372
80th percentile: 1.8989562988281252
90th percentile: 2.1937942504882812
95th percentile: 2.3412132263183594
99th percentile: 2.459148406982422
mean time: 1.8475781440734864
Pipeline stage StressChecker completed in 10.65s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.68s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.66s
Shutdown handler de-registered
junhua024-chai-1-full-066126_v34 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service junhua024-chai-1-full-066126-v34-profiler
Waiting for inference service junhua024-chai-1-full-066126-v34-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
junhua024-chai-1-full-066126_v34 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-1-full-066126_v34 status is now torndown due to DeploymentManager action