developer_uid: junhua024
submission_id: junhua024-chai-02-full-0111_v2
model_name: junhua024-chai-02-full-0111_v2
model_group: junhua024/chai_02_full_0
status: torndown
timestamp: 2025-07-16T18:01:25+00:00
num_battles: 6098
num_wins: 3264
celo_rating: 1273.12
family_friendly_score: 0.0
family_friendly_standard_error: 0.0
submission_type: basic
model_repo: junhua024/chai_02_full_0111
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5993871399167381, 'latency_mean': 1.6682062530517578, 'latency_p50': 1.6657034158706665, 'latency_p90': 1.8503597497940063}, {'batch_size': 3, 'throughput': 1.0861397621679885, 'latency_mean': 2.757166950702667, 'latency_p50': 2.7418967485427856, 'latency_p90': 3.0988407373428344}, {'batch_size': 5, 'throughput': 1.2989632379848832, 'latency_mean': 3.835206837654114, 'latency_p50': 3.844808578491211, 'latency_p90': 4.317627811431885}, {'batch_size': 6, 'throughput': 1.3651609417694854, 'latency_mean': 4.371828438043594, 'latency_p50': 4.411972641944885, 'latency_p90': 4.847292280197143}, {'batch_size': 8, 'throughput': 1.425266741646212, 'latency_mean': 5.577517313957214, 'latency_p50': 5.574975252151489, 'latency_p90': 6.321380257606506}, {'batch_size': 10, 'throughput': 1.4554196221944475, 'latency_mean': 6.81163791179657, 'latency_p50': 6.866379380226135, 'latency_p90': 7.792941784858703}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-02-full-0111_v2
is_internal_developer: False
language_model: junhua024/chai_02_full_0111
model_size: 13B
ranking_group: single
throughput_3p7s: 1.28
us_pacific_date: 2025-07-16
win_ratio: 0.5352574614627746
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-02-full-0111-v2-mkmlizer
Waiting for job on junhua024-chai-02-full-0111-v2-mkmlizer to finish
junhua024-chai-02-full-0111-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ belonging to: ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-02-full-0111-v2-mkmlizer: ║ ║
junhua024-chai-02-full-0111-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Failed to get response for submission junhua024-chai-02-full-0121_v4: HTTPConnectionPool(host='junhua024-chai-02-full-0121-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-0111-v2-mkmlizer: Downloaded to shared memory in 136.584s
junhua024-chai-02-full-0111-v2-mkmlizer: Checking if junhua024/chai_02_full_0111 already exists in ChaiML
junhua024-chai-02-full-0111-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpk4wstiy0, device:0
junhua024-chai-02-full-0111-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Retrying (%r) after connection broken by '%r': %s
Failed to get response for submission chaiml-gy-exp78-sft-gy-_94402_v1: HTTPConnectionPool(host='chaiml-gy-exp78-sft-gy-94402-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-02-full-0111-v2-mkmlizer: quantized model in 32.258s
junhua024-chai-02-full-0111-v2-mkmlizer: Processed model junhua024/chai_02_full_0111 in 168.925s
junhua024-chai-02-full-0111-v2-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-02-full-0111-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-02-full-0111-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-02-full-0111-v2/nvidia
junhua024-chai-02-full-0111-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-02-full-0111-v2/nvidia/special_tokens_map.json
junhua024-chai-02-full-0111-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-02-full-0111-v2/nvidia/config.json
junhua024-chai-02-full-0111-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-02-full-0111-v2/nvidia/flywheel_model.0.safetensors
junhua024-chai-02-full-0111-v2-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:24, 14.88it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:20, 17.53it/s] Loading 0: 3%|▎ | 12/363 [00:00<00:12, 29.04it/s] Loading 0: 5%|▍ | 17/363 [00:00<00:11, 30.72it/s] Loading 0: 6%|▋ | 23/363 [00:00<00:10, 32.46it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:07, 43.63it/s] Loading 0: 10%|▉ | 36/363 [00:01<00:10, 32.15it/s] Loading 0: 11%|█▏ | 41/363 [00:01<00:10, 31.75it/s] Loading 0: 13%|█▎ | 46/363 [00:01<00:08, 35.45it/s] Loading 0: 14%|█▍ | 51/363 [00:01<00:09, 31.77it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:10, 30.28it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:10, 30.08it/s] Loading 0: 18%|█▊ | 65/363 [00:02<00:09, 32.14it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:09, 30.97it/s] Loading 0: 20%|██ | 74/363 [00:02<00:08, 34.76it/s] Loading 0: 21%|██▏ | 78/363 [00:02<00:08, 35.04it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:09, 29.87it/s] Loading 0: 24%|██▎ | 86/363 [00:02<00:08, 31.46it/s] Loading 0: 25%|██▌ | 91/363 [00:02<00:08, 32.15it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 32.24it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:08, 31.24it/s] Loading 0: 29%|██▉ | 105/363 [00:03<00:08, 29.88it/s] Loading 0: 31%|███ | 113/363 [00:03<00:06, 36.46it/s] Loading 0: 32%|███▏ | 117/363 [00:03<00:08, 29.43it/s] Loading 0: 34%|███▎ | 122/363 [00:03<00:07, 31.63it/s] Loading 0: 35%|███▌ | 128/363 [00:03<00:06, 33.62it/s] Loading 0: 36%|███▋ | 132/363 [00:04<00:07, 32.24it/s] Loading 0: 38%|███▊ | 138/363 [00:04<00:06, 32.83it/s] Loading 0: 39%|███▉ | 143/363 [00:04<00:06, 32.36it/s] Loading 0: 41%|████ | 149/363 [00:04<00:06, 33.39it/s] Loading 0: 43%|████▎ | 155/363 [00:04<00:05, 38.23it/s] Loading 0: 44%|████▍ | 160/363 [00:04<00:05, 36.75it/s] Loading 0: 45%|████▌ | 164/363 [00:05<00:05, 36.41it/s] Loading 0: 46%|████▋ | 168/363 [00:05<00:05, 33.74it/s] Loading 0: 48%|████▊ | 176/363 [00:05<00:04, 39.82it/s] Loading 0: 50%|████▉ | 180/363 [00:05<00:05, 31.34it/s] Loading 0: 51%|█████ | 185/363 [00:05<00:05, 32.11it/s] Loading 0: 53%|█████▎ | 191/363 [00:05<00:05, 32.23it/s] Loading 0: 54%|█████▎ | 195/363 [00:05<00:05, 31.68it/s] Loading 0: 55%|█████▌ | 201/363 [00:06<00:05, 29.91it/s] Loading 0: 57%|█████▋ | 206/363 [00:06<00:05, 30.79it/s] Loading 0: 58%|█████▊ | 212/363 [00:06<00:04, 32.73it/s] Loading 0: 60%|██████ | 218/363 [00:06<00:03, 37.98it/s] Loading 0: 61%|██████▏ | 223/363 [00:06<00:03, 35.31it/s] Loading 0: 63%|██████▎ | 227/363 [00:06<00:03, 34.73it/s] Loading 0: 64%|██████▎ | 231/363 [00:07<00:04, 32.86it/s] Loading 0: 66%|██████▌ | 239/363 [00:07<00:03, 39.11it/s] Loading 0: 67%|██████▋ | 243/363 [00:07<00:03, 31.63it/s] Loading 0: 68%|██████▊ | 248/363 [00:07<00:03, 33.12it/s] Loading 0: 70%|██████▉ | 254/363 [00:07<00:03, 35.09it/s] Loading 0: 71%|███████ | 258/363 [00:07<00:03, 32.69it/s] Loading 0: 72%|███████▏ | 263/363 [00:07<00:02, 36.22it/s] Loading 0: 74%|███████▎ | 267/363 [00:08<00:02, 34.94it/s] Loading 0: 75%|███████▍ | 271/363 [00:08<00:03, 29.82it/s] Loading 0: 76%|███████▌ | 275/363 [00:08<00:02, 29.50it/s] Loading 0: 77%|███████▋ | 281/363 [00:08<00:02, 36.11it/s] Loading 0: 79%|███████▉ | 286/363 [00:08<00:02, 35.40it/s] Loading 0: 80%|███████▉ | 290/363 [00:08<00:02, 35.00it/s] Loading 0: 81%|████████ | 294/363 [00:08<00:02, 32.77it/s] Loading 0: 83%|████████▎ | 302/363 [00:09<00:01, 39.78it/s] Loading 0: 85%|████████▍ | 307/363 [00:09<00:01, 33.50it/s] Loading 0: 86%|████████▌ | 311/363 [00:09<00:01, 32.35it/s] Loading 0: 87%|████████▋ | 317/363 [00:09<00:01, 34.32it/s] Loading 0: 88%|████████▊ | 321/363 [00:09<00:01, 32.23it/s] Loading 0: 90%|████████▉ | 326/363 [00:09<00:01, 35.64it/s] Loading 0: 91%|█████████ | 330/363 [00:09<00:00, 36.53it/s] Loading 0: 92%|█████████▏| 334/363 [00:10<00:00, 32.72it/s] Loading 0: 93%|█████████▎| 338/363 [00:10<00:00, 31.79it/s] Loading 0: 95%|█████████▌| 346/363 [00:10<00:00, 42.88it/s] Loading 0: 97%|█████████▋| 351/363 [00:10<00:00, 23.25it/s] Loading 0: 98%|█████████▊| 355/363 [00:10<00:00, 25.37it/s] Loading 0: 99%|█████████▉| 359/363 [00:11<00:00, 27.48it/s]
Job junhua024-chai-02-full-0111-v2-mkmlizer completed after 189.74s with status: succeeded
Stopping job with name junhua024-chai-02-full-0111-v2-mkmlizer
Pipeline stage MKMLizer completed in 190.37s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.20s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-02-full-0111-v2
Waiting for inference service junhua024-chai-02-full-0111-v2 to be ready
Failed to get response for submission junhua024-chai-02-full-0121_v4: HTTPConnectionPool(host='junhua024-chai-02-full-0121-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Failed to get response for submission junhua024-chai-02-full-0121_v4: HTTPConnectionPool(host='junhua024-chai-02-full-0121-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-02-full-0111-v2 ready after 261.3641724586487s
Pipeline stage MKMLDeployer completed in 261.90s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.4254515171051025s
Received healthy response to inference request in 1.9255292415618896s
Received healthy response to inference request in 1.5427682399749756s
Received healthy response to inference request in 1.751769781112671s
5 requests
1 failed requests
5th percentile: 1.5845685482025147
10th percentile: 1.6263688564300538
20th percentile: 1.7099694728851318
30th percentile: 1.7865216732025146
40th percentile: 1.8560254573822021
50th percentile: 1.9255292415618896
60th percentile: 2.125498151779175
70th percentile: 2.3254670619964597
80th percentile: 5.9769812107086215
90th percentile: 13.080040597915652
95th percentile: 16.63157029151916
99th percentile: 19.472794046401976
mean time: 5.5657237529754635
%s, retrying in %s seconds...
Received healthy response to inference request in 1.8215336799621582s
Received healthy response to inference request in 2.409975528717041s
Received healthy response to inference request in 2.5668962001800537s
Received healthy response to inference request in 2.0838253498077393s
Received healthy response to inference request in 2.50767183303833s
5 requests
0 failed requests
5th percentile: 1.8739920139312745
10th percentile: 1.9264503479003907
20th percentile: 2.0313670158386232
30th percentile: 2.1490553855895995
40th percentile: 2.2795154571533205
50th percentile: 2.409975528717041
60th percentile: 2.4490540504455565
70th percentile: 2.4881325721740724
80th percentile: 2.519516706466675
90th percentile: 2.5432064533233643
95th percentile: 2.555051326751709
99th percentile: 2.564527225494385
Failed to get response for submission junhua024-chai-02-full-0121_v4: HTTPConnectionPool(host='junhua024-chai-02-full-0121-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
mean time: 2.2779805183410646
Pipeline stage StressChecker completed in 42.36s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.68s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.76s
Shutdown handler de-registered
junhua024-chai-02-full-0111_v2 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
clean up pipeline due to error=DeploymentChecksError('None: None')
Shutdown handler de-registered
junhua024-chai-02-full-0111_v2 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-02-full-0111_v2 status is now torndown due to DeploymentManager action
junhua024-chai-02-full-0111_v2 status is now torndown due to DeploymentManager action