developer_uid: junhua024
submission_id: junhua024-chai-02-full_54657_v34
model_name: junhua024-chai-02-full_54657_v34
model_group: junhua024/chai_02_full_0
status: torndown
timestamp: 2025-07-25T07:07:34+00:00
num_battles: 6744
num_wins: 3444
celo_rating: 1294.24
family_friendly_score: 0.41859999999999997
family_friendly_standard_error: 0.006976733333014814
submission_type: basic
model_repo: junhua024/chai_02_full_02_019
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 2048
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5153790593361816, 'latency_mean': 1.9402156054973603, 'latency_p50': 1.9355250597000122, 'latency_p90': 2.1042478561401365}, {'batch_size': 2, 'throughput': 0.6804693150679231, 'latency_mean': 2.932838158607483, 'latency_p50': 2.9211565256118774, 'latency_p90': 3.198399806022644}, {'batch_size': 3, 'throughput': 0.7583739829648515, 'latency_mean': 3.9483076882362367, 'latency_p50': 3.9824140071868896, 'latency_p90': 4.451128339767456}, {'batch_size': 4, 'throughput': 0.8128752163970777, 'latency_mean': 4.910735834836959, 'latency_p50': 4.927261114120483, 'latency_p90': 5.548508596420288}, {'batch_size': 5, 'throughput': 0.8323532393044497, 'latency_mean': 5.981708582639694, 'latency_p50': 5.994927644729614, 'latency_p90': 6.655341124534607}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-02-full_54657_v34
is_internal_developer: False
language_model: junhua024/chai_02_full_02_019
model_size: 13B
ranking_group: single
throughput_3p7s: 0.74
us_pacific_date: 2025-07-25
win_ratio: 0.5106761565836299
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 45, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 2048, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{memory} You are an empathetic conversational AI. You understand the user's feelings and respond with warmth, compassion, and sincerity. Each of your responses should be fewer than 64 tokens long.", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-02-full-54657-v34-mkmlizer
Waiting for job on junhua024-chai-02-full-54657-v34-mkmlizer to finish
junhua024-chai-02-full-54657-v34-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ belonging to: ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-02-full-54657-v34-mkmlizer: ║ ║
junhua024-chai-02-full-54657-v34-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-02-full-54657-v34-mkmlizer: Downloaded to shared memory in 74.369s
junhua024-chai-02-full-54657-v34-mkmlizer: Checking if junhua024/chai_02_full_02_019 already exists in ChaiML
junhua024-chai-02-full-54657-v34-mkmlizer: quantizing model to /dev/shm/model_cache, profile:q4, folder:/tmp/tmp6ybs0t25, device:0
junhua024-chai-02-full-54657-v34-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission chaiml-next-door-annoyi_15417_v1: ('http://chaiml-next-door-annoyi-15417-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
junhua024-chai-02-full-54657-v34-mkmlizer: quantized model in 162.682s
junhua024-chai-02-full-54657-v34-mkmlizer: Processed model junhua024/chai_02_full_02_019 in 237.135s
junhua024-chai-02-full-54657-v34-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-02-full-54657-v34-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-02-full-54657-v34-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-02-full-54657-v34/nvidia
junhua024-chai-02-full-54657-v34-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-02-full-54657-v34/nvidia/flywheel_model.0.safetensors
junhua024-chai-02-full-54657-v34-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 2/363 [00:00<00:20, 17.89it/s] Loading 0: 1%| | 4/363 [00:02<03:32, 1.69it/s] Loading 0: 1%|▏ | 5/363 [00:03<04:23, 1.36it/s] Loading 0: 2%|▏ | 8/363 [00:03<02:18, 2.55it/s] Loading 0: 2%|▏ | 9/363 [00:03<02:12, 2.67it/s] Loading 0: 3%|▎ | 10/363 [00:03<01:54, 3.08it/s] Loading 0: 3%|▎ | 11/363 [00:04<02:50, 2.06it/s] Loading 0: 3%|▎ | 12/363 [00:05<03:42, 1.58it/s] Loading 0: 4%|▍ | 14/363 [00:06<02:32, 2.29it/s] Loading 0: 4%|▍ | 15/363 [00:06<02:19, 2.50it/s] Loading 0: 4%|▍ | 16/363 [00:06<01:56, 2.98it/s] Loading 0: 5%|▍ | 18/363 [00:07<02:23, 2.40it/s] Loading 0: 6%|▌ | 21/363 [00:08<02:07, 2.68it/s] Loading 0: 6%|▌ | 22/363 [00:09<02:44, 2.07it/s] Loading 0: 6%|▋ | 23/363 [00:10<03:25, 1.66it/s] Loading 0: 7%|▋ | 26/363 [00:11<02:05, 2.68it/s] Loading 0: 7%|▋ | 27/363 [00:11<01:59, 2.82it/s] Loading 0: 8%|▊ | 28/363 [00:11<01:43, 3.22it/s] Loading 0: 8%|▊ | 29/363 [00:12<02:32, 2.19it/s] Loading 0: 9%|▊ | 31/363 [00:12<01:53, 2.92it/s] Loading 0: 9%|▉ | 32/363 [00:13<01:48, 3.06it/s] Loading 0: 9%|▉ | 33/363 [00:13<01:34, 3.49it/s] Loading 0: 10%|▉ | 35/363 [00:14<02:05, 2.62it/s] Loading 0: 10%|▉ | 36/363 [00:15<02:54, 1.88it/s] Loading 0: 11%|█ | 39/363 [00:16<02:19, 2.33it/s] Loading 0: 11%|█ | 40/363 [00:17<02:50, 1.89it/s] Loading 0: 11%|█▏ | 41/363 [00:18<03:25, 1.57it/s] Loading 0: 12%|█▏ | 44/363 [00:18<02:03, 2.58it/s] Loading 0: 12%|█▏ | 45/363 [00:18<01:56, 2.73it/s] Loading 0: 13%|█▎ | 46/363 [00:19<01:40, 3.15it/s] Loading 0: 13%|█▎ | 48/363 [00:19<01:22, 3.80it/s] Loading 0: 13%|█▎ | 49/363 [00:19<01:22, 3.81it/s] Loading 0: 14%|█▍ | 50/363 [00:19<01:13, 4.28it/s] Loading 0: 14%|█▍ | 52/363 [00:20<01:47, 2.90it/s] Loading 0: 15%|█▍ | 53/363 [00:21<02:28, 2.09it/s] Loading 0: 15%|█▍ | 54/363 [00:22<03:09, 1.63it/s] Loading 0: 16%|█▌ | 57/363 [00:23<02:22, 2.15it/s] Loading 0: 16%|█▌ | 58/363 [00:24<02:51, 1.78it/s] Loading 0: 16%|█▋ | 59/363 [00:25<03:23, 1.50it/s] Loading 0: 17%|█▋ | 62/363 [00:26<02:00, 2.49it/s] Loading 0: 17%|█▋ | 63/363 [00:26<01:53, 2.65it/s] Loading 0: 18%|█▊ | 64/363 [00:26<01:37, 3.06it/s] Loading 0: 18%|█▊ | 66/363 [00:27<01:58, 2.52it/s] Loading 0: 18%|█▊ | 67/363 [00:28<02:32, 1.94it/s] Loading 0: 19%|█▊ | 68/363 [00:29<03:07, 1.57it/s] Loading 0: 20%|█▉ | 71/363 [00:29<01:49, 2.66it/s] Loading 0: 20%|█▉ | 72/363 [00:30<01:43, 2.81it/s] Loading 0: 20%|██ | 73/363 [00:30<01:29, 3.24it/s] Loading 0: 20%|██ | 74/363 [00:31<02:12, 2.18it/s] Loading 0: 21%|██ | 75/363 [00:32<02:53, 1.66it/s] Loading 0: 21%|██ | 77/363 [00:32<02:00, 2.37it/s] Loading 0: 21%|██▏ | 78/363 [00:32<01:50, 2.58it/s] Loading 0: 22%|██▏ | 79/363 [00:33<01:32, 3.06it/s] Loading 0: 22%|██▏ | 80/363 [00:34<02:24, 1.96it/s] Loading 0: 23%|██▎ | 82/363 [00:34<01:42, 2.73it/s] Loading 0: 23%|██▎ | 83/363 [00:34<01:36, 2.90it/s] Loading 0: 23%|██▎ | 84/363 [00:34<01:21, 3.41it/s] Loading 0: 24%|██▎ | 86/363 [00:35<01:42, 2.69it/s] Loading 0: 25%|██▍ | 89/363 [00:36<01:35, 2.87it/s] Loading 0: 25%|██▍ | 90/363 [00:37<02:05, 2.17it/s] Loading 0: 25%|██▌ | 91/363 [00:38<02:38, 1.72it/s] Loading 0: 26%|██▌ | 94/363 [00:39<01:37, 2.77it/s] Loading 0: 26%|██▌ | 95/363 [00:39<01:32, 2.90it/s] Loading 0: 26%|██▋ | 96/363 [00:39<01:20, 3.31it/s] Loading 0: 27%|██▋ | 98/363 [00:40<01:41, 2.60it/s] Loading 0: 27%|██▋ | 99/363 [00:41<02:17, 1.93it/s] Loading 0: 28%|██▊ | 102/363 [00:42<01:51, 2.34it/s] Loading 0: 28%|██▊ | 103/363 [00:43<02:16, 1.91it/s] Loading 0: 29%|██▊ | 104/363 [00:44<02:43, 1.59it/s] Loading 0: 29%|██▉ | 107/363 [00:44<01:38, 2.59it/s] Loading 0: 30%|██▉ | 108/363 [00:45<01:33, 2.74it/s] Loading 0: 30%|███ | 109/363 [00:45<01:20, 3.15it/s] Loading 0: 31%|███ | 111/363 [00:45<01:06, 3.80it/s] Loading 0: 31%|███ | 112/363 [00:45<01:06, 3.76it/s] Loading 0: 31%|███ | 113/363 [00:46<00:58, 4.26it/s] Loading 0: 32%|███▏ | 115/363 [00:47<01:26, 2.87it/s] Loading 0: 32%|███▏ | 116/363 [00:48<01:58, 2.08it/s] Loading 0: 32%|███▏ | 117/363 [00:49<02:31, 1.63it/s] Loading 0: 33%|███▎ | 120/363 [00:50<01:52, 2.15it/s] Loading 0: 33%|███▎ | 121/363 [00:51<02:15, 1.78it/s] Loading 0: 34%|███▎ | 122/363 [00:52<02:39, 1.51it/s] Loading 0: 34%|███▍ | 125/363 [00:52<01:34, 2.51it/s] Loading 0: 35%|███▍ | 126/363 [00:52<01:28, 2.67it/s] Loading 0: 35%|███▍ | 127/363 [00:52<01:16, 3.07it/s] Loading 0: 36%|███▌ | 129/363 [00:53<01:33, 2.50it/s] Loading 0: 36%|███▌ | 130/363 [00:54<02:00, 1.93it/s] Loading 0: 36%|███▌ | 131/363 [00:55<02:27, 1.57it/s] Loading 0: 37%|███▋ | 134/363 [00:56<01:25, 2.67it/s] Loading 0: 37%|███▋ | 135/363 [00:56<01:21, 2.81it/s] Loading 0: 37%|███▋ | 136/363 [00:56<01:09, 3.25it/s] Loading 0: 38%|███▊ | 137/363 [00:57<01:43, 2.17it/s] Loading 0: 38%|███▊ | 138/363 [00:58<02:15, 1.66it/s] Loading 0: 39%|███▊ | 140/363 [00:58<01:33, 2.37it/s] Loading 0: 39%|███▉ | 141/363 [00:59<01:26, 2.58it/s] Loading 0: 39%|███▉ | 142/363 [00:59<01:11, 3.08it/s] Loading 0: 40%|███▉ | 144/363 [01:00<01:29, 2.46it/s] Loading 0: 40%|████ | 147/363 [01:01<01:18, 2.74it/s] Loading 0: 41%|████ | 148/363 [01:02<01:41, 2.11it/s] Loading 0: 41%|████ | 149/363 [01:03<02:06, 1.70it/s] Loading 0: 42%|████▏ | 152/363 [01:03<01:16, 2.77it/s] Loading 0: 42%|████▏ | 153/363 [01:03<01:12, 2.90it/s] Loading 0: 42%|████▏ | 154/363 [01:04<01:02, 3.33it/s] Loading 0: 43%|████▎ | 155/363 [01:04<01:33, 2.23it/s] Loading 0: 43%|████▎ | 157/363 [01:05<01:09, 2.96it/s] Loading 0: 44%|████▎ | 158/363 [01:05<01:06, 3.08it/s] Loading 0: 44%|████▍ | 159/363 [01:05<00:56, 3.60it/s] Loading 0: 44%|████▍ | 161/363 [01:06<01:16, 2.65it/s] Loading 0: 45%|████▍ | 162/363 [01:07<01:45, 1.91it/s] Loading 0: 45%|████▌ | 165/363 [01:08<01:23, 2.36it/s] Loading 0: 46%|████▌ | 166/363 [01:09<01:43, 1.91it/s] Loading 0: 46%|████▌ | 167/363 [01:10<02:04, 1.57it/s] Loading 0: 47%|████▋ | 170/363 [01:11<01:14, 2.59it/s] Loading 0: 47%|████▋ | 171/363 [01:11<01:10, 2.74it/s] Loading 0: 47%|████▋ | 172/363 [01:11<01:00, 3.16it/s] Loading 0: 48%|████▊ | 174/363 [01:11<00:49, 3.82it/s] Loading 0: 48%|████▊ | 175/363 [01:12<00:49, 3.79it/s] Loading 0: 48%|████▊ | 176/363 [01:12<00:43, 4.30it/s] Loading 0: 49%|████▉ | 178/363 [01:13<01:03, 2.91it/s] Loading 0: 49%|████▉ | 179/363 [01:14<01:28, 2.09it/s] Loading 0: 50%|████▉ | 180/363 [01:15<01:51, 1.64it/s] Loading 0: 50%|█████ | 183/363 [01:16<01:22, 2.17it/s] Loading 0: 51%|█████ | 184/363 [01:17<01:39, 1.80it/s] Loading 0: 51%|█████ | 185/363 [01:18<01:57, 1.52it/s] Loading 0: 52%|█████▏ | 188/363 [01:18<01:09, 2.53it/s] Loading 0: 52%|█████▏ | 189/363 [01:18<01:04, 2.68it/s] Loading 0: 52%|█████▏ | 190/363 [01:18<00:56, 3.09it/s] Loading 0: 53%|█████▎ | 192/363 [01:19<01:07, 2.52it/s] Loading 0: 53%|█████▎ | 193/363 [01:20<01:27, 1.95it/s] Loading 0: 53%|█████▎ | 194/363 [01:21<01:46, 1.58it/s] Loading 0: 54%|█████▍ | 197/363 [01:22<01:01, 2.68it/s] Loading 0: 55%|█████▍ | 198/363 [01:22<00:58, 2.82it/s] Loading 0: 55%|█████▍ | 199/363 [01:22<00:50, 3.26it/s] Loading 0: 55%|█████▌ | 200/363 [01:23<01:14, 2.19it/s] Loading 0: 55%|█████▌ | 201/363 [01:24<01:36, 1.68it/s] Loading 0: 56%|█████▌ | 203/363 [01:24<01:06, 2.40it/s] Loading 0: 56%|█████▌ | 204/363 [01:25<01:01, 2.60it/s] Loading 0: 56%|█████▋ | 205/363 [01:25<00:50, 3.11it/s] Loading 0: 57%|█████▋ | 207/363 [01:26<01:02, 2.49it/s] Loading 0: 58%|█████▊ | 210/363 [01:27<00:55, 2.76it/s] Loading 0: 58%|█████▊ | 211/363 [01:28<01:11, 2.12it/s] Loading 0: 58%|█████▊ | 212/363 [01:29<01:28, 1.71it/s] Loading 0: 59%|█████▉ | 215/363 [01:29<00:53, 2.75it/s] Loading 0: 60%|█████▉ | 216/363 [01:29<00:51, 2.88it/s] Loading 0: 60%|█████▉ | 217/363 [01:30<00:44, 3.30it/s] Loading 0: 60%|██████ | 218/363 [01:31<01:05, 2.23it/s] Loading 0: 61%|██████ | 220/363 [01:31<00:48, 2.96it/s] Loading 0: 61%|██████ | 221/363 [01:31<00:46, 3.08it/s] Loading 0: 61%|██████ | 222/363 [01:31<00:39, 3.60it/s] Loading 0: 62%|██████▏ | 224/363 [01:32<00:52, 2.67it/s] Loading 0: 62%|██████▏ | 225/363 [01:33<01:11, 1.92it/s] Loading 0: 63%|██████▎ | 228/363 [01:34<00:56, 2.37it/s] Loading 0: 63%|██████▎ | 229/363 [01:35<01:10, 1.91it/s] Loading 0: 63%|██████▎ | 230/363 [01:36<01:23, 1.59it/s] Loading 0: 64%|██████▍ | 233/363 [01:37<00:49, 2.61it/s] Loading 0: 64%|██████▍ | 234/363 [01:37<00:46, 2.76it/s] Loading 0: 65%|██████▍ | 235/363 [01:37<00:40, 3.18it/s] Loading 0: 65%|██████▌ | 237/363 [01:37<00:32, 3.83it/s] Loading 0: 66%|██████▌ | 238/363 [01:38<00:33, 3.78it/s] Loading 0: 66%|██████▌ | 239/363 [01:38<00:28, 4.28it/s] Loading 0: 66%|██████▋ | 241/363 [01:39<00:41, 2.91it/s] Loading 0: 67%|██████▋ | 242/363 [01:40<00:57, 2.10it/s] Loading 0: 67%|██████▋ | 243/363 [01:41<01:12, 1.65it/s] Loading 0: 68%|██████▊ | 246/363 [01:42<00:53, 2.18it/s] Loading 0: 68%|██████▊ | 247/363 [01:43<01:04, 1.80it/s] Loading 0: 68%|██████▊ | 248/363 [01:44<01:15, 1.51it/s] Loading 0: 69%|██████▉ | 251/363 [01:44<00:44, 2.53it/s] Loading 0: 69%|██████▉ | 252/363 [01:44<00:41, 2.68it/s] Loading 0: 70%|██████▉ | 253/363 [01:44<00:35, 3.12it/s] Loading 0: 70%|███████ | 255/363 [01:45<00:42, 2.54it/s] Loading 0: 71%|███████ | 256/363 [01:46<00:54, 1.96it/s] Loading 0: 71%|███████ | 257/363 [01:47<01:06, 1.59it/s] Loading 0: 72%|███████▏ | 260/363 [01:48<00:38, 2.69it/s] Loading 0: 72%|███████▏ | 261/363 [01:48<00:35, 2.84it/s] Loading 0: 72%|███████▏ | 262/363 [01:48<00:30, 3.28it/s] Loading 0: 72%|███████▏ | 263/363 [01:49<00:45, 2.20it/s] Loading 0: 73%|███████▎ | 264/363 [01:50<00:58, 1.68it/s] Loading 0: 73%|███████▎ | 266/363 [01:50<00:40, 2.40it/s] Loading 0: 74%|███████▎ | 267/363 [01:51<00:36, 2.60it/s] Loading 0: 74%|███████▍ | 268/363 [01:51<00:30, 3.10it/s] Loading 0: 74%|███████▍ | 270/363 [01:52<00:37, 2.48it/s] Loading 0: 75%|███████▌ | 273/363 [01:53<00:32, 2.75it/s] Loading 0: 75%|███████▌ | 274/363 [01:54<00:41, 2.12it/s] Loading 0: 76%|███████▌ | 275/363 [01:55<00:51, 1.71it/s] Loading 0: 77%|███████▋ | 278/363 [01:55<00:30, 2.75it/s] Loading 0: 77%|███████▋ | 279/363 [01:55<00:29, 2.88it/s] Loading 0: 77%|███████▋ | 280/363 [01:56<00:25, 3.28it/s] Loading 0: 77%|███████▋ | 281/363 [01:57<00:36, 2.22it/s] Loading 0: 78%|███████▊ | 283/363 [01:57<00:27, 2.95it/s] Loading 0: 78%|███████▊ | 284/363 [01:57<00:25, 3.08it/s] Loading 0: 79%|███████▊ | 285/363 [01:57<00:21, 3.60it/s] Loading 0: 79%|███████▉ | 287/363 [01:58<00:28, 2.67it/s] Loading 0: 79%|███████▉ | 288/363 [01:59<00:39, 1.92it/s] Loading 0: 80%|████████ | 291/363 [02:00<00:30, 2.38it/s] Loading 0: 80%|████████ | 292/363 [02:01<00:36, 1.92it/s] Loading 0: 81%|████████ | 293/363 [02:02<00:44, 1.58it/s] Loading 0: 82%|████████▏ | 296/363 [02:03<00:25, 2.61it/s] Loading 0: 82%|████████▏ | 297/363 [02:03<00:23, 2.76it/s] Loading 0: 82%|████████▏ | 298/363 [02:03<00:20, 3.14it/s] Loading 0: 83%|████████▎ | 300/363 [02:03<00:16, 3.78it/s] Loading 0: 83%|████████▎ | 301/363 [02:04<00:16, 3.76it/s] Loading 0: 83%|████████▎ | 302/363 [02:04<00:14, 4.23it/s] Loading 0: 84%|████████▎ | 304/363 [02:05<00:20, 2.88it/s] Loading 0: 84%|████████▍ | 305/363 [02:06<00:27, 2.09it/s] Loading 0: 84%|████████▍ | 306/363 [02:07<00:34, 1.64it/s] Loading 0: 85%|████████▌ | 309/363 [02:08<00:24, 2.17it/s] Loading 0: 85%|████████▌ | 310/363 [02:09<00:29, 1.80it/s] Loading 0: 86%|████████▌ | 311/363 [02:10<00:34, 1.52it/s] Loading 0: 87%|████████▋ | 314/363 [02:10<00:19, 2.53it/s] Loading 0: 87%|████████▋ | 315/363 [02:10<00:18, 2.67it/s] Loading 0: 87%|████████▋ | 316/363 [02:10<00:15, 3.07it/s] Loading 0: 88%|████████▊ | 318/363 [02:12<00:17, 2.51it/s] Loading 0: 88%|████████▊ | 319/363 [02:12<00:22, 1.94it/s] Loading 0: 88%|████████▊ | 320/363 [02:13<00:27, 1.58it/s] Loading 0: 89%|████████▉ | 323/363 [02:14<00:14, 2.67it/s] Loading 0: 89%|████████▉ | 324/363 [02:14<00:13, 2.82it/s] Loading 0: 90%|████████▉ | 325/363 [02:14<00:11, 3.26it/s] Loading 0: 90%|████████▉ | 326/363 [02:15<00:16, 2.20it/s] Loading 0: 90%|█████████ | 327/363 [02:16<00:21, 1.69it/s] Loading 0: 91%|█████████ | 329/363 [02:17<00:14, 2.41it/s] Loading 0: 91%|█████████ | 330/363 [02:17<00:12, 2.61it/s] Loading 0: 91%|█████████ | 331/363 [02:17<00:10, 3.11it/s] Loading 0: 92%|█████████▏| 333/363 [02:18<00:12, 2.48it/s] Loading 0: 93%|█████████▎| 336/363 [02:19<00:09, 2.74it/s] Loading 0: 93%|█████████▎| 337/363 [02:20<00:12, 2.11it/s] Loading 0: 93%|█████████▎| 338/363 [02:21<00:14, 1.70it/s] Loading 0: 94%|█████████▍| 341/363 [02:21<00:08, 2.74it/s] Loading 0: 94%|█████████▍| 342/363 [02:22<00:07, 2.87it/s] Loading 0: 94%|█████████▍| 343/363 [02:22<00:06, 3.28it/s] Loading 0: 95%|█████████▍| 344/363 [02:23<00:08, 2.22it/s] Loading 0: 95%|█████████▌| 346/363 [02:23<00:05, 2.94it/s] Loading 0: 96%|█████████▌| 347/363 [02:23<00:05, 3.07it/s] Loading 0: 96%|█████████▌| 348/363 [02:23<00:04, 3.57it/s] Loading 0: 96%|█████████▌| 349/363 [02:24<00:03, 3.66it/s] Loading 0: 96%|█████████▋| 350/363 [02:24<00:02, 4.35it/s] Loading 0: 97%|█████████▋| 351/363 [02:25<00:05, 2.31it/s] Loading 0: 97%|█████████▋| 352/363 [02:26<00:06, 1.67it/s] Loading 0: 98%|█████████▊| 355/363 [02:27<00:03, 2.22it/s] Loading 0: 98%|█████████▊| 356/363 [02:28<00:03, 1.80it/s] Loading 0: 98%|█████████▊| 357/363 [02:29<00:03, 1.51it/s] Loading 0: 99%|█████████▉| 360/363 [02:29<00:01, 2.56it/s] Loading 0: 99%|█████████▉| 361/363 [02:29<00:00, 2.71it/s] Loading 0: 100%|█████████▉| 362/363 [02:29<00:00, 3.14it/s]
Job junhua024-chai-02-full-54657-v34-mkmlizer completed after 347.4s with status: succeeded
Stopping job with name junhua024-chai-02-full-54657-v34-mkmlizer
Pipeline stage MKMLizer completed in 347.90s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.29s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-02-full-54657-v34
Waiting for inference service junhua024-chai-02-full-54657-v34 to be ready
Failed to get response for submission chaiml-next-door-annoyi_15417_v1: ('http://chaiml-next-door-annoyi-15417-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Failed to get response for submission chaiml-next-door-annoyi_15417_v1: ('http://chaiml-next-door-annoyi-15417-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Failed to get response for submission chaiml-grpo-nis-12bsftqw_4593_v1: HTTPConnectionPool(host='chaiml-grpo-nis-12bsftqw-4593-v1-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service junhua024-chai-02-full-54657-v34 ready after 191.09073066711426s
Pipeline stage MKMLDeployer completed in 191.63s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.625896453857422s
Received healthy response to inference request in 1.9315719604492188s
Received healthy response to inference request in 1.2661380767822266s
Received healthy response to inference request in 3.651374578475952s
5 requests
1 failed requests
5th percentile: 1.399224853515625
10th percentile: 1.5323116302490234
20th percentile: 1.7984851837158202
30th percentile: 2.0704368591308593
40th percentile: 2.348166656494141
50th percentile: 2.625896453857422
60th percentile: 3.0360877037048337
70th percentile: 3.446278953552246
80th percentile: 6.957064104080203
90th percentile: 13.568443155288698
95th percentile: 16.87413268089294
99th percentile: 19.518684301376343
mean time: 5.930960655212402
%s, retrying in %s seconds...
Received healthy response to inference request in 2.010425329208374s
Received healthy response to inference request in 2.654798984527588s
Received healthy response to inference request in 1.475830316543579s
Received healthy response to inference request in 2.2858200073242188s
Received healthy response to inference request in 1.2388746738433838s
5 requests
0 failed requests
5th percentile: 1.2862658023834228
10th percentile: 1.3336569309234618
20th percentile: 1.42843918800354
30th percentile: 1.582749319076538
40th percentile: 1.7965873241424561
50th percentile: 2.010425329208374
60th percentile: 2.120583200454712
70th percentile: 2.2307410717010496
80th percentile: 2.3596158027648926
90th percentile: 2.5072073936462402
95th percentile: 2.581003189086914
99th percentile: 2.640039825439453
mean time: 1.9331498622894288
Pipeline stage StressChecker completed in 42.47s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 1.36s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.83s
Shutdown handler de-registered
junhua024-chai-02-full_54657_v34 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4654.02s
Shutdown handler de-registered
junhua024-chai-02-full_54657_v34 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-02-full_54657_v34 status is now torndown due to DeploymentManager action