developer_uid: NischayDnk
submission_id: chaiml-llama3-8b-exp3-s_66199_v3
model_name: chaiml-llama3-8b-exp3-s_66199_v3
model_group: ChaiML/llama3-8b-exp3-sc
status: torndown
timestamp: 2025-03-25T18:21:43+00:00
num_battles: 8114
num_wins: 3714
celo_rating: 1250.43
family_friendly_score: 0.5873999999999999
family_friendly_standard_error: 0.006962201376001702
submission_type: basic
model_repo: ChaiML/llama3-8b-exp3-screenshot-seq512
model_architecture: LlamaForSequenceClassification
model_num_parameters: 8030261248.0
best_of: 8
max_input_tokens: 256
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.8825110654663676, 'latency_mean': 1.1330650305747987, 'latency_p50': 1.0931477546691895, 'latency_p90': 1.2311514616012573}, {'batch_size': 2, 'throughput': 0.8738801278147278, 'latency_mean': 2.285091767311096, 'latency_p50': 2.306013345718384, 'latency_p90': 2.3412331342697144}, {'batch_size': 3, 'throughput': 0.8722727243389884, 'latency_mean': 3.4273548221588133, 'latency_p50': 3.417988419532776, 'latency_p90': 3.56433687210083}, {'batch_size': 4, 'throughput': 0.8710049167955347, 'latency_mean': 4.5670322573184965, 'latency_p50': 4.546087980270386, 'latency_p90': 4.7083505392074585}, {'batch_size': 5, 'throughput': 0.863529151789793, 'latency_mean': 5.7468026566505435, 'latency_p50': 5.835215091705322, 'latency_p90': 5.895191121101379}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: chaiml-llama3-8b-exp3-s_66199_v3
is_internal_developer: False
language_model: ChaiML/llama3-8b-exp3-screenshot-seq512
model_size: 8B
ranking_group: single
throughput_3p7s: 0.88
us_pacific_date: 2025-03-25
win_ratio: 0.45772738476706926
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 256, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|im_start|>system\n{memory}<|im_end|>\n', 'prompt_template': '<|im_start|>user\n{prompt}<|im_end|>\n', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer
Waiting for job on chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer to finish
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ _____ __ __ ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ /___/ ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ Version: 0.12.8 ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ https://mk1.ai ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ belonging to: ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ Chai Research Corp. ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ║ ║
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rirv938-mistral-24b-dpo-45557-v1-mkmlizer
Waiting for job on rirv938-mistral-24b-dpo-45557-v1-mkmlizer to finish
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ _____ __ __ ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ /___/ ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ Version: 0.12.8 ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ https://mk1.ai ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ belonging to: ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ Chai Research Corp. ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ║ ║
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: Downloaded to shared memory in 48.017s
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:t0, folder:/tmp/tmp6hj75joc, device:0
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: quantized model in 101.182s
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: Processed model ChaiML/llama3-8b-exp3-screenshot-seq512 in 149.200s
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: creating bucket guanaco-mkml-models
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-llama3-8b-exp3-s-66199-v3
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-llama3-8b-exp3-s-66199-v3/special_tokens_map.json
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-llama3-8b-exp3-s-66199-v3/config.json
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-llama3-8b-exp3-s-66199-v3/tokenizer_config.json
chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-llama3-8b-exp3-s-66199-v3/tokenizer.json
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: quantized model in 66.383s
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: Processed model rirv938/mistral_24b_dpo_kl_10k_beta2_1248_v3 in 246.426s
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: creating bucket guanaco-mkml-models
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rirv938-mistral-24b-dpo-45557-v1
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rirv938-mistral-24b-dpo-45557-v1/config.json
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rirv938-mistral-24b-dpo-45557-v1/special_tokens_map.json
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rirv938-mistral-24b-dpo-45557-v1/tokenizer_config.json
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rirv938-mistral-24b-dpo-45557-v1/tokenizer.json
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/rirv938-mistral-24b-dpo-45557-v1/flywheel_model.1.safetensors
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rirv938-mistral-24b-dpo-45557-v1/flywheel_model.0.safetensors
rirv938-mistral-24b-dpo-45557-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%| | 3/363 [00:00<00:12, 29.36it/s] Loading 0: 2%|▏ | 6/363 [00:00<00:26, 13.39it/s] Loading 0: 3%|▎ | 10/363 [00:00<00:17, 20.33it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:25, 13.61it/s] Loading 0: 4%|▍ | 15/363 [00:01<00:31, 11.09it/s] Loading 0: 5%|▌ | 19/363 [00:01<00:21, 15.66it/s] Loading 0: 6%|▌ | 22/363 [00:01<00:22, 15.21it/s] Loading 0: 7%|▋ | 24/363 [00:01<00:24, 13.90it/s] Loading 0: 8%|▊ | 28/363 [00:01<00:18, 18.30it/s] Loading 0: 9%|▉ | 32/363 [00:01<00:14, 22.23it/s] Loading 0: 10%|▉ | 35/363 [00:02<00:21, 15.52it/s] Loading 0: 10%|█ | 38/363 [00:02<00:20, 16.00it/s] Loading 0: 11%|█▏ | 41/363 [00:02<00:25, 12.47it/s] Loading 0: 13%|█▎ | 46/363 [00:02<00:18, 17.42it/s] Loading 0: 14%|█▍ | 50/363 [00:02<00:15, 20.82it/s] Loading 0: 15%|█▍ | 53/363 [00:03<00:20, 15.34it/s] Loading 0: 15%|█▌ | 56/363 [00:03<00:19, 15.93it/s] Loading 0: 16%|█▋ | 59/363 [00:03<00:24, 12.28it/s] Loading 0: 18%|█▊ | 64/363 [00:04<00:17, 16.83it/s] Loading 0: 19%|█▊ | 68/363 [00:04<00:14, 19.70it/s] Loading 0: 20%|█▉ | 71/363 [00:04<00:20, 14.08it/s] Loading 0: 20%|██ | 74/363 [00:04<00:19, 14.68it/s] Loading 0: 21%|██ | 76/363 [00:04<00:21, 13.26it/s] Loading 0: 21%|██▏ | 78/363 [00:05<00:22, 12.58it/s] Loading 0: 23%|██▎ | 82/363 [00:05<00:17, 15.89it/s] Loading 0: 24%|██▎ | 86/363 [00:05<00:14, 19.53it/s] Loading 0: 25%|██▍ | 89/363 [00:05<00:19, 14.13it/s] Loading 0: 25%|██▌ | 91/363 [00:05<00:20, 13.38it/s] Loading 0: 26%|██▌ | 95/363 [00:06<00:15, 17.45it/s] Loading 0: 27%|██▋ | 99/363 [00:06<00:12, 21.15it/s] Loading 0: 28%|██▊ | 102/363 [00:06<00:13, 18.99it/s] Loading 0: 29%|██▉ | 105/363 [00:06<00:18, 14.09it/s] Loading 0: 29%|██▉ | 107/363 [00:06<00:19, 12.92it/s] Loading 0: 30%|███ | 109/363 [00:07<00:20, 12.50it/s] Loading 0: 31%|███ | 111/363 [00:07<00:18, 13.42it/s] Loading 0: 31%|███ | 113/363 [00:07<00:22, 10.93it/s] Loading 0: 33%|███▎ | 118/363 [00:07<00:14, 16.86it/s] Loading 0: 34%|███▎ | 122/363 [00:07<00:11, 20.20it/s] Loading 0: 34%|███▍ | 125/363 [00:08<00:16, 14.22it/s] Loading 0: 35%|███▍ | 127/363 [00:08<00:17, 13.51it/s] Loading 0: 36%|███▌ | 129/363 [00:08<00:16, 14.33it/s] Loading 0: 36%|███▌ | 131/363 [00:08<00:20, 11.39it/s] Loading 0: 37%|███▋ | 136/363 [00:08<00:13, 16.88it/s] Loading 0: 39%|███▊ | 140/363 [00:08<00:11, 20.23it/s] Loading 0: 39%|███▉ | 143/363 [00:09<00:15, 14.14it/s] Loading 0: 40%|███▉ | 145/363 [00:09<00:16, 13.38it/s] Loading 0: 40%|████ | 147/363 [00:09<00:15, 13.88it/s] Loading 0: 41%|████ | 149/363 [00:09<00:19, 10.78it/s] Loading 0: 42%|████▏ | 154/363 [00:10<00:13, 15.76it/s] Loading 0: 44%|████▎ | 158/363 [00:10<00:10, 19.02it/s] Loading 0: 44%|████▍ | 161/363 [00:10<00:14, 13.50it/s] Loading 0: 45%|████▍ | 163/363 [00:10<00:15, 12.91it/s] Loading 0: 45%|████▌ | 165/363 [00:10<00:14, 13.46it/s] Loading 0: 46%|████▌ | 167/363 [00:11<00:18, 10.87it/s] Loading 0: 47%|████▋ | 172/363 [00:11<00:11, 16.47it/s] Loading 0: 48%|████▊ | 176/363 [00:11<00:09, 19.65it/s] Loading 0: 49%|████▉ | 179/363 [00:11<00:13, 13.99it/s] Loading 0: 50%|████▉ | 181/363 [00:12<00:13, 13.17it/s] Loading 0: 50%|█████ | 183/363 [00:12<00:13, 13.74it/s] Loading 0: 51%|█████ | 185/363 [00:12<00:16, 11.04it/s] Loading 0: 52%|█████▏ | 190/363 [00:12<00:10, 16.39it/s] Loading 0: 53%|█████▎ | 194/363 [00:12<00:08, 19.47it/s] Loading 0: 54%|█████▍ | 197/363 [00:13<00:12, 13.74it/s] Loading 0: 55%|█████▍ | 199/363 [00:13<00:12, 13.11it/s] Loading 0: 55%|█████▌ | 201/363 [00:28<04:46, 1.77s/it] Loading 0: 56%|█████▌ | 202/363 [00:28<04:07, 1.54s/it] Loading 0: 56%|█████▌ | 204/363 [00:28<02:59, 1.13s/it] Loading 0: 57%|█████▋ | 207/363 [00:28<01:51, 1.40it/s] Loading 0: 58%|█████▊ | 210/363 [00:28<01:13, 2.08it/s] Loading 0: 58%|█████▊ | 212/363 [00:29<00:56, 2.66it/s] Loading 0: 59%|█████▉ | 214/363 [00:29<00:46, 3.20it/s] Loading 0: 60%|█████▉ | 216/363 [00:29<00:40, 3.63it/s] Loading 0: 60%|██████ | 219/363 [00:29<00:27, 5.30it/s] Loading 0: 61%|██████ | 221/363 [00:30<00:25, 5.59it/s] Loading 0: 62%|██████▏ | 226/363 [00:30<00:14, 9.54it/s] Loading 0: 63%|██████▎ | 230/363 [00:30<00:10, 12.71it/s] Loading 0: 64%|██████▍ | 233/363 [00:30<00:11, 10.93it/s] Loading 0: 65%|██████▍ | 235/363 [00:30<00:11, 10.90it/s] Loading 0: 65%|██████▌ | 237/363 [00:31<00:10, 11.86it/s] Loading 0: 66%|██████▌ | 239/363 [00:31<00:12, 9.99it/s] Loading 0: 67%|██████▋ | 244/363 [00:31<00:07, 15.37it/s] Loading 0: 68%|██████▊ | 248/363 [00:31<00:06, 18.70it/s] Loading 0: 69%|██████▉ | 251/363 [00:32<00:08, 13.54it/s] Loading 0: 70%|██████▉ | 253/363 [00:32<00:08, 12.78it/s] Loading 0: 70%|███████ | 255/363 [00:32<00:08, 13.29it/s] Loading 0: 71%|███████ | 257/363 [00:32<00:10, 10.39it/s] Loading 0: 72%|███████▏ | 262/363 [00:32<00:06, 16.00it/s] Loading 0: 73%|███████▎ | 266/363 [00:32<00:04, 19.42it/s] Loading 0: 74%|███████▍ | 269/363 [00:33<00:06, 13.71it/s] Loading 0: 75%|███████▍ | 271/363 [00:33<00:06, 13.23it/s] Loading 0: 75%|███████▌ | 273/363 [00:33<00:06, 13.92it/s] Loading 0: 76%|███████▌ | 275/363 [00:33<00:07, 11.30it/s] Loading 0: 77%|███████▋ | 280/363 [00:33<00:04, 17.28it/s] Loading 0: 78%|███████▊ | 284/363 [00:34<00:03, 20.95it/s] Loading 0: 79%|███████▉ | 287/363 [00:34<00:05, 14.77it/s] Loading 0: 80%|███████▉ | 290/363 [00:34<00:04, 15.33it/s] Loading 0: 80%|████████ | 292/363 [00:34<00:05, 13.53it/s] Loading 0: 81%|████████ | 294/363 [00:35<00:05, 12.99it/s] Loading 0: 82%|████████▏ | 298/363 [00:35<00:03, 17.15it/s] Loading 0: 83%|████████▎ | 302/363 [00:35<00:02, 21.01it/s] Loading 0: 84%|████████▍ | 305/363 [00:35<00:04, 14.25it/s] Loading 0: 85%|████████▍ | 307/363 [00:35<00:04, 13.18it/s] Loading 0: 85%|████████▌ | 309/363 [00:35<00:03, 13.80it/s] Loading 0: 86%|████████▌ | 311/363 [00:36<00:04, 11.30it/s] Loading 0: 87%|████████▋ | 316/363 [00:36<00:02, 17.29it/s] Loading 0: 88%|████████▊ | 320/363 [00:36<00:02, 20.89it/s] Loading 0: 89%|████████▉ | 323/363 [00:36<00:02, 14.15it/s] Loading 0: 90%|████████▉ | 326/363 [00:37<00:02, 14.70it/s] Loading 0: 90%|█████████ | 328/363 [00:37<00:02, 12.72it/s] Loading 0: 91%|█████████ | 330/363 [00:37<00:02, 12.12it/s] Loading 0: 92%|█████████▏| 334/363 [00:37<00:01, 16.18it/s] Loading 0: 93%|█████████▎| 338/363 [00:37<00:01, 20.01it/s] Loading 0: 94%|█████████▍| 341/363 [00:38<00:01, 13.60it/s] Loading 0: 94%|█████████▍| 343/363 [00:38<00:01, 12.97it/s] Loading 0: 95%|█████████▌| 345/363 [00:38<00:01, 13.61it/s] Loading 0: 96%|█████████▌| 347/363 [00:38<00:01, 10.51it/s] Loading 0: 97%|█████████▋| 352/363 [00:38<00:00, 15.99it/s] Loading 0: 98%|█████████▊| 356/363 [00:38<00:00, 19.25it/s] Loading 0: 99%|█████████▉| 359/363 [00:46<00:02, 1.43it/s] Loading 0: 99%|█████████▉| 361/363 [00:46<00:01, 1.71it/s]
Job rirv938-mistral-24b-dpo-45557-v1-mkmlizer completed after 289.83s with status: succeeded
Stopping job with name rirv938-mistral-24b-dpo-45557-v1-mkmlizer
Pipeline stage MKMLizer completed in 290.54s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.20s
Unable to record family friendly update due to error: Invalid JSON input: JSON must contain 'User Safety' and 'Response Safety' fields
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rirv938-mistral-24b-dpo-45557-v1
Waiting for inference service rirv938-mistral-24b-dpo-45557-v1 to be ready
Job chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer completed after 318.25s with status: succeeded
Stopping job with name chaiml-llama3-8b-exp3-s-66199-v3-mkmlizer
Pipeline stage MKMLizer completed in 318.88s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.23s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service chaiml-llama3-8b-exp3-s-66199-v3
Waiting for inference service chaiml-llama3-8b-exp3-s-66199-v3 to be ready
Failed to get response for submission nitral-ai-captain-eris_45741_v27: HTTPConnectionPool(host='nitral-ai-captain-eris-45741-v27-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Inference service rirv938-mistral-24b-dpo-45557-v1 ready after 100.38854718208313s
Pipeline stage MKMLDeployer completed in 101.17s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.9630608558654785s
Inference service chaiml-llama3-8b-exp3-s-66199-v3 ready after 100.37573003768921s
Pipeline stage MKMLDeployer completed in 101.19s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.4725728034973145s
Received healthy response to inference request in 2.718724489212036s
Received healthy response to inference request in 3.465040683746338s
Received healthy response to inference request in 2.4118425846099854s
Received healthy response to inference request in 3.95638108253479s
Received healthy response to inference request in 2.7215511798858643s
5 requests
0 failed requests
5th percentile: 2.423988628387451
10th percentile: 2.436134672164917
20th percentile: 2.460426759719849
30th percentile: 2.521803140640259
40th percentile: 2.6202638149261475
50th percentile: 2.718724489212036
60th percentile: 2.7198551654815675
70th percentile: 2.720985841751099
80th percentile: 2.769853115081787
90th percentile: 2.866456985473633
95th percentile: 2.9147589206695557
99th percentile: 2.953400468826294
mean time: 2.6575503826141356
Pipeline stage StressChecker completed in 16.26s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
Received healthy response to inference request in 3.4145803451538086s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.99s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 1.04s
Shutdown handler de-registered
rirv938-mistral-24b-dpo_45557_v1 status is now deployed due to DeploymentManager action
Received healthy response to inference request in 3.490630865097046s
Received healthy response to inference request in 2.0942342281341553s
5 requests
0 failed requests
5th percentile: 2.358303451538086
10th percentile: 2.6223726749420164
20th percentile: 3.150511121749878
30th percentile: 3.4246724128723143
40th percentile: 3.444856548309326
50th percentile: 3.465040683746338
60th percentile: 3.475276756286621
70th percentile: 3.4855128288269044
80th percentile: 3.5837809085845946
90th percentile: 3.7700809955596926
95th percentile: 3.8632310390472413
99th percentile: 3.9377510738372803
mean time: 3.2841734409332277
Pipeline stage StressChecker completed in 18.58s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.71s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.70s
Shutdown handler de-registered
chaiml-llama3-8b-exp3-s_66199_v3 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service chaiml-llama3-8b-exp3-s-66199-v3-profiler
Waiting for inference service chaiml-llama3-8b-exp3-s-66199-v3-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3346.32s
Shutdown handler de-registered
chaiml-llama3-8b-exp3-s_66199_v3 status is now inactive due to auto deactivation removed underperforming models
chaiml-llama3-8b-exp3-s_66199_v3 status is now torndown due to DeploymentManager action