developer_uid: nguyenzzz
submission_id: thanhdaonguyen-thanhdaorp-281_v2
model_name: thanhdaonguyen-thanhdaorp-281_v2
model_group: thanhdaonguyen/thanhdaor
status: torndown
timestamp: 2023-12-28T18:24:03+00:00
num_battles: 67821
num_wins: 34039
celo_rating: 1158.88
family_friendly_score: 0.0
submission_type: basic
model_repo: thanhdaonguyen/thanhdaorp-2812
reward_repo: ChaiML/reward_models_100_170000000_cp_498032
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
display_name: thanhdaonguyen-thanhdaorp-281_v2
is_internal_developer: False
language_model: thanhdaonguyen/thanhdaorp-2812
model_size: NoneB
ranking_group: single
us_pacific_date: 2023-12-28
win_ratio: 0.5018946933840551
generation_params: {'temperature': 0.75, 'top_p': 0.8, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.2, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>', '<|im_end|>', '###'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '### Instruction:\n\nEnter roleplay mode. You are {bot_name}.\n\n{memory}\n\n', 'prompt_template': 'Example session #1:\n```\n{prompt}\n```\n\n### Input:\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '\n### Response:\n{bot_name}:', 'truncate_by_message': False}
Resubmit model
Running pipeline stage MKMLizer
Starting job with name thanhdaonguyen-thanhdaorp-281-v2-mkmlizer
Waiting for job on thanhdaonguyen-thanhdaorp-281-v2-mkmlizer to finish
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ _______ __ __ _______ _____ ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ | | | |/ | | | |_ ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ | | <| | | ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ |__|_|__|__|\__|__|_|__|_______| ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ The license key for the current software has been verified as ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ belonging to: ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ Chai Research Corp. ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ Expiration: 2024-04-15 23:59:59 ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ║ ║
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: loading model from thanhdaonguyen/thanhdaorp-2812
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: warnings.warn(
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: config.json: 0%| | 0.00/644 [00:00<?, ?B/s] config.json: 100%|██████████| 644/644 [00:00<00:00, 7.38MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: warnings.warn(
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: tokenizer_config.json: 0%| | 0.00/916 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 916/916 [00:00<00:00, 11.9MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: tokenizer.model: 0%| | 0.00/493k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 493k/493k [00:00<00:00, 12.7MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: tokenizer.json: 0%| | 0.00/1.80M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 1.80M/1.80M [00:00<00:00, 35.2MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: special_tokens_map.json: 0%| | 0.00/414 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 414/414 [00:00<00:00, 5.60MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: warnings.warn(
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model.safetensors.index.json: 0%| | 0.00/23.9k [00:00<?, ?B/s] model.safetensors.index.json: 100%|██████████| 23.9k/23.9k [00:00<00:00, 156MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Downloading shards: 0%| | 0/3 [00:00<?, ?it/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 0%| | 0.00/4.94G [00:00<?, ?B/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 0%| | 10.5M/4.94G [00:00<01:39, 49.5MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 1%| | 52.4M/4.94G [00:00<00:26, 186MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 2%|▏ | 83.9M/4.94G [00:00<00:22, 216MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 3%|▎ | 168M/4.94G [00:00<00:11, 412MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 4%|▍ | 220M/4.94G [00:00<00:15, 310MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 5%|▌ | 262M/4.94G [00:01<00:24, 194MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 7%|▋ | 336M/4.94G [00:01<00:18, 254MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 11%|█▏ | 566M/4.94G [00:01<00:07, 603MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 23%|██▎ | 1.13G/4.94G [00:01<00:02, 1.59GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 28%|██▊ | 1.37G/4.94G [00:01<00:02, 1.41GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 32%|███▏ | 1.57G/4.94G [00:01<00:02, 1.32GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 35%|███▌ | 1.75G/4.94G [00:02<00:03, 882MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 38%|███▊ | 1.89G/4.94G [00:02<00:03, 838MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 43%|████▎ | 2.15G/4.94G [00:02<00:02, 1.11GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 51%|█████ | 2.52G/4.94G [00:02<00:01, 1.57GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 55%|█████▌ | 2.74G/4.94G [00:03<00:01, 1.27GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 59%|█████▉ | 2.92G/4.94G [00:03<00:01, 1.05GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 62%|██████▏ | 3.06G/4.94G [00:03<00:01, 1.09GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 65%|██████▍ | 3.21G/4.94G [00:03<00:01, 996MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 68%|██████▊ | 3.37G/4.94G [00:03<00:01, 1.10GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 72%|███████▏ | 3.58G/4.94G [00:03<00:01, 1.31GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 76%|███████▋ | 3.77G/4.94G [00:03<00:00, 1.45GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 80%|███████▉ | 3.94G/4.94G [00:04<00:00, 1.08GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 83%|████████▎ | 4.11G/4.94G [00:04<00:00, 1.19GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 86%|████████▌ | 4.26G/4.94G [00:04<00:00, 1.17GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00001-of-00003.safetensors: 91%|█████████▏| 4.51G/4.94G [00:04<00:00, 1.48GB/s] model-00001-of-00003.safetensors: 100%|█████████▉| 4.94G/4.94G [00:04<00:00, 1.06GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Downloading shards: 33%|███▎ | 1/3 [00:04<00:09, 4.93s/it]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00002-of-00003.safetensors: 0%| | 0.00/5.00G [00:00<?, ?B/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00002-of-00003.safetensors: 0%| | 10.5M/5.00G [00:00<03:48, 21.9MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00002-of-00003.safetensors: 0%| | 21.0M/5.00G [00:00<02:07, 39.2MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00002-of-00003.safetensors: 1%|▏ | 62.9M/5.00G [00:00<00:51, 95.4MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00002-of-00003.safetensors: 2%|▏ | 83.9M/5.00G [00:01<00:46, 106MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00002-of-00003.safetensors: 2%|▏ | 105M/5.00G [00:01<00:57, 84.8MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00002-of-00003.safetensors: 5%|▍ | 241M/5.00G [00:01<00:16, 293MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00002-of-00003.safetensors: 22%|██▏ | 1.12G/5.00G [00:01<00:02, 1.83GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Downloading shards: 67%|██████▋ | 2/3 [00:09<00:04, 4.99s/it]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 0%| | 0.00/4.54G [00:00<?, ?B/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 0%| | 10.5M/4.54G [00:00<02:09, 35.1MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 1%| | 41.9M/4.54G [00:00<00:42, 105MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 2%|▏ | 94.4M/4.54G [00:01<00:53, 83.4MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 3%|▎ | 115M/4.54G [00:01<00:44, 98.4MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 6%|▋ | 294M/4.54G [00:01<00:11, 358MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 22%|██▏ | 1.01G/4.54G [00:01<00:02, 1.57GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 29%|██▊ | 1.30G/4.54G [00:01<00:02, 1.52GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 34%|███▍ | 1.54G/4.54G [00:01<00:01, 1.56GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 39%|███▉ | 1.76G/4.54G [00:02<00:03, 812MB/s] 
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 42%|████▏ | 1.93G/4.54G [00:02<00:03, 853MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 46%|████▋ | 2.11G/4.54G [00:02<00:02, 971MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 53%|█████▎ | 2.40G/4.54G [00:02<00:01, 1.29GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 59%|█████▊ | 2.66G/4.54G [00:02<00:01, 1.49GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 63%|██████▎ | 2.87G/4.54G [00:03<00:01, 1.07GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 67%|██████▋ | 3.04G/4.54G [00:03<00:01, 1.11GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 70%|███████ | 3.20G/4.54G [00:03<00:01, 1.08GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 74%|███████▍ | 3.36G/4.54G [00:03<00:01, 1.15GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 78%|███████▊ | 3.53G/4.54G [00:03<00:00, 1.27GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 82%|████████▏ | 3.74G/4.54G [00:03<00:00, 1.45GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 86%|████████▌ | 3.91G/4.54G [00:04<00:00, 1.29GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: model-00003-of-00003.safetensors: 92%|█████████▏| 4.16G/4.54G [00:04<00:00, 1.55GB/s] model-00003-of-00003.safetensors: 100%|█████████▉| 4.54G/4.54G [00:04<00:00, 1.06GB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Downloading shards: 100%|██████████| 3/3 [00:14<00:00, 4.78s/it] Downloading shards: 100%|██████████| 3/3 [00:14<00:00, 4.83s/it]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Loading checkpoint shards: 0%| | 0/3 [00:00<?, ?it/s] Loading checkpoint shards: 33%|███▎ | 1/3 [00:00<00:01, 1.27it/s] Loading checkpoint shards: 67%|██████▋ | 2/3 [00:01<00:00, 1.22it/s] Loading checkpoint shards: 100%|██████████| 3/3 [00:02<00:00, 1.29it/s] Loading checkpoint shards: 100%|██████████| 3/3 [00:02<00:00, 1.28it/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: generation_config.json: 0%| | 0.00/111 [00:00<?, ?B/s] generation_config.json: 100%|██████████| 111/111 [00:00<00:00, 1.40MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: loaded model in 18.340s
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: saved to disk in 21.183s
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: quantizing model to /tmp/model_cache
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Saving mkml model at /tmp/model_cache
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Reading /tmp/tmpxqi2bpw9/model.safetensors.index.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Profiling: 0%| | 0/291 [00:00<?, ?it/s] Profiling: 0%| | 1/291 [00:01<05:36, 1.16s/it] Profiling: 1%| | 3/291 [00:01<02:39, 1.80it/s] Profiling: 1%|▏ | 4/291 [00:02<02:48, 1.71it/s] Profiling: 2%|▏ | 5/291 [00:03<02:55, 1.63it/s] Profiling: 3%|▎ | 8/291 [00:03<01:25, 3.29it/s] Profiling: 3%|▎ | 9/291 [00:03<01:24, 3.35it/s] Profiling: 4%|▍ | 12/291 [00:04<01:14, 3.74it/s] Profiling: 4%|▍ | 13/291 [00:05<01:34, 2.94it/s] Profiling: 5%|▍ | 14/291 [00:05<01:52, 2.46it/s] Profiling: 6%|▌ | 17/291 [00:05<01:09, 3.96it/s] Profiling: 6%|▌ | 18/291 [00:06<01:08, 3.99it/s] Profiling: 7%|▋ | 21/291 [00:06<01:05, 4.10it/s] Profiling: 8%|▊ | 22/291 [00:07<01:25, 3.16it/s] Profiling: 8%|▊ | 23/291 [00:08<01:42, 2.61it/s] Profiling: 9%|▉ | 26/291 [00:08<01:04, 4.10it/s] Profiling: 9%|▉ | 27/291 [00:08<01:02, 4.25it/s] Profiling: 10%|█ | 30/291 [00:09<01:01, 4.24it/s] Profiling: 11%|█ | 31/291 [00:10<01:20, 3.24it/s] Profiling: 11%|█ | 32/291 [00:10<01:37, 2.66it/s] Profiling: 12%|█▏ | 35/291 [00:10<01:01, 4.16it/s] Profiling: 12%|█▏ | 36/291 [00:11<00:59, 4.31it/s] Profiling: 13%|█▎ | 39/291 [00:11<00:58, 4.29it/s] Profiling: 14%|█▎ | 40/291 [00:12<01:16, 3.26it/s] Profiling: 14%|█▍ | 41/291 [00:13<01:34, 2.66it/s] Profiling: 15%|█▌ | 44/291 [00:13<00:59, 4.15it/s] Profiling: 15%|█▌ | 45/291 [00:13<00:57, 4.30it/s] Profiling: 16%|█▋ | 48/291 [00:13<00:41, 5.80it/s] Profiling: 17%|█▋ | 50/291 [00:14<00:58, 4.13it/s] Profiling: 18%|█▊ | 51/291 [00:15<01:15, 3.19it/s] Profiling: 18%|█▊ | 53/291 [00:15<00:59, 4.02it/s] Profiling: 19%|█▊ | 54/291 [00:15<00:56, 4.19it/s] Profiling: 20%|█▉ | 57/291 [00:16<00:55, 4.22it/s] Profiling: 20%|█▉ | 58/291 [00:17<01:12, 3.22it/s] Profiling: 20%|██ | 59/291 [00:17<01:28, 2.63it/s] Profiling: 21%|██ | 61/291 [00:17<01:03, 3.62it/s] Profiling: 22%|██▏ | 63/291 [00:18<01:07, 3.38it/s] Profiling: 22%|██▏ | 64/291 [00:19<01:23, 2.72it/s] Profiling: 22%|██▏ | 65/291 [00:19<01:37, 2.33it/s] Profiling: 23%|██▎ | 68/291 [00:20<00:57, 3.86it/s] Profiling: 24%|██▎ | 69/291 [00:20<00:54, 4.06it/s] Profiling: 25%|██▍ | 72/291 [00:21<00:52, 4.15it/s] Profiling: 25%|██▌ | 73/291 [00:21<01:08, 3.19it/s] Profiling: 25%|██▌ | 74/291 [00:22<01:23, 2.61it/s] Profiling: 26%|██▋ | 77/291 [00:22<00:51, 4.13it/s] Profiling: 27%|██▋ | 78/291 [00:22<00:49, 4.30it/s] Profiling: 28%|██▊ | 81/291 [00:23<00:48, 4.30it/s] Profiling: 28%|██▊ | 82/291 [00:24<01:03, 3.27it/s] Profiling: 29%|██▊ | 83/291 [00:24<01:17, 2.67it/s] Profiling: 30%|██▉ | 86/291 [00:25<00:49, 4.18it/s] Profiling: 30%|██▉ | 87/291 [00:25<00:47, 4.33it/s] Profiling: 31%|███ | 90/291 [00:25<00:46, 4.32it/s] Profiling: 31%|███▏ | 91/291 [00:26<01:00, 3.30it/s] Profiling: 32%|███▏ | 92/291 [00:27<01:14, 2.69it/s] Profiling: 33%|███▎ | 95/291 [00:27<00:46, 4.21it/s] Profiling: 33%|███▎ | 96/291 [00:27<00:44, 4.37it/s] Profiling: 34%|███▎ | 98/291 [00:27<00:34, 5.59it/s] Profiling: 34%|███▍ | 99/291 [00:28<00:51, 3.70it/s] Profiling: 35%|███▌ | 102/291 [00:29<00:46, 4.04it/s] Profiling: 35%|███▌ | 103/291 [00:29<01:00, 3.13it/s] Profiling: 36%|███▌ | 104/291 [00:30<01:12, 2.57it/s] Profiling: 37%|███▋ | 107/291 [00:30<00:45, 4.08it/s] Profiling: 37%|███▋ | 108/291 [00:30<00:43, 4.24it/s] Profiling: 38%|███▊ | 111/291 [00:31<00:42, 4.26it/s] Profiling: 38%|███▊ | 112/291 [00:32<00:54, 3.27it/s] Profiling: 39%|███▉ | 113/291 [00:32<01:06, 2.66it/s] Profiling: 40%|███▉ | 116/291 [00:33<00:41, 4.17it/s] Profiling: 40%|████ | 117/291 [00:33<00:40, 4.32it/s] Profiling: 41%|████ | 120/291 [00:34<00:39, 4.29it/s] Profiling: 42%|████▏ | 121/291 [00:34<00:51, 3.27it/s] Profiling: 42%|████▏ | 122/291 [00:35<01:03, 2.67it/s] Profiling: 43%|████▎ | 125/291 [00:35<00:39, 4.18it/s] Profiling: 43%|████▎ | 126/291 [00:35<00:38, 4.33it/s] Profiling: 44%|████▍ | 129/291 [00:36<00:37, 4.34it/s] Profiling: 45%|████▍ | 130/291 [00:37<00:48, 3.30it/s] Profiling: 45%|████▌ | 131/291 [00:37<00:59, 2.68it/s] Profiling: 46%|████▌ | 134/291 [00:38<00:37, 4.20it/s] Profiling: 46%|████▋ | 135/291 [00:38<00:35, 4.36it/s] Profiling: 47%|████▋ | 138/291 [00:38<00:35, 4.33it/s] Profiling: 48%|████▊ | 139/291 [00:39<00:46, 3.30it/s] Profiling: 48%|████▊ | 140/291 [00:40<00:56, 2.68it/s] Profiling: 49%|████▉ | 143/291 [00:40<00:35, 4.21it/s] Profiling: 49%|████▉ | 144/291 [00:40<00:33, 4.35it/s] Profiling: 50%|█████ | 146/291 [00:41<00:39, 3.71it/s] Profiling: 51%|█████ | 148/291 [00:41<00:31, 4.56it/s] Profiling: 51%|█████ | 149/291 [00:41<00:30, 4.68it/s] Profiling: 52%|█████▏ | 151/291 [00:41<00:23, 5.94it/s] Profiling: 52%|█████▏ | 152/291 [00:42<00:36, 3.76it/s] Profiling: 53%|█████▎ | 153/291 [00:43<00:48, 2.85it/s] Profiling: 54%|█████▎ | 156/291 [00:43<00:38, 3.46it/s] Profiling: 54%|█████▍ | 157/291 [00:44<00:48, 2.79it/s] Profiling: 54%|█████▍ | 158/291 [00:45<00:56, 2.37it/s] Profiling: 55%|█████▌ | 161/291 [00:45<00:33, 3.87it/s] Profiling: 56%|█████▌ | 162/291 [00:45<00:31, 4.06it/s] Profiling: 57%|█████▋ | 165/291 [00:46<00:30, 4.15it/s] Profiling: 57%|█████▋ | 166/291 [00:46<00:39, 3.20it/s] Profiling: 57%|█████▋ | 167/291 [00:47<00:47, 2.62it/s] Profiling: 58%|█████▊ | 170/291 [00:47<00:29, 4.12it/s] Profiling: 59%|█████▉ | 171/291 [00:48<00:28, 4.28it/s] Profiling: 60%|█████▉ | 174/291 [00:48<00:27, 4.30it/s] Profiling: 60%|██████ | 175/291 [00:49<00:35, 3.27it/s] Profiling: 60%|██████ | 176/291 [00:50<00:43, 2.67it/s] Profiling: 62%|██████▏ | 179/291 [00:50<00:26, 4.18it/s] Profiling: 62%|██████▏ | 180/291 [00:50<00:25, 4.33it/s] Profiling: 63%|██████▎ | 183/291 [00:51<00:25, 4.30it/s] Profiling: 63%|██████▎ | 184/291 [00:51<00:32, 3.30it/s] Profiling: 64%|██████▎ | 185/291 [00:52<00:39, 2.70it/s] Profiling: 65%|██████▍ | 188/291 [00:52<00:24, 4.22it/s] Profiling: 65%|██████▍ | 189/291 [00:52<00:23, 4.36it/s] Profiling: 66%|██████▌ | 192/291 [00:53<00:22, 4.34it/s] Profiling: 66%|██████▋ | 193/291 [00:54<00:29, 3.29it/s] Profiling: 67%|██████▋ | 194/291 [00:54<00:36, 2.69it/s] Profiling: 68%|██████▊ | 197/291 [00:55<00:22, 4.21it/s] Profiling: 68%|██████▊ | 198/291 [00:55<00:21, 4.37it/s] Profiling: 69%|██████▉ | 201/291 [00:55<00:15, 5.86it/s] Profiling: 69%|██████▉ | 202/291 [00:55<00:15, 5.74it/s] Profiling: 70%|███████ | 204/291 [00:56<00:12, 6.94it/s] Profiling: 70%|███████ | 205/291 [00:56<00:20, 4.13it/s] Profiling: 71%|███████ | 206/291 [00:57<00:28, 3.02it/s] Profiling: 71%|███████ | 207/291 [00:57<00:33, 2.48it/s] Profiling: 72%|███████▏ | 210/291 [00:58<00:25, 3.23it/s] Profiling: 73%|███████▎ | 211/291 [00:59<00:30, 2.65it/s] Profiling: 73%|███████▎ | 212/291 [00:59<00:34, 2.29it/s] Profiling: 74%|███████▍ | 215/291 [01:00<00:20, 3.77it/s] Profiling: 74%|███████▍ | 216/291 [01:00<00:18, 3.98it/s] Profiling: 75%|███████▌ | 219/291 [01:01<00:17, 4.12it/s] Profiling: 76%|███████▌ | 220/291 [01:01<00:22, 3.19it/s] Profiling: 76%|███████▌ | 221/291 [01:02<00:26, 2.63it/s] Profiling: 77%|███████▋ | 224/291 [01:02<00:16, 4.15it/s] Profiling: 77%|███████▋ | 225/291 [01:02<00:15, 4.31it/s] Profiling: 78%|███████▊ | 228/291 [01:03<00:14, 4.31it/s] Profiling: 79%|███████▊ | 229/291 [01:04<00:18, 3.27it/s] Profiling: 79%|███████▉ | 230/291 [01:04<00:22, 2.66it/s] Profiling: 80%|████████ | 233/291 [01:05<00:13, 4.19it/s] Profiling: 80%|████████ | 234/291 [01:05<00:13, 4.34it/s] Profiling: 81%|████████▏ | 237/291 [01:05<00:12, 4.31it/s] Profiling: 82%|████████▏ | 238/291 [01:06<00:16, 3.27it/s] Profiling: 82%|████████▏ | 239/291 [01:07<00:19, 2.67it/s] Profiling: 83%|████████▎ | 242/291 [01:07<00:11, 4.18it/s] Profiling: 84%|████████▎ | 243/291 [01:07<00:11, 4.34it/s] Profiling: 84%|████████▍ | 245/291 [01:08<00:12, 3.69it/s] Profiling: 85%|████████▍ | 246/291 [01:09<00:15, 2.90it/s] Profiling: 85%|████████▌ | 248/291 [01:09<00:11, 3.79it/s] Profiling: 86%|████████▌ | 249/291 [01:09<00:10, 4.02it/s] Profiling: 86%|████████▋ | 251/291 [01:10<00:17, 2.32it/s] Profiling: 87%|████████▋ | 253/291 [01:11<00:14, 2.54it/s] Profiling: 88%|████████▊ | 256/291 [01:12<00:11, 3.12it/s] Profiling: 88%|████████▊ | 257/291 [01:12<00:12, 2.66it/s] Profiling: 89%|████████▊ | 258/291 [01:13<00:14, 2.32it/s] Profiling: 90%|████████▉ | 261/291 [01:13<00:08, 3.70it/s] Profiling: 90%|█████████ | 262/291 [01:13<00:07, 3.90it/s] Profiling: 91%|█████████ | 265/291 [01:14<00:06, 4.08it/s] Profiling: 91%|█████████▏| 266/291 [01:15<00:07, 3.18it/s] Profiling: 92%|█████████▏| 267/291 [01:15<00:09, 2.64it/s] Profiling: 93%|█████████▎| 270/291 [01:16<00:05, 4.13it/s] Profiling: 93%|█████████▎| 271/291 [01:16<00:04, 4.29it/s] Profiling: 94%|█████████▍| 274/291 [01:17<00:04, 4.22it/s] Profiling: 95%|█████████▍| 275/291 [01:17<00:04, 3.25it/s] Profiling: 95%|█████████▍| 276/291 [01:18<00:05, 2.67it/s] Profiling: 96%|█████████▌| 279/291 [01:18<00:02, 4.18it/s] Profiling: 96%|█████████▌| 280/291 [01:18<00:02, 4.33it/s] Profiling: 97%|█████████▋| 283/291 [01:19<00:01, 4.28it/s] Profiling: 98%|█████████▊| 284/291 [01:20<00:02, 3.27it/s] Profiling: 98%|█████████▊| 285/291 [01:20<00:02, 2.68it/s] Profiling: 99%|█████████▉| 288/291 [01:21<00:00, 4.18it/s] Profiling: 99%|█████████▉| 289/291 [01:21<00:00, 4.33it/s] Profiling: 100%|██████████| 291/291 [01:21<00:00, 3.57it/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: quantized model in 101.431s
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Processed model thanhdaonguyen/thanhdaorp-2812 in 140.956s
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: creating bucket guanaco-mkml-models
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: uploading /tmp/model_cache to s3://guanaco-mkml-models/thanhdaonguyen-thanhdaorp-281-v2
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/model_cache/tokenizer_config.json s3://guanaco-mkml-models/thanhdaonguyen-thanhdaorp-281-v2/tokenizer_config.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/model_cache/special_tokens_map.json s3://guanaco-mkml-models/thanhdaonguyen-thanhdaorp-281-v2/special_tokens_map.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/model_cache/config.json s3://guanaco-mkml-models/thanhdaonguyen-thanhdaorp-281-v2/config.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/model_cache/tokenizer.model s3://guanaco-mkml-models/thanhdaonguyen-thanhdaorp-281-v2/tokenizer.model
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/model_cache/tokenizer.json s3://guanaco-mkml-models/thanhdaonguyen-thanhdaorp-281-v2/tokenizer.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: loading reward model from ChaiML/reward_models_100_170000000_cp_498032
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: warnings.warn(
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: config.json: 0%| | 0.00/1.03k [00:00<?, ?B/s] config.json: 100%|██████████| 1.03k/1.03k [00:00<00:00, 9.22MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: warnings.warn(
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 234/234 [00:00<00:00, 2.86MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: vocab.json: 0%| | 0.00/798k [00:00<?, ?B/s] vocab.json: 100%|██████████| 798k/798k [00:00<00:00, 10.8MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: merges.txt: 0%| | 0.00/456k [00:00<?, ?B/s] merges.txt: 100%|██████████| 456k/456k [00:00<00:00, 48.2MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 44.1MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: special_tokens_map.json: 0%| | 0.00/99.0 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 99.0/99.0 [00:00<00:00, 1.36MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: warnings.warn(
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: pytorch_model.bin: 0%| | 0.00/510M [00:00<?, ?B/s] pytorch_model.bin: 2%|▏ | 10.5M/510M [00:00<00:12, 40.5MB/s] pytorch_model.bin: 12%|█▏ | 59.5M/510M [00:00<00:02, 157MB/s] pytorch_model.bin: 22%|██▏ | 112M/510M [00:00<00:01, 256MB/s] pytorch_model.bin: 61%|██████ | 311M/510M [00:00<00:00, 736MB/s] pytorch_model.bin: 100%|█████████▉| 510M/510M [00:00<00:00, 750MB/s]
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Saving duration: 0.122s
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Processed model ChaiML/reward_models_100_170000000_cp_498032 in 3.187s
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: creating bucket guanaco-reward-models
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/thanhdaonguyen-thanhdaorp-281-v2_reward
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/thanhdaonguyen-thanhdaorp-281-v2_reward/special_tokens_map.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/thanhdaonguyen-thanhdaorp-281-v2_reward/config.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/thanhdaonguyen-thanhdaorp-281-v2_reward/tokenizer_config.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/thanhdaonguyen-thanhdaorp-281-v2_reward/merges.txt
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/thanhdaonguyen-thanhdaorp-281-v2_reward/vocab.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/thanhdaonguyen-thanhdaorp-281-v2_reward/tokenizer.json
thanhdaonguyen-thanhdaorp-281-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/thanhdaonguyen-thanhdaorp-281-v2_reward/reward.tensors
Job thanhdaonguyen-thanhdaorp-281-v2-mkmlizer completed after 184.15s with status: succeeded
Stopping job with name thanhdaonguyen-thanhdaorp-281-v2-mkmlizer
Pipeline stage MKMLizer completed in 189.79s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.17s
Running pipeline stage ISVCDeployer
Creating inference service thanhdaonguyen-thanhdaorp-281-v2
Waiting for inference service thanhdaonguyen-thanhdaorp-281-v2 to be ready
Inference service thanhdaonguyen-thanhdaorp-281-v2 ready after 40.3695113658905s
Pipeline stage ISVCDeployer completed in 48.21s
Running pipeline stage StressChecker
Received healthy response to inference request with status code 200 in 2.3492188453674316s
Received healthy response to inference request with status code 200 in 1.3956687450408936s
Received healthy response to inference request with status code 200 in 1.440493106842041s
Received healthy response to inference request with status code 200 in 1.3822202682495117s
Received healthy response to inference request with status code 200 in 1.3424830436706543s
Received healthy response to inference request with status code 200 in 1.3707740306854248s
Received healthy response to inference request with status code 200 in 1.3910903930664062s
Received healthy response to inference request with status code 200 in 1.196009874343872s
Received healthy response to inference request with status code 200 in 1.3893134593963623s
Received healthy response to inference request with status code 200 in 1.514636516571045s
Received healthy response to inference request with status code 200 in 1.4169731140136719s
Received healthy response to inference request with status code 200 in 1.3809421062469482s
Received healthy response to inference request with status code 200 in 1.397322654724121s
Received healthy response to inference request with status code 200 in 1.4988493919372559s
Received healthy response to inference request with status code 200 in 1.3727595806121826s
Received healthy response to inference request with status code 200 in 1.3633031845092773s
Received healthy response to inference request with status code 200 in 2.125699758529663s
Received healthy response to inference request with status code 200 in 1.4922871589660645s
Received healthy response to inference request with status code 200 in 1.391878366470337s
Received healthy response to inference request with status code 200 in 1.4562056064605713s
Received healthy response to inference request with status code 200 in 1.4016849994659424s
Received healthy response to inference request with status code 200 in 1.4274630546569824s
Received healthy response to inference request with status code 200 in 1.421264886856079s
Received healthy response to inference request with status code 200 in 1.2927265167236328s
Received healthy response to inference request with status code 200 in 1.4298200607299805s
Received healthy response to inference request with status code 200 in 1.4522273540496826s
Received healthy response to inference request with status code 200 in 1.4409842491149902s
Received healthy response to inference request with status code 200 in 1.4382860660552979s
Received healthy response to inference request with status code 200 in 1.4835448265075684s
Received healthy response to inference request with status code 200 in 1.4554426670074463s
Received healthy response to inference request with status code 200 in 1.4421303272247314s
Received healthy response to inference request with status code 200 in 1.4955720901489258s
Received healthy response to inference request with status code 200 in 1.4948041439056396s
Received healthy response to inference request with status code 200 in 1.4178650379180908s
Received healthy response to inference request with status code 200 in 1.4129416942596436s
Received healthy response to inference request with status code 200 in 1.490776538848877s
Received healthy response to inference request with status code 200 in 1.4213640689849854s
Received healthy response to inference request with status code 200 in 1.453080654144287s
Received healthy response to inference request with status code 200 in 1.4902279376983643s
Received healthy response to inference request with status code 200 in 1.4192893505096436s
Received healthy response to inference request with status code 200 in 1.6769299507141113s
Received healthy response to inference request with status code 200 in 1.452077865600586s
Received healthy response to inference request with status code 200 in 1.3521416187286377s
Received healthy response to inference request with status code 200 in 1.480215311050415s
Received healthy response to inference request with status code 200 in 1.474172592163086s
Received healthy response to inference request with status code 200 in 1.502370834350586s
Received healthy response to inference request with status code 200 in 1.4880561828613281s
Received healthy response to inference request with status code 200 in 1.5373361110687256s
Received healthy response to inference request with status code 200 in 1.51991605758667s
Received healthy response to inference request with status code 200 in 1.4761643409729004s
Received healthy response to inference request with status code 200 in 1.4617893695831299s
Received healthy response to inference request with status code 200 in 1.1537187099456787s
Received healthy response to inference request with status code 200 in 1.3454687595367432s
Received healthy response to inference request with status code 200 in 1.07265043258667s
Received healthy response to inference request with status code 200 in 1.1414532661437988s
Received healthy response to inference request with status code 200 in 1.4795680046081543s
Received healthy response to inference request with status code 200 in 1.4632675647735596s
Received healthy response to inference request with status code 200 in 1.3356292247772217s
Received healthy response to inference request with status code 200 in 1.1221463680267334s
Received healthy response to inference request with status code 200 in 1.4369375705718994s
Received healthy response to inference request with status code 200 in 1.4587688446044922s
Received healthy response to inference request with status code 200 in 1.4475250244140625s
Received healthy response to inference request with status code 200 in 1.4660882949829102s
Received healthy response to inference request with status code 200 in 1.473196268081665s
Received healthy response to inference request with status code 200 in 1.479646921157837s
Received healthy response to inference request with status code 200 in 1.119788646697998s
Received healthy response to inference request with status code 200 in 1.480651617050171s
Received healthy response to inference request with status code 200 in 1.381201982498169s
Received healthy response to inference request with status code 200 in 1.5070526599884033s
Received healthy response to inference request with status code 200 in 1.2251081466674805s
Received healthy response to inference request with status code 200 in 1.1866824626922607s
Received healthy response to inference request with status code 200 in 1.4534611701965332s
Received healthy response to inference request with status code 200 in 1.097769021987915s
Received healthy response to inference request with status code 200 in 1.476773738861084s
Received healthy response to inference request with status code 200 in 1.4675898551940918s
Received healthy response to inference request with status code 200 in 1.403599739074707s
Received healthy response to inference request with status code 200 in 1.4782230854034424s
Received healthy response to inference request with status code 200 in 1.4850740432739258s
Received healthy response to inference request with status code 200 in 1.4722115993499756s
Received healthy response to inference request with status code 200 in 1.5012907981872559s
Received healthy response to inference request with status code 200 in 1.4858241081237793s
Received healthy response to inference request with status code 200 in 1.4603760242462158s
Received healthy response to inference request with status code 200 in 1.4455735683441162s
Received healthy response to inference request with status code 200 in 1.461296796798706s
Received healthy response to inference request with status code 200 in 1.4817006587982178s
Received healthy response to inference request with status code 200 in 1.494518518447876s
Received healthy response to inference request with status code 200 in 1.4793124198913574s
Received healthy response to inference request with status code 200 in 1.5057775974273682s
Received healthy response to inference request with status code 200 in 1.4494504928588867s
Received healthy response to inference request with status code 200 in 1.498295545578003s
Received healthy response to inference request with status code 200 in 1.5135431289672852s
Received healthy response to inference request with status code 200 in 1.5052340030670166s
Received healthy response to inference request with status code 200 in 1.0561375617980957s
Received healthy response to inference request with status code 200 in 1.2031288146972656s
Received healthy response to inference request with status code 200 in 1.2008659839630127s
Received healthy response to inference request with status code 200 in 1.2451624870300293s
Received healthy response to inference request with status code 200 in 1.346299648284912s
Received healthy response to inference request with status code 200 in 1.149526834487915s
Received healthy response to inference request with status code 200 in 0.9913637638092041s
Received healthy response to inference request with status code 200 in 1.312239170074463s
100 requests
0 failed requests
5th percentile: 1.1220284819602966
10th percentile: 1.195077133178711
20th percentile: 1.3461334705352783
30th percentile: 1.3916419744491577
40th percentile: 1.4213243961334228
50th percentile: 1.4484877586364746
60th percentile: 1.4614938259124757
70th percentile: 1.478549885749817
80th percentile: 1.4884905338287353
90th percentile: 1.5026571512222289
95th percentile: 1.5149004936218262
99th percentile: 2.127934949398042
mean time: 1.4176896691322327
Pipeline stage StressChecker completed in 149.69s
Running pipeline stage SafetyScorer
Pipeline stage SafetyScorer completed in 38.06s
Running pipeline stage MEvalScorer
Running M-Eval for topic stay_in_character
Pipeline stage MEvalScorer completed in 380.21s
thanhdaonguyen-thanhdaorp-281_v2 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of thanhdaonguyen-thanhdaorp-281_v2
Running pipeline stage ISVCDeleter
Checking if service thanhdaonguyen-thanhdaorp-281-v2 is running
Tearing down inference service thanhdaonguyen-thanhdaorp-281-v2
Toredown service thanhdaonguyen-thanhdaorp-281-v2
Pipeline stage ISVCDeleter completed in 4.36s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key thanhdaonguyen-thanhdaorp-281-v2/config.json from bucket guanaco-mkml-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2/mkml_model.tensors from bucket guanaco-mkml-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2/tokenizer.json from bucket guanaco-mkml-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2/tokenizer.model from bucket guanaco-mkml-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key thanhdaonguyen-thanhdaorp-281-v2_reward/config.json from bucket guanaco-reward-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2_reward/merges.txt from bucket guanaco-reward-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2_reward/reward.tensors from bucket guanaco-reward-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key thanhdaonguyen-thanhdaorp-281-v2_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 2.65s
thanhdaonguyen-thanhdaorp-281_v2 status is now torndown due to DeploymentManager action