submission_id: anhnv125-llama-op-v17-1_v35
developer_uid: robert_irvine
status: inactive
model_repo: anhnv125/llama-op-v17.1
reward_repo: rirv938/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.1, 'top_p': 1.0, 'top_k': 20, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '<|im_end|>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': "### Instruction:\n As the assistant, your task is to become the assigned character, weaving engaging stories that fully embrace their personality and background. Ensure your responses capture the essence of the character's traits accurately, immersing users in emotional, suspenseful, and anticipatory narratives. Craft more detailed and descriptive replies to enhance the vividness of the story. Foster an interactive environment by introducing new elements, providing choices, or posing questions to encourage active user participation. Think of the conversation as a continuous dance, always unfolding and evolving.\nYour character: {bot_name}.\nContext:{memory}\n", 'prompt_template': 'Example conversation:\n{prompt}\n', 'bot_template': '### Response:\n{bot_name}: {message}\n', 'user_template': '### Input:\n{user_name}: {message}\n', 'response_template': '### Response:\n{bot_name}:'}
reward_formatter: {'memory_template': 'Memory: {memory}\n', 'prompt_template': '{prompt}\n', 'bot_template': 'Bot: {message}\n', 'user_template': 'User: {message}\n', 'response_template': 'Bot:'}
timestamp: 2024-03-05T23:07:19+00:00
model_name: anhnv125-llama-op-v17-1_v35
model_eval_status: success
safety_score: 0.97
entertaining: 7.02
stay_in_character: 8.67
user_preference: 7.62
double_thumbs_up: 2843
thumbs_up: 4182
thumbs_down: 1763
num_battles: 176348
num_wins: 85743
win_ratio: 0.4862147571846576
celo_rating: 1144.51
Resubmit model
Running pipeline stage MKMLizer
Starting job with name anhnv125-llama-op-v17-1-v35-mkmlizer
Waiting for job on anhnv125-llama-op-v17-1-v35-mkmlizer to finish
anhnv125-llama-op-v17-1-v35-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ _____ __ __ ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ /___/ ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ Version: 0.6.11 ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ The license key for the current software has been verified as ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ belonging to: ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ Chai Research Corp. ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ Expiration: 2024-04-15 23:59:59 ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ║ ║
anhnv125-llama-op-v17-1-v35-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
anhnv125-llama-op-v17-1-v35-mkmlizer: .gitattributes: 0%| | 0.00/1.52k [00:00<?, ?B/s] .gitattributes: 100%|██████████| 1.52k/1.52k [00:00<00:00, 11.0MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: added_tokens.json: 0%| | 0.00/21.0 [00:00<?, ?B/s] added_tokens.json: 100%|██████████| 21.0/21.0 [00:00<00:00, 272kB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: config.json: 0%| | 0.00/654 [00:00<?, ?B/s] config.json: 100%|██████████| 654/654 [00:00<00:00, 6.15MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00001-of-00013.safetensors: 0%| | 0.00/2.09G [00:00<?, ?B/s] model-00001-of-00013.safetensors: 1%| | 10.5M/2.09G [00:01<04:50, 7.15MB/s] model-00001-of-00013.safetensors: 1%| | 21.0M/2.09G [00:01<02:33, 13.5MB/s] model-00001-of-00013.safetensors: 2%|▏ | 31.5M/2.09G [00:01<01:38, 21.0MB/s] model-00001-of-00013.safetensors: 2%|▏ | 41.9M/2.09G [00:02<01:13, 27.7MB/s] model-00001-of-00013.safetensors: 3%|▎ | 52.4M/2.09G [00:02<00:55, 36.9MB/s] model-00001-of-00013.safetensors: 4%|▎ | 73.4M/2.09G [00:02<00:36, 55.5MB/s] model-00001-of-00013.safetensors: 5%|▍ | 94.4M/2.09G [00:02<00:26, 74.8MB/s] model-00001-of-00013.safetensors: 6%|▌ | 115M/2.09G [00:02<00:21, 93.0MB/s] model-00001-of-00013.safetensors: 9%|▊ | 178M/2.09G [00:02<00:09, 194MB/s] model-00001-of-00013.safetensors: 15%|█▌ | 315M/2.09G [00:03<00:06, 293MB/s] model-00001-of-00013.safetensors: 17%|█▋ | 357M/2.09G [00:03<00:05, 308MB/s] model-00001-of-00013.safetensors: 21%|██ | 430M/2.09G [00:03<00:04, 389MB/s] model-00001-of-00013.safetensors: 30%|██▉ | 619M/2.09G [00:03<00:02, 709MB/s] model-00001-of-00013.safetensors: 53%|█████▎ | 1.10G/2.09G [00:03<00:00, 1.65GB/s] model-00001-of-00013.safetensors: 65%|██████▌ | 1.36G/2.09G [00:03<00:00, 1.89GB/s] model-00001-of-00013.safetensors: 76%|███████▌ | 1.59G/2.09G [00:04<00:00, 790MB/s] model-00001-of-00013.safetensors: 84%|████████▍ | 1.75G/2.09G [00:04<00:00, 878MB/s] model-00001-of-00013.safetensors: 91%|█████████▏| 1.91G/2.09G [00:04<00:00, 818MB/s] model-00001-of-00013.safetensors: 100%|█████████▉| 2.09G/2.09G [00:04<00:00, 439MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00002-of-00013.safetensors: 0%| | 0.00/2.04G [00:00<?, ?B/s] model-00002-of-00013.safetensors: 1%| | 10.5M/2.04G [00:00<02:27, 13.8MB/s] model-00002-of-00013.safetensors: 1%| | 21.0M/2.04G [00:02<04:18, 7.82MB/s] model-00002-of-00013.safetensors: 3%|▎ | 52.4M/2.04G [00:02<01:17, 25.9MB/s] model-00002-of-00013.safetensors: 5%|▍ | 94.4M/2.04G [00:02<00:34, 55.8MB/s] model-00002-of-00013.safetensors: 11%|█ | 220M/2.04G [00:02<00:10, 173MB/s] model-00002-of-00013.safetensors: 14%|█▍ | 294M/2.04G [00:03<00:08, 205MB/s] model-00002-of-00013.safetensors: 17%|█▋ | 346M/2.04G [00:03<00:07, 239MB/s] model-00002-of-00013.safetensors: 23%|██▎ | 472M/2.04G [00:03<00:04, 391MB/s] model-00002-of-00013.safetensors: 50%|████▉ | 1.02G/2.04G [00:03<00:00, 1.29GB/s] model-00002-of-00013.safetensors: 61%|██████ | 1.24G/2.04G [00:03<00:00, 1.35GB/s] model-00002-of-00013.safetensors: 70%|███████ | 1.44G/2.04G [00:04<00:00, 769MB/s] model-00002-of-00013.safetensors: 78%|███████▊ | 1.59G/2.04G [00:04<00:00, 701MB/s] model-00002-of-00013.safetensors: 84%|████████▍ | 1.72G/2.04G [00:04<00:00, 771MB/s] model-00002-of-00013.safetensors: 100%|█████████▉| 2.04G/2.04G [00:04<00:00, 440MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00003-of-00013.safetensors: 0%| | 0.00/2.06G [00:00<?, ?B/s] model-00003-of-00013.safetensors: 1%| | 10.5M/2.06G [00:01<04:53, 6.98MB/s] model-00003-of-00013.safetensors: 1%| | 21.0M/2.06G [00:01<02:20, 14.5MB/s] model-00003-of-00013.safetensors: 2%|▏ | 31.5M/2.06G [00:02<02:04, 16.3MB/s] model-00003-of-00013.safetensors: 2%|▏ | 41.9M/2.06G [00:02<01:40, 20.1MB/s] model-00003-of-00013.safetensors: 3%|▎ | 52.4M/2.06G [00:02<01:14, 26.9MB/s] model-00003-of-00013.safetensors: 3%|▎ | 62.9M/2.06G [00:02<00:55, 35.8MB/s] model-00003-of-00013.safetensors: 7%|▋ | 147M/2.06G [00:02<00:12, 148MB/s] model-00003-of-00013.safetensors: 11%|█ | 220M/2.06G [00:03<00:08, 209MB/s] model-00003-of-00013.safetensors: 13%|█▎ | 262M/2.06G [00:03<00:07, 240MB/s] model-00003-of-00013.safetensors: 17%|█▋ | 357M/2.06G [00:03<00:04, 365MB/s] model-00003-of-00013.safetensors: 21%|██▏ | 440M/2.06G [00:03<00:03, 459MB/s] model-00003-of-00013.safetensors: 49%|████▉ | 1.01G/2.06G [00:03<00:00, 1.65GB/s] model-00003-of-00013.safetensors: 63%|██████▎ | 1.29G/2.06G [00:03<00:00, 1.79GB/s] model-00003-of-00013.safetensors: 73%|███████▎ | 1.50G/2.06G [00:04<00:00, 742MB/s] model-00003-of-00013.safetensors: 81%|████████ | 1.66G/2.06G [00:04<00:00, 646MB/s] model-00003-of-00013.safetensors: 87%|████████▋ | 1.79G/2.06G [00:05<00:00, 629MB/s] model-00003-of-00013.safetensors: 100%|█████████▉| 2.06G/2.06G [00:05<00:00, 396MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00004-of-00013.safetensors: 0%| | 0.00/1.96G [00:00<?, ?B/s] model-00004-of-00013.safetensors: 1%| | 10.5M/1.96G [00:01<03:58, 8.15MB/s] model-00004-of-00013.safetensors: 1%| | 21.0M/1.96G [00:01<02:06, 15.3MB/s] model-00004-of-00013.safetensors: 2%|▏ | 31.5M/1.96G [00:01<01:23, 23.1MB/s] model-00004-of-00013.safetensors: 2%|▏ | 41.9M/1.96G [00:02<01:10, 27.3MB/s] model-00004-of-00013.safetensors: 3%|▎ | 62.9M/1.96G [00:02<00:49, 38.1MB/s] model-00004-of-00013.safetensors: 4%|▍ | 73.4M/1.96G [00:02<00:47, 39.8MB/s] model-00004-of-00013.safetensors: 5%|▌ | 105M/1.96G [00:02<00:24, 75.3MB/s] model-00004-of-00013.safetensors: 9%|▉ | 178M/1.96G [00:02<00:10, 176MB/s] model-00004-of-00013.safetensors: 14%|█▍ | 283M/1.96G [00:02<00:05, 317MB/s] model-00004-of-00013.safetensors: 17%|█▋ | 336M/1.96G [00:03<00:04, 328MB/s] model-00004-of-00013.safetensors: 20%|██ | 398M/1.96G [00:03<00:04, 389MB/s] model-00004-of-00013.safetensors: 25%|██▍ | 482M/1.96G [00:03<00:03, 469MB/s] model-00004-of-00013.safetensors: 53%|█████▎ | 1.03G/1.96G [00:03<00:00, 1.64GB/s] model-00004-of-00013.safetensors: 63%|██████▎ | 1.24G/1.96G [00:03<00:00, 1.20GB/s] model-00004-of-00013.safetensors: 72%|███████▏ | 1.41G/1.96G [00:03<00:00, 932MB/s] model-00004-of-00013.safetensors: 79%|███████▉ | 1.54G/1.96G [00:04<00:00, 655MB/s] model-00004-of-00013.safetensors: 84%|████████▍ | 1.65G/1.96G [00:04<00:00, 542MB/s] model-00004-of-00013.safetensors: 100%|█████████▉| 1.96G/1.96G [00:04<00:00, 400MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00005-of-00013.safetensors: 0%| | 0.00/2.04G [00:00<?, ?B/s] model-00005-of-00013.safetensors: 1%| | 10.5M/2.04G [00:01<04:29, 7.56MB/s] model-00005-of-00013.safetensors: 1%| | 21.0M/2.04G [00:01<02:29, 13.5MB/s] model-00005-of-00013.safetensors: 2%|▏ | 31.5M/2.04G [00:01<01:42, 19.6MB/s] model-00005-of-00013.safetensors: 2%|▏ | 41.9M/2.04G [00:02<01:13, 27.2MB/s] model-00005-of-00013.safetensors: 3%|▎ | 62.9M/2.04G [00:02<00:53, 37.4MB/s] model-00005-of-00013.safetensors: 7%|▋ | 147M/2.04G [00:02<00:14, 131MB/s] model-00005-of-00013.safetensors: 12%|█▏ | 252M/2.04G [00:02<00:06, 259MB/s] model-00005-of-00013.safetensors: 21%|██ | 430M/2.04G [00:02<00:03, 512MB/s] model-00005-of-00013.safetensors: 26%|██▌ | 524M/2.04G [00:03<00:03, 482MB/s] model-00005-of-00013.safetensors: 31%|███▏ | 640M/2.04G [00:03<00:02, 604MB/s] model-00005-of-00013.safetensors: 53%|█████▎ | 1.09G/2.04G [00:03<00:00, 1.02GB/s] model-00005-of-00013.safetensors: 59%|█████▉ | 1.21G/2.04G [00:03<00:01, 728MB/s] model-00005-of-00013.safetensors: 64%|██████▎ | 1.30G/2.04G [00:04<00:01, 563MB/s] model-00005-of-00013.safetensors: 68%|██████▊ | 1.39G/2.04G [00:04<00:01, 545MB/s] model-00005-of-00013.safetensors: 72%|███████▏ | 1.47G/2.04G [00:04<00:01, 570MB/s] model-00005-of-00013.safetensors: 75%|███████▌ | 1.54G/2.04G [00:04<00:00, 586MB/s] model-00005-of-00013.safetensors: 82%|████████▏ | 1.68G/2.04G [00:04<00:00, 699MB/s] model-00005-of-00013.safetensors: 86%|████████▌ | 1.76G/2.04G [00:04<00:00, 597MB/s] model-00005-of-00013.safetensors: 100%|█████████▉| 2.04G/2.04G [00:04<00:00, 1.03GB/s] model-00005-of-00013.safetensors: 100%|█████████▉| 2.04G/2.04G [00:05<00:00, 404MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00006-of-00013.safetensors: 0%| | 0.00/2.04G [00:00<?, ?B/s] model-00006-of-00013.safetensors: 1%| | 10.5M/2.04G [00:02<07:32, 4.50MB/s] model-00006-of-00013.safetensors: 2%|▏ | 31.5M/2.04G [00:02<02:04, 16.1MB/s] model-00006-of-00013.safetensors: 2%|▏ | 41.9M/2.04G [00:02<01:29, 22.4MB/s] model-00006-of-00013.safetensors: 3%|▎ | 62.9M/2.04G [00:02<00:54, 36.3MB/s] model-00006-of-00013.safetensors: 6%|▌ | 126M/2.04G [00:02<00:19, 99.9MB/s] model-00006-of-00013.safetensors: 9%|▉ | 189M/2.04G [00:03<00:10, 169MB/s] model-00006-of-00013.safetensors: 17%|█▋ | 346M/2.04G [00:03<00:04, 372MB/s] model-00006-of-00013.safetensors: 21%|██ | 419M/2.04G [00:03<00:04, 381MB/s] model-00006-of-00013.safetensors: 25%|██▌ | 514M/2.04G [00:03<00:03, 468MB/s] model-00006-of-00013.safetensors: 53%|█████▎ | 1.08G/2.04G [00:03<00:00, 1.24GB/s] model-00006-of-00013.safetensors: 59%|█████▉ | 1.22G/2.04G [00:04<00:01, 572MB/s] model-00006-of-00013.safetensors: 65%|██████▍ | 1.32G/2.04G [00:04<00:01, 574MB/s] model-00006-of-00013.safetensors: 69%|██████▉ | 1.42G/2.04G [00:04<00:01, 610MB/s] model-00006-of-00013.safetensors: 73%|███████▎ | 1.50G/2.04G [00:04<00:01, 536MB/s] model-00006-of-00013.safetensors: 77%|███████▋ | 1.57G/2.04G [00:05<00:00, 502MB/s] model-00006-of-00013.safetensors: 83%|████████▎ | 1.70G/2.04G [00:05<00:00, 622MB/s] model-00006-of-00013.safetensors: 88%|████████▊ | 1.79G/2.04G [00:05<00:00, 628MB/s] model-00006-of-00013.safetensors: 100%|█████████▉| 2.04G/2.04G [00:05<00:00, 373MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00007-of-00013.safetensors: 0%| | 0.00/2.04G [00:00<?, ?B/s] model-00007-of-00013.safetensors: 1%| | 10.5M/2.04G [00:01<04:12, 8.06MB/s] model-00007-of-00013.safetensors: 2%|▏ | 31.5M/2.04G [00:01<01:12, 27.7MB/s] model-00007-of-00013.safetensors: 4%|▍ | 83.9M/2.04G [00:01<00:22, 88.2MB/s] model-00007-of-00013.safetensors: 6%|▌ | 126M/2.04G [00:01<00:15, 125MB/s] model-00007-of-00013.safetensors: 8%|▊ | 157M/2.04G [00:01<00:12, 146MB/s] model-00007-of-00013.safetensors: 9%|▉ | 189M/2.04G [00:01<00:10, 172MB/s] model-00007-of-00013.safetensors: 11%|█▏ | 231M/2.04G [00:02<00:08, 202MB/s] model-00007-of-00013.safetensors: 14%|█▍ | 283M/2.04G [00:02<00:06, 252MB/s] model-00007-of-00013.safetensors: 15%|█▌ | 315M/2.04G [00:02<00:07, 245MB/s] model-00007-of-00013.safetensors: 20%|█▉ | 409M/2.04G [00:02<00:04, 393MB/s] model-00007-of-00013.safetensors: 26%|██▌ | 535M/2.04G [00:02<00:02, 574MB/s] model-00007-of-00013.safetensors: 30%|███ | 619M/2.04G [00:02<00:02, 636MB/s] model-00007-of-00013.safetensors: 34%|███▍ | 692M/2.04G [00:02<00:02, 534MB/s] model-00007-of-00013.safetensors: 40%|███▉ | 818M/2.04G [00:02<00:01, 700MB/s] model-00007-of-00013.safetensors: 66%|██████▌ | 1.34G/2.04G [00:03<00:00, 1.79GB/s] model-00007-of-00013.safetensors: 76%|███████▌ | 1.55G/2.04G [00:03<00:00, 1.67GB/s] model-00007-of-00013.safetensors: 85%|████████▌ | 1.74G/2.04G [00:03<00:00, 1.20GB/s] model-00007-of-00013.safetensors: 93%|█████████▎| 1.90G/2.04G [00:03<00:00, 1.08GB/s] model-00007-of-00013.safetensors: 99%|█████████▉| 2.03G/2.04G [00:03<00:00, 1.09GB/s] model-00007-of-00013.safetensors: 100%|█████████▉| 2.04G/2.04G [00:03<00:00, 520MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00008-of-00013.safetensors: 0%| | 0.00/2.06G [00:00<?, ?B/s] model-00008-of-00013.safetensors: 1%| | 10.5M/2.06G [00:00<01:11, 28.8MB/s] model-00008-of-00013.safetensors: 1%| | 21.0M/2.06G [00:01<02:11, 15.5MB/s] model-00008-of-00013.safetensors: 2%|▏ | 31.5M/2.06G [00:01<02:06, 16.0MB/s] model-00008-of-00013.safetensors: 2%|▏ | 41.9M/2.06G [00:02<01:26, 23.2MB/s] model-00008-of-00013.safetensors: 3%|▎ | 52.4M/2.06G [00:02<01:03, 31.4MB/s] model-00008-of-00013.safetensors: 3%|▎ | 62.9M/2.06G [00:02<01:07, 29.5MB/s] model-00008-of-00013.safetensors: 4%|▍ | 83.9M/2.06G [00:02<00:39, 49.4MB/s] model-00008-of-00013.safetensors: 5%|▍ | 94.4M/2.06G [00:02<00:35, 55.4MB/s] model-00008-of-00013.safetensors: 9%|▉ | 189M/2.06G [00:02<00:09, 201MB/s] model-00008-of-00013.safetensors: 12%|█▏ | 252M/2.06G [00:03<00:06, 282MB/s] model-00008-of-00013.safetensors: 16%|█▋ | 336M/2.06G [00:03<00:04, 361MB/s] model-00008-of-00013.safetensors: 19%|█▉ | 388M/2.06G [00:03<00:04, 396MB/s] model-00008-of-00013.safetensors: 27%|██▋ | 566M/2.06G [00:03<00:02, 714MB/s] model-00008-of-00013.safetensors: 52%|█████▏ | 1.08G/2.06G [00:03<00:00, 1.56GB/s] model-00008-of-00013.safetensors: 60%|██████ | 1.24G/2.06G [00:04<00:01, 698MB/s] model-00008-of-00013.safetensors: 66%|██████▌ | 1.36G/2.06G [00:04<00:01, 605MB/s] model-00008-of-00013.safetensors: 70%|███████ | 1.45G/2.06G [00:04<00:01, 474MB/s] model-00008-of-00013.safetensors: 74%|███████▍ | 1.53G/2.06G [00:05<00:01, 403MB/s] model-00008-of-00013.safetensors: 81%|████████ | 1.67G/2.06G [00:05<00:00, 441MB/s] model-00008-of-00013.safetensors: 84%|████████▍ | 1.74G/2.06G [00:05<00:00, 457MB/s] model-00008-of-00013.safetensors: 100%|█████████▉| 2.06G/2.06G [00:05<00:00, 363MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00009-of-00013.safetensors: 0%| | 0.00/1.96G [00:00<?, ?B/s] model-00009-of-00013.safetensors: 1%| | 10.5M/1.96G [00:02<06:44, 4.81MB/s] model-00009-of-00013.safetensors: 2%|▏ | 31.5M/1.96G [00:02<01:52, 17.1MB/s] model-00009-of-00013.safetensors: 2%|▏ | 41.9M/1.96G [00:02<01:20, 23.8MB/s] model-00009-of-00013.safetensors: 3%|▎ | 52.4M/1.96G [00:02<01:06, 28.6MB/s] model-00009-of-00013.safetensors: 4%|▍ | 73.4M/1.96G [00:02<00:38, 49.0MB/s] model-00009-of-00013.safetensors: 8%|▊ | 147M/1.96G [00:02<00:12, 143MB/s] model-00009-of-00013.safetensors: 13%|█▎ | 252M/1.96G [00:02<00:05, 289MB/s] model-00009-of-00013.safetensors: 20%|██ | 398M/1.96G [00:03<00:03, 437MB/s] model-00009-of-00013.safetensors: 24%|██▍ | 472M/1.96G [00:03<00:03, 481MB/s] model-00009-of-00013.safetensors: 28%|██▊ | 556M/1.96G [00:03<00:02, 535MB/s] model-00009-of-00013.safetensors: 56%|█████▌ | 1.09G/1.96G [00:03<00:00, 1.45GB/s] model-00009-of-00013.safetensors: 64%|██████▍ | 1.25G/1.96G [00:04<00:00, 804MB/s] model-00009-of-00013.safetensors: 70%|███████ | 1.37G/1.96G [00:04<00:00, 651MB/s] model-00009-of-00013.safetensors: 75%|███████▌ | 1.47G/1.96G [00:04<00:00, 613MB/s] model-00009-of-00013.safetensors: 80%|███████▉ | 1.56G/1.96G [00:04<00:00, 578MB/s] model-00009-of-00013.safetensors: 83%|████████▎ | 1.63G/1.96G [00:04<00:00, 515MB/s] model-00009-of-00013.safetensors: 87%|████████▋ | 1.69G/1.96G [00:05<00:00, 496MB/s] model-00009-of-00013.safetensors: 100%|█████████▉| 1.96G/1.96G [00:05<00:00, 371MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00010-of-00013.safetensors: 0%| | 0.00/2.04G [00:00<?, ?B/s] model-00010-of-00013.safetensors: 1%| | 10.5M/2.04G [00:00<02:21, 14.4MB/s] model-00010-of-00013.safetensors: 1%| | 21.0M/2.04G [00:02<03:28, 9.71MB/s] model-00010-of-00013.safetensors: 2%|▏ | 31.5M/2.04G [00:02<02:13, 15.1MB/s] model-00010-of-00013.safetensors: 3%|▎ | 52.4M/2.04G [00:02<01:26, 23.0MB/s] model-00010-of-00013.safetensors: 5%|▍ | 94.4M/2.04G [00:02<00:35, 54.4MB/s] model-00010-of-00013.safetensors: 13%|█▎ | 273M/2.04G [00:03<00:07, 232MB/s] model-00010-of-00013.safetensors: 18%|█▊ | 377M/2.04G [00:03<00:05, 293MB/s] model-00010-of-00013.safetensors: 22%|██▏ | 451M/2.04G [00:03<00:04, 352MB/s] model-00010-of-00013.safetensors: 28%|██▊ | 566M/2.04G [00:03<00:03, 475MB/s] model-00010-of-00013.safetensors: 52%|█████▏ | 1.06G/2.04G [00:03<00:00, 1.31GB/s] model-00010-of-00013.safetensors: 62%|██████▏ | 1.27G/2.04G [00:03<00:00, 929MB/s] model-00010-of-00013.safetensors: 70%|███████ | 1.44G/2.04G [00:04<00:00, 836MB/s] model-00010-of-00013.safetensors: 77%|███████▋ | 1.58G/2.04G [00:04<00:00, 932MB/s] model-00010-of-00013.safetensors: 84%|████████▍ | 1.72G/2.04G [00:04<00:00, 819MB/s] model-00010-of-00013.safetensors: 90%|████████▉ | 1.84G/2.04G [00:04<00:00, 807MB/s] model-00010-of-00013.safetensors: 100%|█████████▉| 2.04G/2.04G [00:04<00:00, 418MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00011-of-00013.safetensors: 0%| | 0.00/2.04G [00:00<?, ?B/s] model-00011-of-00013.safetensors: 1%| | 10.5M/2.04G [00:00<01:10, 29.0MB/s] model-00011-of-00013.safetensors: 1%| | 21.0M/2.04G [00:02<04:15, 7.92MB/s] model-00011-of-00013.safetensors: 2%|▏ | 41.9M/2.04G [00:02<01:44, 19.2MB/s] model-00011-of-00013.safetensors: 3%|▎ | 52.4M/2.04G [00:02<01:22, 24.2MB/s] model-00011-of-00013.safetensors: 6%|▌ | 115M/2.04G [00:02<00:24, 78.6MB/s] model-00011-of-00013.safetensors: 10%|█ | 210M/2.04G [00:02<00:10, 174MB/s] model-00011-of-00013.safetensors: 15%|█▍ | 304M/2.04G [00:03<00:06, 267MB/s] model-00011-of-00013.safetensors: 17%|█▋ | 357M/2.04G [00:03<00:06, 281MB/s] model-00011-of-00013.safetensors: 22%|██▏ | 440M/2.04G [00:03<00:04, 371MB/s] model-00011-of-00013.safetensors: 42%|████▏ | 849M/2.04G [00:03<00:01, 1.09GB/s] model-00011-of-00013.safetensors: 52%|█████▏ | 1.07G/2.04G [00:03<00:00, 1.20GB/s] model-00011-of-00013.safetensors: 61%|██████ | 1.24G/2.04G [00:04<00:01, 624MB/s] model-00011-of-00013.safetensors: 67%|██████▋ | 1.37G/2.04G [00:04<00:00, 714MB/s] model-00011-of-00013.safetensors: 73%|███████▎ | 1.50G/2.04G [00:04<00:00, 793MB/s] model-00011-of-00013.safetensors: 79%|███████▉ | 1.63G/2.04G [00:04<00:00, 725MB/s] model-00011-of-00013.safetensors: 85%|████████▍ | 1.73G/2.04G [00:04<00:00, 593MB/s] model-00011-of-00013.safetensors: 89%|████████▊ | 1.81G/2.04G [00:05<00:00, 596MB/s] model-00011-of-00013.safetensors: 100%|█████████▉| 2.04G/2.04G [00:05<00:00, 397MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00012-of-00013.safetensors: 0%| | 0.00/2.04G [00:00<?, ?B/s] model-00012-of-00013.safetensors: 1%| | 10.5M/2.04G [00:01<05:48, 5.84MB/s] model-00012-of-00013.safetensors: 1%| | 21.0M/2.04G [00:01<02:37, 12.8MB/s] model-00012-of-00013.safetensors: 2%|▏ | 41.9M/2.04G [00:02<01:18, 25.6MB/s] model-00012-of-00013.safetensors: 3%|▎ | 52.4M/2.04G [00:02<01:12, 27.4MB/s] model-00012-of-00013.safetensors: 4%|▍ | 83.9M/2.04G [00:02<00:36, 54.1MB/s] model-00012-of-00013.safetensors: 6%|▌ | 115M/2.04G [00:02<00:22, 85.7MB/s] model-00012-of-00013.safetensors: 13%|█▎ | 273M/2.04G [00:02<00:05, 307MB/s] model-00012-of-00013.safetensors: 19%|█▉ | 388M/2.04G [00:03<00:04, 390MB/s] model-00012-of-00013.safetensors: 22%|██▏ | 451M/2.04G [00:03<00:03, 403MB/s] model-00012-of-00013.safetensors: 31%|███▏ | 640M/2.04G [00:03<00:02, 683MB/s] model-00012-of-00013.safetensors: 54%|█████▍ | 1.11G/2.04G [00:03<00:00, 1.06GB/s] model-00012-of-00013.safetensors: 60%|██████ | 1.23G/2.04G [00:04<00:01, 734MB/s] model-00012-of-00013.safetensors: 65%|██████▌ | 1.33G/2.04G [00:04<00:00, 717MB/s] model-00012-of-00013.safetensors: 69%|██████▉ | 1.42G/2.04G [00:04<00:01, 537MB/s] model-00012-of-00013.safetensors: 73%|███████▎ | 1.49G/2.04G [00:04<00:01, 510MB/s] model-00012-of-00013.safetensors: 76%|███████▋ | 1.56G/2.04G [00:04<00:00, 504MB/s] model-00012-of-00013.safetensors: 79%|███████▉ | 1.63G/2.04G [00:05<00:00, 471MB/s] model-00012-of-00013.safetensors: 83%|████████▎ | 1.69G/2.04G [00:05<00:00, 455MB/s] model-00012-of-00013.safetensors: 88%|████████▊ | 1.80G/2.04G [00:05<00:00, 582MB/s] model-00012-of-00013.safetensors: 100%|█████████▉| 2.04G/2.04G [00:05<00:00, 376MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model-00013-of-00013.safetensors: 0%| | 0.00/1.60G [00:00<?, ?B/s] model-00013-of-00013.safetensors: 1%| | 10.5M/1.60G [00:00<00:47, 33.5MB/s] model-00013-of-00013.safetensors: 1%|▏ | 21.0M/1.60G [00:02<03:00, 8.74MB/s] model-00013-of-00013.safetensors: 3%|▎ | 41.9M/1.60G [00:02<01:13, 21.2MB/s] model-00013-of-00013.safetensors: 5%|▍ | 73.4M/1.60G [00:02<00:34, 44.7MB/s] model-00013-of-00013.safetensors: 7%|▋ | 105M/1.60G [00:02<00:20, 71.9MB/s] model-00013-of-00013.safetensors: 8%|▊ | 126M/1.60G [00:02<00:18, 80.7MB/s] model-00013-of-00013.safetensors: 12%|█▏ | 189M/1.60G [00:02<00:09, 156MB/s] model-00013-of-00013.safetensors: 16%|█▋ | 262M/1.60G [00:02<00:05, 251MB/s] model-00013-of-00013.safetensors: 20%|█▉ | 315M/1.60G [00:03<00:05, 240MB/s] model-00013-of-00013.safetensors: 26%|██▌ | 409M/1.60G [00:03<00:03, 351MB/s] model-00013-of-00013.safetensors: 39%|███▉ | 619M/1.60G [00:03<00:01, 681MB/s] model-00013-of-00013.safetensors: 70%|██████▉ | 1.11G/1.60G [00:03<00:00, 1.59GB/s] model-00013-of-00013.safetensors: 83%|████████▎ | 1.33G/1.60G [00:03<00:00, 1.35GB/s] model-00013-of-00013.safetensors: 95%|█████████▍| 1.51G/1.60G [00:03<00:00, 1.11GB/s] model-00013-of-00013.safetensors: 100%|█████████▉| 1.60G/1.60G [00:04<00:00, 393MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: model.safetensors.index.json: 0%| | 0.00/29.9k [00:00<?, ?B/s] model.safetensors.index.json: 100%|██████████| 29.9k/29.9k [00:00<00:00, 103MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: special_tokens_map.json: 0%| | 0.00/438 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 438/438 [00:00<00:00, 3.53MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: tokenizer.model: 0%| | 0.00/500k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 500k/500k [00:00<00:00, 50.1MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: tokenizer_config.json: 0%| | 0.00/749 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 749/749 [00:00<00:00, 6.68MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: Downloaded to shared memory in 71.218s
anhnv125-llama-op-v17-1-v35-mkmlizer: quantizing model to /dev/shm/model_cache
anhnv125-llama-op-v17-1-v35-mkmlizer: Saving mkml model at /dev/shm/model_cache
anhnv125-llama-op-v17-1-v35-mkmlizer: Reading /tmp/tmpb70c4t60/model.safetensors.index.json
anhnv125-llama-op-v17-1-v35-mkmlizer: Profiling: 0%| | 0/363 [00:00<?, ?it/s] Profiling: 0%| | 1/363 [00:02<17:02, 2.82s/it] Profiling: 2%|▏ | 7/363 [00:02<01:50, 3.23it/s] Profiling: 4%|▍ | 14/363 [00:03<00:46, 7.44it/s] Profiling: 6%|▌ | 22/363 [00:03<00:25, 13.39it/s] Profiling: 8%|▊ | 28/363 [00:03<00:22, 15.04it/s] Profiling: 10%|▉ | 35/363 [00:03<00:15, 20.99it/s] Profiling: 12%|█▏ | 42/363 [00:03<00:11, 27.55it/s] Profiling: 14%|█▍ | 50/363 [00:03<00:08, 34.80it/s] Profiling: 15%|█▌ | 56/363 [00:03<00:08, 34.83it/s] Profiling: 17%|█▋ | 62/363 [00:04<00:07, 38.94it/s] Profiling: 19%|█▊ | 68/363 [00:04<00:06, 43.15it/s] Profiling: 21%|██ | 75/363 [00:04<00:08, 34.71it/s] Profiling: 23%|██▎ | 83/363 [00:04<00:06, 42.56it/s] Profiling: 25%|██▍ | 89/363 [00:04<00:07, 36.62it/s] Profiling: 27%|██▋ | 98/363 [00:04<00:05, 46.00it/s] Profiling: 30%|███ | 110/363 [00:05<00:04, 58.75it/s] Profiling: 32%|███▏ | 117/363 [00:05<00:05, 46.77it/s] Profiling: 35%|███▍ | 126/363 [00:05<00:04, 54.52it/s] Profiling: 37%|███▋ | 133/363 [00:05<00:03, 57.84it/s] Profiling: 39%|███▉ | 142/363 [00:05<00:04, 50.64it/s] Profiling: 41%|████ | 148/363 [00:05<00:04, 44.33it/s] Profiling: 43%|████▎ | 157/363 [00:05<00:04, 51.35it/s] Profiling: 45%|████▌ | 165/363 [00:06<00:03, 56.36it/s] Profiling: 47%|████▋ | 172/363 [00:06<00:03, 50.47it/s] Profiling: 50%|████▉ | 180/363 [00:06<00:03, 56.12it/s] Profiling: 52%|█████▏ | 187/363 [00:06<00:03, 58.31it/s] Profiling: 53%|█████▎ | 194/363 [00:06<00:02, 59.48it/s] Profiling: 55%|█████▌ | 201/363 [00:06<00:04, 37.71it/s] Profiling: 58%|█████▊ | 210/363 [00:07<00:03, 45.26it/s] Profiling: 60%|██████ | 219/363 [00:07<00:02, 52.47it/s] Profiling: 63%|██████▎ | 227/363 [00:07<00:02, 58.28it/s] Profiling: 64%|██████▍ | 234/363 [00:07<00:02, 49.35it/s] Profiling: 67%|██████▋ | 242/363 [00:07<00:02, 55.53it/s] Profiling: 69%|██████▉ | 250/363 [00:07<00:01, 61.07it/s] Profiling: 71%|███████ | 258/363 [00:07<00:01, 56.08it/s] Profiling: 73%|███████▎ | 265/363 [00:08<00:01, 56.41it/s] Profiling: 75%|███████▌ | 273/363 [00:08<00:02, 42.30it/s] Profiling: 77%|███████▋ | 280/363 [00:08<00:01, 47.52it/s] Profiling: 79%|███████▉ | 286/363 [00:08<00:01, 43.48it/s] Profiling: 80%|████████ | 292/363 [00:08<00:01, 46.08it/s] Profiling: 83%|████████▎ | 301/363 [00:08<00:01, 53.62it/s] Profiling: 85%|████████▌ | 309/363 [00:08<00:00, 57.39it/s] Profiling: 87%|████████▋ | 316/363 [00:09<00:01, 41.54it/s] Profiling: 89%|████████▉ | 323/363 [00:09<00:00, 46.88it/s] Profiling: 91%|█████████ | 329/363 [00:09<00:00, 46.29it/s] Profiling: 93%|█████████▎| 338/363 [00:09<00:00, 54.40it/s] Profiling: 95%|█████████▌| 345/363 [00:11<00:01, 11.87it/s] Profiling: 97%|█████████▋| 352/363 [00:11<00:00, 15.54it/s] Profiling: 99%|█████████▉| 360/363 [00:11<00:00, 20.83it/s] Profiling: 100%|██████████| 363/363 [00:11<00:00, 31.24it/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: quantized model in 30.114s
anhnv125-llama-op-v17-1-v35-mkmlizer: Processed model anhnv125/llama-op-v17.1 in 103.549s
anhnv125-llama-op-v17-1-v35-mkmlizer: creating bucket guanaco-mkml-models
anhnv125-llama-op-v17-1-v35-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
anhnv125-llama-op-v17-1-v35-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/anhnv125-llama-op-v17-1-v35
anhnv125-llama-op-v17-1-v35-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/anhnv125-llama-op-v17-1-v35/config.json
anhnv125-llama-op-v17-1-v35-mkmlizer: cp /dev/shm/model_cache/added_tokens.json s3://guanaco-mkml-models/anhnv125-llama-op-v17-1-v35/added_tokens.json
anhnv125-llama-op-v17-1-v35-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/anhnv125-llama-op-v17-1-v35/tokenizer_config.json
anhnv125-llama-op-v17-1-v35-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/anhnv125-llama-op-v17-1-v35/tokenizer.json
anhnv125-llama-op-v17-1-v35-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/anhnv125-llama-op-v17-1-v35/tokenizer.model
anhnv125-llama-op-v17-1-v35-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/anhnv125-llama-op-v17-1-v35/special_tokens_map.json
anhnv125-llama-op-v17-1-v35-mkmlizer: loading reward model from rirv938/reward_gpt2_medium_preference_24m_e2
anhnv125-llama-op-v17-1-v35-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
anhnv125-llama-op-v17-1-v35-mkmlizer: warnings.warn(
anhnv125-llama-op-v17-1-v35-mkmlizer: config.json: 0%| | 0.00/1.05k [00:00<?, ?B/s] config.json: 100%|██████████| 1.05k/1.05k [00:00<00:00, 11.4MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
anhnv125-llama-op-v17-1-v35-mkmlizer: warnings.warn(
anhnv125-llama-op-v17-1-v35-mkmlizer: tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 234/234 [00:00<00:00, 2.45MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: vocab.json: 0%| | 0.00/1.04M [00:00<?, ?B/s] vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 29.5MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 54.7MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
anhnv125-llama-op-v17-1-v35-mkmlizer: warnings.warn(
anhnv125-llama-op-v17-1-v35-mkmlizer: pytorch_model.bin: 0%| | 0.00/1.44G [00:00<?, ?B/s] pytorch_model.bin: 1%| | 10.5M/1.44G [00:00<00:29, 48.9MB/s] pytorch_model.bin: 4%|▎ | 52.4M/1.44G [00:00<00:10, 133MB/s] pytorch_model.bin: 7%|▋ | 94.4M/1.44G [00:00<00:07, 192MB/s] pytorch_model.bin: 12%|█▏ | 168M/1.44G [00:00<00:06, 192MB/s] pytorch_model.bin: 13%|█▎ | 189M/1.44G [00:01<00:08, 149MB/s] pytorch_model.bin: 15%|█▌ | 220M/1.44G [00:01<00:06, 176MB/s] pytorch_model.bin: 19%|█▉ | 273M/1.44G [00:01<00:04, 241MB/s] pytorch_model.bin: 24%|██▍ | 346M/1.44G [00:01<00:03, 278MB/s] pytorch_model.bin: 49%|████▉ | 713M/1.44G [00:01<00:00, 939MB/s] pytorch_model.bin: 85%|████████▍ | 1.23G/1.44G [00:01<00:00, 1.82GB/s] pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:01<00:00, 737MB/s]
anhnv125-llama-op-v17-1-v35-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
anhnv125-llama-op-v17-1-v35-mkmlizer: Saving duration: 0.290s
anhnv125-llama-op-v17-1-v35-mkmlizer: Processed model rirv938/reward_gpt2_medium_preference_24m_e2 in 6.611s
anhnv125-llama-op-v17-1-v35-mkmlizer: creating bucket guanaco-reward-models
Job anhnv125-llama-op-v17-1-v35-mkmlizer completed after 138.04s with status: succeeded
Stopping job with name anhnv125-llama-op-v17-1-v35-mkmlizer
Pipeline stage MKMLizer completed in 145.72s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.21s
Running pipeline stage ISVCDeployer
Creating inference service anhnv125-llama-op-v17-1-v35
Waiting for inference service anhnv125-llama-op-v17-1-v35 to be ready
Inference service anhnv125-llama-op-v17-1-v35 ready after 50.38667392730713s
Pipeline stage ISVCDeployer completed in 61.56s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1930201053619385s
Received healthy response to inference request in 1.8787600994110107s
Received healthy response to inference request in 1.7163758277893066s
Received healthy response to inference request in 1.8083300590515137s
Received healthy response to inference request in 1.8714442253112793s
5 requests
0 failed requests
5th percentile: 1.734766674041748
10th percentile: 1.7531575202941894
20th percentile: 1.7899392127990723
30th percentile: 1.8209528923034668
40th percentile: 1.846198558807373
50th percentile: 1.8714442253112793
60th percentile: 1.8743705749511719
70th percentile: 1.8772969245910645
80th percentile: 1.9416121006011964
90th percentile: 2.067316102981567
95th percentile: 2.130168104171753
99th percentile: 2.1804497051239013
mean time: 1.8935860633850097
Pipeline stage StressChecker completed in 10.70s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.06s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.11s
M-Eval Dataset for topic stay_in_character is loaded
AUTO_DEACTIVATION: submission %s deactivated %s
anhnv125-llama-op-v17-1_v35 status is now inactive due to auto deactivation removed underperforming models
anhnv125-llama-op-v17-1_v35 status is now deployed due to admin request
anhnv125-llama-op-v17-1_v35 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics