developer_uid: Fizzarolli
submission_id: fizzarolli-llama-3-lust-_8388_v1
model_name: fizzarolli-llama-3-lust-_8388_v1
model_group: Fizzarolli/llama-3-lust-
status: rejected
timestamp: 2024-04-19T01:17:58+00:00
num_battles: 116
num_wins: 44
family_friendly_score: 0.0
submission_type: basic
model_repo: Fizzarolli/llama-3-lust-8b-v0.1
model_architecture: LlamaForCausalLM
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
model_num_parameters: 8030261248.0
best_of: 8
max_input_tokens: 512
max_output_tokens: 64
display_name: fizzarolli-llama-3-lust-_8388_v1
ineligible_reason: model is not deployable
is_internal_developer: False
language_model: Fizzarolli/llama-3-lust-8b-v0.1
model_size: 8B
ranking_group: single
us_pacific_date: 2024-04-18
win_ratio: 0.3793103448275862
generation_params: {'temperature': 1.0, 'top_p': 0.85, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.1, 'frequency_penalty': 0.1, 'stopping_words': ['\n', '</s>'], 'max_input_tokens': 512, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|description|>{bot_name}\n{memory}</s>\n', 'prompt_template': '<|message|>{user_name}\n{prompt}</s>\n<|message|>{bot_name}\n', 'bot_template': '<|message|>{bot_name}\n{message}</s>\n', 'user_template': '<|message|>{user_name}\n{message}</s>\n', 'response_template': '<|message|>{bot_name}\n', 'truncate_by_message': False}
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'user_template': '{user_name}: {message}\n'}
model_eval_status: success
Resubmit model
Running pipeline stage MKMLizer
Starting job with name fizzarolli-llama-3-lust-8388-v1-mkmlizer
Waiting for job on fizzarolli-llama-3-lust-8388-v1-mkmlizer to finish
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ _____ __ __ ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ /___/ ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ Version: 0.6.11 ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ The license key for the current software has been verified as ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ belonging to: ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ Chai Research Corp. ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ║ ║
fizzarolli-llama-3-lust-8388-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
fizzarolli-llama-3-lust-8388-v1-mkmlizer: .gitattributes: 0%| | 0.00/1.52k [00:00<?, ?B/s] .gitattributes: 100%|██████████| 1.52k/1.52k [00:00<00:00, 15.8MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: LICENSE: 0%| | 0.00/7.80k [00:00<?, ?B/s] LICENSE: 100%|██████████| 7.80k/7.80k [00:00<00:00, 69.2MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: NOTICE: 0%| | 0.00/118 [00:00<?, ?B/s] NOTICE: 100%|██████████| 118/118 [00:00<00:00, 1.18MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: config.json: 0%| | 0.00/689 [00:00<?, ?B/s] config.json: 100%|██████████| 689/689 [00:00<00:00, 7.10MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: generation_config.json: 0%| | 0.00/121 [00:00<?, ?B/s] generation_config.json: 100%|██████████| 121/121 [00:00<00:00, 866kB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: log.txt: 0%| | 0.00/34.0 [00:00<?, ?B/s] log.txt: 100%|██████████| 34.0/34.0 [00:00<00:00, 389kB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: model-00001-of-00007.safetensors: 0%| | 0.00/4.89G [00:00<?, ?B/s] model-00001-of-00007.safetensors: 0%| | 10.5M/4.89G [00:02<17:49, 4.56MB/s] model-00001-of-00007.safetensors: 0%| | 21.0M/4.89G [00:02<08:00, 10.1MB/s] model-00001-of-00007.safetensors: 2%|▏ | 73.4M/4.89G [00:02<01:38, 48.7MB/s] model-00001-of-00007.safetensors: 5%|▍ | 241M/4.89G [00:02<00:22, 203MB/s] model-00001-of-00007.safetensors: 8%|▊ | 377M/4.89G [00:02<00:13, 328MB/s] model-00001-of-00007.safetensors: 9%|▉ | 461M/4.89G [00:03<00:18, 242MB/s] model-00001-of-00007.safetensors: 11%|█ | 524M/4.89G [00:03<00:15, 276MB/s] model-00001-of-00007.safetensors: 13%|█▎ | 640M/4.89G [00:03<00:11, 375MB/s] model-00001-of-00007.safetensors: 15%|█▌ | 744M/4.89G [00:03<00:08, 474MB/s] model-00001-of-00007.safetensors: 17%|█▋ | 828M/4.89G [00:03<00:07, 532MB/s] model-00001-of-00007.safetensors: 20%|██ | 996M/4.89G [00:03<00:05, 745MB/s] model-00001-of-00007.safetensors: 31%|███ | 1.51G/4.89G [00:04<00:02, 1.36GB/s] model-00001-of-00007.safetensors: 34%|███▍ | 1.66G/4.89G [00:05<00:06, 536MB/s] model-00001-of-00007.safetensors: 36%|███▌ | 1.76G/4.89G [00:05<00:05, 548MB/s] model-00001-of-00007.safetensors: 39%|███▊ | 1.89G/4.89G [00:05<00:04, 623MB/s] model-00001-of-00007.safetensors: 41%|████ | 2.01G/4.89G [00:05<00:04, 672MB/s] model-00001-of-00007.safetensors: 43%|████▎ | 2.12G/4.89G [00:05<00:04, 617MB/s] model-00001-of-00007.safetensors: 45%|████▌ | 2.20G/4.89G [00:05<00:04, 650MB/s] model-00001-of-00007.safetensors: 48%|████▊ | 2.35G/4.89G [00:05<00:03, 771MB/s] model-00001-of-00007.safetensors: 50%|████▉ | 2.44G/4.89G [00:06<00:03, 799MB/s] model-00001-of-00007.safetensors: 52%|█████▏ | 2.54G/4.89G [00:06<00:02, 823MB/s] model-00001-of-00007.safetensors: 55%|█████▍ | 2.68G/4.89G [00:06<00:02, 972MB/s] model-00001-of-00007.safetensors: 57%|█████▋ | 2.80G/4.89G [00:06<00:02, 937MB/s] model-00001-of-00007.safetensors: 63%|██████▎ | 3.07G/4.89G [00:06<00:01, 1.38GB/s] model-00001-of-00007.safetensors: 66%|██████▌ | 3.23G/4.89G [00:06<00:02, 825MB/s] model-00001-of-00007.safetensors: 69%|██████▊ | 3.36G/4.89G [00:07<00:02, 573MB/s] model-00001-of-00007.safetensors: 71%|███████ | 3.45G/4.89G [00:07<00:02, 571MB/s] model-00001-of-00007.safetensors: 72%|███████▏ | 3.53G/4.89G [00:07<00:02, 560MB/s] model-00001-of-00007.safetensors: 74%|███████▍ | 3.61G/4.89G [00:07<00:02, 535MB/s] model-00001-of-00007.safetensors: 77%|███████▋ | 3.75G/4.89G [00:07<00:01, 682MB/s] model-00001-of-00007.safetensors: 79%|███████▊ | 3.84G/4.89G [00:07<00:01, 695MB/s] model-00001-of-00007.safetensors: 81%|████████▏ | 3.97G/4.89G [00:08<00:01, 832MB/s] model-00001-of-00007.safetensors: 86%|████████▌ | 4.18G/4.89G [00:08<00:00, 1.11GB/s] model-00001-of-00007.safetensors: 88%|████████▊ | 4.31G/4.89G [00:08<00:00, 929MB/s] model-00001-of-00007.safetensors: 91%|█████████ | 4.43G/4.89G [00:08<00:00, 808MB/s] model-00001-of-00007.safetensors: 93%|█████████▎| 4.55G/4.89G [00:08<00:00, 890MB/s] model-00001-of-00007.safetensors: 96%|█████████▌| 4.70G/4.89G [00:08<00:00, 1.02GB/s] model-00001-of-00007.safetensors: 100%|█████████▉| 4.89G/4.89G [00:08<00:00, 549MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: model-00002-of-00007.safetensors: 0%| | 0.00/4.83G [00:00<?, ?B/s] model-00002-of-00007.safetensors: 0%| | 10.5M/4.83G [00:01<07:45, 10.4MB/s] model-00002-of-00007.safetensors: 0%| | 21.0M/4.83G [00:01<06:58, 11.5MB/s] model-00002-of-00007.safetensors: 1%|▏ | 62.9M/4.83G [00:01<01:46, 44.6MB/s] model-00002-of-00007.safetensors: 3%|▎ | 168M/4.83G [00:02<00:31, 147MB/s] model-00002-of-00007.safetensors: 5%|▍ | 220M/4.83G [00:02<00:38, 120MB/s] model-00002-of-00007.safetensors: 5%|▌ | 252M/4.83G [00:02<00:33, 137MB/s] model-00002-of-00007.safetensors: 7%|▋ | 336M/4.83G [00:02<00:20, 222MB/s] model-00002-of-00007.safetensors: 8%|▊ | 388M/4.83G [00:03<00:24, 182MB/s] model-00002-of-00007.safetensors: 9%|▉ | 430M/4.83G [00:03<00:22, 199MB/s] model-00002-of-00007.safetensors: 11%|█▏ | 545M/4.83G [00:03<00:13, 320MB/s] model-00002-of-00007.safetensors: 13%|█▎ | 629M/4.83G [00:03<00:10, 392MB/s] model-00002-of-00007.safetensors: 16%|█▌ | 755M/4.83G [00:03<00:07, 551MB/s] model-00002-of-00007.safetensors: 18%|█▊ | 860M/4.83G [00:03<00:06, 604MB/s] model-00002-of-00007.safetensors: 20%|██ | 986M/4.83G [00:04<00:05, 729MB/s] model-00002-of-00007.safetensors: 31%|███ | 1.48G/4.83G [00:04<00:01, 1.70GB/s] model-00002-of-00007.safetensors: 35%|███▍ | 1.69G/4.83G [00:04<00:04, 778MB/s] model-00002-of-00007.safetensors: 38%|███▊ | 1.85G/4.83G [00:05<00:04, 697MB/s] model-00002-of-00007.safetensors: 42%|████▏ | 2.02G/4.83G [00:05<00:03, 838MB/s] model-00002-of-00007.safetensors: 45%|████▍ | 2.17G/4.83G [00:05<00:04, 613MB/s] model-00002-of-00007.safetensors: 47%|████▋ | 2.29G/4.83G [00:05<00:03, 646MB/s] model-00002-of-00007.safetensors: 49%|████▉ | 2.39G/4.83G [00:05<00:03, 664MB/s] model-00002-of-00007.safetensors: 52%|█████▏ | 2.51G/4.83G [00:06<00:03, 745MB/s] model-00002-of-00007.safetensors: 55%|█████▌ | 2.67G/4.83G [00:06<00:02, 867MB/s] model-00002-of-00007.safetensors: 58%|█████▊ | 2.79G/4.83G [00:06<00:02, 855MB/s] model-00002-of-00007.safetensors: 60%|█████▉ | 2.89G/4.83G [00:06<00:02, 795MB/s] model-00002-of-00007.safetensors: 62%|██████▏ | 2.99G/4.83G [00:06<00:02, 786MB/s] model-00002-of-00007.safetensors: 65%|██████▍ | 3.12G/4.83G [00:06<00:01, 891MB/s] model-00002-of-00007.safetensors: 69%|██████▉ | 3.33G/4.83G [00:06<00:01, 1.17GB/s] model-00002-of-00007.safetensors: 75%|███████▌ | 3.64G/4.83G [00:06<00:00, 1.64GB/s] model-00002-of-00007.safetensors: 80%|███████▉ | 3.85G/4.83G [00:07<00:00, 1.70GB/s] model-00002-of-00007.safetensors: 84%|████████▎ | 4.04G/4.83G [00:07<00:00, 1.08GB/s] model-00002-of-00007.safetensors: 87%|████████▋ | 4.18G/4.83G [00:07<00:00, 979MB/s] model-00002-of-00007.safetensors: 89%|████████▉ | 4.31G/4.83G [00:07<00:00, 1.02GB/s] model-00002-of-00007.safetensors: 93%|█████████▎| 4.48G/4.83G [00:07<00:00, 1.09GB/s] model-00002-of-00007.safetensors: 97%|█████████▋| 4.67G/4.83G [00:07<00:00, 1.29GB/s] model-00002-of-00007.safetensors: 100%|█████████▉| 4.83G/4.83G [00:07<00:00, 605MB/s]
Connection pool is full, discarding connection: %s
fizzarolli-llama-3-lust-8388-v1-mkmlizer: model-00003-of-00007.safetensors: 0%| | 0.00/5.00G [00:00<?, ?B/s] model-00003-of-00007.safetensors: 0%| | 10.5M/5.00G [00:01<11:25, 7.27MB/s] model-00003-of-00007.safetensors: 1%| | 31.5M/5.00G [00:01<04:01, 20.6MB/s] model-00003-of-00007.safetensors: 1%| | 52.4M/5.00G [00:01<02:17, 36.0MB/s] model-00003-of-00007.safetensors: 1%|▏ | 73.4M/5.00G [00:02<01:29, 55.1MB/s] model-00003-of-00007.safetensors: 2%|▏ | 115M/5.00G [00:02<00:47, 102MB/s] model-00003-of-00007.safetensors: 3%|▎ | 147M/5.00G [00:02<00:42, 114MB/s] model-00003-of-00007.safetensors: 3%|▎ | 168M/5.00G [00:02<00:42, 113MB/s] model-00003-of-00007.safetensors: 4%|▍ | 189M/5.00G [00:02<00:40, 119MB/s] model-00003-of-00007.safetensors: 4%|▍ | 210M/5.00G [00:02<00:35, 134MB/s] model-00003-of-00007.safetensors: 5%|▌ | 262M/5.00G [00:03<00:24, 194MB/s] model-00003-of-00007.safetensors: 6%|▌ | 294M/5.00G [00:03<00:29, 160MB/s] model-00003-of-00007.safetensors: 7%|▋ | 325M/5.00G [00:03<00:26, 177MB/s] model-00003-of-00007.safetensors: 7%|▋ | 357M/5.00G [00:03<00:24, 193MB/s] model-00003-of-00007.safetensors: 8%|▊ | 388M/5.00G [00:03<00:31, 149MB/s] model-00003-of-00007.safetensors: 8%|▊ | 409M/5.00G [00:04<00:30, 150MB/s] model-00003-of-00007.safetensors: 10%|▉ | 493M/5.00G [00:04<00:16, 270MB/s] model-00003-of-00007.safetensors: 11%|█ | 535M/5.00G [00:04<00:15, 286MB/s] model-00003-of-00007.safetensors: 12%|█▏ | 577M/5.00G [00:04<00:15, 286MB/s] model-00003-of-00007.safetensors: 14%|█▍ | 692M/5.00G [00:04<00:09, 451MB/s] model-00003-of-00007.safetensors: 18%|█▊ | 881M/5.00G [00:04<00:05, 777MB/s] model-00003-of-00007.safetensors: 20%|█▉ | 975M/5.00G [00:04<00:05, 770MB/s] model-00003-of-00007.safetensors: 25%|██▍ | 1.25G/5.00G [00:04<00:02, 1.25GB/s] model-00003-of-00007.safetensors: 30%|██▉ | 1.50G/5.00G [00:04<00:02, 1.57GB/s] model-00003-of-00007.safetensors: 34%|███▎ | 1.68G/5.00G [00:05<00:06, 518MB/s] model-00003-of-00007.safetensors: 36%|███▋ | 1.81G/5.00G [00:06<00:06, 456MB/s] model-00003-of-00007.safetensors: 38%|███▊ | 1.92G/5.00G [00:06<00:06, 468MB/s] model-00003-of-00007.safetensors: 40%|████ | 2.01G/5.00G [00:06<00:06, 460MB/s] model-00003-of-00007.safetensors: 43%|████▎ | 2.13G/5.00G [00:06<00:05, 538MB/s] model-00003-of-00007.safetensors: 44%|████▍ | 2.21G/5.00G [00:07<00:05, 509MB/s] model-00003-of-00007.safetensors: 46%|████▌ | 2.30G/5.00G [00:07<00:04, 555MB/s] model-00003-of-00007.safetensors: 48%|████▊ | 2.41G/5.00G [00:07<00:03, 650MB/s] model-00003-of-00007.safetensors: 50%|█████ | 2.52G/5.00G [00:07<00:03, 708MB/s] model-00003-of-00007.safetensors: 53%|█████▎ | 2.65G/5.00G [00:07<00:02, 848MB/s] model-00003-of-00007.safetensors: 55%|█████▌ | 2.77G/5.00G [00:07<00:02, 901MB/s] model-00003-of-00007.safetensors: 57%|█████▋ | 2.87G/5.00G [00:07<00:02, 862MB/s] model-00003-of-00007.safetensors: 59%|█████▉ | 2.97G/5.00G [00:07<00:02, 870MB/s] model-00003-of-00007.safetensors: 68%|██████▊ | 3.38G/5.00G [00:07<00:00, 1.70GB/s] model-00003-of-00007.safetensors: 71%|███████▏ | 3.57G/5.00G [00:08<00:01, 1.39GB/s] model-00003-of-00007.safetensors: 75%|███████▍ | 3.73G/5.00G [00:08<00:01, 1.18GB/s] model-00003-of-00007.safetensors: 78%|███████▊ | 3.88G/5.00G [00:08<00:01, 650MB/s] model-00003-of-00007.safetensors: 80%|███████▉ | 3.98G/5.00G [00:08<00:01, 656MB/s] model-00003-of-00007.safetensors: 83%|████████▎ | 4.15G/5.00G [00:09<00:01, 793MB/s] model-00003-of-00007.safetensors: 86%|████████▌ | 4.31G/5.00G [00:09<00:00, 931MB/s] model-00003-of-00007.safetensors: 91%|█████████ | 4.53G/5.00G [00:09<00:00, 1.18GB/s] model-00003-of-00007.safetensors: 96%|█████████▌| 4.78G/5.00G [00:09<00:00, 1.44GB/s] model-00003-of-00007.safetensors: 100%|█████████▉| 4.98G/5.00G [00:09<00:00, 1.52GB/s] model-00003-of-00007.safetensors: 100%|█████████▉| 5.00G/5.00G [00:09<00:00, 517MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: model-00004-of-00007.safetensors: 0%| | 0.00/5.00G [00:00<?, ?B/s] model-00004-of-00007.safetensors: 0%| | 10.5M/5.00G [00:00<07:49, 10.6MB/s] model-00004-of-00007.safetensors: 0%| | 21.0M/5.00G [00:01<05:40, 14.6MB/s] model-00004-of-00007.safetensors: 1%| | 31.5M/5.00G [00:01<03:30, 23.6MB/s] model-00004-of-00007.safetensors: 1%| | 41.9M/5.00G [00:01<02:45, 29.9MB/s] model-00004-of-00007.safetensors: 1%| | 52.4M/5.00G [00:02<02:16, 36.3MB/s] model-00004-of-00007.safetensors: 2%|▏ | 94.4M/5.00G [00:02<00:55, 89.1MB/s] model-00004-of-00007.safetensors: 4%|▍ | 189M/5.00G [00:02<00:23, 209MB/s] model-00004-of-00007.safetensors: 4%|▍ | 220M/5.00G [00:02<00:22, 212MB/s] model-00004-of-00007.safetensors: 5%|▌ | 252M/5.00G [00:02<00:26, 181MB/s] model-00004-of-00007.safetensors: 7%|▋ | 325M/5.00G [00:02<00:17, 261MB/s] model-00004-of-00007.safetensors: 7%|▋ | 367M/5.00G [00:03<00:17, 266MB/s] model-00004-of-00007.safetensors: 8%|▊ | 398M/5.00G [00:03<00:22, 207MB/s] model-00004-of-00007.safetensors: 9%|▉ | 451M/5.00G [00:03<00:17, 260MB/s] model-00004-of-00007.safetensors: 10%|█ | 503M/5.00G [00:03<00:17, 261MB/s] model-00004-of-00007.safetensors: 11%|█▏ | 566M/5.00G [00:03<00:13, 328MB/s] model-00004-of-00007.safetensors: 13%|█▎ | 629M/5.00G [00:03<00:11, 390MB/s] model-00004-of-00007.safetensors: 14%|█▍ | 692M/5.00G [00:03<00:10, 407MB/s] model-00004-of-00007.safetensors: 17%|█▋ | 828M/5.00G [00:04<00:06, 628MB/s] model-00004-of-00007.safetensors: 18%|█▊ | 923M/5.00G [00:04<00:06, 653MB/s] model-00004-of-00007.safetensors: 20%|█▉ | 996M/5.00G [00:04<00:06, 629MB/s] model-00004-of-00007.safetensors: 24%|██▎ | 1.18G/5.00G [00:04<00:04, 939MB/s] model-00004-of-00007.safetensors: 33%|███▎ | 1.65G/5.00G [00:04<00:02, 1.41GB/s] model-00004-of-00007.safetensors: 36%|███▌ | 1.78G/5.00G [00:05<00:05, 546MB/s] model-00004-of-00007.safetensors: 38%|███▊ | 1.90G/5.00G [00:05<00:05, 608MB/s] model-00004-of-00007.safetensors: 40%|████ | 2.00G/5.00G [00:05<00:05, 537MB/s] model-00004-of-00007.safetensors: 42%|████▏ | 2.09G/5.00G [00:05<00:05, 566MB/s] model-00004-of-00007.safetensors: 43%|████▎ | 2.17G/5.00G [00:06<00:05, 556MB/s] model-00004-of-00007.safetensors: 45%|████▍ | 2.24G/5.00G [00:06<00:04, 582MB/s] model-00004-of-00007.safetensors: 47%|████▋ | 2.33G/5.00G [00:06<00:04, 627MB/s] model-00004-of-00007.safetensors: 48%|████▊ | 2.41G/5.00G [00:06<00:04, 618MB/s] model-00004-of-00007.safetensors: 50%|████▉ | 2.49G/5.00G [00:06<00:03, 634MB/s] model-00004-of-00007.safetensors: 52%|█████▏ | 2.59G/5.00G [00:06<00:03, 678MB/s] model-00004-of-00007.safetensors: 54%|█████▍ | 2.69G/5.00G [00:06<00:03, 766MB/s] model-00004-of-00007.safetensors: 56%|█████▌ | 2.80G/5.00G [00:06<00:02, 800MB/s] model-00004-of-00007.safetensors: 58%|█████▊ | 2.90G/5.00G [00:07<00:02, 863MB/s] model-00004-of-00007.safetensors: 61%|██████ | 3.05G/5.00G [00:07<00:01, 1.01GB/s] model-00004-of-00007.safetensors: 64%|██████▍ | 3.21G/5.00G [00:07<00:01, 1.04GB/s] model-00004-of-00007.safetensors: 66%|██████▋ | 3.32G/5.00G [00:07<00:01, 971MB/s] model-00004-of-00007.safetensors: 75%|███████▌ | 3.75G/5.00G [00:07<00:00, 1.77GB/s] model-00004-of-00007.safetensors: 79%|███████▉ | 3.94G/5.00G [00:07<00:00, 1.20GB/s] model-00004-of-00007.safetensors: 82%|████████▏ | 4.10G/5.00G [00:08<00:01, 831MB/s] model-00004-of-00007.safetensors: 85%|████████▍ | 4.23G/5.00G [00:08<00:01, 666MB/s] model-00004-of-00007.safetensors: 87%|████████▋ | 4.33G/5.00G [00:08<00:00, 716MB/s] model-00004-of-00007.safetensors: 89%|████████▉ | 4.45G/5.00G [00:08<00:00, 772MB/s] model-00004-of-00007.safetensors: 91%|█████████ | 4.55G/5.00G [00:08<00:00, 793MB/s] model-00004-of-00007.safetensors: 98%|█████████▊| 4.88G/5.00G [00:08<00:00, 1.32GB/s] model-00004-of-00007.safetensors: 100%|█████████▉| 5.00G/5.00G [00:09<00:00, 555MB/s]
Connection pool is full, discarding connection: %s
fizzarolli-llama-3-lust-8388-v1-mkmlizer: model-00005-of-00007.safetensors: 0%| | 0.00/4.83G [00:00<?, ?B/s] model-00005-of-00007.safetensors: 0%| | 10.5M/4.83G [00:01<07:44, 10.4MB/s] model-00005-of-00007.safetensors: 0%| | 21.0M/4.83G [00:01<06:33, 12.2MB/s] model-00005-of-00007.safetensors: 1%| | 31.5M/4.83G [00:01<04:10, 19.2MB/s] model-00005-of-00007.safetensors: 1%| | 41.9M/4.83G [00:02<03:03, 26.1MB/s] model-00005-of-00007.safetensors: 2%|▏ | 73.4M/4.83G [00:02<01:19, 59.5MB/s] model-00005-of-00007.safetensors: 2%|▏ | 94.4M/4.83G [00:02<01:15, 62.7MB/s] model-00005-of-00007.safetensors: 2%|▏ | 105M/4.83G [00:02<01:11, 66.1MB/s] model-00005-of-00007.safetensors: 2%|▏ | 115M/4.83G [00:02<01:12, 64.8MB/s] model-00005-of-00007.safetensors: 3%|▎ | 136M/4.83G [00:03<00:58, 80.3MB/s] model-00005-of-00007.safetensors: 3%|▎ | 147M/4.83G [00:03<00:59, 78.2MB/s] model-00005-of-00007.safetensors: 3%|▎ | 157M/4.83G [00:03<00:56, 82.5MB/s] model-00005-of-00007.safetensors: 4%|▎ | 178M/4.83G [00:03<00:42, 109MB/s] model-00005-of-00007.safetensors: 4%|▍ | 199M/4.83G [00:03<00:43, 106MB/s] model-00005-of-00007.safetensors: 5%|▍ | 220M/4.83G [00:03<00:44, 103MB/s] model-00005-of-00007.safetensors: 6%|▌ | 283M/4.83G [00:03<00:22, 204MB/s] model-00005-of-00007.safetensors: 7%|▋ | 325M/4.83G [00:04<00:18, 244MB/s] model-00005-of-00007.safetensors: 10%|▉ | 461M/4.83G [00:04<00:08, 490MB/s] model-00005-of-00007.safetensors: 12%|█▏ | 577M/4.83G [00:04<00:06, 647MB/s] model-00005-of-00007.safetensors: 14%|█▎ | 661M/4.83G [00:04<00:07, 546MB/s] model-00005-of-00007.safetensors: 25%|██▍ | 1.20G/4.83G [00:04<00:02, 1.62GB/s] model-00005-of-00007.safetensors: 29%|██▉ | 1.41G/4.83G [00:05<00:05, 619MB/s] model-00005-of-00007.safetensors: 32%|███▏ | 1.56G/4.83G [00:05<00:04, 655MB/s] model-00005-of-00007.safetensors: 35%|███▌ | 1.70G/4.83G [00:06<00:06, 515MB/s] model-00005-of-00007.safetensors: 38%|███▊ | 1.81G/4.83G [00:06<00:05, 570MB/s] model-00005-of-00007.safetensors: 40%|███▉ | 1.92G/4.83G [00:06<00:04, 595MB/s] model-00005-of-00007.safetensors: 42%|████▏ | 2.01G/4.83G [00:06<00:04, 647MB/s] model-00005-of-00007.safetensors: 44%|████▎ | 2.11G/4.83G [00:06<00:03, 698MB/s] model-00005-of-00007.safetensors: 46%|████▌ | 2.20G/4.83G [00:06<00:03, 665MB/s] model-00005-of-00007.safetensors: 47%|████▋ | 2.29G/4.83G [00:06<00:03, 697MB/s] model-00005-of-00007.safetensors: 49%|████▉ | 2.38G/4.83G [00:06<00:03, 744MB/s] model-00005-of-00007.safetensors: 51%|█████ | 2.47G/4.83G [00:06<00:02, 788MB/s] model-00005-of-00007.safetensors: 54%|█████▍ | 2.60G/4.83G [00:07<00:02, 896MB/s] model-00005-of-00007.safetensors: 56%|█████▌ | 2.71G/4.83G [00:07<00:02, 816MB/s] model-00005-of-00007.safetensors: 64%|██████▍ | 3.08G/4.83G [00:07<00:01, 1.53GB/s] model-00005-of-00007.safetensors: 70%|██████▉ | 3.37G/4.83G [00:07<00:00, 1.83GB/s] model-00005-of-00007.safetensors: 74%|███████▍ | 3.57G/4.83G [00:07<00:00, 1.47GB/s] model-00005-of-00007.safetensors: 77%|███████▋ | 3.73G/4.83G [00:08<00:01, 760MB/s] model-00005-of-00007.safetensors: 80%|████████ | 3.87G/4.83G [00:08<00:01, 731MB/s] model-00005-of-00007.safetensors: 82%|████████▏ | 3.98G/4.83G [00:08<00:01, 737MB/s] model-00005-of-00007.safetensors: 86%|████████▌ | 4.15G/4.83G [00:08<00:00, 877MB/s] model-00005-of-00007.safetensors: 90%|█████████ | 4.36G/4.83G [00:08<00:00, 1.10GB/s] model-00005-of-00007.safetensors: 94%|█████████▍| 4.54G/4.83G [00:08<00:00, 1.24GB/s] model-00005-of-00007.safetensors: 100%|█████████▉| 4.83G/4.83G [00:08<00:00, 1.51GB/s] model-00005-of-00007.safetensors: 100%|█████████▉| 4.83G/4.83G [00:09<00:00, 533MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: model-00006-of-00007.safetensors: 0%| | 0.00/5.00G [00:00<?, ?B/s] model-00006-of-00007.safetensors: 0%| | 10.5M/5.00G [00:01<11:30, 7.23MB/s] model-00006-of-00007.safetensors: 0%| | 21.0M/5.00G [00:01<06:28, 12.8MB/s] model-00006-of-00007.safetensors: 1%| | 41.9M/5.00G [00:02<02:59, 27.6MB/s] model-00006-of-00007.safetensors: 1%| | 52.4M/5.00G [00:02<02:20, 35.1MB/s] model-00006-of-00007.safetensors: 2%|▏ | 115M/5.00G [00:02<00:46, 106MB/s] model-00006-of-00007.safetensors: 3%|▎ | 136M/5.00G [00:02<00:54, 89.4MB/s] model-00006-of-00007.safetensors: 3%|▎ | 157M/5.00G [00:02<00:46, 104MB/s] model-00006-of-00007.safetensors: 4%|▎ | 178M/5.00G [00:02<00:40, 118MB/s] model-00006-of-00007.safetensors: 4%|▍ | 220M/5.00G [00:03<00:30, 158MB/s] model-00006-of-00007.safetensors: 5%|▌ | 252M/5.00G [00:03<00:33, 144MB/s] model-00006-of-00007.safetensors: 6%|▌ | 283M/5.00G [00:03<00:29, 159MB/s] model-00006-of-00007.safetensors: 7%|▋ | 325M/5.00G [00:03<00:24, 190MB/s] model-00006-of-00007.safetensors: 7%|▋ | 357M/5.00G [00:03<00:25, 182MB/s] model-00006-of-00007.safetensors: 8%|▊ | 377M/5.00G [00:04<00:30, 151MB/s] model-00006-of-00007.safetensors: 8%|▊ | 398M/5.00G [00:04<00:31, 148MB/s] model-00006-of-00007.safetensors: 9%|▊ | 430M/5.00G [00:04<00:31, 144MB/s] model-00006-of-00007.safetensors: 10%|▉ | 493M/5.00G [00:04<00:20, 222MB/s] model-00006-of-00007.safetensors: 10%|█ | 524M/5.00G [00:04<00:20, 213MB/s] model-00006-of-00007.safetensors: 11%|█▏ | 566M/5.00G [00:04<00:17, 253MB/s] model-00006-of-00007.safetensors: 14%|█▍ | 692M/5.00G [00:04<00:09, 467MB/s] model-00006-of-00007.safetensors: 17%|█▋ | 839M/5.00G [00:05<00:06, 669MB/s] model-00006-of-00007.safetensors: 18%|█▊ | 923M/5.00G [00:05<00:06, 678MB/s] model-00006-of-00007.safetensors: 25%|██▍ | 1.24G/5.00G [00:05<00:02, 1.29GB/s] model-00006-of-00007.safetensors: 31%|███ | 1.54G/5.00G [00:05<00:02, 1.64GB/s] model-00006-of-00007.safetensors: 34%|███▍ | 1.72G/5.00G [00:06<00:05, 556MB/s] model-00006-of-00007.safetensors: 37%|███▋ | 1.86G/5.00G [00:06<00:05, 594MB/s] model-00006-of-00007.safetensors: 39%|███▉ | 1.97G/5.00G [00:06<00:06, 494MB/s] model-00006-of-00007.safetensors: 41%|████▏ | 2.07G/5.00G [00:07<00:06, 474MB/s] model-00006-of-00007.safetensors: 43%|████▎ | 2.15G/5.00G [00:07<00:06, 457MB/s] model-00006-of-00007.safetensors: 44%|████▍ | 2.22G/5.00G [00:07<00:05, 475MB/s] model-00006-of-00007.safetensors: 46%|████▌ | 2.29G/5.00G [00:07<00:06, 399MB/s] model-00006-of-00007.safetensors: 49%|████▊ | 2.43G/5.00G [00:07<00:04, 561MB/s] model-00006-of-00007.safetensors: 50%|█████ | 2.52G/5.00G [00:07<00:04, 517MB/s] model-00006-of-00007.safetensors: 52%|█████▏ | 2.59G/5.00G [00:08<00:04, 508MB/s] model-00006-of-00007.safetensors: 55%|█████▍ | 2.74G/5.00G [00:08<00:03, 662MB/s] model-00006-of-00007.safetensors: 56%|█████▋ | 2.82G/5.00G [00:08<00:03, 638MB/s] model-00006-of-00007.safetensors: 59%|█████▊ | 2.94G/5.00G [00:08<00:02, 729MB/s] model-00006-of-00007.safetensors: 62%|██████▏ | 3.11G/5.00G [00:08<00:02, 941MB/s] model-00006-of-00007.safetensors: 64%|██████▍ | 3.22G/5.00G [00:08<00:01, 926MB/s] model-00006-of-00007.safetensors: 72%|███████▏ | 3.62G/5.00G [00:08<00:00, 1.51GB/s] model-00006-of-00007.safetensors: 76%|███████▌ | 3.77G/5.00G [00:09<00:00, 1.29GB/s] model-00006-of-00007.safetensors: 78%|███████▊ | 3.91G/5.00G [00:09<00:00, 1.12GB/s] model-00006-of-00007.safetensors: 81%|████████ | 4.04G/5.00G [00:09<00:01, 942MB/s] model-00006-of-00007.safetensors: 83%|████████▎ | 4.14G/5.00G [00:09<00:01, 808MB/s] model-00006-of-00007.safetensors: 85%|████████▍ | 4.24G/5.00G [00:09<00:00, 802MB/s] model-00006-of-00007.safetensors: 87%|████████▋ | 4.36G/5.00G [00:09<00:00, 874MB/s] model-00006-of-00007.safetensors: 90%|████████▉ | 4.49G/5.00G [00:09<00:00, 953MB/s] model-00006-of-00007.safetensors: 93%|█████████▎| 4.67G/5.00G [00:10<00:00, 1.14GB/s] model-00006-of-00007.safetensors: 100%|█████████▉| 4.99G/5.00G [00:10<00:00, 1.63GB/s] model-00006-of-00007.safetensors: 100%|█████████▉| 5.00G/5.00G [00:10<00:00, 481MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: model-00007-of-00007.safetensors: 0%| | 0.00/2.57G [00:00<?, ?B/s] model-00007-of-00007.safetensors: 0%| | 10.5M/2.57G [00:01<05:51, 7.28MB/s] model-00007-of-00007.safetensors: 1%| | 31.5M/2.57G [00:01<02:17, 18.5MB/s] model-00007-of-00007.safetensors: 2%|▏ | 41.9M/2.57G [00:02<01:38, 25.6MB/s] model-00007-of-00007.safetensors: 2%|▏ | 52.4M/2.57G [00:02<01:14, 33.9MB/s] model-00007-of-00007.safetensors: 4%|▎ | 94.4M/2.57G [00:02<00:29, 84.9MB/s] model-00007-of-00007.safetensors: 5%|▍ | 126M/2.57G [00:02<00:22, 108MB/s] model-00007-of-00007.safetensors: 6%|▌ | 147M/2.57G [00:02<00:27, 89.0MB/s] model-00007-of-00007.safetensors: 7%|▋ | 178M/2.57G [00:02<00:19, 120MB/s] model-00007-of-00007.safetensors: 9%|▊ | 220M/2.57G [00:03<00:17, 137MB/s] model-00007-of-00007.safetensors: 9%|▉ | 241M/2.57G [00:03<00:16, 143MB/s] model-00007-of-00007.safetensors: 10%|█ | 262M/2.57G [00:03<00:15, 151MB/s] model-00007-of-00007.safetensors: 11%|█▏ | 294M/2.57G [00:03<00:14, 154MB/s] model-00007-of-00007.safetensors: 13%|█▎ | 325M/2.57G [00:03<00:12, 184MB/s] model-00007-of-00007.safetensors: 14%|█▍ | 357M/2.57G [00:04<00:16, 132MB/s] model-00007-of-00007.safetensors: 15%|█▌ | 388M/2.57G [00:04<00:14, 153MB/s] model-00007-of-00007.safetensors: 17%|█▋ | 440M/2.57G [00:04<00:10, 203MB/s] model-00007-of-00007.safetensors: 18%|█▊ | 472M/2.57G [00:04<00:10, 204MB/s] model-00007-of-00007.safetensors: 21%|██ | 535M/2.57G [00:04<00:07, 280MB/s] model-00007-of-00007.safetensors: 27%|██▋ | 682M/2.57G [00:04<00:03, 495MB/s] model-00007-of-00007.safetensors: 29%|██▉ | 744M/2.57G [00:04<00:03, 499MB/s] model-00007-of-00007.safetensors: 31%|███▏ | 807M/2.57G [00:05<00:03, 512MB/s] model-00007-of-00007.safetensors: 54%|█████▍ | 1.39G/2.57G [00:05<00:00, 1.81GB/s] model-00007-of-00007.safetensors: 63%|██████▎ | 1.62G/2.57G [00:06<00:01, 570MB/s] model-00007-of-00007.safetensors: 69%|██████▉ | 1.77G/2.57G [00:06<00:01, 661MB/s] model-00007-of-00007.safetensors: 75%|███████▌ | 1.93G/2.57G [00:07<00:02, 318MB/s] model-00007-of-00007.safetensors: 80%|███████▉ | 2.05G/2.57G [00:07<00:01, 321MB/s] model-00007-of-00007.safetensors: 83%|████████▎ | 2.14G/2.57G [00:07<00:01, 358MB/s] model-00007-of-00007.safetensors: 87%|████████▋ | 2.23G/2.57G [00:08<00:00, 392MB/s] model-00007-of-00007.safetensors: 100%|█████████▉| 2.57G/2.57G [00:08<00:00, 312MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: model.safetensors.index.json: 0%| | 0.00/23.9k [00:00<?, ?B/s] model.safetensors.index.json: 100%|██████████| 23.9k/23.9k [00:00<00:00, 6.31MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: special_tokens_map.json: 0%| | 0.00/73.0 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 73.0/73.0 [00:00<00:00, 812kB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: tokenizer.json: 0%| | 0.00/9.08M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 9.08M/9.08M [00:00<00:00, 24.7MB/s] tokenizer.json: 100%|██████████| 9.08M/9.08M [00:00<00:00, 24.4MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: tokenizer_config.json: 0%| | 0.00/50.9k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 50.9k/50.9k [00:00<00:00, 273MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Downloaded to shared memory in 70.915s
fizzarolli-llama-3-lust-8388-v1-mkmlizer: quantizing model to /dev/shm/model_cache
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Saving mkml model at /dev/shm/model_cache
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Reading /tmp/tmprms5yhz3/model.safetensors.index.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Profiling: 0%| | 0/291 [00:00<?, ?it/s] Profiling: 0%| | 1/291 [00:04<22:29, 4.65s/it] Profiling: 2%|▏ | 7/291 [00:04<02:22, 2.00it/s] Profiling: 5%|▍ | 14/291 [00:04<00:58, 4.75it/s] Profiling: 8%|▊ | 23/291 [00:04<00:28, 9.35it/s] Profiling: 11%|█▏ | 33/291 [00:05<00:18, 14.09it/s] Profiling: 13%|█▎ | 39/291 [00:05<00:14, 17.66it/s] Profiling: 16%|█▋ | 48/291 [00:05<00:09, 25.07it/s] Profiling: 20%|█▉ | 57/291 [00:05<00:07, 33.06it/s] Profiling: 23%|██▎ | 66/291 [00:05<00:05, 41.24it/s] Profiling: 25%|██▌ | 74/291 [00:05<00:04, 47.35it/s] Profiling: 28%|██▊ | 82/291 [00:06<00:05, 36.89it/s] Profiling: 31%|███▏ | 91/291 [00:06<00:04, 43.87it/s] Profiling: 34%|███▍ | 100/291 [00:06<00:03, 50.85it/s] Profiling: 37%|███▋ | 108/291 [00:06<00:03, 53.04it/s] Profiling: 40%|███▉ | 116/291 [00:06<00:03, 56.66it/s] Profiling: 43%|████▎ | 126/291 [00:06<00:02, 63.81it/s] Profiling: 46%|████▌ | 134/291 [00:07<00:03, 43.37it/s] Profiling: 48%|████▊ | 140/291 [00:07<00:03, 44.72it/s] Profiling: 51%|█████ | 149/291 [00:07<00:02, 52.55it/s] Profiling: 54%|█████▍ | 158/291 [00:07<00:02, 59.21it/s] Profiling: 57%|█████▋ | 167/291 [00:07<00:01, 64.18it/s] Profiling: 60%|██████ | 176/291 [00:07<00:01, 68.02it/s] Profiling: 64%|██████▍ | 186/291 [00:07<00:02, 51.64it/s] Profiling: 66%|██████▋ | 193/291 [00:08<00:01, 49.85it/s] Profiling: 69%|██████▉ | 202/291 [00:08<00:01, 56.91it/s] Profiling: 73%|███████▎ | 211/291 [00:08<00:01, 62.69it/s] Profiling: 76%|███████▌ | 220/291 [00:08<00:01, 67.44it/s] Profiling: 78%|███████▊ | 228/291 [00:08<00:00, 68.93it/s] Profiling: 81%|████████ | 236/291 [00:08<00:01, 47.22it/s] Profiling: 83%|████████▎ | 242/291 [00:08<00:00, 49.69it/s] Profiling: 86%|████████▌ | 249/291 [00:09<00:00, 54.08it/s] Profiling: 88%|████████▊ | 257/291 [00:09<00:00, 58.95it/s] Profiling: 91%|█████████▏| 266/291 [00:09<00:00, 64.99it/s] Profiling: 95%|█████████▍| 275/291 [00:09<00:00, 66.57it/s] Profiling: 98%|█████████▊| 286/291 [00:14<00:00, 5.30it/s] Profiling: 100%|██████████| 291/291 [00:15<00:00, 19.34it/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
fizzarolli-llama-3-lust-8388-v1-mkmlizer: quantized model in 29.867s
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Processed model Fizzarolli/llama-3-lust-8b-v0.1 in 102.922s
fizzarolli-llama-3-lust-8388-v1-mkmlizer: creating bucket guanaco-mkml-models
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
fizzarolli-llama-3-lust-8388-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/fizzarolli-llama-3-lust-8388-v1
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/fizzarolli-llama-3-lust-8388-v1/config.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/fizzarolli-llama-3-lust-8388-v1/special_tokens_map.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/fizzarolli-llama-3-lust-8388-v1/tokenizer_config.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/fizzarolli-llama-3-lust-8388-v1/tokenizer.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
fizzarolli-llama-3-lust-8388-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
fizzarolli-llama-3-lust-8388-v1-mkmlizer: warnings.warn(
fizzarolli-llama-3-lust-8388-v1-mkmlizer: config.json: 0%| | 0.00/1.05k [00:00<?, ?B/s] config.json: 100%|██████████| 1.05k/1.05k [00:00<00:00, 9.97MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
fizzarolli-llama-3-lust-8388-v1-mkmlizer: warnings.warn(
fizzarolli-llama-3-lust-8388-v1-mkmlizer: tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 234/234 [00:00<00:00, 1.56MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: vocab.json: 0%| | 0.00/1.04M [00:00<?, ?B/s] vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 4.19MB/s] vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 4.18MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 23.7MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
fizzarolli-llama-3-lust-8388-v1-mkmlizer: warnings.warn(
fizzarolli-llama-3-lust-8388-v1-mkmlizer: pytorch_model.bin: 0%| | 0.00/1.44G [00:00<?, ?B/s] pytorch_model.bin: 1%| | 10.5M/1.44G [00:00<02:00, 11.9MB/s] pytorch_model.bin: 1%|▏ | 21.0M/1.44G [00:01<00:58, 24.2MB/s] pytorch_model.bin: 2%|▏ | 31.5M/1.44G [00:01<00:45, 31.3MB/s] pytorch_model.bin: 3%|▎ | 41.9M/1.44G [00:01<00:35, 40.0MB/s] pytorch_model.bin: 6%|▌ | 83.9M/1.44G [00:01<00:12, 107MB/s] pytorch_model.bin: 7%|▋ | 105M/1.44G [00:01<00:10, 125MB/s] pytorch_model.bin: 9%|▉ | 136M/1.44G [00:01<00:10, 123MB/s] pytorch_model.bin: 12%|█▏ | 178M/1.44G [00:01<00:07, 178MB/s] pytorch_model.bin: 19%|█▉ | 273M/1.44G [00:02<00:03, 338MB/s] pytorch_model.bin: 32%|███▏ | 461M/1.44G [00:02<00:01, 694MB/s] pytorch_model.bin: 81%|████████ | 1.16G/1.44G [00:02<00:00, 2.26GB/s] pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:02<00:00, 2.37GB/s] pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:02<00:00, 608MB/s]
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
fizzarolli-llama-3-lust-8388-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
fizzarolli-llama-3-lust-8388-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/fizzarolli-llama-3-lust-8388-v1_reward
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/fizzarolli-llama-3-lust-8388-v1_reward/special_tokens_map.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/fizzarolli-llama-3-lust-8388-v1_reward/tokenizer_config.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/fizzarolli-llama-3-lust-8388-v1_reward/config.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/fizzarolli-llama-3-lust-8388-v1_reward/merges.txt
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/fizzarolli-llama-3-lust-8388-v1_reward/vocab.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/fizzarolli-llama-3-lust-8388-v1_reward/tokenizer.json
fizzarolli-llama-3-lust-8388-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/fizzarolli-llama-3-lust-8388-v1_reward/reward.tensors
Job fizzarolli-llama-3-lust-8388-v1-mkmlizer completed after 137.8s with status: succeeded
Stopping job with name fizzarolli-llama-3-lust-8388-v1-mkmlizer
Pipeline stage MKMLizer completed in 141.31s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service fizzarolli-llama-3-lust-8388-v1
Waiting for inference service fizzarolli-llama-3-lust-8388-v1 to be ready
Inference service fizzarolli-llama-3-lust-8388-v1 ready after 30.293846130371094s
Pipeline stage ISVCDeployer completed in 37.62s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.0965499877929688s
Received healthy response to inference request in 0.9951739311218262s
Received healthy response to inference request in 0.5400865077972412s
Received healthy response to inference request in 0.8568799495697021s
Received healthy response to inference request in 0.7789185047149658s
5 requests
0 failed requests
5th percentile: 0.5878529071807861
10th percentile: 0.6356193065643311
20th percentile: 0.7311521053314209
30th percentile: 0.794510793685913
40th percentile: 0.8256953716278076
50th percentile: 0.8568799495697021
60th percentile: 0.9121975421905517
70th percentile: 0.9675151348114014
80th percentile: 1.0154491424560548
90th percentile: 1.0559995651245118
95th percentile: 1.0762747764587401
99th percentile: 1.092494945526123
mean time: 0.8535217761993408
Pipeline stage StressChecker completed in 5.20s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.06s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.05s
M-Eval Dataset for topic stay_in_character is loaded
fizzarolli-llama-3-lust-_8388_v1 status is now deployed due to DeploymentManager action
fizzarolli-llama-3-lust-_8388_v1 status is now rejected due to its M-Eval score being less than the acceptable minimum 6.5 to serve to users. Please consider iterating on your model's ability to adhere to prompts to improve this score.