submission_id: chaiml-phase2-winner-13b2_v224
developer_uid: robert_irvine
status: inactive
model_repo: ChaiML/phase2_winner_13b2
reward_repo: ChaiML/reward_models_100_170000000_cp_332032
generation_params: {'temperature': 1.0733671330918084, 'top_p': 0.6971846333941389, 'top_k': 50, 'presence_penalty': 0.0, 'frequency_penalty': 0.312882778758545, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:'}
timestamp: 2024-02-13T20:17:50+00:00
model_name: chaiml-phase2-winner-13b2_v224
model_eval_status: success
safety_score: 0.95
entertaining: 6.96
stay_in_character: 8.52
user_preference: 7.36
double_thumbs_up: 3343
thumbs_up: 5339
thumbs_down: 2314
num_battles: 123560
num_wins: 56456
win_ratio: 0.4569116218841049
celo_rating: 1126.03
Resubmit model
Running pipeline stage MKMLizer
Starting job with name chaiml-phase2-winner-13b2-v224-mkmlizer
Waiting for job on chaiml-phase2-winner-13b2-v224-mkmlizer to finish
chaiml-phase2-winner-13b2-v224-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ _____ __ __ ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ /___/ ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ Version: 0.6.11 ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ belonging to: ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ Chai Research Corp. ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ Expiration: 2024-04-15 23:59:59 ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ║ ║
chaiml-phase2-winner-13b2-v224-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-phase2-winner-13b2-v224-mkmlizer: .gitattributes: 0%| | 0.00/1.52k [00:00<?, ?B/s] .gitattributes: 100%|██████████| 1.52k/1.52k [00:00<00:00, 14.3MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: README.md: 0%| | 0.00/1.53k [00:00<?, ?B/s] README.md: 100%|██████████| 1.53k/1.53k [00:00<00:00, 12.4MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: config.json: 0%| | 0.00/651 [00:00<?, ?B/s] config.json: 100%|██████████| 651/651 [00:00<00:00, 10.7MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: generation_config.json: 0%| | 0.00/170 [00:00<?, ?B/s] generation_config.json: 100%|██████████| 170/170 [00:00<00:00, 2.06MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: pytorch_model-00001-of-00003.bin: 0%| | 0.00/9.95G [00:00<?, ?B/s] pytorch_model-00001-of-00003.bin: 0%| | 10.5M/9.95G [00:00<09:13, 17.9MB/s] pytorch_model-00001-of-00003.bin: 0%| | 21.0M/9.95G [00:01<12:02, 13.7MB/s] pytorch_model-00001-of-00003.bin: 1%| | 73.4M/9.95G [00:01<02:33, 64.3MB/s] pytorch_model-00001-of-00003.bin: 2%|▏ | 157M/9.95G [00:01<01:03, 154MB/s] pytorch_model-00001-of-00003.bin: 2%|▏ | 199M/9.95G [00:01<00:53, 182MB/s] pytorch_model-00001-of-00003.bin: 3%|▎ | 283M/9.95G [00:01<00:34, 281MB/s] pytorch_model-00001-of-00003.bin: 3%|▎ | 336M/9.95G [00:02<00:31, 309MB/s] pytorch_model-00001-of-00003.bin: 4%|▍ | 388M/9.95G [00:02<00:34, 277MB/s] pytorch_model-00001-of-00003.bin: 4%|▍ | 430M/9.95G [00:02<00:48, 197MB/s] pytorch_model-00001-of-00003.bin: 5%|▍ | 493M/9.95G [00:02<00:36, 260MB/s] pytorch_model-00001-of-00003.bin: 5%|▌ | 545M/9.95G [00:02<00:32, 292MB/s] pytorch_model-00001-of-00003.bin: 7%|▋ | 661M/9.95G [00:03<00:20, 451MB/s] pytorch_model-00001-of-00003.bin: 8%|▊ | 765M/9.95G [00:03<00:16, 542MB/s] pytorch_model-00001-of-00003.bin: 8%|▊ | 839M/9.95G [00:03<00:15, 574MB/s] pytorch_model-00001-of-00003.bin: 9%|▉ | 933M/9.95G [00:03<00:13, 660MB/s] pytorch_model-00001-of-00003.bin: 10%|█ | 1.02G/9.95G [00:03<00:13, 662MB/s] pytorch_model-00001-of-00003.bin: 11%|█ | 1.09G/9.95G [00:03<00:14, 624MB/s] pytorch_model-00001-of-00003.bin: 12%|█▏ | 1.16G/9.95G [00:04<00:24, 362MB/s] pytorch_model-00001-of-00003.bin: 12%|█▏ | 1.22G/9.95G [00:04<00:26, 328MB/s] pytorch_model-00001-of-00003.bin: 13%|█▎ | 1.27G/9.95G [00:04<00:24, 353MB/s] pytorch_model-00001-of-00003.bin: 13%|█▎ | 1.32G/9.95G [00:04<00:23, 363MB/s] pytorch_model-00001-of-00003.bin: 14%|█▍ | 1.42G/9.95G [00:04<00:18, 467MB/s] pytorch_model-00001-of-00003.bin: 15%|█▌ | 1.51G/9.95G [00:04<00:15, 557MB/s] pytorch_model-00001-of-00003.bin: 16%|█▌ | 1.59G/9.95G [00:04<00:13, 612MB/s] pytorch_model-00001-of-00003.bin: 17%|█▋ | 1.73G/9.95G [00:04<00:10, 796MB/s] pytorch_model-00001-of-00003.bin: 18%|█▊ | 1.82G/9.95G [00:05<00:10, 804MB/s] pytorch_model-00001-of-00003.bin: 19%|█▉ | 1.92G/9.95G [00:05<00:17, 467MB/s] pytorch_model-00001-of-00003.bin: 20%|██ | 1.99G/9.95G [00:05<00:23, 342MB/s] pytorch_model-00001-of-00003.bin: 21%|██ | 2.11G/9.95G [00:05<00:17, 457MB/s] pytorch_model-00001-of-00003.bin: 22%|██▏ | 2.22G/9.95G [00:06<00:13, 556MB/s] pytorch_model-00001-of-00003.bin: 23%|██▎ | 2.31G/9.95G [00:06<00:13, 576MB/s] pytorch_model-00001-of-00003.bin: 24%|██▍ | 2.39G/9.95G [00:06<00:13, 569MB/s] pytorch_model-00001-of-00003.bin: 25%|██▌ | 2.50G/9.95G [00:06<00:11, 633MB/s] pytorch_model-00001-of-00003.bin: 26%|██▌ | 2.60G/9.95G [00:06<00:10, 722MB/s] pytorch_model-00001-of-00003.bin: 27%|██▋ | 2.68G/9.95G [00:06<00:10, 669MB/s] pytorch_model-00001-of-00003.bin: 28%|██▊ | 2.77G/9.95G [00:07<00:20, 356MB/s] pytorch_model-00001-of-00003.bin: 29%|██▊ | 2.85G/9.95G [00:07<00:16, 420MB/s] pytorch_model-00001-of-00003.bin: 30%|██▉ | 2.94G/9.95G [00:07<00:14, 480MB/s] pytorch_model-00001-of-00003.bin: 32%|███▏ | 3.14G/9.95G [00:07<00:09, 755MB/s] pytorch_model-00001-of-00003.bin: 33%|███▎ | 3.25G/9.95G [00:07<00:08, 796MB/s] pytorch_model-00001-of-00003.bin: 34%|███▎ | 3.36G/9.95G [00:07<00:09, 709MB/s] pytorch_model-00001-of-00003.bin: 35%|███▍ | 3.45G/9.95G [00:08<00:12, 520MB/s] pytorch_model-00001-of-00003.bin: 35%|███▌ | 3.52G/9.95G [00:08<00:16, 394MB/s] pytorch_model-00001-of-00003.bin: 36%|███▌ | 3.59G/9.95G [00:08<00:17, 371MB/s] pytorch_model-00001-of-00003.bin: 37%|███▋ | 3.64G/9.95G [00:08<00:17, 370MB/s] pytorch_model-00001-of-00003.bin: 37%|███▋ | 3.70G/9.95G [00:09<00:15, 412MB/s] pytorch_model-00001-of-00003.bin: 39%|███▊ | 3.84G/9.95G [00:09<00:10, 586MB/s] pytorch_model-00001-of-00003.bin: 40%|███▉ | 3.97G/9.95G [00:09<00:08, 743MB/s] pytorch_model-00001-of-00003.bin: 41%|████▏ | 4.11G/9.95G [00:09<00:06, 846MB/s] pytorch_model-00001-of-00003.bin: 42%|████▏ | 4.22G/9.95G [00:09<00:11, 513MB/s] pytorch_model-00001-of-00003.bin: 43%|████▎ | 4.30G/9.95G [00:09<00:11, 504MB/s] pytorch_model-00001-of-00003.bin: 44%|████▍ | 4.38G/9.95G [00:10<00:11, 487MB/s] pytorch_model-00001-of-00003.bin: 45%|████▌ | 4.49G/9.95G [00:10<00:09, 553MB/s] pytorch_model-00001-of-00003.bin: 46%|████▌ | 4.57G/9.95G [00:10<00:08, 608MB/s] pytorch_model-00001-of-00003.bin: 47%|████▋ | 4.65G/9.95G [00:10<00:15, 353MB/s] pytorch_model-00001-of-00003.bin: 48%|████▊ | 4.76G/9.95G [00:10<00:11, 461MB/s] pytorch_model-00001-of-00003.bin: 49%|████▉ | 4.90G/9.95G [00:11<00:08, 607MB/s] pytorch_model-00001-of-00003.bin: 50%|█████ | 4.99G/9.95G [00:11<00:07, 647MB/s] pytorch_model-00001-of-00003.bin: 51%|█████ | 5.10G/9.95G [00:11<00:06, 700MB/s] pytorch_model-00001-of-00003.bin: 52%|█████▏ | 5.20G/9.95G [00:11<00:06, 760MB/s] pytorch_model-00001-of-00003.bin: 53%|█████▎ | 5.30G/9.95G [00:11<00:05, 792MB/s] pytorch_model-00001-of-00003.bin: 55%|█████▍ | 5.43G/9.95G [00:11<00:04, 930MB/s] pytorch_model-00001-of-00003.bin: 56%|█████▌ | 5.56G/9.95G [00:11<00:04, 1.01GB/s] pytorch_model-00001-of-00003.bin: 57%|█████▋ | 5.67G/9.95G [00:11<00:04, 887MB/s] pytorch_model-00001-of-00003.bin: 58%|█████▊ | 5.82G/9.95G [00:12<00:04, 969MB/s] pytorch_model-00001-of-00003.bin: 60%|█████▉ | 5.92G/9.95G [00:12<00:04, 818MB/s] pytorch_model-00001-of-00003.bin: 60%|██████ | 6.02G/9.95G [00:12<00:07, 555MB/s] pytorch_model-00001-of-00003.bin: 61%|██████ | 6.09G/9.95G [00:12<00:07, 501MB/s] pytorch_model-00001-of-00003.bin: 62%|██████▏ | 6.17G/9.95G [00:12<00:07, 522MB/s] pytorch_model-00001-of-00003.bin: 63%|██████▎ | 6.23G/9.95G [00:13<00:07, 486MB/s] pytorch_model-00001-of-00003.bin: 63%|██████▎ | 6.29G/9.95G [00:13<00:07, 504MB/s] pytorch_model-00001-of-00003.bin: 64%|██████▍ | 6.41G/9.95G [00:13<00:05, 628MB/s] pytorch_model-00001-of-00003.bin: 65%|██████▌ | 6.48G/9.95G [00:13<00:06, 556MB/s] pytorch_model-00001-of-00003.bin: 66%|██████▌ | 6.54G/9.95G [00:13<00:07, 483MB/s] pytorch_model-00001-of-00003.bin: 66%|██████▋ | 6.61G/9.95G [00:13<00:07, 455MB/s] pytorch_model-00001-of-00003.bin: 68%|██████▊ | 6.72G/9.95G [00:13<00:05, 561MB/s] pytorch_model-00001-of-00003.bin: 69%|██████▉ | 6.85G/9.95G [00:14<00:04, 671MB/s] pytorch_model-00001-of-00003.bin: 70%|██████▉ | 6.92G/9.95G [00:14<00:05, 561MB/s] pytorch_model-00001-of-00003.bin: 70%|███████ | 6.98G/9.95G [00:14<00:05, 573MB/s] pytorch_model-00001-of-00003.bin: 71%|███████ | 7.08G/9.95G [00:14<00:04, 656MB/s] pytorch_model-00001-of-00003.bin: 72%|███████▏ | 7.15G/9.95G [00:14<00:06, 410MB/s] pytorch_model-00001-of-00003.bin: 73%|███████▎ | 7.24G/9.95G [00:14<00:05, 459MB/s] pytorch_model-00001-of-00003.bin: 73%|███████▎ | 7.31G/9.95G [00:15<00:05, 500MB/s] pytorch_model-00001-of-00003.bin: 75%|███████▍ | 7.42G/9.95G [00:15<00:04, 623MB/s] pytorch_model-00001-of-00003.bin: 76%|███████▌ | 7.57G/9.95G [00:15<00:02, 815MB/s] pytorch_model-00001-of-00003.bin: 77%|███████▋ | 7.68G/9.95G [00:15<00:02, 804MB/s] pytorch_model-00001-of-00003.bin: 78%|███████▊ | 7.77G/9.95G [00:15<00:02, 836MB/s] pytorch_model-00001-of-00003.bin: 79%|███████▉ | 7.86G/9.95G [00:15<00:02, 774MB/s] pytorch_model-00001-of-00003.bin: 80%|████████ | 7.99G/9.95G [00:15<00:02, 887MB/s] pytorch_model-00001-of-00003.bin: 81%|████████▏ | 8.11G/9.95G [00:15<00:01, 934MB/s] pytorch_model-00001-of-00003.bin: 83%|████████▎ | 8.23G/9.95G [00:15<00:01, 1.02GB/s] pytorch_model-00001-of-00003.bin: 84%|████████▍ | 8.35G/9.95G [00:16<00:01, 983MB/s] pytorch_model-00001-of-00003.bin: 86%|████████▌ | 8.52G/9.95G [00:16<00:01, 1.20GB/s] pytorch_model-00001-of-00003.bin: 87%|████████▋ | 8.68G/9.95G [00:16<00:00, 1.30GB/s] pytorch_model-00001-of-00003.bin: 89%|████████▊ | 8.82G/9.95G [00:16<00:00, 1.30GB/s] pytorch_model-00001-of-00003.bin: 90%|█████████ | 8.95G/9.95G [00:16<00:00, 1.30GB/s] pytorch_model-00001-of-00003.bin: 92%|█████████▏| 9.14G/9.95G [00:16<00:00, 1.46GB/s] pytorch_model-00001-of-00003.bin: 93%|█████████▎| 9.30G/9.95G [00:16<00:00, 1.42GB/s] pytorch_model-00001-of-00003.bin: 95%|█████████▌| 9.47G/9.95G [00:16<00:00, 1.47GB/s] pytorch_model-00001-of-00003.bin: 97%|█████████▋| 9.62G/9.95G [00:16<00:00, 1.47GB/s] pytorch_model-00001-of-00003.bin: 98%|█████████▊| 9.78G/9.95G [00:17<00:00, 1.45GB/s] pytorch_model-00001-of-00003.bin: 100%|█████████▉| 9.93G/9.95G [00:17<00:00, 702MB/s] pytorch_model-00001-of-00003.bin: 100%|█████████▉| 9.95G/9.95G [00:18<00:00, 535MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: pytorch_model-00002-of-00003.bin: 0%| | 0.00/9.90G [00:00<?, ?B/s] pytorch_model-00002-of-00003.bin: 0%| | 10.5M/9.90G [00:00<07:58, 20.7MB/s] pytorch_model-00002-of-00003.bin: 0%| | 21.0M/9.90G [00:01<08:29, 19.4MB/s] pytorch_model-00002-of-00003.bin: 0%| | 41.9M/9.90G [00:01<03:46, 43.5MB/s] pytorch_model-00002-of-00003.bin: 1%|▏ | 147M/9.90G [00:01<00:48, 200MB/s] pytorch_model-00002-of-00003.bin: 3%|▎ | 336M/9.90G [00:01<00:19, 498MB/s] pytorch_model-00002-of-00003.bin: 5%|▍ | 461M/9.90G [00:01<00:14, 650MB/s] pytorch_model-00002-of-00003.bin: 8%|▊ | 755M/9.90G [00:01<00:07, 1.16GB/s] pytorch_model-00002-of-00003.bin: 9%|▉ | 923M/9.90G [00:01<00:08, 1.11GB/s] pytorch_model-00002-of-00003.bin: 11%|█ | 1.07G/9.90G [00:01<00:08, 1.09GB/s] pytorch_model-00002-of-00003.bin: 12%|█▏ | 1.21G/9.90G [00:02<00:07, 1.13GB/s] pytorch_model-00002-of-00003.bin: 14%|█▎ | 1.34G/9.90G [00:02<00:08, 1.04GB/s] pytorch_model-00002-of-00003.bin: 15%|█▍ | 1.47G/9.90G [00:02<00:08, 955MB/s] pytorch_model-00002-of-00003.bin: 16%|█▋ | 1.61G/9.90G [00:02<00:07, 1.07GB/s] pytorch_model-00002-of-00003.bin: 18%|█▊ | 1.77G/9.90G [00:02<00:07, 1.15GB/s] pytorch_model-00002-of-00003.bin: 21%|██ | 2.07G/9.90G [00:02<00:04, 1.59GB/s] pytorch_model-00002-of-00003.bin: 23%|██▎ | 2.24G/9.90G [00:02<00:05, 1.47GB/s] pytorch_model-00002-of-00003.bin: 24%|██▍ | 2.42G/9.90G [00:02<00:04, 1.51GB/s] pytorch_model-00002-of-00003.bin: 26%|██▋ | 2.60G/9.90G [00:03<00:04, 1.57GB/s] pytorch_model-00002-of-00003.bin: 28%|██▊ | 2.77G/9.90G [00:03<00:04, 1.47GB/s] pytorch_model-00002-of-00003.bin: 30%|██▉ | 2.94G/9.90G [00:03<00:04, 1.50GB/s] pytorch_model-00002-of-00003.bin: 32%|███▏ | 3.16G/9.90G [00:03<00:04, 1.68GB/s] pytorch_model-00002-of-00003.bin: 34%|███▎ | 3.33G/9.90G [00:03<00:04, 1.51GB/s] pytorch_model-00002-of-00003.bin: 36%|███▌ | 3.59G/9.90G [00:03<00:03, 1.77GB/s] pytorch_model-00002-of-00003.bin: 38%|███▊ | 3.77G/9.90G [00:03<00:03, 1.57GB/s] pytorch_model-00002-of-00003.bin: 40%|████ | 4.00G/9.90G [00:03<00:03, 1.72GB/s] pytorch_model-00002-of-00003.bin: 42%|████▏ | 4.18G/9.90G [00:04<00:03, 1.53GB/s] pytorch_model-00002-of-00003.bin: 44%|████▍ | 4.35G/9.90G [00:04<00:03, 1.56GB/s] pytorch_model-00002-of-00003.bin: 46%|████▋ | 4.58G/9.90G [00:04<00:03, 1.72GB/s] pytorch_model-00002-of-00003.bin: 48%|████▊ | 4.77G/9.90G [00:04<00:05, 920MB/s] pytorch_model-00002-of-00003.bin: 50%|████▉ | 4.92G/9.90G [00:05<00:08, 566MB/s] pytorch_model-00002-of-00003.bin: 51%|█████ | 5.08G/9.90G [00:05<00:07, 672MB/s] pytorch_model-00002-of-00003.bin: 53%|█████▎ | 5.20G/9.90G [00:05<00:06, 752MB/s] pytorch_model-00002-of-00003.bin: 54%|█████▍ | 5.33G/9.90G [00:05<00:05, 803MB/s] pytorch_model-00002-of-00003.bin: 55%|█████▌ | 5.48G/9.90G [00:05<00:04, 918MB/s] pytorch_model-00002-of-00003.bin: 57%|█████▋ | 5.63G/9.90G [00:05<00:04, 1.03GB/s] pytorch_model-00002-of-00003.bin: 58%|█████▊ | 5.77G/9.90G [00:05<00:03, 1.06GB/s] pytorch_model-00002-of-00003.bin: 60%|█████▉ | 5.89G/9.90G [00:06<00:04, 977MB/s] pytorch_model-00002-of-00003.bin: 61%|██████ | 6.02G/9.90G [00:06<00:03, 1.04GB/s] pytorch_model-00002-of-00003.bin: 62%|██████▏ | 6.17G/9.90G [00:06<00:03, 1.14GB/s] pytorch_model-00002-of-00003.bin: 64%|██████▍ | 6.32G/9.90G [00:06<00:02, 1.24GB/s] pytorch_model-00002-of-00003.bin: 65%|██████▌ | 6.48G/9.90G [00:06<00:02, 1.31GB/s] pytorch_model-00002-of-00003.bin: 67%|██████▋ | 6.66G/9.90G [00:06<00:02, 1.43GB/s] pytorch_model-00002-of-00003.bin: 69%|██████▉ | 6.82G/9.90G [00:06<00:02, 1.19GB/s] pytorch_model-00002-of-00003.bin: 71%|███████ | 7.00G/9.90G [00:06<00:02, 1.34GB/s] pytorch_model-00002-of-00003.bin: 72%|███████▏ | 7.16G/9.90G [00:07<00:02, 1.34GB/s] pytorch_model-00002-of-00003.bin: 74%|███████▍ | 7.31G/9.90G [00:07<00:02, 1.23GB/s] pytorch_model-00002-of-00003.bin: 77%|███████▋ | 7.62G/9.90G [00:07<00:01, 1.64GB/s] pytorch_model-00002-of-00003.bin: 79%|███████▉ | 7.85G/9.90G [00:07<00:01, 1.80GB/s] pytorch_model-00002-of-00003.bin: 81%|████████ | 8.04G/9.90G [00:07<00:01, 1.32GB/s] pytorch_model-00002-of-00003.bin: 83%|████████▎ | 8.20G/9.90G [00:07<00:01, 1.37GB/s] pytorch_model-00002-of-00003.bin: 84%|████████▍ | 8.36G/9.90G [00:08<00:01, 911MB/s] pytorch_model-00002-of-00003.bin: 86%|████████▌ | 8.48G/9.90G [00:08<00:02, 602MB/s] pytorch_model-00002-of-00003.bin: 87%|████████▋ | 8.58G/9.90G [00:08<00:02, 636MB/s] pytorch_model-00002-of-00003.bin: 88%|████████▊ | 8.67G/9.90G [00:08<00:01, 623MB/s] pytorch_model-00002-of-00003.bin: 90%|████████▉ | 8.91G/9.90G [00:08<00:01, 919MB/s] pytorch_model-00002-of-00003.bin: 91%|█████████▏| 9.04G/9.90G [00:08<00:00, 975MB/s] pytorch_model-00002-of-00003.bin: 93%|█████████▎| 9.20G/9.90G [00:09<00:00, 1.10GB/s] pytorch_model-00002-of-00003.bin: 94%|█████████▍| 9.33G/9.90G [00:09<00:00, 973MB/s] pytorch_model-00002-of-00003.bin: 95%|█████████▌| 9.45G/9.90G [00:09<00:00, 990MB/s] pytorch_model-00002-of-00003.bin: 97%|█████████▋| 9.57G/9.90G [00:09<00:00, 896MB/s] pytorch_model-00002-of-00003.bin: 98%|█████████▊| 9.67G/9.90G [00:09<00:00, 789MB/s] pytorch_model-00002-of-00003.bin: 99%|█████████▊| 9.77G/9.90G [00:09<00:00, 681MB/s] pytorch_model-00002-of-00003.bin: 99%|█████████▉| 9.85G/9.90G [00:10<00:00, 312MB/s] pytorch_model-00002-of-00003.bin: 100%|█████████▉| 9.90G/9.90G [00:12<00:00, 779MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: pytorch_model-00003-of-00003.bin: 0%| | 0.00/6.18G [00:00<?, ?B/s] pytorch_model-00003-of-00003.bin: 0%| | 10.5M/6.18G [00:00<06:09, 16.7MB/s] pytorch_model-00003-of-00003.bin: 0%| | 21.0M/6.18G [00:01<08:06, 12.7MB/s] pytorch_model-00003-of-00003.bin: 1%| | 31.5M/6.18G [00:01<04:52, 21.0MB/s] pytorch_model-00003-of-00003.bin: 1%| | 62.9M/6.18G [00:01<01:56, 52.4MB/s] pytorch_model-00003-of-00003.bin: 2%|▏ | 136M/6.18G [00:01<00:42, 142MB/s] pytorch_model-00003-of-00003.bin: 3%|▎ | 168M/6.18G [00:02<00:35, 167MB/s] pytorch_model-00003-of-00003.bin: 4%|▍ | 262M/6.18G [00:02<00:19, 310MB/s] pytorch_model-00003-of-00003.bin: 5%|▌ | 336M/6.18G [00:02<00:14, 393MB/s] pytorch_model-00003-of-00003.bin: 6%|▋ | 398M/6.18G [00:02<00:13, 429MB/s] pytorch_model-00003-of-00003.bin: 8%|▊ | 482M/6.18G [00:02<00:10, 522MB/s] pytorch_model-00003-of-00003.bin: 9%|▉ | 556M/6.18G [00:02<00:11, 498MB/s] pytorch_model-00003-of-00003.bin: 11%|█▏ | 703M/6.18G [00:02<00:08, 657MB/s] pytorch_model-00003-of-00003.bin: 13%|█▎ | 776M/6.18G [00:02<00:08, 653MB/s] pytorch_model-00003-of-00003.bin: 14%|█▍ | 891M/6.18G [00:03<00:07, 717MB/s] pytorch_model-00003-of-00003.bin: 16%|█▌ | 975M/6.18G [00:03<00:07, 717MB/s] pytorch_model-00003-of-00003.bin: 17%|█▋ | 1.06G/6.18G [00:03<00:07, 721MB/s] pytorch_model-00003-of-00003.bin: 19%|█▉ | 1.20G/6.18G [00:03<00:05, 833MB/s] pytorch_model-00003-of-00003.bin: 22%|██▏ | 1.37G/6.18G [00:03<00:04, 983MB/s] pytorch_model-00003-of-00003.bin: 24%|██▍ | 1.49G/6.18G [00:03<00:04, 1.02GB/s] pytorch_model-00003-of-00003.bin: 26%|██▋ | 1.64G/6.18G [00:03<00:04, 1.11GB/s] pytorch_model-00003-of-00003.bin: 28%|██▊ | 1.75G/6.18G [00:03<00:04, 1.10GB/s] pytorch_model-00003-of-00003.bin: 31%|███ | 1.93G/6.18G [00:03<00:03, 1.26GB/s] pytorch_model-00003-of-00003.bin: 34%|███▍ | 2.11G/6.18G [00:04<00:02, 1.38GB/s] pytorch_model-00003-of-00003.bin: 37%|███▋ | 2.28G/6.18G [00:04<00:02, 1.44GB/s] pytorch_model-00003-of-00003.bin: 39%|███▉ | 2.42G/6.18G [00:04<00:03, 1.22GB/s] pytorch_model-00003-of-00003.bin: 41%|████▏ | 2.56G/6.18G [00:04<00:03, 1.19GB/s] pytorch_model-00003-of-00003.bin: 46%|████▌ | 2.82G/6.18G [00:04<00:02, 1.54GB/s] pytorch_model-00003-of-00003.bin: 48%|████▊ | 2.99G/6.18G [00:04<00:02, 1.57GB/s] pytorch_model-00003-of-00003.bin: 52%|█████▏ | 3.19G/6.18G [00:04<00:01, 1.68GB/s] pytorch_model-00003-of-00003.bin: 54%|█████▍ | 3.37G/6.18G [00:04<00:01, 1.45GB/s] pytorch_model-00003-of-00003.bin: 57%|█████▋ | 3.52G/6.18G [00:05<00:04, 538MB/s] pytorch_model-00003-of-00003.bin: 59%|█████▉ | 3.64G/6.18G [00:05<00:04, 578MB/s] pytorch_model-00003-of-00003.bin: 62%|██████▏ | 3.82G/6.18G [00:05<00:03, 737MB/s] pytorch_model-00003-of-00003.bin: 64%|██████▍ | 3.98G/6.18G [00:06<00:02, 856MB/s] pytorch_model-00003-of-00003.bin: 67%|██████▋ | 4.12G/6.18G [00:06<00:02, 816MB/s] pytorch_model-00003-of-00003.bin: 69%|██████▉ | 4.29G/6.18G [00:06<00:01, 946MB/s] pytorch_model-00003-of-00003.bin: 71%|███████▏ | 4.41G/6.18G [00:06<00:01, 904MB/s] pytorch_model-00003-of-00003.bin: 73%|███████▎ | 4.53G/6.18G [00:06<00:01, 953MB/s] pytorch_model-00003-of-00003.bin: 76%|███████▌ | 4.68G/6.18G [00:06<00:01, 1.06GB/s] pytorch_model-00003-of-00003.bin: 78%|███████▊ | 4.80G/6.18G [00:06<00:01, 1.07GB/s] pytorch_model-00003-of-00003.bin: 80%|███████▉ | 4.94G/6.18G [00:07<00:01, 1.05GB/s] pytorch_model-00003-of-00003.bin: 83%|████████▎ | 5.11G/6.18G [00:07<00:00, 1.20GB/s] pytorch_model-00003-of-00003.bin: 85%|████████▍ | 5.24G/6.18G [00:07<00:00, 1.20GB/s] pytorch_model-00003-of-00003.bin: 87%|████████▋ | 5.38G/6.18G [00:07<00:00, 1.22GB/s] pytorch_model-00003-of-00003.bin: 90%|████████▉ | 5.55G/6.18G [00:07<00:00, 1.29GB/s] pytorch_model-00003-of-00003.bin: 93%|█████████▎| 5.73G/6.18G [00:07<00:00, 1.41GB/s] pytorch_model-00003-of-00003.bin: 95%|█████████▌| 5.87G/6.18G [00:07<00:00, 1.40GB/s] pytorch_model-00003-of-00003.bin: 98%|█████████▊| 6.03G/6.18G [00:07<00:00, 1.24GB/s] pytorch_model-00003-of-00003.bin: 100%|█████████▉| 6.17G/6.18G [00:09<00:00, 289MB/s] pytorch_model-00003-of-00003.bin: 100%|█████████▉| 6.18G/6.18G [00:11<00:00, 540MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: tokenizer.model: 0%| | 0.00/500k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 500k/500k [00:00<00:00, 5.90MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: tokenizer_config.json: 0%| | 0.00/725 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 725/725 [00:00<00:00, 9.91MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: Downloaded to shared memory in 46.929s
chaiml-phase2-winner-13b2-v224-mkmlizer: quantizing model to /dev/shm/model_cache
chaiml-phase2-winner-13b2-v224-mkmlizer: Saving mkml model at /dev/shm/model_cache
chaiml-phase2-winner-13b2-v224-mkmlizer: Reading /tmp/tmpx2itl0j0/pytorch_model.bin.index.json
chaiml-phase2-winner-13b2-v224-mkmlizer: Profiling: 0%| | 0/363 [00:00<?, ?it/s] Profiling: 0%| | 1/363 [00:03<23:17, 3.86s/it] Profiling: 38%|███▊ | 139/363 [00:05<00:07, 31.25it/s] Profiling: 77%|███████▋ | 278/363 [00:06<00:01, 57.07it/s] Profiling: 100%|██████████| 363/363 [00:08<00:00, 54.63it/s] Profiling: 100%|██████████| 363/363 [00:08<00:00, 43.67it/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: quantized model in 27.361s
chaiml-phase2-winner-13b2-v224-mkmlizer: Processed model ChaiML/phase2_winner_13b2 in 76.149s
chaiml-phase2-winner-13b2-v224-mkmlizer: creating bucket guanaco-mkml-models
chaiml-phase2-winner-13b2-v224-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-phase2-winner-13b2-v224-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-phase2-winner-13b2-v224
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-phase2-winner-13b2-v224/config.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-phase2-winner-13b2-v224/tokenizer_config.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-phase2-winner-13b2-v224/special_tokens_map.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/chaiml-phase2-winner-13b2-v224/tokenizer.model
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-phase2-winner-13b2-v224/tokenizer.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /dev/shm/model_cache/mkml_model.tensors s3://guanaco-mkml-models/chaiml-phase2-winner-13b2-v224/mkml_model.tensors
chaiml-phase2-winner-13b2-v224-mkmlizer: loading reward model from ChaiML/reward_models_100_170000000_cp_332032
chaiml-phase2-winner-13b2-v224-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-phase2-winner-13b2-v224-mkmlizer: warnings.warn(
chaiml-phase2-winner-13b2-v224-mkmlizer: config.json: 0%| | 0.00/1.06k [00:00<?, ?B/s] config.json: 100%|██████████| 1.06k/1.06k [00:00<00:00, 13.2MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-phase2-winner-13b2-v224-mkmlizer: warnings.warn(
chaiml-phase2-winner-13b2-v224-mkmlizer: tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 234/234 [00:00<00:00, 2.64MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: vocab.json: 0%| | 0.00/798k [00:00<?, ?B/s] vocab.json: 100%|██████████| 798k/798k [00:00<00:00, 8.09MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 25.2MB/s]
chaiml-phase2-winner-13b2-v224-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-phase2-winner-13b2-v224-mkmlizer: warnings.warn(
chaiml-phase2-winner-13b2-v224-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
chaiml-phase2-winner-13b2-v224-mkmlizer: Saving duration: 0.085s
chaiml-phase2-winner-13b2-v224-mkmlizer: Processed model ChaiML/reward_models_100_170000000_cp_332032 in 4.168s
chaiml-phase2-winner-13b2-v224-mkmlizer: creating bucket guanaco-reward-models
chaiml-phase2-winner-13b2-v224-mkmlizer: Bucket 's3://guanaco-reward-models/' created
chaiml-phase2-winner-13b2-v224-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/chaiml-phase2-winner-13b2-v224_reward
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/chaiml-phase2-winner-13b2-v224_reward/config.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/chaiml-phase2-winner-13b2-v224_reward/special_tokens_map.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/chaiml-phase2-winner-13b2-v224_reward/merges.txt
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/chaiml-phase2-winner-13b2-v224_reward/tokenizer_config.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/chaiml-phase2-winner-13b2-v224_reward/vocab.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/chaiml-phase2-winner-13b2-v224_reward/tokenizer.json
chaiml-phase2-winner-13b2-v224-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/chaiml-phase2-winner-13b2-v224_reward/reward.tensors
Job chaiml-phase2-winner-13b2-v224-mkmlizer completed after 105.88s with status: succeeded
Stopping job with name chaiml-phase2-winner-13b2-v224-mkmlizer
Pipeline stage MKMLizer completed in 111.42s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service chaiml-phase2-winner-13b2-v224
Waiting for inference service chaiml-phase2-winner-13b2-v224 to be ready
Inference service chaiml-phase2-winner-13b2-v224 ready after 50.38985538482666s
Pipeline stage ISVCDeployer completed in 58.35s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.822221517562866s
Received healthy response to inference request in 1.7520310878753662s
Received healthy response to inference request in 1.5771708488464355s
Received healthy response to inference request in 1.5142478942871094s
Received healthy response to inference request in 1.7716469764709473s
5 requests
0 failed requests
5th percentile: 1.5268324851989745
10th percentile: 1.5394170761108399
20th percentile: 1.5645862579345704
30th percentile: 1.6121428966522218
40th percentile: 1.682086992263794
50th percentile: 1.7520310878753662
60th percentile: 1.7598774433135986
70th percentile: 1.767723798751831
80th percentile: 1.9817618846893312
90th percentile: 2.4019917011260987
95th percentile: 2.6121066093444822
99th percentile: 2.7801985359191894
mean time: 1.887463665008545
Pipeline stage StressChecker completed in 10.97s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.05s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.05s
M-Eval Dataset for topic stay_in_character is loaded
chaiml-phase2-winner-13b2_v224 status is now inactive due to auto deactivation removed underperforming models
chaiml-phase2-winner-13b2_v224 status is now deployed due to admin request
chaiml-phase2-winner-13b2_v224 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics