submission_id: turboderp-cat-llama-3-70_8684_v4
developer_uid: kaltcit
status: inactive
model_repo: turboderp/Cat-Llama-3-70B-instruct
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 0.9, 'top_p': 0.8, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '<|im_start|>system\n{memory}<|im_end|>\n', 'prompt_template': '<|im_start|>system\n{prompt}<|im_end|>\n', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-26T18:50:54+00:00
model_name: turboderp-cat-llama-3-70_8684_v4
model_group: turboderp/Cat-Llama-3-70
num_battles: 13915
num_wins: 6879
celo_rating: 1187.35
propriety_score: 0.7282176704371752
propriety_total_count: 6542.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 70553739264.0
best_of: 4
max_input_tokens: 512
max_output_tokens: 64
display_name: turboderp-cat-llama-3-70_8684_v4
ineligible_reason: None
language_model: turboderp/Cat-Llama-3-70B-instruct
model_size: 71B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-06-26
win_ratio: 0.4943586058210564
Resubmit model
Running pipeline stage MKMLizer
Starting job with name turboderp-cat-llama-3-70-8684-v4-mkmlizer
Waiting for job on turboderp-cat-llama-3-70-8684-v4-mkmlizer to finish
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ _____ __ __ ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ /___/ ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ Version: 0.8.14 ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ https://mk1.ai ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ The license key for the current software has been verified as ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ belonging to: ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ Chai Research Corp. ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ║ ║
turboderp-cat-llama-3-70-8684-v4-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
turboderp-cat-llama-3-70-8684-v4-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
turboderp-cat-llama-3-70-8684-v4-mkmlizer: warnings.warn(warning_message, FutureWarning)
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
turboderp-cat-llama-3-70-8684-v4-mkmlizer: Downloaded to shared memory in 190.520s
turboderp-cat-llama-3-70-8684-v4-mkmlizer: quantizing model to /dev/shm/model_cache
turboderp-cat-llama-3-70-8684-v4-mkmlizer: Saving flywheel model at /dev/shm/model_cache
turboderp-cat-llama-3-70-8684-v4-mkmlizer: Loading 0: 0%| | 0/723 [00:00<?, ?it/s] Loading 0: 1%| | 4/723 [00:00<00:28, 25.51it/s] Loading 0: 1%| | 8/723 [00:00<00:22, 32.21it/s] Loading 0: 2%|▏ | 13/723 [00:00<00:19, 36.35it/s] Loading 0: 2%|▏ | 17/723 [00:00<00:40, 17.23it/s] Loading 0: 3%|▎ | 21/723 [00:00<00:34, 20.15it/s] Loading 0: 3%|▎ | 24/723 [00:01<00:34, 20.49it/s] Loading 0: 4%|▍ | 30/723 [00:01<00:24, 28.25it/s] Loading 0: 5%|▍ | 34/723 [00:01<00:24, 27.89it/s] Loading 0: 6%|▌ | 42/723 [00:01<00:30, 22.32it/s] Loading 0: 6%|▌ | 45/723 [00:02<00:33, 20.00it/s] Loading 0: 7%|▋ | 49/723 [00:02<00:31, 21.71it/s] Loading 0: 7%|▋ | 54/723 [00:02<00:25, 25.85it/s] Loading 0: 8%|▊ | 58/723 [00:02<00:25, 26.51it/s] Loading 0: 9%|▊ | 63/723 [00:02<00:21, 30.37it/s] Loading 0: 9%|▉ | 68/723 [00:02<00:29, 22.06it/s] Loading 0: 10%|▉ | 71/723 [00:03<00:32, 19.96it/s] Loading 0: 10%|█ | 75/723 [00:03<00:27, 23.26it/s] Loading 0: 11%|█ | 78/723 [00:03<00:28, 22.84it/s] Loading 0: 12%|█▏ | 84/723 [00:03<00:21, 29.89it/s] Loading 0: 12%|█▏ | 88/723 [00:03<00:21, 29.19it/s] Loading 0: 13%|█▎ | 92/723 [00:03<00:31, 19.79it/s] Loading 0: 13%|█▎ | 95/723 [00:04<00:34, 18.18it/s] Loading 0: 14%|█▍ | 102/723 [00:04<00:23, 26.49it/s] Loading 0: 15%|█▍ | 106/723 [00:04<00:22, 26.86it/s] Loading 0: 15%|█▌ | 110/723 [00:04<00:20, 29.20it/s] Loading 0: 16%|█▌ | 114/723 [00:04<00:19, 30.57it/s] Loading 0: 16%|█▋ | 118/723 [00:05<00:32, 18.65it/s] Loading 0: 17%|█▋ | 121/723 [00:05<00:31, 19.10it/s] Loading 0: 17%|█▋ | 125/723 [00:16<08:42, 1.14it/s] Loading 0: 18%|█▊ | 129/723 [00:16<06:05, 1.63it/s] Loading 0: 18%|█▊ | 132/723 [00:16<04:41, 2.10it/s] Loading 0: 19%|█▉ | 137/723 [00:16<03:02, 3.22it/s] Loading 0: 20%|█▉ | 142/723 [00:17<02:26, 3.96it/s] Loading 0: 20%|██ | 145/723 [00:17<01:59, 4.82it/s] Loading 0: 20%|██ | 148/723 [00:17<01:36, 5.94it/s] Loading 0: 21%|██ | 152/723 [00:17<01:10, 8.12it/s] Loading 0: 22%|██▏ | 157/723 [00:17<00:51, 11.04it/s] Loading 0: 22%|██▏ | 162/723 [00:17<00:38, 14.74it/s] Loading 0: 23%|██▎ | 168/723 [00:18<00:42, 12.97it/s] Loading 0: 24%|██▎ | 171/723 [00:18<00:42, 13.00it/s] Loading 0: 24%|██▍ | 175/723 [00:18<00:36, 15.11it/s] Loading 0: 25%|██▍ | 179/723 [00:18<00:29, 18.31it/s] Loading 0: 25%|██▌ | 184/723 [00:19<00:25, 21.04it/s] Loading 0: 26%|██▌ | 188/723 [00:19<00:22, 24.17it/s] Loading 0: 27%|██▋ | 194/723 [00:19<00:28, 18.63it/s] Loading 0: 27%|██▋ | 197/723 [00:19<00:30, 17.40it/s] Loading 0: 28%|██▊ | 201/723 [00:19<00:25, 20.49it/s] Loading 0: 28%|██▊ | 204/723 [00:20<00:25, 20.41it/s] Loading 0: 29%|██▉ | 210/723 [00:20<00:19, 26.75it/s] Loading 0: 30%|██▉ | 214/723 [00:20<00:19, 25.87it/s] Loading 0: 30%|███ | 218/723 [00:20<00:28, 17.59it/s] Loading 0: 31%|███ | 221/723 [00:21<00:30, 16.33it/s] Loading 0: 32%|███▏ | 228/723 [00:21<00:20, 23.65it/s] Loading 0: 32%|███▏ | 232/723 [00:21<00:20, 23.46it/s] Loading 0: 33%|███▎ | 236/723 [00:21<00:19, 24.80it/s] Loading 0: 33%|███▎ | 239/723 [00:21<00:19, 25.25it/s] Loading 0: 33%|███▎ | 242/723 [00:22<00:32, 14.68it/s] Loading 0: 34%|███▍ | 246/723 [00:22<00:28, 16.57it/s] Loading 0: 34%|███▍ | 249/723 [00:22<00:28, 16.47it/s] Loading 0: 35%|███▌ | 255/723 [00:22<00:20, 22.78it/s] Loading 0: 36%|███▌ | 258/723 [00:22<00:21, 21.40it/s] Loading 0: 36%|███▋ | 263/723 [00:22<00:18, 25.47it/s] Loading 0: 37%|███▋ | 266/723 [00:33<06:38, 1.15it/s] Loading 0: 37%|███▋ | 268/723 [00:34<05:43, 1.33it/s] Loading 0: 37%|███▋ | 271/723 [00:34<04:12, 1.79it/s] Loading 0: 38%|███▊ | 274/723 [00:34<03:06, 2.41it/s] Loading 0: 38%|███▊ | 278/723 [00:34<02:04, 3.58it/s] Loading 0: 39%|███▉ | 282/723 [00:34<01:26, 5.11it/s] Loading 0: 39%|███▉ | 285/723 [00:34<01:09, 6.31it/s] Loading 0: 41%|████ | 294/723 [00:35<00:44, 9.56it/s] Loading 0: 41%|████ | 297/723 [00:35<00:42, 10.07it/s] Loading 0: 42%|████▏ | 301/723 [00:35<00:34, 12.12it/s] Loading 0: 42%|████▏ | 305/723 [00:35<00:27, 15.07it/s] Loading 0: 43%|████▎ | 309/723 [00:35<00:23, 17.94it/s] Loading 0: 43%|████▎ | 312/723 [00:36<00:22, 18.08it/s] Loading 0: 44%|████▍ | 320/723 [00:36<00:23, 17.31it/s] Loading 0: 45%|████▍ | 323/723 [00:36<00:24, 16.41it/s] Loading 0: 45%|████▌ | 327/723 [00:36<00:20, 19.07it/s] Loading 0: 46%|████▌ | 330/723 [00:37<00:21, 18.52it/s] Loading 0: 46%|████▋ | 336/723 [00:37<00:15, 24.57it/s] Loading 0: 47%|████▋ | 340/723 [00:37<00:15, 25.04it/s] Loading 0: 48%|████▊ | 344/723 [00:37<00:21, 17.57it/s] Loading 0: 48%|████▊ | 347/723 [00:38<00:23, 16.28it/s] Loading 0: 49%|████▉ | 354/723 [00:38<00:15, 23.70it/s] Loading 0: 50%|████▉ | 358/723 [00:38<00:15, 23.37it/s] Loading 0: 50%|█████ | 362/723 [00:38<00:13, 25.96it/s] Loading 0: 51%|█████ | 366/723 [00:38<00:13, 27.40it/s] Loading 0: 51%|█████ | 370/723 [00:39<00:19, 18.13it/s] Loading 0: 52%|█████▏ | 373/723 [00:39<00:18, 18.79it/s] Loading 0: 52%|█████▏ | 378/723 [00:39<00:14, 23.42it/s] Loading 0: 53%|█████▎ | 382/723 [00:39<00:14, 24.21it/s] Loading 0: 54%|█████▎ | 387/723 [00:39<00:11, 28.17it/s] Loading 0: 54%|█████▍ | 392/723 [00:39<00:10, 31.99it/s] Loading 0: 55%|█████▍ | 396/723 [00:40<00:18, 17.45it/s] Loading 0: 55%|█████▌ | 400/723 [00:40<00:17, 18.74it/s] Loading 0: 55%|█████▌ | 400/723 [00:51<00:17, 18.74it/s] Loading 0: 55%|█████▌ | 401/723 [00:51<05:30, 1.03s/it] Loading 0: 56%|█████▋ | 408/723 [00:51<02:52, 1.83it/s] Loading 0: 57%|█████▋ | 412/723 [00:51<02:07, 2.44it/s] Loading 0: 58%|█████▊ | 418/723 [00:51<01:20, 3.81it/s] Loading 0: 58%|█████▊ | 422/723 [00:52<01:12, 4.17it/s] Loading 0: 59%|█████▉ | 426/723 [00:52<00:55, 5.37it/s] Loading 0: 59%|█████▉ | 429/723 [00:52<00:45, 6.42it/s] Loading 0: 60%|██████ | 435/723 [00:52<00:30, 9.59it/s] Loading 0: 61%|██████ | 438/723 [00:52<00:26, 10.73it/s] Loading 0: 62%|██████▏ | 446/723 [00:53<00:22, 12.12it/s] Loading 0: 62%|██████▏ | 449/723 [00:53<00:22, 12.28it/s] Loading 0: 63%|██████▎ | 454/723 [00:53<00:17, 15.00it/s] Loading 0: 63%|██████▎ | 458/723 [00:53<00:14, 17.91it/s] Loading 0: 64%|██████▍ | 463/723 [00:54<00:12, 20.56it/s] Loading 0: 65%|██████▍ | 468/723 [00:54<00:10, 24.39it/s] Loading 0: 65%|██████▌ | 472/723 [00:54<00:16, 15.13it/s] Loading 0: 66%|██████▌ | 476/723 [00:54<00:13, 18.13it/s] Loading 0: 66%|██████▋ | 480/723 [00:55<00:11, 21.29it/s] Loading 0: 67%|██████▋ | 484/723 [00:55<00:11, 21.30it/s] Loading 0: 67%|██████▋ | 488/723 [00:55<00:09, 23.50it/s] Loading 0: 68%|██████▊ | 491/723 [00:55<00:09, 24.50it/s] Loading 0: 68%|██████▊ | 494/723 [00:55<00:15, 14.46it/s] Loading 0: 69%|██████▉ | 498/723 [00:56<00:13, 16.96it/s] Loading 0: 69%|██████▉ | 501/723 [00:56<00:12, 17.40it/s] Loading 0: 70%|███████ | 507/723 [00:56<00:09, 23.76it/s] Loading 0: 71%|███████ | 510/723 [00:56<00:09, 22.25it/s] Loading 0: 71%|███████ | 515/723 [00:56<00:07, 26.67it/s] Loading 0: 72%|███████▏ | 520/723 [00:57<00:11, 18.01it/s] Loading 0: 72%|███████▏ | 523/723 [00:57<00:10, 18.40it/s] Loading 0: 73%|███████▎ | 526/723 [00:57<00:10, 18.73it/s] Loading 0: 73%|███████▎ | 530/723 [00:57<00:08, 22.11it/s] Loading 0: 74%|███████▍ | 534/723 [00:57<00:07, 25.59it/s] Loading 0: 74%|███████▍ | 538/723 [00:57<00:07, 24.94it/s] Loading 0: 75%|███████▍ | 541/723 [01:08<02:43, 1.11it/s] Loading 0: 76%|███████▌ | 546/723 [01:08<01:46, 1.66it/s] Loading 0: 76%|███████▌ | 549/723 [01:08<01:22, 2.11it/s] Loading 0: 76%|███████▋ | 553/723 [01:09<00:57, 2.96it/s] Loading 0: 77%|███████▋ | 557/723 [01:09<00:40, 4.13it/s] Loading 0: 78%|███████▊ | 561/723 [01:09<00:28, 5.68it/s] Loading 0: 78%|███████▊ | 564/723 [01:09<00:23, 6.84it/s] Loading 0: 79%|███████▉ | 572/723 [01:10<00:16, 9.15it/s] Loading 0: 80%|███████▉ | 575/723 [01:10<00:15, 9.72it/s] Loading 0: 80%|████████ | 579/723 [01:10<00:11, 12.25it/s] Loading 0: 80%|████████ | 582/723 [01:10<00:10, 13.33it/s] Loading 0: 81%|████████▏ | 588/723 [01:10<00:07, 18.68it/s] Loading 0: 82%|████████▏ | 592/723 [01:10<00:06, 19.72it/s] Loading 0: 82%|████████▏ | 596/723 [01:11<00:09, 13.53it/s] Loading 0: 83%|████████▎ | 599/723 [01:11<00:09, 13.40it/s] Loading 0: 84%|████████▍ | 606/723 [01:11<00:05, 20.03it/s] Loading 0: 84%|████████▍ | 610/723 [01:11<00:05, 20.59it/s] Loading 0: 85%|████████▍ | 614/723 [01:12<00:04, 22.86it/s] Loading 0: 85%|████████▌ | 617/723 [01:12<00:04, 24.08it/s] Loading 0: 86%|████████▌ | 620/723 [01:12<00:07, 14.70it/s] Loading 0: 86%|████████▋ | 624/723 [01:12<00:05, 17.18it/s] Loading 0: 87%|████████▋ | 627/723 [01:12<00:05, 17.81it/s] Loading 0: 88%|████████▊ | 633/723 [01:13<00:03, 24.50it/s] Loading 0: 88%|████████▊ | 637/723 [01:13<00:03, 23.94it/s] Loading 0: 89%|████████▊ | 641/723 [01:13<00:03, 26.34it/s] Loading 0: 89%|████████▉ | 646/723 [01:13<00:04, 17.75it/s] Loading 0: 90%|████████▉ | 649/723 [01:13<00:04, 17.71it/s] Loading 0: 90%|█████████ | 652/723 [01:14<00:03, 18.10it/s] Loading 0: 91%|█████████ | 656/723 [01:14<00:03, 21.78it/s] Loading 0: 91%|█████████▏| 660/723 [01:14<00:02, 24.59it/s] Loading 0: 92%|█████████▏| 663/723 [01:14<00:02, 23.01it/s] Loading 0: 93%|█████████▎| 672/723 [01:14<00:02, 21.51it/s] Loading 0: 93%|█████████▎| 675/723 [01:15<00:02, 19.03it/s] Loading 0: 94%|█████████▍| 679/723 [01:26<00:34, 1.28it/s] Loading 0: 95%|█████████▍| 684/723 [01:26<00:20, 1.88it/s] Loading 0: 95%|█████████▌| 688/723 [01:26<00:13, 2.52it/s] Loading 0: 96%|█████████▌| 692/723 [01:26<00:09, 3.42it/s] Loading 0: 97%|█████████▋| 698/723 [01:26<00:05, 4.77it/s] Loading 0: 97%|█████████▋| 701/723 [01:27<00:03, 5.53it/s] Loading 0: 98%|█████████▊| 706/723 [01:27<00:02, 7.59it/s] Loading 0: 98%|█████████▊| 711/723 [01:27<00:01, 10.33it/s] Loading 0: 99%|█████████▉| 715/723 [01:27<00:00, 12.44it/s] Loading 0: 100%|█████████▉| 720/723 [01:27<00:00, 16.21it/s] Loading 0: 100%|█████████▉| 722/723 [01:40<00:00, 16.21it/s] Loading 0: 100%|██████████| 723/723 [01:40<00:00, 1.03it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
turboderp-cat-llama-3-70-8684-v4-mkmlizer: quantized model in 115.868s
turboderp-cat-llama-3-70-8684-v4-mkmlizer: Processed model turboderp/Cat-Llama-3-70B-instruct in 319.040s
turboderp-cat-llama-3-70-8684-v4-mkmlizer: creating bucket guanaco-mkml-models
turboderp-cat-llama-3-70-8684-v4-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
turboderp-cat-llama-3-70-8684-v4-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/special_tokens_map.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/config.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/tokenizer.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/tokenizer_config.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.5.safetensors s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/flywheel_model.5.safetensors
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/flywheel_model.0.safetensors
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.2.safetensors s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/flywheel_model.2.safetensors
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/flywheel_model.1.safetensors
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.3.safetensors s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/flywheel_model.3.safetensors
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.4.safetensors s3://guanaco-mkml-models/turboderp-cat-llama-3-70-8684-v4/flywheel_model.4.safetensors
turboderp-cat-llama-3-70-8684-v4-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
turboderp-cat-llama-3-70-8684-v4-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
turboderp-cat-llama-3-70-8684-v4-mkmlizer: warnings.warn(
turboderp-cat-llama-3-70-8684-v4-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
turboderp-cat-llama-3-70-8684-v4-mkmlizer: warnings.warn(
turboderp-cat-llama-3-70-8684-v4-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
turboderp-cat-llama-3-70-8684-v4-mkmlizer: warnings.warn(
turboderp-cat-llama-3-70-8684-v4-mkmlizer: creating bucket guanaco-reward-models
turboderp-cat-llama-3-70-8684-v4-mkmlizer: Bucket 's3://guanaco-reward-models/' created
turboderp-cat-llama-3-70-8684-v4-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/turboderp-cat-llama-3-70-8684-v4_reward
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/turboderp-cat-llama-3-70-8684-v4_reward/config.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/turboderp-cat-llama-3-70-8684-v4_reward/special_tokens_map.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/turboderp-cat-llama-3-70-8684-v4_reward/tokenizer_config.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/turboderp-cat-llama-3-70-8684-v4_reward/merges.txt
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/turboderp-cat-llama-3-70-8684-v4_reward/vocab.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/turboderp-cat-llama-3-70-8684-v4_reward/tokenizer.json
turboderp-cat-llama-3-70-8684-v4-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/turboderp-cat-llama-3-70-8684-v4_reward/reward.tensors
Job turboderp-cat-llama-3-70-8684-v4-mkmlizer completed after 371.45s with status: succeeded
Stopping job with name turboderp-cat-llama-3-70-8684-v4-mkmlizer
Pipeline stage MKMLizer completed in 372.38s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.13s
Running pipeline stage ISVCDeployer
Creating inference service turboderp-cat-llama-3-70-8684-v4
Waiting for inference service turboderp-cat-llama-3-70-8684-v4 to be ready
Inference service turboderp-cat-llama-3-70-8684-v4 ready after 90.53577995300293s
Pipeline stage ISVCDeployer completed in 97.50s
Running pipeline stage StressChecker
Failed to get response for submission blend_lonib_2024-06-26: ('http://neversleep-noromaid-v0-8068-v36-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'EOF\n')
%s, retrying in %s seconds...
Received healthy response to inference request in 4.759587049484253s
Received healthy response to inference request in 4.205268859863281s
Received healthy response to inference request in 4.614182472229004s
Received healthy response to inference request in 4.197971820831299s
Received healthy response to inference request in 4.203437328338623s
5 requests
0 failed requests
5th percentile: 4.199064922332764
10th percentile: 4.200158023834229
20th percentile: 4.202344226837158
30th percentile: 4.203803634643554
40th percentile: 4.204536247253418
50th percentile: 4.205268859863281
60th percentile: 4.36883430480957
70th percentile: 4.532399749755859
80th percentile: 4.6432633876800535
90th percentile: 4.701425218582154
95th percentile: 4.730506134033203
99th percentile: 4.753770866394043
mean time: 4.396089506149292
Pipeline stage StressChecker completed in 43.21s
turboderp-cat-llama-3-70_8684_v4 status is now deployed due to DeploymentManager action
turboderp-cat-llama-3-70_8684_v4 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics