submission_id: sao10k-l3-70b-euryale-v2-1_v1
developer_uid: frankdu
status: inactive
model_repo: Sao10K/L3-70B-Euryale-v2.1
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-24T16:14:51+00:00
model_name: sao10k-l3-70b-euryale-v2-1_v1
model_group: Sao10K/L3-70B-Euryale-v2
num_battles: 45793
num_wins: 25631
celo_rating: 1209.09
propriety_score: 0.7171726535341831
propriety_total_count: 21575.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 70553706496.0
best_of: 4
max_input_tokens: 512
max_output_tokens: 64
display_name: sao10k-l3-70b-euryale-v2-1_v1
ineligible_reason: None
language_model: Sao10K/L3-70B-Euryale-v2.1
model_size: 71B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-06-24
win_ratio: 0.5597143668246238
Resubmit model
Running pipeline stage MKMLizer
Starting job with name sao10k-l3-70b-euryale-v2-1-v1-mkmlizer
Waiting for job on sao10k-l3-70b-euryale-v2-1-v1-mkmlizer to finish
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ _____ __ __ ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ /___/ ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ Version: 0.8.14 ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ https://mk1.ai ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ The license key for the current software has been verified as ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ belonging to: ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ Chai Research Corp. ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ║ ║
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: warnings.warn(warning_message, FutureWarning)
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Downloaded to shared memory in 228.149s
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: quantizing model to /dev/shm/model_cache
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Loading 0: 0%| | 0/723 [00:00<?, ?it/s] Loading 0: 1%| | 4/723 [00:00<00:25, 28.39it/s] Loading 0: 1%| | 8/723 [00:00<00:21, 33.93it/s] Loading 0: 2%|▏ | 14/723 [00:00<00:16, 42.35it/s] Loading 0: 3%|▎ | 19/723 [00:00<00:33, 20.93it/s] Loading 0: 3%|▎ | 23/723 [00:00<00:31, 21.93it/s] Loading 0: 4%|▍ | 31/723 [00:01<00:22, 30.58it/s] Loading 0: 5%|▌ | 39/723 [00:01<00:17, 39.61it/s] Loading 0: 6%|▌ | 44/723 [00:01<00:26, 25.32it/s] Loading 0: 7%|▋ | 48/723 [00:01<00:25, 26.83it/s] Loading 0: 7%|▋ | 52/723 [00:01<00:24, 27.91it/s] Loading 0: 8%|▊ | 58/723 [00:01<00:21, 31.65it/s] Loading 0: 9%|▉ | 66/723 [00:02<00:16, 40.56it/s] Loading 0: 10%|▉ | 71/723 [00:02<00:27, 23.81it/s] Loading 0: 11%|█ | 76/723 [00:02<00:24, 26.75it/s] Loading 0: 12%|█▏ | 84/723 [00:02<00:19, 33.13it/s] Loading 0: 12%|█▏ | 89/723 [00:02<00:18, 33.74it/s] Loading 0: 13%|█▎ | 94/723 [00:03<00:27, 22.75it/s] Loading 0: 14%|█▍ | 100/723 [00:03<00:22, 28.03it/s] Loading 0: 14%|█▍ | 104/723 [00:03<00:23, 26.02it/s] Loading 0: 15%|█▌ | 111/723 [00:03<00:19, 31.24it/s] Loading 0: 16%|█▌ | 116/723 [00:04<00:24, 24.36it/s] Loading 0: 17%|█▋ | 120/723 [00:04<00:22, 26.44it/s] Loading 0: 17%|█▋ | 124/723 [00:04<00:21, 27.79it/s] Loading 0: 17%|█▋ | 124/723 [00:22<00:21, 27.79it/s] Loading 0: 17%|█▋ | 125/723 [00:22<14:52, 1.49s/it] Loading 0: 18%|█▊ | 130/723 [00:22<09:21, 1.06it/s] Loading 0: 19%|█▉ | 137/723 [00:22<05:23, 1.81it/s] Loading 0: 20%|█▉ | 142/723 [00:22<04:00, 2.42it/s] Loading 0: 20%|██ | 145/723 [00:22<03:14, 2.97it/s] Loading 0: 20%|██ | 148/723 [00:22<02:34, 3.73it/s] Loading 0: 22%|██▏ | 156/723 [00:23<01:26, 6.57it/s] Loading 0: 22%|██▏ | 160/723 [00:23<01:08, 8.24it/s] Loading 0: 23%|██▎ | 168/723 [00:23<00:48, 11.42it/s] Loading 0: 24%|██▍ | 172/723 [00:23<00:42, 12.82it/s] Loading 0: 24%|██▍ | 175/723 [00:23<00:38, 14.41it/s] Loading 0: 25%|██▌ | 183/723 [00:23<00:25, 21.10it/s] Loading 0: 26%|██▌ | 187/723 [00:24<00:23, 23.19it/s] Loading 0: 27%|██▋ | 194/723 [00:24<00:23, 22.31it/s] Loading 0: 27%|██▋ | 198/723 [00:24<00:23, 22.49it/s] Loading 0: 28%|██▊ | 202/723 [00:24<00:21, 24.14it/s] Loading 0: 29%|██▉ | 210/723 [00:24<00:16, 31.33it/s] Loading 0: 30%|██▉ | 214/723 [00:25<00:16, 31.63it/s] Loading 0: 30%|███ | 218/723 [00:25<00:21, 23.75it/s] Loading 0: 31%|███ | 221/723 [00:25<00:22, 22.03it/s] Loading 0: 32%|███▏ | 229/723 [00:25<00:16, 30.20it/s] Loading 0: 33%|███▎ | 236/723 [00:25<00:14, 34.71it/s] Loading 0: 33%|███▎ | 242/723 [00:26<00:18, 26.55it/s] Loading 0: 34%|███▍ | 246/723 [00:26<00:16, 28.18it/s] Loading 0: 35%|███▍ | 250/723 [00:26<00:16, 29.45it/s] Loading 0: 35%|███▌ | 256/723 [00:26<00:14, 33.10it/s] Loading 0: 36%|███▋ | 263/723 [00:26<00:12, 37.48it/s] Loading 0: 36%|███▋ | 263/723 [00:42<00:12, 37.48it/s] Loading 0: 37%|███▋ | 264/723 [00:42<08:39, 1.13s/it] Loading 0: 37%|███▋ | 268/723 [00:43<06:21, 1.19it/s] Loading 0: 38%|███▊ | 272/723 [00:43<04:34, 1.64it/s] Loading 0: 38%|███▊ | 275/723 [00:43<03:34, 2.09it/s] Loading 0: 39%|███▉ | 283/723 [00:43<01:54, 3.84it/s] Loading 0: 40%|████ | 291/723 [00:43<01:09, 6.21it/s] Loading 0: 41%|████ | 296/723 [00:44<01:00, 7.12it/s] Loading 0: 41%|████▏ | 300/723 [00:44<00:48, 8.74it/s] Loading 0: 42%|████▏ | 304/723 [00:44<00:39, 10.69it/s] Loading 0: 43%|████▎ | 310/723 [00:44<00:28, 14.44it/s] Loading 0: 44%|████▍ | 317/723 [00:44<00:19, 20.35it/s] Loading 0: 45%|████▍ | 322/723 [00:44<00:23, 17.03it/s] Loading 0: 45%|████▌ | 327/723 [00:45<00:19, 19.93it/s] Loading 0: 46%|████▌ | 331/723 [00:45<00:17, 22.18it/s] Loading 0: 47%|████▋ | 337/723 [00:45<00:14, 26.59it/s] Loading 0: 48%|████▊ | 344/723 [00:45<00:16, 22.82it/s] Loading 0: 48%|████▊ | 348/723 [00:45<00:16, 22.49it/s] Loading 0: 49%|████▉ | 355/723 [00:46<00:13, 27.77it/s] Loading 0: 50%|████▉ | 359/723 [00:46<00:12, 29.80it/s] Loading 0: 50%|█████ | 363/723 [00:46<00:12, 29.74it/s] Loading 0: 51%|█████ | 368/723 [00:46<00:15, 22.48it/s] Loading 0: 51%|█████▏ | 372/723 [00:46<00:14, 24.72it/s] Loading 0: 52%|█████▏ | 376/723 [00:46<00:13, 26.06it/s] Loading 0: 53%|█████▎ | 382/723 [00:47<00:11, 30.17it/s] Loading 0: 54%|█████▍ | 389/723 [00:47<00:09, 34.93it/s] Loading 0: 54%|█████▍ | 394/723 [00:47<00:12, 25.94it/s] Loading 0: 55%|█████▌ | 398/723 [00:47<00:11, 27.54it/s] Loading 0: 55%|█████▌ | 400/723 [01:05<00:11, 27.54it/s] Loading 0: 55%|█████▌ | 401/723 [01:05<06:34, 1.23s/it] Loading 0: 57%|█████▋ | 409/723 [01:05<03:38, 1.44it/s] Loading 0: 57%|█████▋ | 415/723 [01:05<02:27, 2.09it/s] Loading 0: 58%|█████▊ | 420/723 [01:05<01:50, 2.74it/s] Loading 0: 59%|█████▊ | 424/723 [01:05<01:27, 3.43it/s] Loading 0: 59%|█████▉ | 428/723 [01:06<01:07, 4.37it/s] Loading 0: 60%|██████ | 436/723 [01:06<00:40, 7.13it/s] Loading 0: 61%|██████▏ | 444/723 [01:06<00:25, 10.76it/s] Loading 0: 62%|██████▏ | 449/723 [01:06<00:25, 10.87it/s] Loading 0: 63%|██████▎ | 454/723 [01:06<00:20, 13.38it/s] Loading 0: 64%|██████▍ | 462/723 [01:07<00:14, 18.50it/s] Loading 0: 65%|██████▍ | 467/723 [01:07<00:12, 20.60it/s] Loading 0: 65%|██████▌ | 471/723 [01:07<00:15, 16.38it/s] Loading 0: 66%|██████▌ | 474/723 [01:07<00:14, 17.41it/s] Loading 0: 67%|██████▋ | 481/723 [01:07<00:10, 22.80it/s] Loading 0: 67%|██████▋ | 486/723 [01:08<00:08, 26.62it/s] Loading 0: 68%|██████▊ | 490/723 [01:08<00:08, 27.94it/s] Loading 0: 68%|██████▊ | 494/723 [01:08<00:11, 20.53it/s] Loading 0: 69%|██████▉ | 498/723 [01:08<00:09, 23.13it/s] Loading 0: 69%|██████▉ | 502/723 [01:08<00:08, 25.23it/s] Loading 0: 70%|███████ | 508/723 [01:08<00:07, 29.78it/s] Loading 0: 71%|███████ | 515/723 [01:09<00:05, 34.67it/s] Loading 0: 72%|███████▏ | 520/723 [01:09<00:07, 26.41it/s] Loading 0: 72%|███████▏ | 524/723 [01:09<00:07, 27.93it/s] Loading 0: 73%|███████▎ | 528/723 [01:09<00:07, 26.28it/s] Loading 0: 74%|███████▍ | 535/723 [01:09<00:05, 32.07it/s] Loading 0: 75%|███████▍ | 540/723 [01:26<00:05, 32.07it/s] Loading 0: 75%|███████▍ | 541/723 [01:26<02:51, 1.06it/s] Loading 0: 76%|███████▌ | 546/723 [01:27<02:04, 1.42it/s] Loading 0: 76%|███████▌ | 549/723 [01:27<01:40, 1.72it/s] Loading 0: 76%|███████▋ | 553/723 [01:27<01:13, 2.31it/s] Loading 0: 77%|███████▋ | 558/723 [01:27<00:49, 3.34it/s] Loading 0: 78%|███████▊ | 562/723 [01:27<00:36, 4.40it/s] Loading 0: 78%|███████▊ | 567/723 [01:27<00:24, 6.24it/s] Loading 0: 79%|███████▉ | 572/723 [01:28<00:19, 7.63it/s] Loading 0: 80%|███████▉ | 575/723 [01:28<00:17, 8.63it/s] Loading 0: 80%|████████ | 580/723 [01:28<00:12, 11.51it/s] Loading 0: 81%|████████ | 585/723 [01:28<00:09, 15.14it/s] Loading 0: 81%|████████▏ | 589/723 [01:28<00:07, 17.28it/s] Loading 0: 82%|████████▏ | 593/723 [01:28<00:06, 20.16it/s] Loading 0: 83%|████████▎ | 597/723 [01:29<00:08, 15.02it/s] Loading 0: 83%|████████▎ | 600/723 [01:29<00:07, 16.20it/s] Loading 0: 84%|████████▍ | 606/723 [01:29<00:05, 22.66it/s] Loading 0: 84%|████████▍ | 610/723 [01:29<00:04, 23.76it/s] Loading 0: 85%|████████▍ | 614/723 [01:29<00:04, 26.01it/s] Loading 0: 85%|████████▌ | 618/723 [01:30<00:03, 26.74it/s] Loading 0: 86%|████████▌ | 622/723 [01:30<00:05, 17.99it/s] Loading 0: 86%|████████▋ | 625/723 [01:30<00:05, 17.80it/s] Loading 0: 87%|████████▋ | 629/723 [01:30<00:04, 20.82it/s] Loading 0: 88%|████████▊ | 633/723 [01:30<00:03, 24.01it/s] Loading 0: 88%|████████▊ | 636/723 [01:31<00:04, 21.72it/s] Loading 0: 89%|████████▊ | 641/723 [01:31<00:03, 25.52it/s] Loading 0: 89%|████████▉ | 646/723 [01:31<00:04, 17.30it/s] Loading 0: 90%|████████▉ | 649/723 [01:31<00:04, 17.22it/s] Loading 0: 90%|█████████ | 652/723 [01:31<00:04, 17.42it/s] Loading 0: 91%|█████████ | 656/723 [01:32<00:03, 20.63it/s] Loading 0: 91%|█████████▏| 660/723 [01:32<00:02, 24.03it/s] Loading 0: 92%|█████████▏| 663/723 [01:32<00:02, 21.75it/s] Loading 0: 93%|█████████▎| 670/723 [01:32<00:01, 31.24it/s] Loading 0: 93%|█████████▎| 674/723 [01:33<00:03, 16.27it/s] Loading 0: 94%|█████████▍| 678/723 [01:33<00:02, 17.60it/s] Loading 0: 94%|█████████▍| 678/723 [01:50<00:02, 17.60it/s] Loading 0: 94%|█████████▍| 679/723 [01:50<01:11, 1.63s/it] Loading 0: 94%|█████████▍| 681/723 [01:50<00:54, 1.31s/it] Loading 0: 95%|█████████▌| 687/723 [01:50<00:25, 1.42it/s] Loading 0: 95%|█████████▌| 690/723 [01:51<00:17, 1.84it/s] Loading 0: 97%|█████████▋| 698/723 [01:51<00:07, 3.21it/s] Loading 0: 97%|█████████▋| 701/723 [01:51<00:05, 3.80it/s] Loading 0: 98%|█████████▊| 705/723 [01:51<00:03, 5.10it/s] Loading 0: 98%|█████████▊| 708/723 [01:52<00:02, 6.14it/s] Loading 0: 99%|█████████▉| 714/723 [01:52<00:00, 9.40it/s] Loading 0: 99%|█████████▉| 718/723 [01:52<00:00, 11.25it/s] Loading 0: 100%|█████████▉| 722/723 [02:04<00:00, 11.25it/s] Loading 0: 100%|██████████| 723/723 [02:04<00:00, 1.16it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: quantized model in 142.590s
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Processed model Sao10K/L3-70B-Euryale-v2.1 in 380.608s
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: creating bucket guanaco-mkml-models
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/special_tokens_map.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/tokenizer_config.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/config.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/tokenizer.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.5.safetensors s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/flywheel_model.5.safetensors
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.2.safetensors s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/flywheel_model.2.safetensors
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.4.safetensors s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/flywheel_model.4.safetensors
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/flywheel_model.0.safetensors
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.3.safetensors s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/flywheel_model.3.safetensors
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/sao10k-l3-70b-euryale-v2-1-v1/flywheel_model.1.safetensors
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: warnings.warn(
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: warnings.warn(
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: return self.fget.__get__(instance, owner)()
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Saving duration: 0.277s
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 3.726s
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: creating bucket guanaco-reward-models
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/sao10k-l3-70b-euryale-v2-1-v1_reward
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/sao10k-l3-70b-euryale-v2-1-v1_reward/special_tokens_map.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/sao10k-l3-70b-euryale-v2-1-v1_reward/config.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/sao10k-l3-70b-euryale-v2-1-v1_reward/tokenizer_config.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/sao10k-l3-70b-euryale-v2-1-v1_reward/merges.txt
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/sao10k-l3-70b-euryale-v2-1-v1_reward/vocab.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/sao10k-l3-70b-euryale-v2-1-v1_reward/tokenizer.json
sao10k-l3-70b-euryale-v2-1-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/sao10k-l3-70b-euryale-v2-1-v1_reward/reward.tensors
Job sao10k-l3-70b-euryale-v2-1-v1-mkmlizer completed after 430.61s with status: succeeded
Stopping job with name sao10k-l3-70b-euryale-v2-1-v1-mkmlizer
Pipeline stage MKMLizer completed in 431.08s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service sao10k-l3-70b-euryale-v2-1-v1
Waiting for inference service sao10k-l3-70b-euryale-v2-1-v1 to be ready
Inference service sao10k-l3-70b-euryale-v2-1-v1 ready after 90.52466940879822s
Pipeline stage ISVCDeployer completed in 96.28s
Running pipeline stage StressChecker
Received healthy response to inference request in 4.384335994720459s
Received healthy response to inference request in 4.177119493484497s
Received healthy response to inference request in 6.307408332824707s
Received healthy response to inference request in 4.173785209655762s
Received healthy response to inference request in 3.4552812576293945s
5 requests
0 failed requests
5th percentile: 3.598982048034668
10th percentile: 3.7426828384399413
20th percentile: 4.030084419250488
30th percentile: 4.174452066421509
40th percentile: 4.175785779953003
50th percentile: 4.177119493484497
60th percentile: 4.260006093978882
70th percentile: 4.342892694473266
80th percentile: 4.768950462341309
90th percentile: 5.538179397583008
95th percentile: 5.922793865203857
99th percentile: 6.230485439300537
mean time: 4.499586057662964
Pipeline stage StressChecker completed in 23.36s
Running pipeline stage DaemonicSafetyScorer
Pipeline stage DaemonicSafetyScorer completed in 0.04s
sao10k-l3-70b-euryale-v2-1_v1 status is now deployed due to DeploymentManager action
sao10k-l3-70b-euryale-v2-1_v1 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics