submission_id: sao10k-euryale-v2-1-alpha_v1
developer_uid: sao10k
best_of: 4
celo_rating: 1159.06
display_name: v2-3-l3-70b-euryale-test-alpha
family_friendly_score: 0.0
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 0.95, 'min_p': 0.075, 'top_k': 60, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>,', '<|eot_id|>,', '\n\n{user_name}'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
ineligible_reason: model is not deployable
is_internal_developer: False
language_model: Sao10K/Euryale-v2.1-Alpha
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_eval_status: success
model_group: Sao10K/Euryale-v2.1-Alph
model_name: v2-3-l3-70b-euryale-test-alpha
model_num_parameters: 70553706496.0
model_repo: Sao10K/Euryale-v2.1-Alpha
model_size: 71B
num_battles: 6485
num_wins: 3165
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
status: rejected
submission_type: basic
timestamp: 2024-06-12T15:47:34+00:00
us_pacific_date: 2024-06-12
win_ratio: 0.48804934464148036
Resubmit model
Running pipeline stage MKMLizer
Starting job with name sao10k-euryale-v2-1-alpha-v1-mkmlizer
Waiting for job on sao10k-euryale-v2-1-alpha-v1-mkmlizer to finish
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ _____ __ __ ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ /___/ ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ Version: 0.8.14 ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ https://mk1.ai ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ The license key for the current software has been verified as ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ belonging to: ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ Chai Research Corp. ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ║ ║
sao10k-euryale-v2-1-alpha-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
sao10k-euryale-v2-1-alpha-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
sao10k-euryale-v2-1-alpha-v1-mkmlizer: warnings.warn(warning_message, FutureWarning)
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Downloaded to shared memory in 348.782s
sao10k-euryale-v2-1-alpha-v1-mkmlizer: quantizing model to /dev/shm/model_cache
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Loading 0: 0%| | 0/723 [00:00<?, ?it/s] Loading 0: 0%| | 2/723 [00:08<52:59, 4.41s/it] Loading 0: 1%| | 5/723 [00:09<17:48, 1.49s/it] Loading 0: 2%|▏ | 11/723 [00:09<06:07, 1.94it/s] Loading 0: 2%|▏ | 14/723 [00:09<04:16, 2.76it/s] Loading 0: 3%|▎ | 22/723 [00:09<02:01, 5.76it/s] Loading 0: 4%|▎ | 26/723 [00:09<01:32, 7.51it/s] Loading 0: 4%|▍ | 31/723 [00:10<01:16, 9.05it/s] Loading 0: 5%|▍ | 34/723 [00:10<01:05, 10.58it/s] Loading 0: 6%|▌ | 41/723 [00:10<00:43, 15.82it/s] Loading 0: 7%|▋ | 49/723 [00:10<00:30, 22.13it/s] Loading 0: 7%|▋ | 53/723 [00:10<00:27, 24.06it/s] Loading 0: 8%|▊ | 57/723 [00:10<00:32, 20.68it/s] Loading 0: 8%|▊ | 61/723 [00:10<00:31, 21.24it/s] Loading 0: 9%|▉ | 68/723 [00:11<00:24, 27.16it/s] Loading 0: 11%|█ | 76/723 [00:11<00:19, 33.54it/s] Loading 0: 11%|█ | 80/723 [00:11<00:19, 33.57it/s] Loading 0: 12%|█▏ | 84/723 [00:11<00:25, 25.13it/s] Loading 0: 12%|█▏ | 88/723 [00:11<00:25, 24.61it/s] Loading 0: 13%|█▎ | 95/723 [00:12<00:20, 30.14it/s] Loading 0: 14%|█▍ | 103/723 [00:12<00:17, 36.14it/s] Loading 0: 15%|█▍ | 107/723 [00:12<00:25, 24.27it/s] Loading 0: 15%|█▌ | 112/723 [00:28<09:29, 1.07it/s] Loading 0: 16%|█▌ | 114/723 [00:28<08:14, 1.23it/s] Loading 0: 17%|█▋ | 122/723 [00:28<04:38, 2.16it/s] Loading 0: 18%|█▊ | 130/723 [00:29<02:52, 3.43it/s] Loading 0: 19%|█▊ | 135/723 [00:29<02:21, 4.16it/s] Loading 0: 19%|█▉ | 140/723 [00:29<01:46, 5.47it/s] Loading 0: 20%|██ | 148/723 [00:29<01:09, 8.30it/s] Loading 0: 21%|██ | 152/723 [00:29<00:57, 9.89it/s] Loading 0: 22%|██▏ | 157/723 [00:30<00:51, 11.05it/s] Loading 0: 22%|██▏ | 161/723 [00:30<00:42, 13.18it/s] Loading 0: 23%|██▎ | 167/723 [00:30<00:32, 17.21it/s] Loading 0: 24%|██▍ | 175/723 [00:30<00:23, 23.43it/s] Loading 0: 25%|██▍ | 180/723 [00:30<00:20, 25.96it/s] Loading 0: 25%|██▌ | 184/723 [00:31<00:25, 20.95it/s] Loading 0: 26%|██▌ | 188/723 [00:31<00:23, 23.05it/s] Loading 0: 27%|██▋ | 194/723 [00:31<00:19, 27.28it/s] Loading 0: 28%|██▊ | 202/723 [00:31<00:15, 33.89it/s] Loading 0: 29%|██▊ | 207/723 [00:31<00:22, 23.10it/s] Loading 0: 29%|██▉ | 212/723 [00:32<00:19, 25.86it/s] Loading 0: 30%|███ | 220/723 [00:32<00:15, 32.14it/s] Loading 0: 31%|███ | 224/723 [00:32<00:15, 32.28it/s] Loading 0: 32%|███▏ | 230/723 [00:32<00:14, 34.71it/s] Loading 0: 32%|███▏ | 234/723 [00:32<00:19, 25.39it/s] Loading 0: 33%|███▎ | 239/723 [00:32<00:17, 28.23it/s] Loading 0: 34%|███▍ | 247/723 [00:33<00:13, 34.61it/s] Loading 0: 35%|███▍ | 251/723 [00:49<07:25, 1.06it/s] Loading 0: 36%|███▌ | 257/723 [00:50<05:10, 1.50it/s] Loading 0: 37%|███▋ | 265/723 [00:50<03:12, 2.38it/s] Loading 0: 37%|███▋ | 269/723 [00:50<02:32, 2.97it/s] Loading 0: 38%|███▊ | 275/723 [00:50<01:46, 4.19it/s] Loading 0: 39%|███▉ | 281/723 [00:50<01:15, 5.89it/s] Loading 0: 40%|███▉ | 286/723 [00:51<01:03, 6.84it/s] Loading 0: 41%|████ | 293/723 [00:51<00:44, 9.76it/s] Loading 0: 42%|████▏ | 301/723 [00:51<00:30, 13.87it/s] Loading 0: 42%|████▏ | 306/723 [00:51<00:25, 16.38it/s] Loading 0: 43%|████▎ | 311/723 [00:51<00:26, 15.41it/s] Loading 0: 44%|████▍ | 319/723 [00:52<00:19, 20.98it/s] Loading 0: 45%|████▍ | 323/723 [00:52<00:17, 22.83it/s] Loading 0: 46%|████▌ | 329/723 [00:52<00:14, 26.68it/s] Loading 0: 46%|████▌ | 333/723 [00:52<00:18, 21.45it/s] Loading 0: 47%|████▋ | 338/723 [00:52<00:15, 24.68it/s] Loading 0: 48%|████▊ | 346/723 [00:52<00:11, 31.45it/s] Loading 0: 48%|████▊ | 350/723 [00:53<00:11, 31.95it/s] Loading 0: 49%|████▉ | 356/723 [00:53<00:10, 34.64it/s] Loading 0: 50%|████▉ | 360/723 [00:53<00:14, 24.97it/s] Loading 0: 50%|█████ | 365/723 [00:53<00:12, 27.91it/s] Loading 0: 52%|█████▏ | 373/723 [00:53<00:10, 34.62it/s] Loading 0: 52%|█████▏ | 377/723 [00:53<00:09, 34.62it/s] Loading 0: 53%|█████▎ | 383/723 [00:54<00:13, 25.54it/s] Loading 0: 53%|█████▎ | 383/723 [01:10<00:13, 25.54it/s] Loading 0: 53%|█████▎ | 384/723 [01:10<06:51, 1.21s/it] Loading 0: 54%|█████▍ | 392/723 [01:10<03:43, 1.48it/s] Loading 0: 55%|█████▌ | 400/723 [01:10<02:15, 2.39it/s] Loading 0: 56%|█████▌ | 405/723 [01:11<01:40, 3.15it/s] Loading 0: 57%|█████▋ | 410/723 [01:11<01:20, 3.88it/s] Loading 0: 58%|█████▊ | 418/723 [01:11<00:50, 6.01it/s] Loading 0: 58%|█████▊ | 422/723 [01:11<00:41, 7.29it/s] Loading 0: 59%|█████▉ | 428/723 [01:12<00:29, 9.86it/s] Loading 0: 60%|█████▉ | 433/723 [01:12<00:26, 10.90it/s] Loading 0: 60%|██████ | 437/723 [01:12<00:21, 13.05it/s] Loading 0: 62%|██████▏ | 445/723 [01:12<00:14, 18.85it/s] Loading 0: 62%|██████▏ | 449/723 [01:12<00:13, 20.82it/s] Loading 0: 63%|██████▎ | 455/723 [01:12<00:10, 24.98it/s] Loading 0: 63%|██████▎ | 459/723 [01:13<00:13, 20.22it/s] Loading 0: 64%|██████▍ | 463/723 [01:13<00:11, 22.31it/s] Loading 0: 65%|██████▍ | 467/723 [01:13<00:10, 24.06it/s] Loading 0: 65%|██████▌ | 473/723 [01:13<00:08, 27.85it/s] Loading 0: 66%|██████▋ | 479/723 [01:13<00:07, 33.89it/s] Loading 0: 67%|██████▋ | 484/723 [01:14<00:11, 21.26it/s] Loading 0: 68%|██████▊ | 491/723 [01:14<00:08, 26.46it/s] Loading 0: 69%|██████▉ | 499/723 [01:14<00:06, 32.45it/s] Loading 0: 70%|██████▉ | 504/723 [01:14<00:06, 33.34it/s] Loading 0: 70%|███████ | 509/723 [01:15<00:09, 22.88it/s] Loading 0: 72%|███████▏ | 517/723 [01:15<00:07, 29.28it/s] Loading 0: 72%|███████▏ | 521/723 [01:15<00:06, 30.21it/s] Loading 0: 72%|███████▏ | 523/723 [01:31<00:06, 30.21it/s] Loading 0: 72%|███████▏ | 524/723 [01:31<03:29, 1.05s/it] Loading 0: 73%|███████▎ | 527/723 [01:32<02:44, 1.19it/s] Loading 0: 74%|███████▍ | 535/723 [01:32<01:33, 2.02it/s] Loading 0: 74%|███████▍ | 538/723 [01:32<01:15, 2.45it/s] Loading 0: 75%|███████▌ | 545/723 [01:32<00:45, 3.89it/s] Loading 0: 76%|███████▋ | 553/723 [01:32<00:27, 6.09it/s] Loading 0: 77%|███████▋ | 557/723 [01:33<00:22, 7.41it/s] Loading 0: 78%|███████▊ | 561/723 [01:33<00:19, 8.30it/s] Loading 0: 78%|███████▊ | 565/723 [01:33<00:16, 9.76it/s] Loading 0: 79%|███████▉ | 572/723 [01:33<00:10, 14.06it/s] Loading 0: 80%|████████ | 580/723 [01:33<00:07, 19.59it/s] Loading 0: 81%|████████ | 584/723 [01:33<00:06, 21.61it/s] Loading 0: 81%|████████▏ | 588/723 [01:34<00:06, 19.36it/s] Loading 0: 82%|████████▏ | 592/723 [01:34<00:06, 20.33it/s] Loading 0: 83%|████████▎ | 599/723 [01:34<00:04, 26.24it/s] Loading 0: 84%|████████▍ | 607/723 [01:34<00:03, 32.61it/s] Loading 0: 85%|████████▍ | 612/723 [01:35<00:04, 24.50it/s] Loading 0: 85%|████████▌ | 617/723 [01:35<00:03, 27.25it/s] Loading 0: 86%|████████▋ | 625/723 [01:35<00:02, 33.53it/s] Loading 0: 87%|████████▋ | 630/723 [01:35<00:02, 34.56it/s] Loading 0: 88%|████████▊ | 635/723 [01:35<00:03, 25.07it/s] Loading 0: 89%|████████▊ | 641/723 [01:35<00:02, 30.59it/s] Loading 0: 89%|████████▉ | 646/723 [01:36<00:02, 29.90it/s] Loading 0: 90%|█████████ | 653/723 [01:36<00:02, 34.31it/s] Loading 0: 91%|█████████▏| 661/723 [01:36<00:02, 28.05it/s] Loading 0: 91%|█████████▏| 661/723 [01:53<00:02, 28.05it/s] Loading 0: 92%|█████████▏| 662/723 [01:53<01:04, 1.05s/it] Loading 0: 93%|█████████▎| 670/723 [01:53<00:33, 1.60it/s] Loading 0: 93%|█████████▎| 674/723 [01:53<00:24, 2.04it/s] Loading 0: 94%|█████████▍| 680/723 [01:53<00:14, 2.96it/s] Loading 0: 95%|█████████▍| 685/723 [01:54<00:10, 3.72it/s] Loading 0: 95%|█████████▌| 689/723 [01:54<00:07, 4.75it/s] Loading 0: 96%|█████████▋| 697/723 [01:54<00:03, 7.59it/s] Loading 0: 97%|█████████▋| 701/723 [01:54<00:02, 9.22it/s] Loading 0: 98%|█████████▊| 707/723 [01:54<00:01, 12.39it/s] Loading 0: 98%|█████████▊| 711/723 [01:54<00:00, 12.48it/s] Loading 0: 99%|█████████▉| 716/723 [01:55<00:00, 15.64it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
sao10k-euryale-v2-1-alpha-v1-mkmlizer: quantized model in 133.138s
Connection pool is full, discarding connection: %s
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Processed model Sao10K/Euryale-v2.1-Alpha in 504.291s
sao10k-euryale-v2-1-alpha-v1-mkmlizer: creating bucket guanaco-mkml-models
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
sao10k-euryale-v2-1-alpha-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/special_tokens_map.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/tokenizer.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/tokenizer_config.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/config.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.5.safetensors s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/flywheel_model.5.safetensors
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.4.safetensors s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/flywheel_model.4.safetensors
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/flywheel_model.0.safetensors
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/flywheel_model.1.safetensors
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.2.safetensors s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/flywheel_model.2.safetensors
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.3.safetensors s3://guanaco-mkml-models/sao10k-euryale-v2-1-alpha-v1/flywheel_model.3.safetensors
sao10k-euryale-v2-1-alpha-v1-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
sao10k-euryale-v2-1-alpha-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-euryale-v2-1-alpha-v1-mkmlizer: warnings.warn(
sao10k-euryale-v2-1-alpha-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-euryale-v2-1-alpha-v1-mkmlizer: warnings.warn(
sao10k-euryale-v2-1-alpha-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
sao10k-euryale-v2-1-alpha-v1-mkmlizer: warnings.warn(
sao10k-euryale-v2-1-alpha-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
sao10k-euryale-v2-1-alpha-v1-mkmlizer: return self.fget.__get__(instance, owner)()
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Saving duration: 0.434s
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 7.537s
sao10k-euryale-v2-1-alpha-v1-mkmlizer: creating bucket guanaco-reward-models
sao10k-euryale-v2-1-alpha-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
sao10k-euryale-v2-1-alpha-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/sao10k-euryale-v2-1-alpha-v1_reward
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/sao10k-euryale-v2-1-alpha-v1_reward/special_tokens_map.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/sao10k-euryale-v2-1-alpha-v1_reward/tokenizer_config.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/sao10k-euryale-v2-1-alpha-v1_reward/vocab.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/sao10k-euryale-v2-1-alpha-v1_reward/config.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/sao10k-euryale-v2-1-alpha-v1_reward/merges.txt
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/sao10k-euryale-v2-1-alpha-v1_reward/tokenizer.json
sao10k-euryale-v2-1-alpha-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/sao10k-euryale-v2-1-alpha-v1_reward/reward.tensors
Job sao10k-euryale-v2-1-alpha-v1-mkmlizer completed after 565.31s with status: succeeded
Stopping job with name sao10k-euryale-v2-1-alpha-v1-mkmlizer
Pipeline stage MKMLizer completed in 566.26s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service sao10k-euryale-v2-1-alpha-v1
Waiting for inference service sao10k-euryale-v2-1-alpha-v1 to be ready
Inference service sao10k-euryale-v2-1-alpha-v1 ready after 90.91029477119446s
Pipeline stage ISVCDeployer completed in 97.13s
Running pipeline stage StressChecker
Received healthy response to inference request in 5.298290967941284s
Received healthy response to inference request in 4.2394585609436035s
Received healthy response to inference request in 4.225162029266357s
Received healthy response to inference request in 4.165940523147583s
Received healthy response to inference request in 4.241365432739258s
5 requests
0 failed requests
5th percentile: 4.177784824371338
10th percentile: 4.189629125595093
20th percentile: 4.213317728042602
30th percentile: 4.228021335601807
40th percentile: 4.233739948272705
50th percentile: 4.2394585609436035
60th percentile: 4.240221309661865
70th percentile: 4.240984058380127
80th percentile: 4.452750539779664
90th percentile: 4.8755207538604735
95th percentile: 5.086905860900878
99th percentile: 5.256013946533203
mean time: 4.434043502807617
Pipeline stage StressChecker completed in 22.90s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.04s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.03s
M-Eval Dataset for topic stay_in_character is loaded
sao10k-euryale-v2-1-alpha_v1 status is now deployed due to DeploymentManager action
sao10k-euryale-v2-1-alpha_v1 status is now rejected due to its M-Eval score being less than the acceptable minimum 6.5 to serve to users. Please consider iterating on your model's ability to adhere to prompts to improve this score.