submission_id: cgato-thespis-balanced-7b-v1_v3
developer_uid: chai_backend_admin
status: inactive
model_repo: cgato/Thespis-Balanced-7b-v1
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'top_k': 50, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>', '<|user|>', '###'], 'max_input_tokens': 512, 'best_of': 1, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:'}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:'}
timestamp: 2024-04-02T17:01:42+00:00
model_name: auto_submit_ricum_lodonorowa
model_eval_status: success
safety_score: 0.68
entertaining: 6.76
stay_in_character: 8.54
user_preference: 7.06
double_thumbs_up: 242
thumbs_up: 417
thumbs_down: 190
num_battles: 23934
num_wins: 10365
win_ratio: 0.43306593131110555
celo_rating: 1106.4
Resubmit model
Running pipeline stage MKMLizer
Starting job with name cgato-thespis-balanced-7b-v1-v3-mkmlizer
Waiting for job on cgato-thespis-balanced-7b-v1-v3-mkmlizer to finish
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ _____ __ __ ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ /___/ ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ Version: 0.6.11 ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ The license key for the current software has been verified as ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ belonging to: ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ Chai Research Corp. ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ║ ║
cgato-thespis-balanced-7b-v1-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cgato-thespis-balanced-7b-v1-v3-mkmlizer: .gitattributes: 0%| | 0.00/1.52k [00:00<?, ?B/s] .gitattributes: 100%|██████████| 1.52k/1.52k [00:00<00:00, 18.6MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: README.md: 0%| | 0.00/187 [00:00<?, ?B/s] README.md: 100%|██████████| 187/187 [00:00<00:00, 1.70MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: config.json: 0%| | 0.00/643 [00:00<?, ?B/s] config.json: 100%|██████████| 643/643 [00:00<00:00, 5.09MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: generation_config.json: 0%| | 0.00/132 [00:00<?, ?B/s] generation_config.json: 100%|██████████| 132/132 [00:00<00:00, 2.09MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: pytorch_model-00001-of-00003.bin: 0%| | 0.00/4.94G [00:00<?, ?B/s] pytorch_model-00001-of-00003.bin: 0%| | 10.5M/4.94G [00:01<09:24, 8.74MB/s] pytorch_model-00001-of-00003.bin: 1%| | 31.5M/4.94G [00:01<02:54, 28.2MB/s] pytorch_model-00001-of-00003.bin: 1%| | 41.9M/4.94G [00:01<02:33, 31.9MB/s] pytorch_model-00001-of-00003.bin: 1%|▏ | 73.4M/4.94G [00:01<01:11, 67.8MB/s] pytorch_model-00001-of-00003.bin: 3%|▎ | 126M/4.94G [00:01<00:37, 127MB/s] pytorch_model-00001-of-00003.bin: 4%|▍ | 220M/4.94G [00:01<00:18, 259MB/s] pytorch_model-00001-of-00003.bin: 5%|▌ | 262M/4.94G [00:02<00:17, 267MB/s] pytorch_model-00001-of-00003.bin: 6%|▌ | 304M/4.94G [00:02<00:18, 252MB/s] pytorch_model-00001-of-00003.bin: 7%|▋ | 346M/4.94G [00:02<00:16, 279MB/s] pytorch_model-00001-of-00003.bin: 8%|▊ | 388M/4.94G [00:02<00:14, 306MB/s] pytorch_model-00001-of-00003.bin: 10%|█ | 514M/4.94G [00:02<00:08, 509MB/s] pytorch_model-00001-of-00003.bin: 12%|█▏ | 587M/4.94G [00:02<00:09, 469MB/s] pytorch_model-00001-of-00003.bin: 15%|█▍ | 724M/4.94G [00:02<00:06, 656MB/s] pytorch_model-00001-of-00003.bin: 21%|██ | 1.03G/4.94G [00:03<00:03, 1.22GB/s] pytorch_model-00001-of-00003.bin: 29%|██▉ | 1.43G/4.94G [00:03<00:02, 1.53GB/s] pytorch_model-00001-of-00003.bin: 32%|███▏ | 1.59G/4.94G [00:04<00:05, 606MB/s] pytorch_model-00001-of-00003.bin: 35%|███▍ | 1.72G/4.94G [00:04<00:06, 472MB/s] pytorch_model-00001-of-00003.bin: 38%|███▊ | 1.87G/4.94G [00:04<00:05, 569MB/s] pytorch_model-00001-of-00003.bin: 41%|████ | 2.01G/4.94G [00:04<00:04, 669MB/s] pytorch_model-00001-of-00003.bin: 44%|████▍ | 2.19G/4.94G [00:04<00:03, 834MB/s] pytorch_model-00001-of-00003.bin: 47%|████▋ | 2.33G/4.94G [00:04<00:02, 905MB/s] pytorch_model-00001-of-00003.bin: 55%|█████▍ | 2.69G/4.94G [00:05<00:01, 1.42GB/s] pytorch_model-00001-of-00003.bin: 59%|█████▊ | 2.89G/4.94G [00:05<00:02, 716MB/s] pytorch_model-00001-of-00003.bin: 62%|██████▏ | 3.04G/4.94G [00:05<00:02, 717MB/s] pytorch_model-00001-of-00003.bin: 64%|██████▍ | 3.17G/4.94G [00:06<00:02, 673MB/s] pytorch_model-00001-of-00003.bin: 67%|██████▋ | 3.31G/4.94G [00:06<00:02, 727MB/s] pytorch_model-00001-of-00003.bin: 69%|██████▉ | 3.43G/4.94G [00:06<00:01, 792MB/s] pytorch_model-00001-of-00003.bin: 74%|███████▍ | 3.65G/4.94G [00:06<00:01, 1.03GB/s] pytorch_model-00001-of-00003.bin: 77%|███████▋ | 3.79G/4.94G [00:06<00:01, 980MB/s] pytorch_model-00001-of-00003.bin: 79%|███████▉ | 3.91G/4.94G [00:06<00:01, 730MB/s] pytorch_model-00001-of-00003.bin: 81%|████████ | 4.02G/4.94G [00:07<00:01, 694MB/s] pytorch_model-00001-of-00003.bin: 83%|████████▎ | 4.11G/4.94G [00:07<00:01, 651MB/s] pytorch_model-00001-of-00003.bin: 85%|████████▍ | 4.19G/4.94G [00:07<00:01, 666MB/s] pytorch_model-00001-of-00003.bin: 88%|████████▊ | 4.33G/4.94G [00:07<00:00, 799MB/s] pytorch_model-00001-of-00003.bin: 90%|█████████ | 4.46G/4.94G [00:07<00:00, 907MB/s] pytorch_model-00001-of-00003.bin: 94%|█████████▍| 4.64G/4.94G [00:07<00:00, 1.10GB/s] pytorch_model-00001-of-00003.bin: 100%|█████████▉| 4.94G/4.94G [00:07<00:00, 1.40GB/s] pytorch_model-00001-of-00003.bin: 100%|█████████▉| 4.94G/4.94G [00:07<00:00, 620MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: pytorch_model-00002-of-00003.bin: 0%| | 0.00/5.00G [00:00<?, ?B/s] pytorch_model-00002-of-00003.bin: 0%| | 10.5M/5.00G [00:02<16:15, 5.11MB/s] pytorch_model-00002-of-00003.bin: 1%| | 41.9M/5.00G [00:02<03:19, 24.9MB/s] pytorch_model-00002-of-00003.bin: 1%|▏ | 62.9M/5.00G [00:02<02:11, 37.5MB/s] pytorch_model-00002-of-00003.bin: 2%|▏ | 83.9M/5.00G [00:02<01:43, 47.5MB/s] pytorch_model-00002-of-00003.bin: 2%|▏ | 115M/5.00G [00:02<01:07, 72.3MB/s] pytorch_model-00002-of-00003.bin: 4%|▍ | 189M/5.00G [00:02<00:30, 158MB/s] pytorch_model-00002-of-00003.bin: 8%|▊ | 398M/5.00G [00:02<00:10, 458MB/s] pytorch_model-00002-of-00003.bin: 11%|█ | 535M/5.00G [00:03<00:08, 558MB/s] pytorch_model-00002-of-00003.bin: 13%|█▎ | 671M/5.00G [00:03<00:06, 702MB/s] pytorch_model-00002-of-00003.bin: 22%|██▏ | 1.09G/5.00G [00:03<00:03, 1.27GB/s] pytorch_model-00002-of-00003.bin: 25%|██▍ | 1.25G/5.00G [00:04<00:06, 565MB/s] pytorch_model-00002-of-00003.bin: 28%|██▊ | 1.42G/5.00G [00:04<00:05, 686MB/s] pytorch_model-00002-of-00003.bin: 31%|███ | 1.54G/5.00G [00:04<00:04, 714MB/s] pytorch_model-00002-of-00003.bin: 33%|███▎ | 1.66G/5.00G [00:04<00:04, 670MB/s] pytorch_model-00002-of-00003.bin: 35%|███▌ | 1.76G/5.00G [00:04<00:04, 694MB/s] pytorch_model-00002-of-00003.bin: 38%|███▊ | 1.88G/5.00G [00:04<00:04, 765MB/s] pytorch_model-00002-of-00003.bin: 40%|████ | 2.01G/5.00G [00:04<00:03, 843MB/s] pytorch_model-00002-of-00003.bin: 42%|████▏ | 2.12G/5.00G [00:05<00:03, 867MB/s] pytorch_model-00002-of-00003.bin: 48%|████▊ | 2.39G/5.00G [00:05<00:02, 1.29GB/s] pytorch_model-00002-of-00003.bin: 52%|█████▏ | 2.62G/5.00G [00:05<00:01, 1.51GB/s] pytorch_model-00002-of-00003.bin: 56%|█████▌ | 2.80G/5.00G [00:05<00:01, 1.32GB/s] pytorch_model-00002-of-00003.bin: 59%|█████▉ | 2.96G/5.00G [00:05<00:01, 1.10GB/s] pytorch_model-00002-of-00003.bin: 62%|██████▏ | 3.09G/5.00G [00:06<00:02, 747MB/s] pytorch_model-00002-of-00003.bin: 64%|██████▍ | 3.20G/5.00G [00:06<00:02, 631MB/s] pytorch_model-00002-of-00003.bin: 66%|██████▌ | 3.28G/5.00G [00:06<00:02, 622MB/s] pytorch_model-00002-of-00003.bin: 68%|██████▊ | 3.42G/5.00G [00:06<00:02, 749MB/s] pytorch_model-00002-of-00003.bin: 71%|███████ | 3.55G/5.00G [00:06<00:01, 856MB/s] pytorch_model-00002-of-00003.bin: 73%|███████▎ | 3.67G/5.00G [00:06<00:01, 907MB/s] pytorch_model-00002-of-00003.bin: 79%|███████▉ | 3.96G/5.00G [00:06<00:00, 1.38GB/s] pytorch_model-00002-of-00003.bin: 83%|████████▎ | 4.14G/5.00G [00:06<00:00, 1.43GB/s] pytorch_model-00002-of-00003.bin: 86%|████████▌ | 4.31G/5.00G [00:07<00:00, 954MB/s] pytorch_model-00002-of-00003.bin: 89%|████████▉ | 4.45G/5.00G [00:07<00:00, 902MB/s] pytorch_model-00002-of-00003.bin: 92%|█████████▏| 4.58G/5.00G [00:07<00:00, 987MB/s] pytorch_model-00002-of-00003.bin: 95%|█████████▍| 4.73G/5.00G [00:07<00:00, 1.07GB/s] pytorch_model-00002-of-00003.bin: 99%|█████████▉| 4.97G/5.00G [00:07<00:00, 1.38GB/s] pytorch_model-00002-of-00003.bin: 100%|█████████▉| 5.00G/5.00G [00:07<00:00, 636MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: pytorch_model-00003-of-00003.bin: 0%| | 0.00/4.54G [00:00<?, ?B/s] pytorch_model-00003-of-00003.bin: 0%| | 10.5M/4.54G [00:00<04:28, 16.9MB/s] pytorch_model-00003-of-00003.bin: 0%| | 21.0M/4.54G [00:01<04:41, 16.0MB/s] pytorch_model-00003-of-00003.bin: 1%| | 31.5M/4.54G [00:02<05:27, 13.8MB/s] pytorch_model-00003-of-00003.bin: 1%| | 41.9M/4.54G [00:02<03:48, 19.7MB/s] pytorch_model-00003-of-00003.bin: 1%| | 52.4M/4.54G [00:02<02:58, 25.2MB/s] pytorch_model-00003-of-00003.bin: 1%|▏ | 62.9M/4.54G [00:02<02:13, 33.5MB/s] pytorch_model-00003-of-00003.bin: 2%|▏ | 94.4M/4.54G [00:02<01:10, 62.9MB/s] pytorch_model-00003-of-00003.bin: 5%|▍ | 210M/4.54G [00:02<00:19, 218MB/s] pytorch_model-00003-of-00003.bin: 8%|▊ | 357M/4.54G [00:03<00:10, 389MB/s] pytorch_model-00003-of-00003.bin: 9%|▉ | 430M/4.54G [00:03<00:09, 427MB/s] pytorch_model-00003-of-00003.bin: 15%|█▍ | 661M/4.54G [00:03<00:04, 795MB/s] pytorch_model-00003-of-00003.bin: 25%|██▍ | 1.12G/4.54G [00:03<00:02, 1.56GB/s] pytorch_model-00003-of-00003.bin: 29%|██▉ | 1.32G/4.54G [00:04<00:04, 671MB/s] pytorch_model-00003-of-00003.bin: 32%|███▏ | 1.47G/4.54G [00:04<00:04, 702MB/s] pytorch_model-00003-of-00003.bin: 36%|███▋ | 1.66G/4.54G [00:04<00:03, 842MB/s] pytorch_model-00003-of-00003.bin: 40%|███▉ | 1.80G/4.54G [00:04<00:03, 775MB/s] pytorch_model-00003-of-00003.bin: 43%|████▎ | 1.96G/4.54G [00:04<00:02, 897MB/s] pytorch_model-00003-of-00003.bin: 46%|████▌ | 2.10G/4.54G [00:05<00:02, 905MB/s] pytorch_model-00003-of-00003.bin: 49%|████▉ | 2.22G/4.54G [00:05<00:02, 920MB/s] pytorch_model-00003-of-00003.bin: 54%|█████▍ | 2.46G/4.54G [00:05<00:01, 1.23GB/s] pytorch_model-00003-of-00003.bin: 60%|██████ | 2.74G/4.54G [00:05<00:01, 1.54GB/s] pytorch_model-00003-of-00003.bin: 64%|██████▍ | 2.93G/4.54G [00:05<00:01, 1.20GB/s] pytorch_model-00003-of-00003.bin: 68%|██████▊ | 3.08G/4.54G [00:06<00:01, 762MB/s] pytorch_model-00003-of-00003.bin: 71%|███████ | 3.21G/4.54G [00:06<00:02, 624MB/s] pytorch_model-00003-of-00003.bin: 73%|███████▎ | 3.33G/4.54G [00:06<00:01, 708MB/s] pytorch_model-00003-of-00003.bin: 76%|███████▌ | 3.44G/4.54G [00:06<00:01, 759MB/s] pytorch_model-00003-of-00003.bin: 79%|███████▉ | 3.60G/4.54G [00:06<00:01, 909MB/s] pytorch_model-00003-of-00003.bin: 84%|████████▍ | 3.82G/4.54G [00:06<00:00, 1.18GB/s] pytorch_model-00003-of-00003.bin: 87%|████████▋ | 3.96G/4.54G [00:06<00:00, 985MB/s] pytorch_model-00003-of-00003.bin: 90%|█████████ | 4.09G/4.54G [00:07<00:00, 954MB/s] pytorch_model-00003-of-00003.bin: 94%|█████████▎| 4.25G/4.54G [00:07<00:00, 1.08GB/s] pytorch_model-00003-of-00003.bin: 97%|█████████▋| 4.39G/4.54G [00:07<00:00, 1.17GB/s] pytorch_model-00003-of-00003.bin: 100%|█████████▉| 4.53G/4.54G [00:07<00:00, 1.22GB/s] pytorch_model-00003-of-00003.bin: 100%|█████████▉| 4.54G/4.54G [00:07<00:00, 593MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: pytorch_model.bin.index.json: 0%| | 0.00/23.9k [00:00<?, ?B/s] pytorch_model.bin.index.json: 100%|██████████| 23.9k/23.9k [00:00<00:00, 109MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: special_tokens_map.json: 0%| | 0.00/437 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 437/437 [00:00<00:00, 4.53MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: tokenizer.model: 0%| | 0.00/493k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 493k/493k [00:00<00:00, 25.0MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: tokenizer_config.json: 0%| | 0.00/1.02k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 1.02k/1.02k [00:00<00:00, 11.8MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Downloaded to shared memory in 28.884s
cgato-thespis-balanced-7b-v1-v3-mkmlizer: quantizing model to /dev/shm/model_cache
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Saving mkml model at /dev/shm/model_cache
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Reading /tmp/tmp6hil81ow/pytorch_model.bin.index.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Profiling: 0%| | 0/291 [00:00<?, ?it/s] Profiling: 0%| | 1/291 [00:02<11:40, 2.42s/it] Profiling: 34%|███▎ | 98/291 [00:03<00:05, 34.35it/s] Profiling: 70%|███████ | 204/291 [00:04<00:01, 57.22it/s] Profiling: 100%|██████████| 291/291 [00:05<00:00, 59.79it/s] Profiling: 100%|██████████| 291/291 [00:05<00:00, 48.58it/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: quantized model in 17.203s
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Processed model cgato/Thespis-Balanced-7b-v1 in 46.985s
cgato-thespis-balanced-7b-v1-v3-mkmlizer: creating bucket guanaco-mkml-models
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cgato-thespis-balanced-7b-v1-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cgato-thespis-balanced-7b-v1-v3
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cgato-thespis-balanced-7b-v1-v3/tokenizer_config.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cgato-thespis-balanced-7b-v1-v3/config.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cgato-thespis-balanced-7b-v1-v3/special_tokens_map.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/cgato-thespis-balanced-7b-v1-v3/tokenizer.model
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cgato-thespis-balanced-7b-v1-v3/tokenizer.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /dev/shm/model_cache/mkml_model.tensors s3://guanaco-mkml-models/cgato-thespis-balanced-7b-v1-v3/mkml_model.tensors
cgato-thespis-balanced-7b-v1-v3-mkmlizer: tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 20.1MB/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 20.0MB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cgato-thespis-balanced-7b-v1-v3-mkmlizer: warnings.warn(
cgato-thespis-balanced-7b-v1-v3-mkmlizer: pytorch_model.bin: 0%| | 0.00/1.44G [00:00<?, ?B/s] pytorch_model.bin: 1%| | 10.5M/1.44G [00:00<00:15, 91.8MB/s] pytorch_model.bin: 7%|▋ | 105M/1.44G [00:00<00:02, 553MB/s] pytorch_model.bin: 12%|█▏ | 178M/1.44G [00:00<00:02, 626MB/s] pytorch_model.bin: 18%|█▊ | 262M/1.44G [00:00<00:01, 701MB/s] pytorch_model.bin: 28%|██▊ | 398M/1.44G [00:00<00:01, 931MB/s] pytorch_model.bin: 44%|████▎ | 629M/1.44G [00:00<00:00, 1.33GB/s] pytorch_model.bin: 55%|█████▌ | 797M/1.44G [00:00<00:00, 1.42GB/s] pytorch_model.bin: 70%|███████ | 1.02G/1.44G [00:00<00:00, 1.64GB/s] pytorch_model.bin: 86%|████████▋ | 1.25G/1.44G [00:00<00:00, 1.83GB/s] pytorch_model.bin: 99%|█████████▉| 1.43G/1.44G [00:01<00:00, 1.27GB/s] pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:01<00:00, 1.02GB/s]
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Saving duration: 0.260s
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 4.891s
cgato-thespis-balanced-7b-v1-v3-mkmlizer: creating bucket guanaco-reward-models
cgato-thespis-balanced-7b-v1-v3-mkmlizer: Bucket 's3://guanaco-reward-models/' created
cgato-thespis-balanced-7b-v1-v3-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/cgato-thespis-balanced-7b-v1-v3_reward
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/cgato-thespis-balanced-7b-v1-v3_reward/special_tokens_map.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/cgato-thespis-balanced-7b-v1-v3_reward/tokenizer_config.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/cgato-thespis-balanced-7b-v1-v3_reward/vocab.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/cgato-thespis-balanced-7b-v1-v3_reward/merges.txt
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/cgato-thespis-balanced-7b-v1-v3_reward/config.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/cgato-thespis-balanced-7b-v1-v3_reward/tokenizer.json
cgato-thespis-balanced-7b-v1-v3-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/cgato-thespis-balanced-7b-v1-v3_reward/reward.tensors
Job cgato-thespis-balanced-7b-v1-v3-mkmlizer completed after 136.3s with status: succeeded
Stopping job with name cgato-thespis-balanced-7b-v1-v3-mkmlizer
Pipeline stage MKMLizer completed in 139.51s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.16s
Running pipeline stage ISVCDeployer
Creating inference service cgato-thespis-balanced-7b-v1-v3
Waiting for inference service cgato-thespis-balanced-7b-v1-v3 to be ready
Retrying (%r) after connection broken by '%r': %s
Inference service cgato-thespis-balanced-7b-v1-v3 ready after 40.277381896972656s
Pipeline stage ISVCDeployer completed in 47.31s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.3355119228363037s
Received healthy response to inference request in 0.9136238098144531s
Received healthy response to inference request in 0.967648983001709s
Received healthy response to inference request in 0.8121259212493896s
Received healthy response to inference request in 0.4730536937713623s
5 requests
0 failed requests
5th percentile: 0.5408681392669678
10th percentile: 0.6086825847625732
20th percentile: 0.7443114757537842
30th percentile: 0.8324254989624024
40th percentile: 0.8730246543884277
50th percentile: 0.9136238098144531
60th percentile: 0.9352338790893555
70th percentile: 0.9568439483642578
80th percentile: 1.041221570968628
90th percentile: 1.188366746902466
95th percentile: 1.2619393348693846
99th percentile: 1.3207974052429199
mean time: 0.9003928661346435
Pipeline stage StressChecker completed in 5.47s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.23s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.06s
M-Eval Dataset for topic stay_in_character is loaded
cgato-thespis-balanced-7b-v1_v3 status is now deployed due to DeploymentManager action
cgato-thespis-balanced-7b-v1_v3 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics