submission_id: hflserdaniel-chai-s6-13_2318_v18
developer_uid: chai_backend_admin
status: torndown
model_repo: hflserdaniel/chai_s6_13b_slp3
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 50, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n', '</s>', '<|user|>', '###'], 'max_input_tokens': 512, 'best_of': 1, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-04-02T07:35:51+00:00
model_name: auto_submit_ricum_wipucesega
model_eval_status: success
model_group: hflserdaniel/chai_s6_13b
num_battles: 5579
num_wins: 2441
celo_rating: 1118.37
propriety_score: 0.0
propriety_total_count: 0.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 13015864320.0
best_of: 1
max_input_tokens: 512
max_output_tokens: 64
display_name: auto_submit_ricum_wipucesega
ineligible_reason: propriety_total_count < 800
language_model: hflserdaniel/chai_s6_13b_slp3
model_size: 13B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-04-02
win_ratio: 0.4375336081735078
preference_data_url: None
Resubmit model
Running pipeline stage MKMLizer
Starting job with name hflserdaniel-chai-s6-13-2318-v18-mkmlizer
Waiting for job on hflserdaniel-chai-s6-13-2318-v18-mkmlizer to finish
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ _____ __ __ ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ /___/ ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ Version: 0.6.11 ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ The license key for the current software has been verified as ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ belonging to: ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ Chai Research Corp. ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ║ ║
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: .gitattributes: 0%| | 0.00/1.52k [00:00<?, ?B/s] .gitattributes: 100%|██████████| 1.52k/1.52k [00:00<00:00, 19.7MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: README.md: 0%| | 0.00/949 [00:00<?, ?B/s] README.md: 100%|██████████| 949/949 [00:00<00:00, 9.34MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: added_tokens.json: 0%| | 0.00/21.0 [00:00<?, ?B/s] added_tokens.json: 100%|██████████| 21.0/21.0 [00:00<00:00, 191kB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: config.json: 0%| | 0.00/738 [00:00<?, ?B/s] config.json: 100%|██████████| 738/738 [00:00<00:00, 12.3MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: mergekit_config.yml: 0%| | 0.00/370 [00:00<?, ?B/s] mergekit_config.yml: 100%|██████████| 370/370 [00:00<00:00, 3.32MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: model-00002-of-00003.safetensors: 0%| | 0.00/9.90G [00:00<?, ?B/s] model-00002-of-00003.safetensors: 0%| | 10.5M/9.90G [00:00<02:48, 58.7MB/s] model-00002-of-00003.safetensors: 1%|▏ | 126M/9.90G [00:00<00:18, 523MB/s] model-00002-of-00003.safetensors: 3%|▎ | 262M/9.90G [00:00<00:12, 757MB/s] model-00002-of-00003.safetensors: 4%|▎ | 357M/9.90G [00:00<00:13, 699MB/s] model-00002-of-00003.safetensors: 5%|▍ | 461M/9.90G [00:00<00:12, 783MB/s] model-00002-of-00003.safetensors: 7%|▋ | 661M/9.90G [00:00<00:08, 1.12GB/s] model-00002-of-00003.safetensors: 10%|█ | 1.01G/9.90G [00:00<00:04, 1.79GB/s] model-00002-of-00003.safetensors: 14%|█▍ | 1.38G/9.90G [00:00<00:03, 2.36GB/s] model-00002-of-00003.safetensors: 17%|█▋ | 1.64G/9.90G [00:01<00:04, 1.97GB/s] model-00002-of-00003.safetensors: 19%|█▊ | 1.86G/9.90G [00:01<00:04, 1.95GB/s] model-00002-of-00003.safetensors: 21%|██ | 2.08G/9.90G [00:01<00:03, 2.00GB/s] model-00002-of-00003.safetensors: 24%|██▎ | 2.33G/9.90G [00:01<00:03, 2.10GB/s] model-00002-of-00003.safetensors: 26%|██▌ | 2.55G/9.90G [00:01<00:04, 1.82GB/s] model-00002-of-00003.safetensors: 29%|██▉ | 2.89G/9.90G [00:01<00:03, 2.21GB/s] model-00002-of-00003.safetensors: 32%|███▏ | 3.14G/9.90G [00:01<00:03, 2.17GB/s] model-00002-of-00003.safetensors: 34%|███▍ | 3.37G/9.90G [00:01<00:03, 2.08GB/s] model-00002-of-00003.safetensors: 36%|███▋ | 3.61G/9.90G [00:02<00:02, 2.15GB/s] model-00002-of-00003.safetensors: 39%|███▊ | 3.83G/9.90G [00:02<00:02, 2.07GB/s] model-00002-of-00003.safetensors: 41%|████ | 4.05G/9.90G [00:02<00:02, 2.08GB/s] model-00002-of-00003.safetensors: 43%|████▎ | 4.28G/9.90G [00:02<00:02, 2.10GB/s] model-00002-of-00003.safetensors: 46%|████▌ | 4.54G/9.90G [00:02<00:02, 2.19GB/s] model-00002-of-00003.safetensors: 48%|████▊ | 4.77G/9.90G [00:02<00:02, 2.15GB/s] model-00002-of-00003.safetensors: 50%|█████ | 4.99G/9.90G [00:02<00:02, 2.14GB/s] model-00002-of-00003.safetensors: 53%|█████▎ | 5.28G/9.90G [00:02<00:01, 2.35GB/s] model-00002-of-00003.safetensors: 56%|█████▌ | 5.53G/9.90G [00:02<00:01, 2.27GB/s] model-00002-of-00003.safetensors: 58%|█████▊ | 5.76G/9.90G [00:03<00:01, 2.20GB/s] model-00002-of-00003.safetensors: 60%|██████ | 5.99G/9.90G [00:03<00:01, 2.10GB/s] model-00002-of-00003.safetensors: 63%|██████▎ | 6.23G/9.90G [00:03<00:01, 2.17GB/s] model-00002-of-00003.safetensors: 65%|██████▌ | 6.46G/9.90G [00:03<00:01, 2.20GB/s] model-00002-of-00003.safetensors: 68%|██████▊ | 6.70G/9.90G [00:03<00:01, 2.22GB/s] model-00002-of-00003.safetensors: 70%|██████▉ | 6.93G/9.90G [00:03<00:01, 1.98GB/s] model-00002-of-00003.safetensors: 73%|███████▎ | 7.24G/9.90G [00:03<00:01, 2.21GB/s] model-00002-of-00003.safetensors: 75%|███████▌ | 7.47G/9.90G [00:03<00:01, 2.17GB/s] model-00002-of-00003.safetensors: 78%|███████▊ | 7.69G/9.90G [00:03<00:01, 2.05GB/s] model-00002-of-00003.safetensors: 80%|███████▉ | 7.92G/9.90G [00:04<00:00, 2.09GB/s] model-00002-of-00003.safetensors: 82%|████████▏ | 8.15G/9.90G [00:04<00:00, 2.14GB/s] model-00002-of-00003.safetensors: 85%|████████▌ | 8.43G/9.90G [00:04<00:00, 2.28GB/s] model-00002-of-00003.safetensors: 87%|████████▋ | 8.66G/9.90G [00:04<00:00, 2.04GB/s] model-00002-of-00003.safetensors: 90%|████████▉ | 8.87G/9.90G [00:04<00:00, 1.96GB/s] model-00002-of-00003.safetensors: 92%|█████████▏| 9.13G/9.90G [00:04<00:00, 2.11GB/s] model-00002-of-00003.safetensors: 95%|█████████▌| 9.41G/9.90G [00:04<00:00, 2.29GB/s] model-00002-of-00003.safetensors: 100%|█████████▉| 9.90G/9.90G [00:04<00:00, 2.02GB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: model-00003-of-00003.safetensors: 0%| | 0.00/6.18G [00:00<?, ?B/s] model-00003-of-00003.safetensors: 0%| | 10.5M/6.18G [00:00<02:12, 46.4MB/s] model-00003-of-00003.safetensors: 2%|▏ | 136M/6.18G [00:00<00:11, 512MB/s] model-00003-of-00003.safetensors: 4%|▎ | 231M/6.18G [00:00<00:09, 602MB/s] model-00003-of-00003.safetensors: 5%|▌ | 315M/6.18G [00:00<00:10, 555MB/s] model-00003-of-00003.safetensors: 6%|▋ | 388M/6.18G [00:00<00:13, 433MB/s] model-00003-of-00003.safetensors: 8%|▊ | 472M/6.18G [00:00<00:10, 520MB/s] model-00003-of-00003.safetensors: 20%|██ | 1.25G/6.18G [00:01<00:02, 2.22GB/s] model-00003-of-00003.safetensors: 26%|██▌ | 1.61G/6.18G [00:01<00:01, 2.56GB/s] model-00003-of-00003.safetensors: 31%|███ | 1.93G/6.18G [00:01<00:01, 2.32GB/s] model-00003-of-00003.safetensors: 36%|███▌ | 2.21G/6.18G [00:01<00:02, 1.70GB/s] model-00003-of-00003.safetensors: 40%|███▉ | 2.44G/6.18G [00:01<00:02, 1.46GB/s] model-00003-of-00003.safetensors: 46%|████▌ | 2.84G/6.18G [00:01<00:01, 1.91GB/s] model-00003-of-00003.safetensors: 53%|█████▎ | 3.29G/6.18G [00:02<00:01, 2.33GB/s] model-00003-of-00003.safetensors: 58%|█████▊ | 3.58G/6.18G [00:02<00:01, 2.29GB/s] model-00003-of-00003.safetensors: 62%|██████▏ | 3.85G/6.18G [00:02<00:01, 1.98GB/s] model-00003-of-00003.safetensors: 66%|██████▌ | 4.08G/6.18G [00:02<00:01, 1.81GB/s] model-00003-of-00003.safetensors: 71%|███████▏ | 4.41G/6.18G [00:02<00:00, 2.07GB/s] model-00003-of-00003.safetensors: 76%|███████▌ | 4.70G/6.18G [00:02<00:00, 2.23GB/s] model-00003-of-00003.safetensors: 80%|████████ | 4.97G/6.18G [00:02<00:00, 2.31GB/s] model-00003-of-00003.safetensors: 85%|████████▍ | 5.22G/6.18G [00:03<00:00, 1.80GB/s] model-00003-of-00003.safetensors: 90%|████████▉ | 5.55G/6.18G [00:03<00:00, 2.11GB/s] model-00003-of-00003.safetensors: 100%|█████████▉| 6.18G/6.18G [00:03<00:00, 1.86GB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: model.safetensors.index.json: 0%| | 0.00/28.4k [00:00<?, ?B/s] model.safetensors.index.json: 100%|██████████| 28.4k/28.4k [00:00<00:00, 177MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: special_tokens_map.json: 0%| | 0.00/549 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 549/549 [00:00<00:00, 4.51MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: tokenizer.json: 0%| | 0.00/1.84M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 1.84M/1.84M [00:00<00:00, 38.1MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: tokenizer.model: 0%| | 0.00/500k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 500k/500k [00:00<00:00, 58.6MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: tokenizer_config.json: 0%| | 0.00/1.12k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 1.12k/1.12k [00:00<00:00, 13.3MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Downloaded to shared memory in 15.526s
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: quantizing model to /dev/shm/model_cache
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Saving mkml model at /dev/shm/model_cache
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Reading /tmp/tmpao9i6egl/model.safetensors.index.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Profiling: 0%| | 0/363 [00:00<?, ?it/s] Profiling: 0%| | 1/363 [00:01<08:58, 1.49s/it] Profiling: 4%|▎ | 13/363 [00:01<00:31, 11.07it/s] Profiling: 8%|▊ | 30/363 [00:01<00:11, 28.58it/s] Profiling: 12%|█▏ | 45/363 [00:01<00:07, 45.09it/s] Profiling: 16%|█▋ | 59/363 [00:01<00:05, 60.05it/s] Profiling: 21%|██▏ | 78/363 [00:02<00:03, 83.09it/s] Profiling: 26%|██▌ | 93/363 [00:02<00:02, 96.61it/s] Profiling: 30%|██▉ | 108/363 [00:02<00:02, 107.38it/s] Profiling: 34%|███▍ | 124/363 [00:02<00:02, 119.16it/s] Profiling: 39%|███▊ | 140/363 [00:02<00:03, 65.28it/s] Profiling: 43%|████▎ | 156/363 [00:02<00:02, 78.79it/s] Profiling: 47%|████▋ | 171/363 [00:03<00:02, 91.34it/s] Profiling: 51%|█████ | 185/363 [00:03<00:01, 100.47it/s] Profiling: 56%|█████▌ | 202/363 [00:03<00:01, 113.84it/s] Profiling: 60%|██████ | 219/363 [00:03<00:01, 124.70it/s] Profiling: 64%|██████▍ | 234/363 [00:03<00:00, 130.26it/s] Profiling: 69%|██████▊ | 249/363 [00:03<00:00, 133.56it/s] Profiling: 73%|███████▎ | 265/363 [00:03<00:00, 138.25it/s] Profiling: 77%|███████▋ | 280/363 [00:05<00:03, 24.29it/s] Profiling: 81%|████████ | 294/363 [00:05<00:02, 31.58it/s] Profiling: 86%|████████▌ | 311/363 [00:05<00:01, 42.78it/s] Profiling: 90%|█████████ | 328/363 [00:05<00:00, 55.87it/s] Profiling: 94%|█████████▍| 343/363 [00:05<00:00, 68.06it/s] Profiling: 99%|█████████▊| 358/363 [00:06<00:00, 80.27it/s] Profiling: 100%|██████████| 363/363 [00:06<00:00, 57.80it/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: quantized model in 23.309s
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Processed model hflserdaniel/chai_s6_13b_slp3 in 40.264s
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: creating bucket guanaco-mkml-models
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/hflserdaniel-chai-s6-13-2318-v18
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/hflserdaniel-chai-s6-13-2318-v18/config.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/hflserdaniel-chai-s6-13-2318-v18/tokenizer.model
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/hflserdaniel-chai-s6-13-2318-v18/tokenizer.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/hflserdaniel-chai-s6-13-2318-v18/special_tokens_map.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /dev/shm/model_cache/added_tokens.json s3://guanaco-mkml-models/hflserdaniel-chai-s6-13-2318-v18/added_tokens.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/hflserdaniel-chai-s6-13-2318-v18/tokenizer_config.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /dev/shm/model_cache/mkml_model.tensors s3://guanaco-mkml-models/hflserdaniel-chai-s6-13-2318-v18/mkml_model.tensors
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: warnings.warn(
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: config.json: 0%| | 0.00/1.05k [00:00<?, ?B/s] config.json: 100%|██████████| 1.05k/1.05k [00:00<00:00, 12.6MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: warnings.warn(
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 234/234 [00:00<00:00, 2.81MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: vocab.json: 0%| | 0.00/1.04M [00:00<?, ?B/s] vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 11.3MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 9.02MB/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 9.00MB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: warnings.warn(
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: pytorch_model.bin: 0%| | 0.00/1.44G [00:00<?, ?B/s] pytorch_model.bin: 1%|▏ | 21.0M/1.44G [00:00<00:07, 192MB/s] pytorch_model.bin: 7%|▋ | 105M/1.44G [00:00<00:02, 541MB/s] pytorch_model.bin: 12%|█▏ | 168M/1.44G [00:00<00:02, 461MB/s] pytorch_model.bin: 16%|█▌ | 231M/1.44G [00:00<00:02, 491MB/s] pytorch_model.bin: 24%|██▍ | 346M/1.44G [00:00<00:01, 679MB/s] pytorch_model.bin: 30%|███ | 440M/1.44G [00:00<00:01, 743MB/s] pytorch_model.bin: 41%|████▏ | 598M/1.44G [00:00<00:00, 988MB/s] pytorch_model.bin: 75%|███████▍ | 1.08G/1.44G [00:00<00:00, 2.13GB/s] pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:00<00:00, 1.48GB/s]
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Saving duration: 0.220s
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 4.165s
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: creating bucket guanaco-reward-models
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: Bucket 's3://guanaco-reward-models/' created
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/hflserdaniel-chai-s6-13-2318-v18_reward
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/hflserdaniel-chai-s6-13-2318-v18_reward/config.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/hflserdaniel-chai-s6-13-2318-v18_reward/special_tokens_map.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/hflserdaniel-chai-s6-13-2318-v18_reward/tokenizer_config.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/hflserdaniel-chai-s6-13-2318-v18_reward/merges.txt
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/hflserdaniel-chai-s6-13-2318-v18_reward/vocab.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/hflserdaniel-chai-s6-13-2318-v18_reward/tokenizer.json
hflserdaniel-chai-s6-13-2318-v18-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/hflserdaniel-chai-s6-13-2318-v18_reward/reward.tensors
Job hflserdaniel-chai-s6-13-2318-v18-mkmlizer completed after 74.45s with status: succeeded
Stopping job with name hflserdaniel-chai-s6-13-2318-v18-mkmlizer
Pipeline stage MKMLizer completed in 77.87s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Creating inference service hflserdaniel-chai-s6-13-2318-v18
Waiting for inference service hflserdaniel-chai-s6-13-2318-v18 to be ready
Inference service hflserdaniel-chai-s6-13-2318-v18 ready after 251.250727891922s
Pipeline stage ISVCDeployer completed in 258.36s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.5637755393981934s
Received healthy response to inference request in 0.9876484870910645s
Received healthy response to inference request in 0.5874133110046387s
Received healthy response to inference request in 1.1142828464508057s
Received healthy response to inference request in 1.0934622287750244s
5 requests
0 failed requests
5th percentile: 0.6674603462219239
10th percentile: 0.747507381439209
20th percentile: 0.9076014518737793
30th percentile: 1.0088112354278564
40th percentile: 1.0511367321014404
50th percentile: 1.0934622287750244
60th percentile: 1.101790475845337
70th percentile: 1.1101187229156495
80th percentile: 1.2041813850402834
90th percentile: 1.3839784622192384
95th percentile: 1.4738770008087156
99th percentile: 1.5457958316802978
mean time: 1.0693164825439454
Pipeline stage StressChecker completed in 6.15s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.04s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.04s
M-Eval Dataset for topic stay_in_character is loaded
hflserdaniel-chai-s6-13_2318_v18 status is now deployed due to DeploymentManager action
hflserdaniel-chai-s6-13_2318_v18 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of hflserdaniel-chai-s6-13_2318_v18
Running pipeline stage ISVCDeleter
Checking if service hflserdaniel-chai-s6-13-2318-v18 is running
Tearing down inference service hflserdaniel-chai-s6-13-2318-v18
Toredown service hflserdaniel-chai-s6-13-2318-v18
Pipeline stage ISVCDeleter completed in 4.94s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key hflserdaniel-chai-s6-13-2318-v18/added_tokens.json from bucket guanaco-mkml-models
Deleting key hflserdaniel-chai-s6-13-2318-v18/config.json from bucket guanaco-mkml-models
Deleting key hflserdaniel-chai-s6-13-2318-v18/mkml_model.tensors from bucket guanaco-mkml-models
Deleting key hflserdaniel-chai-s6-13-2318-v18/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key hflserdaniel-chai-s6-13-2318-v18/tokenizer.json from bucket guanaco-mkml-models
Deleting key hflserdaniel-chai-s6-13-2318-v18/tokenizer.model from bucket guanaco-mkml-models
Deleting key hflserdaniel-chai-s6-13-2318-v18/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key hflserdaniel-chai-s6-13-2318-v18_reward/config.json from bucket guanaco-reward-models
Deleting key hflserdaniel-chai-s6-13-2318-v18_reward/merges.txt from bucket guanaco-reward-models
Deleting key hflserdaniel-chai-s6-13-2318-v18_reward/reward.tensors from bucket guanaco-reward-models
Deleting key hflserdaniel-chai-s6-13-2318-v18_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key hflserdaniel-chai-s6-13-2318-v18_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key hflserdaniel-chai-s6-13-2318-v18_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key hflserdaniel-chai-s6-13-2318-v18_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.24s
hflserdaniel-chai-s6-13_2318_v18 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics