Running pipeline stage MKMLizer
Starting job with name deverdever-heavenly-goat-v7-v2-mkmlizer
Waiting for job on deverdever-heavenly-goat-v7-v2-mkmlizer to finish
deverdever-heavenly-goat-v7-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ _____ __ __ ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ /___/ ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ Version: 0.6.11 ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ The license key for the current software has been verified as ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ belonging to: ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ Chai Research Corp. ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ║ ║
deverdever-heavenly-goat-v7-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
deverdever-heavenly-goat-v7-v2-mkmlizer:
.gitattributes: 0%| | 0.00/1.52k [00:00<?, ?B/s]
.gitattributes: 100%|██████████| 1.52k/1.52k [00:00<00:00, 17.5MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
README.md: 0%| | 0.00/21.0 [00:00<?, ?B/s]
README.md: 100%|██████████| 21.0/21.0 [00:00<00:00, 274kB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
added_tokens.json: 0%| | 0.00/42.0 [00:00<?, ?B/s]
added_tokens.json: 100%|██████████| 42.0/42.0 [00:00<00:00, 471kB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
config.json: 0%| | 0.00/617 [00:00<?, ?B/s]
config.json: 100%|██████████| 617/617 [00:00<00:00, 4.90MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
generation_config.json: 0%| | 0.00/111 [00:00<?, ?B/s]
generation_config.json: 100%|██████████| 111/111 [00:00<00:00, 1.06MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
pytorch_model-00001-of-00002.bin: 0%| | 0.00/9.94G [00:00<?, ?B/s]
pytorch_model-00001-of-00002.bin: 0%| | 10.5M/9.94G [00:00<06:42, 24.7MB/s]
pytorch_model-00001-of-00002.bin: 0%| | 21.0M/9.94G [00:01<16:14, 10.2MB/s]
pytorch_model-00001-of-00002.bin: 0%| | 31.5M/9.94G [00:02<10:01, 16.5MB/s]
pytorch_model-00001-of-00002.bin: 0%| | 41.9M/9.94G [00:02<07:08, 23.1MB/s]
pytorch_model-00001-of-00002.bin: 1%| | 52.4M/9.94G [00:02<05:15, 31.3MB/s]
pytorch_model-00001-of-00002.bin: 1%| | 62.9M/9.94G [00:02<04:48, 34.3MB/s]
pytorch_model-00001-of-00002.bin: 1%| | 73.4M/9.94G [00:03<05:57, 27.6MB/s]
pytorch_model-00001-of-00002.bin: 2%|▏ | 178M/9.94G [00:03<01:16, 128MB/s]
pytorch_model-00001-of-00002.bin: 2%|▏ | 210M/9.94G [00:03<01:20, 121MB/s]
pytorch_model-00001-of-00002.bin: 7%|▋ | 692M/9.94G [00:03<00:13, 710MB/s]
pytorch_model-00001-of-00002.bin: 12%|█▏ | 1.23G/9.94G [00:03<00:07, 1.12GB/s]
pytorch_model-00001-of-00002.bin: 14%|█▍ | 1.41G/9.94G [00:04<00:14, 597MB/s]
pytorch_model-00001-of-00002.bin: 16%|█▌ | 1.59G/9.94G [00:04<00:11, 716MB/s]
pytorch_model-00001-of-00002.bin: 18%|█▊ | 1.75G/9.94G [00:05<00:10, 798MB/s]
pytorch_model-00001-of-00002.bin: 19%|█▉ | 1.90G/9.94G [00:05<00:11, 702MB/s]
pytorch_model-00001-of-00002.bin: 21%|██▏ | 2.12G/9.94G [00:05<00:08, 904MB/s]
pytorch_model-00001-of-00002.bin: 23%|██▎ | 2.29G/9.94G [00:05<00:07, 982MB/s]
pytorch_model-00001-of-00002.bin: 24%|██▍ | 2.43G/9.94G [00:05<00:08, 879MB/s]
pytorch_model-00001-of-00002.bin: 26%|██▌ | 2.57G/9.94G [00:05<00:07, 940MB/s]
pytorch_model-00001-of-00002.bin: 31%|███ | 3.06G/9.94G [00:05<00:03, 1.72GB/s]
pytorch_model-00001-of-00002.bin: 33%|███▎ | 3.29G/9.94G [00:06<00:03, 1.82GB/s]
pytorch_model-00001-of-00002.bin: 36%|███▌ | 3.54G/9.94G [00:06<00:03, 1.97GB/s]
pytorch_model-00001-of-00002.bin: 38%|███▊ | 3.77G/9.94G [00:06<00:03, 1.98GB/s]
pytorch_model-00001-of-00002.bin: 40%|████ | 4.01G/9.94G [00:06<00:02, 2.01GB/s]
pytorch_model-00001-of-00002.bin: 42%|████▏ | 4.23G/9.94G [00:06<00:04, 1.38GB/s]
pytorch_model-00001-of-00002.bin: 44%|████▍ | 4.40G/9.94G [00:06<00:05, 1.06GB/s]
pytorch_model-00001-of-00002.bin: 46%|████▌ | 4.55G/9.94G [00:07<00:05, 993MB/s]
pytorch_model-00001-of-00002.bin: 47%|████▋ | 4.68G/9.94G [00:07<00:05, 918MB/s]
pytorch_model-00001-of-00002.bin: 49%|████▊ | 4.84G/9.94G [00:07<00:04, 1.06GB/s]
pytorch_model-00001-of-00002.bin: 50%|█████ | 4.98G/9.94G [00:07<00:05, 866MB/s]
pytorch_model-00001-of-00002.bin: 51%|█████▏ | 5.10G/9.94G [00:07<00:05, 856MB/s]
pytorch_model-00001-of-00002.bin: 53%|█████▎ | 5.24G/9.94G [00:07<00:04, 971MB/s]
pytorch_model-00001-of-00002.bin: 54%|█████▍ | 5.40G/9.94G [00:08<00:04, 1.10GB/s]
pytorch_model-00001-of-00002.bin: 56%|█████▌ | 5.55G/9.94G [00:08<00:03, 1.18GB/s]
pytorch_model-00001-of-00002.bin: 58%|█████▊ | 5.74G/9.94G [00:08<00:03, 1.35GB/s]
pytorch_model-00001-of-00002.bin: 60%|██████ | 5.98G/9.94G [00:08<00:02, 1.49GB/s]
pytorch_model-00001-of-00002.bin: 63%|██████▎ | 6.23G/9.94G [00:08<00:02, 1.75GB/s]
pytorch_model-00001-of-00002.bin: 65%|██████▍ | 6.42G/9.94G [00:08<00:03, 1.08GB/s]
pytorch_model-00001-of-00002.bin: 66%|██████▌ | 6.56G/9.94G [00:08<00:03, 1.03GB/s]
pytorch_model-00001-of-00002.bin: 68%|██████▊ | 6.75G/9.94G [00:09<00:02, 1.17GB/s]
pytorch_model-00001-of-00002.bin: 70%|██████▉ | 6.95G/9.94G [00:09<00:02, 1.23GB/s]
pytorch_model-00001-of-00002.bin: 71%|███████▏ | 7.10G/9.94G [00:09<00:02, 1.18GB/s]
pytorch_model-00001-of-00002.bin: 73%|███████▎ | 7.24G/9.94G [00:09<00:02, 1.17GB/s]
pytorch_model-00001-of-00002.bin: 75%|███████▍ | 7.42G/9.94G [00:09<00:01, 1.34GB/s]
pytorch_model-00001-of-00002.bin: 76%|███████▌ | 7.57G/9.94G [00:09<00:01, 1.22GB/s]
pytorch_model-00001-of-00002.bin: 79%|███████▊ | 7.81G/9.94G [00:09<00:01, 1.49GB/s]
pytorch_model-00001-of-00002.bin: 82%|████████▏ | 8.11G/9.94G [00:09<00:00, 1.84GB/s]
pytorch_model-00001-of-00002.bin: 84%|████████▎ | 8.30G/9.94G [00:10<00:00, 1.81GB/s]
pytorch_model-00001-of-00002.bin: 86%|████████▌ | 8.50G/9.94G [00:10<00:00, 1.76GB/s]
pytorch_model-00001-of-00002.bin: 88%|████████▊ | 8.73G/9.94G [00:10<00:00, 1.89GB/s]
pytorch_model-00001-of-00002.bin: 90%|████████▉ | 8.93G/9.94G [00:10<00:00, 1.88GB/s]
pytorch_model-00001-of-00002.bin: 92%|█████████▏| 9.18G/9.94G [00:10<00:00, 2.03GB/s]
pytorch_model-00001-of-00002.bin: 95%|█████████▌| 9.46G/9.94G [00:10<00:00, 2.22GB/s]
pytorch_model-00001-of-00002.bin: 97%|█████████▋| 9.69G/9.94G [00:10<00:00, 1.82GB/s]
pytorch_model-00001-of-00002.bin: 99%|█████████▉| 9.89G/9.94G [00:10<00:00, 1.58GB/s]
pytorch_model-00001-of-00002.bin: 100%|█████████▉| 9.94G/9.94G [00:11<00:00, 890MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
pytorch_model-00002-of-00002.bin: 0%| | 0.00/4.54G [00:00<?, ?B/s]
pytorch_model-00002-of-00002.bin: 0%| | 10.5M/4.54G [00:02<21:17, 3.55MB/s]
pytorch_model-00002-of-00002.bin: 1%| | 41.9M/4.54G [00:03<04:57, 15.1MB/s]
pytorch_model-00002-of-00002.bin: 2%|▏ | 73.4M/4.54G [00:03<02:28, 30.2MB/s]
pytorch_model-00002-of-00002.bin: 5%|▌ | 241M/4.54G [00:03<00:30, 143MB/s]
pytorch_model-00002-of-00002.bin: 24%|██▍ | 1.09G/4.54G [00:03<00:03, 884MB/s]
pytorch_model-00002-of-00002.bin: 32%|███▏ | 1.44G/4.54G [00:04<00:04, 669MB/s]
pytorch_model-00002-of-00002.bin: 37%|███▋ | 1.69G/4.54G [00:05<00:04, 610MB/s]
pytorch_model-00002-of-00002.bin: 41%|████▏ | 1.88G/4.54G [00:05<00:03, 680MB/s]
pytorch_model-00002-of-00002.bin: 45%|████▌ | 2.04G/4.54G [00:05<00:03, 630MB/s]
pytorch_model-00002-of-00002.bin: 48%|████▊ | 2.17G/4.54G [00:05<00:03, 689MB/s]
pytorch_model-00002-of-00002.bin: 51%|█████ | 2.30G/4.54G [00:05<00:03, 702MB/s]
pytorch_model-00002-of-00002.bin: 53%|█████▎ | 2.41G/4.54G [00:05<00:02, 751MB/s]
pytorch_model-00002-of-00002.bin: 56%|█████▌ | 2.53G/4.54G [00:06<00:02, 726MB/s]
pytorch_model-00002-of-00002.bin: 58%|█████▊ | 2.63G/4.54G [00:06<00:02, 764MB/s]
pytorch_model-00002-of-00002.bin: 66%|██████▋ | 3.01G/4.54G [00:06<00:01, 1.35GB/s]
pytorch_model-00002-of-00002.bin: 75%|███████▌ | 3.41G/4.54G [00:06<00:00, 1.86GB/s]
pytorch_model-00002-of-00002.bin: 80%|████████ | 3.64G/4.54G [00:06<00:00, 1.92GB/s]
pytorch_model-00002-of-00002.bin: 85%|████████▌ | 3.88G/4.54G [00:06<00:00, 2.03GB/s]
pytorch_model-00002-of-00002.bin: 91%|█████████ | 4.14G/4.54G [00:06<00:00, 2.14GB/s]
pytorch_model-00002-of-00002.bin: 97%|█████████▋| 4.38G/4.54G [00:07<00:00, 1.61GB/s]
pytorch_model-00002-of-00002.bin: 100%|█████████▉| 4.54G/4.54G [00:07<00:00, 634MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
pytorch_model.bin.index.json: 0%| | 0.00/23.9k [00:00<?, ?B/s]
pytorch_model.bin.index.json: 100%|██████████| 23.9k/23.9k [00:00<00:00, 152MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
special_tokens_map.json: 0%| | 0.00/145 [00:00<?, ?B/s]
special_tokens_map.json: 100%|██████████| 145/145 [00:00<00:00, 1.54MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
tokenizer.json: 0%| | 0.00/1.80M [00:00<?, ?B/s]
tokenizer.json: 100%|██████████| 1.80M/1.80M [00:00<00:00, 36.1MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
tokenizer.model: 0%| | 0.00/493k [00:00<?, ?B/s]
tokenizer.model: 100%|██████████| 493k/493k [00:00<00:00, 44.8MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
tokenizer_config.json: 0%| | 0.00/953 [00:00<?, ?B/s]
tokenizer_config.json: 100%|██████████| 953/953 [00:00<00:00, 12.4MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer: Downloaded to shared memory in 19.900s
deverdever-heavenly-goat-v7-v2-mkmlizer: quantizing model to /dev/shm/model_cache
deverdever-heavenly-goat-v7-v2-mkmlizer: Saving mkml model at /dev/shm/model_cache
deverdever-heavenly-goat-v7-v2-mkmlizer: Reading /tmp/tmpu6jc2go5/pytorch_model.bin.index.json
deverdever-heavenly-goat-v7-v2-mkmlizer:
Profiling: 0%| | 0/291 [00:00<?, ?it/s]
Profiling: 0%| | 1/291 [00:03<17:07, 3.54s/it]
Profiling: 70%|███████ | 204/291 [00:04<00:01, 55.58it/s]
Profiling: 100%|██████████| 291/291 [00:06<00:00, 58.50it/s]
Profiling: 100%|██████████| 291/291 [00:06<00:00, 47.80it/s]
deverdever-heavenly-goat-v7-v2-mkmlizer: quantized model in 17.356s
deverdever-heavenly-goat-v7-v2-mkmlizer: creating bucket guanaco-mkml-models
deverdever-heavenly-goat-v7-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
deverdever-heavenly-goat-v7-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/deverdever-heavenly-goat-v7-v2
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/deverdever-heavenly-goat-v7-v2/tokenizer_config.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/deverdever-heavenly-goat-v7-v2/special_tokens_map.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/deverdever-heavenly-goat-v7-v2/tokenizer.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/deverdever-heavenly-goat-v7-v2/tokenizer.model
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/deverdever-heavenly-goat-v7-v2/config.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /dev/shm/model_cache/mkml_model.tensors s3://guanaco-mkml-models/deverdever-heavenly-goat-v7-v2/mkml_model.tensors
deverdever-heavenly-goat-v7-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
deverdever-heavenly-goat-v7-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
deverdever-heavenly-goat-v7-v2-mkmlizer: warnings.warn(
deverdever-heavenly-goat-v7-v2-mkmlizer:
config.json: 0%| | 0.00/1.05k [00:00<?, ?B/s]
config.json: 100%|██████████| 1.05k/1.05k [00:00<00:00, 8.89MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
deverdever-heavenly-goat-v7-v2-mkmlizer: warnings.warn(
deverdever-heavenly-goat-v7-v2-mkmlizer:
tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s]
tokenizer_config.json: 100%|██████████| 234/234 [00:00<00:00, 1.99MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
vocab.json: 0%| | 0.00/1.04M [00:00<?, ?B/s]
vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 20.3MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer:
tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s]
tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 41.3MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
deverdever-heavenly-goat-v7-v2-mkmlizer: warnings.warn(
deverdever-heavenly-goat-v7-v2-mkmlizer:
pytorch_model.bin: 0%| | 0.00/1.44G [00:00<?, ?B/s]
pytorch_model.bin: 1%| | 10.5M/1.44G [00:00<00:39, 36.8MB/s]
pytorch_model.bin: 1%|▏ | 21.0M/1.44G [00:00<00:27, 52.3MB/s]
pytorch_model.bin: 4%|▍ | 62.9M/1.44G [00:00<00:09, 152MB/s]
pytorch_model.bin: 9%|▉ | 136M/1.44G [00:00<00:04, 309MB/s]
pytorch_model.bin: 23%|██▎ | 325M/1.44G [00:00<00:01, 709MB/s]
pytorch_model.bin: 38%|███▊ | 556M/1.44G [00:00<00:00, 1.10GB/s]
pytorch_model.bin: 47%|████▋ | 682M/1.44G [00:01<00:00, 1.00GB/s]
pytorch_model.bin: 55%|█████▌ | 797M/1.44G [00:02<00:02, 265MB/s]
pytorch_model.bin: 61%|██████ | 881M/1.44G [00:02<00:01, 303MB/s]
pytorch_model.bin: 92%|█████████▏| 1.33G/1.44G [00:02<00:00, 742MB/s]
pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:02<00:00, 558MB/s]
deverdever-heavenly-goat-v7-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
deverdever-heavenly-goat-v7-v2-mkmlizer: Saving duration: 0.271s
deverdever-heavenly-goat-v7-v2-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 6.515s
deverdever-heavenly-goat-v7-v2-mkmlizer: creating bucket guanaco-reward-models
deverdever-heavenly-goat-v7-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
deverdever-heavenly-goat-v7-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/deverdever-heavenly-goat-v7-v2_reward
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/deverdever-heavenly-goat-v7-v2_reward/special_tokens_map.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/deverdever-heavenly-goat-v7-v2_reward/tokenizer_config.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/deverdever-heavenly-goat-v7-v2_reward/config.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/deverdever-heavenly-goat-v7-v2_reward/vocab.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/deverdever-heavenly-goat-v7-v2_reward/merges.txt
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/deverdever-heavenly-goat-v7-v2_reward/tokenizer.json
deverdever-heavenly-goat-v7-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/deverdever-heavenly-goat-v7-v2_reward/reward.tensors
Job deverdever-heavenly-goat-v7-v2-mkmlizer completed after 64.26s with status: succeeded
Stopping job with name deverdever-heavenly-goat-v7-v2-mkmlizer
Pipeline stage MKMLizer completed in 69.45s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service deverdever-heavenly-goat-v7-v2
Waiting for inference service deverdever-heavenly-goat-v7-v2 to be ready
Inference service deverdever-heavenly-goat-v7-v2 ready after 40.32271385192871s
Pipeline stage ISVCDeployer completed in 48.04s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.6948959827423096s
Received healthy response to inference request in 1.1681816577911377s
Received healthy response to inference request in 1.1418678760528564s
Received healthy response to inference request in 1.179459571838379s
Received healthy response to inference request in 1.1526825428009033s
5 requests
0 failed requests
5th percentile: 1.1440308094024658
10th percentile: 1.1461937427520752
20th percentile: 1.150519609451294
30th percentile: 1.15578236579895
40th percentile: 1.161982011795044
50th percentile: 1.1681816577911377
60th percentile: 1.1726928234100342
70th percentile: 1.1772039890289308
80th percentile: 1.2825468540191651
90th percentile: 1.4887214183807373
95th percentile: 1.5918087005615233
99th percentile: 1.6742785263061524
mean time: 1.2674175262451173
Pipeline stage StressChecker completed in 7.23s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.04s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.05s
M-Eval Dataset for topic stay_in_character is loaded
deverdever-heavenly-goat-v7_v2 status is now deployed due to DeploymentManager action
deverdever-heavenly-goat-v7_v2 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of deverdever-heavenly-goat-v7_v2
Running pipeline stage ISVCDeleter
Checking if service deverdever-heavenly-goat-v7-v2 is running
admin requested tearing down of deverdever-heavenly-goat-v8_v1
Running pipeline stage ISVCDeleter
Checking if service deverdever-heavenly-goat-v8-v1 is running
Tearing down inference service deverdever-heavenly-goat-v7-v2
Toredown service deverdever-heavenly-goat-v7-v2
Pipeline stage ISVCDeleter completed in 3.81s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Tearing down inference service deverdever-heavenly-goat-v8-v1
Deleting key deverdever-heavenly-goat-v7-v2/config.json from bucket guanaco-mkml-models
Toredown service deverdever-heavenly-goat-v8-v1
Deleting key deverdever-heavenly-goat-v7-v2/mkml_model.tensors from bucket guanaco-mkml-models
Pipeline stage ISVCDeleter completed in 4.57s
Running pipeline stage MKMLModelDeleter
Deleting key deverdever-heavenly-goat-v7-v2/special_tokens_map.json from bucket guanaco-mkml-models
Cleaning model data from S3
Deleting key deverdever-heavenly-goat-v7-v2/tokenizer.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key deverdever-heavenly-goat-v7-v2/tokenizer.model from bucket guanaco-mkml-models
Deleting key deverdever-heavenly-goat-v7-v2/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key deverdever-heavenly-goat-v8-v1/added_tokens.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key deverdever-heavenly-goat-v8-v1/config.json from bucket guanaco-mkml-models
Deleting key deverdever-heavenly-goat-v8-v1/mkml_model.tensors from bucket guanaco-mkml-models
Deleting key deverdever-heavenly-goat-v7-v2_reward/config.json from bucket guanaco-reward-models
Deleting key deverdever-heavenly-goat-v7-v2_reward/merges.txt from bucket guanaco-reward-models
Deleting key deverdever-heavenly-goat-v7-v2_reward/reward.tensors from bucket guanaco-reward-models
Deleting key deverdever-heavenly-goat-v8-v1/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key deverdever-heavenly-goat-v7-v2_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key deverdever-heavenly-goat-v8-v1/tokenizer.json from bucket guanaco-mkml-models
Deleting key deverdever-heavenly-goat-v7-v2_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key deverdever-heavenly-goat-v8-v1/tokenizer.model from bucket guanaco-mkml-models
Deleting key deverdever-heavenly-goat-v7-v2_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key deverdever-heavenly-goat-v8-v1/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key deverdever-heavenly-goat-v7-v2_reward/vocab.json from bucket guanaco-reward-models
Cleaning model data from model cache
Pipeline stage MKMLModelDeleter completed in 3.50s
Deleting key deverdever-heavenly-goat-v8-v1_reward/config.json from bucket guanaco-reward-models
Deleting key deverdever-heavenly-goat-v8-v1_reward/merges.txt from bucket guanaco-reward-models
Deleting key deverdever-heavenly-goat-v8-v1_reward/reward.tensors from bucket guanaco-reward-models
deverdever-heavenly-goat-v7_v2 status is now torndown due to DeploymentManager action
Deleting key deverdever-heavenly-goat-v8-v1_reward/special_tokens_map.json from bucket guanaco-reward-models