submission_id: undi95-bigl-7b_v2
developer_uid: Undi95
status: inactive
model_repo: Undi95/BigL-7B
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '\n###', '</s>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "<s>### Instruction:\nContinue the following conversation in a natural way. Think before you reply, keep the conversation entertaining and imaginative. Engage in *roleplay* actions.\n{bot_name}'s Persona: {memory}\n</s>", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '###Response:\n{bot_name}: {message}\n', 'user_template': '### Input:\n{user_name}: {message}\n', 'response_template': '### Response:\n{bot_name}:'}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:'}
timestamp: 2024-03-25T09:40:00+00:00
model_name: undi95-bigl-7b_v2
model_eval_status: success
safety_score: 0.96
entertaining: 7.24
stay_in_character: 8.73
user_preference: 7.5
double_thumbs_up: 426
thumbs_up: 569
thumbs_down: 261
num_battles: 67858
num_wins: 35065
win_ratio: 0.5167408411683221
celo_rating: 1169.13
Resubmit model
Running pipeline stage MKMLizer
Starting job with name undi95-bigl-7b-v2-mkmlizer
Waiting for job on undi95-bigl-7b-v2-mkmlizer to finish
undi95-bigl-7b-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
undi95-bigl-7b-v2-mkmlizer: ║ _____ __ __ ║
undi95-bigl-7b-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
undi95-bigl-7b-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
undi95-bigl-7b-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
undi95-bigl-7b-v2-mkmlizer: ║ /___/ ║
undi95-bigl-7b-v2-mkmlizer: ║ ║
undi95-bigl-7b-v2-mkmlizer: ║ Version: 0.6.11 ║
undi95-bigl-7b-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
undi95-bigl-7b-v2-mkmlizer: ║ ║
undi95-bigl-7b-v2-mkmlizer: ║ The license key for the current software has been verified as ║
undi95-bigl-7b-v2-mkmlizer: ║ belonging to: ║
undi95-bigl-7b-v2-mkmlizer: ║ ║
undi95-bigl-7b-v2-mkmlizer: ║ Chai Research Corp. ║
undi95-bigl-7b-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
undi95-bigl-7b-v2-mkmlizer: ║ Expiration: 2024-04-15 23:59:59 ║
undi95-bigl-7b-v2-mkmlizer: ║ ║
undi95-bigl-7b-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
undi95-bigl-7b-v2-mkmlizer: .gitattributes: 0%| | 0.00/1.52k [00:00<?, ?B/s] .gitattributes: 100%|██████████| 1.52k/1.52k [00:00<00:00, 17.6MB/s]
undi95-bigl-7b-v2-mkmlizer: config.json: 0%| | 0.00/653 [00:00<?, ?B/s] config.json: 100%|██████████| 653/653 [00:00<00:00, 7.89MB/s]
undi95-bigl-7b-v2-mkmlizer: model-00001-of-00002.safetensors: 0%| | 0.00/9.94G [00:00<?, ?B/s] model-00001-of-00002.safetensors: 0%| | 10.5M/9.94G [00:00<14:47, 11.2MB/s] model-00001-of-00002.safetensors: 0%| | 21.0M/9.94G [00:02<23:58, 6.90MB/s] model-00001-of-00002.safetensors: 1%| | 52.4M/9.94G [00:03<07:30, 21.9MB/s] model-00001-of-00002.safetensors: 1%|▏ | 147M/9.94G [00:03<02:21, 69.3MB/s] model-00001-of-00002.safetensors: 2%|▏ | 210M/9.94G [00:03<01:29, 109MB/s] model-00001-of-00002.safetensors: 3%|▎ | 346M/9.94G [00:03<00:42, 227MB/s] model-00001-of-00002.safetensors: 12%|█▏ | 1.20G/9.94G [00:03<00:07, 1.20GB/s] model-00001-of-00002.safetensors: 15%|█▌ | 1.51G/9.94G [00:04<00:08, 1.01GB/s] model-00001-of-00002.safetensors: 18%|█▊ | 1.75G/9.94G [00:04<00:07, 1.04GB/s] model-00001-of-00002.safetensors: 20%|█▉ | 1.95G/9.94G [00:04<00:07, 1.06GB/s] model-00001-of-00002.safetensors: 21%|██▏ | 2.13G/9.94G [00:04<00:07, 1.01GB/s] model-00001-of-00002.safetensors: 23%|██▎ | 2.29G/9.94G [00:04<00:07, 1.08GB/s] model-00001-of-00002.safetensors: 24%|██▍ | 2.43G/9.94G [00:05<00:06, 1.08GB/s] model-00001-of-00002.safetensors: 26%|██▌ | 2.57G/9.94G [00:05<00:07, 960MB/s] model-00001-of-00002.safetensors: 27%|██▋ | 2.69G/9.94G [00:05<00:07, 987MB/s] model-00001-of-00002.safetensors: 28%|██▊ | 2.81G/9.94G [00:05<00:09, 742MB/s] model-00001-of-00002.safetensors: 29%|██▉ | 2.90G/9.94G [00:05<00:11, 620MB/s] model-00001-of-00002.safetensors: 30%|███ | 3.01G/9.94G [00:05<00:10, 684MB/s] model-00001-of-00002.safetensors: 33%|███▎ | 3.24G/9.94G [00:06<00:06, 984MB/s] model-00001-of-00002.safetensors: 36%|███▌ | 3.57G/9.94G [00:06<00:04, 1.45GB/s] model-00001-of-00002.safetensors: 39%|███▉ | 3.86G/9.94G [00:06<00:03, 1.77GB/s] model-00001-of-00002.safetensors: 41%|████ | 4.07G/9.94G [00:06<00:04, 1.43GB/s] model-00001-of-00002.safetensors: 43%|████▎ | 4.25G/9.94G [00:06<00:04, 1.38GB/s] model-00001-of-00002.safetensors: 44%|████▍ | 4.41G/9.94G [00:06<00:03, 1.39GB/s] model-00001-of-00002.safetensors: 46%|████▌ | 4.59G/9.94G [00:06<00:03, 1.47GB/s] model-00001-of-00002.safetensors: 48%|████▊ | 4.76G/9.94G [00:06<00:03, 1.46GB/s] model-00001-of-00002.safetensors: 49%|████▉ | 4.92G/9.94G [00:07<00:03, 1.29GB/s] model-00001-of-00002.safetensors: 51%|█████ | 5.09G/9.94G [00:07<00:03, 1.38GB/s] model-00001-of-00002.safetensors: 53%|█████▎ | 5.25G/9.94G [00:07<00:03, 1.45GB/s] model-00001-of-00002.safetensors: 55%|█████▌ | 5.47G/9.94G [00:07<00:02, 1.63GB/s] model-00001-of-00002.safetensors: 57%|█████▋ | 5.66G/9.94G [00:07<00:02, 1.64GB/s] model-00001-of-00002.safetensors: 59%|█████▊ | 5.84G/9.94G [00:07<00:02, 1.54GB/s] model-00001-of-00002.safetensors: 60%|██████ | 6.01G/9.94G [00:07<00:03, 1.21GB/s] model-00001-of-00002.safetensors: 62%|██████▏ | 6.20G/9.94G [00:08<00:02, 1.31GB/s] model-00001-of-00002.safetensors: 64%|██████▍ | 6.34G/9.94G [00:08<00:02, 1.33GB/s] model-00001-of-00002.safetensors: 65%|██████▌ | 6.50G/9.94G [00:08<00:02, 1.38GB/s] model-00001-of-00002.safetensors: 67%|██████▋ | 6.66G/9.94G [00:08<00:02, 1.43GB/s] model-00001-of-00002.safetensors: 69%|██████▉ | 6.89G/9.94G [00:08<00:01, 1.63GB/s] model-00001-of-00002.safetensors: 71%|███████ | 7.07G/9.94G [00:08<00:02, 1.37GB/s] model-00001-of-00002.safetensors: 73%|███████▎ | 7.26G/9.94G [00:08<00:01, 1.49GB/s] model-00001-of-00002.safetensors: 75%|███████▍ | 7.42G/9.94G [00:08<00:01, 1.45GB/s] model-00001-of-00002.safetensors: 76%|███████▌ | 7.58G/9.94G [00:08<00:01, 1.45GB/s] model-00001-of-00002.safetensors: 78%|███████▊ | 7.74G/9.94G [00:09<00:01, 1.44GB/s] model-00001-of-00002.safetensors: 80%|███████▉ | 7.92G/9.94G [00:09<00:01, 1.50GB/s] model-00001-of-00002.safetensors: 81%|████████ | 8.07G/9.94G [00:09<00:01, 1.50GB/s] model-00001-of-00002.safetensors: 83%|████████▎ | 8.24G/9.94G [00:09<00:01, 1.47GB/s] model-00001-of-00002.safetensors: 84%|████████▍ | 8.40G/9.94G [00:09<00:01, 1.37GB/s] model-00001-of-00002.safetensors: 86%|████████▌ | 8.55G/9.94G [00:09<00:01, 1.27GB/s] model-00001-of-00002.safetensors: 88%|████████▊ | 8.72G/9.94G [00:09<00:00, 1.37GB/s] model-00001-of-00002.safetensors: 90%|████████▉ | 8.90G/9.94G [00:09<00:00, 1.48GB/s] model-00001-of-00002.safetensors: 91%|█████████▏| 9.08G/9.94G [00:10<00:00, 1.52GB/s] model-00001-of-00002.safetensors: 94%|█████████▍| 9.35G/9.94G [00:10<00:00, 1.83GB/s] model-00001-of-00002.safetensors: 97%|█████████▋| 9.66G/9.94G [00:10<00:00, 2.19GB/s] model-00001-of-00002.safetensors: 100%|█████████▉| 9.93G/9.94G [00:10<00:00, 2.10GB/s] model-00001-of-00002.safetensors: 100%|█████████▉| 9.94G/9.94G [00:10<00:00, 950MB/s]
undi95-bigl-7b-v2-mkmlizer: model-00002-of-00002.safetensors: 0%| | 0.00/4.54G [00:00<?, ?B/s] model-00002-of-00002.safetensors: 0%| | 10.5M/4.54G [00:01<11:58, 6.31MB/s] model-00002-of-00002.safetensors: 0%| | 21.0M/4.54G [00:02<09:53, 7.62MB/s] model-00002-of-00002.safetensors: 1%| | 41.9M/4.54G [00:03<04:21, 17.2MB/s] model-00002-of-00002.safetensors: 3%|▎ | 157M/4.54G [00:03<00:46, 95.0MB/s] model-00002-of-00002.safetensors: 6%|▌ | 273M/4.54G [00:03<00:22, 187MB/s] model-00002-of-00002.safetensors: 24%|██▍ | 1.09G/4.54G [00:03<00:03, 1.08GB/s] model-00002-of-00002.safetensors: 31%|███ | 1.42G/4.54G [00:04<00:03, 845MB/s] model-00002-of-00002.safetensors: 36%|███▋ | 1.66G/4.54G [00:04<00:03, 872MB/s] model-00002-of-00002.safetensors: 41%|████ | 1.86G/4.54G [00:04<00:02, 932MB/s] model-00002-of-00002.safetensors: 45%|████▍ | 2.03G/4.54G [00:04<00:02, 945MB/s] model-00002-of-00002.safetensors: 48%|████▊ | 2.19G/4.54G [00:04<00:02, 994MB/s] model-00002-of-00002.safetensors: 52%|█████▏ | 2.38G/4.54G [00:04<00:02, 1.07GB/s] model-00002-of-00002.safetensors: 56%|█████▌ | 2.53G/4.54G [00:05<00:02, 811MB/s] model-00002-of-00002.safetensors: 58%|█████▊ | 2.64G/4.54G [00:05<00:02, 825MB/s] model-00002-of-00002.safetensors: 66%|██████▋ | 3.01G/4.54G [00:05<00:01, 1.32GB/s] model-00002-of-00002.safetensors: 74%|███████▍ | 3.38G/4.54G [00:05<00:00, 1.79GB/s] model-00002-of-00002.safetensors: 81%|████████ | 3.66G/4.54G [00:05<00:00, 2.00GB/s] model-00002-of-00002.safetensors: 86%|████████▌ | 3.91G/4.54G [00:05<00:00, 1.59GB/s] model-00002-of-00002.safetensors: 91%|█████████ | 4.12G/4.54G [00:05<00:00, 1.65GB/s] model-00002-of-00002.safetensors: 100%|█████████▉| 4.54G/4.54G [00:06<00:00, 743MB/s]
undi95-bigl-7b-v2-mkmlizer: model.safetensors.index.json: 0%| | 0.00/22.8k [00:00<?, ?B/s] model.safetensors.index.json: 100%|██████████| 22.8k/22.8k [00:00<00:00, 5.89MB/s]
undi95-bigl-7b-v2-mkmlizer: special_tokens_map.json: 0%| | 0.00/414 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 414/414 [00:00<00:00, 3.21MB/s]
undi95-bigl-7b-v2-mkmlizer: tokenizer.json: 0%| | 0.00/1.80M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 1.80M/1.80M [00:00<00:00, 7.16MB/s] tokenizer.json: 100%|██████████| 1.80M/1.80M [00:00<00:00, 7.13MB/s]
undi95-bigl-7b-v2-mkmlizer: tokenizer.model: 0%| | 0.00/493k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 493k/493k [00:00<00:00, 2.33MB/s] tokenizer.model: 100%|██████████| 493k/493k [00:00<00:00, 2.32MB/s]
undi95-bigl-7b-v2-mkmlizer: tokenizer_config.json: 0%| | 0.00/960 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 960/960 [00:00<00:00, 11.4MB/s]
undi95-bigl-7b-v2-mkmlizer: Downloaded to shared memory in 21.205s
undi95-bigl-7b-v2-mkmlizer: quantizing model to /dev/shm/model_cache
undi95-bigl-7b-v2-mkmlizer: Saving mkml model at /dev/shm/model_cache
undi95-bigl-7b-v2-mkmlizer: Reading /tmp/tmplgnk5cf7/model.safetensors.index.json
undi95-bigl-7b-v2-mkmlizer: Profiling: 0%| | 0/291 [00:00<?, ?it/s] Profiling: 0%| | 1/291 [00:01<05:47, 1.20s/it] Profiling: 5%|▍ | 14/291 [00:01<00:19, 14.49it/s] Profiling: 11%|█ | 31/291 [00:01<00:07, 34.73it/s] Profiling: 16%|█▋ | 48/291 [00:01<00:04, 55.67it/s] Profiling: 22%|██▏ | 63/291 [00:01<00:03, 72.46it/s] Profiling: 26%|██▋ | 77/291 [00:01<00:02, 84.98it/s] Profiling: 32%|███▏ | 94/291 [00:01<00:01, 103.09it/s] Profiling: 38%|███▊ | 111/291 [00:01<00:01, 117.81it/s] Profiling: 43%|████▎ | 126/291 [00:02<00:01, 124.68it/s] Profiling: 49%|████▉ | 143/291 [00:02<00:01, 135.72it/s] Profiling: 55%|█████▍ | 160/291 [00:02<00:00, 143.85it/s] Profiling: 60%|██████ | 176/291 [00:02<00:00, 146.83it/s] Profiling: 66%|██████▌ | 192/291 [00:02<00:00, 145.98it/s] Profiling: 71%|███████▏ | 208/291 [00:04<00:03, 24.96it/s] Profiling: 76%|███████▋ | 222/291 [00:04<00:02, 31.88it/s] Profiling: 82%|████████▏ | 239/291 [00:04<00:01, 42.74it/s] Profiling: 88%|████████▊ | 256/291 [00:04<00:00, 55.35it/s] Profiling: 93%|█████████▎| 270/291 [00:04<00:00, 65.70it/s] Profiling: 98%|█████████▊| 284/291 [00:04<00:00, 76.89it/s] Profiling: 100%|██████████| 291/291 [00:05<00:00, 56.53it/s]
undi95-bigl-7b-v2-mkmlizer: quantized model in 16.915s
undi95-bigl-7b-v2-mkmlizer: Processed model Undi95/BigL-7B in 39.083s
undi95-bigl-7b-v2-mkmlizer: creating bucket guanaco-mkml-models
undi95-bigl-7b-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
undi95-bigl-7b-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/undi95-bigl-7b-v2
undi95-bigl-7b-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/undi95-bigl-7b-v2/config.json
undi95-bigl-7b-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/undi95-bigl-7b-v2/special_tokens_map.json
undi95-bigl-7b-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/undi95-bigl-7b-v2/tokenizer_config.json
undi95-bigl-7b-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/undi95-bigl-7b-v2/tokenizer.model
undi95-bigl-7b-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/undi95-bigl-7b-v2/tokenizer.json
undi95-bigl-7b-v2-mkmlizer: cp /dev/shm/model_cache/mkml_model.tensors s3://guanaco-mkml-models/undi95-bigl-7b-v2/mkml_model.tensors
undi95-bigl-7b-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
undi95-bigl-7b-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
undi95-bigl-7b-v2-mkmlizer: warnings.warn(
undi95-bigl-7b-v2-mkmlizer: config.json: 0%| | 0.00/1.05k [00:00<?, ?B/s] config.json: 100%|██████████| 1.05k/1.05k [00:00<00:00, 12.6MB/s]
undi95-bigl-7b-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
undi95-bigl-7b-v2-mkmlizer: warnings.warn(
undi95-bigl-7b-v2-mkmlizer: tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 234/234 [00:00<00:00, 2.73MB/s]
undi95-bigl-7b-v2-mkmlizer: vocab.json: 0%| | 0.00/1.04M [00:00<?, ?B/s] vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 4.43MB/s] vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 4.43MB/s]
undi95-bigl-7b-v2-mkmlizer: tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 15.9MB/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 15.9MB/s]
undi95-bigl-7b-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
undi95-bigl-7b-v2-mkmlizer: warnings.warn(
undi95-bigl-7b-v2-mkmlizer: pytorch_model.bin: 0%| | 0.00/1.44G [00:00<?, ?B/s] pytorch_model.bin: 1%| | 10.5M/1.44G [00:00<02:06, 11.4MB/s] pytorch_model.bin: 1%|▏ | 21.0M/1.44G [00:02<03:26, 6.89MB/s] pytorch_model.bin: 3%|▎ | 41.9M/1.44G [00:03<01:23, 16.9MB/s] pytorch_model.bin: 8%|▊ | 115M/1.44G [00:03<00:22, 58.4MB/s] pytorch_model.bin: 9%|▉ | 136M/1.44G [00:03<00:21, 62.0MB/s] pytorch_model.bin: 15%|█▌ | 220M/1.44G [00:03<00:09, 132MB/s] pytorch_model.bin: 29%|██▉ | 419M/1.44G [00:03<00:02, 344MB/s] pytorch_model.bin: 78%|███████▊ | 1.13G/1.44G [00:03<00:00, 1.28GB/s] pytorch_model.bin: 98%|█████████▊| 1.41G/1.44G [00:04<00:00, 1.23GB/s] pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:04<00:00, 335MB/s]
undi95-bigl-7b-v2-mkmlizer: creating bucket guanaco-reward-models
undi95-bigl-7b-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
undi95-bigl-7b-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/undi95-bigl-7b-v2_reward
undi95-bigl-7b-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/undi95-bigl-7b-v2_reward/config.json
undi95-bigl-7b-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/undi95-bigl-7b-v2_reward/special_tokens_map.json
undi95-bigl-7b-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/undi95-bigl-7b-v2_reward/tokenizer_config.json
undi95-bigl-7b-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/undi95-bigl-7b-v2_reward/merges.txt
undi95-bigl-7b-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/undi95-bigl-7b-v2_reward/vocab.json
undi95-bigl-7b-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/undi95-bigl-7b-v2_reward/tokenizer.json
undi95-bigl-7b-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/undi95-bigl-7b-v2_reward/reward.tensors
Job undi95-bigl-7b-v2-mkmlizer completed after 74.94s with status: succeeded
Stopping job with name undi95-bigl-7b-v2-mkmlizer
Pipeline stage MKMLizer completed in 78.81s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service undi95-bigl-7b-v2
Waiting for inference service undi95-bigl-7b-v2 to be ready
Inference service undi95-bigl-7b-v2 ready after 40.20610070228577s
Pipeline stage ISVCDeployer completed in 47.71s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.7006890773773193s
Received healthy response to inference request in 1.2066590785980225s
Received healthy response to inference request in 1.1894328594207764s
Received healthy response to inference request in 1.187743902206421s
Received healthy response to inference request in 1.198122501373291s
5 requests
0 failed requests
5th percentile: 1.188081693649292
10th percentile: 1.188419485092163
20th percentile: 1.1890950679779053
30th percentile: 1.1911707878112794
40th percentile: 1.1946466445922852
50th percentile: 1.198122501373291
60th percentile: 1.2015371322631836
70th percentile: 1.2049517631530762
80th percentile: 1.305465078353882
90th percentile: 1.5030770778656006
95th percentile: 1.6018830776214599
99th percentile: 1.6809278774261474
mean time: 1.296529483795166
Pipeline stage StressChecker completed in 7.40s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.05s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.06s
M-Eval Dataset for topic stay_in_character is loaded
undi95-bigl-7b_v2 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics