submission_id: saishf-fimbulvetr-kuro-l_8724_v2
developer_uid: saishf
status: inactive
model_repo: saishf/Fimbulvetr-Kuro-Lotus-10.7B
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:'}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:'}
timestamp: 2024-03-26T12:40:47+00:00
model_name: saishf-fimbulvetr-kuro-l_8724_v2
model_eval_status: success
safety_score: 0.85
entertaining: 6.88
stay_in_character: 8.54
user_preference: 7.5
double_thumbs_up: 338
thumbs_up: 508
thumbs_down: 237
num_battles: 67630
num_wins: 31941
win_ratio: 0.4722904036670117
celo_rating: 1138.04
Resubmit model
Running pipeline stage MKMLizer
Starting job with name saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer
Waiting for job on saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer to finish
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: model-00011-of-00011.safetensors: 0%| | 0.00/1.89G [00:00<?, ?B/s] model-00011-of-00011.safetensors: 1%| | 10.5M/1.89G [00:05<15:38, 2.00MB/s] model-00011-of-00011.safetensors: 1%| | 21.0M/1.89G [00:05<06:38, 4.69MB/s] model-00011-of-00011.safetensors: 4%|▍ | 73.4M/1.89G [00:05<01:25, 21.4MB/s] model-00011-of-00011.safetensors: 5%|▍ | 94.4M/1.89G [00:05<01:04, 27.9MB/s] model-00011-of-00011.safetensors: 13%|█▎ | 252M/1.89G [00:06<00:14, 115MB/s] model-00011-of-00011.safetensors: 43%|████▎ | 818M/1.89G [00:06<00:02, 530MB/s] model-00011-of-00011.safetensors: 60%|██████ | 1.14G/1.89G [00:06<00:00, 784MB/s] model-00011-of-00011.safetensors: 74%|███████▍ | 1.41G/1.89G [00:06<00:00, 1.00GB/s] model-00011-of-00011.safetensors: 88%|████████▊ | 1.67G/1.89G [00:06<00:00, 1.22GB/s] model-00011-of-00011.safetensors: 100%|█████████▉| 1.89G/1.89G [00:06<00:00, 281MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: model.safetensors.index.json: 0%| | 0.00/34.1k [00:00<?, ?B/s] model.safetensors.index.json: 100%|██████████| 34.1k/34.1k [00:00<00:00, 149MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: special_tokens_map.json: 0%| | 0.00/551 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 551/551 [00:00<00:00, 4.42MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: tokenizer.json: 0%| | 0.00/1.80M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 1.80M/1.80M [00:00<00:00, 30.5MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: tokenizer.model: 0%| | 0.00/493k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 493k/493k [00:00<00:00, 62.7MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: tokenizer_config.json: 0%| | 0.00/984 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 984/984 [00:00<00:00, 8.17MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Downloaded to shared memory in 53.210s
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: quantizing model to /dev/shm/model_cache
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Saving mkml model at /dev/shm/model_cache
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Reading /tmp/tmpg9pen49_/model.safetensors.index.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: quantized model in 23.838s
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Processed model saishf/Fimbulvetr-Kuro-Lotus-10.7B in 78.545s
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: creating bucket guanaco-mkml-models
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/saishf-fimbulvetr-kuro-l-8724-v2
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/saishf-fimbulvetr-kuro-l-8724-v2/tokenizer_config.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/saishf-fimbulvetr-kuro-l-8724-v2/config.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/saishf-fimbulvetr-kuro-l-8724-v2/special_tokens_map.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/saishf-fimbulvetr-kuro-l-8724-v2/tokenizer.model
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/saishf-fimbulvetr-kuro-l-8724-v2/tokenizer.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /dev/shm/model_cache/mkml_model.tensors s3://guanaco-mkml-models/saishf-fimbulvetr-kuro-l-8724-v2/mkml_model.tensors
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1067: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: warnings.warn(
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: config.json: 0%| | 0.00/1.05k [00:00<?, ?B/s] config.json: 100%|██████████| 1.05k/1.05k [00:00<00:00, 11.5MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:690: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: warnings.warn(
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: tokenizer_config.json: 0%| | 0.00/234 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 234/234 [00:00<00:00, 1.60MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: vocab.json: 0%| | 0.00/1.04M [00:00<?, ?B/s] vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 12.4MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: tokenizer.json: 0%| | 0.00/2.11M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.11M/2.11M [00:00<00:00, 55.1MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: warnings.warn(
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: pytorch_model.bin: 0%| | 0.00/1.44G [00:00<?, ?B/s] pytorch_model.bin: 1%|▏ | 21.0M/1.44G [00:00<00:07, 188MB/s] pytorch_model.bin: 6%|▌ | 83.9M/1.44G [00:00<00:03, 396MB/s] pytorch_model.bin: 10%|█ | 147M/1.44G [00:00<00:02, 481MB/s] pytorch_model.bin: 17%|█▋ | 252M/1.44G [00:00<00:01, 681MB/s] pytorch_model.bin: 26%|██▌ | 377M/1.44G [00:00<00:01, 861MB/s] pytorch_model.bin: 36%|███▋ | 524M/1.44G [00:00<00:00, 1.05GB/s] pytorch_model.bin: 44%|████▍ | 640M/1.44G [00:00<00:00, 1.01GB/s] pytorch_model.bin: 51%|█████▏ | 742M/1.44G [00:00<00:00, 995MB/s] pytorch_model.bin: 60%|██████ | 868M/1.44G [00:00<00:00, 1.03GB/s] pytorch_model.bin: 72%|███████▏ | 1.05G/1.44G [00:01<00:00, 1.24GB/s] pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:01<00:00, 643MB/s] pytorch_model.bin: 100%|█████████▉| 1.44G/1.44G [00:05<00:00, 280MB/s]
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Saving duration: 0.306s
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 9.289s
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: creating bucket guanaco-reward-models
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/saishf-fimbulvetr-kuro-l-8724-v2_reward
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/saishf-fimbulvetr-kuro-l-8724-v2_reward/config.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/saishf-fimbulvetr-kuro-l-8724-v2_reward/vocab.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/saishf-fimbulvetr-kuro-l-8724-v2_reward/tokenizer_config.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/saishf-fimbulvetr-kuro-l-8724-v2_reward/merges.txt
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/saishf-fimbulvetr-kuro-l-8724-v2_reward/special_tokens_map.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/saishf-fimbulvetr-kuro-l-8724-v2_reward/tokenizer.json
saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/saishf-fimbulvetr-kuro-l-8724-v2_reward/reward.tensors
Job saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer completed after 187.08s with status: succeeded
Stopping job with name saishf-fimbulvetr-kuro-l-8724-v2-mkmlizer
Pipeline stage MKMLizer completed in 192.28s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service saishf-fimbulvetr-kuro-l-8724-v2
Waiting for inference service saishf-fimbulvetr-kuro-l-8724-v2 to be ready
Inference service saishf-fimbulvetr-kuro-l-8724-v2 ready after 40.25477480888367s
Pipeline stage ISVCDeployer completed in 48.27s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.332563877105713s
Received healthy response to inference request in 1.4683380126953125s
Received healthy response to inference request in 1.5161585807800293s
Received healthy response to inference request in 1.264127492904663s
Received healthy response to inference request in 1.1750903129577637s
5 requests
0 failed requests
5th percentile: 1.1928977489471435
10th percentile: 1.2107051849365233
20th percentile: 1.2463200569152832
30th percentile: 1.277814769744873
40th percentile: 1.305189323425293
50th percentile: 1.332563877105713
60th percentile: 1.3868735313415528
70th percentile: 1.4411831855773924
80th percentile: 1.4779021263122558
90th percentile: 1.4970303535461427
95th percentile: 1.506594467163086
99th percentile: 1.5142457580566406
mean time: 1.3512556552886963
Pipeline stage StressChecker completed in 7.51s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.04s
Running pipeline stage DaemonicSafetyScorer
Running M-Eval for topic stay_in_character
Pipeline stage DaemonicSafetyScorer completed in 0.06s
M-Eval Dataset for topic stay_in_character is loaded
saishf-fimbulvetr-kuro-l_8724_v2 status is now deployed due to DeploymentManager action
saishf-fimbulvetr-kuro-l_8724_v2 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics