submission_id: jic062-instruct_v14
developer_uid: chace9580
best_of: 16
celo_rating: 1234.37
display_name: jic062-instruct_v14
family_friendly_score: 0.0
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_of_text|>', '|eot_id|'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: False
language_model: jic062/instruct
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: jic062/instruct
model_name: jic062-instruct_v14
model_num_parameters: 8030261248.0
model_repo: jic062/instruct
model_size: 8B
num_battles: 19610
num_wins: 10584
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-01T06:28:21+00:00
us_pacific_date: 2024-07-31
win_ratio: 0.539724630290668
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jic062-instruct-v14-mkmlizer
Waiting for job on jic062-instruct-v14-mkmlizer to finish
jic062-instruct-v14-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-instruct-v14-mkmlizer: ║ _____ __ __ ║
jic062-instruct-v14-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-instruct-v14-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-instruct-v14-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-instruct-v14-mkmlizer: ║ /___/ ║
jic062-instruct-v14-mkmlizer: ║ ║
jic062-instruct-v14-mkmlizer: ║ Version: 0.9.7 ║
jic062-instruct-v14-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-instruct-v14-mkmlizer: ║ https://mk1.ai ║
jic062-instruct-v14-mkmlizer: ║ ║
jic062-instruct-v14-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-instruct-v14-mkmlizer: ║ belonging to: ║
jic062-instruct-v14-mkmlizer: ║ ║
jic062-instruct-v14-mkmlizer: ║ Chai Research Corp. ║
jic062-instruct-v14-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-instruct-v14-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-instruct-v14-mkmlizer: ║ ║
jic062-instruct-v14-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jic062-instruct-v14-mkmlizer: Downloaded to shared memory in 29.206s
jic062-instruct-v14-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp0caqhexw, device:0
jic062-instruct-v14-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jic062-instruct-v14-mkmlizer: quantized model in 26.757s
jic062-instruct-v14-mkmlizer: Processed model jic062/instruct in 55.964s
jic062-instruct-v14-mkmlizer: creating bucket guanaco-mkml-models
jic062-instruct-v14-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-instruct-v14-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-instruct-v14
jic062-instruct-v14-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-instruct-v14/config.json
jic062-instruct-v14-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-instruct-v14/special_tokens_map.json
jic062-instruct-v14-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-instruct-v14/tokenizer_config.json
jic062-instruct-v14-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-instruct-v14/tokenizer.json
jic062-instruct-v14-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-instruct-v14/flywheel_model.0.safetensors
jic062-instruct-v14-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jic062-instruct-v14-mkmlizer: warnings.warn(
jic062-instruct-v14-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jic062-instruct-v14-mkmlizer: warnings.warn(
jic062-instruct-v14-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.39s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.88s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.11s/it]
jic062-instruct-v14-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.24it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.68it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.36it/s]
jic062-instruct-v14-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
jic062-instruct-v14-mkmlizer: Saving duration: 1.343s
jic062-instruct-v14-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.327s
jic062-instruct-v14-mkmlizer: creating bucket guanaco-reward-models
jic062-instruct-v14-mkmlizer: Bucket 's3://guanaco-reward-models/' created
jic062-instruct-v14-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/jic062-instruct-v14_reward
jic062-instruct-v14-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/jic062-instruct-v14_reward/config.json
jic062-instruct-v14-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/jic062-instruct-v14_reward/special_tokens_map.json
jic062-instruct-v14-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/jic062-instruct-v14_reward/tokenizer_config.json
jic062-instruct-v14-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/jic062-instruct-v14_reward/merges.txt
jic062-instruct-v14-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/jic062-instruct-v14_reward/vocab.json
jic062-instruct-v14-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/jic062-instruct-v14_reward/tokenizer.json
jic062-instruct-v14-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/jic062-instruct-v14_reward/reward.tensors
Job jic062-instruct-v14-mkmlizer completed after 104.72s with status: succeeded
Stopping job with name jic062-instruct-v14-mkmlizer
Pipeline stage MKMLizer completed in 106.01s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service jic062-instruct-v14
Waiting for inference service jic062-instruct-v14 to be ready
Inference service jic062-instruct-v14 ready after 141.4309320449829s
Pipeline stage ISVCDeployer completed in 142.95s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1566741466522217s
Received healthy response to inference request in 1.3754217624664307s
Received healthy response to inference request in 1.38287353515625s
Received healthy response to inference request in 1.3748950958251953s
Received healthy response to inference request in 1.3982937335968018s
5 requests
0 failed requests
5th percentile: 1.3750004291534423
10th percentile: 1.3751057624816894
20th percentile: 1.3753164291381836
30th percentile: 1.3769121170043945
40th percentile: 1.3798928260803223
50th percentile: 1.38287353515625
60th percentile: 1.3890416145324707
70th percentile: 1.3952096939086913
80th percentile: 1.5499698162078859
90th percentile: 1.8533219814300539
95th percentile: 2.0049980640411373
99th percentile: 2.1263389301300046
mean time: 1.5376316547393798
Pipeline stage StressChecker completed in 8.41s
jic062-instruct_v14 status is now deployed due to DeploymentManager action
jic062-instruct_v14 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of jic062-instruct_v14
Running pipeline stage ISVCDeleter
Checking if service jic062-instruct-v14 is running
Tearing down inference service jic062-instruct-v14
Service jic062-instruct-v14 has been torndown
Pipeline stage ISVCDeleter completed in 4.67s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key jic062-instruct-v14/config.json from bucket guanaco-mkml-models
Deleting key jic062-instruct-v14/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key jic062-instruct-v14/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key jic062-instruct-v14/tokenizer.json from bucket guanaco-mkml-models
Deleting key jic062-instruct-v14/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key jic062-instruct-v14_reward/config.json from bucket guanaco-reward-models
Deleting key jic062-instruct-v14_reward/merges.txt from bucket guanaco-reward-models
Deleting key jic062-instruct-v14_reward/reward.tensors from bucket guanaco-reward-models
Deleting key jic062-instruct-v14_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key jic062-instruct-v14_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key jic062-instruct-v14_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key jic062-instruct-v14_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.60s
jic062-instruct_v14 status is now torndown due to DeploymentManager action