developer_uid: PawanOsman
submission_id: pawankrd-cosmosrp_v22
model_name: pawankrd-cosmosrp_v22
model_group: PawanKrd/CosmosRP
status: torndown
timestamp: 2024-06-20T18:55:04+00:00
num_battles: 972946
num_wins: 520155
celo_rating: 1214.17
family_friendly_score: 0.0
submission_type: basic
model_repo: PawanKrd/CosmosRP
model_architecture: LlamaForCausalLM
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: pawankrd-cosmosrp_v22
is_internal_developer: False
language_model: PawanKrd/CosmosRP
model_size: 8B
ranking_group: single
us_pacific_date: 2024-06-20
win_ratio: 0.5346185708148242
generation_params: {'temperature': 0.9, 'top_p': 0.95, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['<', '>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "<|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
Resubmit model
Running pipeline stage MKMLizer
Starting job with name pawankrd-cosmosrp-v22-mkmlizer
Waiting for job on pawankrd-cosmosrp-v22-mkmlizer to finish
pawankrd-cosmosrp-v22-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
pawankrd-cosmosrp-v22-mkmlizer: warnings.warn(warning_message, FutureWarning)
pawankrd-cosmosrp-v22-mkmlizer: Downloaded to shared memory in 17.754s
pawankrd-cosmosrp-v22-mkmlizer: quantizing model to /dev/shm/model_cache
pawankrd-cosmosrp-v22-mkmlizer: Saving flywheel model at /dev/shm/model_cache
pawankrd-cosmosrp-v22-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:04<10:35, 2.20s/it] Loading 0: 6%|▌ | 17/291 [00:04<00:53, 5.16it/s] Loading 0: 11%|█▏ | 33/291 [00:04<00:21, 11.79it/s] Loading 0: 18%|█▊ | 51/291 [00:04<00:11, 21.38it/s] Loading 0: 22%|██▏ | 65/291 [00:05<00:08, 26.00it/s] Loading 0: 27%|██▋ | 78/291 [00:05<00:06, 34.76it/s] Loading 0: 33%|███▎ | 96/291 [00:05<00:03, 49.79it/s] Loading 0: 39%|███▉ | 114/291 [00:05<00:02, 66.22it/s] Loading 0: 45%|████▌ | 132/291 [00:05<00:01, 82.82it/s] Loading 0: 52%|█████▏ | 150/291 [00:05<00:01, 99.24it/s] Loading 0: 57%|█████▋ | 166/291 [00:05<00:01, 77.39it/s] Loading 0: 63%|██████▎ | 184/291 [00:05<00:01, 93.53it/s] Loading 0: 68%|██████▊ | 198/291 [00:06<00:00, 100.26it/s] Loading 0: 73%|███████▎ | 212/291 [00:06<00:00, 99.73it/s] Loading 0: 77%|███████▋ | 225/291 [00:06<00:00, 92.63it/s] Loading 0: 82%|████████▏ | 240/291 [00:06<00:00, 101.94it/s] Loading 0: 89%|████████▊ | 258/291 [00:06<00:00, 115.83it/s] Loading 0: 93%|█████████▎| 271/291 [00:06<00:00, 80.98it/s] Loading 0: 98%|█████████▊| 285/291 [00:07<00:00, 90.09it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
pawankrd-cosmosrp-v22-mkmlizer: quantized model in 22.924s
pawankrd-cosmosrp-v22-mkmlizer: Processed model PawanKrd/CosmosRP in 43.204s
pawankrd-cosmosrp-v22-mkmlizer: creating bucket guanaco-mkml-models
pawankrd-cosmosrp-v22-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
pawankrd-cosmosrp-v22-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/pawankrd-cosmosrp-v22
pawankrd-cosmosrp-v22-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/pawankrd-cosmosrp-v22/special_tokens_map.json
pawankrd-cosmosrp-v22-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/pawankrd-cosmosrp-v22/config.json
pawankrd-cosmosrp-v22-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/pawankrd-cosmosrp-v22/tokenizer_config.json
pawankrd-cosmosrp-v22-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/pawankrd-cosmosrp-v22/tokenizer.json
pawankrd-cosmosrp-v22-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/pawankrd-cosmosrp-v22/flywheel_model.0.safetensors
pawankrd-cosmosrp-v22-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
pawankrd-cosmosrp-v22-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
pawankrd-cosmosrp-v22-mkmlizer: warnings.warn(
pawankrd-cosmosrp-v22-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
pawankrd-cosmosrp-v22-mkmlizer: warnings.warn(
pawankrd-cosmosrp-v22-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
pawankrd-cosmosrp-v22-mkmlizer: warnings.warn(
pawankrd-cosmosrp-v22-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
pawankrd-cosmosrp-v22-mkmlizer: return self.fget.__get__(instance, owner)()
pawankrd-cosmosrp-v22-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
pawankrd-cosmosrp-v22-mkmlizer: Saving duration: 0.399s
pawankrd-cosmosrp-v22-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 5.941s
pawankrd-cosmosrp-v22-mkmlizer: creating bucket guanaco-reward-models
pawankrd-cosmosrp-v22-mkmlizer: Bucket 's3://guanaco-reward-models/' created
pawankrd-cosmosrp-v22-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/pawankrd-cosmosrp-v22_reward
pawankrd-cosmosrp-v22-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/pawankrd-cosmosrp-v22_reward/config.json
pawankrd-cosmosrp-v22-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/pawankrd-cosmosrp-v22_reward/special_tokens_map.json
pawankrd-cosmosrp-v22-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/pawankrd-cosmosrp-v22_reward/tokenizer_config.json
pawankrd-cosmosrp-v22-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/pawankrd-cosmosrp-v22_reward/vocab.json
pawankrd-cosmosrp-v22-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/pawankrd-cosmosrp-v22_reward/merges.txt
pawankrd-cosmosrp-v22-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/pawankrd-cosmosrp-v22_reward/tokenizer.json
pawankrd-cosmosrp-v22-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/pawankrd-cosmosrp-v22_reward/reward.tensors
Job pawankrd-cosmosrp-v22-mkmlizer completed after 137.72s with status: succeeded
Stopping job with name pawankrd-cosmosrp-v22-mkmlizer
Pipeline stage MKMLizer completed in 138.11s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Creating inference service pawankrd-cosmosrp-v22
Waiting for inference service pawankrd-cosmosrp-v22 to be ready
Inference service pawankrd-cosmosrp-v22 ready after 30.236672401428223s
Pipeline stage ISVCDeployer completed in 35.94s
Running pipeline stage StressChecker
Received healthy response to inference request in 6.927843332290649s
Received healthy response to inference request in 1.5424880981445312s
Received healthy response to inference request in 6.99535870552063s
Received healthy response to inference request in 5.975579023361206s
Received healthy response to inference request in 7.602850914001465s
5 requests
0 failed requests
5th percentile: 2.429106283187866
10th percentile: 3.3157244682312013
20th percentile: 5.088960838317871
30th percentile: 6.1660318851470945
40th percentile: 6.546937608718872
50th percentile: 6.927843332290649
60th percentile: 6.954849481582642
70th percentile: 6.981855630874634
80th percentile: 7.116857147216797
90th percentile: 7.359854030609131
95th percentile: 7.481352472305297
99th percentile: 7.578551225662231
mean time: 5.8088240146636965
%s, retrying in %s seconds...
Received healthy response to inference request in 3.6158924102783203s
Received healthy response to inference request in 1.5930161476135254s
Received healthy response to inference request in 6.606021881103516s
Received healthy response to inference request in 2.6939656734466553s
Received healthy response to inference request in 5.192718744277954s
5 requests
0 failed requests
5th percentile: 1.8132060527801515
10th percentile: 2.0333959579467775
20th percentile: 2.473775768280029
30th percentile: 2.878351020812988
40th percentile: 3.247121715545654
50th percentile: 3.6158924102783203
60th percentile: 4.246622943878174
70th percentile: 4.877353477478027
80th percentile: 5.475379371643067
90th percentile: 6.040700626373291
95th percentile: 6.323361253738403
99th percentile: 6.549489755630493
mean time: 3.9403229713439942
Pipeline stage StressChecker completed in 50.20s
Running pipeline stage DaemonicSafetyScorer
Pipeline stage DaemonicSafetyScorer completed in 0.04s
pawankrd-cosmosrp_v22 status is now deployed due to DeploymentManager action
pawankrd-cosmosrp_v22 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of pawankrd-cosmosrp_v22
Running pipeline stage ISVCDeleter
Checking if service pawankrd-cosmosrp-v22 is running
Skipping teardown as no inference service was found
Pipeline stage ISVCDeleter completed in 4.59s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key pawankrd-cosmosrp-v22/config.json from bucket guanaco-mkml-models
Deleting key pawankrd-cosmosrp-v22/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key pawankrd-cosmosrp-v22/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key pawankrd-cosmosrp-v22/tokenizer.json from bucket guanaco-mkml-models
Deleting key pawankrd-cosmosrp-v22/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key pawankrd-cosmosrp-v22_reward/config.json from bucket guanaco-reward-models
Deleting key pawankrd-cosmosrp-v22_reward/merges.txt from bucket guanaco-reward-models
Deleting key pawankrd-cosmosrp-v22_reward/reward.tensors from bucket guanaco-reward-models
Deleting key pawankrd-cosmosrp-v22_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key pawankrd-cosmosrp-v22_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key pawankrd-cosmosrp-v22_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key pawankrd-cosmosrp-v22_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.20s
pawankrd-cosmosrp_v22 status is now torndown due to DeploymentManager action