submission_id: frank2030-hathor-stable-_7642_v2
developer_uid: frank2030
best_of: 16
celo_rating: 1212.88
display_name: frank2030-hathor-stable-_7642_v2
family_friendly_score: 0.0
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: frank2030/Hathor_Stable-v0.2-L3-8B-DPO-3epoch-half-k
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: frank2030/Hathor_Stable-
model_name: frank2030-hathor-stable-_7642_v2
model_num_parameters: 8030261248.0
model_repo: frank2030/Hathor_Stable-v0.2-L3-8B-DPO-3epoch-half-k
model_size: 8B
num_battles: 31583
num_wins: 16775
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
status: torndown
submission_type: basic
timestamp: 2024-07-09T00:24:58+00:00
us_pacific_date: 2024-07-08
win_ratio: 0.5311401703448058
Resubmit model
Running pipeline stage MKMLizer
Starting job with name frank2030-hathor-stable-7642-v2-mkmlizer
Waiting for job on frank2030-hathor-stable-7642-v2-mkmlizer to finish
frank2030-hathor-stable-7642-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
frank2030-hathor-stable-7642-v2-mkmlizer: ║ _____ __ __ ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ /___/ ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ Version: 0.8.14 ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ https://mk1.ai ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ The license key for the current software has been verified as ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ belonging to: ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ Chai Research Corp. ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
frank2030-hathor-stable-7642-v2-mkmlizer: ║ ║
frank2030-hathor-stable-7642-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
frank2030-hathor-stable-7642-v2-mkmlizer: Downloaded to shared memory in 22.212s
frank2030-hathor-stable-7642-v2-mkmlizer: quantizing model to /dev/shm/model_cache
frank2030-hathor-stable-7642-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
frank2030-hathor-stable-7642-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:02, 126.87it/s] Loading 0: 10%|█ | 30/291 [00:00<00:01, 140.91it/s] Loading 0: 15%|█▌ | 45/291 [00:00<00:01, 139.20it/s] Loading 0: 20%|██ | 59/291 [00:00<00:01, 132.86it/s] Loading 0: 26%|██▌ | 76/291 [00:00<00:01, 139.83it/s] Loading 0: 31%|███ | 90/291 [00:00<00:02, 74.20it/s] Loading 0: 36%|███▌ | 104/291 [00:01<00:02, 84.39it/s] Loading 0: 42%|████▏ | 121/291 [00:01<00:01, 100.47it/s] Loading 0: 47%|████▋ | 138/291 [00:01<00:01, 114.31it/s] Loading 0: 52%|█████▏ | 152/291 [00:01<00:01, 118.83it/s] Loading 0: 57%|█████▋ | 166/291 [00:01<00:01, 123.47it/s] Loading 0: 62%|██████▏ | 181/291 [00:01<00:00, 126.79it/s] Loading 0: 67%|██████▋ | 195/291 [00:01<00:01, 74.93it/s] Loading 0: 73%|███████▎ | 211/291 [00:02<00:00, 87.74it/s] Loading 0: 78%|███████▊ | 228/291 [00:02<00:00, 101.29it/s] Loading 0: 83%|████████▎ | 241/291 [00:02<00:00, 105.78it/s] Loading 0: 88%|████████▊ | 256/291 [00:02<00:00, 112.87it/s] Loading 0: 94%|█████████▍| 273/291 [00:02<00:00, 122.41it/s] Loading 0: 99%|█████████▊| 287/291 [00:08<00:00, 8.09it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
frank2030-hathor-stable-7642-v2-mkmlizer: quantized model in 24.415s
frank2030-hathor-stable-7642-v2-mkmlizer: Processed model frank2030/Hathor_Stable-v0.2-L3-8B-DPO-3epoch-half-k in 46.628s
frank2030-hathor-stable-7642-v2-mkmlizer: creating bucket guanaco-mkml-models
frank2030-hathor-stable-7642-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
frank2030-hathor-stable-7642-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/frank2030-hathor-stable-7642-v2
frank2030-hathor-stable-7642-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/frank2030-hathor-stable-7642-v2/special_tokens_map.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/frank2030-hathor-stable-7642-v2/tokenizer_config.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/frank2030-hathor-stable-7642-v2/config.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/frank2030-hathor-stable-7642-v2/tokenizer.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/frank2030-hathor-stable-7642-v2/flywheel_model.0.safetensors
frank2030-hathor-stable-7642-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
frank2030-hathor-stable-7642-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:919: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
frank2030-hathor-stable-7642-v2-mkmlizer: warnings.warn(
frank2030-hathor-stable-7642-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
frank2030-hathor-stable-7642-v2-mkmlizer: warnings.warn(
frank2030-hathor-stable-7642-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:769: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
frank2030-hathor-stable-7642-v2-mkmlizer: warnings.warn(
frank2030-hathor-stable-7642-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
frank2030-hathor-stable-7642-v2-mkmlizer: warnings.warn(
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
frank2030-hathor-stable-7642-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
frank2030-hathor-stable-7642-v2-mkmlizer: return self.fget.__get__(instance, owner)()
frank2030-hathor-stable-7642-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
frank2030-hathor-stable-7642-v2-mkmlizer: Saving duration: 0.422s
frank2030-hathor-stable-7642-v2-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 7.353s
frank2030-hathor-stable-7642-v2-mkmlizer: creating bucket guanaco-reward-models
frank2030-hathor-stable-7642-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
frank2030-hathor-stable-7642-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/frank2030-hathor-stable-7642-v2_reward
frank2030-hathor-stable-7642-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/frank2030-hathor-stable-7642-v2_reward/config.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/frank2030-hathor-stable-7642-v2_reward/special_tokens_map.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/frank2030-hathor-stable-7642-v2_reward/tokenizer_config.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/frank2030-hathor-stable-7642-v2_reward/merges.txt
frank2030-hathor-stable-7642-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/frank2030-hathor-stable-7642-v2_reward/vocab.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/frank2030-hathor-stable-7642-v2_reward/tokenizer.json
frank2030-hathor-stable-7642-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/frank2030-hathor-stable-7642-v2_reward/reward.tensors
Job frank2030-hathor-stable-7642-v2-mkmlizer completed after 73.9s with status: succeeded
Stopping job with name frank2030-hathor-stable-7642-v2-mkmlizer
Pipeline stage MKMLizer completed in 74.75s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service frank2030-hathor-stable-7642-v2
Waiting for inference service frank2030-hathor-stable-7642-v2 to be ready
Inference service frank2030-hathor-stable-7642-v2 ready after 40.21519470214844s
Pipeline stage ISVCDeployer completed in 46.64s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1601510047912598s
Received healthy response to inference request in 1.2623028755187988s
Received healthy response to inference request in 1.2643043994903564s
Received healthy response to inference request in 1.2403221130371094s
Received healthy response to inference request in 1.239264726638794s
5 requests
0 failed requests
5th percentile: 1.239476203918457
10th percentile: 1.2396876811981201
20th percentile: 1.2401106357574463
30th percentile: 1.2447182655334472
40th percentile: 1.2535105705261231
50th percentile: 1.2623028755187988
60th percentile: 1.2631034851074219
70th percentile: 1.263904094696045
80th percentile: 1.4434737205505372
90th percentile: 1.8018123626708986
95th percentile: 1.980981683731079
99th percentile: 2.1243171405792234
mean time: 1.4332690238952637
Pipeline stage StressChecker completed in 8.15s
frank2030-hathor-stable-_7642_v2 status is now deployed due to DeploymentManager action
frank2030-hathor-stable-_7642_v2 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of frank2030-hathor-stable-_7642_v2
Running pipeline stage ISVCDeleter
Checking if service frank2030-hathor-stable-7642-v2 is running
Skipping teardown as no inference service was found
Pipeline stage ISVCDeleter completed in 4.32s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key frank2030-hathor-stable-7642-v2/config.json from bucket guanaco-mkml-models
Deleting key frank2030-hathor-stable-7642-v2/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key frank2030-hathor-stable-7642-v2/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key frank2030-hathor-stable-7642-v2/tokenizer.json from bucket guanaco-mkml-models
Deleting key frank2030-hathor-stable-7642-v2/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key frank2030-hathor-stable-7642-v2_reward/config.json from bucket guanaco-reward-models
Deleting key frank2030-hathor-stable-7642-v2_reward/merges.txt from bucket guanaco-reward-models
Deleting key frank2030-hathor-stable-7642-v2_reward/reward.tensors from bucket guanaco-reward-models
Deleting key frank2030-hathor-stable-7642-v2_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key frank2030-hathor-stable-7642-v2_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key frank2030-hathor-stable-7642-v2_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key frank2030-hathor-stable-7642-v2_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.75s
frank2030-hathor-stable-_7642_v2 status is now torndown due to DeploymentManager action