submission_id: cycy233-l3-bdpo-v5-c2_v2
developer_uid: shiroe40
status: inactive
model_repo: cycy233/L3-bdpo-v5-c2
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 0.6, 'top_p': 0.9, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-07-12T16:07:29+00:00
model_name: auto
model_group: cycy233/L3-bdpo-v5-c2
num_battles: 12317
num_wins: 5265
celo_rating: 1129.46
alignment_score: None
alignment_samples: 0
propriety_score: 0.782940698619009
propriety_total_count: 6155.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: auto
ineligible_reason: None
language_model: cycy233/L3-bdpo-v5-c2
model_size: 8B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-07-12
win_ratio: 0.4274579848989202
preference_data_url: None
Resubmit model
Running pipeline stage MKMLizer
Starting job with name cycy233-l3-bdpo-v5-c2-v2-mkmlizer
Waiting for job on cycy233-l3-bdpo-v5-c2-v2-mkmlizer to finish
Failed to get response for submission mistralai-mixtral-8x7b-_3473_v86: ('http://mistralai-mixtral-8x7b-3473-v86-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'upstream connect error or disconnect/reset before headers. reset reason: remote connection failure, transport failure reason: delayed connect error: 113')
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ _____ __ __ ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ /___/ ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ Version: 0.8.14 ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ https://mk1.ai ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ The license key for the current software has been verified as ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ belonging to: ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ Chai Research Corp. ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ║ ║
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Downloaded to shared memory in 23.064s
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: quantizing model to /dev/shm/model_cache
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission neversleep-noromaid-v0_8068_v116: ('http://neversleep-noromaid-v0-8068-v116-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'readfrom tcp 127.0.0.1:53536->127.0.0.1:8080: write tcp 127.0.0.1:53536->127.0.0.1:8080: use of closed network connection\n')
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:02, 115.86it/s] Loading 0: 10%|▉ | 28/291 [00:00<00:01, 133.79it/s] Loading 0: 14%|█▍ | 42/291 [00:00<00:02, 124.07it/s] Loading 0: 20%|█▉ | 58/291 [00:00<00:01, 128.80it/s] Loading 0: 24%|██▍ | 71/291 [00:00<00:01, 127.65it/s] Loading 0: 29%|██▉ | 84/291 [00:01<00:03, 58.70it/s] Loading 0: 33%|███▎ | 95/291 [00:01<00:02, 65.82it/s] Loading 0: 38%|███▊ | 111/291 [00:01<00:02, 83.79it/s] Loading 0: 42%|████▏ | 123/291 [00:01<00:01, 90.41it/s] Loading 0: 47%|████▋ | 138/291 [00:01<00:01, 103.35it/s] Loading 0: 52%|█████▏ | 151/291 [00:01<00:01, 107.06it/s] Loading 0: 57%|█████▋ | 165/291 [00:01<00:01, 114.55it/s] Loading 0: 62%|██████▏ | 179/291 [00:01<00:00, 121.00it/s] Loading 0: 66%|██████▌ | 192/291 [00:02<00:01, 62.91it/s] Loading 0: 70%|██████▉ | 203/291 [00:02<00:01, 70.21it/s] Loading 0: 75%|███████▌ | 219/291 [00:02<00:00, 87.17it/s] Loading 0: 79%|███████▉ | 231/291 [00:02<00:00, 92.30it/s] Loading 0: 85%|████████▍ | 246/291 [00:02<00:00, 103.25it/s] Loading 0: 89%|████████▉ | 259/291 [00:02<00:00, 100.75it/s] Loading 0: 94%|█████████▍| 273/291 [00:02<00:00, 110.09it/s] Loading 0: 98%|█████████▊| 286/291 [00:02<00:00, 114.55it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: quantized model in 28.746s
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Processed model cycy233/L3-bdpo-v5-c2 in 51.811s
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: creating bucket guanaco-mkml-models
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c2-v2
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c2-v2/special_tokens_map.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c2-v2/tokenizer_config.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c2-v2/config.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c2-v2/tokenizer.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c2-v2/flywheel_model.0.safetensors
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:919: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: warnings.warn(
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: warnings.warn(
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:769: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: warnings.warn(
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: warnings.warn(
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: return self.fget.__get__(instance, owner)()
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Saving duration: 0.512s
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 4.280s
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: creating bucket guanaco-reward-models
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/cycy233-l3-bdpo-v5-c2-v2_reward
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/cycy233-l3-bdpo-v5-c2-v2_reward/config.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/cycy233-l3-bdpo-v5-c2-v2_reward/special_tokens_map.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/cycy233-l3-bdpo-v5-c2-v2_reward/tokenizer_config.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/cycy233-l3-bdpo-v5-c2-v2_reward/merges.txt
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/cycy233-l3-bdpo-v5-c2-v2_reward/vocab.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/cycy233-l3-bdpo-v5-c2-v2_reward/tokenizer.json
cycy233-l3-bdpo-v5-c2-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/cycy233-l3-bdpo-v5-c2-v2_reward/reward.tensors
Job cycy233-l3-bdpo-v5-c2-v2-mkmlizer completed after 83.96s with status: succeeded
Stopping job with name cycy233-l3-bdpo-v5-c2-v2-mkmlizer
Pipeline stage MKMLizer completed in 84.90s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Creating inference service cycy233-l3-bdpo-v5-c2-v2
Waiting for inference service cycy233-l3-bdpo-v5-c2-v2 to be ready
Inference service cycy233-l3-bdpo-v5-c2-v2 ready after 50.24069809913635s
Pipeline stage ISVCDeployer completed in 57.48s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.0839734077453613s
Received healthy response to inference request in 1.259383201599121s
Failed to get response for submission neversleep-noromaid-v0_8068_v133: ('http://neversleep-noromaid-v0-8068-v133-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'upstream connect error or disconnect/reset before headers. reset reason: remote connection failure, transport failure reason: delayed connect error: 113')
Received healthy response to inference request in 1.2732253074645996s
Received healthy response to inference request in 1.2750446796417236s
Received healthy response to inference request in 1.2859222888946533s
5 requests
0 failed requests
5th percentile: 1.2621516227722167
10th percentile: 1.2649200439453125
20th percentile: 1.270456886291504
30th percentile: 1.2735891819000245
40th percentile: 1.274316930770874
50th percentile: 1.2750446796417236
60th percentile: 1.2793957233428954
70th percentile: 1.2837467670440674
80th percentile: 1.445532512664795
90th percentile: 1.7647529602050782
95th percentile: 1.9243631839752195
99th percentile: 2.052051362991333
mean time: 1.4355097770690919
Pipeline stage StressChecker completed in 7.87s
cycy233-l3-bdpo-v5-c2_v2 status is now deployed due to DeploymentManager action
cycy233-l3-bdpo-v5-c2_v2 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics