submission_id: cycy233-l3-e-v2-c1_v21
developer_uid: shiroe40
alignment_samples: 0
best_of: 4
celo_rating: 1222.91
display_name: auto
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: False
language_model: cycy233/L3-e-v2-c1
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: cycy233/L3-e-v2-c1
model_name: auto
model_num_parameters: 8030261248.0
model_repo: cycy233/L3-e-v2-c1
model_size: 8B
num_battles: 10665
num_wins: 5604
propriety_score: 0.7077777777777777
propriety_total_count: 900.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
status: torndown
submission_type: basic
timestamp: 2024-07-26T08:46:30+00:00
us_pacific_date: 2024-07-26
win_ratio: 0.5254571026722925
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name cycy233-l3-e-v2-c1-v21-mkmlizer
Waiting for job on cycy233-l3-e-v2-c1-v21-mkmlizer to finish
cycy233-l3-e-v2-c1-v21-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ _____ __ __ ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ /___/ ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ Version: 0.9.7 ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ https://mk1.ai ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ The license key for the current software has been verified as ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ belonging to: ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ Chai Research Corp. ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ║ ║
cycy233-l3-e-v2-c1-v21-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cycy233-l3-e-v2-c1-v21-mkmlizer: Downloaded to shared memory in 25.118s
cycy233-l3-e-v2-c1-v21-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpcva08d15, device:0
cycy233-l3-e-v2-c1-v21-mkmlizer: Saving flywheel model at /dev/shm/model_cache
cycy233-l3-e-v2-c1-v21-mkmlizer: quantized model in 26.405s
cycy233-l3-e-v2-c1-v21-mkmlizer: Processed model cycy233/L3-e-v2-c1 in 51.524s
cycy233-l3-e-v2-c1-v21-mkmlizer: creating bucket guanaco-mkml-models
cycy233-l3-e-v2-c1-v21-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cycy233-l3-e-v2-c1-v21-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cycy233-l3-e-v2-c1-v21
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cycy233-l3-e-v2-c1-v21/config.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cycy233-l3-e-v2-c1-v21/special_tokens_map.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cycy233-l3-e-v2-c1-v21/tokenizer_config.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cycy233-l3-e-v2-c1-v21/tokenizer.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/cycy233-l3-e-v2-c1-v21/flywheel_model.0.safetensors
cycy233-l3-e-v2-c1-v21-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
cycy233-l3-e-v2-c1-v21-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:07, 36.71it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:04, 57.80it/s] Loading 0: 7%|▋ | 20/291 [00:00<00:04, 57.01it/s] Loading 0: 9%|▉ | 26/291 [00:00<00:04, 57.65it/s] Loading 0: 11%|█ | 32/291 [00:00<00:05, 49.88it/s] Loading 0: 14%|█▍ | 41/291 [00:00<00:04, 52.70it/s] Loading 0: 17%|█▋ | 50/291 [00:00<00:04, 54.40it/s] Loading 0: 20%|██ | 59/291 [00:01<00:04, 54.60it/s] Loading 0: 23%|██▎ | 67/291 [00:01<00:03, 59.51it/s] Loading 0: 25%|██▌ | 74/291 [00:01<00:03, 56.53it/s] Loading 0: 27%|██▋ | 80/291 [00:01<00:03, 55.00it/s] Loading 0: 30%|██▉ | 86/291 [00:01<00:05, 34.38it/s] Loading 0: 32%|███▏ | 94/291 [00:01<00:04, 41.41it/s] Loading 0: 34%|███▍ | 100/291 [00:02<00:04, 41.02it/s] Loading 0: 36%|███▌ | 105/291 [00:02<00:04, 42.06it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:03, 48.02it/s] Loading 0: 41%|████ | 118/291 [00:02<00:03, 46.12it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 47.96it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:03, 50.87it/s] Loading 0: 47%|████▋ | 136/291 [00:02<00:03, 48.00it/s] Loading 0: 49%|████▉ | 142/291 [00:02<00:03, 48.82it/s] Loading 0: 51%|█████ | 148/291 [00:03<00:02, 51.16it/s] Loading 0: 53%|█████▎ | 154/291 [00:03<00:02, 45.95it/s] Loading 0: 55%|█████▍ | 159/291 [00:03<00:02, 45.74it/s] Loading 0: 57%|█████▋ | 166/291 [00:03<00:02, 50.16it/s] Loading 0: 59%|█████▉ | 172/291 [00:03<00:02, 45.60it/s] Loading 0: 62%|██████▏ | 179/291 [00:03<00:02, 49.20it/s] Loading 0: 64%|██████▎ | 185/291 [00:03<00:02, 49.91it/s] Loading 0: 66%|██████▌ | 191/291 [00:04<00:02, 33.58it/s] Loading 0: 67%|██████▋ | 196/291 [00:04<00:02, 34.94it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:02, 39.82it/s] Loading 0: 71%|███████▏ | 208/291 [00:04<00:02, 41.14it/s] Loading 0: 73%|███████▎ | 213/291 [00:04<00:01, 41.86it/s] Loading 0: 76%|███████▌ | 220/291 [00:04<00:01, 47.12it/s] Loading 0: 78%|███████▊ | 226/291 [00:04<00:01, 45.61it/s] Loading 0: 79%|███████▉ | 231/291 [00:04<00:01, 45.78it/s] Loading 0: 82%|████████▏ | 238/291 [00:05<00:01, 49.96it/s] Loading 0: 84%|████████▍ | 244/291 [00:05<00:00, 47.15it/s] Loading 0: 86%|████████▌ | 249/291 [00:05<00:00, 46.63it/s] Loading 0: 88%|████████▊ | 256/291 [00:05<00:00, 51.71it/s] Loading 0: 90%|█████████ | 262/291 [00:05<00:00, 48.47it/s] Loading 0: 92%|█████████▏| 267/291 [00:05<00:00, 45.95it/s] Loading 0: 94%|█████████▍| 274/291 [00:05<00:00, 50.61it/s] Loading 0: 96%|█████████▌| 280/291 [00:05<00:00, 47.94it/s] Loading 0: 98%|█████████▊| 285/291 [00:06<00:00, 46.46it/s] Loading 0: 100%|█████████▉| 290/291 [00:11<00:00, 3.22it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-e-v2-c1-v21-mkmlizer: warnings.warn(
cycy233-l3-e-v2-c1-v21-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-e-v2-c1-v21-mkmlizer: warnings.warn(
cycy233-l3-e-v2-c1-v21-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-e-v2-c1-v21-mkmlizer: warnings.warn(
cycy233-l3-e-v2-c1-v21-mkmlizer: Saving duration: 0.351s
cycy233-l3-e-v2-c1-v21-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 6.715s
cycy233-l3-e-v2-c1-v21-mkmlizer: creating bucket guanaco-reward-models
cycy233-l3-e-v2-c1-v21-mkmlizer: Bucket 's3://guanaco-reward-models/' created
cycy233-l3-e-v2-c1-v21-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/cycy233-l3-e-v2-c1-v21_reward
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/cycy233-l3-e-v2-c1-v21_reward/config.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/cycy233-l3-e-v2-c1-v21_reward/tokenizer_config.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/cycy233-l3-e-v2-c1-v21_reward/special_tokens_map.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/cycy233-l3-e-v2-c1-v21_reward/merges.txt
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/cycy233-l3-e-v2-c1-v21_reward/vocab.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/cycy233-l3-e-v2-c1-v21_reward/tokenizer.json
cycy233-l3-e-v2-c1-v21-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/cycy233-l3-e-v2-c1-v21_reward/reward.tensors
Job cycy233-l3-e-v2-c1-v21-mkmlizer completed after 84.94s with status: succeeded
Stopping job with name cycy233-l3-e-v2-c1-v21-mkmlizer
Pipeline stage MKMLizer completed in 86.10s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service cycy233-l3-e-v2-c1-v21
Waiting for inference service cycy233-l3-e-v2-c1-v21 to be ready
Inference service cycy233-l3-e-v2-c1-v21 ready after 70.78991341590881s
Pipeline stage ISVCDeployer completed in 72.98s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.9442672729492188s
Received healthy response to inference request in 1.100451946258545s
Received healthy response to inference request in 1.0441269874572754s
Received healthy response to inference request in 1.0948290824890137s
Received healthy response to inference request in 1.3540103435516357s
5 requests
0 failed requests
5th percentile: 1.054267406463623
10th percentile: 1.0644078254699707
20th percentile: 1.084688663482666
30th percentile: 1.0959536552429199
40th percentile: 1.0982028007507325
50th percentile: 1.100451946258545
60th percentile: 1.2018753051757813
70th percentile: 1.3032986640930175
80th percentile: 1.4720617294311524
90th percentile: 1.7081645011901856
95th percentile: 1.826215887069702
99th percentile: 1.9206569957733155
mean time: 1.3075371265411377
Pipeline stage StressChecker completed in 7.28s
cycy233-l3-e-v2-c1_v21 status is now deployed due to DeploymentManager action
cycy233-l3-e-v2-c1_v21 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of cycy233-l3-e-v2-c1_v21
Running pipeline stage ISVCDeleter
Checking if service cycy233-l3-e-v2-c1-v21 is running
Tearing down inference service cycy233-l3-e-v2-c1-v21
Service cycy233-l3-e-v2-c1-v21 has been torndown
Pipeline stage ISVCDeleter completed in 4.81s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key cycy233-l3-e-v2-c1-v21/config.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-e-v2-c1-v21/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key cycy233-l3-e-v2-c1-v21/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-e-v2-c1-v21/tokenizer.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-e-v2-c1-v21/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key cycy233-l3-e-v2-c1-v21_reward/config.json from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c1-v21_reward/merges.txt from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c1-v21_reward/reward.tensors from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c1-v21_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c1-v21_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c1-v21_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c1-v21_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.22s
cycy233-l3-e-v2-c1_v21 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics