submission_id: cycy233-l3-ss-e-v2-c1_v1
developer_uid: shiroe40
alignment_samples: 10847
alignment_score: -0.06829345172429473
best_of: 16
celo_rating: 1222.93
display_name: auto
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['<|end_header_id|>', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: False
language_model: cycy233/L3-ss-e-v2-c1
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: cycy233/L3-ss-e-v2-c1
model_name: auto
model_num_parameters: 8030261248.0
model_repo: cycy233/L3-ss-e-v2-c1
model_size: 8B
num_battles: 10847
num_wins: 5516
propriety_score: 0.7117021276595744
propriety_total_count: 940.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: Jellywibble/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-14T03:47:11+00:00
us_pacific_date: 2024-08-13
win_ratio: 0.5085277035124919
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name cycy233-l3-ss-e-v2-c1-v1-mkmlizer
Waiting for job on cycy233-l3-ss-e-v2-c1-v1-mkmlizer to finish
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ _____ __ __ ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ /___/ ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ Version: 0.9.9 ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ https://mk1.ai ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ The license key for the current software has been verified as ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ belonging to: ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ Chai Research Corp. ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ║ ║
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: Downloaded to shared memory in 30.437s
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp0arkl14o, device:0
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: quantized model in 25.602s
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: Processed model cycy233/L3-ss-e-v2-c1 in 56.039s
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: creating bucket guanaco-mkml-models
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cycy233-l3-ss-e-v2-c1-v1
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cycy233-l3-ss-e-v2-c1-v1/config.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cycy233-l3-ss-e-v2-c1-v1/special_tokens_map.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cycy233-l3-ss-e-v2-c1-v1/tokenizer_config.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cycy233-l3-ss-e-v2-c1-v1/tokenizer.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/cycy233-l3-ss-e-v2-c1-v1/flywheel_model.0.safetensors
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: loading reward model from Jellywibble/gpt2_xl_pairwise_89m_step_347634
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 6/291 [00:00<00:04, 59.56it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:04, 64.34it/s] Loading 0: 8%|▊ | 22/291 [00:00<00:03, 75.60it/s] Loading 0: 11%|█ | 31/291 [00:00<00:03, 77.06it/s] Loading 0: 14%|█▎ | 40/291 [00:00<00:03, 80.29it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:02, 80.84it/s] Loading 0: 20%|█▉ | 58/291 [00:00<00:02, 82.11it/s] Loading 0: 23%|██▎ | 67/291 [00:00<00:02, 80.46it/s] Loading 0: 26%|██▌ | 76/291 [00:00<00:02, 79.52it/s] Loading 0: 29%|██▉ | 84/291 [00:02<00:09, 21.55it/s] Loading 0: 31%|███ | 90/291 [00:02<00:08, 24.79it/s] Loading 0: 35%|███▌ | 103/291 [00:02<00:05, 35.56it/s] Loading 0: 40%|███▉ | 115/291 [00:02<00:03, 44.96it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 52.03it/s] Loading 0: 46%|████▌ | 133/291 [00:02<00:02, 58.11it/s] Loading 0: 49%|████▉ | 142/291 [00:02<00:02, 63.56it/s] Loading 0: 52%|█████▏ | 151/291 [00:02<00:02, 67.52it/s] Loading 0: 55%|█████▍ | 160/291 [00:02<00:01, 71.65it/s] Loading 0: 58%|█████▊ | 169/291 [00:03<00:01, 73.61it/s] Loading 0: 61%|██████ | 178/291 [00:03<00:01, 75.95it/s] Loading 0: 64%|██████▍ | 187/291 [00:04<00:04, 22.34it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:02, 33.16it/s] Loading 0: 74%|███████▎ | 214/291 [00:04<00:01, 41.06it/s] Loading 0: 77%|███████▋ | 223/291 [00:04<00:01, 47.22it/s] Loading 0: 80%|███████▉ | 232/291 [00:04<00:01, 53.43it/s] Loading 0: 83%|████████▎ | 241/291 [00:04<00:00, 60.03it/s] Loading 0: 86%|████████▌ | 250/291 [00:04<00:00, 66.17it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 70.56it/s] Loading 0: 92%|█████████▏| 268/291 [00:05<00:00, 74.16it/s] Loading 0: 95%|█████████▌| 277/291 [00:05<00:00, 71.93it/s] Loading 0: 98%|█████████▊| 286/291 [00:05<00:00, 75.53it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: warnings.warn(
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: warnings.warn(
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: warnings.warn(
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: Saving duration: 1.366s
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: Processed model Jellywibble/gpt2_xl_pairwise_89m_step_347634 in 9.581s
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: creating bucket guanaco-reward-models
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: Bucket 's3://guanaco-reward-models/' created
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/cycy233-l3-ss-e-v2-c1-v1_reward
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/cycy233-l3-ss-e-v2-c1-v1_reward/config.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/cycy233-l3-ss-e-v2-c1-v1_reward/special_tokens_map.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/cycy233-l3-ss-e-v2-c1-v1_reward/tokenizer_config.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/cycy233-l3-ss-e-v2-c1-v1_reward/merges.txt
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/cycy233-l3-ss-e-v2-c1-v1_reward/vocab.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/cycy233-l3-ss-e-v2-c1-v1_reward/tokenizer.json
cycy233-l3-ss-e-v2-c1-v1-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/cycy233-l3-ss-e-v2-c1-v1_reward/reward.tensors
Job cycy233-l3-ss-e-v2-c1-v1-mkmlizer completed after 102.99s with status: succeeded
Stopping job with name cycy233-l3-ss-e-v2-c1-v1-mkmlizer
Pipeline stage MKMLizer completed in 104.22s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service cycy233-l3-ss-e-v2-c1-v1
Waiting for inference service cycy233-l3-ss-e-v2-c1-v1 to be ready
Inference service cycy233-l3-ss-e-v2-c1-v1 ready after 302.95262455940247s
Pipeline stage ISVCDeployer completed in 304.42s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.5394999980926514s
Received healthy response to inference request in 1.6062655448913574s
Received healthy response to inference request in 1.5089867115020752s
Received healthy response to inference request in 1.5559439659118652s
Received healthy response to inference request in 1.5753557682037354s
5 requests
0 failed requests
5th percentile: 1.5183781623840331
10th percentile: 1.5277696132659913
20th percentile: 1.5465525150299073
30th percentile: 1.5598263263702392
40th percentile: 1.5675910472869874
50th percentile: 1.5753557682037354
60th percentile: 1.5877196788787842
70th percentile: 1.600083589553833
80th percentile: 1.7929124355316164
90th percentile: 2.166206216812134
95th percentile: 2.3528531074523924
99th percentile: 2.5021706199645997
mean time: 1.7572103977203368
Pipeline stage StressChecker completed in 9.63s
cycy233-l3-ss-e-v2-c1_v1 status is now deployed due to DeploymentManager action
cycy233-l3-ss-e-v2-c1_v1 status is now inactive due to auto deactivation removed underperforming models
cycy233-l3-ss-e-v2-c1_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics