submission_id: jic062-dpo-v1-1_v1
developer_uid: chace9580
alignment_samples: 10933
alignment_score: 0.7934476912561704
best_of: 16
celo_rating: 1255.48
display_name: jic062-dpo-v1-1_v1
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '|eot_id|'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: jic062/dpo-v1.1
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: jic062/dpo-v1.1
model_name: jic062-dpo-v1-1_v1
model_num_parameters: 8030261248.0
model_repo: jic062/dpo-v1.1
model_size: 8B
num_battles: 10933
num_wins: 5773
propriety_score: 0.7133550488599348
propriety_total_count: 921.0
ranking_group: single
status: inactive
submission_type: basic
timestamp: 2024-08-24T01:40:34+00:00
us_pacific_date: 2024-08-23
win_ratio: 0.5280343912924175
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jic062-dpo-v1-1-v1-mkmlizer
Waiting for job on jic062-dpo-v1-1-v1-mkmlizer to finish
jic062-dpo-v1-1-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-dpo-v1-1-v1-mkmlizer: ║ _____ __ __ ║
jic062-dpo-v1-1-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-dpo-v1-1-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-dpo-v1-1-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-dpo-v1-1-v1-mkmlizer: ║ /___/ ║
jic062-dpo-v1-1-v1-mkmlizer: ║ ║
jic062-dpo-v1-1-v1-mkmlizer: ║ Version: 0.10.1 ║
jic062-dpo-v1-1-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-dpo-v1-1-v1-mkmlizer: ║ https://mk1.ai ║
jic062-dpo-v1-1-v1-mkmlizer: ║ ║
jic062-dpo-v1-1-v1-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-dpo-v1-1-v1-mkmlizer: ║ belonging to: ║
jic062-dpo-v1-1-v1-mkmlizer: ║ ║
jic062-dpo-v1-1-v1-mkmlizer: ║ Chai Research Corp. ║
jic062-dpo-v1-1-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-dpo-v1-1-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-dpo-v1-1-v1-mkmlizer: ║ ║
jic062-dpo-v1-1-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jic062-dpo-v1-1-v1-mkmlizer: Downloaded to shared memory in 33.837s
jic062-dpo-v1-1-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp1rordghg, device:0
jic062-dpo-v1-1-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jic062-dpo-v1-1-v1-mkmlizer: quantized model in 25.802s
jic062-dpo-v1-1-v1-mkmlizer: Processed model jic062/dpo-v1.1 in 59.639s
jic062-dpo-v1-1-v1-mkmlizer: creating bucket guanaco-mkml-models
jic062-dpo-v1-1-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-dpo-v1-1-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-dpo-v1-1-v1
jic062-dpo-v1-1-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-dpo-v1-1-v1/config.json
jic062-dpo-v1-1-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-dpo-v1-1-v1/special_tokens_map.json
jic062-dpo-v1-1-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-dpo-v1-1-v1/tokenizer_config.json
jic062-dpo-v1-1-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-dpo-v1-1-v1/tokenizer.json
jic062-dpo-v1-1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-dpo-v1-1-v1/flywheel_model.0.safetensors
jic062-dpo-v1-1-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:07, 36.03it/s] Loading 0: 5%|▍ | 14/291 [00:00<00:05, 49.11it/s] Loading 0: 8%|▊ | 23/291 [00:00<00:05, 53.31it/s] Loading 0: 11%|█ | 32/291 [00:00<00:04, 54.89it/s] Loading 0: 14%|█▍ | 41/291 [00:00<00:04, 55.55it/s] Loading 0: 17%|█▋ | 50/291 [00:00<00:04, 54.45it/s] Loading 0: 20%|█▉ | 58/291 [00:01<00:03, 59.54it/s] Loading 0: 22%|██▏ | 65/291 [00:01<00:04, 56.37it/s] Loading 0: 24%|██▍ | 71/291 [00:01<00:04, 54.98it/s] Loading 0: 26%|██▋ | 77/291 [00:01<00:04, 49.17it/s] Loading 0: 29%|██▊ | 83/291 [00:01<00:05, 37.98it/s] Loading 0: 30%|███ | 88/291 [00:01<00:05, 39.02it/s] Loading 0: 32%|███▏ | 94/291 [00:01<00:04, 43.50it/s] Loading 0: 34%|███▍ | 100/291 [00:02<00:04, 43.38it/s] Loading 0: 36%|███▌ | 105/291 [00:02<00:04, 44.65it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:03, 50.95it/s] Loading 0: 41%|████ | 118/291 [00:02<00:03, 49.10it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 49.75it/s] Loading 0: 45%|████▌ | 131/291 [00:02<00:03, 47.41it/s] Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 54.91it/s] Loading 0: 50%|████▉ | 145/291 [00:02<00:02, 52.44it/s] Loading 0: 52%|█████▏ | 151/291 [00:03<00:02, 52.61it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:02, 53.03it/s] Loading 0: 56%|█████▌ | 163/291 [00:03<00:02, 51.12it/s] Loading 0: 58%|█████▊ | 169/291 [00:03<00:02, 51.95it/s] Loading 0: 60%|██████ | 176/291 [00:03<00:02, 56.30it/s] Loading 0: 63%|██████▎ | 182/291 [00:03<00:02, 49.99it/s] Loading 0: 65%|██████▍ | 188/291 [00:03<00:02, 37.25it/s] Loading 0: 66%|██████▋ | 193/291 [00:04<00:02, 37.88it/s] Loading 0: 68%|██████▊ | 199/291 [00:04<00:02, 40.57it/s] Loading 0: 70%|███████ | 204/291 [00:04<00:02, 42.61it/s] Loading 0: 73%|███████▎ | 211/291 [00:04<00:01, 48.95it/s] Loading 0: 75%|███████▍ | 217/291 [00:04<00:01, 47.37it/s] Loading 0: 76%|███████▋ | 222/291 [00:04<00:01, 46.48it/s] Loading 0: 79%|███████▊ | 229/291 [00:04<00:01, 52.56it/s] Loading 0: 81%|████████ | 235/291 [00:04<00:01, 51.35it/s] Loading 0: 83%|████████▎ | 241/291 [00:04<00:00, 51.53it/s] Loading 0: 85%|████████▍ | 247/291 [00:05<00:00, 53.57it/s] Loading 0: 87%|████████▋ | 253/291 [00:05<00:00, 52.24it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 53.39it/s] Loading 0: 91%|█████████ | 265/291 [00:05<00:00, 53.92it/s] Loading 0: 93%|█████████▎| 271/291 [00:05<00:00, 51.76it/s] Loading 0: 95%|█████████▌| 277/291 [00:05<00:00, 53.41it/s] Loading 0: 97%|█████████▋| 283/291 [00:05<00:00, 48.45it/s] Loading 0: 99%|█████████▉| 288/291 [00:11<00:00, 3.30it/s]
Job jic062-dpo-v1-1-v1-mkmlizer completed after 85.43s with status: succeeded
Stopping job with name jic062-dpo-v1-1-v1-mkmlizer
Pipeline stage MKMLizer completed in 86.48s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service jic062-dpo-v1-1-v1
Waiting for inference service jic062-dpo-v1-1-v1 to be ready
Inference service jic062-dpo-v1-1-v1 ready after 211.48000764846802s
Pipeline stage ISVCDeployer completed in 212.71s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.192013740539551s
Received healthy response to inference request in 2.0318024158477783s
Received healthy response to inference request in 1.709522008895874s
Received healthy response to inference request in 1.6391122341156006s
Received healthy response to inference request in 1.585284948348999s
5 requests
0 failed requests
5th percentile: 1.5960504055023192
10th percentile: 1.6068158626556397
20th percentile: 1.6283467769622804
30th percentile: 1.6531941890716553
40th percentile: 1.6813580989837646
50th percentile: 1.709522008895874
60th percentile: 1.8384341716766357
70th percentile: 1.9673463344573974
80th percentile: 2.063844680786133
90th percentile: 2.127929210662842
95th percentile: 2.1599714756011963
99th percentile: 2.18560528755188
mean time: 1.8315470695495606
Pipeline stage StressChecker completed in 10.46s
jic062-dpo-v1-1_v1 status is now deployed due to DeploymentManager action
jic062-dpo-v1-1_v1 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics