submission_id: jic062-instruct-g32-lr25_5927_v1
developer_uid: chace9580
alignment_samples: 20217
alignment_score: 0.17530330511269676
best_of: 16
celo_rating: 1203.18
display_name: jic062-instruct-g32-lr25_5927_v1
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_of_text|>', '|eot_id|'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: jic062/instruct_g32_lr25_dp7_c1
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: jic062/instruct_g32_lr25
model_name: jic062-instruct-g32-lr25_5927_v1
model_num_parameters: 8030261248.0
model_repo: jic062/instruct_g32_lr25_dp7_c1
model_size: 8B
num_battles: 20217
num_wins: 8787
propriety_score: 0.7279521674140508
propriety_total_count: 2007.0
ranking_group: single
status: torndown
submission_type: basic
timestamp: 2024-08-15T20:34:04+00:00
us_pacific_date: 2024-08-15
win_ratio: 0.43463421872681407
Download Preferencedata
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jic062-instruct-g32-lr25-5927-v1-mkmlizer
Waiting for job on jic062-instruct-g32-lr25-5927-v1-mkmlizer to finish
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ _____ __ __ ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ /___/ ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ Version: 0.9.11 ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ https://mk1.ai ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ belonging to: ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ Chai Research Corp. ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-5927-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
jic062-instruct-g32-lr25-5927-v1-mkmlizer: Downloaded to shared memory in 41.435s
jic062-instruct-g32-lr25-5927-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpka3d2zjw, device:0
jic062-instruct-g32-lr25-5927-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
jic062-instruct-g32-lr25-5927-v1-mkmlizer: quantized model in 26.815s
jic062-instruct-g32-lr25-5927-v1-mkmlizer: Processed model jic062/instruct_g32_lr25_dp7_c1 in 68.251s
jic062-instruct-g32-lr25-5927-v1-mkmlizer: creating bucket guanaco-mkml-models
jic062-instruct-g32-lr25-5927-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-instruct-g32-lr25-5927-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-instruct-g32-lr25-5927-v1
jic062-instruct-g32-lr25-5927-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-5927-v1/special_tokens_map.json
jic062-instruct-g32-lr25-5927-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-5927-v1/config.json
jic062-instruct-g32-lr25-5927-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-5927-v1/tokenizer_config.json
jic062-instruct-g32-lr25-5927-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-5927-v1/tokenizer.json
jic062-instruct-g32-lr25-5927-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%|▏ | 4/291 [00:00<00:07, 39.95it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:04, 56.04it/s] Loading 0: 8%|▊ | 22/291 [00:00<00:04, 59.71it/s] Loading 0: 11%|█ | 31/291 [00:00<00:03, 66.40it/s] Loading 0: 14%|█▎ | 40/291 [00:00<00:03, 67.75it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:03, 71.46it/s] Loading 0: 21%|██ | 60/291 [00:00<00:02, 82.35it/s] Loading 0: 24%|██▍ | 70/291 [00:00<00:02, 77.85it/s] Loading 0: 27%|██▋ | 79/291 [00:01<00:02, 78.91it/s] Loading 0: 30%|███ | 88/291 [00:02<00:10, 19.75it/s] Loading 0: 33%|███▎ | 97/291 [00:02<00:07, 24.77it/s] Loading 0: 36%|███▋ | 106/291 [00:02<00:06, 30.33it/s] Loading 0: 40%|███▉ | 115/291 [00:02<00:04, 37.00it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 43.65it/s] Loading 0: 46%|████▌ | 133/291 [00:02<00:03, 49.57it/s] Loading 0: 49%|████▉ | 142/291 [00:03<00:02, 51.88it/s] Loading 0: 52%|█████▏ | 151/291 [00:03<00:02, 54.25it/s] Loading 0: 55%|█████▍ | 160/291 [00:03<00:02, 56.56it/s] Loading 0: 58%|█████▊ | 169/291 [00:03<00:02, 55.46it/s] Loading 0: 61%|██████ | 178/291 [00:03<00:01, 58.64it/s] Loading 0: 64%|██████▍ | 187/291 [00:04<00:05, 19.49it/s] Loading 0: 67%|██████▋ | 196/291 [00:05<00:03, 24.73it/s] Loading 0: 70%|███████ | 205/291 [00:05<00:02, 30.77it/s] Loading 0: 74%|███████▎ | 214/291 [00:05<00:02, 37.65it/s] Loading 0: 77%|███████▋ | 223/291 [00:05<00:01, 45.02it/s] Loading 0: 80%|███████▉ | 232/291 [00:05<00:01, 51.55it/s] Loading 0: 83%|████████▎ | 241/291 [00:05<00:00, 57.67it/s] Loading 0: 86%|████████▌ | 250/291 [00:05<00:00, 63.96it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 68.11it/s] Loading 0: 92%|█████████▏| 268/291 [00:05<00:00, 70.83it/s] Loading 0: 95%|█████████▌| 277/291 [00:06<00:00, 71.59it/s] Loading 0: 98%|█████████▊| 286/291 [00:06<00:00, 75.64it/s]
Job jic062-instruct-g32-lr25-5927-v1-mkmlizer completed after 95.07s with status: succeeded
Stopping job with name jic062-instruct-g32-lr25-5927-v1-mkmlizer
Pipeline stage MKMLizer completed in 96.02s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service jic062-instruct-g32-lr25-5927-v1
Waiting for inference service jic062-instruct-g32-lr25-5927-v1 to be ready
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Inference service jic062-instruct-g32-lr25-5927-v1 ready after 221.50933647155762s
Pipeline stage ISVCDeployer completed in 223.30s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.8786702156066895s
Received healthy response to inference request in 1.5036535263061523s
Received healthy response to inference request in 1.5431180000305176s
Received healthy response to inference request in 1.386185646057129s
Received healthy response to inference request in 1.7764766216278076s
5 requests
0 failed requests
5th percentile: 1.4096792221069336
10th percentile: 1.4331727981567384
20th percentile: 1.4801599502563476
30th percentile: 1.5115464210510254
40th percentile: 1.5273322105407714
50th percentile: 1.5431180000305176
60th percentile: 1.6364614486694335
70th percentile: 1.7298048973083495
80th percentile: 1.796915340423584
90th percentile: 1.8377927780151366
95th percentile: 1.858231496810913
99th percentile: 1.8745824718475341
mean time: 1.6176208019256593
Pipeline stage StressChecker completed in 8.79s
jic062-instruct-g32-lr25_5927_v1 status is now deployed due to DeploymentManager action
jic062-instruct-g32-lr25_5927_v1 status is now inactive due to auto deactivation removed underperforming models
jic062-instruct-g32-lr25_5927_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics