submission_id: jic062-instruct-g32-lr25_4505_v1
developer_uid: chace9580
alignment_samples: 18756
alignment_score: -0.06313733123460985
best_of: 16
celo_rating: 1258.14
display_name: jic062-instruct-g32-lr25_4505_v1
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_of_text|>', '|eot_id|'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: jic062/instruct_g32_lr25_dp7_c3
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: jic062/instruct_g32_lr25
model_name: jic062-instruct-g32-lr25_4505_v1
model_num_parameters: 8030261248.0
model_repo: jic062/instruct_g32_lr25_dp7_c3
model_size: 8B
num_battles: 18756
num_wins: 9699
propriety_score: 0.7372881355932204
propriety_total_count: 1888.0
ranking_group: single
status: torndown
submission_type: basic
timestamp: 2024-08-15T20:35:27+00:00
us_pacific_date: 2024-08-15
win_ratio: 0.5171145233525272
Download Preferencedata
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jic062-instruct-g32-lr25-4505-v1-mkmlizer
Waiting for job on jic062-instruct-g32-lr25-4505-v1-mkmlizer to finish
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ _____ __ __ ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ /___/ ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ Version: 0.9.11 ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ https://mk1.ai ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ belonging to: ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ Chai Research Corp. ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-4505-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jic062-instruct-g32-lr25-4505-v1-mkmlizer: Downloaded to shared memory in 43.805s
jic062-instruct-g32-lr25-4505-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpxmnk9pv1, device:0
jic062-instruct-g32-lr25-4505-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission jellywibble-lora-120k-pr_2801_v2: ('http://jellywibble-lora-120k-pr-2801-v2-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:51638->127.0.0.1:8080: read: connection reset by peer\n')
jic062-instruct-g32-lr25-4505-v1-mkmlizer: quantized model in 27.091s
jic062-instruct-g32-lr25-4505-v1-mkmlizer: Processed model jic062/instruct_g32_lr25_dp7_c3 in 70.896s
jic062-instruct-g32-lr25-4505-v1-mkmlizer: creating bucket guanaco-mkml-models
jic062-instruct-g32-lr25-4505-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-instruct-g32-lr25-4505-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-instruct-g32-lr25-4505-v1
jic062-instruct-g32-lr25-4505-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-4505-v1/special_tokens_map.json
jic062-instruct-g32-lr25-4505-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-4505-v1/config.json
jic062-instruct-g32-lr25-4505-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-4505-v1/tokenizer_config.json
jic062-instruct-g32-lr25-4505-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-4505-v1/tokenizer.json
jic062-instruct-g32-lr25-4505-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-instruct-g32-lr25-4505-v1/flywheel_model.0.safetensors
jic062-instruct-g32-lr25-4505-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 7/291 [00:00<00:05, 48.47it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:04, 64.24it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:04, 65.84it/s] Loading 0: 12%|█▏ | 34/291 [00:00<00:03, 68.47it/s] Loading 0: 15%|█▍ | 43/291 [00:00<00:03, 68.93it/s] Loading 0: 18%|█▊ | 52/291 [00:00<00:03, 67.76it/s] Loading 0: 21%|██ | 61/291 [00:00<00:03, 66.95it/s] Loading 0: 24%|██▍ | 70/291 [00:01<00:03, 65.57it/s] Loading 0: 27%|██▋ | 79/291 [00:01<00:03, 64.32it/s] Loading 0: 30%|██▉ | 86/291 [00:02<00:11, 17.95it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:08, 23.24it/s] Loading 0: 35%|███▌ | 103/291 [00:02<00:06, 29.86it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:05, 35.65it/s] Loading 0: 42%|████▏ | 121/291 [00:02<00:03, 43.02it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:03, 50.99it/s] Loading 0: 48%|████▊ | 139/291 [00:03<00:02, 56.96it/s] Loading 0: 51%|█████ | 148/291 [00:03<00:02, 62.39it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:01, 68.13it/s] Loading 0: 57%|█████▋ | 166/291 [00:03<00:01, 72.68it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:01, 75.69it/s] Loading 0: 63%|██████▎ | 184/291 [00:03<00:01, 75.86it/s] Loading 0: 66%|██████▋ | 193/291 [00:04<00:04, 20.30it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:03, 26.33it/s] Loading 0: 73%|███████▎ | 211/291 [00:05<00:02, 32.74it/s] Loading 0: 76%|███████▋ | 222/291 [00:05<00:01, 43.14it/s] Loading 0: 79%|███████▉ | 231/291 [00:05<00:01, 50.51it/s] Loading 0: 82%|████████▏ | 240/291 [00:05<00:00, 57.60it/s] Loading 0: 86%|████████▌ | 249/291 [00:05<00:00, 62.93it/s] Loading 0: 89%|████████▊ | 258/291 [00:05<00:00, 68.59it/s] Loading 0: 92%|█████████▏| 267/291 [00:05<00:00, 72.74it/s] Loading 0: 95%|█████████▍| 276/291 [00:05<00:00, 75.88it/s] Loading 0: 98%|█████████▊| 285/291 [00:05<00:00, 79.35it/s]
Job jic062-instruct-g32-lr25-4505-v1-mkmlizer completed after 94.47s with status: succeeded
Stopping job with name jic062-instruct-g32-lr25-4505-v1-mkmlizer
Pipeline stage MKMLizer completed in 95.25s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.19s
Running pipeline stage ISVCDeployer
Creating inference service jic062-instruct-g32-lr25-4505-v1
Waiting for inference service jic062-instruct-g32-lr25-4505-v1 to be ready
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Inference service jic062-instruct-g32-lr25-4505-v1 ready after 231.5681586265564s
Pipeline stage ISVCDeployer completed in 233.61s
Running pipeline stage StressChecker
Received healthy response to inference request in 4.2586188316345215s
Received healthy response to inference request in 4.008186340332031s
Received healthy response to inference request in 1.5570876598358154s
Received healthy response to inference request in 3.2343366146087646s
Received healthy response to inference request in 1.5776307582855225s
5 requests
0 failed requests
5th percentile: 1.5611962795257568
10th percentile: 1.5653048992156982
20th percentile: 1.573522138595581
30th percentile: 1.9089719295501708
40th percentile: 2.571654272079468
50th percentile: 3.2343366146087646
60th percentile: 3.5438765048980714
70th percentile: 3.8534163951873777
80th percentile: 4.058272838592529
90th percentile: 4.158445835113525
95th percentile: 4.208532333374023
99th percentile: 4.2486015319824215
mean time: 2.927172040939331
Pipeline stage StressChecker completed in 15.39s
jic062-instruct-g32-lr25_4505_v1 status is now deployed due to DeploymentManager action
jic062-instruct-g32-lr25_4505_v1 status is now inactive due to auto deactivation removed underperforming models
jic062-instruct-g32-lr25_4505_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics