submission_id: jic062-instruct-v18-aggr_8300_v1
developer_uid: chace9580
alignment_samples: 11320
alignment_score: 0.6059921139688187
best_of: 16
celo_rating: 1246.55
display_name: jic062-instruct-v18-aggr_8300_v1
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_of_text|>', '|eot_id|'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: jic062/instruct_v18_aggresive_lr5_dp10
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: jic062/instruct_v18_aggr
model_name: jic062-instruct-v18-aggr_8300_v1
model_num_parameters: 8030261248.0
model_repo: jic062/instruct_v18_aggresive_lr5_dp10
model_size: 8B
num_battles: 11320
num_wins: 6067
propriety_score: 0.6880927291886196
propriety_total_count: 949.0
ranking_group: single
status: torndown
submission_type: basic
timestamp: 2024-08-16T23:51:49+00:00
us_pacific_date: 2024-08-16
win_ratio: 0.5359540636042402
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jic062-instruct-v18-aggr-8300-v1-mkmlizer
Waiting for job on jic062-instruct-v18-aggr-8300-v1-mkmlizer to finish
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ _____ __ __ ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ /___/ ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ Version: 0.9.11 ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ https://mk1.ai ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ belonging to: ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ Chai Research Corp. ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ║ ║
jic062-instruct-v18-aggr-8300-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jic062-instruct-v18-aggr-8300-v1-mkmlizer: Downloaded to shared memory in 39.715s
jic062-instruct-v18-aggr-8300-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpizxw6s3p, device:0
jic062-instruct-v18-aggr-8300-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jic062-instruct-v18-aggr-8300-v1-mkmlizer: quantized model in 27.322s
jic062-instruct-v18-aggr-8300-v1-mkmlizer: Processed model jic062/instruct_v18_aggresive_lr5_dp10 in 67.038s
jic062-instruct-v18-aggr-8300-v1-mkmlizer: creating bucket guanaco-mkml-models
jic062-instruct-v18-aggr-8300-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-instruct-v18-aggr-8300-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-instruct-v18-aggr-8300-v1
jic062-instruct-v18-aggr-8300-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-instruct-v18-aggr-8300-v1/config.json
jic062-instruct-v18-aggr-8300-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-instruct-v18-aggr-8300-v1/special_tokens_map.json
jic062-instruct-v18-aggr-8300-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-instruct-v18-aggr-8300-v1/tokenizer_config.json
jic062-instruct-v18-aggr-8300-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-instruct-v18-aggr-8300-v1/tokenizer.json
jic062-instruct-v18-aggr-8300-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-instruct-v18-aggr-8300-v1/flywheel_model.0.safetensors
jic062-instruct-v18-aggr-8300-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 7/291 [00:00<00:05, 51.23it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:03, 69.14it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:03, 73.65it/s] Loading 0: 12%|█▏ | 34/291 [00:00<00:03, 75.83it/s] Loading 0: 15%|█▍ | 43/291 [00:00<00:03, 77.19it/s] Loading 0: 18%|█▊ | 52/291 [00:00<00:03, 77.34it/s] Loading 0: 21%|██ | 61/291 [00:00<00:03, 75.76it/s] Loading 0: 24%|██▍ | 70/291 [00:00<00:02, 76.52it/s] Loading 0: 27%|██▋ | 79/291 [00:01<00:02, 79.55it/s] Loading 0: 30%|██▉ | 87/291 [00:02<00:09, 20.97it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:07, 25.40it/s] Loading 0: 35%|███▌ | 103/291 [00:02<00:05, 32.79it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:04, 40.39it/s] Loading 0: 42%|████▏ | 121/291 [00:02<00:03, 48.68it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:02, 55.19it/s] Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 58.89it/s] Loading 0: 51%|█████ | 148/291 [00:02<00:02, 64.21it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:01, 70.30it/s] Loading 0: 57%|█████▋ | 166/291 [00:03<00:01, 73.11it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:01, 75.62it/s] Loading 0: 63%|██████▎ | 184/291 [00:03<00:01, 76.86it/s] Loading 0: 66%|██████▋ | 193/291 [00:04<00:04, 21.70it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:03, 28.06it/s] Loading 0: 73%|███████▎ | 211/291 [00:04<00:02, 35.23it/s] Loading 0: 77%|███████▋ | 223/291 [00:04<00:01, 44.55it/s] Loading 0: 80%|████████ | 233/291 [00:04<00:01, 53.45it/s] Loading 0: 83%|████████▎ | 242/291 [00:05<00:00, 59.82it/s] Loading 0: 88%|████████▊ | 256/291 [00:05<00:00, 70.60it/s] Loading 0: 91%|█████████ | 265/291 [00:05<00:00, 68.08it/s] Loading 0: 94%|█████████▍| 274/291 [00:05<00:00, 71.59it/s] Loading 0: 97%|█████████▋| 283/291 [00:05<00:00, 74.24it/s]
Job jic062-instruct-v18-aggr-8300-v1-mkmlizer completed after 94.8s with status: succeeded
Stopping job with name jic062-instruct-v18-aggr-8300-v1-mkmlizer
Pipeline stage MKMLizer completed in 95.78s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service jic062-instruct-v18-aggr-8300-v1
Waiting for inference service jic062-instruct-v18-aggr-8300-v1 to be ready
Failed to get response for submission blend_lirek_2024-08-16: ('http://zonemercy-lexical-nemo-1518-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Tearing down inference service jic062-instruct-v18-aggr-8300-v1
%s, retrying in %s seconds...
Creating inference service jic062-instruct-v18-aggr-8300-v1
Ignoring service jic062-instruct-v18-aggr-8300-v1 already deployed
Waiting for inference service jic062-instruct-v18-aggr-8300-v1 to be ready
Tearing down inference service jic062-instruct-v18-aggr-8300-v1
%s, retrying in %s seconds...
Creating inference service jic062-instruct-v18-aggr-8300-v1
Waiting for inference service jic062-instruct-v18-aggr-8300-v1 to be ready
Inference service jic062-instruct-v18-aggr-8300-v1 ready after 161.55892372131348s
Pipeline stage ISVCDeployer completed in 778.13s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.8966524600982666s
Received healthy response to inference request in 1.6069505214691162s
Received healthy response to inference request in 1.5557942390441895s
Received healthy response to inference request in 2.1665985584259033s
Received healthy response to inference request in 2.362623929977417s
5 requests
0 failed requests
5th percentile: 1.5660254955291748
10th percentile: 1.5762567520141602
20th percentile: 1.5967192649841309
30th percentile: 1.6648909091949462
40th percentile: 1.7807716846466064
50th percentile: 1.8966524600982666
60th percentile: 2.0046308994293214
70th percentile: 2.1126093387603757
80th percentile: 2.205803632736206
90th percentile: 2.2842137813568115
95th percentile: 2.3234188556671143
99th percentile: 2.3547829151153565
mean time: 1.9177239418029786
Pipeline stage StressChecker completed in 10.29s
jic062-instruct-v18-aggr_8300_v1 status is now deployed due to DeploymentManager action
jic062-instruct-v18-aggr_8300_v1 status is now inactive due to auto deactivation removed underperforming models
jic062-instruct-v18-aggr_8300_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics