submission_id: jic062-instruct-v18-aggr_4397_v1
developer_uid: chace9580
alignment_samples: 11394
alignment_score: 2.7854308208324756
best_of: 16
celo_rating: 1210.35
display_name: jic062-instruct-v18-aggr_4397_v1
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_of_text|>', '|eot_id|'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: jic062/instruct_v18_aggresive_lr5_dp10_c2
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: jic062/instruct_v18_aggr
model_name: jic062-instruct-v18-aggr_4397_v1
model_num_parameters: 8030261248.0
model_repo: jic062/instruct_v18_aggresive_lr5_dp10_c2
model_size: 8B
num_battles: 11394
num_wins: 5492
propriety_score: 0.712742980561555
propriety_total_count: 926.0
ranking_group: single
status: torndown
submission_type: basic
timestamp: 2024-08-16T23:52:28+00:00
us_pacific_date: 2024-08-16
win_ratio: 0.48200807442513605
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jic062-instruct-v18-aggr-4397-v1-mkmlizer
Waiting for job on jic062-instruct-v18-aggr-4397-v1-mkmlizer to finish
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ _____ __ __ ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ /___/ ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ Version: 0.9.11 ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ https://mk1.ai ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ belonging to: ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ Chai Research Corp. ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ║ ║
jic062-instruct-v18-aggr-4397-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jic062-instruct-v18-aggr-4397-v1-mkmlizer: Downloaded to shared memory in 39.739s
jic062-instruct-v18-aggr-4397-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpo_ymmgxy, device:0
jic062-instruct-v18-aggr-4397-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jic062-instruct-v18-aggr-4397-v1-mkmlizer: quantized model in 26.509s
jic062-instruct-v18-aggr-4397-v1-mkmlizer: Processed model jic062/instruct_v18_aggresive_lr5_dp10_c2 in 66.249s
jic062-instruct-v18-aggr-4397-v1-mkmlizer: creating bucket guanaco-mkml-models
jic062-instruct-v18-aggr-4397-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-instruct-v18-aggr-4397-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-instruct-v18-aggr-4397-v1
jic062-instruct-v18-aggr-4397-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-instruct-v18-aggr-4397-v1/config.json
jic062-instruct-v18-aggr-4397-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-instruct-v18-aggr-4397-v1/special_tokens_map.json
jic062-instruct-v18-aggr-4397-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-instruct-v18-aggr-4397-v1/tokenizer_config.json
jic062-instruct-v18-aggr-4397-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-instruct-v18-aggr-4397-v1/flywheel_model.0.safetensors
jic062-instruct-v18-aggr-4397-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 7/291 [00:00<00:05, 49.99it/s] Loading 0: 6%|▌ | 17/291 [00:00<00:03, 73.72it/s] Loading 0: 11%|█ | 31/291 [00:00<00:03, 86.18it/s] Loading 0: 14%|█▍ | 42/291 [00:00<00:02, 93.87it/s] Loading 0: 18%|█▊ | 52/291 [00:00<00:02, 81.11it/s] Loading 0: 21%|██ | 61/291 [00:00<00:02, 81.03it/s] Loading 0: 24%|██▍ | 70/291 [00:00<00:02, 76.14it/s] Loading 0: 27%|██▋ | 79/291 [00:01<00:02, 70.87it/s] Loading 0: 30%|██▉ | 87/291 [00:02<00:10, 20.28it/s] Loading 0: 33%|███▎ | 96/291 [00:02<00:07, 26.61it/s] Loading 0: 35%|███▌ | 103/291 [00:02<00:05, 31.46it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:04, 38.84it/s] Loading 0: 42%|████▏ | 121/291 [00:02<00:03, 46.57it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:02, 53.83it/s] Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 60.93it/s] Loading 0: 51%|█████ | 148/291 [00:02<00:02, 63.48it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:02, 66.02it/s] Loading 0: 57%|█████▋ | 166/291 [00:03<00:01, 67.96it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:01, 71.90it/s] Loading 0: 63%|██████▎ | 184/291 [00:03<00:01, 73.90it/s] Loading 0: 66%|██████▌ | 192/291 [00:04<00:04, 21.07it/s] Loading 0: 68%|██████▊ | 198/291 [00:04<00:03, 24.62it/s] Loading 0: 70%|███████ | 205/291 [00:04<00:02, 29.55it/s] Loading 0: 74%|███████▎ | 214/291 [00:04<00:02, 37.61it/s] Loading 0: 77%|███████▋ | 223/291 [00:04<00:01, 45.81it/s] Loading 0: 80%|███████▉ | 232/291 [00:05<00:01, 53.85it/s] Loading 0: 83%|████████▎ | 241/291 [00:05<00:00, 59.91it/s] Loading 0: 86%|████████▌ | 250/291 [00:05<00:00, 65.08it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 70.59it/s] Loading 0: 92%|█████████▏| 268/291 [00:05<00:00, 74.27it/s] Loading 0: 95%|█████████▌| 277/291 [00:05<00:00, 75.06it/s] Loading 0: 98%|█████████▊| 286/291 [00:05<00:00, 76.15it/s]
Job jic062-instruct-v18-aggr-4397-v1-mkmlizer completed after 86.59s with status: succeeded
Stopping job with name jic062-instruct-v18-aggr-4397-v1-mkmlizer
Pipeline stage MKMLizer completed in 88.26s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.08s
Running pipeline stage ISVCDeployer
Creating inference service jic062-instruct-v18-aggr-4397-v1
Waiting for inference service jic062-instruct-v18-aggr-4397-v1 to be ready
Failed to get response for submission mistralai-mixtral-8x7b-_3473_v33: ('http://mistralai-mixtral-8x7b-3473-v33-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:53050->127.0.0.1:8080: read: connection reset by peer\n')
Tearing down inference service jic062-instruct-v18-aggr-4397-v1
%s, retrying in %s seconds...
Creating inference service jic062-instruct-v18-aggr-4397-v1
Ignoring service jic062-instruct-v18-aggr-4397-v1 already deployed
Waiting for inference service jic062-instruct-v18-aggr-4397-v1 to be ready
Tearing down inference service jic062-instruct-v18-aggr-4397-v1
%s, retrying in %s seconds...
Creating inference service jic062-instruct-v18-aggr-4397-v1
Waiting for inference service jic062-instruct-v18-aggr-4397-v1 to be ready
Inference service jic062-instruct-v18-aggr-4397-v1 ready after 140.89385223388672s
Pipeline stage ISVCDeployer completed in 757.36s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.967078685760498s
Received healthy response to inference request in 1.6626155376434326s
Received healthy response to inference request in 1.8205738067626953s
Received healthy response to inference request in 1.4376800060272217s
Received healthy response to inference request in 1.547018051147461s
5 requests
0 failed requests
5th percentile: 1.4595476150512696
10th percentile: 1.4814152240753173
20th percentile: 1.525150442123413
30th percentile: 1.5701375484466553
40th percentile: 1.6163765430450439
50th percentile: 1.6626155376434326
60th percentile: 1.7257988452911377
70th percentile: 1.7889821529388428
80th percentile: 1.849874782562256
90th percentile: 1.9084767341613769
95th percentile: 1.9377777099609375
99th percentile: 1.9612184906005858
mean time: 1.6869932174682618
Pipeline stage StressChecker completed in 9.14s
jic062-instruct-v18-aggr_4397_v1 status is now deployed due to DeploymentManager action
jic062-instruct-v18-aggr_4397_v1 status is now inactive due to auto deactivation removed underperforming models
jic062-instruct-v18-aggr_4397_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics