submission_id: rinen0721-llama8b-0821-cp3022_v1
developer_uid: rinen0721
alignment_samples: 9622
alignment_score: 0.06721597849775482
best_of: 16
celo_rating: 1217.71
display_name: rinen0721-llama8b-0821-cp3022_v1
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: rinen0721/llama8b-0821-cp3022
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: rinen0721/llama8b-0821-c
model_name: rinen0721-llama8b-0821-cp3022_v1
model_num_parameters: 8030261248.0
model_repo: rinen0721/llama8b-0821-cp3022
model_size: 8B
num_battles: 9622
num_wins: 4579
propriety_score: 0.7129300118623962
propriety_total_count: 843.0
ranking_group: single
status: torndown
submission_type: basic
timestamp: 2024-08-21T06:17:22+00:00
us_pacific_date: 2024-08-20
win_ratio: 0.4758885886510081
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name rinen0721-llama8b-0821-cp3022-v1-mkmlizer
Waiting for job on rinen0721-llama8b-0821-cp3022-v1-mkmlizer to finish
Failed to get response for submission function_notom_2024-08-20: ('http://chaiml-llama-8b-pairwis-8189-v17-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:52116->127.0.0.1:8080: read: connection reset by peer\n')
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ _____ __ __ ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ /___/ ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ Version: 0.9.11 ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ https://mk1.ai ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ belonging to: ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ Chai Research Corp. ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ║ ║
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: Downloaded to shared memory in 34.587s
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpq0qcrus6, device:0
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: quantized model in 25.922s
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: Processed model rinen0721/llama8b-0821-cp3022 in 60.510s
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: creating bucket guanaco-mkml-models
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rinen0721-llama8b-0821-cp3022-v1
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rinen0721-llama8b-0821-cp3022-v1/config.json
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rinen0721-llama8b-0821-cp3022-v1/special_tokens_map.json
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rinen0721-llama8b-0821-cp3022-v1/tokenizer_config.json
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rinen0721-llama8b-0821-cp3022-v1/tokenizer.json
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rinen0721-llama8b-0821-cp3022-v1/flywheel_model.0.safetensors
rinen0721-llama8b-0821-cp3022-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 7/291 [00:00<00:05, 52.41it/s] Loading 0: 6%|▌ | 17/291 [00:00<00:03, 76.27it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:03, 77.71it/s] Loading 0: 12%|█▏ | 34/291 [00:00<00:03, 76.69it/s] Loading 0: 15%|█▍ | 43/291 [00:00<00:03, 79.53it/s] Loading 0: 18%|█▊ | 52/291 [00:00<00:02, 82.76it/s] Loading 0: 21%|██ | 61/291 [00:00<00:02, 84.72it/s] Loading 0: 24%|██▍ | 70/291 [00:00<00:02, 84.78it/s] Loading 0: 27%|██▋ | 79/291 [00:00<00:02, 81.19it/s] Loading 0: 30%|███ | 88/291 [00:02<00:09, 20.85it/s] Loading 0: 33%|███▎ | 97/291 [00:02<00:07, 27.02it/s] Loading 0: 36%|███▋ | 106/291 [00:02<00:05, 33.75it/s] Loading 0: 42%|████▏ | 121/291 [00:02<00:03, 47.05it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:03, 53.42it/s] Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 56.69it/s] Loading 0: 51%|█████ | 148/291 [00:02<00:02, 58.28it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:02, 59.84it/s] Loading 0: 57%|█████▋ | 166/291 [00:03<00:01, 65.58it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:01, 69.74it/s] Loading 0: 63%|██████▎ | 184/291 [00:03<00:01, 73.61it/s] Loading 0: 66%|██████▌ | 192/291 [00:04<00:04, 21.53it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:03, 27.72it/s] Loading 0: 73%|███████▎ | 211/291 [00:04<00:02, 34.75it/s] Loading 0: 76%|███████▌ | 220/291 [00:04<00:01, 42.28it/s] Loading 0: 79%|███████▊ | 229/291 [00:04<00:01, 49.85it/s] Loading 0: 82%|████████▏ | 240/291 [00:04<00:00, 60.85it/s] Loading 0: 86%|████████▌ | 249/291 [00:05<00:00, 66.38it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 67.64it/s] Loading 0: 92%|█████████▏| 269/291 [00:05<00:00, 75.05it/s] Loading 0: 96%|█████████▌| 278/291 [00:05<00:00, 75.49it/s] Loading 0: 99%|█████████▊| 287/291 [00:05<00:00, 42.39it/s]
Job rinen0721-llama8b-0821-cp3022-v1-mkmlizer completed after 84.91s with status: succeeded
Stopping job with name rinen0721-llama8b-0821-cp3022-v1-mkmlizer
Pipeline stage MKMLizer completed in 85.85s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service rinen0721-llama8b-0821-cp3022-v1
Waiting for inference service rinen0721-llama8b-0821-cp3022-v1 to be ready
Failed to get response for submission blend_filor_2024-08-16: ('http://zonemercy-cogent-nemo-v2-5e6-v13-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:43164->127.0.0.1:8080: read: connection reset by peer\n')
Inference service rinen0721-llama8b-0821-cp3022-v1 ready after 252.31444334983826s
Pipeline stage ISVCDeployer completed in 252.96s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2535440921783447s
Received healthy response to inference request in 2.716857433319092s
Received healthy response to inference request in 1.8040640354156494s
Received healthy response to inference request in 1.477410078048706s
Received healthy response to inference request in 1.9818902015686035s
5 requests
0 failed requests
5th percentile: 1.5427408695220948
10th percentile: 1.6080716609954835
20th percentile: 1.7387332439422607
30th percentile: 1.8396292686462403
40th percentile: 1.910759735107422
50th percentile: 1.9818902015686035
60th percentile: 2.0905517578125
70th percentile: 2.1992133140563963
80th percentile: 2.3462067604064942
90th percentile: 2.531532096862793
95th percentile: 2.6241947650909423
99th percentile: 2.698324899673462
mean time: 2.046753168106079
Pipeline stage StressChecker completed in 10.83s
rinen0721-llama8b-0821-cp3022_v1 status is now deployed due to DeploymentManager action
rinen0721-llama8b-0821-cp3022_v1 status is now inactive due to auto deactivation removed underperforming models
rinen0721-llama8b-0821-cp3022_v1 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics