developer_uid: rinen0721
submission_id: rinen0721-llama0827_v1
model_name: rinen0721-llama0827_v1
model_group: rinen0721/llama0827
status: torndown
timestamp: 2024-08-27T11:07:22+00:00
num_battles: 11250
num_wins: 5437
celo_rating: 1225.51
family_friendly_score: 0.0
submission_type: basic
model_repo: rinen0721/llama0827
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: rinen0721-llama0827_v1
is_internal_developer: False
language_model: rinen0721/llama0827
model_size: 8B
ranking_group: single
us_pacific_date: 2024-08-27
win_ratio: 0.4832888888888889
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Running pipeline stage MKMLizer
Starting job with name rinen0721-llama0827-v1-mkmlizer
Waiting for job on rinen0721-llama0827-v1-mkmlizer to finish
Stopping job with name rinen0721-llama0827-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name rinen0721-llama0827-v1-mkmlizer
Waiting for job on rinen0721-llama0827-v1-mkmlizer to finish
Failed to get response for submission zonemercy-lexical-nemo-_1518_v18: ('http://zonemercy-lexical-nemo-1518-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:48076->127.0.0.1:8080: read: connection reset by peer\n')
rinen0721-llama0827-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rinen0721-llama0827-v1-mkmlizer: ║ _____ __ __ ║
rinen0721-llama0827-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rinen0721-llama0827-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rinen0721-llama0827-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rinen0721-llama0827-v1-mkmlizer: ║ /___/ ║
rinen0721-llama0827-v1-mkmlizer: ║ ║
rinen0721-llama0827-v1-mkmlizer: ║ Version: 0.10.1 ║
rinen0721-llama0827-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rinen0721-llama0827-v1-mkmlizer: ║ https://mk1.ai ║
rinen0721-llama0827-v1-mkmlizer: ║ ║
rinen0721-llama0827-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rinen0721-llama0827-v1-mkmlizer: ║ belonging to: ║
rinen0721-llama0827-v1-mkmlizer: ║ ║
rinen0721-llama0827-v1-mkmlizer: ║ Chai Research Corp. ║
rinen0721-llama0827-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rinen0721-llama0827-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
rinen0721-llama0827-v1-mkmlizer: ║ ║
rinen0721-llama0827-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
rinen0721-llama0827-v1-mkmlizer: Downloaded to shared memory in 34.934s
rinen0721-llama0827-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpyh8fgs2z, device:0
rinen0721-llama0827-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rinen0721-llama0827-v1-mkmlizer: quantized model in 26.812s
rinen0721-llama0827-v1-mkmlizer: Processed model rinen0721/llama0827 in 61.747s
rinen0721-llama0827-v1-mkmlizer: creating bucket guanaco-mkml-models
rinen0721-llama0827-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rinen0721-llama0827-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rinen0721-llama0827-v1
rinen0721-llama0827-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rinen0721-llama0827-v1/config.json
rinen0721-llama0827-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rinen0721-llama0827-v1/special_tokens_map.json
rinen0721-llama0827-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rinen0721-llama0827-v1/tokenizer_config.json
rinen0721-llama0827-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rinen0721-llama0827-v1/tokenizer.json
rinen0721-llama0827-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 7/291 [00:00<00:06, 46.97it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:05, 53.29it/s] Loading 0: 8%|▊ | 22/291 [00:00<00:04, 62.02it/s] Loading 0: 11%|█ | 31/291 [00:00<00:03, 67.01it/s] Loading 0: 14%|█▎ | 40/291 [00:00<00:03, 70.69it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:03, 72.12it/s] Loading 0: 20%|█▉ | 58/291 [00:00<00:03, 74.55it/s] Loading 0: 23%|██▎ | 67/291 [00:00<00:03, 73.55it/s] Loading 0: 26%|██▌ | 76/291 [00:01<00:03, 69.92it/s] Loading 0: 29%|██▉ | 84/291 [00:02<00:10, 19.89it/s] Loading 0: 31%|███ | 90/291 [00:02<00:08, 22.93it/s] Loading 0: 33%|███▎ | 97/291 [00:02<00:07, 27.56it/s] Loading 0: 36%|███▋ | 106/291 [00:02<00:05, 34.41it/s] Loading 0: 40%|███▉ | 115/291 [00:02<00:04, 40.31it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 44.74it/s] Loading 0: 46%|████▌ | 133/291 [00:03<00:03, 50.82it/s] Loading 0: 49%|████▉ | 142/291 [00:03<00:02, 56.65it/s] Loading 0: 52%|█████▏ | 151/291 [00:03<00:02, 59.35it/s] Loading 0: 55%|█████▍ | 160/291 [00:03<00:02, 61.27it/s] Loading 0: 58%|█████▊ | 169/291 [00:03<00:01, 64.70it/s] Loading 0: 61%|██████ | 178/291 [00:03<00:01, 68.74it/s] Loading 0: 64%|██████▍ | 187/291 [00:04<00:05, 20.55it/s] Loading 0: 67%|██████▋ | 196/291 [00:04<00:03, 26.44it/s] Loading 0: 70%|███████ | 205/291 [00:05<00:02, 32.17it/s] Loading 0: 74%|███████▎ | 214/291 [00:05<00:02, 37.90it/s] Loading 0: 77%|███████▋ | 223/291 [00:05<00:01, 45.45it/s] Loading 0: 80%|███████▉ | 232/291 [00:05<00:01, 50.42it/s] Loading 0: 83%|████████▎ | 241/291 [00:05<00:00, 52.60it/s] Loading 0: 86%|████████▌ | 250/291 [00:05<00:00, 55.99it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 57.49it/s] Loading 0: 92%|█████████▏| 268/291 [00:06<00:00, 60.42it/s] Loading 0: 95%|█████████▌| 277/291 [00:06<00:00, 65.56it/s] Loading 0: 98%|█████████▊| 286/291 [00:06<00:00, 64.57it/s]
Job rinen0721-llama0827-v1-mkmlizer completed after 85.3s with status: succeeded
Stopping job with name rinen0721-llama0827-v1-mkmlizer
Pipeline stage MKMLizer completed in 86.99s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.38s
Running pipeline stage ISVCDeployer
Creating inference service rinen0721-llama0827-v1
Waiting for inference service rinen0721-llama0827-v1 to be ready
Inference service rinen0721-llama0827-v1 ready after 171.70040678977966s
Pipeline stage ISVCDeployer completed in 172.44s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.9029815196990967s
Received healthy response to inference request in 2.022343635559082s
Received healthy response to inference request in 1.5513877868652344s
Received healthy response to inference request in 1.8421730995178223s
Received healthy response to inference request in 1.4793391227722168s
5 requests
0 failed requests
5th percentile: 1.4937488555908203
10th percentile: 1.5081585884094237
20th percentile: 1.536978054046631
30th percentile: 1.609544849395752
40th percentile: 1.7258589744567872
50th percentile: 1.8421730995178223
60th percentile: 1.866496467590332
70th percentile: 1.8908198356628418
80th percentile: 1.9268539428710938
90th percentile: 1.9745987892150878
95th percentile: 1.998471212387085
99th percentile: 2.0175691509246825
mean time: 1.7596450328826905
Pipeline stage StressChecker completed in 10.53s
rinen0721-llama0827_v1 status is now deployed due to DeploymentManager action
rinen0721-llama0827_v1 status is now inactive due to auto deactivation removed underperforming models
rinen0721-llama0827_v1 status is now torndown due to DeploymentManager action