developer_uid: Riverise
submission_id: riverise-my-second-model_v2
model_name: riverise-my-second-model_v1
model_group: Riverise/my-second-model
status: torndown
timestamp: 2024-08-29T09:37:15+00:00
num_battles: 10624
num_wins: 5408
celo_rating: 1239.91
family_friendly_score: 0.0
submission_type: basic
model_repo: Riverise/my-second-model
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: riverise-my-second-model_v1
is_internal_developer: False
language_model: Riverise/my-second-model
model_size: 8B
ranking_group: single
us_pacific_date: 2024-08-29
win_ratio: 0.5090361445783133
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Running pipeline stage MKMLizer
Starting job with name riverise-my-second-model-v2-mkmlizer
Waiting for job on riverise-my-second-model-v2-mkmlizer to finish
Stopping job with name riverise-my-second-model-v2-mkmlizer
%s, retrying in %s seconds...
Starting job with name riverise-my-second-model-v2-mkmlizer
Waiting for job on riverise-my-second-model-v2-mkmlizer to finish
riverise-my-second-model-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
riverise-my-second-model-v2-mkmlizer: ║ _____ __ __ ║
riverise-my-second-model-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
riverise-my-second-model-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
riverise-my-second-model-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
riverise-my-second-model-v2-mkmlizer: ║ /___/ ║
riverise-my-second-model-v2-mkmlizer: ║ ║
riverise-my-second-model-v2-mkmlizer: ║ Version: 0.10.1 ║
riverise-my-second-model-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
riverise-my-second-model-v2-mkmlizer: ║ https://mk1.ai ║
riverise-my-second-model-v2-mkmlizer: ║ ║
riverise-my-second-model-v2-mkmlizer: ║ The license key for the current software has been verified as ║
riverise-my-second-model-v2-mkmlizer: ║ belonging to: ║
riverise-my-second-model-v2-mkmlizer: ║ ║
riverise-my-second-model-v2-mkmlizer: ║ Chai Research Corp. ║
riverise-my-second-model-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
riverise-my-second-model-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
riverise-my-second-model-v2-mkmlizer: ║ ║
riverise-my-second-model-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
riverise-my-second-model-v2-mkmlizer: Downloaded to shared memory in 23.598s
riverise-my-second-model-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp5jqgtf24, device:0
riverise-my-second-model-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission chaiml-llama-8b-pairwis_8189_v19: ('http://chaiml-llama-8b-pairwis-8189-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:36988->127.0.0.1:8080: read: connection reset by peer\n')
riverise-my-second-model-v2-mkmlizer: quantized model in 25.949s
riverise-my-second-model-v2-mkmlizer: Processed model Riverise/my-second-model in 49.547s
riverise-my-second-model-v2-mkmlizer: creating bucket guanaco-mkml-models
riverise-my-second-model-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
riverise-my-second-model-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/riverise-my-second-model-v2
riverise-my-second-model-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/riverise-my-second-model-v2/special_tokens_map.json
riverise-my-second-model-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/riverise-my-second-model-v2/config.json
riverise-my-second-model-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/riverise-my-second-model-v2/tokenizer_config.json
riverise-my-second-model-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/riverise-my-second-model-v2/tokenizer.json
riverise-my-second-model-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 7/291 [00:00<00:05, 50.70it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:03, 69.67it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:03, 76.87it/s] Loading 0: 12%|█▏ | 34/291 [00:00<00:03, 80.79it/s] Loading 0: 15%|█▍ | 43/291 [00:00<00:03, 72.74it/s] Loading 0: 18%|█▊ | 52/291 [00:00<00:03, 75.95it/s] Loading 0: 21%|██ | 61/291 [00:00<00:02, 79.47it/s] Loading 0: 24%|██▍ | 70/291 [00:00<00:02, 82.10it/s] Loading 0: 27%|██▋ | 79/291 [00:01<00:02, 78.35it/s] Loading 0: 30%|██▉ | 87/291 [00:02<00:09, 21.05it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:07, 25.60it/s] Loading 0: 35%|███▌ | 103/291 [00:02<00:05, 32.64it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:04, 38.50it/s] Loading 0: 42%|████▏ | 121/291 [00:02<00:03, 45.23it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:03, 51.65it/s] Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 57.15it/s] Loading 0: 51%|█████ | 148/291 [00:02<00:02, 61.18it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:01, 67.41it/s] Loading 0: 57%|█████▋ | 166/291 [00:03<00:01, 72.41it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:01, 71.52it/s] Loading 0: 63%|██████▎ | 184/291 [00:03<00:01, 73.36it/s] Loading 0: 66%|██████▌ | 192/291 [00:04<00:04, 21.50it/s] Loading 0: 68%|██████▊ | 198/291 [00:04<00:03, 24.33it/s] Loading 0: 70%|███████ | 205/291 [00:04<00:03, 28.40it/s] Loading 0: 74%|███████▎ | 214/291 [00:04<00:02, 36.07it/s] Loading 0: 77%|███████▋ | 223/291 [00:04<00:01, 44.37it/s] Loading 0: 80%|███████▉ | 232/291 [00:05<00:01, 52.24it/s] Loading 0: 83%|████████▎ | 242/291 [00:05<00:00, 61.92it/s] Loading 0: 86%|████████▋ | 251/291 [00:05<00:00, 64.56it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 67.44it/s] Loading 0: 92%|█████████▏| 268/291 [00:05<00:00, 71.11it/s] Loading 0: 95%|█████████▌| 277/291 [00:05<00:00, 72.77it/s] Loading 0: 98%|█████████▊| 286/291 [00:05<00:00, 70.66it/s]
Job riverise-my-second-model-v2-mkmlizer completed after 74.92s with status: succeeded
Stopping job with name riverise-my-second-model-v2-mkmlizer
Pipeline stage MKMLizer completed in 77.03s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.08s
Running pipeline stage ISVCDeployer
Creating inference service riverise-my-second-model-v2
Waiting for inference service riverise-my-second-model-v2 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission neversleep-noromaid-v0_8068_v150: ('http://chaiml-llama-8b-pairwis-8189-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:57142->127.0.0.1:8080: read: connection reset by peer\n')
Inference service riverise-my-second-model-v2 ready after 181.24362325668335s
Pipeline stage ISVCDeployer completed in 182.28s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.3058433532714844s
Received healthy response to inference request in 2.122457504272461s
Received healthy response to inference request in 4.5668768882751465s
Received healthy response to inference request in 2.1340737342834473s
Received healthy response to inference request in 1.385310173034668s
5 requests
0 failed requests
5th percentile: 1.5327396392822266
10th percentile: 1.6801691055297852
20th percentile: 1.9750280380249023
30th percentile: 2.1247807502746583
40th percentile: 2.1294272422790526
50th percentile: 2.1340737342834473
60th percentile: 2.602781581878662
70th percentile: 3.0714894294738766
80th percentile: 3.558050060272217
90th percentile: 4.062463474273682
95th percentile: 4.314670181274414
99th percentile: 4.516435546875
mean time: 2.7029123306274414
Pipeline stage StressChecker completed in 14.26s
riverise-my-second-model_v2 status is now deployed due to DeploymentManager action
riverise-my-second-model_v2 status is now inactive due to auto deactivation removed underperforming models
riverise-my-second-model_v2 status is now torndown due to DeploymentManager action