developer_uid: Riverise
submission_id: riverise-re-feedback-4k_v1
model_name: riverise-re-feedback-4k_v1
model_group: Riverise/re-feedback-4k
status: torndown
timestamp: 2024-09-02T10:27:08+00:00
num_battles: 10863
num_wins: 5684
celo_rating: 1253.0
family_friendly_score: 0.0
submission_type: basic
model_repo: Riverise/re-feedback-4k
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: riverise-re-feedback-4k_v1
is_internal_developer: False
language_model: Riverise/re-feedback-4k
model_size: 8B
ranking_group: single
us_pacific_date: 2024-09-02
win_ratio: 0.5232440393997975
generation_params: {'temperature': 1.15, 'top_p': 0.95, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name riverise-re-feedback-4k-v1-mkmlizer
Waiting for job on riverise-re-feedback-4k-v1-mkmlizer to finish
Stopping job with name riverise-re-feedback-4k-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name riverise-re-feedback-4k-v1-mkmlizer
Waiting for job on riverise-re-feedback-4k-v1-mkmlizer to finish
riverise-re-feedback-4k-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
riverise-re-feedback-4k-v1-mkmlizer: ║ _____ __ __ ║
riverise-re-feedback-4k-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
riverise-re-feedback-4k-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
riverise-re-feedback-4k-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
riverise-re-feedback-4k-v1-mkmlizer: ║ /___/ ║
riverise-re-feedback-4k-v1-mkmlizer: ║ ║
riverise-re-feedback-4k-v1-mkmlizer: ║ Version: 0.10.1 ║
riverise-re-feedback-4k-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
riverise-re-feedback-4k-v1-mkmlizer: ║ https://mk1.ai ║
riverise-re-feedback-4k-v1-mkmlizer: ║ ║
riverise-re-feedback-4k-v1-mkmlizer: ║ The license key for the current software has been verified as ║
riverise-re-feedback-4k-v1-mkmlizer: ║ belonging to: ║
riverise-re-feedback-4k-v1-mkmlizer: ║ ║
riverise-re-feedback-4k-v1-mkmlizer: ║ Chai Research Corp. ║
riverise-re-feedback-4k-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
riverise-re-feedback-4k-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
riverise-re-feedback-4k-v1-mkmlizer: ║ ║
riverise-re-feedback-4k-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
riverise-re-feedback-4k-v1-mkmlizer: Downloaded to shared memory in 36.128s
riverise-re-feedback-4k-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpon_q1mp_, device:0
riverise-re-feedback-4k-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
riverise-re-feedback-4k-v1-mkmlizer: quantized model in 26.087s
riverise-re-feedback-4k-v1-mkmlizer: Processed model Riverise/re-feedback-4k in 62.215s
riverise-re-feedback-4k-v1-mkmlizer: creating bucket guanaco-mkml-models
riverise-re-feedback-4k-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
riverise-re-feedback-4k-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/riverise-re-feedback-4k-v1
riverise-re-feedback-4k-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/riverise-re-feedback-4k-v1/config.json
riverise-re-feedback-4k-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/riverise-re-feedback-4k-v1/special_tokens_map.json
riverise-re-feedback-4k-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/riverise-re-feedback-4k-v1/tokenizer_config.json
riverise-re-feedback-4k-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/riverise-re-feedback-4k-v1/tokenizer.json
riverise-re-feedback-4k-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/riverise-re-feedback-4k-v1/flywheel_model.0.safetensors
riverise-re-feedback-4k-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%|▏ | 4/291 [00:00<00:07, 38.37it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:04, 66.50it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:03, 74.30it/s] Loading 0: 12%|█▏ | 34/291 [00:00<00:03, 74.14it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:02, 83.77it/s] Loading 0: 20%|█▉ | 58/291 [00:00<00:02, 81.51it/s] Loading 0: 24%|██▍ | 70/291 [00:00<00:02, 83.16it/s] Loading 0: 29%|██▊ | 83/291 [00:02<00:08, 25.36it/s] Loading 0: 31%|███ | 89/291 [00:02<00:07, 27.95it/s] Loading 0: 35%|███▌ | 103/291 [00:02<00:04, 38.48it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:04, 44.02it/s] Loading 0: 42%|████▏ | 121/291 [00:02<00:03, 48.50it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:02, 54.57it/s] Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 60.47it/s] Loading 0: 51%|█████ | 148/291 [00:02<00:02, 65.63it/s] Loading 0: 55%|█████▍ | 160/291 [00:03<00:01, 72.25it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:01, 83.11it/s] Loading 0: 64%|██████▍ | 187/291 [00:04<00:04, 25.68it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:02, 35.17it/s] Loading 0: 73%|███████▎ | 211/291 [00:04<00:01, 40.91it/s] Loading 0: 76%|███████▌ | 220/291 [00:04<00:01, 45.97it/s] Loading 0: 80%|███████▉ | 232/291 [00:04<00:01, 53.91it/s] Loading 0: 83%|████████▎ | 241/291 [00:04<00:00, 59.58it/s] Loading 0: 86%|████████▌ | 250/291 [00:05<00:00, 62.95it/s] Loading 0: 91%|█████████ | 265/291 [00:05<00:00, 72.83it/s] Loading 0: 94%|█████████▍| 274/291 [00:05<00:00, 76.38it/s] Loading 0: 98%|█████████▊| 286/291 [00:05<00:00, 79.00it/s]
Job riverise-re-feedback-4k-v1-mkmlizer completed after 95.51s with status: succeeded
Stopping job with name riverise-re-feedback-4k-v1-mkmlizer
Pipeline stage MKMLizer completed in 97.32s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service riverise-re-feedback-4k-v1
Waiting for inference service riverise-re-feedback-4k-v1 to be ready
Failed to get response for submission blend_dedat_2024-08-16: ('http://zonemercy-graft-cogent-v-7573-v6-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:51520->127.0.0.1:8080: read: connection reset by peer\n')
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service riverise-re-feedback-4k-v1 ready after 201.27478528022766s
Pipeline stage MKMLDeployer completed in 201.62s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.9684383869171143s
Received healthy response to inference request in 1.5430729389190674s
Received healthy response to inference request in 2.08890700340271s
Received healthy response to inference request in 1.5495085716247559s
Received healthy response to inference request in 1.4843480587005615s
5 requests
0 failed requests
5th percentile: 1.4960930347442627
10th percentile: 1.5078380107879639
20th percentile: 1.5313279628753662
30th percentile: 1.544360065460205
40th percentile: 1.5469343185424804
50th percentile: 1.5495085716247559
60th percentile: 1.7652679443359374
70th percentile: 1.981027317047119
80th percentile: 2.264813280105591
90th percentile: 2.616625833511353
95th percentile: 2.7925321102142333
99th percentile: 2.933257131576538
mean time: 1.9268549919128417
Pipeline stage StressChecker completed in 10.66s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
starting trigger_guanaco_pipeline %s
Pipeline stage TriggerMKMLProfilingPipeline completed in 8.39s
riverise-re-feedback-4k_v1 status is now deployed due to DeploymentManager action
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service riverise-re-feedback-4k-v1-profiler
Waiting for inference service riverise-re-feedback-4k-v1-profiler to be ready
Inference service riverise-re-feedback-4k-v1-profiler ready after 190.47137093544006s
Pipeline stage MKMLProfilerDeployer completed in 190.87s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
script pods %s
Pipeline stage MKMLProfilerRunner completed in 0.43s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service riverise-re-feedback-4k-v1-profiler is running
Tearing down inference service riverise-re-feedback-4k-v1-profiler
Service riverise-re-feedback-4k-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.78s
riverise-re-feedback-4k_v1 status is now inactive due to auto deactivation removed underperforming models
riverise-re-feedback-4k_v1 status is now torndown due to DeploymentManager action