submission_id: jellywibble-lora-120k-p_2801_v26
developer_uid: Jellywibble
best_of: 8
celo_rating: 1257.95
display_name: jellywibble-lora-120k-p_2801_v26
family_friendly_score: 0.5667433831990794
family_friendly_standard_error: 0.005277681772149539
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
is_internal_developer: True
language_model: Jellywibble/lora_120k_pref_data_ep3_stacked_elo_alignment
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: Jellywibble/lora_120k_pr
model_name: jellywibble-lora-120k-p_2801_v26
model_num_parameters: 8030261248.0
model_repo: Jellywibble/lora_120k_pref_data_ep3_stacked_elo_alignment
model_size: 8B
num_battles: 9316
num_wins: 4709
ranking_group: single
status: inactive
submission_type: basic
timestamp: 2024-10-17T04:25:23+00:00
us_pacific_date: 2024-10-16
win_ratio: 0.5054744525547445
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name jellywibble-lora-120k-p-2801-v26-mkmlizer
Waiting for job on jellywibble-lora-120k-p-2801-v26-mkmlizer to finish
jellywibble-lora-120k-p-2801-v26-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ _____ __ __ ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ /___/ ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ Version: 0.11.12 ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ https://mk1.ai ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ The license key for the current software has been verified as ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ belonging to: ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ Chai Research Corp. ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v26-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jellywibble-lora-120k-p-2801-v26-mkmlizer: Downloaded to shared memory in 68.194s
jellywibble-lora-120k-p-2801-v26-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmppi0ot_3x, device:0
jellywibble-lora-120k-p-2801-v26-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jellywibble-lora-120k-p-2801-v26-mkmlizer: quantized model in 28.850s
jellywibble-lora-120k-p-2801-v26-mkmlizer: Processed model Jellywibble/lora_120k_pref_data_ep3_stacked_elo_alignment in 97.044s
jellywibble-lora-120k-p-2801-v26-mkmlizer: creating bucket guanaco-mkml-models
jellywibble-lora-120k-p-2801-v26-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jellywibble-lora-120k-p-2801-v26-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v26
jellywibble-lora-120k-p-2801-v26-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v26/config.json
jellywibble-lora-120k-p-2801-v26-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v26/special_tokens_map.json
jellywibble-lora-120k-p-2801-v26-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v26/tokenizer_config.json
jellywibble-lora-120k-p-2801-v26-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v26/tokenizer.json
jellywibble-lora-120k-p-2801-v26-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v26/flywheel_model.0.safetensors
jellywibble-lora-120k-p-2801-v26-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:11, 25.15it/s] Loading 0: 4%|▍ | 12/291 [00:00<00:07, 36.46it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:08, 32.92it/s] Loading 0: 7%|▋ | 21/291 [00:00<00:07, 36.34it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:07, 34.24it/s] Loading 0: 11%|█ | 31/291 [00:00<00:06, 41.01it/s] Loading 0: 12%|█▏ | 36/291 [00:01<00:10, 23.43it/s] Loading 0: 14%|█▍ | 41/291 [00:01<00:09, 25.65it/s] Loading 0: 16%|█▋ | 48/291 [00:01<00:07, 32.59it/s] Loading 0: 18%|█▊ | 53/291 [00:01<00:07, 33.71it/s] Loading 0: 20%|█▉ | 57/291 [00:01<00:06, 35.00it/s] Loading 0: 21%|██ | 61/291 [00:01<00:06, 33.63it/s] Loading 0: 23%|██▎ | 66/291 [00:02<00:06, 36.48it/s] Loading 0: 24%|██▍ | 70/291 [00:02<00:06, 35.13it/s] Loading 0: 25%|██▌ | 74/291 [00:02<00:06, 35.38it/s] Loading 0: 27%|██▋ | 78/291 [00:02<00:05, 35.77it/s] Loading 0: 28%|██▊ | 82/291 [00:02<00:08, 24.08it/s] Loading 0: 29%|██▉ | 85/291 [00:02<00:08, 25.12it/s] Loading 0: 31%|███ | 90/291 [00:02<00:06, 29.74it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:06, 30.23it/s] Loading 0: 34%|███▍ | 99/291 [00:03<00:05, 33.79it/s] Loading 0: 35%|███▌ | 103/291 [00:03<00:05, 33.61it/s] Loading 0: 37%|███▋ | 108/291 [00:03<00:04, 36.62it/s] Loading 0: 38%|███▊ | 112/291 [00:03<00:05, 34.73it/s] Loading 0: 40%|███▉ | 116/291 [00:03<00:05, 34.64it/s] Loading 0: 42%|████▏ | 122/291 [00:03<00:04, 39.64it/s] Loading 0: 44%|████▎ | 127/291 [00:03<00:04, 37.21it/s] Loading 0: 46%|████▌ | 133/291 [00:04<00:05, 29.73it/s] Loading 0: 47%|████▋ | 137/291 [00:04<00:05, 29.54it/s] Loading 0: 48%|████▊ | 141/291 [00:04<00:05, 27.64it/s] Loading 0: 51%|█████ | 147/291 [00:04<00:04, 32.88it/s] Loading 0: 52%|█████▏ | 151/291 [00:04<00:04, 31.78it/s] Loading 0: 54%|█████▎ | 156/291 [00:04<00:03, 34.59it/s] Loading 0: 55%|█████▍ | 160/291 [00:04<00:04, 31.94it/s] Loading 0: 57%|█████▋ | 165/291 [00:05<00:03, 34.69it/s] Loading 0: 58%|█████▊ | 169/291 [00:05<00:03, 33.96it/s] Loading 0: 60%|█████▉ | 174/291 [00:05<00:03, 37.05it/s] Loading 0: 61%|██████ | 178/291 [00:05<00:03, 34.22it/s] Loading 0: 64%|██████▎ | 185/291 [00:05<00:02, 38.14it/s] Loading 0: 65%|██████▍ | 189/291 [00:05<00:04, 23.95it/s] Loading 0: 67%|██████▋ | 194/291 [00:06<00:03, 25.46it/s] Loading 0: 69%|██████▉ | 201/291 [00:06<00:02, 32.36it/s] Loading 0: 70%|███████ | 205/291 [00:06<00:02, 31.87it/s] Loading 0: 72%|███████▏ | 210/291 [00:06<00:02, 34.50it/s] Loading 0: 74%|███████▎ | 214/291 [00:06<00:02, 33.79it/s] Loading 0: 75%|███████▌ | 219/291 [00:06<00:01, 36.85it/s] Loading 0: 77%|███████▋ | 223/291 [00:06<00:01, 35.03it/s] Loading 0: 78%|███████▊ | 227/291 [00:06<00:01, 35.27it/s] Loading 0: 79%|███████▉ | 231/291 [00:07<00:01, 35.53it/s] Loading 0: 81%|████████ | 235/291 [00:07<00:02, 26.30it/s] Loading 0: 82%|████████▏ | 239/291 [00:07<00:02, 24.82it/s] Loading 0: 85%|████████▍ | 246/291 [00:07<00:01, 33.13it/s] Loading 0: 86%|████████▌ | 250/291 [00:07<00:01, 33.46it/s] Loading 0: 88%|████████▊ | 255/291 [00:07<00:00, 36.41it/s] Loading 0: 89%|████████▉ | 259/291 [00:07<00:00, 35.14it/s] Loading 0: 91%|█████████ | 264/291 [00:08<00:00, 38.26it/s] Loading 0: 92%|█████████▏| 269/291 [00:08<00:00, 37.75it/s] Loading 0: 94%|█████████▍| 273/291 [00:08<00:00, 37.84it/s] Loading 0: 95%|█████████▌| 277/291 [00:08<00:00, 35.25it/s] Loading 0: 97%|█████████▋| 281/291 [00:08<00:00, 36.01it/s] Loading 0: 98%|█████████▊| 286/291 [00:14<00:01, 2.56it/s] Loading 0: 99%|█████████▉| 289/291 [00:14<00:00, 3.17it/s]
Job jellywibble-lora-120k-p-2801-v26-mkmlizer completed after 124.88s with status: succeeded
Stopping job with name jellywibble-lora-120k-p-2801-v26-mkmlizer
Pipeline stage MKMLizer completed in 125.40s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.23s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service jellywibble-lora-120k-p-2801-v26
Waiting for inference service jellywibble-lora-120k-p-2801-v26 to be ready
Inference service jellywibble-lora-120k-p-2801-v26 ready after 170.98682761192322s
Pipeline stage MKMLDeployer completed in 171.63s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.6351444721221924s
Received healthy response to inference request in 1.2341234683990479s
Received healthy response to inference request in 1.3827872276306152s
Received healthy response to inference request in 1.2946341037750244s
Received healthy response to inference request in 1.4107208251953125s
5 requests
0 failed requests
5th percentile: 1.2462255954742432
10th percentile: 1.2583277225494385
20th percentile: 1.282531976699829
30th percentile: 1.3122647285461426
40th percentile: 1.3475259780883788
50th percentile: 1.3827872276306152
60th percentile: 1.393960666656494
70th percentile: 1.405134105682373
80th percentile: 1.4556055545806885
90th percentile: 1.5453750133514403
95th percentile: 1.5902597427368164
99th percentile: 1.6261675262451172
mean time: 1.3914820194244384
Pipeline stage StressChecker completed in 8.20s
Shutdown handler de-registered
jellywibble-lora-120k-p_2801_v26 status is now deployed due to DeploymentManager action
jellywibble-lora-120k-p_2801_v26 status is now inactive due to auto deactivation removed underperforming models