submission_id: jellywibble-lora-120k-p_2801_v21
developer_uid: chai_backend_admin
alignment_samples: 0
best_of: 16
celo_rating: 1234.94
display_name: jellywibble-lora-120k-p_2801_v21
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>system<|end_header_id|>\n\nrespond with humor<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.08, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: True
language_model: Jellywibble/lora_120k_pref_data_ep3_stacked_elo_alignment
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: Jellywibble/lora_120k_pr
model_name: jellywibble-lora-120k-p_2801_v21
model_num_parameters: 8030261248.0
model_repo: Jellywibble/lora_120k_pref_data_ep3_stacked_elo_alignment
model_size: 8B
num_battles: 216146
num_wins: 114694
propriety_score: 0.7085632900124622
propriety_total_count: 11234.0
ranking_group: single
reward_formatter: {'bot_template': 'Bot: {message}\n', 'memory_template': '', 'prompt_template': '', 'response_template': 'Bot:', 'truncate_by_message': False, 'user_template': 'User: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-07-15T18:05:45+00:00
us_pacific_date: 2024-07-15
win_ratio: 0.5306320727656306
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jellywibble-lora-120k-p-2801-v21-mkmlizer
Waiting for job on jellywibble-lora-120k-p-2801-v21-mkmlizer to finish
jellywibble-lora-120k-p-2801-v21-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ _____ __ __ ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ /___/ ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ Version: 0.9.5.post2 ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ https://mk1.ai ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ The license key for the current software has been verified as ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ belonging to: ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ Chai Research Corp. ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v21-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
jellywibble-lora-120k-p-2801-v21-mkmlizer: Downloaded to shared memory in 47.636s
jellywibble-lora-120k-p-2801-v21-mkmlizer: quantizing model to /dev/shm/model_cache
jellywibble-lora-120k-p-2801-v21-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.embed_tokens.weight torch.Size([139542528])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.0.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.0.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.0.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.0.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.0.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.0.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.1.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.1.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.1.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.1.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.1.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.1.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.2.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.2.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.2.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.2.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.2.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.2.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.3.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.3.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.3.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.3.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.3.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.3.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.4.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.4.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.4.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.4.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.4.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.4.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.5.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.5.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.5.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.5.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.5.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.5.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.6.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.6.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.6.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.6.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.6.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.6.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.7.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.7.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.7.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.7.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.7.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.7.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.8.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.8.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.8.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.10.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.10.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.10.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.10.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.10.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.10.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.11.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.11.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.11.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.11.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.11.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.11.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.12.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.12.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.12.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.12.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.12.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.12.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.13.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.13.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.13.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.13.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.13.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.13.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.14.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.14.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.8.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.8.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.8.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.9.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.9.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.9.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.9.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.9.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.9.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.28.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.28.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.28.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.28.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.28.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.29.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.29.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.29.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.29.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.29.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.29.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.30.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.30.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.30.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.30.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.30.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.30.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.31.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.31.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v21-mkmlizer: lm_head.weight torch.Size([139542528])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.31.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.31.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.31.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.layers.31.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: model.norm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v21-mkmlizer: Loading 0: 59%|█████▉ | 173/291 [00:03<00:02, 49.89it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:02, 49.89it/s] Loading 0: 60%|██████ | 176/291 [00:03<00:02, 54.61it/s] Loading 0: 60%|██████ | 176/291 [00:03<00:02, 54.61it/s] Loading 0: 61%|██████ | 178/291 [00:03<00:02, 54.61it/s] Loading 0: 62%|██████▏ | 180/291 [00:03<00:02, 54.61it/s] Loading 0: 63%|██████▎ | 182/291 [00:03<00:01, 54.61it/s] Loading 0: 63%|██████▎ | 184/291 [00:03<00:01, 54.61it/s] Loading 0: 64%|██████▎ | 185/291 [00:04<00:01, 54.61it/s] Loading 0: 64%|██████▍ | 186/291 [00:04<00:02, 37.10it/s] Loading 0: 64%|██████▍ | 186/291 [00:04<00:02, 37.10it/s] Loading 0: 65%|██████▍ | 188/291 [00:04<00:02, 37.10it/s] Loading 0: 65%|██████▍ | 189/291 [00:04<00:02, 37.10it/s] Loading 0: 65%|██████▌ | 190/291 [00:04<00:02, 37.10it/s] Loading 0: 66%|██████▌ | 191/291 [00:04<00:02, 38.14it/s] Loading 0: 66%|██████▌ | 191/291 [00:04<00:02, 38.14it/s] Loading 0: 66%|██████▋ | 193/291 [00:04<00:02, 38.14it/s] Loading 0: 67%|██████▋ | 194/291 [00:04<00:02, 38.14it/s] Loading 0: 67%|██████▋ | 196/291 [00:04<00:02, 39.04it/s] Loading 0: 67%|██████▋ | 196/291 [00:04<00:02, 39.04it/s] Loading 0: 68%|██████▊ | 198/291 [00:04<00:02, 39.04it/s] Loading 0: 68%|██████▊ | 199/291 [00:04<00:02, 39.04it/s] Loading 0: 69%|██████▊ | 200/291 [00:04<00:02, 39.04it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:02, 39.04it/s] Loading 0: 70%|██████▉ | 203/291 [00:04<00:02, 42.33it/s] Loading 0: 70%|██████▉ | 203/291 [00:04<00:02, 42.33it/s] Loading 0: 70%|███████ | 205/291 [00:04<00:02, 42.33it/s] Loading 0: 71%|███████ | 207/291 [00:04<00:01, 42.33it/s] Loading 0: 71%|███████▏ | 208/291 [00:04<00:01, 42.33it/s] Loading 0: 72%|███████▏ | 209/291 [00:04<00:01, 42.33it/s] Loading 0: 73%|███████▎ | 211/291 [00:04<00:01, 42.33it/s] Loading 0: 73%|███████▎ | 212/291 [00:04<00:01, 48.81it/s] Loading 0: 73%|███████▎ | 212/291 [00:04<00:01, 48.81it/s] Loading 0: 74%|███████▎ | 214/291 [00:04<00:01, 48.81it/s] Loading 0: 74%|███████▍ | 216/291 [00:04<00:01, 48.81it/s] Loading 0: 75%|███████▍ | 217/291 [00:04<00:01, 48.81it/s] Loading 0: 75%|███████▍ | 218/291 [00:04<00:01, 48.81it/s] Loading 0: 76%|███████▌ | 220/291 [00:04<00:01, 48.81it/s] Loading 0: 76%|███████▌ | 221/291 [00:04<00:01, 53.51it/s] Loading 0: 76%|███████▌ | 221/291 [00:04<00:01, 53.51it/s] Loading 0: 77%|███████▋ | 223/291 [00:04<00:01, 53.51it/s] Loading 0: 77%|███████▋ | 225/291 [00:04<00:01, 53.51it/s] Loading 0: 78%|███████▊ | 227/291 [00:05<00:01, 53.51it/s] Loading 0: 78%|███████▊ | 228/291 [00:05<00:01, 56.99it/s] Loading 0: 79%|███████▊ | 229/291 [00:05<00:01, 56.99it/s] Loading 0: 79%|███████▉ | 231/291 [00:05<00:01, 56.99it/s] Loading 0: 80%|███████▉ | 232/291 [00:05<00:01, 56.99it/s] Loading 0: 80%|████████ | 233/291 [00:05<00:01, 56.99it/s] Loading 0: 80%|████████ | 234/291 [00:05<00:01, 56.99it/s] Loading 0: 81%|████████ | 235/291 [00:05<00:01, 33.74it/s] Loading 0: 81%|████████ | 235/291 [00:05<00:01, 33.74it/s] Loading 0: 81%|████████ | 236/291 [00:05<00:01, 33.74it/s] Loading 0: 82%|████████▏ | 238/291 [00:05<00:01, 33.74it/s] Loading 0: 82%|████████▏ | 239/291 [00:05<00:01, 33.74it/s] Loading 0: 82%|████████▏ | 240/291 [00:05<00:01, 34.84it/s] Loading 0: 83%|████████▎ | 241/291 [00:05<00:01, 34.84it/s] Loading 0: 84%|████████▎ | 243/291 [00:05<00:01, 34.84it/s] Loading 0: 84%|████████▍ | 244/291 [00:05<00:01, 34.84it/s] Loading 0: 84%|████████▍ | 245/291 [00:05<00:01, 34.84it/s] Loading 0: 85%|████████▍ | 247/291 [00:05<00:01, 40.90it/s] Loading 0: 85%|████████▍ | 247/291 [00:05<00:01, 40.90it/s] Loading 0: 85%|████████▌ | 248/291 [00:05<00:01, 40.90it/s] Loading 0: 86%|████████▌ | 250/291 [00:05<00:01, 40.90it/s] Loading 0: 87%|████████▋ | 252/291 [00:05<00:00, 40.90it/s] Loading 0: 87%|████████▋ | 253/291 [00:05<00:00, 40.90it/s] Loading 0: 87%|████████▋ | 254/291 [00:05<00:00, 40.90it/s] Loading 0: 88%|████████▊ | 255/291 [00:05<00:00, 47.53it/s] Loading 0: 88%|████████▊ | 256/291 [00:05<00:00, 47.53it/s] Loading 0: 88%|████████▊ | 257/291 [00:05<00:00, 47.53it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 47.53it/s] Loading 0: 90%|████████▉ | 261/291 [00:05<00:00, 48.04it/s] Loading 0: 90%|████████▉ | 261/291 [00:05<00:00, 48.04it/s] Loading 0: 90%|█████████ | 262/291 [00:05<00:00, 48.04it/s] Loading 0: 90%|█████████ | 263/291 [00:06<00:00, 48.04it/s] Loading 0: 91%|█████████ | 265/291 [00:06<00:00, 48.04it/s] Loading 0: 91%|█████████▏| 266/291 [00:06<00:00, 48.04it/s] Loading 0: 92%|█████████▏| 267/291 [00:06<00:00, 45.34it/s] Loading 0: 92%|█████████▏| 268/291 [00:06<00:00, 45.34it/s] Loading 0: 93%|█████████▎| 270/291 [00:06<00:00, 45.34it/s] Loading 0: 93%|█████████▎| 271/291 [00:06<00:00, 45.34it/s] Loading 0: 93%|█████████▎| 272/291 [00:06<00:00, 45.34it/s] Loading 0: 94%|█████████▍| 274/291 [00:06<00:00, 50.30it/s] Loading 0: 94%|█████████▍| 274/291 [00:06<00:00, 50.30it/s] Loading 0: 95%|█████████▍| 275/291 [00:06<00:00, 50.30it/s] Loading 0: 95%|█████████▌| 277/291 [00:06<00:00, 50.30it/s] Loading 0: 96%|█████████▌| 279/291 [00:06<00:00, 50.30it/s] Loading 0: 97%|█████████▋| 281/291 [00:06<00:00, 54.74it/s] Loading 0: 97%|█████████▋| 282/291 [00:06<00:00, 54.74it/s] Loading 0: 98%|█████████▊| 284/291 [00:06<00:00, 54.74it/s] Loading 0: 98%|█████████▊| 285/291 [00:13<00:00, 54.74it/s] Loading 0: 98%|█████████▊| 286/291 [00:13<00:00, 54.74it/s] Loading 0: 99%|█████████▊| 287/291 [00:13<00:01, 2.92it/s] Loading 0: 99%|█████████▊| 287/291 [00:13<00:01, 2.92it/s] Loading 0: 99%|█████████▉| 288/291 [00:13<00:01, 2.92it/s] Loading 0: 99%|█████████▉| 289/291 [00:13<00:00, 2.92it/s] Loading 0: 100%|█████████▉| 290/291 [00:13<00:00, 2.92it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
jellywibble-lora-120k-p-2801-v21-mkmlizer: quantized model in 32.261s
jellywibble-lora-120k-p-2801-v21-mkmlizer: Processed model Jellywibble/lora_120k_pref_data_ep3_stacked_elo_alignment in 79.897s
jellywibble-lora-120k-p-2801-v21-mkmlizer: creating bucket guanaco-mkml-models
jellywibble-lora-120k-p-2801-v21-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v21/flywheel_model.0.safetensors
jellywibble-lora-120k-p-2801-v21-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
jellywibble-lora-120k-p-2801-v21-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:950: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-lora-120k-p-2801-v21-mkmlizer: warnings.warn(
jellywibble-lora-120k-p-2801-v21-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:778: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-lora-120k-p-2801-v21-mkmlizer: warnings.warn(
jellywibble-lora-120k-p-2801-v21-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-lora-120k-p-2801-v21-mkmlizer: warnings.warn(
jellywibble-lora-120k-p-2801-v21-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:06<00:06, 6.83s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.98s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.41s/it]
jellywibble-lora-120k-p-2801-v21-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 1.55it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 2.54it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 2.32it/s]
jellywibble-lora-120k-p-2801-v21-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
jellywibble-lora-120k-p-2801-v21-mkmlizer: Saving duration: 2.158s
jellywibble-lora-120k-p-2801-v21-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.452s
jellywibble-lora-120k-p-2801-v21-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v21_reward/tokenizer_config.json
jellywibble-lora-120k-p-2801-v21-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v21_reward/special_tokens_map.json
jellywibble-lora-120k-p-2801-v21-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v21_reward/vocab.json
jellywibble-lora-120k-p-2801-v21-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v21_reward/config.json
jellywibble-lora-120k-p-2801-v21-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v21_reward/merges.txt
jellywibble-lora-120k-p-2801-v21-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v21_reward/tokenizer.json
jellywibble-lora-120k-p-2801-v21-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v21_reward/reward.tensors
Job jellywibble-lora-120k-p-2801-v21-mkmlizer completed after 118.33s with status: succeeded
Stopping job with name jellywibble-lora-120k-p-2801-v21-mkmlizer
Pipeline stage MKMLizer completed in 119.11s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service jellywibble-lora-120k-p-2801-v21
Waiting for inference service jellywibble-lora-120k-p-2801-v21 to be ready
Inference service jellywibble-lora-120k-p-2801-v21 ready after 50.352261781692505s
Pipeline stage ISVCDeployer completed in 57.15s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.310333013534546s
Received healthy response to inference request in 1.3425955772399902s
Received healthy response to inference request in 1.2941091060638428s
Received healthy response to inference request in 1.3891403675079346s
Received healthy response to inference request in 1.3756628036499023s
5 requests
0 failed requests
5th percentile: 1.3038064002990724
10th percentile: 1.3135036945343017
20th percentile: 1.3328982830047607
30th percentile: 1.3492090225219726
40th percentile: 1.3624359130859376
50th percentile: 1.3756628036499023
60th percentile: 1.3810538291931151
70th percentile: 1.3864448547363282
80th percentile: 1.573378896713257
90th percentile: 1.9418559551239014
95th percentile: 2.1260944843292235
99th percentile: 2.2734853076934813
mean time: 1.5423681735992432
Pipeline stage StressChecker completed in 9.46s
jellywibble-lora-120k-p_2801_v21 status is now deployed due to DeploymentManager action
jellywibble-lora-120k-p_2801_v21 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of jellywibble-lora-120k-p_2801_v21
Running pipeline stage ISVCDeleter
Checking if service jellywibble-lora-120k-p-2801-v21 is running
Skipping teardown as no inference service was found
Pipeline stage ISVCDeleter completed in 4.88s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key jellywibble-lora-120k-p-2801-v21/config.json from bucket guanaco-mkml-models
Deleting key jellywibble-lora-120k-p-2801-v21/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key jellywibble-lora-120k-p-2801-v21/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key jellywibble-lora-120k-p-2801-v21/tokenizer.json from bucket guanaco-mkml-models
Deleting key jellywibble-lora-120k-p-2801-v21/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key jellywibble-lora-120k-p-2801-v21_reward/config.json from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v21_reward/merges.txt from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v21_reward/reward.tensors from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v21_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v21_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v21_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v21_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.02s
jellywibble-lora-120k-p_2801_v21 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics