submission_id: cycy233-l3-bdpo-v5-c1_v8
developer_uid: shiroe40
alignment_samples: 0
best_of: 16
celo_rating: 1146.97
display_name: auto
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.6, 'top_p': 0.9, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: cycy233/L3-bdpo-v5-c1
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: cycy233/L3-bdpo-v5-c1
model_name: auto
model_num_parameters: 8030261248.0
model_repo: cycy233/L3-bdpo-v5-c1
model_size: 8B
num_battles: 64219
num_wins: 28064
propriety_score: 0.7721930934731513
propriety_total_count: 5531.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: Jellywibble/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-07-15T04:05:23+00:00
us_pacific_date: 2024-07-14
win_ratio: 0.4370046247995142
Resubmit model
Running pipeline stage MKMLizer
Starting job with name cycy233-l3-bdpo-v5-c1-v8-mkmlizer
Waiting for job on cycy233-l3-bdpo-v5-c1-v8-mkmlizer to finish
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ _____ __ __ ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ /___/ ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ Version: 0.9.5.post2 ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ https://mk1.ai ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ The license key for the current software has been verified as ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ belonging to: ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ Chai Research Corp. ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ║ ║
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: Downloaded to shared memory in 15.895s
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: quantizing model to /dev/shm/model_cache
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: Saving flywheel model at /dev/shm/model_cache
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.embed_tokens.weight torch.Size([139542528])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.0.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.0.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.0.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.0.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.0.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.0.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.1.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.1.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.1.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.1.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.1.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.1.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.2.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.2.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.2.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.2.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.2.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.2.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.3.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.3.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.3.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.3.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.3.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.3.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.4.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.4.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.4.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.4.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.4.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.4.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.5.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.5.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.5.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.5.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.5.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.5.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.6.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.6.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.6.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.6.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.6.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.6.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.7.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.7.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.7.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.7.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.7.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.7.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.8.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.8.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.8.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.8.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.8.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.8.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.10.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.10.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.10.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.10.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.10.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.10.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.11.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.11.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.11.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.11.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.11.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.11.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.12.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.12.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.12.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.12.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.12.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.12.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.13.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.13.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.13.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.13.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.13.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.13.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.14.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.14.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.14.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.14.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.14.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.14.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.15.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.15.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.15.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.15.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.15.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.15.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.16.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.16.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.16.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.16.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.16.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.16.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.17.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.17.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.17.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.17.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.17.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.17.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.18.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.18.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.18.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.18.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.18.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.18.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.19.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.19.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.19.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.19.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.19.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.19.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.20.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.20.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.9.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.9.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.9.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.9.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.9.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.9.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.20.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.20.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.20.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.20.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.21.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.21.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.21.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.21.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.21.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.21.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.22.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.22.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.22.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.22.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.22.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.22.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.23.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.23.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.23.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.23.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.23.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.23.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.24.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.24.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.24.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.24.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 0%| | 1/291 [00:00<00:00, 830.23it/s] Loading 0: 1%| | 2/291 [00:00<00:02, 104.75it/s] Loading 0: 1%|▏ | 4/291 [00:00<00:03, 76.86it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:03, 95.29it/s] Loading 0: 2%|▏ | 7/291 [00:00<00:02, 109.43it/s] Loading 0: 3%|▎ | 9/291 [00:00<00:02, 127.34it/s] Loading 0: 3%|▎ | 10/291 [00:00<00:01, 140.93it/s] Loading 0: 4%|▍ | 11/291 [00:00<00:02, 124.89it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:02, 122.52it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:02, 122.52it/s] Loading 0: 5%|▍ | 14/291 [00:00<00:02, 122.52it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:02, 122.52it/s] Loading 0: 6%|▌ | 18/291 [00:00<00:02, 122.52it/s] Loading 0: 7%|▋ | 19/291 [00:00<00:02, 122.52it/s] Loading 0: 7%|▋ | 20/291 [00:00<00:02, 122.52it/s] Loading 0: 8%|▊ | 22/291 [00:00<00:02, 122.52it/s] Loading 0: 8%|▊ | 23/291 [00:00<00:02, 122.52it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:02, 122.52it/s] Loading 0: 9%|▉ | 27/291 [00:00<00:02, 122.52it/s] Loading 0: 10%|▉ | 28/291 [00:00<00:01, 137.31it/s] Loading 0: 10%|▉ | 28/291 [00:00<00:01, 137.31it/s] Loading 0: 10%|▉ | 29/291 [00:00<00:01, 137.31it/s] Loading 0: 11%|█ | 31/291 [00:00<00:01, 137.31it/s] Loading 0: 11%|█ | 32/291 [00:00<00:01, 137.31it/s] Loading 0: 12%|█▏ | 34/291 [00:00<00:01, 137.31it/s] Loading 0: 12%|█▏ | 36/291 [00:00<00:01, 137.31it/s] Loading 0: 13%|█▎ | 37/291 [00:00<00:01, 137.31it/s] Loading 0: 13%|█▎ | 38/291 [00:00<00:01, 137.31it/s] Loading 0: 14%|█▎ | 40/291 [00:00<00:01, 137.31it/s] Loading 0: 14%|█▍ | 41/291 [00:00<00:01, 137.31it/s] Loading 0: 14%|█▍ | 42/291 [00:00<00:01, 125.10it/s] Loading 0: 15%|█▍ | 43/291 [00:00<00:01, 125.10it/s] Loading 0: 15%|█▌ | 45/291 [00:00<00:01, 125.10it/s] Loading 0: 16%|█▌ | 46/291 [00:00<00:01, 125.10it/s] Loading 0: 16%|█▌ | 47/291 [00:00<00:01, 125.10it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:01, 125.10it/s] Loading 0: 17%|█▋ | 50/291 [00:00<00:01, 125.10it/s] Loading 0: 18%|█▊ | 52/291 [00:00<00:01, 125.10it/s] Loading 0: 19%|█▊ | 54/291 [00:00<00:01, 125.10it/s] Loading 0: 19%|█▉ | 55/291 [00:00<00:01, 125.10it/s] Loading 0: 19%|█▉ | 56/291 [00:00<00:01, 125.10it/s] Loading 0: 20%|█▉ | 58/291 [00:00<00:01, 135.87it/s] Loading 0: 20%|█▉ | 58/291 [00:00<00:01, 135.87it/s] Loading 0: 20%|██ | 59/291 [00:00<00:01, 135.87it/s] Loading 0: 21%|██ | 61/291 [00:00<00:01, 135.87it/s] Loading 0: 22%|██▏ | 63/291 [00:00<00:01, 135.87it/s] Loading 0: 22%|██▏ | 64/291 [00:00<00:01, 135.87it/s] Loading 0: 22%|██▏ | 65/291 [00:00<00:01, 135.87it/s] Loading 0: 23%|██▎ | 67/291 [00:00<00:01, 135.87it/s] Loading 0: 23%|██▎ | 68/291 [00:00<00:01, 135.87it/s] Loading 0: 24%|██▍ | 70/291 [00:00<00:01, 135.87it/s] Loading 0: 25%|██▍ | 72/291 [00:00<00:01, 135.87it/s] Loading 0: 25%|██▌ | 73/291 [00:00<00:01, 135.87it/s] Loading 0: 25%|██▌ | 74/291 [00:00<00:01, 135.87it/s] Loading 0: 26%|██▌ | 75/291 [00:00<00:01, 146.72it/s] Loading 0: 26%|██▌ | 76/291 [00:00<00:01, 146.72it/s] Loading 0: 26%|██▋ | 77/291 [00:00<00:01, 146.72it/s] Loading 0: 27%|██▋ | 79/291 [00:00<00:01, 146.72it/s] Loading 0: 28%|██▊ | 81/291 [00:00<00:01, 146.72it/s] Loading 0: 28%|██▊ | 82/291 [00:00<00:01, 146.72it/s] Loading 0: 29%|██▊ | 83/291 [00:00<00:01, 146.72it/s] Loading 0: 29%|██▉ | 85/291 [00:00<00:01, 146.72it/s] Loading 0: 30%|██▉ | 86/291 [00:00<00:01, 146.72it/s] Loading 0: 30%|███ | 88/291 [00:00<00:01, 146.72it/s] Loading 0: 31%|███ | 90/291 [00:00<00:02, 80.36it/s] Loading 0: 31%|███ | 90/291 [00:00<00:02, 80.36it/s] Loading 0: 31%|███▏ | 91/291 [00:00<00:02, 80.36it/s] Loading 0: 32%|███▏ | 92/291 [00:00<00:02, 80.36it/s] Loading 0: 32%|███▏ | 94/291 [00:00<00:02, 80.36it/s] Loading 0: 33%|███▎ | 95/291 [00:00<00:02, 80.36it/s] Loading 0: 33%|███▎ | 97/291 [00:00<00:02, 80.36it/s] Loading 0: 34%|███▍ | 99/291 [00:00<00:02, 80.36it/s] Loading 0: 34%|███▍ | 100/291 [00:00<00:02, 80.36it/s] Loading 0: 35%|███▍ | 101/291 [00:00<00:02, 80.36it/s] Loading 0: 35%|███▌ | 103/291 [00:01<00:02, 89.28it/s] Loading 0: 35%|███▌ | 103/291 [00:01<00:02, 89.28it/s] Loading 0: 36%|███▌ | 104/291 [00:01<00:02, 89.28it/s] Loading 0: 36%|███▋ | 106/291 [00:01<00:02, 89.28it/s] Loading 0: 37%|███▋ | 108/291 [00:01<00:02, 89.28it/s] Loading 0: 37%|███▋ | 109/291 [00:01<00:02, 89.28it/s] Loading 0: 38%|███▊ | 110/291 [00:01<00:02, 89.28it/s] Loading 0: 38%|███▊ | 112/291 [00:01<00:02, 89.28it/s] Loading 0: 39%|███▉ | 113/291 [00:01<00:01, 89.28it/s] Loading 0: 40%|███▉ | 115/291 [00:01<00:01, 89.28it/s] Loading 0: 40%|████ | 117/291 [00:01<00:01, 89.28it/s] Loading 0: 41%|████ | 118/291 [00:01<00:01, 89.28it/s] Loading 0: 41%|████ | 119/291 [00:01<00:01, 89.28it/s] Loading 0: 41%|████ | 120/291 [00:01<00:01, 103.52it/s] Loading 0: 42%|████▏ | 121/291 [00:01<00:01, 103.52it/s] Loading 0: 42%|████▏ | 122/291 [00:01<00:01, 103.52it/s] Loading 0: 43%|████▎ | 124/291 [00:01<00:01, 103.52it/s] Loading 0: 43%|████▎ | 126/291 [00:01<00:01, 103.52it/s] Loading 0: 44%|████▎ | 127/291 [00:01<00:01, 103.52it/s] Loading 0: 44%|████▍ | 128/291 [00:01<00:01, 103.52it/s] Loading 0: 45%|████▍ | 130/291 [00:01<00:01, 103.52it/s] Loading 0: 45%|████▌ | 131/291 [00:01<00:01, 103.52it/s] Loading 0: 46%|████▌ | 133/291 [00:01<00:01, 108.69it/s] Loading 0: 46%|████▌ | 133/291 [00:01<00:01, 108.69it/s] Loading 0: 46%|████▋ | 135/291 [00:01<00:01, 108.69it/s] Loading 0: 47%|████▋ | 136/291 [00:01<00:01, 108.69it/s] Loading 0: 47%|████▋ | 137/291 [00:01<00:01, 108.69it/s] Loading 0: 48%|████▊ | 139/291 [00:01<00:01, 108.69it/s] Loading 0: 48%|████▊ | 140/291 [00:01<00:01, 108.69it/s] Loading 0: 49%|████▉ | 142/291 [00:01<00:01, 108.69it/s] Loading 0: 49%|████▉ | 144/291 [00:01<00:01, 108.69it/s] Loading 0: 50%|████▉ | 145/291 [00:01<00:01, 108.69it/s] Loading 0: 50%|█████ | 146/291 [00:01<00:01, 108.69it/s] Loading 0: 51%|█████ | 148/291 [00:01<00:01, 116.97it/s] Loading 0: 51%|█████ | 148/291 [00:01<00:01, 116.97it/s] Loading 0: 51%|█████ | 149/291 [00:01<00:01, 116.97it/s] Loading 0: 52%|█████▏ | 151/291 [00:01<00:01, 116.97it/s] Loading 0: 53%|█████▎ | 153/291 [00:01<00:01, 116.97it/s] Loading 0: 53%|█████▎ | 154/291 [00:01<00:01, 116.97it/s] Loading 0: 53%|█████▎ | 155/291 [00:01<00:01, 116.97it/s] Loading 0: 54%|█████▍ | 157/291 [00:01<00:01, 116.97it/s] Loading 0: 54%|█████▍ | 158/291 [00:01<00:01, 116.97it/s] Loading 0: 55%|█████▍ | 160/291 [00:01<00:01, 116.97it/s] Loading 0: 56%|█████▌ | 162/291 [00:01<00:01, 116.97it/s] Loading 0: 56%|█████▌ | 163/291 [00:01<00:01, 116.97it/s] Loading 0: 56%|█████▋ | 164/291 [00:01<00:01, 116.97it/s] Loading 0: 57%|█████▋ | 165/291 [00:01<00:00, 130.49it/s] Loading 0: 57%|█████▋ | 166/291 [00:01<00:00, 130.49it/s] Loading 0: 57%|█████▋ | 167/291 [00:01<00:00, 130.49it/s] Loading 0: 58%|█████▊ | 169/291 [00:01<00:00, 130.49it/s] Loading 0: 59%|█████▉ | 171/291 [00:01<00:00, 130.49it/s] Loading 0: 60%|█████▉ | 174/291 [00:01<00:00, 130.49it/s] Loading 0: 60%|██████ | 176/291 [00:01<00:00, 130.49it/s] Loading 0: 61%|██████ | 177/291 [00:01<00:00, 130.49it/s] Loading 0: 61%|██████ | 178/291 [00:01<00:00, 130.49it/s] Loading 0: 62%|██████▏ | 180/291 [00:01<00:00, 131.30it/s] Loading 0: 62%|██████▏ | 180/291 [00:01<00:00, 131.30it/s] model.layers.24.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.24.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.25.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.25.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.25.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.25.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.25.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.25.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.26.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.26.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.26.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.26.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.26.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.26.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.27.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.27.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.27.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.27.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.27.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.27.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.28.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.28.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.28.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.28.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.28.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.28.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.29.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.29.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.29.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.29.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.29.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.29.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.30.input_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.30.mlp.down_proj.weight torch.Size([11927552])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.30.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.30.post_attention_layernorm.weight torch.Size([4096])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.30.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.30.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.31.mlp.up_gate_proj.weight torch.Size([23855104])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.31.self_attn.o_proj.weight torch.Size([3407872])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: model.layers.31.self_attn.qkv_proj.weight torch.Size([5111808])
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: creating bucket guanaco-mkml-models
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c1-v8
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c1-v8/special_tokens_map.json
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c1-v8/config.json
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c1-v8/tokenizer_config.json
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c1-v8/tokenizer.json
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/cycy233-l3-bdpo-v5-c1-v8/flywheel_model.0.safetensors
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: loading reward model from Jellywibble/gpt2_xl_pairwise_89m_step_347634
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:950: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: warnings.warn(
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:778: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: warnings.warn(
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-bdpo-v5-c1-v8-mkmlizer: warnings.warn(
Job cycy233-l3-bdpo-v5-c1-v8-mkmlizer completed after 92.16s with status: succeeded
Stopping job with name cycy233-l3-bdpo-v5-c1-v8-mkmlizer
Pipeline stage MKMLizer completed in 92.99s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.15s
Running pipeline stage ISVCDeployer
Creating inference service cycy233-l3-bdpo-v5-c1-v8
Waiting for inference service cycy233-l3-bdpo-v5-c1-v8 to be ready
Inference service cycy233-l3-bdpo-v5-c1-v8 ready after 50.227527379989624s
Pipeline stage ISVCDeployer completed in 57.14s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.404926061630249s
Received healthy response to inference request in 1.4038257598876953s
Received healthy response to inference request in 1.405078411102295s
Received healthy response to inference request in 1.6774146556854248s
Received healthy response to inference request in 1.4319593906402588s
5 requests
0 failed requests
5th percentile: 1.4040762901306152
10th percentile: 1.404326820373535
20th percentile: 1.404827880859375
30th percentile: 1.4104546070098878
40th percentile: 1.4212069988250733
50th percentile: 1.4319593906402588
60th percentile: 1.530141496658325
70th percentile: 1.6283236026763916
80th percentile: 1.8229169368743898
90th percentile: 2.1139214992523194
95th percentile: 2.259423780441284
99th percentile: 2.375825605392456
mean time: 1.6646408557891845
Pipeline stage StressChecker completed in 9.79s
cycy233-l3-bdpo-v5-c1_v8 status is now deployed due to DeploymentManager action
cycy233-l3-bdpo-v5-c1_v8 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of cycy233-l3-bdpo-v5-c1_v8
Running pipeline stage ISVCDeleter
Checking if service cycy233-l3-bdpo-v5-c1-v8 is running
Skipping teardown as no inference service was found
Pipeline stage ISVCDeleter completed in 4.30s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key cycy233-l3-bdpo-v5-c1-v8/config.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-bdpo-v5-c1-v8/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key cycy233-l3-bdpo-v5-c1-v8/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-bdpo-v5-c1-v8/tokenizer.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-bdpo-v5-c1-v8/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key cycy233-l3-bdpo-v5-c1-v8_reward/config.json from bucket guanaco-reward-models
Deleting key cycy233-l3-bdpo-v5-c1-v8_reward/merges.txt from bucket guanaco-reward-models
Deleting key cycy233-l3-bdpo-v5-c1-v8_reward/reward.tensors from bucket guanaco-reward-models
Deleting key cycy233-l3-bdpo-v5-c1-v8_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key cycy233-l3-bdpo-v5-c1-v8_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key cycy233-l3-bdpo-v5-c1-v8_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key cycy233-l3-bdpo-v5-c1-v8_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 7.29s
cycy233-l3-bdpo-v5-c1_v8 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics