submission_id: jellywibble-lora-120k-p_2801_v16
developer_uid: szzi-ye
best_of: 4
celo_rating: 1231.62
display_name: nitral-ai-hathor-l3-8b-v-01_v1
family_friendly_score: 0.0
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.08, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
is_internal_developer: False
language_model: Jellywibble/lora_120k_pref_data_ep3_stacked_elo_alignment
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: Jellywibble/lora_120k_pr
model_name: nitral-ai-hathor-l3-8b-v-01_v1
model_num_parameters: 8030261248.0
model_repo: Jellywibble/lora_120k_pref_data_ep3_stacked_elo_alignment
model_size: 8B
num_battles: 65537
num_wins: 35810
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-07-14T19:28:00+00:00
us_pacific_date: 2024-07-14
win_ratio: 0.5464088987899965
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jellywibble-lora-120k-p-2801-v16-mkmlizer
Waiting for job on jellywibble-lora-120k-p-2801-v16-mkmlizer to finish
jellywibble-lora-120k-p-2801-v16-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ _____ __ __ ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ /___/ ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ Version: 0.9.5.post2 ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ https://mk1.ai ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ The license key for the current software has been verified as ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ belonging to: ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ Chai Research Corp. ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ║ ║
jellywibble-lora-120k-p-2801-v16-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jellywibble-lora-120k-p-2801-v16-mkmlizer: Downloaded to shared memory in 48.145s
jellywibble-lora-120k-p-2801-v16-mkmlizer: quantizing model to /dev/shm/model_cache
jellywibble-lora-120k-p-2801-v16-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.embed_tokens.weight torch.Size([139542528])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.0.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.0.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.0.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.0.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.0.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.0.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.1.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.1.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.1.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.1.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.1.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.1.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.2.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.2.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.2.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.2.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.2.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.2.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.3.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.3.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.3.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.3.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.3.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.3.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.4.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.4.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.4.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.4.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.4.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.4.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.5.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.5.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.5.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.5.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.5.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.5.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.6.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.6.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.6.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.6.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.6.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.6.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.7.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.7.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.7.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.7.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.7.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.7.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.8.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.8.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.8.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.10.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.10.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.10.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.10.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.10.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.10.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.11.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.11.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.11.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.11.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.11.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.11.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.12.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.12.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.12.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.12.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.12.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.12.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.13.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.13.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.13.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.13.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.13.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.13.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.14.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.14.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.8.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.8.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.8.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.9.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.9.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.26.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.27.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.27.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.27.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.27.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.27.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.27.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.28.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.28.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.28.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.28.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.28.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.28.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.29.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.29.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.29.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.29.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.29.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.29.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.30.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.30.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.30.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.30.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.30.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.30.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.31.self_attn.o_proj.weight torch.Size([3407872])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.31.self_attn.qkv_proj.weight torch.Size([5111808])
jellywibble-lora-120k-p-2801-v16-mkmlizer: lm_head.weight torch.Size([139542528])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.31.input_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.31.mlp.down_proj.weight torch.Size([11927552])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.31.mlp.up_gate_proj.weight torch.Size([23855104])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.layers.31.post_attention_layernorm.weight torch.Size([4096])
jellywibble-lora-120k-p-2801-v16-mkmlizer: model.norm.weight torch.Size([4096])
Connection pool is full, discarding connection: %s. Connection pool size: %s
jellywibble-lora-120k-p-2801-v16-mkmlizer: creating bucket guanaco-mkml-models
jellywibble-lora-120k-p-2801-v16-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jellywibble-lora-120k-p-2801-v16-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v16
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v16/config.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v16/tokenizer_config.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v16/special_tokens_map.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v16/tokenizer.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jellywibble-lora-120k-p-2801-v16/flywheel_model.0.safetensors
jellywibble-lora-120k-p-2801-v16-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
jellywibble-lora-120k-p-2801-v16-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:950: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-lora-120k-p-2801-v16-mkmlizer: warnings.warn(
jellywibble-lora-120k-p-2801-v16-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:778: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-lora-120k-p-2801-v16-mkmlizer: warnings.warn(
jellywibble-lora-120k-p-2801-v16-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-lora-120k-p-2801-v16-mkmlizer: warnings.warn(
jellywibble-lora-120k-p-2801-v16-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:09<00:09, 9.89s/it] Downloading shards: 100%|██████████| 2/2 [00:12<00:00, 5.50s/it] Downloading shards: 100%|██████████| 2/2 [00:12<00:00, 6.16s/it]
jellywibble-lora-120k-p-2801-v16-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 1.42it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 2.38it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 2.17it/s]
jellywibble-lora-120k-p-2801-v16-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
jellywibble-lora-120k-p-2801-v16-mkmlizer: Saving duration: 2.402s
jellywibble-lora-120k-p-2801-v16-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 17.748s
jellywibble-lora-120k-p-2801-v16-mkmlizer: creating bucket guanaco-reward-models
jellywibble-lora-120k-p-2801-v16-mkmlizer: Bucket 's3://guanaco-reward-models/' created
jellywibble-lora-120k-p-2801-v16-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v16_reward
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v16_reward/config.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v16_reward/special_tokens_map.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v16_reward/merges.txt
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v16_reward/vocab.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v16_reward/tokenizer_config.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v16_reward/tokenizer.json
jellywibble-lora-120k-p-2801-v16-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/jellywibble-lora-120k-p-2801-v16_reward/reward.tensors
Job jellywibble-lora-120k-p-2801-v16-mkmlizer completed after 139.56s with status: succeeded
Stopping job with name jellywibble-lora-120k-p-2801-v16-mkmlizer
Pipeline stage MKMLizer completed in 140.62s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service jellywibble-lora-120k-p-2801-v16
Waiting for inference service jellywibble-lora-120k-p-2801-v16 to be ready
Inference service jellywibble-lora-120k-p-2801-v16 ready after 382.0600275993347s
Pipeline stage ISVCDeployer completed in 389.13s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.1202280521392822s
Received healthy response to inference request in 1.1462750434875488s
Received healthy response to inference request in 1.1501362323760986s
Received healthy response to inference request in 1.12502121925354s
Received healthy response to inference request in 1.1408252716064453s
5 requests
0 failed requests
5th percentile: 1.1281820297241212
10th percentile: 1.131342840194702
20th percentile: 1.1376644611358642
30th percentile: 1.141915225982666
40th percentile: 1.1440951347351074
50th percentile: 1.1462750434875488
60th percentile: 1.1478195190429688
70th percentile: 1.1493639945983887
80th percentile: 1.3441545963287356
90th percentile: 1.7321913242340088
95th percentile: 1.9262096881866453
99th percentile: 2.081424379348755
mean time: 1.336497163772583
Pipeline stage StressChecker completed in 7.73s
jellywibble-lora-120k-p_2801_v16 status is now deployed due to DeploymentManager action
jellywibble-lora-120k-p_2801_v16 status is now inactive due to auto deactivation removed underperforming models
jellywibble-lora-120k-p_2801_v16 status is now deployed due to admin request
jellywibble-lora-120k-p_2801_v16 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of jellywibble-lora-120k-p_2801_v16
Running pipeline stage ISVCDeleter
Checking if service jellywibble-lora-120k-p-2801-v16 is running
Skipping teardown as no inference service was found
Pipeline stage ISVCDeleter completed in 4.20s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key jellywibble-lora-120k-p-2801-v16/config.json from bucket guanaco-mkml-models
Deleting key jellywibble-lora-120k-p-2801-v16/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key jellywibble-lora-120k-p-2801-v16/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key jellywibble-lora-120k-p-2801-v16/tokenizer.json from bucket guanaco-mkml-models
Deleting key jellywibble-lora-120k-p-2801-v16/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key jellywibble-lora-120k-p-2801-v16_reward/config.json from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v16_reward/merges.txt from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v16_reward/reward.tensors from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v16_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v16_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v16_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key jellywibble-lora-120k-p-2801-v16_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.94s
jellywibble-lora-120k-p_2801_v16 status is now torndown due to DeploymentManager action