developer_uid: shiroe40
submission_id: cycy233-l3-e-v2-c4_v9
model_name: auto
model_group: cycy233/L3-e-v2-c4
status: torndown
timestamp: 2024-07-29T02:26:37+00:00
num_battles: 12637
num_wins: 6689
celo_rating: 1236.03
family_friendly_score: 0.0
submission_type: basic
model_repo: cycy233/L3-e-v2-c4
model_architecture: LlamaForCausalLM
reward_repo: rirv938/reward_gpt2_medium_preference_24m_e2
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: auto
is_internal_developer: False
language_model: cycy233/L3-e-v2-c4
model_size: 8B
ranking_group: single
us_pacific_date: 2024-07-28
win_ratio: 0.5293186674052386
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.08, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
Resubmit model
Running pipeline stage MKMLizer
Starting job with name cycy233-l3-e-v2-c4-v9-mkmlizer
Waiting for job on cycy233-l3-e-v2-c4-v9-mkmlizer to finish
cycy233-l3-e-v2-c4-v9-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ _____ __ __ ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ /___/ ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ Version: 0.9.7 ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ https://mk1.ai ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ The license key for the current software has been verified as ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ belonging to: ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ Chai Research Corp. ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ║ ║
cycy233-l3-e-v2-c4-v9-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cycy233-l3-e-v2-c4-v9-mkmlizer: Downloaded to shared memory in 21.848s
cycy233-l3-e-v2-c4-v9-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpv8nwrkeg, device:0
cycy233-l3-e-v2-c4-v9-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Inference service cycy233-l3-bc-v0-c1-v8 ready after 100.98432397842407s
Pipeline stage ISVCDeployer completed in 101.96s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.225341320037842s
Received healthy response to inference request in 1.4252760410308838s
Received healthy response to inference request in 1.4502143859863281s
Received healthy response to inference request in 1.3704843521118164s
Received healthy response to inference request in 1.4661283493041992s
5 requests
0 failed requests
5th percentile: 1.3814426898956298
10th percentile: 1.3924010276794434
20th percentile: 1.4143177032470704
30th percentile: 1.4302637100219726
40th percentile: 1.4402390480041505
50th percentile: 1.4502143859863281
60th percentile: 1.4565799713134766
70th percentile: 1.462945556640625
80th percentile: 1.617970943450928
90th percentile: 1.9216561317443848
95th percentile: 2.0734987258911133
99th percentile: 2.194972801208496
mean time: 1.5874888896942139
Pipeline stage StressChecker completed in 9.18s
cycy233-l3-bc-v0-c1_v8 status is now deployed due to DeploymentManager action
cycy233-l3-e-v2-c4-v9-mkmlizer: quantized model in 26.154s
cycy233-l3-e-v2-c4-v9-mkmlizer: Processed model cycy233/L3-e-v2-c4 in 48.003s
cycy233-l3-e-v2-c4-v9-mkmlizer: creating bucket guanaco-mkml-models
cycy233-l3-e-v2-c4-v9-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cycy233-l3-e-v2-c4-v9-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cycy233-l3-e-v2-c4-v9
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cycy233-l3-e-v2-c4-v9/config.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cycy233-l3-e-v2-c4-v9/special_tokens_map.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cycy233-l3-e-v2-c4-v9/tokenizer_config.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cycy233-l3-e-v2-c4-v9/tokenizer.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/cycy233-l3-e-v2-c4-v9/flywheel_model.0.safetensors
cycy233-l3-e-v2-c4-v9-mkmlizer: loading reward model from rirv938/reward_gpt2_medium_preference_24m_e2
cycy233-l3-e-v2-c4-v9-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:08, 35.53it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:04, 57.84it/s] Loading 0: 7%|▋ | 20/291 [00:00<00:05, 54.10it/s] Loading 0: 9%|▉ | 26/291 [00:00<00:04, 53.75it/s] Loading 0: 11%|█ | 32/291 [00:00<00:05, 46.47it/s] Loading 0: 14%|█▍ | 41/291 [00:00<00:05, 48.53it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:04, 55.44it/s] Loading 0: 19%|█▉ | 55/291 [00:01<00:04, 50.71it/s] Loading 0: 21%|██ | 61/291 [00:01<00:04, 51.65it/s] Loading 0: 23%|██▎ | 68/291 [00:01<00:04, 47.49it/s] Loading 0: 26%|██▌ | 76/291 [00:01<00:03, 54.02it/s] Loading 0: 28%|██▊ | 82/291 [00:01<00:04, 50.04it/s] Loading 0: 30%|███ | 88/291 [00:01<00:06, 33.08it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:05, 37.05it/s] Loading 0: 34%|███▍ | 100/291 [00:02<00:04, 38.51it/s] Loading 0: 36%|███▌ | 105/291 [00:02<00:04, 40.05it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:03, 46.23it/s] Loading 0: 41%|████ | 118/291 [00:02<00:03, 44.21it/s] Loading 0: 42%|████▏ | 123/291 [00:02<00:03, 43.46it/s] Loading 0: 44%|████▍ | 129/291 [00:02<00:03, 46.54it/s] Loading 0: 46%|████▌ | 134/291 [00:02<00:03, 46.89it/s] Loading 0: 48%|████▊ | 140/291 [00:03<00:03, 42.45it/s] Loading 0: 51%|█████ | 148/291 [00:03<00:02, 50.25it/s] Loading 0: 53%|█████▎ | 154/291 [00:03<00:02, 47.36it/s] Loading 0: 55%|█████▍ | 159/291 [00:03<00:02, 47.37it/s] Loading 0: 57%|█████▋ | 166/291 [00:03<00:02, 52.14it/s] Loading 0: 59%|█████▉ | 172/291 [00:03<00:02, 47.70it/s] Loading 0: 62%|██████▏ | 179/291 [00:03<00:02, 51.43it/s] Loading 0: 64%|██████▎ | 185/291 [00:03<00:02, 51.46it/s] Loading 0: 66%|██████▌ | 191/291 [00:04<00:03, 32.82it/s] Loading 0: 67%|██████▋ | 196/291 [00:04<00:02, 34.87it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:02, 39.42it/s] Loading 0: 71%|███████▏ | 208/291 [00:04<00:02, 40.11it/s] Loading 0: 73%|███████▎ | 213/291 [00:04<00:01, 40.61it/s] Loading 0: 76%|███████▌ | 220/291 [00:04<00:01, 46.79it/s] Loading 0: 78%|███████▊ | 226/291 [00:05<00:01, 43.07it/s] Loading 0: 79%|███████▉ | 231/291 [00:05<00:01, 43.52it/s] Loading 0: 82%|████████▏ | 238/291 [00:05<00:01, 48.78it/s] Loading 0: 84%|████████▍ | 244/291 [00:05<00:01, 46.29it/s] Loading 0: 86%|████████▌ | 249/291 [00:05<00:00, 45.21it/s] Loading 0: 88%|████████▊ | 256/291 [00:05<00:00, 50.28it/s] Loading 0: 90%|█████████ | 262/291 [00:05<00:00, 47.74it/s] Loading 0: 92%|█████████▏| 267/291 [00:05<00:00, 46.97it/s] Loading 0: 94%|█████████▍| 274/291 [00:05<00:00, 51.68it/s] Loading 0: 96%|█████████▌| 280/291 [00:06<00:00, 47.89it/s] Loading 0: 98%|█████████▊| 285/291 [00:06<00:00, 47.59it/s] Loading 0: 100%|█████████▉| 290/291 [00:11<00:00, 3.24it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-e-v2-c4-v9-mkmlizer: warnings.warn(
cycy233-l3-e-v2-c4-v9-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-e-v2-c4-v9-mkmlizer: warnings.warn(
cycy233-l3-e-v2-c4-v9-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
cycy233-l3-e-v2-c4-v9-mkmlizer: warnings.warn(
cycy233-l3-e-v2-c4-v9-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
cycy233-l3-e-v2-c4-v9-mkmlizer: Saving duration: 0.326s
cycy233-l3-e-v2-c4-v9-mkmlizer: Processed model rirv938/reward_gpt2_medium_preference_24m_e2 in 6.124s
cycy233-l3-e-v2-c4-v9-mkmlizer: creating bucket guanaco-reward-models
cycy233-l3-e-v2-c4-v9-mkmlizer: Bucket 's3://guanaco-reward-models/' created
cycy233-l3-e-v2-c4-v9-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/cycy233-l3-e-v2-c4-v9_reward
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/cycy233-l3-e-v2-c4-v9_reward/config.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/cycy233-l3-e-v2-c4-v9_reward/special_tokens_map.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/cycy233-l3-e-v2-c4-v9_reward/tokenizer_config.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/cycy233-l3-e-v2-c4-v9_reward/merges.txt
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/cycy233-l3-e-v2-c4-v9_reward/vocab.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/cycy233-l3-e-v2-c4-v9_reward/tokenizer.json
cycy233-l3-e-v2-c4-v9-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/cycy233-l3-e-v2-c4-v9_reward/reward.tensors
Job cycy233-l3-e-v2-c4-v9-mkmlizer completed after 86.36s with status: succeeded
Stopping job with name cycy233-l3-e-v2-c4-v9-mkmlizer
Pipeline stage MKMLizer completed in 86.93s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service cycy233-l3-e-v2-c4-v9
Waiting for inference service cycy233-l3-e-v2-c4-v9 to be ready
Failed to get response for submission blend_dunet_2024-07-19: ('http://undi95-meta-llama-3-70b-6209-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:49326->127.0.0.1:8080: read: connection reset by peer\n')
Inference service cycy233-l3-e-v2-c4-v9 ready after 110.84471917152405s
Pipeline stage ISVCDeployer completed in 111.21s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.0465517044067383s
Received healthy response to inference request in 1.317847728729248s
Received healthy response to inference request in 1.2599751949310303s
Received healthy response to inference request in 1.2333195209503174s
Received healthy response to inference request in 1.3171677589416504s
5 requests
0 failed requests
5th percentile: 1.23865065574646
10th percentile: 1.2439817905426025
20th percentile: 1.2546440601348876
30th percentile: 1.2714137077331542
40th percentile: 1.2942907333374023
50th percentile: 1.3171677589416504
60th percentile: 1.3174397468566894
70th percentile: 1.3177117347717284
80th percentile: 1.4635885238647461
90th percentile: 1.7550701141357423
95th percentile: 1.9008109092712402
99th percentile: 2.0174035453796386
mean time: 1.434972381591797
Pipeline stage StressChecker completed in 7.94s
cycy233-l3-e-v2-c4_v9 status is now deployed due to DeploymentManager action
cycy233-l3-e-v2-c4_v9 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of cycy233-l3-e-v2-c4_v9
Running pipeline stage ISVCDeleter
Checking if service cycy233-l3-e-v2-c4-v9 is running
Tearing down inference service cycy233-l3-e-v2-c4-v9
Service cycy233-l3-e-v2-c4-v9 has been torndown
Pipeline stage ISVCDeleter completed in 5.09s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key cycy233-l3-e-v2-c4-v9/config.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-e-v2-c4-v9/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key cycy233-l3-e-v2-c4-v9/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-e-v2-c4-v9/tokenizer.json from bucket guanaco-mkml-models
Deleting key cycy233-l3-e-v2-c4-v9/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key cycy233-l3-e-v2-c4-v9_reward/config.json from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c4-v9_reward/merges.txt from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c4-v9_reward/reward.tensors from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c4-v9_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c4-v9_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c4-v9_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key cycy233-l3-e-v2-c4-v9_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.52s
cycy233-l3-e-v2-c4_v9 status is now torndown due to DeploymentManager action