submission_id: google-gemma-2-27b-it_v11
developer_uid: zonemercy
best_of: 4
celo_rating: 1190.42
display_name: google-gemma-2-27b-it_v10
family_friendly_score: 0.0
formatter: {'memory_template': '<|im_start|>system\n{memory}<|im_end|>\n', 'prompt_template': '<|im_start|>user\n{prompt}<|im_end|>\n', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': True}
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<end_of_turn>', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
is_internal_developer: True
language_model: google/gemma-2-27b-it
max_input_tokens: 512
max_output_tokens: 64
model_architecture: Gemma2ForCausalLM
model_group: google/gemma-2-27b-it
model_name: google-gemma-2-27b-it_v10
model_num_parameters: 28731935232.0
model_repo: google/gemma-2-27b-it
model_size: 29B
num_battles: 49074
num_wins: 26531
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-07-23T21:03:54+00:00
us_pacific_date: 2024-07-23
win_ratio: 0.5406325141622855
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name google-gemma-2-27b-it-v11-mkmlizer
Waiting for job on google-gemma-2-27b-it-v11-mkmlizer to finish
google-gemma-2-27b-it-v11-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
google-gemma-2-27b-it-v11-mkmlizer: ║ _____ __ __ ║
google-gemma-2-27b-it-v11-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
google-gemma-2-27b-it-v11-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
google-gemma-2-27b-it-v11-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
google-gemma-2-27b-it-v11-mkmlizer: ║ /___/ ║
google-gemma-2-27b-it-v11-mkmlizer: ║ ║
google-gemma-2-27b-it-v11-mkmlizer: ║ Version: 0.9.5.post3 ║
google-gemma-2-27b-it-v11-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
google-gemma-2-27b-it-v11-mkmlizer: ║ https://mk1.ai ║
google-gemma-2-27b-it-v11-mkmlizer: ║ ║
google-gemma-2-27b-it-v11-mkmlizer: ║ The license key for the current software has been verified as ║
google-gemma-2-27b-it-v11-mkmlizer: ║ belonging to: ║
google-gemma-2-27b-it-v11-mkmlizer: ║ ║
google-gemma-2-27b-it-v11-mkmlizer: ║ Chai Research Corp. ║
google-gemma-2-27b-it-v11-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
google-gemma-2-27b-it-v11-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
google-gemma-2-27b-it-v11-mkmlizer: ║ ║
google-gemma-2-27b-it-v11-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
google-gemma-2-27b-it-v11-mkmlizer: Downloaded to shared memory in 67.087s
google-gemma-2-27b-it-v11-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmplbpsguf5, device:0
google-gemma-2-27b-it-v11-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission undi95-meta-llama-3-70b_6209_v18: ('http://undi95-meta-llama-3-70b-6209-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"TypeError : SamplingParameters.__init__() got an unexpected keyword argument \'reward_max_tokens\'"}')
google-gemma-2-27b-it-v11-mkmlizer: quantized model in 63.746s
google-gemma-2-27b-it-v11-mkmlizer: Processed model google/gemma-2-27b-it in 130.833s
google-gemma-2-27b-it-v11-mkmlizer: creating bucket guanaco-mkml-models
google-gemma-2-27b-it-v11-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
google-gemma-2-27b-it-v11-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/google-gemma-2-27b-it-v11
google-gemma-2-27b-it-v11-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/google-gemma-2-27b-it-v11/config.json
google-gemma-2-27b-it-v11-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/google-gemma-2-27b-it-v11/special_tokens_map.json
google-gemma-2-27b-it-v11-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/google-gemma-2-27b-it-v11/tokenizer_config.json
google-gemma-2-27b-it-v11-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/google-gemma-2-27b-it-v11/tokenizer.model
google-gemma-2-27b-it-v11-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/google-gemma-2-27b-it-v11/tokenizer.json
google-gemma-2-27b-it-v11-mkmlizer: cp /dev/shm/model_cache/flywheel_model.2.safetensors s3://guanaco-mkml-models/google-gemma-2-27b-it-v11/flywheel_model.2.safetensors
google-gemma-2-27b-it-v11-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/google-gemma-2-27b-it-v11/flywheel_model.0.safetensors
google-gemma-2-27b-it-v11-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/google-gemma-2-27b-it-v11/flywheel_model.1.safetensors
google-gemma-2-27b-it-v11-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
google-gemma-2-27b-it-v11-mkmlizer: Loading 0: 0%| | 0/508 [00:00<?, ?it/s] Loading 0: 1%| | 4/508 [00:00<00:13, 37.75it/s] Loading 0: 3%|▎ | 14/508 [00:00<00:07, 64.70it/s] Loading 0: 4%|▍ | 21/508 [00:00<00:07, 63.42it/s] Loading 0: 6%|▌ | 28/508 [00:00<00:10, 45.00it/s] Loading 0: 6%|▋ | 33/508 [00:00<00:11, 40.34it/s] Loading 0: 7%|▋ | 38/508 [00:00<00:12, 37.81it/s] Loading 0: 9%|▉ | 48/508 [00:01<00:09, 50.05it/s] Loading 0: 11%|█▏ | 58/508 [00:01<00:07, 59.15it/s] Loading 0: 13%|█▎ | 65/508 [00:01<00:07, 59.44it/s] Loading 0: 14%|█▍ | 73/508 [00:01<00:11, 39.21it/s] Loading 0: 16%|█▌ | 80/508 [00:01<00:09, 43.22it/s] Loading 0: 17%|█▋ | 86/508 [00:01<00:09, 46.15it/s] Loading 0: 18%|█▊ | 92/508 [00:01<00:08, 48.48it/s] Loading 0: 19%|█▉ | 98/508 [00:02<00:08, 49.15it/s] Loading 0: 21%|██▏ | 108/508 [00:02<00:06, 58.98it/s] Loading 0: 23%|██▎ | 115/508 [00:02<00:06, 59.80it/s] Loading 0: 24%|██▍ | 122/508 [00:02<00:08, 45.87it/s] Loading 0: 25%|██▌ | 128/508 [00:02<00:08, 43.73it/s] Loading 0: 27%|██▋ | 136/508 [00:02<00:07, 49.70it/s] Loading 0: 29%|██▊ | 146/508 [00:02<00:06, 58.31it/s] Loading 0: 30%|███ | 153/508 [00:03<00:05, 59.28it/s] Loading 0: 31%|███▏ | 160/508 [00:03<00:06, 53.69it/s] Loading 0: 34%|███▎ | 171/508 [00:03<00:06, 52.12it/s] Loading 0: 35%|███▍ | 177/508 [00:03<00:06, 48.27it/s] Loading 0: 36%|███▌ | 182/508 [00:03<00:07, 43.80it/s] Loading 0: 38%|███▊ | 191/508 [00:03<00:06, 51.75it/s] Loading 0: 39%|███▉ | 197/508 [00:18<03:12, 1.62it/s] Loading 0: 40%|███▉ | 202/508 [00:18<02:26, 2.08it/s] Loading 0: 42%|████▏ | 211/508 [00:18<01:30, 3.27it/s] Loading 0: 43%|████▎ | 217/508 [00:18<01:08, 4.22it/s] Loading 0: 44%|████▍ | 223/508 [00:18<00:50, 5.65it/s] Loading 0: 45%|████▌ | 229/508 [00:18<00:36, 7.55it/s] Loading 0: 46%|████▋ | 235/508 [00:18<00:27, 10.00it/s] Loading 0: 48%|████▊ | 245/508 [00:19<00:16, 15.51it/s] Loading 0: 50%|████▉ | 252/508 [00:19<00:12, 19.79it/s] Loading 0: 51%|█████ | 259/508 [00:19<00:10, 24.41it/s] Loading 0: 52%|█████▏ | 265/508 [00:19<00:09, 24.77it/s] Loading 0: 53%|█████▎ | 270/508 [00:19<00:09, 26.26it/s] Loading 0: 55%|█████▍ | 279/508 [00:19<00:06, 35.07it/s] Loading 0: 57%|█████▋ | 289/508 [00:19<00:04, 44.97it/s] Loading 0: 58%|█████▊ | 296/508 [00:20<00:04, 48.45it/s] Loading 0: 60%|█████▉ | 303/508 [00:20<00:04, 46.84it/s] Loading 0: 62%|██████▏ | 314/508 [00:20<00:04, 48.42it/s] Loading 0: 63%|██████▎ | 320/508 [00:20<00:04, 45.74it/s] Loading 0: 64%|██████▍ | 326/508 [00:20<00:04, 43.91it/s] Loading 0: 66%|██████▌ | 334/508 [00:20<00:03, 49.66it/s] Loading 0: 68%|██████▊ | 344/508 [00:20<00:02, 58.34it/s] Loading 0: 69%|██████▉ | 351/508 [00:21<00:02, 59.28it/s] Loading 0: 71%|███████ | 359/508 [00:21<00:03, 46.10it/s] Loading 0: 72%|███████▏ | 365/508 [00:21<00:02, 48.59it/s] Loading 0: 73%|███████▎ | 371/508 [00:21<00:02, 45.75it/s] Loading 0: 74%|███████▍ | 378/508 [00:21<00:02, 49.05it/s] Loading 0: 76%|███████▋ | 388/508 [00:21<00:02, 58.40it/s] Loading 0: 78%|███████▊ | 395/508 [00:21<00:01, 59.42it/s] Loading 0: 79%|███████▉ | 402/508 [00:22<00:01, 58.59it/s] Loading 0: 81%|████████ | 409/508 [00:22<00:02, 45.72it/s] Loading 0: 82%|████████▏ | 415/508 [00:22<00:02, 43.84it/s] Loading 0: 83%|████████▎ | 422/508 [00:22<00:01, 47.92it/s] Loading 0: 85%|████████▌ | 432/508 [00:22<00:01, 57.36it/s] Loading 0: 86%|████████▋ | 439/508 [00:22<00:01, 58.53it/s] Loading 0: 88%|████████▊ | 446/508 [00:22<00:01, 53.21it/s] Loading 0: 89%|████████▉ | 452/508 [00:37<00:35, 1.59it/s] Loading 0: 90%|████████▉ | 457/508 [00:37<00:25, 2.03it/s] Loading 0: 91%|█████████ | 462/508 [00:37<00:17, 2.66it/s] Loading 0: 92%|█████████▏| 467/508 [00:37<00:11, 3.51it/s] Loading 0: 94%|█████████▍| 477/508 [00:37<00:05, 5.96it/s] Loading 0: 96%|█████████▌| 487/508 [00:38<00:02, 9.22it/s] Loading 0: 97%|█████████▋| 494/508 [00:38<00:01, 12.04it/s] Loading 0: 99%|█████████▉| 502/508 [00:38<00:00, 14.77it/s] Loading 0: 100%|██████████| 508/508 [00:38<00:00, 18.12it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:950: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
google-gemma-2-27b-it-v11-mkmlizer: warnings.warn(
google-gemma-2-27b-it-v11-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:778: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
google-gemma-2-27b-it-v11-mkmlizer: warnings.warn(
google-gemma-2-27b-it-v11-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.75s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.92s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.20s/it]
google-gemma-2-27b-it-v11-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.40it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.90it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.56it/s]
google-gemma-2-27b-it-v11-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
google-gemma-2-27b-it-v11-mkmlizer: Saving duration: 1.360s
google-gemma-2-27b-it-v11-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.350s
google-gemma-2-27b-it-v11-mkmlizer: creating bucket guanaco-reward-models
google-gemma-2-27b-it-v11-mkmlizer: Bucket 's3://guanaco-reward-models/' created
google-gemma-2-27b-it-v11-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/google-gemma-2-27b-it-v11_reward
google-gemma-2-27b-it-v11-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/google-gemma-2-27b-it-v11_reward/tokenizer_config.json
google-gemma-2-27b-it-v11-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/google-gemma-2-27b-it-v11_reward/special_tokens_map.json
google-gemma-2-27b-it-v11-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/google-gemma-2-27b-it-v11_reward/config.json
google-gemma-2-27b-it-v11-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/google-gemma-2-27b-it-v11_reward/merges.txt
google-gemma-2-27b-it-v11-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/google-gemma-2-27b-it-v11_reward/vocab.json
google-gemma-2-27b-it-v11-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/google-gemma-2-27b-it-v11_reward/tokenizer.json
google-gemma-2-27b-it-v11-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/google-gemma-2-27b-it-v11_reward/reward.tensors
Job google-gemma-2-27b-it-v11-mkmlizer completed after 178.26s with status: succeeded
Stopping job with name google-gemma-2-27b-it-v11-mkmlizer
Pipeline stage MKMLizer completed in 179.34s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service google-gemma-2-27b-it-v11
Waiting for inference service google-gemma-2-27b-it-v11 to be ready
Failed to get response for submission blend_kupeb_2024-07-19: ('http://neversleep-noromaid-v0-8068-v36-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:34262->127.0.0.1:8080: read: connection reset by peer\n')
Failed to get response for submission undi95-meta-llama-3-70b_6209_v18: ('http://undi95-meta-llama-3-70b-6209-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"TypeError : SamplingParameters.__init__() got an unexpected keyword argument \'reward_max_tokens\'"}')
Failed to get response for submission undi95-meta-llama-3-70b_6209_v18: ('http://undi95-meta-llama-3-70b-6209-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"TypeError : SamplingParameters.__init__() got an unexpected keyword argument \'reward_max_tokens\'"}')
Failed to get response for submission blend_dunet_2024-07-19: ('http://undi95-meta-llama-3-70b-6209-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"TypeError : SamplingParameters.__init__() got an unexpected keyword argument \'reward_max_tokens\'"}')
Inference service google-gemma-2-27b-it-v11 ready after 181.4979214668274s
Pipeline stage ISVCDeployer completed in 183.59s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.6654412746429443s
Received healthy response to inference request in 1.737379550933838s
Received healthy response to inference request in 2.2695131301879883s
Received healthy response to inference request in 2.158775806427002s
Received healthy response to inference request in 1.5100932121276855s
5 requests
0 failed requests
5th percentile: 1.555550479888916
10th percentile: 1.6010077476501465
20th percentile: 1.6919222831726075
30th percentile: 1.8216588020324707
40th percentile: 1.9902173042297364
50th percentile: 2.158775806427002
60th percentile: 2.2030707359313966
70th percentile: 2.247365665435791
80th percentile: 2.3486987590789794
90th percentile: 2.507070016860962
95th percentile: 2.586255645751953
99th percentile: 2.649604148864746
mean time: 2.068240594863892
Pipeline stage StressChecker completed in 11.00s
google-gemma-2-27b-it_v11 status is now deployed due to DeploymentManager action
google-gemma-2-27b-it_v11 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of google-gemma-2-27b-it_v11
Running pipeline stage ISVCDeleter
Checking if service google-gemma-2-27b-it-v11 is running
Tearing down inference service google-gemma-2-27b-it-v11
Service google-gemma-2-27b-it-v11 has been torndown
Pipeline stage ISVCDeleter completed in 4.37s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key google-gemma-2-27b-it-v11/config.json from bucket guanaco-mkml-models
Deleting key google-gemma-2-27b-it-v11/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key google-gemma-2-27b-it-v11/flywheel_model.1.safetensors from bucket guanaco-mkml-models
Deleting key google-gemma-2-27b-it-v11/flywheel_model.2.safetensors from bucket guanaco-mkml-models
Deleting key google-gemma-2-27b-it-v11/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key google-gemma-2-27b-it-v11/tokenizer.json from bucket guanaco-mkml-models
Deleting key google-gemma-2-27b-it-v11/tokenizer.model from bucket guanaco-mkml-models
Deleting key google-gemma-2-27b-it-v11/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key google-gemma-2-27b-it-v11_reward/config.json from bucket guanaco-reward-models
Deleting key google-gemma-2-27b-it-v11_reward/merges.txt from bucket guanaco-reward-models
Deleting key google-gemma-2-27b-it-v11_reward/reward.tensors from bucket guanaco-reward-models
Deleting key google-gemma-2-27b-it-v11_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key google-gemma-2-27b-it-v11_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key google-gemma-2-27b-it-v11_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key google-gemma-2-27b-it-v11_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 8.13s
google-gemma-2-27b-it_v11 status is now torndown due to DeploymentManager action