submission_id: jellywibble-vampirekingn_2997_v2
developer_uid: chai_backend_admin
alignment_samples: 10713
alignment_score: 0.10713159678789427
best_of: 16
celo_rating: 1210.07
display_name: jellywibble-vampirekingn_2997_v2
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': True}
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['</s>', '###', 'You:', '\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: True
language_model: Jellywibble/VampireKingNoDedupEP4
max_input_tokens: 512
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: Jellywibble/VampireKingN
model_name: jellywibble-vampirekingn_2997_v2
model_num_parameters: 12772070400.0
model_repo: Jellywibble/VampireKingNoDedupEP4
model_size: 13B
num_battles: 10713
num_wins: 5139
propriety_score: 0.7150101419878296
propriety_total_count: 986.0
ranking_group: single
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-08-12T22:06:43+00:00
us_pacific_date: 2024-08-12
win_ratio: 0.47969756370764494
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jellywibble-vampirekingn-2997-v2-mkmlizer
Waiting for job on jellywibble-vampirekingn-2997-v2-mkmlizer to finish
Stopping job with name jellywibble-vampirekingn-2997-v2-mkmlizer
%s, retrying in %s seconds...
Starting job with name jellywibble-vampirekingn-2997-v2-mkmlizer
Waiting for job on jellywibble-vampirekingn-2997-v2-mkmlizer to finish
jellywibble-vampirekingn-2997-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ _____ __ __ ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ /___/ ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ Version: 0.9.9 ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ https://mk1.ai ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ The license key for the current software has been verified as ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ belonging to: ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ Chai Research Corp. ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ║ ║
jellywibble-vampirekingn-2997-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jellywibble-vampirekingn-2997-v2-mkmlizer: Downloaded to shared memory in 42.982s
jellywibble-vampirekingn-2997-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpn9ju89hq, device:0
jellywibble-vampirekingn-2997-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jellywibble-vampirekingn-2997-v2-mkmlizer: quantized model in 35.370s
jellywibble-vampirekingn-2997-v2-mkmlizer: Processed model Jellywibble/VampireKingNoDedupEP4 in 78.352s
jellywibble-vampirekingn-2997-v2-mkmlizer: creating bucket guanaco-mkml-models
jellywibble-vampirekingn-2997-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jellywibble-vampirekingn-2997-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jellywibble-vampirekingn-2997-v2
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jellywibble-vampirekingn-2997-v2/config.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jellywibble-vampirekingn-2997-v2/special_tokens_map.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jellywibble-vampirekingn-2997-v2/tokenizer_config.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jellywibble-vampirekingn-2997-v2/tokenizer.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jellywibble-vampirekingn-2997-v2/flywheel_model.0.safetensors
jellywibble-vampirekingn-2997-v2-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
jellywibble-vampirekingn-2997-v2-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:11, 30.95it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:06, 52.65it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 47.71it/s] Loading 0: 7%|▋ | 25/363 [00:00<00:06, 48.90it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 48.46it/s] Loading 0: 10%|█ | 37/363 [00:00<00:06, 46.75it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 45.74it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 52.05it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 47.28it/s] Loading 0: 17%|█▋ | 60/363 [00:01<00:06, 47.69it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:09, 31.49it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 38.51it/s] Loading 0: 21%|██▏ | 78/363 [00:01<00:07, 40.26it/s] Loading 0: 23%|██▎ | 83/363 [00:01<00:06, 41.01it/s] Loading 0: 25%|██▍ | 90/363 [00:02<00:05, 46.52it/s] Loading 0: 26%|██▋ | 96/363 [00:02<00:06, 43.91it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:06, 42.62it/s] Loading 0: 30%|██▉ | 108/363 [00:02<00:05, 48.90it/s] Loading 0: 31%|███▏ | 114/363 [00:02<00:05, 45.44it/s] Loading 0: 33%|███▎ | 119/363 [00:02<00:05, 43.45it/s] Loading 0: 34%|███▍ | 125/363 [00:02<00:05, 45.99it/s] Loading 0: 36%|███▌ | 130/363 [00:02<00:05, 43.33it/s] Loading 0: 37%|███▋ | 135/363 [00:03<00:05, 43.21it/s] Loading 0: 39%|███▊ | 140/363 [00:03<00:05, 44.48it/s] Loading 0: 40%|███▉ | 145/363 [00:03<00:07, 27.88it/s] Loading 0: 41%|████ | 149/363 [00:03<00:07, 28.73it/s] Loading 0: 43%|████▎ | 157/363 [00:03<00:05, 37.41it/s] Loading 0: 45%|████▍ | 163/363 [00:03<00:05, 38.37it/s] Loading 0: 46%|████▋ | 168/363 [00:04<00:04, 39.33it/s] Loading 0: 48%|████▊ | 175/363 [00:04<00:04, 45.29it/s] Loading 0: 50%|████▉ | 181/363 [00:04<00:04, 43.87it/s] Loading 0: 51%|█████ | 186/363 [00:04<00:04, 42.70it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:03, 47.83it/s] Loading 0: 55%|█████▍ | 199/363 [00:04<00:03, 46.29it/s] Loading 0: 56%|█████▌ | 204/363 [00:04<00:03, 43.48it/s] Loading 0: 58%|█████▊ | 211/363 [00:04<00:03, 48.45it/s] Loading 0: 60%|█████▉ | 217/363 [00:05<00:03, 46.69it/s] Loading 0: 61%|██████▏ | 223/363 [00:05<00:03, 35.43it/s] Loading 0: 63%|██████▎ | 228/363 [00:05<00:03, 36.01it/s] Loading 0: 64%|██████▍ | 232/363 [00:05<00:03, 36.74it/s] Loading 0: 66%|██████▌ | 238/363 [00:05<00:03, 41.62it/s] Loading 0: 67%|██████▋ | 243/363 [00:05<00:02, 43.50it/s] Loading 0: 68%|██████▊ | 248/363 [00:05<00:03, 38.08it/s] Loading 0: 71%|███████ | 256/363 [00:06<00:02, 45.89it/s] Loading 0: 72%|███████▏ | 262/363 [00:06<00:02, 44.72it/s] Loading 0: 74%|███████▎ | 267/363 [00:06<00:02, 43.21it/s] Loading 0: 75%|███████▌ | 274/363 [00:06<00:01, 47.19it/s] Loading 0: 77%|███████▋ | 279/363 [00:06<00:01, 47.76it/s] Loading 0: 78%|███████▊ | 284/363 [00:06<00:01, 39.73it/s] Loading 0: 80%|████████ | 292/363 [00:06<00:01, 47.40it/s] Loading 0: 82%|████████▏ | 298/363 [00:06<00:01, 45.81it/s] Loading 0: 84%|████████▎ | 304/363 [00:13<00:20, 2.87it/s] Loading 0: 85%|████████▍ | 308/363 [00:13<00:15, 3.59it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:11, 4.57it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:05, 7.37it/s] Loading 0: 90%|████████▉ | 326/363 [00:14<00:03, 9.88it/s] Loading 0: 91%|█████████ | 331/363 [00:14<00:02, 12.45it/s] Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 17.35it/s] Loading 0: 95%|█████████▍| 344/363 [00:14<00:00, 21.09it/s] Loading 0: 96%|█████████▌| 349/363 [00:14<00:00, 24.29it/s] Loading 0: 98%|█████████▊| 356/363 [00:14<00:00, 30.70it/s] Loading 0: 100%|█████████▉| 362/363 [00:14<00:00, 32.51it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-vampirekingn-2997-v2-mkmlizer: warnings.warn(
jellywibble-vampirekingn-2997-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-vampirekingn-2997-v2-mkmlizer: warnings.warn(
jellywibble-vampirekingn-2997-v2-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jellywibble-vampirekingn-2997-v2-mkmlizer: warnings.warn(
jellywibble-vampirekingn-2997-v2-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.49s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.83s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.08s/it]
jellywibble-vampirekingn-2997-v2-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.38it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.89it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.55it/s]
jellywibble-vampirekingn-2997-v2-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
jellywibble-vampirekingn-2997-v2-mkmlizer: Saving duration: 1.362s
jellywibble-vampirekingn-2997-v2-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.168s
jellywibble-vampirekingn-2997-v2-mkmlizer: creating bucket guanaco-reward-models
jellywibble-vampirekingn-2997-v2-mkmlizer: Bucket 's3://guanaco-reward-models/' created
jellywibble-vampirekingn-2997-v2-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/jellywibble-vampirekingn-2997-v2_reward
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/jellywibble-vampirekingn-2997-v2_reward/config.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/jellywibble-vampirekingn-2997-v2_reward/special_tokens_map.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/jellywibble-vampirekingn-2997-v2_reward/tokenizer_config.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/jellywibble-vampirekingn-2997-v2_reward/merges.txt
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/jellywibble-vampirekingn-2997-v2_reward/vocab.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/jellywibble-vampirekingn-2997-v2_reward/tokenizer.json
jellywibble-vampirekingn-2997-v2-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/jellywibble-vampirekingn-2997-v2_reward/reward.tensors
Job jellywibble-vampirekingn-2997-v2-mkmlizer completed after 127.45s with status: succeeded
Stopping job with name jellywibble-vampirekingn-2997-v2-mkmlizer
Pipeline stage MKMLizer completed in 128.77s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service jellywibble-vampirekingn-2997-v2
Waiting for inference service jellywibble-vampirekingn-2997-v2 to be ready
Failed to get response for submission v000000-l3-8b-poppy-moon_1166_v4: ('http://v000000-l3-8b-poppy-moon-1166-v4-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Inference service jellywibble-vampirekingn-2997-v2 ready after 201.19180345535278s
Pipeline stage ISVCDeployer completed in 202.76s
Running pipeline stage StressChecker
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 2.547001600265503s
Received healthy response to inference request in 1.6821491718292236s
Received healthy response to inference request in 2.2678370475769043s
Received healthy response to inference request in 1.6675419807434082s
Received healthy response to inference request in 1.6918346881866455s
5 requests
0 failed requests
5th percentile: 1.6704634189605714
10th percentile: 1.6733848571777343
20th percentile: 1.6792277336120605
30th percentile: 1.684086275100708
40th percentile: 1.6879604816436768
50th percentile: 1.6918346881866455
60th percentile: 1.922235631942749
70th percentile: 2.1526365756988524
80th percentile: 2.3236699581146243
90th percentile: 2.4353357791900634
95th percentile: 2.491168689727783
99th percentile: 2.5358350181579588
mean time: 1.9712728977203369
Pipeline stage StressChecker completed in 10.64s
jellywibble-vampirekingn_2997_v2 status is now deployed due to DeploymentManager action
jellywibble-vampirekingn_2997_v2 status is now inactive due to auto deactivation removed underperforming models
jellywibble-vampirekingn_2997_v2 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics