submission_id: chaiml-sao10k-l3-rp-v3-3_v51
developer_uid: robert_irvine
best_of: 16
display_name: chaiml-sao10k-l3-rp-v3-3_v51
family_friendly_score: 0.0
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.95, 'top_p': 0.95, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>,', '<|eot_id|>,', '\n\n{user_name}'], 'max_input_tokens': 1024, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 512}
ineligible_reason: num_battles<5000
is_internal_developer: True
language_model: ChaiML/sao10k-l3-rp-v3-3
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: ChaiML/sao10k-l3-rp-v3-3
model_name: chaiml-sao10k-l3-rp-v3-3_v51
model_num_parameters: 8030261248.0
model_repo: ChaiML/sao10k-l3-rp-v3-3
model_size: 8B
num_battles: 1
num_wins: 0
ranking_group: single
reward_formatter: {'bot_template': 'Bot: {message}\n', 'memory_template': '', 'prompt_template': '', 'response_template': 'Bot:', 'truncate_by_message': False, 'user_template': 'User: {message}\n'}
reward_repo: rirv938/gpt2_improves_the_story_512
status: torndown
submission_type: basic
timestamp: 2024-07-24T00:08:11+00:00
us_pacific_date: 2024-07-23
win_ratio: 0.0
Resubmit model
Running pipeline stage MKMLizer
Starting job with name chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer
Waiting for job on chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer to finish
Failed to get response for submission blend_dunet_2024-07-19: ('http://undi95-meta-llama-3-70b-6209-v18-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"TypeError : SamplingParameters.__init__() got an unexpected keyword argument \'reward_max_tokens\'"}')
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ _____ __ __ ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ /___/ ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ Version: 0.9.5.post3 ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ https://mk1.ai ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ belonging to: ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ Chai Research Corp. ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ║ ║
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Downloaded to shared memory in 21.954s
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmppuokw49q, device:0
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:04<11:33, 2.40s/it] Loading 0: 5%|▍ | 14/291 [00:04<01:11, 3.87it/s] Loading 0: 11%|█ | 31/291 [00:05<00:25, 10.39it/s] Loading 0: 15%|█▍ | 43/291 [00:05<00:15, 16.22it/s] Loading 0: 20%|██ | 59/291 [00:05<00:08, 26.13it/s] Loading 0: 25%|██▍ | 72/291 [00:05<00:07, 29.52it/s] Loading 0: 30%|██▉ | 86/291 [00:05<00:05, 39.99it/s] Loading 0: 35%|███▌ | 103/291 [00:05<00:03, 55.09it/s] Loading 0: 40%|███▉ | 116/291 [00:05<00:02, 65.87it/s] Loading 0: 45%|████▌ | 131/291 [00:05<00:02, 79.03it/s] Loading 0: 51%|█████ | 148/291 [00:06<00:01, 95.09it/s] Loading 0: 56%|█████▌ | 162/291 [00:06<00:01, 104.01it/s] Loading 0: 60%|██████ | 176/291 [00:06<00:01, 77.91it/s] Loading 0: 66%|██████▋ | 193/291 [00:06<00:01, 93.21it/s] Loading 0: 71%|███████ | 207/291 [00:06<00:00, 102.14it/s] Loading 0: 76%|███████▌ | 221/291 [00:06<00:00, 108.73it/s] Loading 0: 82%|████████▏ | 238/291 [00:06<00:00, 119.73it/s] Loading 0: 87%|████████▋ | 252/291 [00:07<00:00, 123.65it/s] Loading 0: 91%|█████████▏| 266/291 [00:07<00:00, 85.39it/s] Loading 0: 97%|█████████▋| 283/291 [00:07<00:00, 100.25it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: quantized model in 22.287s
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Processed model ChaiML/sao10k-l3-rp-v3-3 in 44.241s
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: creating bucket guanaco-mkml-models
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v51
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v51/config.json
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v51/special_tokens_map.json
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v51/tokenizer_config.json
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v51/tokenizer.json
Failed to get response for submission blend_kupeb_2024-07-19: ('http://mistralai-mixtral-8x7b-3473-v33-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:45340->127.0.0.1:8080: read: connection reset by peer\n')
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaiml-sao10k-l3-rp-v3-3-v51/flywheel_model.0.safetensors
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: loading reward model from rirv938/gpt2_improves_the_story_512
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:950: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: warnings.warn(
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:778: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: warnings.warn(
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: warnings.warn(
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.40s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.86s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.09s/it]
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.44it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.98it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.64it/s]
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Saving duration: 1.311s
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Processed model rirv938/gpt2_improves_the_story_512 in 12.877s
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: creating bucket guanaco-reward-models
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: Bucket 's3://guanaco-reward-models/' created
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v51_reward
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v51_reward/tokenizer.json
chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/chaiml-sao10k-l3-rp-v3-3-v51_reward/reward.tensors
Job chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer completed after 85.43s with status: succeeded
Stopping job with name chaiml-sao10k-l3-rp-v3-3-v51-mkmlizer
Pipeline stage MKMLizer completed in 86.40s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.08s
Running pipeline stage ISVCDeployer
Creating inference service chaiml-sao10k-l3-rp-v3-3-v51
Waiting for inference service chaiml-sao10k-l3-rp-v3-3-v51 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service chaiml-sao10k-l3-rp-v3-3-v51 ready after 60.85094690322876s
Pipeline stage ISVCDeployer completed in 62.44s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2713096141815186s
Received healthy response to inference request in 1.3454558849334717s
Received healthy response to inference request in 1.3729465007781982s
Received healthy response to inference request in 1.4198863506317139s
Received healthy response to inference request in 1.300541877746582s
5 requests
0 failed requests
5th percentile: 1.3095246791839599
10th percentile: 1.318507480621338
20th percentile: 1.3364730834960938
30th percentile: 1.350954008102417
40th percentile: 1.3619502544403077
50th percentile: 1.3729465007781982
60th percentile: 1.3917224407196045
70th percentile: 1.4104983806610107
80th percentile: 1.590171003341675
90th percentile: 1.9307403087615969
95th percentile: 2.1010249614715573
99th percentile: 2.237252683639526
mean time: 1.542028045654297
Pipeline stage StressChecker completed in 8.39s
chaiml-sao10k-l3-rp-v3-3_v51 status is now deployed due to DeploymentManager action
chaiml-sao10k-l3-rp-v3-3_v51 status is now inactive due to admin request
admin requested tearing down of chaiml-sao10k-l3-rp-v3-3_v51
Running pipeline stage ISVCDeleter
Checking if service chaiml-sao10k-l3-rp-v3-3-v51 is running
Tearing down inference service chaiml-sao10k-l3-rp-v3-3-v51
Service chaiml-sao10k-l3-rp-v3-3-v51 has been torndown
Pipeline stage ISVCDeleter completed in 4.71s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key chaiml-sao10k-l3-rp-v3-3-v51/config.json from bucket guanaco-mkml-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51/tokenizer.json from bucket guanaco-mkml-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key chaiml-sao10k-l3-rp-v3-3-v51_reward/config.json from bucket guanaco-reward-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51_reward/merges.txt from bucket guanaco-reward-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51_reward/reward.tensors from bucket guanaco-reward-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key chaiml-sao10k-l3-rp-v3-3-v51_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.25s
chaiml-sao10k-l3-rp-v3-3_v51 status is now torndown due to DeploymentManager action