submission_id: nitral-ai-hathor-tahsin_6217_v12
developer_uid: Nitral-AI
best_of: 16
celo_rating: 1245.82
display_name: nitral-ai-hathor-l3-8b-v-01_v12
family_friendly_score: 0.0
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.25, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 60, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: False
language_model: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: Nitral-AI/Hathor_Tahsin-
model_name: nitral-ai-hathor-l3-8b-v-01_v12
model_num_parameters: 8030261248.0
model_repo: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
model_size: 8B
num_battles: 15039
num_wins: 8659
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-07-25T11:25:33+00:00
us_pacific_date: 2024-07-25
win_ratio: 0.5757696655362724
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name nitral-ai-hathor-tahsin-6217-v12-mkmlizer
Waiting for job on nitral-ai-hathor-tahsin-6217-v12-mkmlizer to finish
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ _____ __ __ ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ /___/ ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ Version: 0.9.7 ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ https://mk1.ai ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ The license key for the current software has been verified as ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ belonging to: ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ Chai Research Corp. ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ║ ║
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Downloaded to shared memory in 24.056s
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp0wjz3bst, device:0
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Saving flywheel model at /dev/shm/model_cache
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: quantized model in 27.055s
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Processed model Nitral-AI/Hathor_Tahsin-L3-8B-v0.85 in 51.112s
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: creating bucket guanaco-mkml-models
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v12
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v12/config.json
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v12/special_tokens_map.json
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v12/tokenizer_config.json
Failed to get response for submission neversleep-noromaid-v0_8068_v133: ('http://neversleep-noromaid-v0-8068-v133-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:43138->127.0.0.1:8080: read: connection reset by peer\n')
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v12/flywheel_model.0.safetensors
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:04<11:42, 2.43s/it] Loading 0: 2%|▏ | 6/291 [00:05<03:06, 1.52it/s] Loading 0: 4%|▍ | 13/291 [00:05<01:06, 4.16it/s] Loading 0: 6%|▌ | 18/291 [00:05<00:41, 6.54it/s] Loading 0: 8%|▊ | 23/291 [00:05<00:28, 9.53it/s] Loading 0: 10%|▉ | 29/291 [00:05<00:19, 13.42it/s] Loading 0: 12%|█▏ | 34/291 [00:05<00:15, 17.04it/s] Loading 0: 14%|█▍ | 41/291 [00:05<00:10, 23.75it/s] Loading 0: 16%|█▌ | 46/291 [00:05<00:08, 27.84it/s] Loading 0: 18%|█▊ | 51/291 [00:05<00:08, 27.59it/s] Loading 0: 20%|█▉ | 58/291 [00:06<00:06, 35.31it/s] Loading 0: 22%|██▏ | 64/291 [00:06<00:08, 26.51it/s] Loading 0: 24%|██▎ | 69/291 [00:06<00:08, 26.07it/s] Loading 0: 26%|██▌ | 76/291 [00:06<00:06, 31.83it/s] Loading 0: 28%|██▊ | 81/291 [00:06<00:06, 34.22it/s] Loading 0: 30%|██▉ | 86/291 [00:07<00:05, 35.80it/s] Loading 0: 32%|███▏ | 92/291 [00:07<00:05, 36.40it/s] Loading 0: 33%|███▎ | 97/291 [00:07<00:05, 37.06it/s] Loading 0: 35%|███▌ | 103/291 [00:07<00:04, 40.75it/s] Loading 0: 37%|███▋ | 108/291 [00:07<00:04, 40.97it/s] Loading 0: 39%|███▉ | 113/291 [00:07<00:04, 39.68it/s] Loading 0: 41%|████ | 118/291 [00:07<00:04, 42.15it/s] Loading 0: 42%|████▏ | 123/291 [00:07<00:04, 34.64it/s] Loading 0: 45%|████▍ | 130/291 [00:08<00:03, 41.91it/s] Loading 0: 46%|████▋ | 135/291 [00:08<00:03, 42.71it/s] Loading 0: 48%|████▊ | 140/291 [00:08<00:03, 41.58it/s] Loading 0: 50%|█████ | 146/291 [00:08<00:03, 40.47it/s] Loading 0: 52%|█████▏ | 151/291 [00:08<00:03, 39.24it/s] Loading 0: 54%|█████▍ | 157/291 [00:08<00:03, 43.16it/s] Loading 0: 56%|█████▌ | 162/291 [00:08<00:03, 42.19it/s] Loading 0: 57%|█████▋ | 167/291 [00:09<00:04, 30.00it/s] Loading 0: 59%|█████▉ | 173/291 [00:09<00:03, 32.29it/s] Loading 0: 61%|██████ | 177/291 [00:09<00:03, 32.86it/s] Loading 0: 64%|██████▎ | 185/291 [00:09<00:02, 42.22it/s] Loading 0: 66%|██████▌ | 191/291 [00:09<00:02, 41.04it/s] Loading 0: 67%|██████▋ | 196/291 [00:09<00:02, 41.07it/s] Loading 0: 70%|██████▉ | 203/291 [00:09<00:01, 46.91it/s] Loading 0: 72%|███████▏ | 209/291 [00:10<00:01, 44.57it/s] Loading 0: 74%|███████▎ | 214/291 [00:10<00:01, 43.92it/s] Loading 0: 76%|███████▌ | 220/291 [00:10<00:01, 46.80it/s] Loading 0: 77%|███████▋ | 225/291 [00:10<00:01, 46.55it/s] Loading 0: 79%|███████▉ | 230/291 [00:10<00:01, 47.12it/s] Loading 0: 81%|████████ | 236/291 [00:10<00:01, 44.32it/s] Loading 0: 83%|████████▎ | 241/291 [00:10<00:01, 42.86it/s] Loading 0: 85%|████████▌ | 248/291 [00:10<00:00, 47.88it/s] Loading 0: 87%|████████▋ | 254/291 [00:11<00:00, 44.33it/s] Loading 0: 89%|████████▉ | 259/291 [00:11<00:00, 44.23it/s] Loading 0: 91%|█████████ | 265/291 [00:11<00:00, 47.91it/s] Loading 0: 93%|█████████▎| 270/291 [00:11<00:00, 31.85it/s] Loading 0: 95%|█████████▍| 275/291 [00:11<00:00, 34.82it/s] Loading 0: 96%|█████████▌| 280/291 [00:11<00:00, 38.11it/s] Loading 0: 98%|█████████▊| 285/291 [00:11<00:00, 34.33it/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: warnings.warn(
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: warnings.warn(
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: warnings.warn(
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.22s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.82s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.03s/it]
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.40it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.87it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.54it/s]
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Saving duration: 1.339s
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.072s
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: creating bucket guanaco-reward-models
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v12_reward/config.json
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v12_reward/merges.txt
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v12_reward/vocab.json
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v12_reward/tokenizer.json
nitral-ai-hathor-tahsin-6217-v12-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v12_reward/reward.tensors
Job nitral-ai-hathor-tahsin-6217-v12-mkmlizer completed after 94.99s with status: succeeded
Stopping job with name nitral-ai-hathor-tahsin-6217-v12-mkmlizer
Pipeline stage MKMLizer completed in 95.98s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service nitral-ai-hathor-tahsin-6217-v12
Waiting for inference service nitral-ai-hathor-tahsin-6217-v12 to be ready
Inference service nitral-ai-hathor-tahsin-6217-v12 ready after 131.28806328773499s
Pipeline stage ISVCDeployer completed in 132.99s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.280836343765259s
Received healthy response to inference request in 1.484306812286377s
Received healthy response to inference request in 1.4293897151947021s
Received healthy response to inference request in 1.4424715042114258s
Received healthy response to inference request in 1.4802491664886475s
5 requests
0 failed requests
5th percentile: 1.4320060729980468
10th percentile: 1.4346224308013915
20th percentile: 1.439855146408081
30th percentile: 1.4500270366668702
40th percentile: 1.4651381015777587
50th percentile: 1.4802491664886475
60th percentile: 1.4818722248077392
70th percentile: 1.483495283126831
80th percentile: 1.6436127185821534
90th percentile: 1.962224531173706
95th percentile: 2.1215304374694823
99th percentile: 2.2489751625061034
mean time: 1.6234507083892822
Pipeline stage StressChecker completed in 8.72s
nitral-ai-hathor-tahsin_6217_v12 status is now deployed due to DeploymentManager action
nitral-ai-hathor-tahsin_6217_v12 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of nitral-ai-hathor-tahsin_6217_v12
Running pipeline stage ISVCDeleter
Checking if service nitral-ai-hathor-tahsin-6217-v12 is running
Tearing down inference service nitral-ai-hathor-tahsin-6217-v12
Service nitral-ai-hathor-tahsin-6217-v12 has been torndown
Pipeline stage ISVCDeleter completed in 4.95s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key nitral-ai-hathor-tahsin-6217-v12/config.json from bucket guanaco-mkml-models
Deleting key nitral-ai-hathor-tahsin-6217-v12/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key nitral-ai-hathor-tahsin-6217-v12/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key nitral-ai-hathor-tahsin-6217-v12/tokenizer.json from bucket guanaco-mkml-models
Deleting key nitral-ai-hathor-tahsin-6217-v12/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key nitral-ai-hathor-tahsin-6217-v12_reward/config.json from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v12_reward/merges.txt from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v12_reward/reward.tensors from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v12_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v12_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v12_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v12_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.60s
nitral-ai-hathor-tahsin_6217_v12 status is now torndown due to DeploymentManager action