submission_id: nitral-ai-hathor-tahsin_6217_v14
developer_uid: Nitral-AI
alignment_samples: 0
best_of: 16
celo_rating: 1248.72
display_name: nitral-ai-hathor-l3-8b-v-01_v12
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.2, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64, 'reward_max_token_input': 256}
is_internal_developer: False
language_model: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: Nitral-AI/Hathor_Tahsin-
model_name: nitral-ai-hathor-l3-8b-v-01_v12
model_num_parameters: 8030261248.0
model_repo: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
model_size: 8B
num_battles: 8691
num_wins: 4912
propriety_score: 0.7150964812712827
propriety_total_count: 881.0
ranking_group: single
reward_formatter: {'bot_template': '{bot_name}: {message}\n', 'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'response_template': '{bot_name}:', 'truncate_by_message': False, 'user_template': '{user_name}: {message}\n'}
reward_repo: ChaiML/gpt2_xl_pairwise_89m_step_347634
status: torndown
submission_type: basic
timestamp: 2024-07-27T01:21:28+00:00
us_pacific_date: 2024-07-26
win_ratio: 0.5651823725693246
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name nitral-ai-hathor-tahsin-6217-v14-mkmlizer
Waiting for job on nitral-ai-hathor-tahsin-6217-v14-mkmlizer to finish
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ _____ __ __ ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ /___/ ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ Version: 0.9.7 ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ https://mk1.ai ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ The license key for the current software has been verified as ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ belonging to: ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ Chai Research Corp. ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ║ ║
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Downloaded to shared memory in 21.510s
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpxfvorxs7, device:0
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission shuttleai-shuttle-2-5-1-_5730_v8: ('http://shuttleai-shuttle-2-5-1-5730-v8-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission shuttleai-shuttle-2-5-1-_5730_v8: ('http://shuttleai-shuttle-2-5-1-5730-v8-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: quantized model in 26.102s
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Processed model Nitral-AI/Hathor_Tahsin-L3-8B-v0.85 in 47.611s
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: creating bucket guanaco-mkml-models
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v14
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v14/config.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v14/special_tokens_map.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v14/tokenizer_config.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v14/tokenizer.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/nitral-ai-hathor-tahsin-6217-v14/flywheel_model.0.safetensors
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: warnings.warn(
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: warnings.warn(
Failed to get response for submission shuttleai-shuttle-2-5-1-_5730_v8: ('http://shuttleai-shuttle-2-5-1-5730-v8-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.51s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 3.94s/it] Downloading shards: 100%|██████████| 2/2 [00:08<00:00, 4.18s/it]
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.33it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.86it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.52it/s]
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Saving duration: 1.319s
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 13.143s
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: creating bucket guanaco-reward-models
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: Bucket 's3://guanaco-reward-models/' created
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v14_reward
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v14_reward/config.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v14_reward/special_tokens_map.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v14_reward/tokenizer_config.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v14_reward/merges.txt
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v14_reward/vocab.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v14_reward/tokenizer.json
nitral-ai-hathor-tahsin-6217-v14-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/nitral-ai-hathor-tahsin-6217-v14_reward/reward.tensors
Job nitral-ai-hathor-tahsin-6217-v14-mkmlizer completed after 96.11s with status: succeeded
Stopping job with name nitral-ai-hathor-tahsin-6217-v14-mkmlizer
Pipeline stage MKMLizer completed in 97.28s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service nitral-ai-hathor-tahsin-6217-v14
Waiting for inference service nitral-ai-hathor-tahsin-6217-v14 to be ready
Failed to get response for submission shuttleai-shuttle-2-5-1-_5730_v8: ('http://shuttleai-shuttle-2-5-1-5730-v8-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission shuttleai-shuttle-2-5-1-_5730_v8: ('http://shuttleai-shuttle-2-5-1-5730-v8-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission shuttleai-shuttle-2-5-1-_5730_v8: ('http://shuttleai-shuttle-2-5-1-5730-v8-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission shuttleai-shuttle-2-5-1-_5730_v8: ('http://shuttleai-shuttle-2-5-1-5730-v8-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Failed to get response for submission shuttleai-shuttle-2-5-1-_5730_v8: ('http://shuttleai-shuttle-2-5-1-5730-v8-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'request timeout')
Inference service nitral-ai-hathor-tahsin-6217-v14 ready after 90.62319755554199s
Pipeline stage ISVCDeployer completed in 92.26s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2377102375030518s
Received healthy response to inference request in 1.4843800067901611s
Received healthy response to inference request in 1.4829292297363281s
Received healthy response to inference request in 1.3877003192901611s
Received healthy response to inference request in 1.4812839031219482s
5 requests
0 failed requests
5th percentile: 1.4064170360565185
10th percentile: 1.425133752822876
20th percentile: 1.462567186355591
30th percentile: 1.4816129684448243
40th percentile: 1.482271099090576
50th percentile: 1.4829292297363281
60th percentile: 1.4835095405578613
70th percentile: 1.4840898513793945
80th percentile: 1.6350460529327393
90th percentile: 1.9363781452178956
95th percentile: 2.0870441913604734
99th percentile: 2.207577028274536
mean time: 1.6148007392883301
Pipeline stage StressChecker completed in 8.82s
nitral-ai-hathor-tahsin_6217_v14 status is now deployed due to DeploymentManager action
nitral-ai-hathor-tahsin_6217_v14 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of nitral-ai-hathor-tahsin_6217_v14
Running pipeline stage ISVCDeleter
Checking if service nitral-ai-hathor-tahsin-6217-v14 is running
Tearing down inference service nitral-ai-hathor-tahsin-6217-v14
Service nitral-ai-hathor-tahsin-6217-v14 has been torndown
Pipeline stage ISVCDeleter completed in 4.89s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key nitral-ai-hathor-tahsin-6217-v14/config.json from bucket guanaco-mkml-models
Deleting key nitral-ai-hathor-tahsin-6217-v14/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key nitral-ai-hathor-tahsin-6217-v14/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key nitral-ai-hathor-tahsin-6217-v14/tokenizer.json from bucket guanaco-mkml-models
Deleting key nitral-ai-hathor-tahsin-6217-v14/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key nitral-ai-hathor-tahsin-6217-v14_reward/config.json from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v14_reward/merges.txt from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v14_reward/reward.tensors from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v14_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v14_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v14_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key nitral-ai-hathor-tahsin-6217-v14_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 6.00s
nitral-ai-hathor-tahsin_6217_v14 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics