submission_id: zeuslabs-l3-aethora-15b-v2_v7
developer_uid: alexdaoud
status: inactive
model_repo: ZeusLabs/L3-Aethora-15B-V2
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.08, 'top_k': 40, 'presence_penalty': 0.25, 'frequency_penalty': 0.1, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-29T00:39:43+00:00
model_name: zeuslabs-l3-aethora-15b-v2_v2
model_group: ZeusLabs/L3-Aethora-15B-
num_battles: 28552
num_wins: 13852
celo_rating: 1162.36
propriety_score: 0.7063392262741327
propriety_total_count: 13519.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 15009845248.0
best_of: 4
max_input_tokens: 512
max_output_tokens: 64
display_name: zeuslabs-l3-aethora-15b-v2_v2
ineligible_reason: None
language_model: ZeusLabs/L3-Aethora-15B-V2
model_size: 15B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-06-28
win_ratio: 0.48514990193331464
Resubmit model
Running pipeline stage MKMLizer
Starting job with name zeuslabs-l3-aethora-15b-v2-v7-mkmlizer
Waiting for job on zeuslabs-l3-aethora-15b-v2-v7-mkmlizer to finish
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ _____ __ __ ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ /___/ ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ Version: 0.8.14 ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ https://mk1.ai ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ The license key for the current software has been verified as ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ belonging to: ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ Chai Research Corp. ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ║ ║
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: warnings.warn(warning_message, FutureWarning)
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Downloaded to shared memory in 30.686s
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: quantizing model to /dev/shm/model_cache
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Loading 0: 0%| | 0/579 [00:00<?, ?it/s] Loading 0: 2%|▏ | 13/579 [00:00<00:04, 129.02it/s] Loading 0: 5%|▌ | 31/579 [00:00<00:03, 152.21it/s] Loading 0: 8%|▊ | 49/579 [00:00<00:03, 157.70it/s] Loading 0: 11%|█ | 65/579 [00:00<00:03, 152.52it/s] Loading 0: 14%|█▍ | 81/579 [00:00<00:03, 146.34it/s] Loading 0: 17%|█▋ | 96/579 [00:00<00:06, 79.39it/s] Loading 0: 19%|█▉ | 112/579 [00:01<00:04, 95.09it/s] Loading 0: 22%|██▏ | 130/579 [00:01<00:04, 110.57it/s] Loading 0: 26%|██▌ | 148/579 [00:01<00:03, 123.63it/s] Loading 0: 28%|██▊ | 165/579 [00:01<00:03, 133.57it/s] Loading 0: 31%|███▏ | 181/579 [00:01<00:02, 132.86it/s] Loading 0: 34%|███▍ | 196/579 [00:01<00:04, 79.15it/s] Loading 0: 37%|███▋ | 212/579 [00:01<00:03, 92.54it/s] Loading 0: 40%|███▉ | 230/579 [00:02<00:03, 109.13it/s] Loading 0: 43%|████▎ | 248/579 [00:02<00:02, 123.20it/s] Loading 0: 46%|████▌ | 266/579 [00:02<00:02, 134.90it/s] Loading 0: 49%|████▉ | 285/579 [00:02<00:01, 147.45it/s] Loading 0: 52%|█████▏ | 302/579 [00:02<00:03, 91.19it/s] Loading 0: 55%|█████▌ | 320/579 [00:02<00:02, 106.39it/s] Loading 0: 58%|█████▊ | 338/579 [00:02<00:01, 120.56it/s] Loading 0: 61%|██████▏ | 356/579 [00:03<00:01, 131.93it/s] Loading 0: 65%|██████▍ | 374/579 [00:03<00:01, 139.14it/s] Loading 0: 68%|██████▊ | 393/579 [00:03<00:01, 98.32it/s] Loading 0: 70%|███████ | 406/579 [00:03<00:01, 103.38it/s] Loading 0: 73%|███████▎ | 420/579 [00:03<00:01, 111.04it/s] Loading 0: 75%|███████▌ | 437/579 [00:03<00:01, 122.02it/s] Loading 0: 79%|███████▊ | 455/579 [00:03<00:00, 131.96it/s] Loading 0: 82%|████████▏ | 473/579 [00:04<00:00, 140.62it/s] Loading 0: 85%|████████▍ | 491/579 [00:04<00:00, 149.94it/s] Loading 0: 88%|████████▊ | 507/579 [00:04<00:00, 89.60it/s] Loading 0: 89%|████████▉ | 516/579 [00:21<00:00, 89.60it/s] Loading 0: 89%|████████▉ | 517/579 [00:21<00:21, 2.92it/s] Loading 0: 92%|█████████▏| 535/579 [00:21<00:10, 4.40it/s] Loading 0: 96%|█████████▌| 553/579 [00:21<00:04, 6.46it/s] Loading 0: 99%|█████████▊| 571/579 [00:21<00:00, 9.31it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: quantized model in 35.029s
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Processed model ZeusLabs/L3-Aethora-15B-V2 in 70.656s
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: creating bucket guanaco-mkml-models
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zeuslabs-l3-aethora-15b-v2-v7
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zeuslabs-l3-aethora-15b-v2-v7/tokenizer_config.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zeuslabs-l3-aethora-15b-v2-v7/special_tokens_map.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zeuslabs-l3-aethora-15b-v2-v7/config.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zeuslabs-l3-aethora-15b-v2-v7/tokenizer.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/zeuslabs-l3-aethora-15b-v2-v7/flywheel_model.1.safetensors
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zeuslabs-l3-aethora-15b-v2-v7/flywheel_model.0.safetensors
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: warnings.warn(
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: warnings.warn(
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: warnings.warn(
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: return self.fget.__get__(instance, owner)()
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Saving duration: 0.434s
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 4.965s
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: creating bucket guanaco-reward-models
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: Bucket 's3://guanaco-reward-models/' created
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/zeuslabs-l3-aethora-15b-v2-v7_reward
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/zeuslabs-l3-aethora-15b-v2-v7_reward/config.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/zeuslabs-l3-aethora-15b-v2-v7_reward/tokenizer_config.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/zeuslabs-l3-aethora-15b-v2-v7_reward/special_tokens_map.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/zeuslabs-l3-aethora-15b-v2-v7_reward/vocab.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/zeuslabs-l3-aethora-15b-v2-v7_reward/merges.txt
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/zeuslabs-l3-aethora-15b-v2-v7_reward/tokenizer.json
zeuslabs-l3-aethora-15b-v2-v7-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/zeuslabs-l3-aethora-15b-v2-v7_reward/reward.tensors
Job zeuslabs-l3-aethora-15b-v2-v7-mkmlizer completed after 104.56s with status: succeeded
Stopping job with name zeuslabs-l3-aethora-15b-v2-v7-mkmlizer
Pipeline stage MKMLizer completed in 105.43s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.16s
Running pipeline stage ISVCDeployer
Creating inference service zeuslabs-l3-aethora-15b-v2-v7
Waiting for inference service zeuslabs-l3-aethora-15b-v2-v7 to be ready
Inference service zeuslabs-l3-aethora-15b-v2-v7 ready after 462.3692455291748s
Pipeline stage ISVCDeployer completed in 469.33s
Running pipeline stage StressChecker
%s, retrying in %s seconds...
%s, retrying in %s seconds...
Received healthy response to inference request in 2.874854803085327s
Received healthy response to inference request in 1.9954593181610107s
Received healthy response to inference request in 2.059329032897949s
Received healthy response to inference request in 2.0363457202911377s
Received healthy response to inference request in 1.7065339088439941s
5 requests
0 failed requests
5th percentile: 1.7643189907073975
10th percentile: 1.8221040725708009
20th percentile: 1.9376742362976074
30th percentile: 2.003636598587036
40th percentile: 2.019991159439087
50th percentile: 2.0363457202911377
60th percentile: 2.045539045333862
70th percentile: 2.0547323703765867
80th percentile: 2.222434186935425
90th percentile: 2.548644495010376
95th percentile: 2.711749649047851
99th percentile: 2.842233772277832
mean time: 2.134504556655884
Pipeline stage StressChecker completed in 22.53s
zeuslabs-l3-aethora-15b-v2_v7 status is now deployed due to DeploymentManager action
zeuslabs-l3-aethora-15b-v2_v7 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics