submission_id: alkahestry-llmama3-monol_3361_v7
developer_uid: alkacchi
status: inactive
model_repo: alkahestry/Llmama3-Monologue-8B
reward_repo: Jellywibble/CHAI_alignment_reward_model
generation_params: {'temperature': 0.72, 'top_p': 0.73, 'min_p': 0.1, 'top_k': 1000, 'presence_penalty': 0.82, 'frequency_penalty': 0.2, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system\n{bot_name}'s persona: {memory}<|end_header_id|>", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
timestamp: 2024-06-30T01:16:02+00:00
model_name: alkacchi-monologue-8B
model_group: alkahestry/Llmama3-Monol
num_battles: 17619
num_wins: 7975
celo_rating: 1119.02
propriety_score: 0.7553204280389564
propriety_total_count: 8317.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: alkacchi-monologue-8B
ineligible_reason: None
language_model: alkahestry/Llmama3-Monologue-8B
model_size: 8B
reward_model: Jellywibble/CHAI_alignment_reward_model
us_pacific_date: 2024-06-29
win_ratio: 0.4526363584766445
Resubmit model
Running pipeline stage MKMLizer
Starting job with name alkahestry-llmama3-monol-3361-v7-mkmlizer
Waiting for job on alkahestry-llmama3-monol-3361-v7-mkmlizer to finish
alkahestry-llmama3-monol-3361-v7-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ _____ __ __ ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ /___/ ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ Version: 0.8.14 ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ https://mk1.ai ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ The license key for the current software has been verified as ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ belonging to: ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ Chai Research Corp. ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ║ ║
alkahestry-llmama3-monol-3361-v7-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
alkahestry-llmama3-monol-3361-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
alkahestry-llmama3-monol-3361-v7-mkmlizer: warnings.warn(warning_message, FutureWarning)
alkahestry-llmama3-monol-3361-v7-mkmlizer: Downloaded to shared memory in 42.719s
alkahestry-llmama3-monol-3361-v7-mkmlizer: quantizing model to /dev/shm/model_cache
alkahestry-llmama3-monol-3361-v7-mkmlizer: Saving flywheel model at /dev/shm/model_cache
alkahestry-llmama3-monol-3361-v7-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:02, 123.13it/s] Loading 0: 10%|█ | 30/291 [00:00<00:01, 137.20it/s] Loading 0: 15%|█▌ | 44/291 [00:00<00:01, 131.92it/s] Loading 0: 20%|█▉ | 58/291 [00:00<00:01, 130.54it/s] Loading 0: 25%|██▌ | 73/291 [00:00<00:01, 136.88it/s] Loading 0: 30%|██▉ | 87/291 [00:00<00:02, 71.71it/s] Loading 0: 35%|███▌ | 103/291 [00:01<00:02, 86.21it/s] Loading 0: 40%|███▉ | 115/291 [00:01<00:01, 92.86it/s] Loading 0: 45%|████▍ | 130/291 [00:01<00:01, 102.63it/s] Loading 0: 50%|████▉ | 145/291 [00:01<00:01, 113.98it/s] Loading 0: 54%|█████▍ | 158/291 [00:01<00:01, 111.88it/s] Loading 0: 60%|█████▉ | 174/291 [00:01<00:00, 123.88it/s] Loading 0: 65%|██████▍ | 188/291 [00:01<00:01, 79.20it/s] Loading 0: 69%|██████▉ | 201/291 [00:02<00:01, 88.21it/s] Loading 0: 73%|███████▎ | 213/291 [00:02<00:00, 94.04it/s] Loading 0: 79%|███████▊ | 229/291 [00:02<00:00, 105.34it/s] Loading 0: 85%|████████▍ | 246/291 [00:02<00:00, 116.24it/s] Loading 0: 89%|████████▉ | 259/291 [00:02<00:00, 118.16it/s] Loading 0: 94%|█████████▍| 274/291 [00:02<00:00, 122.08it/s] Loading 0: 99%|█████████▊| 287/291 [00:08<00:00, 8.21it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
alkahestry-llmama3-monol-3361-v7-mkmlizer: quantized model in 23.119s
alkahestry-llmama3-monol-3361-v7-mkmlizer: Processed model alkahestry/Llmama3-Monologue-8B in 68.294s
alkahestry-llmama3-monol-3361-v7-mkmlizer: creating bucket guanaco-mkml-models
alkahestry-llmama3-monol-3361-v7-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
alkahestry-llmama3-monol-3361-v7-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/alkahestry-llmama3-monol-3361-v7
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/alkahestry-llmama3-monol-3361-v7/special_tokens_map.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/alkahestry-llmama3-monol-3361-v7/config.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/alkahestry-llmama3-monol-3361-v7/tokenizer_config.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/alkahestry-llmama3-monol-3361-v7/tokenizer.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/alkahestry-llmama3-monol-3361-v7/flywheel_model.0.safetensors
alkahestry-llmama3-monol-3361-v7-mkmlizer: loading reward model from Jellywibble/CHAI_alignment_reward_model
alkahestry-llmama3-monol-3361-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
alkahestry-llmama3-monol-3361-v7-mkmlizer: warnings.warn(
alkahestry-llmama3-monol-3361-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
alkahestry-llmama3-monol-3361-v7-mkmlizer: warnings.warn(
alkahestry-llmama3-monol-3361-v7-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
alkahestry-llmama3-monol-3361-v7-mkmlizer: warnings.warn(
alkahestry-llmama3-monol-3361-v7-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
alkahestry-llmama3-monol-3361-v7-mkmlizer: Saving duration: 0.141s
alkahestry-llmama3-monol-3361-v7-mkmlizer: Processed model Jellywibble/CHAI_alignment_reward_model in 11.688s
alkahestry-llmama3-monol-3361-v7-mkmlizer: creating bucket guanaco-reward-models
alkahestry-llmama3-monol-3361-v7-mkmlizer: Bucket 's3://guanaco-reward-models/' created
alkahestry-llmama3-monol-3361-v7-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/alkahestry-llmama3-monol-3361-v7_reward
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/alkahestry-llmama3-monol-3361-v7_reward/config.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/alkahestry-llmama3-monol-3361-v7_reward/special_tokens_map.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/alkahestry-llmama3-monol-3361-v7_reward/tokenizer_config.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/alkahestry-llmama3-monol-3361-v7_reward/merges.txt
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/alkahestry-llmama3-monol-3361-v7_reward/vocab.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/alkahestry-llmama3-monol-3361-v7_reward/tokenizer.json
alkahestry-llmama3-monol-3361-v7-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/alkahestry-llmama3-monol-3361-v7_reward/reward.tensors
Job alkahestry-llmama3-monol-3361-v7-mkmlizer completed after 114.25s with status: succeeded
Stopping job with name alkahestry-llmama3-monol-3361-v7-mkmlizer
Pipeline stage MKMLizer completed in 115.22s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service alkahestry-llmama3-monol-3361-v7
Waiting for inference service alkahestry-llmama3-monol-3361-v7 to be ready
Inference service alkahestry-llmama3-monol-3361-v7 ready after 50.26294755935669s
Pipeline stage ISVCDeployer completed in 58.33s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.5256245136260986s
Received healthy response to inference request in 1.278862714767456s
Received healthy response to inference request in 1.2735328674316406s
Received healthy response to inference request in 1.2281641960144043s
Received healthy response to inference request in 1.3027334213256836s
5 requests
0 failed requests
5th percentile: 1.2372379302978516
10th percentile: 1.2463116645812988
20th percentile: 1.2644591331481934
30th percentile: 1.2745988368988037
40th percentile: 1.2767307758331299
50th percentile: 1.278862714767456
60th percentile: 1.2884109973907472
70th percentile: 1.297959280014038
80th percentile: 1.7473116397857669
90th percentile: 2.636468076705933
95th percentile: 3.0810462951660154
99th percentile: 3.436708869934082
mean time: 1.7217835426330566
Pipeline stage StressChecker completed in 9.36s
alkahestry-llmama3-monol_3361_v7 status is now deployed due to DeploymentManager action
alkahestry-llmama3-monol_3361_v7 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics