submission_id: chaoticneutrals-poppy-p_6268_v17
developer_uid: robert_irvine
status: inactive
model_repo: ChaoticNeutrals/Poppy_Porpoise-v0.7-L3-8B
reward_repo: ChaiML/reward_gpt2_medium_preference_24m_e2
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>', '<|start_header_id|>', '\n\n', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\nThis is an entertaining conversation. You are {bot_name} who has persona: {memory}. Engage in a chat with {user_name} while staying in character. Try to flirt with {user_name}. Engage in *roleplay* actions. Describe the scene dramatically<|eot_id|>', 'prompt_template': '<|start_header_id|>system<|end_header_id|>\n\nExample conversation:\n{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'memory_template': 'Memory: {memory}\n', 'prompt_template': '{prompt}\n', 'bot_template': 'Bot: {message}\n', 'user_template': 'User: {message}\n', 'response_template': 'Bot:', 'truncate_by_message': False}
timestamp: 2024-05-15T22:20:26+00:00
model_name: chaoticneutrals-poppy-p_6268_v17
model_eval_status: pending
model_group: ChaoticNeutrals/Poppy_Po
num_battles: 28335
num_wins: 14467
celo_rating: 1175.35
safety_score: None
propriety_score: 0.0
propriety_total_count: 0.0
submission_type: basic
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 4
max_input_tokens: 512
max_output_tokens: 64
display_name: chaoticneutrals-poppy-p_6268_v17
ineligible_reason: propriety_total_count < 5000
language_model: ChaoticNeutrals/Poppy_Porpoise-v0.7-L3-8B
model_size: 8B
reward_model: ChaiML/reward_gpt2_medium_preference_24m_e2
us_pacific_date: 2024-05-15
win_ratio: 0.5105699664725605
Resubmit model
Running pipeline stage MKMLizer
Starting job with name chaoticneutrals-poppy-p-6268-v17-mkmlizer
Waiting for job on chaoticneutrals-poppy-p-6268-v17-mkmlizer to finish
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ _____ __ __ ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ /___/ ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ Version: 0.8.14 ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ https://mk1.ai ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ The license key for the current software has been verified as ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ belonging to: ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ Chai Research Corp. ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ Expiration: 2024-07-15 23:59:59 ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ║ ║
chaoticneutrals-poppy-p-6268-v17-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaoticneutrals-poppy-p-6268-v17-mkmlizer: /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py:131: FutureWarning: 'list_files_info' (from 'huggingface_hub.hf_api') is deprecated and will be removed from version '0.23'. Use `list_repo_tree` and `get_paths_info` instead.
chaoticneutrals-poppy-p-6268-v17-mkmlizer: warnings.warn(warning_message, FutureWarning)
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Downloaded to shared memory in 18.267s
chaoticneutrals-poppy-p-6268-v17-mkmlizer: quantizing model to /dev/shm/model_cache
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 7%|▋ | 21/291 [00:00<00:01, 190.94it/s] Loading 0: 14%|█▍ | 41/291 [00:00<00:01, 180.31it/s] Loading 0: 22%|██▏ | 64/291 [00:00<00:01, 200.80it/s] Loading 0: 29%|██▉ | 85/291 [00:00<00:01, 190.89it/s] Loading 0: 36%|███▌ | 105/291 [00:00<00:00, 189.47it/s] Loading 0: 43%|████▎ | 126/291 [00:00<00:00, 195.52it/s] Loading 0: 50%|█████ | 146/291 [00:00<00:00, 193.49it/s] Loading 0: 57%|█████▋ | 167/291 [00:00<00:00, 198.40it/s] Loading 0: 64%|██████▍ | 187/291 [00:06<00:09, 11.51it/s] Loading 0: 70%|██████▉ | 203/291 [00:06<00:05, 15.11it/s] Loading 0: 76%|███████▌ | 221/291 [00:06<00:03, 20.64it/s] Loading 0: 82%|████████▏ | 240/291 [00:06<00:01, 28.28it/s] Loading 0: 90%|████████▉ | 261/291 [00:06<00:00, 39.39it/s] Loading 0: 97%|█████████▋| 283/291 [00:06<00:00, 53.46it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
chaoticneutrals-poppy-p-6268-v17-mkmlizer: quantized model in 18.171s
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Processed model ChaoticNeutrals/Poppy_Porpoise-v0.7-L3-8B in 37.622s
chaoticneutrals-poppy-p-6268-v17-mkmlizer: creating bucket guanaco-mkml-models
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaoticneutrals-poppy-p-6268-v17-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaoticneutrals-poppy-p-6268-v17
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaoticneutrals-poppy-p-6268-v17/special_tokens_map.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaoticneutrals-poppy-p-6268-v17/config.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaoticneutrals-poppy-p-6268-v17/tokenizer_config.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaoticneutrals-poppy-p-6268-v17/tokenizer.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaoticneutrals-poppy-p-6268-v17/flywheel_model.0.safetensors
chaoticneutrals-poppy-p-6268-v17-mkmlizer: loading reward model from ChaiML/reward_gpt2_medium_preference_24m_e2
chaoticneutrals-poppy-p-6268-v17-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:913: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaoticneutrals-poppy-p-6268-v17-mkmlizer: warnings.warn(
chaoticneutrals-poppy-p-6268-v17-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:757: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaoticneutrals-poppy-p-6268-v17-mkmlizer: warnings.warn(
chaoticneutrals-poppy-p-6268-v17-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:468: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
chaoticneutrals-poppy-p-6268-v17-mkmlizer: warnings.warn(
chaoticneutrals-poppy-p-6268-v17-mkmlizer: /opt/conda/lib/python3.10/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
chaoticneutrals-poppy-p-6268-v17-mkmlizer: return self.fget.__get__(instance, owner)()
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Saving duration: 0.238s
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Processed model ChaiML/reward_gpt2_medium_preference_24m_e2 in 5.012s
chaoticneutrals-poppy-p-6268-v17-mkmlizer: creating bucket guanaco-reward-models
chaoticneutrals-poppy-p-6268-v17-mkmlizer: Bucket 's3://guanaco-reward-models/' created
chaoticneutrals-poppy-p-6268-v17-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/chaoticneutrals-poppy-p-6268-v17_reward
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/chaoticneutrals-poppy-p-6268-v17_reward/config.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/chaoticneutrals-poppy-p-6268-v17_reward/special_tokens_map.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/chaoticneutrals-poppy-p-6268-v17_reward/tokenizer_config.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/chaoticneutrals-poppy-p-6268-v17_reward/merges.txt
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/chaoticneutrals-poppy-p-6268-v17_reward/vocab.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/chaoticneutrals-poppy-p-6268-v17_reward/tokenizer.json
chaoticneutrals-poppy-p-6268-v17-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/chaoticneutrals-poppy-p-6268-v17_reward/reward.tensors
Job chaoticneutrals-poppy-p-6268-v17-mkmlizer completed after 69.98s with status: succeeded
Stopping job with name chaoticneutrals-poppy-p-6268-v17-mkmlizer
Pipeline stage MKMLizer completed in 71.48s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.38s
Running pipeline stage ISVCDeployer
Creating inference service chaoticneutrals-poppy-p-6268-v17
Waiting for inference service chaoticneutrals-poppy-p-6268-v17 to be ready
Inference service chaoticneutrals-poppy-p-6268-v17 ready after 30.515952587127686s
Pipeline stage ISVCDeployer completed in 37.44s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.0847737789154053s
Received healthy response to inference request in 1.2188687324523926s
Received healthy response to inference request in 1.2394535541534424s
Received healthy response to inference request in 1.1960437297821045s
Received healthy response to inference request in 1.2118418216705322s
5 requests
0 failed requests
5th percentile: 1.1992033481597901
10th percentile: 1.2023629665374755
20th percentile: 1.2086822032928466
30th percentile: 1.2132472038269042
40th percentile: 1.2160579681396484
50th percentile: 1.2188687324523926
60th percentile: 1.2271026611328124
70th percentile: 1.2353365898132325
80th percentile: 1.4085175991058352
90th percentile: 1.7466456890106201
95th percentile: 1.9157097339630125
99th percentile: 2.050960969924927
mean time: 1.3901963233947754
Pipeline stage StressChecker completed in 10.56s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.42s
Running M-Eval for topic stay_in_character
Running pipeline stage DaemonicSafetyScorer
M-Eval Dataset for topic stay_in_character is loaded
Pipeline stage DaemonicSafetyScorer completed in 0.43s
%s, retrying in %s seconds...
chaoticneutrals-poppy-p_6268_v17 status is now deployed due to DeploymentManager action
%s, retrying in %s seconds...
Scoring model output for bot %s
Scoring model output for bot %s
Scoring model output for bot %s
Scoring model output for bot %s
chaoticneutrals-poppy-p_6268_v17 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics