developer_uid: robert_irvine
submission_id: mistralai-mixtral-8x7b-_3473_v44
model_name: mistralai-mixtral-8x7b-_3473_v44
model_group: mistralai/Mixtral-8x7B-I
status: torndown
timestamp: 2024-06-06T18:41:59+00:00
num_battles: 9816
num_wins: 4826
celo_rating: 1163.82
family_friendly_score: 0.0
submission_type: basic
model_repo: mistralai/Mixtral-8x7B-Instruct-v0.1
model_architecture: MixtralForCausalLM
reward_repo: rirv938/reward_gpt2_medium_preference_24m_e2
model_num_parameters: 46702792704.0
best_of: 4
max_input_tokens: 512
max_output_tokens: 64
display_name: mistralai-mixtral-8x7b-_3473_v44
is_internal_developer: True
language_model: mistralai/Mixtral-8x7B-Instruct-v0.1
model_size: 47B
ranking_group: single
us_pacific_date: 2024-06-06
win_ratio: 0.49164629176854113
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 1000, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['</s>', '<|user|>', '###', '\n'], 'max_input_tokens': 512, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '<s>[INST] This is an entertaining conversation. You are {bot_name} who has the persona: {memory}.\nPlay the role of {bot_name}. Engage in a chat with {user_name} while staying in character. You should create a fun dialogue which entertains {user_name}.\n', 'prompt_template': '{prompt}\n', 'bot_template': '{bot_name}: {message}</s>', 'user_template': '[INST] {user_name}: {message} [/INST]', 'response_template': '{bot_name}:', 'truncate_by_message': False}
reward_formatter: {'bot_template': 'Bot: {message}\n', 'memory_template': 'Memory: {memory}\n', 'prompt_template': '{prompt}\n', 'response_template': 'Bot:', 'truncate_by_message': False, 'user_template': 'User: {message}\n'}
model_eval_status: pending
Resubmit model
Running pipeline stage ISVCDeployer
Pipeline stage ISVCDeployer completed in 0.32s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.4709670543670654s
Received healthy response to inference request in 1.0495257377624512s
Received healthy response to inference request in 1.5365641117095947s
Received healthy response to inference request in 1.2703731060028076s
Received healthy response to inference request in 1.0070762634277344s
5 requests
0 failed requests
5th percentile: 1.0155661582946778
10th percentile: 1.0240560531616212
20th percentile: 1.0410358428955078
30th percentile: 1.0936952114105225
40th percentile: 1.182034158706665
50th percentile: 1.2703731060028076
60th percentile: 1.3506106853485107
70th percentile: 1.4308482646942138
80th percentile: 1.4840864658355712
90th percentile: 1.510325288772583
95th percentile: 1.523444700241089
99th percentile: 1.5339402294158935
mean time: 1.2669012546539307
Pipeline stage StressChecker completed in 8.28s
Running pipeline stage DaemonicModelEvalScorer
Pipeline stage DaemonicModelEvalScorer completed in 0.11s
Running pipeline stage DaemonicSafetyScorer
Pipeline stage DaemonicSafetyScorer completed in 0.10s
%s, retrying in %s seconds...
Running M-Eval for topic stay_in_character
mistralai-mixtral-8x7b-_3473_v44 status is now deployed due to DeploymentManager action
%s, retrying in %s seconds...
M-Eval Dataset for topic stay_in_character is loaded
mistralai-mixtral-8x7b-_3473_v44 status is now inactive due to admin request
admin requested tearing down of mistralai-mixtral-8x7b-_3473_v44
Running pipeline stage ISVCDeleter
Checking if service amd-proxy is running
Skipping teardown as no inference service was found
Pipeline stage ISVCDeleter completed in 1.95s
Running pipeline stage MKMLModelDeleter
Skipping deletion as no model was successfully uploaded
Pipeline stage MKMLModelDeleter completed in 0.28s
mistralai-mixtral-8x7b-_3473_v44 status is now torndown due to DeploymentManager action