submission_id: nousresearch-hermes-2-pr_1418_v6
developer_uid: immaculate_possum_03470
best_of: 4
celo_rating: 1171.28
display_name: red_mistral
family_friendly_score: 0.6240902474526928
family_friendly_standard_error: 0.00922702004778595
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.99, 'top_p': 0.2, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.5, 'frequency_penalty': 0.5, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A5000': 1}
ineligible_reason: num_battles<5000
is_internal_developer: False
language_model: NousResearch/Hermes-2-Pro-Mistral-7B
latencies: [{'batch_size': 1, 'throughput': 0.9496389777426338, 'latency_mean': 1.0529608476161956, 'latency_p50': 1.055152177810669, 'latency_p90': 1.1612992525100707}, {'batch_size': 5, 'throughput': 2.4650071273594647, 'latency_mean': 2.0203439033031465, 'latency_p50': 2.023949146270752, 'latency_p90': 2.30756676197052}, {'batch_size': 10, 'throughput': 3.073101766254452, 'latency_mean': 3.2268009614944457, 'latency_p50': 3.2340402603149414, 'latency_p90': 3.5934436321258545}, {'batch_size': 15, 'throughput': 3.245011851229528, 'latency_mean': 4.576126334667205, 'latency_p50': 4.5395485162734985, 'latency_p90': 5.151118659973145}, {'batch_size': 20, 'throughput': 3.2403771607529968, 'latency_mean': 6.042606542110443, 'latency_p50': 6.055160880088806, 'latency_p90': 6.708936429023742}]
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: NousResearch/Hermes-2-Pr
model_name: red_mistral
model_num_parameters: 7241994240.0
model_repo: NousResearch/Hermes-2-Pro-Mistral-7B
model_size: 7B
num_battles: 2829
num_wins: 1120
ranking_group: single
status: torndown
submission_type: basic
throughput_3p7s: 3.18
timestamp: 2024-09-26T17:27:56+00:00
us_pacific_date: 2024-09-26
win_ratio: 0.39589961117002476
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name nousresearch-hermes-2-pr-1418-v6-mkmlizer
Waiting for job on nousresearch-hermes-2-pr-1418-v6-mkmlizer to finish
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ _____ __ __ ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ /___/ ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ Version: 0.11.12 ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ https://mk1.ai ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ The license key for the current software has been verified as ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ belonging to: ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ Chai Research Corp. ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ║ ║
nousresearch-hermes-2-pr-1418-v6-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
nousresearch-hermes-2-pr-1418-v6-mkmlizer: Downloaded to shared memory in 19.964s
nousresearch-hermes-2-pr-1418-v6-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpu_mjn6zv, device:0
nousresearch-hermes-2-pr-1418-v6-mkmlizer: Saving flywheel model at /dev/shm/model_cache
nousresearch-hermes-2-pr-1418-v6-mkmlizer: quantized model in 17.796s
nousresearch-hermes-2-pr-1418-v6-mkmlizer: Processed model NousResearch/Hermes-2-Pro-Mistral-7B in 37.760s
nousresearch-hermes-2-pr-1418-v6-mkmlizer: creating bucket guanaco-mkml-models
nousresearch-hermes-2-pr-1418-v6-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
nousresearch-hermes-2-pr-1418-v6-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/nousresearch-hermes-2-pr-1418-v6
nousresearch-hermes-2-pr-1418-v6-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/nousresearch-hermes-2-pr-1418-v6/config.json
nousresearch-hermes-2-pr-1418-v6-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/nousresearch-hermes-2-pr-1418-v6/special_tokens_map.json
nousresearch-hermes-2-pr-1418-v6-mkmlizer: cp /dev/shm/model_cache/added_tokens.json s3://guanaco-mkml-models/nousresearch-hermes-2-pr-1418-v6/added_tokens.json
nousresearch-hermes-2-pr-1418-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/nousresearch-hermes-2-pr-1418-v6/tokenizer_config.json
nousresearch-hermes-2-pr-1418-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/nousresearch-hermes-2-pr-1418-v6/tokenizer.model
nousresearch-hermes-2-pr-1418-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/nousresearch-hermes-2-pr-1418-v6/tokenizer.json
nousresearch-hermes-2-pr-1418-v6-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/nousresearch-hermes-2-pr-1418-v6/flywheel_model.0.safetensors
Job nousresearch-hermes-2-pr-1418-v6-mkmlizer completed after 62.53s with status: succeeded
Stopping job with name nousresearch-hermes-2-pr-1418-v6-mkmlizer
Pipeline stage MKMLizer completed in 62.79s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.07s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service nousresearch-hermes-2-pr-1418-v6
Waiting for inference service nousresearch-hermes-2-pr-1418-v6 to be ready
Inference service nousresearch-hermes-2-pr-1418-v6 ready after 220.4873206615448s
Pipeline stage MKMLDeployer completed in 220.71s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.7640471458435059s
Received healthy response to inference request in 1.0826504230499268s
Received healthy response to inference request in 0.6079413890838623s
Received healthy response to inference request in 1.4219329357147217s
Received healthy response to inference request in 0.9505007266998291s
5 requests
0 failed requests
5th percentile: 0.6764532566070557
10th percentile: 0.7449651241302491
20th percentile: 0.8819888591766357
30th percentile: 0.9769306659698487
40th percentile: 1.0297905445098876
50th percentile: 1.0826504230499268
60th percentile: 1.2183634281158446
70th percentile: 1.3540764331817625
80th percentile: 1.4903557777404786
90th percentile: 1.627201461791992
95th percentile: 1.695624303817749
99th percentile: 1.7503625774383544
mean time: 1.165414524078369
Pipeline stage StressChecker completed in 7.22s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 3.51s
Shutdown handler de-registered
nousresearch-hermes-2-pr_1418_v6 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service nousresearch-hermes-2-pr-1418-v6-profiler
Waiting for inference service nousresearch-hermes-2-pr-1418-v6-profiler to be ready
Inference service nousresearch-hermes-2-pr-1418-v6-profiler ready after 220.53125762939453s
Pipeline stage MKMLProfilerDeployer completed in 220.87s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/nousresearch-hermes-3320e97553d0c85cb2e087061f1818db-deplozwrl9:/code/chaiverse_profiler_1727372241 --namespace tenant-chaiml-guanaco
kubectl exec -it nousresearch-hermes-3320e97553d0c85cb2e087061f1818db-deplozwrl9 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1727372241 && python profiles.py profile --best_of_n 4 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1727372241/summary.json'
kubectl exec -it nousresearch-hermes-3320e97553d0c85cb2e087061f1818db-deplozwrl9 --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1727372241/summary.json'
Pipeline stage MKMLProfilerRunner completed in 485.46s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service nousresearch-hermes-2-pr-1418-v6-profiler is running
Tearing down inference service nousresearch-hermes-2-pr-1418-v6-profiler
Service nousresearch-hermes-2-pr-1418-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.37s
Shutdown handler de-registered
nousresearch-hermes-2-pr_1418_v6 status is now inactive due to auto deactivation removed underperforming models
Deleting key mistralai-mistral-nemo-9330-v124/tokenizer.json from bucket guanaco-mkml-models
Shutdown handler de-registered
mistralai-mistral-nemo_9330_v122 status is now torndown due to DeploymentManager action
Service mistralai-mistral-nemo-9330-v130 has been torndown
Pipeline stage MKMLDeleter completed in 24.12s
Running pipeline stage MKMLDeleter
Tearing down inference service mistralai-mistral-nemo-9330-v131
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v125/special_tokens_map.json from bucket guanaco-mkml-models
run pipeline %s
Deleting key mistralai-mistral-nemo-9330-v127/config.json from bucket guanaco-mkml-models
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLModelDeleter
admin requested tearing down of nousresearch-hermes-2-pr_1418_v6
Deleting key mistralai-mistral-nemo-9330-v124/tokenizer_config.json from bucket guanaco-mkml-models
mistralai-mistral-nemo_9330_v123 status is now torndown due to DeploymentManager action
mistralai-mistral-nemo_9330_v123 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 24.79s
run pipeline stage %s
Deleting key mistralai-mistral-nemo-9330-v126/special_tokens_map.json from bucket guanaco-mkml-models
Checking if service mistralai-mistral-nemo-9330-v133 is running
Running pipeline stage MKMLModelDeleter
Cleaning model data from model cache
Deleting key mistralai-mistral-nemo-9330-v125/tokenizer.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v126/tokenizer.json from bucket guanaco-mkml-models
Cleaning model data from S3
run pipeline stage %s
Running pipeline stage MKMLDeleter
Checking if service mistralai-mistral-nemo-9330-v132 is running
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of nousresearch-hermes-2-pr_1418_v6
Deleting key mistralai-mistral-nemo-9330-v127/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Cleaning model data from S3
Deleting key mistralai-mistral-nemo-9330-v125/tokenizer_config.json from bucket guanaco-mkml-models
Skipping teardown as no inference service was found
Deleting key mistralai-mistral-nemo-9330-v126/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Running pipeline stage MKMLModelDeleter
Checking if service mistralai-mistral-nemo-9330-v133 is running
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of rirv938-llama-8b-big-ret_2923_v1
Tearing down inference service mistralai-mistral-nemo-9330-v132
Cleaning model data from model cache
Deleting key mistralai-mistral-nemo-9330-v127/special_tokens_map.json from bucket guanaco-mkml-models
Pipeline stage MKMLModelDeleter completed in 34.22s
Pipeline stage MKMLDeleter completed in 19.25s
Pipeline stage MKMLModelDeleter completed in 28.69s
Deleting key mistralai-mistral-nemo-9330-v128/config.json from bucket guanaco-mkml-models
Cleaning model data from S3
Running pipeline stage MKMLDeleter
Checking if service mistralai-mistral-nemo-9330-v134 is running
run pipeline stage %s
Tearing down inference service mistralai-mistral-nemo-9330-v133
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of zonemercy-elite-edit-v1-1e5_v1
Service mistralai-mistral-nemo-9330-v132 has been torndown
Deleting key mistralai-mistral-nemo-9330-v129/config.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-nemo-9330-v127/tokenizer.json from bucket guanaco-mkml-models
Shutdown handler de-registered
Cleaning model data from S3
mistralai-mistral-nemo_9330_v125 status is now torndown due to DeploymentManager action
Shutdown handler de-registered
run pipeline stage %s
run pipeline stage %s
Running pipeline stage MKMLDeleter
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
admin requested tearing down of nousresearch-hermes-2-pr_1418_v6
Deleting key mistralai-mistral-nemo-9330-v127/tokenizer.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
mistralai-mistral-nemo_9330_v126 status is now torndown due to DeploymentManager action
Pipeline stage MKMLDeleter completed in 11.49s
run pipeline %s
Running pipeline stage MKMLModelDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
Checking if service mistralai-mistral-nemo-9330-v133 is running
Deleting key mistralai-mistral-nemo-9330-v127/tokenizer_config.json from bucket guanaco-mkml-models
admin requested tearing down of rirv938-llama-8b-big-ret_2923_v1
mistralai-mistral-small_5341_v28 status is now torndown due to DeploymentManager action
nousresearch-hermes-2-pr_1418_v6 status is now torndown due to DeploymentManager action