submission_id: closedcharacter-peach-9b_3442_v1
developer_uid: realhusoyun
best_of: 4
celo_rating: 1161.75
display_name: season-2-model
family_friendly_score: 0.0
formatter: {'memory_template': "### Instruction:\nAs the assistant, your task is to fully embody the given character, creating immersive, captivating narratives. Stay true to the character's personality and background, generating responses that not only reflect their core traits but are also accurate to their character. You are an expert actor that can fully immerse yourself into any role given.You do not break character for any reason, even if someone tries addressing you as an AI or language model.\nYour character: {bot_name}, which is described in detail below. As .\nContext: {memory}\n continue the exchange with {{user}}.This conversation is a dance, always continuing, always evolving.\nYour character: {bot_name}.\nContext: {memory}\n", 'prompt_template': '### Input:\n{prompt}\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '### Response:\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.22, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '<|im_end|>'], 'max_input_tokens': 2048, 'best_of': 4, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A5000': 1}
ineligible_reason: num_battles<5000
is_internal_developer: False
language_model: ClosedCharacter/Peach-9B-8k-Roleplay
latencies: [{'batch_size': 1, 'throughput': 0.6199107990550058, 'latency_mean': 1.6130673921108245, 'latency_p50': 1.604731559753418, 'latency_p90': 1.7649089813232421}, {'batch_size': 3, 'throughput': 1.1206323071522968, 'latency_mean': 2.669980412721634, 'latency_p50': 2.676223635673523, 'latency_p90': 2.9123994350433344}, {'batch_size': 5, 'throughput': 1.3621644364672918, 'latency_mean': 3.65768021941185, 'latency_p50': 3.6552186012268066, 'latency_p90': 4.0910667181015015}, {'batch_size': 6, 'throughput': 1.4195586244018588, 'latency_mean': 4.2028286218643185, 'latency_p50': 4.173303961753845, 'latency_p90': 4.735008668899536}, {'batch_size': 8, 'throughput': 1.493481474818155, 'latency_mean': 5.323388731479644, 'latency_p50': 5.32965612411499, 'latency_p90': 6.054883289337158}, {'batch_size': 10, 'throughput': 1.530691697780918, 'latency_mean': 6.4872529733181, 'latency_p50': 6.493401050567627, 'latency_p90': 7.238246870040894}]
max_input_tokens: 2048
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: ClosedCharacter/Peach-9B
model_name: season-2-model
model_num_parameters: 8829407232.0
model_repo: ClosedCharacter/Peach-9B-8k-Roleplay
model_size: 9B
num_battles: 4386
num_wins: 1708
ranking_group: single
status: torndown
submission_type: basic
throughput_3p7s: 1.37
timestamp: 2024-09-24T12:27:55+00:00
us_pacific_date: 2024-09-24
win_ratio: 0.38942088463292296
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name closedcharacter-peach-9b-3442-v1-mkmlizer
Waiting for job on closedcharacter-peach-9b-3442-v1-mkmlizer to finish
closedcharacter-peach-9b-3442-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ _____ __ __ ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ /___/ ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ Version: 0.10.1 ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ https://mk1.ai ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ The license key for the current software has been verified as ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ belonging to: ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ Chai Research Corp. ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ║ ║
closedcharacter-peach-9b-3442-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
closedcharacter-peach-9b-3442-v1-mkmlizer: Downloaded to shared memory in 39.238s
closedcharacter-peach-9b-3442-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp874y5sw6, device:0
closedcharacter-peach-9b-3442-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
closedcharacter-peach-9b-3442-v1-mkmlizer: quantized model in 25.074s
closedcharacter-peach-9b-3442-v1-mkmlizer: Processed model ClosedCharacter/Peach-9B-8k-Roleplay in 64.313s
closedcharacter-peach-9b-3442-v1-mkmlizer: creating bucket guanaco-mkml-models
closedcharacter-peach-9b-3442-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
closedcharacter-peach-9b-3442-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/closedcharacter-peach-9b-3442-v1
closedcharacter-peach-9b-3442-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/closedcharacter-peach-9b-3442-v1/config.json
closedcharacter-peach-9b-3442-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/closedcharacter-peach-9b-3442-v1/special_tokens_map.json
closedcharacter-peach-9b-3442-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/closedcharacter-peach-9b-3442-v1/tokenizer_config.json
closedcharacter-peach-9b-3442-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.model s3://guanaco-mkml-models/closedcharacter-peach-9b-3442-v1/tokenizer.model
closedcharacter-peach-9b-3442-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/closedcharacter-peach-9b-3442-v1/tokenizer.json
closedcharacter-peach-9b-3442-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/closedcharacter-peach-9b-3442-v1/flywheel_model.0.safetensors
closedcharacter-peach-9b-3442-v1-mkmlizer: Loading 0: 0%| | 0/435 [00:00<?, ?it/s] Loading 0: 1%| | 5/435 [00:00<00:11, 37.62it/s] Loading 0: 3%|▎ | 13/435 [00:00<00:07, 57.69it/s] Loading 0: 4%|▍ | 19/435 [00:00<00:08, 47.62it/s] Loading 0: 6%|▌ | 25/435 [00:00<00:08, 48.29it/s] Loading 0: 7%|▋ | 30/435 [00:00<00:08, 48.49it/s] Loading 0: 8%|▊ | 35/435 [00:00<00:08, 48.50it/s] Loading 0: 9%|▉ | 40/435 [00:00<00:09, 40.05it/s] Loading 0: 11%|█ | 46/435 [00:01<00:08, 44.66it/s] Loading 0: 12%|█▏ | 52/435 [00:01<00:08, 43.07it/s] Loading 0: 13%|█▎ | 57/435 [00:01<00:08, 43.42it/s] Loading 0: 14%|█▍ | 63/435 [00:01<00:07, 46.85it/s] Loading 0: 16%|█▌ | 68/435 [00:01<00:07, 46.77it/s] Loading 0: 17%|█▋ | 74/435 [00:01<00:08, 41.67it/s] Loading 0: 19%|█▉ | 82/435 [00:01<00:07, 50.02it/s] Loading 0: 20%|██ | 88/435 [00:01<00:07, 45.67it/s] Loading 0: 21%|██▏ | 93/435 [00:02<00:07, 45.52it/s] Loading 0: 23%|██▎ | 100/435 [00:02<00:06, 50.27it/s] Loading 0: 24%|██▍ | 106/435 [00:02<00:07, 45.24it/s] Loading 0: 26%|██▌ | 111/435 [00:02<00:07, 42.97it/s] Loading 0: 27%|██▋ | 116/435 [00:02<00:09, 32.66it/s] Loading 0: 28%|██▊ | 122/435 [00:02<00:08, 36.07it/s] Loading 0: 30%|██▉ | 130/435 [00:02<00:06, 44.11it/s] Loading 0: 31%|███▏ | 136/435 [00:03<00:07, 42.48it/s] Loading 0: 32%|███▏ | 141/435 [00:03<00:06, 43.25it/s] Loading 0: 34%|███▍ | 147/435 [00:03<00:06, 47.11it/s] Loading 0: 35%|███▍ | 152/435 [00:03<00:06, 46.71it/s] Loading 0: 36%|███▌ | 157/435 [00:03<00:05, 46.37it/s] Loading 0: 37%|███▋ | 163/435 [00:03<00:06, 41.74it/s] Loading 0: 39%|███▊ | 168/435 [00:03<00:06, 42.28it/s] Loading 0: 40%|████ | 174/435 [00:03<00:05, 46.16it/s] Loading 0: 41%|████ | 179/435 [00:04<00:05, 46.67it/s] Loading 0: 43%|████▎ | 185/435 [00:04<00:06, 41.45it/s] Loading 0: 44%|████▍ | 192/435 [00:04<00:05, 47.92it/s] Loading 0: 46%|████▌ | 198/435 [00:04<00:04, 49.39it/s] Loading 0: 47%|████▋ | 204/435 [00:04<00:05, 43.56it/s] Loading 0: 48%|████▊ | 210/435 [00:04<00:04, 46.67it/s] Loading 0: 49%|████▉ | 215/435 [00:04<00:04, 47.10it/s] Loading 0: 51%|█████ | 221/435 [00:04<00:05, 42.25it/s] Loading 0: 53%|█████▎ | 229/435 [00:05<00:04, 49.83it/s] Loading 0: 54%|█████▍ | 235/435 [00:05<00:04, 44.56it/s] Loading 0: 55%|█████▌ | 240/435 [00:05<00:04, 44.48it/s] Loading 0: 57%|█████▋ | 247/435 [00:05<00:04, 44.35it/s] Loading 0: 58%|█████▊ | 252/435 [00:05<00:05, 31.59it/s] Loading 0: 59%|█████▉ | 257/435 [00:05<00:05, 33.85it/s] Loading 0: 61%|██████ | 264/435 [00:06<00:04, 41.27it/s] Loading 0: 62%|██████▏ | 269/435 [00:06<00:03, 42.83it/s] Loading 0: 63%|██████▎ | 274/435 [00:06<00:03, 44.38it/s] Loading 0: 64%|██████▍ | 280/435 [00:06<00:03, 43.65it/s] Loading 0: 66%|██████▌ | 287/435 [00:06<00:03, 49.27it/s] Loading 0: 67%|██████▋ | 293/435 [00:06<00:03, 44.92it/s] Loading 0: 69%|██████▊ | 298/435 [00:06<00:02, 45.80it/s] Loading 0: 70%|██████▉ | 303/435 [00:06<00:02, 46.09it/s] Loading 0: 71%|███████▏ | 310/435 [00:06<00:02, 51.29it/s] Loading 0: 73%|███████▎ | 316/435 [00:07<00:02, 49.72it/s] Loading 0: 74%|███████▍ | 324/435 [00:07<00:01, 57.53it/s] Loading 0: 76%|███████▌ | 330/435 [00:07<00:01, 52.62it/s] Loading 0: 77%|███████▋ | 337/435 [00:07<00:01, 56.66it/s] Loading 0: 79%|███████▉ | 343/435 [00:07<00:01, 54.47it/s] Loading 0: 81%|████████ | 351/435 [00:07<00:01, 60.52it/s] Loading 0: 82%|████████▏ | 358/435 [00:07<00:01, 56.77it/s] Loading 0: 84%|████████▎ | 364/435 [00:07<00:01, 57.29it/s] Loading 0: 85%|████████▌ | 370/435 [00:08<00:01, 52.47it/s] Loading 0: 86%|████████▋ | 376/435 [00:10<00:08, 6.77it/s] Loading 0: 88%|████████▊ | 382/435 [00:10<00:05, 9.02it/s] Loading 0: 89%|████████▉ | 388/435 [00:11<00:03, 11.98it/s] Loading 0: 90%|█████████ | 393/435 [00:11<00:02, 14.29it/s] Loading 0: 92%|█████████▏| 402/435 [00:11<00:01, 20.43it/s] Loading 0: 94%|█████████▍| 411/435 [00:11<00:00, 26.57it/s] Loading 0: 96%|█████████▋| 419/435 [00:11<00:00, 33.32it/s] Loading 0: 98%|█████████▊| 425/435 [00:11<00:00, 35.94it/s] Loading 0: 99%|█████████▉| 431/435 [00:11<00:00, 39.64it/s]
Job closedcharacter-peach-9b-3442-v1-mkmlizer completed after 94.27s with status: succeeded
Stopping job with name closedcharacter-peach-9b-3442-v1-mkmlizer
Pipeline stage MKMLizer completed in 94.70s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service closedcharacter-peach-9b-3442-v1
Waiting for inference service closedcharacter-peach-9b-3442-v1 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission chaiml-elo-alignment-run-3_v44: ('http://chaiml-elo-alignment-run-3-v44-predictor.tenant-chaiml-guanaco.k2.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:47604->127.0.0.1:8080: read: connection reset by peer\n')
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service closedcharacter-peach-9b-3442-v1 ready after 202.0623893737793s
Pipeline stage MKMLDeployer completed in 202.78s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.733142614364624s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 1.7659568786621094s
Received healthy response to inference request in 1.4332191944122314s
Received healthy response to inference request in 1.9018363952636719s
Received healthy response to inference request in 2.9823169708251953s
5 requests
0 failed requests
5th percentile: 1.499766731262207
10th percentile: 1.5663142681121827
20th percentile: 1.6994093418121339
30th percentile: 1.7931327819824219
40th percentile: 1.8474845886230469
50th percentile: 1.9018363952636719
60th percentile: 2.234358882904053
70th percentile: 2.5668813705444333
80th percentile: 2.782977485656738
90th percentile: 2.882647228240967
95th percentile: 2.932482099533081
99th percentile: 2.9723499965667726
mean time: 2.1632944107055665
Pipeline stage StressChecker completed in 12.86s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 6.76s
Shutdown handler de-registered
closedcharacter-peach-9b_3442_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service closedcharacter-peach-9b-3442-v1-profiler
Waiting for inference service closedcharacter-peach-9b-3442-v1-profiler to be ready
Inference service closedcharacter-peach-9b-3442-v1-profiler ready after 210.48336052894592s
Pipeline stage MKMLProfilerDeployer completed in 210.84s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/closedcharacter-peace0cd9c8ab4b50e97570d70c9ff041501-deplomcbjb:/code/chaiverse_profiler_1727181442 --namespace tenant-chaiml-guanaco
kubectl exec -it closedcharacter-peace0cd9c8ab4b50e97570d70c9ff041501-deplomcbjb --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1727181442 && python profiles.py profile --best_of_n 4 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 2048 --output_tokens 64 --summary /code/chaiverse_profiler_1727181442/summary.json'
kubectl exec -it closedcharacter-peace0cd9c8ab4b50e97570d70c9ff041501-deplomcbjb --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1727181442/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1059.27s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service closedcharacter-peach-9b-3442-v1-profiler is running
Tearing down inference service closedcharacter-peach-9b-3442-v1-profiler
Service closedcharacter-peach-9b-3442-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.22s
Shutdown handler de-registered
closedcharacter-peach-9b_3442_v1 status is now inactive due to auto deactivation removed underperforming models
Deleting key closedcharacter-peach-9b-3442-v2/special_tokens_map.json from bucket guanaco-mkml-models
closedcharacter-peach-9b_3442_v1 status is now torndown due to DeploymentManager action