submission_id: cycy233-l3-ba-p-v0-c2_v2
developer_uid: shiroe40
alignment_samples: 11139
alignment_score: -0.552584921602075
best_of: 16
celo_rating: 1247.02
display_name: auto
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['<|end_header_id|>', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: cycy233/L3-ba-p-v0-c2
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: cycy233/L3-ba-p-v0-c2
model_name: auto
model_num_parameters: 8030261248.0
model_repo: cycy233/L3-ba-p-v0-c2
model_size: 8B
num_battles: 11139
num_wins: 5659
propriety_score: 0.7104984093319194
propriety_total_count: 943.0
ranking_group: single
status: inactive
submission_type: basic
timestamp: 2024-09-04T03:37:27+00:00
us_pacific_date: 2024-09-03
win_ratio: 0.5080348325702487
Download Preference Data
Resubmit model
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name cycy233-l3-ba-p-v0-c2-v2-mkmlizer
Waiting for job on cycy233-l3-ba-p-v0-c2-v2-mkmlizer to finish
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ _____ __ __ ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ /___/ ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ Version: 0.10.1 ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ https://mk1.ai ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ The license key for the current software has been verified as ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ belonging to: ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ Chai Research Corp. ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ║ ║
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: Downloaded to shared memory in 47.614s
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmptwokbocd, device:0
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: quantized model in 26.081s
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: Processed model cycy233/L3-ba-p-v0-c2 in 73.695s
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: creating bucket guanaco-mkml-models
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cycy233-l3-ba-p-v0-c2-v2
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cycy233-l3-ba-p-v0-c2-v2/special_tokens_map.json
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cycy233-l3-ba-p-v0-c2-v2/config.json
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cycy233-l3-ba-p-v0-c2-v2/tokenizer_config.json
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cycy233-l3-ba-p-v0-c2-v2/tokenizer.json
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/cycy233-l3-ba-p-v0-c2-v2/flywheel_model.0.safetensors
cycy233-l3-ba-p-v0-c2-v2-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:08, 34.92it/s] Loading 0: 5%|▍ | 14/291 [00:00<00:06, 44.74it/s] Loading 0: 8%|▊ | 22/291 [00:00<00:04, 54.02it/s] Loading 0: 10%|▉ | 28/291 [00:00<00:05, 50.74it/s] Loading 0: 12%|█▏ | 34/291 [00:00<00:04, 51.94it/s] Loading 0: 14%|█▍ | 41/291 [00:00<00:05, 48.03it/s] Loading 0: 17%|█▋ | 49/291 [00:00<00:04, 56.00it/s] Loading 0: 19%|█▉ | 55/291 [00:01<00:04, 52.21it/s] Loading 0: 21%|██ | 61/291 [00:01<00:04, 52.79it/s] Loading 0: 23%|██▎ | 67/291 [00:01<00:04, 54.70it/s] Loading 0: 25%|██▌ | 73/291 [00:01<00:04, 45.96it/s] Loading 0: 27%|██▋ | 78/291 [00:01<00:04, 45.08it/s] Loading 0: 29%|██▊ | 83/291 [00:01<00:05, 34.82it/s] Loading 0: 30%|██▉ | 87/291 [00:01<00:05, 34.58it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:04, 42.52it/s] Loading 0: 34%|███▍ | 100/291 [00:02<00:04, 43.10it/s] Loading 0: 36%|███▌ | 105/291 [00:02<00:04, 42.81it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:03, 47.62it/s] Loading 0: 41%|████ | 118/291 [00:02<00:03, 46.02it/s] Loading 0: 42%|████▏ | 123/291 [00:02<00:03, 44.68it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:03, 50.33it/s] Loading 0: 47%|████▋ | 136/291 [00:02<00:03, 45.29it/s] Loading 0: 48%|████▊ | 141/291 [00:03<00:03, 45.02it/s] Loading 0: 51%|█████ | 147/291 [00:03<00:03, 47.91it/s] Loading 0: 52%|█████▏ | 152/291 [00:03<00:02, 48.08it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:02, 48.58it/s] Loading 0: 56%|█████▌ | 162/291 [00:03<00:02, 48.08it/s] Loading 0: 57%|█████▋ | 167/291 [00:03<00:03, 39.55it/s] Loading 0: 60%|██████ | 175/291 [00:03<00:02, 49.14it/s] Loading 0: 62%|██████▏ | 181/291 [00:03<00:02, 42.95it/s] Loading 0: 64%|██████▍ | 187/291 [00:04<00:02, 36.41it/s] Loading 0: 66%|██████▌ | 192/291 [00:04<00:02, 37.58it/s] Loading 0: 68%|██████▊ | 197/291 [00:04<00:02, 38.62it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:02, 40.57it/s] Loading 0: 71%|███████▏ | 208/291 [00:04<00:02, 40.92it/s] Loading 0: 73%|███████▎ | 213/291 [00:04<00:01, 41.39it/s] Loading 0: 76%|███████▌ | 220/291 [00:04<00:01, 48.15it/s] Loading 0: 78%|███████▊ | 226/291 [00:05<00:01, 47.31it/s] Loading 0: 79%|███████▉ | 231/291 [00:05<00:01, 44.61it/s] Loading 0: 82%|████████▏ | 238/291 [00:05<00:01, 48.98it/s] Loading 0: 84%|████████▍ | 244/291 [00:05<00:01, 44.10it/s] Loading 0: 86%|████████▌ | 249/291 [00:05<00:01, 41.40it/s] Loading 0: 88%|████████▊ | 255/291 [00:05<00:00, 44.42it/s] Loading 0: 89%|████████▉ | 260/291 [00:05<00:00, 45.44it/s] Loading 0: 91%|█████████▏| 266/291 [00:05<00:00, 41.97it/s] Loading 0: 94%|█████████▍| 274/291 [00:06<00:00, 49.60it/s] Loading 0: 96%|█████████▌| 280/291 [00:06<00:00, 46.91it/s] Loading 0: 98%|█████████▊| 285/291 [00:06<00:00, 46.37it/s] Loading 0: 100%|█████████▉| 290/291 [00:11<00:00, 3.32it/s]
Job cycy233-l3-ba-p-v0-c2-v2-mkmlizer completed after 94.1s with status: succeeded
Stopping job with name cycy233-l3-ba-p-v0-c2-v2-mkmlizer
Pipeline stage MKMLizer completed in 95.21s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service cycy233-l3-ba-p-v0-c2-v2
Waiting for inference service cycy233-l3-ba-p-v0-c2-v2 to be ready
Failed to get response for submission mistralai-mixtral-8x7b_3473_v131: ('http://mistralai-mixtral-8x7b-3473-v131-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:38608->127.0.0.1:8080: read: connection reset by peer\n')
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service cycy233-l3-ba-p-v0-c2-v2 ready after 141.6647777557373s
Pipeline stage MKMLDeployer completed in 142.16s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.3179774284362793s
Received healthy response to inference request in 1.6601426601409912s
Received healthy response to inference request in 1.7771258354187012s
Received healthy response to inference request in 1.7651481628417969s
Received healthy response to inference request in 1.9934055805206299s
5 requests
0 failed requests
5th percentile: 1.6811437606811523
10th percentile: 1.7021448612213135
20th percentile: 1.7441470623016357
30th percentile: 1.7675436973571776
40th percentile: 1.7723347663879394
50th percentile: 1.7771258354187012
60th percentile: 1.8636377334594727
70th percentile: 1.950149631500244
80th percentile: 2.0583199501037597
90th percentile: 2.1881486892700197
95th percentile: 2.2530630588531495
99th percentile: 2.3049945545196535
mean time: 1.9027599334716796
Pipeline stage StressChecker completed in 10.21s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
starting trigger_guanaco_pipeline %s
Pipeline stage TriggerMKMLProfilingPipeline completed in 4.19s
cycy233-l3-ba-p-v0-c2_v2 status is now deployed due to DeploymentManager action
cycy233-l3-ba-p-v0-c2_v2 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics