submission_id: sao10k-l3-8b-niitama-v1_v3
developer_uid: Bbbrun0
alignment_samples: 10406
alignment_score: -1.4295039907770835
best_of: 16
celo_rating: 1253.79
display_name: test
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.4, 'top_p': 1.0, 'min_p': 0.2, 'top_k': 50, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_header_id|>', '<|eot_id|>', '\n\n{user_name}'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: Sao10K/L3-8B-Niitama-v1
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: Sao10K/L3-8B-Niitama-v1
model_name: test
model_num_parameters: 8030261248.0
model_repo: Sao10K/L3-8B-Niitama-v1
model_size: 8B
num_battles: 10406
num_wins: 5483
propriety_score: 0.700445434298441
propriety_total_count: 898.0
ranking_group: single
status: inactive
submission_type: basic
timestamp: 2024-08-28T03:39:16+00:00
us_pacific_date: 2024-08-27
win_ratio: 0.5269075533346147
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name sao10k-l3-8b-niitama-v1-v3-mkmlizer
Waiting for job on sao10k-l3-8b-niitama-v1-v3-mkmlizer to finish
Stopping job with name sao10k-l3-8b-niitama-v1-v3-mkmlizer
%s, retrying in %s seconds...
Starting job with name sao10k-l3-8b-niitama-v1-v3-mkmlizer
Waiting for job on sao10k-l3-8b-niitama-v1-v3-mkmlizer to finish
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ _____ __ __ ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ /___/ ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ Version: 0.10.1 ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ https://mk1.ai ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ The license key for the current software has been verified as ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ belonging to: ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ Chai Research Corp. ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ║ ║
sao10k-l3-8b-niitama-v1-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
sao10k-l3-8b-niitama-v1-v3-mkmlizer: Downloaded to shared memory in 40.923s
sao10k-l3-8b-niitama-v1-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp020l1az0, device:0
sao10k-l3-8b-niitama-v1-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
sao10k-l3-8b-niitama-v1-v3-mkmlizer: creating bucket guanaco-mkml-models
sao10k-l3-8b-niitama-v1-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
sao10k-l3-8b-niitama-v1-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/sao10k-l3-8b-niitama-v1-v3
sao10k-l3-8b-niitama-v1-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/sao10k-l3-8b-niitama-v1-v3/config.json
sao10k-l3-8b-niitama-v1-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/sao10k-l3-8b-niitama-v1-v3/special_tokens_map.json
sao10k-l3-8b-niitama-v1-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/sao10k-l3-8b-niitama-v1-v3/tokenizer_config.json
sao10k-l3-8b-niitama-v1-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/sao10k-l3-8b-niitama-v1-v3/tokenizer.json
sao10k-l3-8b-niitama-v1-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/sao10k-l3-8b-niitama-v1-v3/flywheel_model.0.safetensors
sao10k-l3-8b-niitama-v1-v3-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%| | 2/291 [00:04<11:24, 2.37s/it] Loading 0: 2%|▏ | 6/291 [00:04<03:02, 1.56it/s] Loading 0: 4%|▍ | 13/291 [00:04<01:05, 4.27it/s] Loading 0: 6%|▌ | 17/291 [00:05<00:44, 6.17it/s] Loading 0: 8%|▊ | 23/291 [00:05<00:27, 9.91it/s] Loading 0: 10%|▉ | 29/291 [00:05<00:18, 13.82it/s] Loading 0: 12%|█▏ | 34/291 [00:05<00:14, 17.49it/s] Loading 0: 14%|█▍ | 41/291 [00:05<00:10, 24.14it/s] Loading 0: 16%|█▌ | 47/291 [00:05<00:08, 27.69it/s] Loading 0: 18%|█▊ | 52/291 [00:05<00:07, 31.08it/s] Loading 0: 20%|██ | 59/291 [00:05<00:06, 37.61it/s] Loading 0: 22%|██▏ | 65/291 [00:06<00:08, 27.26it/s] Loading 0: 24%|██▍ | 70/291 [00:06<00:07, 30.39it/s] Loading 0: 27%|██▋ | 78/291 [00:06<00:05, 35.58it/s] Loading 0: 30%|██▉ | 86/291 [00:06<00:04, 42.93it/s] Loading 0: 32%|███▏ | 92/291 [00:06<00:04, 42.35it/s] Loading 0: 33%|███▎ | 97/291 [00:06<00:04, 43.75it/s] Loading 0: 35%|███▌ | 103/291 [00:07<00:03, 47.26it/s] Loading 0: 37%|███▋ | 109/291 [00:07<00:03, 48.34it/s] Loading 0: 40%|███▉ | 115/291 [00:07<00:04, 41.66it/s] Loading 0: 42%|████▏ | 122/291 [00:07<00:03, 47.29it/s] Loading 0: 44%|████▍ | 128/291 [00:07<00:03, 43.25it/s] Loading 0: 46%|████▌ | 133/291 [00:07<00:03, 43.97it/s] Loading 0: 48%|████▊ | 140/291 [00:07<00:03, 48.42it/s] Loading 0: 50%|█████ | 146/291 [00:08<00:03, 47.05it/s] Loading 0: 52%|█████▏ | 151/291 [00:08<00:03, 46.37it/s] Loading 0: 55%|█████▍ | 159/291 [00:08<00:02, 46.42it/s] Loading 0: 57%|█████▋ | 166/291 [00:08<00:03, 38.10it/s] Loading 0: 59%|█████▉ | 171/291 [00:08<00:03, 39.16it/s] Loading 0: 60%|██████ | 176/291 [00:08<00:02, 41.45it/s] Loading 0: 62%|██████▏ | 181/291 [00:08<00:02, 43.26it/s] Loading 0: 64%|██████▍ | 186/291 [00:09<00:02, 38.18it/s] Loading 0: 67%|██████▋ | 194/291 [00:09<00:02, 46.85it/s] Loading 0: 69%|██████▊ | 200/291 [00:09<00:02, 44.21it/s] Loading 0: 70%|███████ | 205/291 [00:09<00:01, 43.87it/s] Loading 0: 73%|███████▎ | 212/291 [00:09<00:01, 49.28it/s] Loading 0: 75%|███████▍ | 218/291 [00:09<00:01, 44.66it/s] Loading 0: 77%|███████▋ | 223/291 [00:09<00:01, 45.55it/s] Loading 0: 79%|███████▊ | 229/291 [00:09<00:01, 48.15it/s] Loading 0: 80%|████████ | 234/291 [00:10<00:01, 47.59it/s] Loading 0: 82%|████████▏ | 240/291 [00:10<00:01, 42.52it/s] Loading 0: 86%|████████▌ | 249/291 [00:10<00:00, 45.66it/s] Loading 0: 88%|████████▊ | 257/291 [00:10<00:00, 52.07it/s] Loading 0: 90%|█████████ | 263/291 [00:10<00:00, 50.80it/s] Loading 0: 92%|█████████▏| 269/291 [00:10<00:00, 38.69it/s] Loading 0: 95%|█████████▍| 275/291 [00:10<00:00, 41.50it/s] Loading 0: 96%|█████████▌| 280/291 [00:11<00:00, 42.94it/s] Loading 0: 98%|█████████▊| 285/291 [00:11<00:00, 37.67it/s]
Job sao10k-l3-8b-niitama-v1-v3-mkmlizer completed after 94.22s with status: succeeded
Stopping job with name sao10k-l3-8b-niitama-v1-v3-mkmlizer
Pipeline stage MKMLizer completed in 95.66s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service sao10k-l3-8b-niitama-v1-v3
Waiting for inference service sao10k-l3-8b-niitama-v1-v3 to be ready
Failed to get response for submission bbchicago-brt-v1-13-with_9716_v2: ('http://chaiml-llama-8b-pairwis-8189-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
Inference service sao10k-l3-8b-niitama-v1-v3 ready after 171.14676523208618s
Pipeline stage ISVCDeployer completed in 171.80s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.3862555027008057s
Received healthy response to inference request in 2.118863105773926s
Received healthy response to inference request in 1.4683120250701904s
Received healthy response to inference request in 2.1928913593292236s
Received healthy response to inference request in 1.8331060409545898s
5 requests
0 failed requests
5th percentile: 1.5412708282470704
10th percentile: 1.6142296314239502
20th percentile: 1.7601472377777099
30th percentile: 1.8902574539184571
40th percentile: 2.0045602798461912
50th percentile: 2.118863105773926
60th percentile: 2.148474407196045
70th percentile: 2.178085708618164
80th percentile: 2.23156418800354
90th percentile: 2.308909845352173
95th percentile: 2.3475826740264893
99th percentile: 2.3785209369659426
mean time: 1.9998856067657471
Pipeline stage StressChecker completed in 10.90s
sao10k-l3-8b-niitama-v1_v3 status is now deployed due to DeploymentManager action
sao10k-l3-8b-niitama-v1_v3 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics