submission_id: cycy233-l3-bp-v4-c4_v1
developer_uid: shiroe40
alignment_samples: 12104
alignment_score: -0.1633028912647231
best_of: 16
celo_rating: 1237.36
display_name: auto
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['<|end_header_id|>', '<|eot_id|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
is_internal_developer: False
language_model: cycy233/L3-bp-v4-c4
max_input_tokens: 512
max_output_tokens: 64
model_architecture: LlamaForCausalLM
model_group: cycy233/L3-bp-v4-c4
model_name: auto
model_num_parameters: 8030261248.0
model_repo: cycy233/L3-bp-v4-c4
model_size: 8B
num_battles: 12104
num_wins: 6084
propriety_score: 0.7297297297297297
propriety_total_count: 999.0
ranking_group: single
status: inactive
submission_type: basic
timestamp: 2024-08-28T10:21:18+00:00
us_pacific_date: 2024-08-28
win_ratio: 0.5026437541308658
Download Preference Data
Resubmit model
Running pipeline stage MKMLizer
Starting job with name cycy233-l3-bp-v4-c4-v1-mkmlizer
Waiting for job on cycy233-l3-bp-v4-c4-v1-mkmlizer to finish
Stopping job with name cycy233-l3-bp-v4-c4-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name cycy233-l3-bp-v4-c4-v1-mkmlizer
Waiting for job on cycy233-l3-bp-v4-c4-v1-mkmlizer to finish
cycy233-l3-bp-v4-c4-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ _____ __ __ ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ /___/ ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ Version: 0.10.1 ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ https://mk1.ai ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ The license key for the current software has been verified as ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ belonging to: ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ Chai Research Corp. ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ║ ║
cycy233-l3-bp-v4-c4-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cycy233-l3-bp-v4-c4-v1-mkmlizer: Downloaded to shared memory in 34.264s
cycy233-l3-bp-v4-c4-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp9twg2733, device:0
cycy233-l3-bp-v4-c4-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
cycy233-l3-bp-v4-c4-v1-mkmlizer: quantized model in 26.134s
cycy233-l3-bp-v4-c4-v1-mkmlizer: Processed model cycy233/L3-bp-v4-c4 in 60.398s
cycy233-l3-bp-v4-c4-v1-mkmlizer: creating bucket guanaco-mkml-models
cycy233-l3-bp-v4-c4-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cycy233-l3-bp-v4-c4-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cycy233-l3-bp-v4-c4-v1
cycy233-l3-bp-v4-c4-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cycy233-l3-bp-v4-c4-v1/config.json
cycy233-l3-bp-v4-c4-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cycy233-l3-bp-v4-c4-v1/special_tokens_map.json
cycy233-l3-bp-v4-c4-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cycy233-l3-bp-v4-c4-v1/tokenizer_config.json
cycy233-l3-bp-v4-c4-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cycy233-l3-bp-v4-c4-v1/tokenizer.json
Job cycy233-l3-bp-v4-c4-v1-mkmlizer completed after 120.37s with status: succeeded
Stopping job with name cycy233-l3-bp-v4-c4-v1-mkmlizer
Pipeline stage MKMLizer completed in 121.65s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service cycy233-l3-bp-v4-c4-v1
Waiting for inference service cycy233-l3-bp-v4-c4-v1 to be ready
Failed to get response for submission blend_lobuf_2024-08-22: ('http://mistralai-mixtral-8x7b-3473-v130-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:35224->127.0.0.1:8080: read: connection reset by peer\n')
Inference service cycy233-l3-bp-v4-c4-v1 ready after 170.674560546875s
Pipeline stage ISVCDeployer completed in 171.24s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.695343255996704s
Received healthy response to inference request in 1.646442174911499s
Received healthy response to inference request in 1.5182020664215088s
Received healthy response to inference request in 2.040398359298706s
Received healthy response to inference request in 1.8771998882293701s
5 requests
0 failed requests
5th percentile: 1.5438500881195067
10th percentile: 1.569498109817505
20th percentile: 1.620794153213501
30th percentile: 1.6925937175750732
40th percentile: 1.7848968029022216
50th percentile: 1.8771998882293701
60th percentile: 1.9424792766571044
70th percentile: 2.0077586650848387
80th percentile: 2.1713873386383056
90th percentile: 2.433365297317505
95th percentile: 2.5643542766571046
99th percentile: 2.6691454601287843
mean time: 1.9555171489715577
Pipeline stage StressChecker completed in 10.74s
cycy233-l3-bp-v4-c4_v1 status is now deployed due to DeploymentManager action
cycy233-l3-bp-v4-c4_v1 status is now inactive due to auto deactivation removed underperforming models

Usage Metrics

Latency Metrics