submission_id: zonemercy-lexical-viral-_1979_v9
developer_uid: chai_backend_admin
best_of: 4
celo_rating: 1266.07
display_name: tempv1-9
family_friendly_score: 0.6032
family_friendly_standard_error: 0.006918811458624956
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '####', 'Bot:', 'User:', 'You:', '<|im_end|>', '<|eot_id|>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A6000': 1}
is_internal_developer: True
language_model: zonemercy/Lexical-Viral-v6ava-22b11e5r128
latencies: [{'batch_size': 1, 'throughput': 0.38566660891220533, 'latency_mean': 2.5928553307056426, 'latency_p50': 2.591832995414734, 'latency_p90': 2.8775590658187866}, {'batch_size': 3, 'throughput': 0.8125574539061565, 'latency_mean': 3.674682561159134, 'latency_p50': 3.6625020503997803, 'latency_p90': 4.09182710647583}, {'batch_size': 5, 'throughput': 1.0717879405225696, 'latency_mean': 4.644158564805984, 'latency_p50': 4.6373536586761475, 'latency_p90': 5.201008939743042}, {'batch_size': 6, 'throughput': 1.1557071888036348, 'latency_mean': 5.167476060390473, 'latency_p50': 5.185935735702515, 'latency_p90': 5.859230279922485}, {'batch_size': 10, 'throughput': 1.3586857444000011, 'latency_mean': 7.289627970457077, 'latency_p50': 7.260143280029297, 'latency_p90': 8.32216830253601}]
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: zonemercy/Lexical-Viral-
model_name: tempv1-9
model_num_parameters: 22247282688.0
model_repo: zonemercy/Lexical-Viral-v6ava-22b11e5r128
model_size: 22B
num_battles: 11464
num_wins: 5978
ranking_group: single
status: inactive
submission_type: basic
throughput_3p7s: 0.82
timestamp: 2024-11-12T12:29:21+00:00
us_pacific_date: 2024-11-12
win_ratio: 0.5214584787159805
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name zonemercy-lexical-viral-1979-v9-mkmlizer
Waiting for job on zonemercy-lexical-viral-1979-v9-mkmlizer to finish
zonemercy-lexical-viral-1979-v9-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ _____ __ __ ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ /___/ ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ Version: 0.11.33 ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ https://mk1.ai ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ belonging to: ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ Chai Research Corp. ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ║ ║
zonemercy-lexical-viral-1979-v9-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
zonemercy-lexical-viral-1979-v9-mkmlizer: Downloaded to shared memory in 66.904s
zonemercy-lexical-viral-1979-v9-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmptp76aagz, device:0
zonemercy-lexical-viral-1979-v9-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Connection pool is full, discarding connection: %s. Connection pool size: %s
zonemercy-lexical-viral-1979-v9-mkmlizer: quantized model in 45.452s
zonemercy-lexical-viral-1979-v9-mkmlizer: Processed model zonemercy/Lexical-Viral-v6ava-22b11e5r128 in 112.356s
zonemercy-lexical-viral-1979-v9-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-lexical-viral-1979-v9-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-lexical-viral-1979-v9-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-lexical-viral-1979-v9
zonemercy-lexical-viral-1979-v9-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-lexical-viral-1979-v9/config.json
zonemercy-lexical-viral-1979-v9-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-lexical-viral-1979-v9/special_tokens_map.json
zonemercy-lexical-viral-1979-v9-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-lexical-viral-1979-v9/tokenizer_config.json
zonemercy-lexical-viral-1979-v9-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-lexical-viral-1979-v9/tokenizer.json
zonemercy-lexical-viral-1979-v9-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/zonemercy-lexical-viral-1979-v9/flywheel_model.1.safetensors
zonemercy-lexical-viral-1979-v9-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-lexical-viral-1979-v9/flywheel_model.0.safetensors
Job zonemercy-lexical-viral-1979-v9-mkmlizer completed after 166.16s with status: succeeded
Stopping job with name zonemercy-lexical-viral-1979-v9-mkmlizer
Pipeline stage MKMLizer completed in 166.70s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.19s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service zonemercy-lexical-viral-1979-v9
Waiting for inference service zonemercy-lexical-viral-1979-v9 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission rica40325-10-14dpo_v2: ('http://rica40325-10-14dpo-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Inference service zonemercy-lexical-viral-1979-v9 ready after 200.90396213531494s
Pipeline stage MKMLDeployer completed in 201.50s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.6977903842926025s
Received healthy response to inference request in 2.234783172607422s
Received healthy response to inference request in 2.292320966720581s
Received healthy response to inference request in 1.9435656070709229s
Received healthy response to inference request in 2.0797033309936523s
5 requests
0 failed requests
5th percentile: 1.9707931518554687
10th percentile: 1.9980206966400147
20th percentile: 2.0524757862091065
30th percentile: 2.1107192993164063
40th percentile: 2.172751235961914
50th percentile: 2.234783172607422
60th percentile: 2.2577982902526856
70th percentile: 2.2808134078979494
80th percentile: 2.3734148502349854
90th percentile: 2.5356026172637938
95th percentile: 2.616696500778198
99th percentile: 2.6815716075897216
mean time: 2.249632692337036
Pipeline stage StressChecker completed in 12.66s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 5.75s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 3.14s
Shutdown handler de-registered
zonemercy-lexical-viral-_1979_v9 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3116.95s
Shutdown handler de-registered
zonemercy-lexical-viral-_1979_v9 status is now inactive due to auto deactivation removed underperforming models