developer_uid: chai_backend_admin
submission_id: zonemercy-lexical-viral-_2744_v4
model_name: tempv1-2
model_group: zonemercy/Lexical-Viral-
status: inactive
timestamp: 2024-11-20T10:35:44+00:00
num_battles: 10959
num_wins: 5684
celo_rating: 1266.58
family_friendly_score: 0.5948
family_friendly_standard_error: 0.006942808653563772
submission_type: basic
model_repo: zonemercy/Lexical-Viral-v6ava-22b11e5r256b005
model_architecture: MistralForCausalLM
model_num_parameters: 22247282688.0
best_of: 4
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.38889188373997036, 'latency_mean': 2.571320972442627, 'latency_p50': 2.59159517288208, 'latency_p90': 2.835598850250244}, {'batch_size': 3, 'throughput': 0.8275262760378935, 'latency_mean': 3.6179826366901398, 'latency_p50': 3.6340620517730713, 'latency_p90': 3.9785021781921386}, {'batch_size': 5, 'throughput': 1.0961388384859752, 'latency_mean': 4.521829264163971, 'latency_p50': 4.509411454200745, 'latency_p90': 5.08582112789154}, {'batch_size': 6, 'throughput': 1.1868508119468204, 'latency_mean': 5.03698947429657, 'latency_p50': 5.017203092575073, 'latency_p90': 5.7029194831848145}, {'batch_size': 10, 'throughput': 1.4035884072261566, 'latency_mean': 7.0791613745689395, 'latency_p50': 7.004319429397583, 'latency_p90': 8.0520614862442}]
gpu_counts: {'NVIDIA RTX A6000': 1}
display_name: tempv1-2
is_internal_developer: True
language_model: zonemercy/Lexical-Viral-v6ava-22b11e5r256b005
model_size: 22B
ranking_group: single
throughput_3p7s: 0.86
us_pacific_date: 2024-11-20
win_ratio: 0.5186604617209599
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '####', 'Bot:', 'User:', 'You:', '<|im_end|>', '<|eot_id|>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name zonemercy-lexical-viral-2744-v4-mkmlizer
Waiting for job on zonemercy-lexical-viral-2744-v4-mkmlizer to finish
zonemercy-lexical-viral-2744-v4-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ _____ __ __ ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ /___/ ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ Version: 0.11.12 ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ https://mk1.ai ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ belonging to: ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ Chai Research Corp. ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ║ ║
zonemercy-lexical-viral-2744-v4-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
zonemercy-lexical-viral-2744-v4-mkmlizer: Downloaded to shared memory in 79.268s
zonemercy-lexical-viral-2744-v4-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpb20k4w41, device:0
zonemercy-lexical-viral-2744-v4-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zonemercy-lexical-viral-2744-v4-mkmlizer: quantized model in 48.890s
zonemercy-lexical-viral-2744-v4-mkmlizer: Processed model zonemercy/Lexical-Viral-v6ava-22b11e5r256b005 in 128.157s
zonemercy-lexical-viral-2744-v4-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-lexical-viral-2744-v4-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-lexical-viral-2744-v4-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-lexical-viral-2744-v4
zonemercy-lexical-viral-2744-v4-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-lexical-viral-2744-v4/config.json
zonemercy-lexical-viral-2744-v4-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-lexical-viral-2744-v4/special_tokens_map.json
zonemercy-lexical-viral-2744-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-lexical-viral-2744-v4/tokenizer_config.json
zonemercy-lexical-viral-2744-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-lexical-viral-2744-v4/tokenizer.json
zonemercy-lexical-viral-2744-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/zonemercy-lexical-viral-2744-v4/flywheel_model.1.safetensors
zonemercy-lexical-viral-2744-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-lexical-viral-2744-v4/flywheel_model.0.safetensors
Job zonemercy-lexical-viral-2744-v4-mkmlizer completed after 166.75s with status: succeeded
Stopping job with name zonemercy-lexical-viral-2744-v4-mkmlizer
Pipeline stage MKMLizer completed in 167.35s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.21s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service zonemercy-lexical-viral-2744-v4
Waiting for inference service zonemercy-lexical-viral-2744-v4 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission rica40325-10-14dpo_v2: ('http://rica40325-10-14dpo-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service zonemercy-lexical-viral-2744-v4 ready after 230.86979794502258s
Pipeline stage MKMLDeployer completed in 231.36s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.7250912189483643s
Received healthy response to inference request in 2.586594581604004s
Received healthy response to inference request in 2.871612548828125s
Received healthy response to inference request in 2.335099697113037s
5 requests
1 failed requests
5th percentile: 2.3853986740112303
10th percentile: 2.435697650909424
20th percentile: 2.5362956047058107
30th percentile: 2.614293909072876
40th percentile: 2.66969256401062
50th percentile: 2.7250912189483643
60th percentile: 2.7836997509002686
70th percentile: 2.842308282852173
80th percentile: 6.320736408233646
90th percentile: 13.21898412704468
95th percentile: 16.66810798645019
99th percentile: 19.42740707397461
mean time: 6.127125978469849
%s, retrying in %s seconds...
Received healthy response to inference request in 2.3568150997161865s
Received healthy response to inference request in 2.3274905681610107s
Received healthy response to inference request in 2.3002307415008545s
Received healthy response to inference request in 2.4506516456604004s
Received healthy response to inference request in 2.2056918144226074s
5 requests
0 failed requests
5th percentile: 2.224599599838257
10th percentile: 2.2435073852539062
20th percentile: 2.281322956085205
30th percentile: 2.3056827068328856
40th percentile: 2.316586637496948
50th percentile: 2.3274905681610107
60th percentile: 2.339220380783081
70th percentile: 2.3509501934051515
80th percentile: 2.3755824089050295
90th percentile: 2.413117027282715
95th percentile: 2.4318843364715574
99th percentile: 2.446898183822632
mean time: 2.328175973892212
Pipeline stage StressChecker completed in 44.80s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.63s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 2.48s
Shutdown handler de-registered
zonemercy-lexical-viral-_2744_v4 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-lexical-viral-2744-v4-profiler
Waiting for inference service zonemercy-lexical-viral-2744-v4-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3193.52s
Shutdown handler de-registered
zonemercy-lexical-viral-_2744_v4 status is now inactive due to auto deactivation removed underperforming models