developer_uid: prt123
submission_id: cycy233-model2-1_v1
model_name: prt1
model_group: cycy233/model2-1
status: inactive
timestamp: 2024-12-05T09:44:50+00:00
num_battles: 6753
num_wins: 3040
celo_rating: 1227.7
family_friendly_score: 0.5698
family_friendly_standard_error: 0.007001827761377739
submission_type: basic
model_repo: cycy233/model2-1
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.8680291658307999, 'latency_mean': 1.1519686734676362, 'latency_p50': 1.146751046180725, 'latency_p90': 1.2771525382995605}, {'batch_size': 4, 'throughput': 1.8741151281681305, 'latency_mean': 2.125123083591461, 'latency_p50': 2.117616295814514, 'latency_p90': 2.3389901638031003}, {'batch_size': 5, 'throughput': 2.0301740606258583, 'latency_mean': 2.451947114467621, 'latency_p50': 2.4516464471817017, 'latency_p90': 2.712179112434387}, {'batch_size': 8, 'throughput': 2.236004336403826, 'latency_mean': 3.545656532049179, 'latency_p50': 3.584526777267456, 'latency_p90': 3.9458033084869384}, {'batch_size': 10, 'throughput': 2.3201868521144524, 'latency_mean': 4.274450600147247, 'latency_p50': 4.338069438934326, 'latency_p90': 4.8508225440979}, {'batch_size': 12, 'throughput': 2.335267393860805, 'latency_mean': 5.087548093795776, 'latency_p50': 5.094678521156311, 'latency_p90': 5.830642533302307}, {'batch_size': 15, 'throughput': 2.358579246930314, 'latency_mean': 6.2801130223274235, 'latency_p50': 6.315806984901428, 'latency_p90': 7.055266642570496}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: prt1
is_internal_developer: False
language_model: cycy233/model2-1
model_size: 8B
ranking_group: single
throughput_3p7s: 2.28
us_pacific_date: 2024-12-05
win_ratio: 0.45017029468384423
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.08, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name cycy233-model2-1-v1-mkmlizer
Waiting for job on cycy233-model2-1-v1-mkmlizer to finish
cycy233-model2-1-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
cycy233-model2-1-v1-mkmlizer: ║ _____ __ __ ║
cycy233-model2-1-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
cycy233-model2-1-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
cycy233-model2-1-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
cycy233-model2-1-v1-mkmlizer: ║ /___/ ║
cycy233-model2-1-v1-mkmlizer: ║ ║
cycy233-model2-1-v1-mkmlizer: ║ Version: 0.11.12 ║
cycy233-model2-1-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
cycy233-model2-1-v1-mkmlizer: ║ https://mk1.ai ║
cycy233-model2-1-v1-mkmlizer: ║ ║
cycy233-model2-1-v1-mkmlizer: ║ The license key for the current software has been verified as ║
cycy233-model2-1-v1-mkmlizer: ║ belonging to: ║
cycy233-model2-1-v1-mkmlizer: ║ ║
cycy233-model2-1-v1-mkmlizer: ║ Chai Research Corp. ║
cycy233-model2-1-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
cycy233-model2-1-v1-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
cycy233-model2-1-v1-mkmlizer: ║ ║
cycy233-model2-1-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
cycy233-model2-1-v1-mkmlizer: Downloaded to shared memory in 23.264s
cycy233-model2-1-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpbudb17m_, device:0
cycy233-model2-1-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
cycy233-model2-1-v1-mkmlizer: quantized model in 25.844s
cycy233-model2-1-v1-mkmlizer: Processed model cycy233/model2-1 in 49.109s
cycy233-model2-1-v1-mkmlizer: creating bucket guanaco-mkml-models
cycy233-model2-1-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
cycy233-model2-1-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/cycy233-model2-1-v1
cycy233-model2-1-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/cycy233-model2-1-v1/config.json
cycy233-model2-1-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/cycy233-model2-1-v1/special_tokens_map.json
cycy233-model2-1-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/cycy233-model2-1-v1/tokenizer_config.json
cycy233-model2-1-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/cycy233-model2-1-v1/tokenizer.json
cycy233-model2-1-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/cycy233-model2-1-v1/flywheel_model.0.safetensors
cycy233-model2-1-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:08, 34.00it/s] Loading 0: 5%|▍ | 14/291 [00:00<00:05, 48.06it/s] Loading 0: 8%|▊ | 23/291 [00:00<00:05, 52.37it/s] Loading 0: 11%|█ | 32/291 [00:00<00:04, 54.44it/s] Loading 0: 14%|█▍ | 41/291 [00:00<00:04, 55.42it/s] Loading 0: 17%|█▋ | 50/291 [00:00<00:04, 55.75it/s] Loading 0: 20%|██ | 59/291 [00:01<00:04, 56.00it/s] Loading 0: 23%|██▎ | 68/291 [00:01<00:03, 55.77it/s] Loading 0: 26%|██▌ | 76/291 [00:01<00:03, 60.22it/s] Loading 0: 29%|██▊ | 83/291 [00:01<00:04, 42.04it/s] Loading 0: 30%|███ | 88/291 [00:01<00:04, 41.59it/s] Loading 0: 32%|███▏ | 94/291 [00:01<00:04, 45.06it/s] Loading 0: 34%|███▍ | 100/291 [00:02<00:04, 45.64it/s] Loading 0: 36%|███▌ | 105/291 [00:02<00:04, 46.34it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:03, 51.79it/s] Loading 0: 41%|████ | 118/291 [00:02<00:03, 50.47it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 51.65it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:03, 52.58it/s] Loading 0: 47%|████▋ | 136/291 [00:02<00:03, 50.38it/s] Loading 0: 49%|████▉ | 142/291 [00:02<00:02, 51.66it/s] Loading 0: 51%|█████ | 149/291 [00:02<00:02, 49.13it/s] Loading 0: 54%|█████▍ | 157/291 [00:03<00:02, 56.12it/s] Loading 0: 56%|█████▌ | 163/291 [00:03<00:02, 52.87it/s] Loading 0: 58%|█████▊ | 169/291 [00:03<00:02, 51.00it/s] Loading 0: 61%|██████ | 177/291 [00:03<00:02, 52.63it/s] Loading 0: 63%|██████▎ | 183/291 [00:03<00:02, 52.50it/s] Loading 0: 65%|██████▍ | 189/291 [00:03<00:02, 34.81it/s] Loading 0: 67%|██████▋ | 194/291 [00:04<00:02, 37.37it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:01, 45.78it/s] Loading 0: 71%|███████▏ | 208/291 [00:04<00:01, 45.82it/s] Loading 0: 74%|███████▎ | 214/291 [00:04<00:01, 48.31it/s] Loading 0: 76%|███████▌ | 221/291 [00:04<00:01, 46.64it/s] Loading 0: 79%|███████▊ | 229/291 [00:04<00:01, 53.19it/s] Loading 0: 81%|████████ | 235/291 [00:04<00:01, 49.02it/s] Loading 0: 83%|████████▎ | 241/291 [00:04<00:01, 48.85it/s] Loading 0: 85%|████████▌ | 248/291 [00:05<00:00, 46.64it/s] Loading 0: 88%|████████▊ | 256/291 [00:05<00:00, 52.09it/s] Loading 0: 90%|█████████ | 262/291 [00:05<00:00, 50.55it/s] Loading 0: 92%|█████████▏| 268/291 [00:05<00:00, 51.83it/s] Loading 0: 94%|█████████▍| 274/291 [00:05<00:00, 53.91it/s] Loading 0: 96%|█████████▌| 280/291 [00:05<00:00, 50.71it/s] Loading 0: 98%|█████████▊| 286/291 [00:05<00:00, 46.65it/s] Loading 0: 100%|██████████| 291/291 [00:11<00:00, 3.42it/s]
Job cycy233-model2-1-v1-mkmlizer completed after 77.13s with status: succeeded
Stopping job with name cycy233-model2-1-v1-mkmlizer
Pipeline stage MKMLizer completed in 78.60s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.23s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service cycy233-model2-1-v1
Waiting for inference service cycy233-model2-1-v1 to be ready
Inference service cycy233-model2-1-v1 ready after 164.83288168907166s
Pipeline stage MKMLDeployer completed in 166.43s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.804978370666504s
Received healthy response to inference request in 1.5730276107788086s
Received healthy response to inference request in 1.2947771549224854s
Received healthy response to inference request in 1.9142193794250488s
Received healthy response to inference request in 1.5109221935272217s
5 requests
0 failed requests
5th percentile: 1.3380061626434325
10th percentile: 1.38123517036438
20th percentile: 1.4676931858062745
30th percentile: 1.523343276977539
40th percentile: 1.548185443878174
50th percentile: 1.5730276107788086
60th percentile: 1.6658079147338867
70th percentile: 1.7585882186889648
80th percentile: 1.826826572418213
90th percentile: 1.8705229759216309
95th percentile: 1.8923711776733398
99th percentile: 1.9098497390747071
mean time: 1.6195849418640136
Pipeline stage StressChecker completed in 9.51s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.21s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 2.06s
Shutdown handler de-registered
cycy233-model2-1_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2115.05s
Shutdown handler de-registered
cycy233-model2-1_v1 status is now inactive due to auto deactivation removed underperforming models