developer_uid: deeppin
submission_id: humanllms-human-like-ll_91531_v1
model_name: humanllms-human-like-ll_91531_v1
model_group: HumanLLMs/Human-Like-LLa
status: torndown
timestamp: 2025-01-24T09:54:52+00:00
num_battles: 7413
num_wins: 3101
celo_rating: 1212.18
family_friendly_score: 0.6084
family_friendly_standard_error: 0.006902889829629328
submission_type: basic
model_repo: HumanLLMs/Human-Like-LLama3-8B-Instruct
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
display_name: humanllms-human-like-ll_91531_v1
is_internal_developer: False
language_model: HumanLLMs/Human-Like-LLama3-8B-Instruct
model_size: 8B
ranking_group: single
us_pacific_date: 2025-01-24
win_ratio: 0.4183191690273843
generation_params: {'temperature': 0.95, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|eot_id|>'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n{bot_name}'s Persona: {memory}\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n{bot_name}:', 'truncate_by_message': True}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name humanllms-human-like-ll-91531-v1-mkmlizer
Waiting for job on humanllms-human-like-ll-91531-v1-mkmlizer to finish
humanllms-human-like-ll-91531-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
humanllms-human-like-ll-91531-v1-mkmlizer: ║ _____ __ __ ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ /___/ ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ Version: 0.11.12 ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ https://mk1.ai ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ The license key for the current software has been verified as ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ belonging to: ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ Chai Research Corp. ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
humanllms-human-like-ll-91531-v1-mkmlizer: ║ ║
humanllms-human-like-ll-91531-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
humanllms-human-like-ll-91531-v1-mkmlizer: Downloaded to shared memory in 27.875s
humanllms-human-like-ll-91531-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmppsdau42s, device:0
humanllms-human-like-ll-91531-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
humanllms-human-like-ll-91531-v1-mkmlizer: quantized model in 25.388s
humanllms-human-like-ll-91531-v1-mkmlizer: Processed model HumanLLMs/Human-Like-LLama3-8B-Instruct in 53.263s
humanllms-human-like-ll-91531-v1-mkmlizer: creating bucket guanaco-mkml-models
humanllms-human-like-ll-91531-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
humanllms-human-like-ll-91531-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/humanllms-human-like-ll-91531-v1
humanllms-human-like-ll-91531-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/humanllms-human-like-ll-91531-v1/config.json
humanllms-human-like-ll-91531-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/humanllms-human-like-ll-91531-v1/special_tokens_map.json
humanllms-human-like-ll-91531-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/humanllms-human-like-ll-91531-v1/tokenizer_config.json
humanllms-human-like-ll-91531-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/humanllms-human-like-ll-91531-v1/tokenizer.json
humanllms-human-like-ll-91531-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/humanllms-human-like-ll-91531-v1/flywheel_model.0.safetensors
humanllms-human-like-ll-91531-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:07, 39.02it/s] Loading 0: 5%|▍ | 14/291 [00:00<00:05, 49.35it/s] Loading 0: 8%|▊ | 23/291 [00:00<00:05, 51.78it/s] Loading 0: 11%|█ | 31/291 [00:00<00:04, 60.07it/s] Loading 0: 13%|█▎ | 38/291 [00:00<00:04, 56.78it/s] Loading 0: 15%|█▌ | 44/291 [00:00<00:04, 57.10it/s] Loading 0: 17%|█▋ | 50/291 [00:00<00:04, 49.68it/s] Loading 0: 20%|██ | 59/291 [00:01<00:04, 51.22it/s] Loading 0: 23%|██▎ | 67/291 [00:01<00:03, 57.93it/s] Loading 0: 25%|██▌ | 74/291 [00:01<00:03, 55.49it/s] Loading 0: 27%|██▋ | 80/291 [00:01<00:03, 53.93it/s] Loading 0: 30%|██▉ | 86/291 [00:01<00:06, 31.00it/s] Loading 0: 32%|███▏ | 94/291 [00:02<00:05, 38.84it/s] Loading 0: 34%|███▍ | 100/291 [00:02<00:04, 39.95it/s] Loading 0: 36%|███▌ | 105/291 [00:02<00:04, 41.65it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:03, 47.63it/s] Loading 0: 41%|████ | 118/291 [00:02<00:03, 45.76it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 46.96it/s] Loading 0: 45%|████▍ | 130/291 [00:02<00:03, 48.62it/s] Loading 0: 47%|████▋ | 136/291 [00:02<00:03, 43.85it/s] Loading 0: 48%|████▊ | 141/291 [00:03<00:03, 43.50it/s] Loading 0: 51%|█████ | 148/291 [00:03<00:02, 49.63it/s] Loading 0: 53%|█████▎ | 154/291 [00:03<00:02, 46.97it/s] Loading 0: 55%|█████▍ | 159/291 [00:03<00:02, 45.95it/s] Loading 0: 57%|█████▋ | 166/291 [00:03<00:02, 50.64it/s] Loading 0: 59%|█████▉ | 172/291 [00:03<00:02, 47.56it/s] Loading 0: 62%|██████▏ | 179/291 [00:03<00:02, 50.85it/s] Loading 0: 64%|██████▎ | 185/291 [00:03<00:02, 50.36it/s] Loading 0: 66%|██████▌ | 191/291 [00:04<00:03, 30.22it/s] Loading 0: 67%|██████▋ | 196/291 [00:04<00:02, 32.40it/s] Loading 0: 69%|██████▉ | 202/291 [00:04<00:02, 37.46it/s] Loading 0: 71%|███████▏ | 208/291 [00:04<00:02, 38.78it/s] Loading 0: 73%|███████▎ | 213/291 [00:04<00:01, 40.06it/s] Loading 0: 75%|███████▌ | 219/291 [00:04<00:01, 44.70it/s] Loading 0: 77%|███████▋ | 225/291 [00:04<00:01, 48.02it/s] Loading 0: 79%|███████▉ | 231/291 [00:05<00:01, 43.94it/s] Loading 0: 82%|████████▏ | 238/291 [00:05<00:01, 49.36it/s] Loading 0: 84%|████████▍ | 244/291 [00:05<00:01, 46.16it/s] Loading 0: 86%|████████▌ | 249/291 [00:05<00:00, 46.37it/s] Loading 0: 88%|████████▊ | 256/291 [00:05<00:00, 51.70it/s] Loading 0: 90%|█████████ | 262/291 [00:05<00:00, 47.77it/s] Loading 0: 92%|█████████▏| 267/291 [00:05<00:00, 46.83it/s] Loading 0: 94%|█████████▍| 274/291 [00:05<00:00, 51.52it/s] Loading 0: 96%|█████████▌| 280/291 [00:06<00:00, 48.65it/s] Loading 0: 98%|█████████▊| 285/291 [00:06<00:00, 47.65it/s] Loading 0: 100%|█████████▉| 290/291 [00:11<00:00, 3.36it/s]
Job humanllms-human-like-ll-91531-v1-mkmlizer completed after 83.9s with status: succeeded
Stopping job with name humanllms-human-like-ll-91531-v1-mkmlizer
Pipeline stage MKMLizer completed in 84.43s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service humanllms-human-like-ll-91531-v1
Waiting for inference service humanllms-human-like-ll-91531-v1 to be ready
Inference service humanllms-human-like-ll-91531-v1 ready after 170.7319314479828s
Pipeline stage MKMLDeployer completed in 171.21s
run pipeline stage %s
Running pipeline stage StressChecker
('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))
Received unhealthy response to inference request!
Received healthy response to inference request in 1.6323215961456299s
Received healthy response to inference request in 1.332425832748413s
Received healthy response to inference request in 1.5885181427001953s
Received healthy response to inference request in 1.1902852058410645s
5 requests
1 failed requests
5th percentile: 0.31630525588989256
10th percentile: 0.5348002433776855
20th percentile: 0.9717902183532715
30th percentile: 1.2187133312225342
40th percentile: 1.2755695819854735
50th percentile: 1.332425832748413
60th percentile: 1.4348627567291259
70th percentile: 1.537299680709839
80th percentile: 1.5972788333892822
90th percentile: 1.614800214767456
95th percentile: 1.623560905456543
99th percentile: 1.6305694580078125
mean time: 1.1682722091674804
%s, retrying in %s seconds...
Received healthy response to inference request in 1.1667397022247314s
Received healthy response to inference request in 1.3860244750976562s
Received healthy response to inference request in 1.3753042221069336s
Received healthy response to inference request in 1.2269632816314697s
Received healthy response to inference request in 1.242157220840454s
5 requests
0 failed requests
5th percentile: 1.1787844181060791
10th percentile: 1.1908291339874268
20th percentile: 1.214918565750122
30th percentile: 1.2300020694732665
40th percentile: 1.2360796451568603
50th percentile: 1.242157220840454
60th percentile: 1.2954160213470458
70th percentile: 1.3486748218536377
80th percentile: 1.377448272705078
90th percentile: 1.3817363739013673
95th percentile: 1.3838804244995118
99th percentile: 1.3855956649780274
mean time: 1.279437780380249
Pipeline stage StressChecker completed in 14.75s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 1.09s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.73s
Shutdown handler de-registered
humanllms-human-like-ll_91531_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2661.55s
Shutdown handler de-registered
humanllms-human-like-ll_91531_v1 status is now inactive due to auto deactivation removed underperforming models
humanllms-human-like-ll_91531_v1 status is now torndown due to DeploymentManager action