developer_uid: huohuo12
submission_id: sometimesanotion-lamarck_6304_v8
model_name: sometimesanotion-lamarck_6304_v8
model_group: sometimesanotion/Lamarck
status: torndown
timestamp: 2025-02-18T09:59:14+00:00
num_battles: 6910
num_wins: 3322
celo_rating: 1237.69
family_friendly_score: 0.6202
family_friendly_standard_error: 0.006863701042440587
submission_type: basic
model_repo: sometimesanotion/Lamarck-14B-v0.7
model_architecture: Qwen2ForCausalLM
model_num_parameters: 14765603840.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
display_name: sometimesanotion-lamarck_6304_v8
is_internal_developer: False
language_model: sometimesanotion/Lamarck-14B-v0.7
model_size: 15B
ranking_group: single
us_pacific_date: 2025-02-18
win_ratio: 0.4807525325615051
generation_params: {'temperature': 1.1, 'top_p': 0.77, 'min_p': 0.025, 'top_k': 55, 'presence_penalty': 0.7, 'frequency_penalty': 0.25, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name sometimesanotion-lamarck-6304-v8-mkmlizer
Waiting for job on sometimesanotion-lamarck-6304-v8-mkmlizer to finish
sometimesanotion-lamarck-6304-v8-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ _____ __ __ ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ /___/ ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ Version: 0.12.8 ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ https://mk1.ai ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ The license key for the current software has been verified as ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ belonging to: ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ Chai Research Corp. ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ Expiration: 2025-04-15 23:59:59 ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ║ ║
sometimesanotion-lamarck-6304-v8-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
sometimesanotion-lamarck-6304-v8-mkmlizer: Downloaded to shared memory in 53.278s
sometimesanotion-lamarck-6304-v8-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp76insfjs, device:0
sometimesanotion-lamarck-6304-v8-mkmlizer: Saving flywheel model at /dev/shm/model_cache
sometimesanotion-lamarck-6304-v8-mkmlizer: quantized model in 36.470s
sometimesanotion-lamarck-6304-v8-mkmlizer: Processed model sometimesanotion/Lamarck-14B-v0.7 in 89.749s
sometimesanotion-lamarck-6304-v8-mkmlizer: creating bucket guanaco-mkml-models
sometimesanotion-lamarck-6304-v8-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
sometimesanotion-lamarck-6304-v8-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/config.json
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/added_tokens.json s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/added_tokens.json
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/special_tokens_map.json
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/tokenizer_config.json
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/merges.txt s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/merges.txt
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/vocab.json s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/vocab.json
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/tokenizer.json
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/flywheel_model.1.safetensors
sometimesanotion-lamarck-6304-v8-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/sometimesanotion-lamarck-6304-v8/flywheel_model.0.safetensors
sometimesanotion-lamarck-6304-v8-mkmlizer: Loading 0: 0%| | 0/579 [00:00<?, ?it/s] Loading 0: 0%| | 2/579 [00:07<33:48, 3.51s/it] Loading 0: 3%|▎ | 17/579 [00:07<02:52, 3.25it/s] Loading 0: 5%|▌ | 30/579 [00:07<01:21, 6.75it/s] Loading 0: 7%|▋ | 41/579 [00:07<00:53, 10.06it/s] Loading 0: 9%|▉ | 54/579 [00:07<00:33, 15.89it/s] Loading 0: 13%|█▎ | 76/579 [00:07<00:17, 28.82it/s] Loading 0: 16%|█▌ | 90/579 [00:07<00:12, 37.76it/s] Loading 0: 19%|█▉ | 112/579 [00:07<00:08, 56.25it/s] Loading 0: 22%|██▏ | 127/579 [00:08<00:06, 66.55it/s] Loading 0: 25%|██▍ | 144/579 [00:08<00:05, 82.28it/s] Loading 0: 27%|██▋ | 159/579 [00:08<00:05, 72.57it/s] Loading 0: 30%|██▉ | 173/579 [00:08<00:04, 81.26it/s] Loading 0: 33%|███▎ | 189/579 [00:08<00:04, 95.75it/s] Loading 0: 36%|███▌ | 208/579 [00:08<00:03, 112.40it/s] Loading 0: 39%|███▊ | 223/579 [00:08<00:03, 116.33it/s] Loading 0: 42%|████▏ | 244/579 [00:09<00:02, 132.97it/s] Loading 0: 45%|████▍ | 260/579 [00:09<00:03, 95.59it/s] Loading 0: 48%|████▊ | 280/579 [00:09<00:02, 111.53it/s] Loading 0: 51%|█████ | 294/579 [00:09<00:02, 112.58it/s] Loading 0: 55%|█████▍ | 316/579 [00:09<00:02, 130.65it/s] Loading 0: 57%|█████▋ | 331/579 [00:09<00:01, 128.97it/s] Loading 0: 60%|██████ | 350/579 [00:09<00:01, 143.64it/s] Loading 0: 63%|██████▎ | 366/579 [00:10<00:02, 93.08it/s] Loading 0: 67%|██████▋ | 388/579 [00:10<00:01, 112.89it/s] Loading 0: 70%|██████▉ | 403/579 [00:10<00:01, 114.67it/s] Loading 0: 73%|███████▎ | 420/579 [00:10<00:01, 126.74it/s] Loading 0: 75%|███████▌ | 436/579 [00:10<00:01, 134.14it/s] Loading 0: 78%|███████▊ | 451/579 [00:10<00:00, 129.16it/s] Loading 0: 81%|████████ | 468/579 [00:10<00:00, 139.16it/s] Loading 0: 83%|████████▎ | 483/579 [00:11<00:01, 94.30it/s] Loading 0: 85%|████████▌ | 495/579 [00:25<00:25, 3.28it/s] Loading 0: 88%|████████▊ | 508/579 [00:26<00:15, 4.46it/s] Loading 0: 90%|████████▉ | 521/579 [00:26<00:09, 6.11it/s] Loading 0: 92%|█████████▏| 534/579 [00:26<00:05, 8.40it/s] Loading 0: 95%|█████████▍| 550/579 [00:26<00:02, 12.29it/s] Loading 0: 98%|█████████▊| 568/579 [00:26<00:00, 18.18it/s]
Job sometimesanotion-lamarck-6304-v8-mkmlizer completed after 115.43s with status: succeeded
Stopping job with name sometimesanotion-lamarck-6304-v8-mkmlizer
Pipeline stage MKMLizer completed in 115.94s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service sometimesanotion-lamarck-6304-v8
Waiting for inference service sometimesanotion-lamarck-6304-v8 to be ready
Inference service sometimesanotion-lamarck-6304-v8 ready after 230.8528344631195s
Pipeline stage MKMLDeployer completed in 231.27s
run pipeline stage %s
Running pipeline stage StressChecker
HTTPConnectionPool(host='guanaco-submitter.guanaco-backend.k2.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 2.975832462310791s
Received healthy response to inference request in 1.9806032180786133s
Received healthy response to inference request in 1.7088003158569336s
Received healthy response to inference request in 1.9088211059570312s
5 requests
1 failed requests
5th percentile: 1.748804473876953
10th percentile: 1.7888086318969727
20th percentile: 1.8688169479370118
30th percentile: 1.9231775283813477
40th percentile: 1.9518903732299804
50th percentile: 1.9806032180786133
60th percentile: 2.3786949157714843
70th percentile: 2.7767866134643553
80th percentile: 6.407825326919559
90th percentile: 13.271811056137086
95th percentile: 16.703803920745848
99th percentile: 19.44939821243286
mean time: 5.741970777511597
%s, retrying in %s seconds...
Received healthy response to inference request in 1.8494482040405273s
Received healthy response to inference request in 2.118283987045288s
Received healthy response to inference request in 2.0315113067626953s
Received healthy response to inference request in 2.0298585891723633s
Received healthy response to inference request in 1.8024005889892578s
5 requests
0 failed requests
5th percentile: 1.8118101119995118
10th percentile: 1.8212196350097656
20th percentile: 1.8400386810302733
30th percentile: 1.8855302810668946
40th percentile: 1.9576944351196288
50th percentile: 2.0298585891723633
60th percentile: 2.0305196762084963
70th percentile: 2.031180763244629
80th percentile: 2.0488658428192137
90th percentile: 2.083574914932251
95th percentile: 2.1009294509887697
99th percentile: 2.1148130798339846
mean time: 1.9663005352020264
Pipeline stage StressChecker completed in 41.31s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.71s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.62s
Shutdown handler de-registered
sometimesanotion-lamarck_6304_v8 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Retrying (%r) after connection broken by '%r': %s
Retrying (%r) after connection broken by '%r': %s
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
%s, retrying in %s seconds...
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service sometimesanotion-lamarck-6304-v8-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 2.30s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service sometimesanotion-lamarck-6304-v8-profiler
Waiting for inference service sometimesanotion-lamarck-6304-v8-profiler to be ready
Inference service sometimesanotion-lamarck-6304-v8-profiler ready after 220.87899088859558s
Pipeline stage MKMLProfilerDeployer completed in 221.21s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/sometimesanotion-lamaa060f2eeb244efebcef2b33b1c6f38b-deplo8jlg7:/code/chaiverse_profiler_1739875642 --namespace tenant-chaiml-guanaco
kubectl exec -it sometimesanotion-lamaa060f2eeb244efebcef2b33b1c6f38b-deplo8jlg7 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1739875642 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1739875642/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/sometimesanotion-lamaa060f2eeb244efebcef2b33b1c6f38b-deplo8jlg7:/code/chaiverse_profiler_1739875955 --namespace tenant-chaiml-guanaco
kubectl exec -it sometimesanotion-lamaa060f2eeb244efebcef2b33b1c6f38b-deplo8jlg7 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1739875955 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1739875955/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/sometimesanotion-lamaa060f2eeb244efebcef2b33b1c6f38b-deplo8jlg7:/code/chaiverse_profiler_1739876267 --namespace tenant-chaiml-guanaco
kubectl exec -it sometimesanotion-lamaa060f2eeb244efebcef2b33b1c6f38b-deplo8jlg7 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1739876267 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1739876267/summary.json'
clean up pipeline due to error=ISVCScriptError('Command failed with error: Defaulted container "kserve-container" out of: kserve-container, queue-proxy\nUnable to use a TTY - input is not a terminal or the right kind of file\nTraceback (most recent call last):\n File "/code/chaiverse_profiler_1739876267/profiles.py", line 602, in <module>\n cli()\n File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__\n return self.main(*args, **kwargs)\n File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main\n rv = self.invoke(ctx)\n File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke\n return _process_result(sub_ctx.command.invoke(sub_ctx))\n File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke\n return __callback(*args, **kwargs)\n File "/code/chaiverse_profiler_1739876267/profiles.py", line 103, in profile_batches\n client.wait_for_server_startup(target, max_wait=300)\n File "/code/inference_analysis/client.py", line 136, in wait_for_server_startup\n raise RuntimeError(msg)\nRuntimeError: Timed out after 300s waiting for startup\ncommand terminated with exit code 1\n, output: waiting for startup of TargetModel(endpoint=\'localhost\', route=\'GPT-J-6B-lit-v2\', namespace=\'tenant-chaiml-guanaco\', max_characters=9999, reward=False, url_format=\'{endpoint}-predictor-default.{namespace}.knative.ord1.coreweave.cloud\')\n')
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service sometimesanotion-lamarck-6304-v8-profiler is running
Tearing down inference service sometimesanotion-lamarck-6304-v8-profiler
Service sometimesanotion-lamarck-6304-v8-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.27s
Shutdown handler de-registered
sometimesanotion-lamarck_6304_v8 status is now inactive due to auto deactivation removed underperforming models
sometimesanotion-lamarck_6304_v8 status is now torndown due to DeploymentManager action
sometimesanotion-lamarck_6304_v8 status is now torndown due to DeploymentManager action
sometimesanotion-lamarck_6304_v8 status is now torndown due to DeploymentManager action