submission_id: blend_kajok_2024-03-21
developer_uid: chai_backend_admin
celo_rating: 1145.33
display_name: blend_kajok_2024-03-21
family_friendly_score: 0.0
is_internal_developer: True
language_model: chaiml-phase2-winner-13b2_v123:sfw,khanhnto-khanhnto_v37:sfw,anhnv125-llama-op-v17-1_v27:sfw,thanhdaonguyen-once-upon-a-t_v19:sfw,chaiml-phase2-winner-13b2_v238:nsfw,khanhnto-khanhnto_v54:nsfw,anhnv125-llama-op-v17-1_v32:nsfw,thanhdaonguyen-once-upon-a-t_v35:nsfw,hflserdaniel-chai-s6-13b-slp3_v6:sfw|nsfw,cgato-thespis-7b-v0-2-sfttest_v3:nsfw
model_group:
model_name: blend_kajok_2024-03-21
model_size: n/a
num_battles: 1359940
num_wins: 665037
ranking_group: blended
reward_model: sfw_router
router: sfw_router
status: torndown
submission_type: routed_blend
tagged_submissions: [{'submission_id': 'chaiml-phase2-winner-13b2_v123', 'tags': ['sfw']}, {'submission_id': 'khanhnto-khanhnto_v37', 'tags': ['sfw']}, {'submission_id': 'anhnv125-llama-op-v17-1_v27', 'tags': ['sfw']}, {'submission_id': 'thanhdaonguyen-once-upon-a-t_v19', 'tags': ['sfw']}, {'submission_id': 'chaiml-phase2-winner-13b2_v238', 'tags': ['nsfw']}, {'submission_id': 'khanhnto-khanhnto_v54', 'tags': ['nsfw']}, {'submission_id': 'anhnv125-llama-op-v17-1_v32', 'tags': ['nsfw']}, {'submission_id': 'thanhdaonguyen-once-upon-a-t_v35', 'tags': ['nsfw']}, {'submission_id': 'hflserdaniel-chai-s6-13b-slp3_v6', 'tags': ['sfw', 'nsfw']}, {'submission_id': 'cgato-thespis-7b-v0-2-sfttest_v3', 'tags': ['nsfw']}]
timestamp: 2024-03-21T02:02:14+00:00
us_pacific_date: 2024-03-20
win_ratio: 0.48901936850155153
Resubmit model
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Traceback (most recent call last): File "/code/guanaco/guanaco_services/src/guanaco_model_service/chat_api.py", line 274, in resolve_chat_api conversation_tag = self.tagging_function(conversation_state) File "/home/zongyi/gitlab/zztools/zztools/llm/guanaco/submit_routing_model.py", line 176, in last_user_message_length TypeError: 'ConversationMessage' object is not subscriptable
Exception raised while processing tagging_function
Traceback (most recent call last): File "/code/guanaco/guanaco_services/src/guanaco_model_service/chat_api.py", line 274, in resolve_chat_api conversation_tag = self.tagging_function(conversation_state) File "/home/zongyi/gitlab/zztools/zztools/llm/guanaco/submit_routing_model.py", line 176, in last_user_message_length TypeError: 'ConversationState' object is not subscriptable
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Failed to get response for submission anhnv125-mistral-v2_v2: ('http://anhnv125-mistral-v2-v2-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:51758->127.0.0.1:8080: read: connection reset by peer\n')
Retrying (%r) after connection broken by '%r': %s
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Traceback (most recent call last): File "/code/guanaco/guanaco_services/src/guanaco_model_service/chat_api.py", line 274, in resolve_chat_api conversation_tag = self.tagging_function(conversation_state) File "/home/zongyi/gitlab/zztools/zztools/llm/guanaco/submit_routing_model.py", line 176, in last_user_message_length TypeError: 'ConversationState' object is not subscriptable
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
cgato-thespice-7b-ft-ins-1130-v2-mkmlizer: model-00001-of-00003.safetensors: 0%| | 0.00/4.94G [00:00<?, ?B/s] model-00001-of-00003.safetensors: 0%| | 10.5M/4.94G [00:00<01:21, 60.5MB/s] model-00001-of-00003.safetensors: 3%|▎ | 126M/4.94G [00:00<00:08, 544MB/s] model-00001-of-00003.safetensors: 4%|▍ | 199M/4.94G [00:00<00:07, 599MB/s] model-00001-of-00003.safetensors: 6%|▌ | 294M/4.94G [00:00<00:06, 685MB/s] model-00001-of-00003.safetensors: 8%|▊ | 419M/4.94G [00:00<00:05, 837MB/s] model-00001-of-00003.safetensors: 12%|█▏ | 577M/4.94G [00:00<00:04, 1.05GB/s] model-00001-of-00003.safetensors: 16%|█▋ | 807M/4.94G [00:00<00:02, 1.41GB/s] model-00001-of-00003.safetensors: 22%|██▏ | 1.09G/4.94G [00:00<00:02, 1.75GB/s] model-00001-of-00003.safetensors: 26%|██▌ | 1.27G/4.94G [00:01<00:02, 1.54GB/s] model-00001-of-00003.safetensors: 29%|██▉ | 1.44G/4.94G [00:01<00:02, 1.45GB/s] model-00001-of-00003.safetensors: 32%|███▏ | 1.59G/4.94G [00:01<00:02, 1.34GB/s] model-00001-of-00003.safetensors: 37%|███▋ | 1.82G/4.94G [00:01<00:01, 1.57GB/s] model-00001-of-00003.safetensors: 41%|████ | 2.02G/4.94G [00:01<00:01, 1.62GB/s] model-00001-of-00003.safetensors: 47%|████▋ | 2.32G/4.94G [00:01<00:01, 1.94GB/s] model-00001-of-00003.safetensors: 52%|█████▏ | 2.57G/4.94G [00:01<00:01, 2.08GB/s] model-00001-of-00003.safetensors: 56%|█████▋ | 2.79G/4.94G [00:01<00:01, 1.90GB/s] model-00001-of-00003.safetensors: 62%|██████▏ | 3.05G/4.94G [00:02<00:00, 2.07GB/s] model-00001-of-00003.safetensors: 66%|██████▌ | 3.27G/4.94G [00:02<00:00, 2.00GB/s] model-00001-of-00003.safetensors: 70%|███████ | 3.48G/4.94G [00:02<00:00, 1.85GB/s] model-00001-of-00003.safetensors: 74%|███████▍ | 3.68G/4.94G [00:02<00:00, 1.62GB/s] model-00001-of-00003.safetensors: 78%|███████▊ | 3.86G/4.94G [00:02<00:00, 1.49GB/s] model-00001-of-00003.safetensors: 81%|████████ | 4.02G/4.94G [00:02<00:00, 1.46GB/s] model-00001-of-00003.safetensors: 87%|████████▋ | 4.28G/4.94G [00:02<00:00, 1.74GB/s] model-00001-of-00003.safetensors: 94%|█████████▎| 4.63G/4.94G [00:02<00:00, 2.18GB/s] model-00001-of-00003.safetensors: 100%|█████████▉| 4.94G/4.94G [00:03<00:00, 1.64GB/s]
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Traceback (most recent call last): File "/code/guanaco/guanaco_services/src/guanaco_model_service/chat_api.py", line 274, in resolve_chat_api conversation_tag = self.tagging_function(conversation_state) File "/home/zongyi/gitlab/zztools/zztools/llm/guanaco/submit_routing_model.py", line 176, in last_user_message_length TypeError: 'ConversationState' object is not subscriptable
Traceback (most recent call last): File "/code/guanaco/guanaco_services/src/guanaco_model_service/chat_api.py", line 274, in resolve_chat_api conversation_tag = self.tagging_function(conversation_state) File "/home/zongyi/gitlab/zztools/zztools/llm/guanaco/submit_routing_model.py", line 176, in last_user_message_length TypeError: 'ConversationState' object is not subscriptable
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Failed to get response for submission thanhdaonguyen-once-upon-a-t_v37: ('http://thanhdaonguyen-once-upon-a-t-v37-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:59358->127.0.0.1:8080: read: connection reset by peer\n')
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Traceback (most recent call last): File "/code/guanaco/guanaco_services/src/guanaco_model_service/chat_api.py", line 274, in resolve_chat_api conversation_tag = self.tagging_function(conversation_state) File "/home/zongyi/gitlab/zztools/zztools/llm/guanaco/submit_routing_model.py", line 176, in last_user_message_length TypeError: 'ConversationMessage' object is not subscriptable
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Retrying (%r) after connection broken by '%r': %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Failed to get response for submission anhnv125-llama-op-v17-1_v27: HTTPConnectionPool(host='anhnv125-llama-op-v17-1-v27-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud', port=80): Read timed out. (read timeout=5.5)
Connection pool is full, discarding connection: %s
Failed to get response for submission thanhdaonguyen-once-upon-a-t_v35: HTTPConnectionPool(host='thanhdaonguyen-once-upon-a-t-v35-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud', port=80): Read timed out. (read timeout=5.5)
Exception raised while processing tagging_function
Failed to get response for submission khanhnto-khanhnto_v54: HTTPConnectionPool(host='khanhnto-khanhnto-v54-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud', port=80): Read timed out. (read timeout=5.5)
Connection pool is full, discarding connection: %s
Failed to get response for submission anhnv125-llama-op-v17-1_v27: HTTPConnectionPool(host='anhnv125-llama-op-v17-1-v27-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud', port=80): Read timed out. (read timeout=5.5)
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Failed to get response for submission thanhdaonguyen-once-upon-a-t_v35: HTTPConnectionPool(host='thanhdaonguyen-once-upon-a-t-v35-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud', port=80): Read timed out. (read timeout=5.5)
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Failed to get response for submission thanhdaonguyen-once-upon-a-t_v21: ('http://thanhdaonguyen-once-upon-a-t-v21-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:56270->127.0.0.1:8080: read: connection reset by peer\n')
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Retrying (%r) after connection broken by '%r': %s
Connection pool is full, discarding connection: %s
Failed to get response for submission hflserdaniel-chai-s6-13b-slp3_v6: ('http://hflserdaniel-chai-s6-13b-slp3-v6-predictor-default.tenant-chaiml-guanaco.knative.ord1.coreweave.cloud/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:50320->127.0.0.1:8080: read: connection reset by peer\n')
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Traceback (most recent call last): File "/code/guanaco/guanaco_services/src/guanaco_model_service/chat_api.py", line 274, in resolve_chat_api conversation_tag = self.tagging_function(conversation_state) File "/home/zongyi/gitlab/zztools/zztools/llm/guanaco/submit_routing_model.py", line 176, in last_user_message_length TypeError: 'ConversationState' object is not subscriptable
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Traceback (most recent call last): File "/code/guanaco/guanaco_services/src/guanaco_model_service/chat_api.py", line 274, in resolve_chat_api conversation_tag = self.tagging_function(conversation_state) File "/home/zongyi/gitlab/zztools/zztools/llm/guanaco/submit_routing_model.py", line 176, in last_user_message_length TypeError: 'ConversationMessage' object is not subscriptable
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Exception raised while processing tagging_function
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Failed to get response for submission blend_tapun_2024-02-14: (<class 'guanaco_model_service.schemas.Submission'>, <class 'cachetools.keys._HashedTuple'>, 'submission_id', 'inv-konstanta-v4-alpha-7b_v5')
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
Connection pool is full, discarding connection: %s
blend_kajok_2024-03-21 status is now inactive due to auto deactivation removed underperforming models
blend_kajok_2024-03-21 status is now torndown due to DeploymentManager action