无法创建索引(Pro 10.0.4)

使用docker运行:elasticsearch:8.5.3
elasticsearch运行正常

root@l:/opt/seafile/seafile-server-latest# curl 'http://127.0.0.1:9200/_cluster/health?pretty'
{
  "cluster_name" : "docker-cluster",
  "status" : "green",
  "timed_out" : false,
  "number_of_nodes" : 1,
  "number_of_data_nodes" : 1,
  "active_primary_shards" : 1,
  "active_shards" : 1,
  "relocating_shards" : 0,
  "initializing_shards" : 0,
  "unassigned_shards" : 0,
  "delayed_unassigned_shards" : 0,
  "number_of_pending_tasks" : 0,
  "number_of_in_flight_fetch" : 0,
  "task_max_waiting_in_queue_millis" : 0,
  "active_shards_percent_as_number" : 100.0
}

尝试重建索引失败:

seafile@l:/opt/vault/seafile-server-latest$ ./pro/pro.py search --clear
Delete seafile search index ([y]/n)? y

Delete search index, this may take a while...

05/22/2023 00:40:22 [INFO] seafes:210 main: storage: using multiple backends
05/22/2023 00:40:22 [INFO] seafes:212 main: index office pdf: False
05/22/2023 00:40:22 [INFO] elastic_transport.transport:336 perform_request: HEAD http://127.0.0.1:9200/repo_head [status:200 duration:0.058s]
05/22/2023 00:40:22 [WARNING] seafes:175 delete_indices: deleting index repo_head
05/22/2023 00:40:22 [INFO] elastic_transport.transport:336 perform_request: DELETE http://127.0.0.1:9200/repo_head [status:200 duration:0.166s]
05/22/2023 00:40:22 [INFO] elastic_transport.transport:336 perform_request: HEAD http://127.0.0.1:9200/repofiles [status:200 duration:0.011s]
05/22/2023 00:40:22 [WARNING] seafes:175 delete_indices: deleting index repofiles
05/22/2023 00:40:22 [INFO] elastic_transport.transport:336 perform_request: DELETE http://127.0.0.1:9200/repofiles [status:200 duration:0.161s]
seafile@l:/opt/vault/seafile-server-latest$ ./pro/pro.py search --update

Updating search index, this may take a while...

05/22/2023 00:40:25 [INFO] seafes:210 main: storage: using multiple backends
05/22/2023 00:40:25 [INFO] seafes:212 main: index office pdf: False
05/22/2023 00:40:25 [INFO] elastic_transport.transport:336 perform_request: HEAD http://127.0.0.1:9200/repo_head [status:404 duration:0.012s]
05/22/2023 00:40:26 [INFO] elastic_transport.transport:336 perform_request: PUT http://127.0.0.1:9200/repo_head [status:200 duration:0.171s]
05/22/2023 00:40:26 [INFO] elastic_transport.transport:336 perform_request: PUT http://127.0.0.1:9200/repo_head/_mapping [status:200 duration:0.063s]
05/22/2023 00:40:26 [INFO] elastic_transport.transport:336 perform_request: POST http://127.0.0.1:9200/repo_head/_refresh [status:200 duration:0.007s]
05/22/2023 00:40:26 [INFO] elastic_transport.transport:336 perform_request: HEAD http://127.0.0.1:9200/repofiles [status:404 duration:0.007s]
05/22/2023 00:40:26 [INFO] elastic_transport.transport:336 perform_request: PUT http://127.0.0.1:9200/repofiles [status:200 duration:0.457s]
05/22/2023 00:40:26 [INFO] elastic_transport.transport:336 perform_request: PUT http://127.0.0.1:9200/repofiles/_mapping [status:400 duration:0.017s]
05/22/2023 00:40:26 [ERROR] seafes:158 start_index_local: Index process init error: BadRequestError(400, 'mapper_parsing_exception', 'Failed to parse mapping: analyzer [ik_max_word] has not been configured in mappings').

再次update以后,出现下面错误:

05/22/2023 01:48:57 [ERROR] seafes:116 thread_task: Index Repo Error: 'SeafCommit' object has no attribute 'root_id', repo_id: ec981865-e2ab-423e-9666-0ffde0056ce0
Traceback (most recent call last):
  File "/opt/vault/seafile-pro-server-10.0.4/pro/python/seafes/index_local.py", line 105, in thread_task
    self.fileindexupdater.update_repo(repo_id, commit_id)
  File "/opt/vault/seafile-pro-server-10.0.4/pro/python/seafes/file_index_updater.py", line 89, in update_repo
    self.update_files_index(repo_id, old, new)
  File "/opt/vault/seafile-pro-server-10.0.4/pro/python/seafes/file_index_updater.py", line 45, in update_files_index
    new_root = new_commit.root_id
  File "/opt/vault/seafile-pro-server-10.0.4/seahub/thirdpart/seafobj/commits.py", line 16, in __getattr__
    return object.__getattribute__(self, key)
AttributeError: 'SeafCommit' object has no attribute 'root_id'

解决了吗?我也遇到了