之前有问题,我把pro里面的es删除了,yum安装了一个
es启动了
{
“name” : “DKQLzEV”,
“cluster_name” : “elasticsearch”,
“cluster_uuid” : “GNv1yxNxSZ2axemuHj6-2w”,
“version” : {
“number” : “5.6.16”,
“build_hash” : “3a740d1”,
“build_date” : “2019-03-13T15:33:36.565Z”,
“build_snapshot” : false,
“lucene_version” : “6.6.1”
},
“tagline” : “You Know, for Search”
}
seafevents.conf 文件配置了
[INDEX FILES]
enabled = true
lang = chinese
external_es_server = true
es_host = 192.168.12.254
es_port = 9200
创建了新的数据目录也给了777权限,配置文件也指向这个目录了
但是依然没办法搜索
更新新索引提示
Updating search index, this may take a while
然后就没有了
可以贴一下seafevents和elasticsearch的日志
[root@localhost elasticsearch]# tail -n 30 elasticsearch.log
[2024-12-09T01:36:44,947][INFO ][o.e.h.n.Netty4HttpServerTransport] [DKQLzEV] publish_address {192.168.12.253:9200}, bound_addresses {192.168.12.253:9200}
[2024-12-09T01:36:44,947][INFO ][o.e.n.Node ] [DKQLzEV] started
[2024-12-09T01:36:44,951][INFO ][o.e.g.GatewayService ] [DKQLzEV] recovered [0] indices into cluster_state
[2024-12-09T06:40:15,023][INFO ][o.e.n.Node ] initializing …
[2024-12-09T06:40:15,121][INFO ][o.e.e.NodeEnvironment ] [DKQLzEV] using [1] data paths, mounts [[/opt (/dev/nvme0n2p1)]], net usable_space [199.5gb], net total_space [439.1gb], spins? [no], types [ext4]
[2024-12-09T06:40:15,122][INFO ][o.e.e.NodeEnvironment ] [DKQLzEV] heap size [989.8mb], compressed ordinary object pointers [true]
[2024-12-09T06:40:15,123][INFO ][o.e.n.Node ] node name [DKQLzEV] derived from node ID [DKQLzEVnTaaTyVLgWeLbLw]; set [node.name] to override
[2024-12-09T06:40:15,124][INFO ][o.e.n.Node ] version[5.6.16], pid[3540], build[3a740d1/2019-03-13T15:33:36.565Z], OS[Linux/4.18.0-193.el8.x86_64/amd64], JVM[Red Hat, Inc./OpenJDK 64-Bit Server VM/1.8.0_302/25.302-b08]
[2024-12-09T06:40:15,124][INFO ][o.e.n.Node ] JVM arguments [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headl
ess=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -Djdk.io.permissionsUseCanonicalPath=true, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Dlog4j.skipJansi=true, -XX:+HeapDumpOnOutOfMemoryError, -Des.path.home=/usr/share/elasticsearch][2024-12-09T06:40:15,987][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [aggs-matrix-stats]
[2024-12-09T06:40:15,987][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [ingest-common]
[2024-12-09T06:40:15,987][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [lang-expression]
[2024-12-09T06:40:15,987][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [lang-groovy]
[2024-12-09T06:40:15,987][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [lang-mustache]
[2024-12-09T06:40:15,987][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [lang-painless]
[2024-12-09T06:40:15,987][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [parent-join]
[2024-12-09T06:40:15,987][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [percolator]
[2024-12-09T06:40:15,988][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [reindex]
[2024-12-09T06:40:15,988][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [transport-netty3]
[2024-12-09T06:40:15,988][INFO ][o.e.p.PluginsService ] [DKQLzEV] loaded module [transport-netty4]
[2024-12-09T06:40:15,988][INFO ][o.e.p.PluginsService ] [DKQLzEV] no plugins loaded
[2024-12-09T06:40:17,348][INFO ][o.e.d.DiscoveryModule ] [DKQLzEV] using discovery type [zen]
[2024-12-09T06:40:17,801][INFO ][o.e.n.Node ] initialized
[2024-12-09T06:40:17,802][INFO ][o.e.n.Node ] [DKQLzEV] starting …
[2024-12-09T06:40:17,963][INFO ][o.e.t.TransportService ] [DKQLzEV] publish_address {192.168.12.253:9300}, bound_addresses {192.168.12.253:9300}
[2024-12-09T06:40:17,972][INFO ][o.e.b.BootstrapChecks ] [DKQLzEV] bound or publishing to a non-loopback address, enforcing bootstrap checks
[2024-12-09T06:40:21,067][INFO ][o.e.c.s.ClusterService ] [DKQLzEV] new_master {DKQLzEV}{DKQLzEVnTaaTyVLgWeLbLw}{3opb6fJjT5CIKdOtbkRBYw}{192.168.12.253}{192.168.12.253:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2024-12-09T06:40:21,093][INFO ][o.e.h.n.Netty4HttpServerTransport] [DKQLzEV] publish_address {192.168.12.253:9200}, bound_addresses {192.168.12.253:9200}
[2024-12-09T06:40:21,094][INFO ][o.e.n.Node ] [DKQLzEV] started
[2024-12-09T06:40:21,096][INFO ][o.e.g.GatewayService ] [DKQLzEV] recovered [0] indices into cluster_state
[root@localhost logs]# tail -n 30 seafevents.log
[2024-12-09 04:46:33,942] [INFO] starts to send email
[2024-12-09 05:17:58,945] [INFO] starts to send email
[2024-12-09 05:49:26,944] [INFO] starts to send email
[2024-12-09 06:20:55,940] [INFO] starts to send email
[2024-12-09 11:55:45,614] [INFO] [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
[2024-12-09 11:55:45,665] [INFO] audit is enabled
[2024-12-09 11:55:45,666] [INFO] [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
[2024-12-09 11:55:45,668] [INFO] [seafevents] database: mysql, name: seafile
[2024-12-09 11:55:46,322] [INFO] The file with the following suffix will be recorded into the file history: md,txt,doc,docx,xls,xlsx,ppt,pptx
[2024-12-09 11:55:46,324] [INFO] [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
[2024-12-09 11:55:46,337] [INFO] Events publish to redis is disabled.
[2024-12-09 11:55:46,338] [INFO] LDAP section is not set, disable ldap sync.
[2024-12-09 11:55:46,340] [INFO] [virus_scan] scan_command option is not found in /opt/seafile/conf/seafile.conf, disable virus scan.
[2024-12-09 11:55:46,418] [WARNING] insane workers value “12”
[2024-12-09 11:55:46,419] [INFO] Subscribe to channels: {‘seaf_server.stats’, ‘seahub.draft’, ‘seaf_server.event’, ‘seahub.audit’, ‘seahub.stats’}
[2024-12-09 11:55:46,455] [INFO] Starting background tasks.
[2024-12-09 11:55:46,455] [INFO] Start file updates sender, interval = 300 sec
[2024-12-09 11:55:46,700] [INFO] work weixin notice sender is disabled
[2024-12-09 11:55:46,700] [INFO] search indexer is started, interval = 259200 sec
[2024-12-09 11:55:46,702] [INFO] seahub email sender is started, interval = 1800 sec
[2024-12-09 11:55:46,716] [INFO] ldap sync is disabled
[2024-12-09 11:55:46,716] [INFO] virus scan is disabled
[2024-12-09 11:55:46,716] [INFO] data statistics is disabled
[2024-12-09 11:55:46,716] [INFO] content scan is disabled
[2024-12-09 11:55:47,125] [INFO] http server process already start.
[2024-12-09 11:55:47,125] [INFO] office converter started
[2024-12-09 11:55:47,125] [INFO] repo old file auto del scanner disabled
[2024-12-09 11:55:47,125] [INFO] User login statistics is disabled.
[2024-12-09 11:55:47,125] [INFO] Traffic statistics is disabled.
[2024-12-09 12:27:04,941] [INFO] starts to send email
[root@localhost logs]#
网页搜索一直转圈 app点搜索空白
用云端文件浏览器也是转圈圈
[root@localhost ~]# cd /opt/seafile/logs/
[root@localhost logs]# tail -n 30 seafevents.log
[2024-12-09 11:55:47,125] [INFO] repo old file auto del scanner disabled
[2024-12-09 11:55:47,125] [INFO] User login statistics is disabled.
[2024-12-09 11:55:47,125] [INFO] Traffic statistics is disabled.
[2024-12-09 12:27:04,941] [INFO] starts to send email
[2024-12-09 12:57:04,952] [INFO] starts to send email
[2024-12-09 13:05:41,243] [INFO] [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
[2024-12-09 13:05:41,258] [INFO] audit is enabled
[2024-12-09 13:05:41,259] [INFO] [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
[2024-12-09 13:05:41,260] [INFO] [seafevents] database: mysql, name: seafile
[2024-12-09 13:05:41,604] [INFO] The file with the following suffix will be recorded into the file history: md,txt,doc,docx,xls,xlsx,ppt,pptx
[2024-12-09 13:05:41,606] [INFO] [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
[2024-12-09 13:05:41,608] [INFO] Events publish to redis is disabled.
[2024-12-09 13:05:41,611] [INFO] LDAP section is not set, disable ldap sync.
[2024-12-09 13:05:41,613] [INFO] [virus_scan] scan_command option is not found in /opt/seafile/conf/seafile.conf, disable virus scan.
[2024-12-09 13:05:41,642] [WARNING] insane workers value “12”
[2024-12-09 13:05:41,642] [INFO] Subscribe to channels: {‘seahub.stats’, ‘seahub.audit’, ‘seahub.draft’, ‘seaf_server.event’, ‘seaf_server.stats’}
[2024-12-09 13:05:41,654] [INFO] Starting background tasks.
[2024-12-09 13:05:41,655] [INFO] Start file updates sender, interval = 300 sec
[2024-12-09 13:05:41,657] [INFO] work weixin notice sender is disabled
[2024-12-09 13:05:41,657] [INFO] search indexer is started, interval = 259200 sec
[2024-12-09 13:05:41,658] [INFO] seahub email sender is started, interval = 1800 sec
[2024-12-09 13:05:41,659] [INFO] ldap sync is disabled
[2024-12-09 13:05:41,660] [INFO] virus scan is disabled
[2024-12-09 13:05:41,660] [INFO] data statistics is disabled
[2024-12-09 13:05:41,660] [INFO] content scan is disabled
[2024-12-09 13:05:41,733] [INFO] http server process already start.
[2024-12-09 13:05:41,734] [INFO] office converter started
[2024-12-09 13:05:41,734] [INFO] repo old file auto del scanner disabled
[2024-12-09 13:05:41,735] [INFO] User login statistics is disabled.
[2024-12-09 13:05:41,735] [INFO] Traffic statistics is disabled.
[root@localhost logs]#
您可以再发下seafile和seahub的日志看下
[root@localhost logs]# tail -n50 seafile.log
2024-12-09 11:55:35 http-server.c(409): fileserver: enable_async_indexing = 0
2024-12-09 11:55:35 http-server.c(421): fileserver: async_indexing_threshold = 700
2024-12-09 11:55:35 http-server.c(433): fileserver: fs_id_list_request_timeout = 300
2024-12-09 11:55:35 http-server.c(446): fileserver: max_sync_file_count = 100000
2024-12-09 11:55:35 http-server.c(461): fileserver: put_head_commit_request_timeout = 10
2024-12-09 11:55:35 …/common/license.c(718): License file /opt/seafile/seafile-license.txt does not exist, allow at most 3 trial users
2024-12-09 11:55:35 socket file exists, delete it anyway
2024-12-09 11:55:35 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 11:55:44 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:56:10 start to serve on pipe client
2024-12-09 11:56:22 start to serve on pipe client
2024-12-09 12:00:13 start to serve on pipe client
2024-12-09 12:12:18 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 12:35:23 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 12:55:56 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 13:05:30 …/common/seaf-utils.c(409): Use database Mysql
2024-12-09 13:05:30 http-server.c(236): fileserver: worker_threads = 10
2024-12-09 13:05:30 http-server.c(249): fileserver: backlog = 32
2024-12-09 13:05:30 http-server.c(264): fileserver: fixed_block_size = 8388608
2024-12-09 13:05:30 http-server.c(279): fileserver: web_token_expire_time = 3600
2024-12-09 13:05:30 http-server.c(294): fileserver: max_indexing_threads = 1
2024-12-09 13:05:30 http-server.c(309): fileserver: max_index_processing_threads= 3
2024-12-09 13:05:30 http-server.c(331): fileserver: cluster_shared_temp_file_mode = 600
2024-12-09 13:05:30 http-server.c(409): fileserver: enable_async_indexing = 0
2024-12-09 13:05:30 http-server.c(421): fileserver: async_indexing_threshold = 700
2024-12-09 13:05:30 http-server.c(433): fileserver: fs_id_list_request_timeout = 300
2024-12-09 13:05:30 http-server.c(446): fileserver: max_sync_file_count = 100000
2024-12-09 13:05:30 http-server.c(461): fileserver: put_head_commit_request_timeout = 10
2024-12-09 13:05:30 …/common/license.c(718): License file /opt/seafile/seafile-license.txt does not exist, allow at most 3 trial users
2024-12-09 13:05:30 socket file exists, delete it anyway
2024-12-09 13:05:30 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 13:05:34 start to serve on pipe client
2024-12-09 13:05:41 start to serve on pipe client
2024-12-09 13:05:41 start to serve on pipe client
2024-12-09 13:05:41 start to serve on pipe client
2024-12-09 13:06:45 start to serve on pipe client
2024-12-09 13:07:52 start to serve on pipe client
2024-12-09 13:09:01 start to serve on pipe client
2024-12-09 13:09:14 start to serve on pipe client
2024-12-09 13:10:11 start to serve on pipe client
2024-12-09 13:20:55 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 13:42:53 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 14:06:08 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 14:29:20 …/common/license.c(718): License file /opt/seafile/seafile-license.txt does not exist, allow at most 3 trial users
2024-12-09 14:29:20 filelock-mgr.c(975): Cleaning expired file locks.
[root@localhost logs]# tail -n50 seafile.log
2024-12-09 11:55:35 http-server.c(409): fileserver: enable_async_indexing = 0
2024-12-09 11:55:35 http-server.c(421): fileserver: async_indexing_threshold = 700
2024-12-09 11:55:35 http-server.c(433): fileserver: fs_id_list_request_timeout = 300
2024-12-09 11:55:35 http-server.c(446): fileserver: max_sync_file_count = 100000
2024-12-09 11:55:35 http-server.c(461): fileserver: put_head_commit_request_timeout = 10
2024-12-09 11:55:35 …/common/license.c(718): License file /opt/seafile/seafile-license.txt does not exist, allow at most 3 trial users
2024-12-09 11:55:35 socket file exists, delete it anyway
2024-12-09 11:55:35 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 11:55:44 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:55:46 start to serve on pipe client
2024-12-09 11:56:10 start to serve on pipe client
2024-12-09 11:56:22 start to serve on pipe client
2024-12-09 12:00:13 start to serve on pipe client
2024-12-09 12:12:18 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 12:35:23 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 12:55:56 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 13:05:30 …/common/seaf-utils.c(409): Use database Mysql
2024-12-09 13:05:30 http-server.c(236): fileserver: worker_threads = 10
2024-12-09 13:05:30 http-server.c(249): fileserver: backlog = 32
2024-12-09 13:05:30 http-server.c(264): fileserver: fixed_block_size = 8388608
2024-12-09 13:05:30 http-server.c(279): fileserver: web_token_expire_time = 3600
2024-12-09 13:05:30 http-server.c(294): fileserver: max_indexing_threads = 1
2024-12-09 13:05:30 http-server.c(309): fileserver: max_index_processing_threads= 3
2024-12-09 13:05:30 http-server.c(331): fileserver: cluster_shared_temp_file_mode = 600
2024-12-09 13:05:30 http-server.c(409): fileserver: enable_async_indexing = 0
2024-12-09 13:05:30 http-server.c(421): fileserver: async_indexing_threshold = 700
2024-12-09 13:05:30 http-server.c(433): fileserver: fs_id_list_request_timeout = 300
2024-12-09 13:05:30 http-server.c(446): fileserver: max_sync_file_count = 100000
2024-12-09 13:05:30 http-server.c(461): fileserver: put_head_commit_request_timeout = 10
2024-12-09 13:05:30 …/common/license.c(718): License file /opt/seafile/seafile-license.txt does not exist, allow at most 3 trial users
2024-12-09 13:05:30 socket file exists, delete it anyway
2024-12-09 13:05:30 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 13:05:34 start to serve on pipe client
2024-12-09 13:05:41 start to serve on pipe client
2024-12-09 13:05:41 start to serve on pipe client
2024-12-09 13:05:41 start to serve on pipe client
2024-12-09 13:06:45 start to serve on pipe client
2024-12-09 13:07:52 start to serve on pipe client
2024-12-09 13:09:01 start to serve on pipe client
2024-12-09 13:09:14 start to serve on pipe client
2024-12-09 13:10:11 start to serve on pipe client
2024-12-09 13:20:55 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 13:42:53 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 14:06:08 filelock-mgr.c(975): Cleaning expired file locks.
2024-12-09 14:29:20 …/common/license.c(718): License file /opt/seafile/seafile-license.txt does not exist, allow at most 3 trial users
[root@localhost logs]# tail -n50 seahub.log
version, status, reason = self._read_status()
File “/usr/lib64/python3.6/http/client.py”, line 276, in _read_status
raise RemoteDisconnected(“Remote end closed connection without”
urllib3.exceptions.ProtocolError: (‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))
2024-12-09 14:42:19,303 [ERROR] seahub.api2.views:594 get ConnectionError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))) caused by: ProtocolError((‘Connection aborted.’, RemoteDisconnected
(‘Remote end closed connection without response’,)))2024-12-09 14:42:19,305 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 14:42:19,483 [WARNING] elasticsearch:222 log_request_fail HEAD http://192.168.12.254:9200/repofiles [status:N/A request:0.003s]
Traceback (most recent call last):
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 706, in urlopen
chunked=chunked,
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 445, in _make_request
six.raise_from(e, None)
File “”, line 3, in raise_from
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 440, in _make_request
httplib_response = conn.getresponse()
File “/usr/lib64/python3.6/http/client.py”, line 1346, in getresponse
response.begin()
File “/usr/lib64/python3.6/http/client.py”, line 307, in begin
version, status, reason = self._read_status()
File “/usr/lib64/python3.6/http/client.py”, line 276, in _read_status
raise RemoteDisconnected(“Remote end closed connection without”
http.client.RemoteDisconnected: Remote end closed connection without response
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File “/opt/seafile/seafile-pro-server-8.0.17/pro/python/elasticsearch/connection/http_urllib3.py”, line 233, in perform_request
method, url, body, retries=Retry(False), headers=request_headers, **kw
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 756, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/util/retry.py”, line 507, in increment
raise six.reraise(type(error), error, _stacktrace)
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/packages/six.py”, line 769, in reraise
raise value.with_traceback(tb)
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 706, in urlopen
chunked=chunked,
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 445, in _make_request
six.raise_from(e, None)
File “”, line 3, in raise_from
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 440, in _make_request
httplib_response = conn.getresponse()
File “/usr/lib64/python3.6/http/client.py”, line 1346, in getresponse
response.begin()
File “/usr/lib64/python3.6/http/client.py”, line 307, in begin
version, status, reason = self._read_status()
File “/usr/lib64/python3.6/http/client.py”, line 276, in _read_status
raise RemoteDisconnected(“Remote end closed connection without”
urllib3.exceptions.ProtocolError: (‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))
2024-12-09 14:42:19,484 [ERROR] seahub.api2.views:594 get ConnectionError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))) caused by: ProtocolError((‘Connection aborted.’, RemoteDisconnected
(‘Remote end closed connection without response’,)))2024-12-09 14:42:19,484 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
[root@localhost logs]# tail -n 50 seahub.log
2024-12-09 14:42:19,303 [ERROR] seahub.api2.views:594 get ConnectionError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))) caused by: ProtocolError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection wit
hout response’,)))2024-12-09 14:42:19,305 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 14:42:19,483 [WARNING] elasticsearch:222 log_request_fail HEAD http://192.168.12.254:9200/repofiles [status:N/A request:0.003s]
Traceback (most recent call last):
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 706, in urlopen
chunked=chunked,
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 445, in _make_request
six.raise_from(e, None)
File “”, line 3, in raise_from
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 440, in _make_request
httplib_response = conn.getresponse()
File “/usr/lib64/python3.6/http/client.py”, line 1346, in getresponse
response.begin()
File “/usr/lib64/python3.6/http/client.py”, line 307, in begin
version, status, reason = self._read_status()
File “/usr/lib64/python3.6/http/client.py”, line 276, in _read_status
raise RemoteDisconnected(“Remote end closed connection without”
http.client.RemoteDisconnected: Remote end closed connection without response
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File “/opt/seafile/seafile-pro-server-8.0.17/pro/python/elasticsearch/connection/http_urllib3.py”, line 233, in perform_request
method, url, body, retries=Retry(False), headers=request_headers, kw
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 756, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/util/retry.py”, line 507, in increment
raise six.reraise(type(error), error, _stacktrace)
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/packages/six.py”, line 769, in reraise
raise value.with_traceback(tb)
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 706, in urlopen
chunked=chunked,
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 445, in _make_request
six.raise_from(e, None)
File “”, line 3, in raise_from
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 440, in _make_request
httplib_response = conn.getresponse()
File “/usr/lib64/python3.6/http/client.py”, line 1346, in getresponse
response.begin()
File “/usr/lib64/python3.6/http/client.py”, line 307, in begin
version, status, reason = self._read_status()
File “/usr/lib64/python3.6/http/client.py”, line 276, in _read_status
raise RemoteDisconnected(“Remote end closed connection without”
urllib3.exceptions.ProtocolError: (‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))
2024-12-09 14:42:19,484 [ERROR] seahub.api2.views:594 get ConnectionError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))) caused by: ProtocolError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection wit
hout response’,)))2024-12-09 14:42:19,484 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 14:55:30,091 [WARNING] django.request:228 log_response Not Found: /www.baidu.com:443
2024-12-09 15:50:54,427 [WARNING] django.request:228 log_response Not Found: /
2024-12-09 15:51:02,349 [WARNING] django.request:228 log_response Not Found: /favicon.ico
2024-12-09 15:51:03,794 [WARNING] django.request:228 log_response Not Found: /
从seahub日志上看,应该是es的服务没连上,如果是额外机器部署,确认下es服务是否能访问到
同一台服务器部署的,其他电脑可以打开9200网站,自从装了es后,seafile不知道过了多久自动关闭了
[root@localhost logs]# tail -n 50 seahub.log
2024-12-09 14:42:19,303 [ERROR] seahub.api2.views:594 get ConnectionError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))) caused by: ProtocolError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection wit
hout response’,)))2024-12-09 14:42:19,305 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 14:42:19,483 [WARNING] elasticsearch:222 log_request_fail HEAD http://192.168.12.254:9200/repofiles [status:N/A request:0.003s]
Traceback (most recent call last):
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 706, in urlopen
chunked=chunked,
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 445, in _make_request
six.raise_from(e, None)
File “”, line 3, in raise_from
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 440, in _make_request
httplib_response = conn.getresponse()
File “/usr/lib64/python3.6/http/client.py”, line 1346, in getresponse
response.begin()
File “/usr/lib64/python3.6/http/client.py”, line 307, in begin
version, status, reason = self._read_status()
File “/usr/lib64/python3.6/http/client.py”, line 276, in _read_status
raise RemoteDisconnected(“Remote end closed connection without”
http.client.RemoteDisconnected: Remote end closed connection without response
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File “/opt/seafile/seafile-pro-server-8.0.17/pro/python/elasticsearch/connection/http_urllib3.py”, line 233, in perform_request
method, url, body, retries=Retry(False), headers=request_headers, kw
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 756, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/util/retry.py”, line 507, in increment
raise six.reraise(type(error), error, _stacktrace)
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/packages/six.py”, line 769, in reraise
raise value.with_traceback(tb)
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 706, in urlopen
chunked=chunked,
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 445, in _make_request
six.raise_from(e, None)
File “”, line 3, in raise_from
File “/opt/seafile/seafile-pro-server-8.0.17/seahub/thirdpart/urllib3/connectionpool.py”, line 440, in _make_request
httplib_response = conn.getresponse()
File “/usr/lib64/python3.6/http/client.py”, line 1346, in getresponse
response.begin()
File “/usr/lib64/python3.6/http/client.py”, line 307, in begin
version, status, reason = self._read_status()
File “/usr/lib64/python3.6/http/client.py”, line 276, in _read_status
raise RemoteDisconnected(“Remote end closed connection without”
urllib3.exceptions.ProtocolError: (‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))
2024-12-09 14:42:19,484 [ERROR] seahub.api2.views:594 get ConnectionError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’,))) caused by: ProtocolError((‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection wit
hout response’,)))2024-12-09 14:42:19,484 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 14:55:30,091 [WARNING] django.request:228 log_response Not Found: /www.baidu.com:443
2024-12-09 15:50:54,427 [WARNING] django.request:228 log_response Not Found: /
2024-12-09 15:51:02,349 [WARNING] django.request:228 log_response Not Found: /favicon.ico
2024-12-09 15:51:03,794 [WARNING] django.request:228 log_response Not Found: /
这是seafevents配置
[INDEX FILES]
enabled = true
lang = chinese
external_es_server = true
es_host = 192.168.12.254
es_port = 9200
#username = elastic
#password = sWCTMZB6RYje0EDo8wAN
#scheme = https
#cafile = /opt/seafile/seafile-pro-server-8.0.17/es_ssl
highlight = fvh
index_office_pdf = false
不好意思,我看了下日志,发现是es ip地址没有对,以前试过外部部署,忘记改回来了,现在我把配置文件的ip改es的ip了,现在更新索引出现错误
[root@localhost pro]# ./pro.py search --u
Updating search index, this may take a while…
12/09/2024 19:00:25 [INFO] seafes:208 main: storage: using filesystem storage backend
12/09/2024 19:00:25 [INFO] seafes:210 main: index office pdf: False
12/09/2024 19:00:26 [ERROR] seafes:158 start_index_local: Index process init error: RequestError(400, ‘mapper_parsing_exception’, ‘analyzer [ik_smart] not found for field [filename]’).
[root@localhost logs]# tail -n 50 seahub.log
2024-12-09 19:09:09,208 [INFO] seafevents:118 is_audit_enabled audit is enabled
2024-12-09 19:09:09,210 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:09:09,439 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:09:09,440 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:09:09,441 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:12:38,202 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:12:38,211 [INFO] seafevents.db:74 create_engine_from_conf [seafevents] database: mysql, name: seafile
2024-12-09 19:12:38,380 [INFO] seafevents.app.config:118 load_file_history_config The file with the following suffix will be recorded into the file history: md,txt,doc,docx,xls,xlsx,ppt,pptx
2024-12-09 19:12:38,386 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:12:38,387 [INFO] seafevents:118 is_audit_enabled audit is enabled
2024-12-09 19:12:38,388 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:12:38,448 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:12:38,448 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:12:38,449 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:15:47,091 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:15:47,122 [INFO] seafevents.db:74 create_engine_from_conf [seafevents] database: mysql, name: seafile
2024-12-09 19:15:47,603 [INFO] seafevents.app.config:118 load_file_history_config The file with the following suffix will be recorded into the file history: md,txt,doc,docx,xls,xlsx,ppt,pptx
2024-12-09 19:15:47,620 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:15:47,623 [INFO] seafevents:118 is_audit_enabled audit is enabled
2024-12-09 19:15:47,624 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:15:47,784 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:15:47,784 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:15:47,785 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:18:19,166 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:18:19,190 [INFO] seafevents.db:74 create_engine_from_conf [seafevents] database: mysql, name: seafile
2024-12-09 19:18:19,589 [INFO] seafevents.app.config:118 load_file_history_config The file with the following suffix will be recorded into the file history: md,txt,doc,docx,xls,xlsx,ppt,pptx
2024-12-09 19:18:19,604 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:18:19,606 [INFO] seafevents:118 is_audit_enabled audit is enabled
2024-12-09 19:18:19,607 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:18:19,740 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:18:19,741 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:18:19,741 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:18:57,438 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:18:57,449 [INFO] seafevents.db:74 create_engine_from_conf [seafevents] database: mysql, name: seafile
2024-12-09 19:18:57,613 [INFO] seafevents.app.config:118 load_file_history_config The file with the following suffix will be recorded into the file history: md,txt,doc,docx,xls,xlsx,ppt,pptx
2024-12-09 19:18:57,620 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:18:57,621 [INFO] seafevents:118 is_audit_enabled audit is enabled
2024-12-09 19:18:57,621 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:18:57,686 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:18:57,686 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:18:57,686 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:25:14,143 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:25:14,175 [INFO] seafevents.db:74 create_engine_from_conf [seafevents] database: mysql, name: seafile
2024-12-09 19:25:14,719 [INFO] seafevents.app.config:118 load_file_history_config The file with the following suffix will be recorded into the file history: md,txt,doc,docx,xls,xlsx,ppt,pptx
2024-12-09 19:25:14,741 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 19:25:14,744 [INFO] seafevents:118 is_audit_enabled audit is enabled
2024-12-09 19:25:14,745 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:25:14,957 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:25:14,958 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 19:25:14,959 [WARNING] root:16 parse_workers insane workers value “12”
seahub无法启动了,怎么回事
我把配置文件的es IP改好了 还是不行 依然报错,有时候seahub启动不起来,重启电脑才行
[root@localhost logs]# tail -n50 seahub.log
2024-12-09 19:53:16,144 [WARNING] django.request:228 log_response Method Not Allowed (GET): /office-convert/status/
2024-12-09 19:53:18,178 [WARNING] django.request:228 log_response Method Not Allowed (GET): /office-convert/status/
2024-12-09 19:53:20,202 [WARNING] django.request:228 log_response Method Not Allowed (GET): /office-convert/status/
2024-12-09 19:53:22,679 [WARNING] django.request:228 log_response Method Not Allowed (GET): /office-convert/status/
2024-12-09 19:53:25,680 [WARNING] django.request:228 log_response Method Not Allowed (GET): /office-convert/status/
2024-12-09 19:53:26,042 [WARNING] django.request:228 log_response Method Not Allowed (GET): /office-convert/static/cd920512-7be3-4b4e-85ec-a32ad89e20db/bc149a559035e0c0e03e87c7018fcb7e8fa30f23/1701项目实施文档
1.docx/fake.pdf2024-12-09 19:53:38,073 [WARNING] django.request:228 log_response Method Not Allowed (GET): /office-convert/status/
2024-12-09 19:53:38,167 [WARNING] django.request:228 log_response Method Not Allowed (GET): /office-convert/static/cd920512-7be3-4b4e-85ec-a32ad89e20db/bc149a559035e0c0e03e87c7018fcb7e8fa30f23/1701项目实施文档
1.docx/fake.pdf2024-12-09 19:53:53,188 [INFO] elasticsearch:194 log_request_success HEAD http://192.168.12.253:9200/repofiles [status:200 request:0.006s]
2024-12-09 19:53:53,196 [WARNING] elasticsearch:222 log_request_fail GET http://192.168.12.253:9200/repofiles/_search [status:400 request:0.005s]
2024-12-09 19:53:53,196 [ERROR] seahub.api2.views:594 get RequestError(400, ‘search_phase_execution_exception’, ‘[match] analyzer [ik_smart] not found’)
2024-12-09 19:53:53,198 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 19:53:53,381 [INFO] elasticsearch:194 log_request_success HEAD http://192.168.12.253:9200/repofiles [status:200 request:0.005s]
2024-12-09 19:53:53,387 [WARNING] elasticsearch:222 log_request_fail GET http://192.168.12.253:9200/repofiles/_search [status:400 request:0.005s]
2024-12-09 19:53:53,387 [ERROR] seahub.api2.views:594 get RequestError(400, ‘search_phase_execution_exception’, ‘[match] analyzer [ik_smart] not found’)
2024-12-09 19:53:53,388 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 20:02:23,136 [WARNING] django.request:228 log_response Not Found: /.env
2024-12-09 20:15:29,498 [WARNING] django.request:228 log_response Not Found: /YlSD
2024-12-09 20:17:31,757 [INFO] elasticsearch:194 log_request_success HEAD http://192.168.12.253:9200/repofiles [status:200 request:0.005s]
2024-12-09 20:17:31,764 [WARNING] elasticsearch:222 log_request_fail GET http://192.168.12.253:9200/repofiles/_search [status:400 request:0.007s]
2024-12-09 20:17:31,764 [ERROR] seahub.api2.views:594 get RequestError(400, ‘search_phase_execution_exception’, ‘[match] analyzer [ik_smart] not found’)
2024-12-09 20:17:31,765 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 20:17:32,019 [INFO] elasticsearch:194 log_request_success HEAD http://192.168.12.253:9200/repofiles [status:200 request:0.004s]
2024-12-09 20:17:32,025 [WARNING] elasticsearch:222 log_request_fail GET http://192.168.12.253:9200/repofiles/_search [status:400 request:0.005s]
2024-12-09 20:17:32,025 [ERROR] seahub.api2.views:594 get RequestError(400, ‘search_phase_execution_exception’, ‘[match] analyzer [ik_smart] not found’)
2024-12-09 20:17:32,026 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 20:32:12,842 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 20:32:12,858 [INFO] seafevents.db:74 create_engine_from_conf [seafevents] database: mysql, name: seafile
2024-12-09 20:32:13,103 [INFO] seafevents.app.config:118 load_file_history_config The file with the following suffix will be recorded into the file history: md,txt,doc,docx,xls,xlsx,ppt,pptx
2024-12-09 20:32:13,111 [INFO] seafevents.db:58 create_engine_from_conf [seafevents] database: sqlite3, path: /opt/seafile/pro-data/seafevents.db
2024-12-09 20:32:13,112 [INFO] seafevents:118 is_audit_enabled audit is enabled
2024-12-09 20:32:13,113 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 20:32:13,218 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 20:32:13,218 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 20:32:13,218 [WARNING] root:16 parse_workers insane workers value “12”
2024-12-09 20:33:46,368 [INFO] seafes:135 load_seafevents_conf [seafes] use language chinese
2024-12-09 20:33:46,368 [INFO] seafes:162 load_seafevents_conf [seafes] use highlighter fvh
2024-12-09 20:33:49,325 [INFO] seafes:135 load_seafevents_conf [seafes] use language chinese
2024-12-09 20:33:49,325 [INFO] seafes:162 load_seafevents_conf [seafes] use highlighter fvh
2024-12-09 20:33:49,634 [ERROR] seahub.avatar.models:116 create_thumbnail [Errno 2] No such file or directory: '/opt/seafile/seafile-pro-server-8.0.17/seahub/media/avatars/c/c/4f4e301610918a1a453844268935f6/b9
876e5d57a4d69693a939118e417582.png’2024-12-09 20:33:52,186 [INFO] seafes:135 load_seafevents_conf [seafes] use language chinese
2024-12-09 20:33:52,186 [INFO] seafes:162 load_seafevents_conf [seafes] use highlighter fvh
2024-12-09 20:33:53,626 [INFO] elasticsearch:194 log_request_success HEAD http://192.168.12.253:9200/repofiles [status:200 request:0.075s]
2024-12-09 20:33:53,692 [WARNING] elasticsearch:222 log_request_fail GET http://192.168.12.253:9200/repofiles/_search [status:400 request:0.063s]
2024-12-09 20:33:53,693 [ERROR] seahub.api2.views:594 get RequestError(400, ‘search_phase_execution_exception’, ‘[match] analyzer [ik_smart] not found’)
2024-12-09 20:33:53,696 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
2024-12-09 20:33:53,786 [INFO] elasticsearch:194 log_request_success HEAD http://192.168.12.253:9200/repofiles [status:200 request:0.014s]
2024-12-09 20:33:53,795 [WARNING] elasticsearch:222 log_request_fail GET http://192.168.12.253:9200/repofiles/_search [status:400 request:0.009s]
2024-12-09 20:33:53,795 [ERROR] seahub.api2.views:594 get RequestError(400, ‘search_phase_execution_exception’, ‘[match] analyzer [ik_smart] not found’)
2024-12-09 20:33:53,796 [ERROR] django.request:228 log_response Internal Server Error: /api2/search/
[root@localhost logs]#
从错误日志看应该是es需要安装ik_smart
插件
[seafer@localhost bin]$ sudo ./elasticsearch-plugin install file://usr/share/elasticsearch/plugins/ik
→ Downloading file://usr/share/elasticsearch/plugins/ik
Exception in thread “main” java.net.UnknownHostException: usr
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184)
at java.net.Socket.connect(Socket.java:607)
at sun.net.ftp.impl.FtpClient.doConnect(FtpClient.java:1062)
at sun.net.ftp.impl.FtpClient.tryConnect(FtpClient.java:1024)
at sun.net.ftp.impl.FtpClient.connect(FtpClient.java:1119)
at sun.net.ftp.impl.FtpClient.connect(FtpClient.java:1105)
at sun.net.www.protocol.ftp.FtpURLConnection.connect(FtpURLConnection.java:311)
at sun.net.www.protocol.ftp.FtpURLConnection.getInputStream(FtpURLConnection.java:417)
at org.elasticsearch.plugins.InstallPluginCommand.downloadZip(InstallPluginCommand.java:328)
at org.elasticsearch.plugins.InstallPluginCommand.download(InstallPluginCommand.java:248)
at org.elasticsearch.plugins.InstallPluginCommand.execute(InstallPluginCommand.java:216)
at org.elasticsearch.plugins.InstallPluginCommand.execute(InstallPluginCommand.java:202)
at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:70)
at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:134)
at org.elasticsearch.cli.MultiCommand.execute(MultiCommand.java:69)
at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:134)
at org.elasticsearch.cli.Command.main(Command.java:90)
at org.elasticsearch.plugins.PluginCli.main(PluginCli.java:47)
[seafer@localhost seafile-server-latest]$ ./pro/pro.py search --u
Updating search index, this may take a while…
12/09/2024 22:00:08 [INFO] seafes:208 main: storage: using filesystem storage backend
12/09/2024 22:00:08 [INFO] seafes:210 main: index office pdf: False
12/09/2024 22:00:08 [INFO] seafes:161 start_index_local: Index process initialized.
12/09/2024 22:00:08 [INFO] seafes:46 run: starting worker0 worker threads for indexing
12/09/2024 22:00:08 [INFO] seafes:46 run: starting worker1 worker threads for indexing
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 42a1e2ff-cb7d-4c46-a105-d01514eb6bf6 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo cd920512-7be3-4b4e-85ec-a32ad89e20db already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo e8b28d37-9012-471f-90a3-67db8ea34c63 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo e0dfcf9c-6a11-4245-82f3-5fd9cbee734d already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 15ac263d-a1d5-4cd2-83a4-0794ee3175db already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 77ada7bc-8d84-4590-9eb6-822e16f07888 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 6afad949-0e2a-475c-bbe8-634ea001fe45 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 853d92f6-2121-4f18-9d3b-d66ebd8ebd8b already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 346bcadd-19f6-4a66-b6be-bbccf1c4973e already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 124adc82-0645-41e7-9473-4a44c2751582 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 65da2da8-7e8c-4bf4-ba83-f8264cc5cecf already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 9ddc0dc4-51a4-4f5d-af41-f4818d653acd already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo b212224c-110e-4da7-8c6d-6ae0597b462f already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 5a5080f3-2d4e-442c-87d7-01daf7507365 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 42a044e6-79de-44fe-be4d-1f376cece2f0 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 342e8f3e-00ad-46b4-bf25-648998d0b4ea already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 85af458f-23dc-4b87-ae92-62b32ed28539 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 297e8b54-a0cd-45d3-89fd-d1abb22a562a already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 8d867c07-e5a8-4e5f-afd7-4464193ed233 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 2802be7a-5f86-4048-a7c3-374ee7568a4e already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 6f9c9d01-aa08-4c9c-80af-200b7b31472f already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 490f9133-dc4a-4ff7-9e7f-d0d8949cbc01 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo fa759a04-c3e2-416b-a0ed-316bd2688259 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 7f8cdb43-8deb-4eb6-a2f3-83578978118b already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 96195839-eba3-4a2c-82cc-dae422b8ea68 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 5376e267-7b69-4f55-8384-3feba2bcc4e5 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo de9e16d3-2485-4a39-aa6a-ea053e9ed4f6 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 324f13e5-84f3-46f5-ba52-442f75ef9dff already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo fe06cd23-af40-47f3-81ff-ba0fd497ecc0 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:96 thread_task: Queue is empty, worker0 worker threads stop
12/09/2024 22:00:10 [INFO] seafes:122 thread_task: worker0 worker updated at 2024-12-09 22:00 time
12/09/2024 22:00:10 [INFO] seafes:127 thread_task: worker0 worker get 0 error
12/09/2024 22:00:10 [DEBUG] seafes:93 update_repo: Repo 6b3e9b50-fb12-4ba2-85d5-6e71f35b9977 already uptodate
12/09/2024 22:00:10 [DEBUG] seafes:96 thread_task: Queue is empty, worker1 worker threads stop
12/09/2024 22:00:10 [INFO] seafes:122 thread_task: worker1 worker updated at 2024-12-09 22:00 time
12/09/2024 22:00:10 [INFO] seafes:127 thread_task: worker1 worker get 0 error
12/09/2024 22:00:10 [INFO] seafes:38 clear_worker: All worker threads has stopped.
12/09/2024 22:00:10 [INFO] seafes:72 run: index updated, total time 2.164555549621582 seconds
12/09/2024 22:00:10 [INFO] seafes:131 clear_deleted_repo: start to clear deleted repo
12/09/2024 22:00:11 [INFO] seafes:135 clear_deleted_repo: 0 repos need to be deleted.
12/09/2024 22:00:11 [INFO] seafes:139 clear_deleted_repo: deleted repo has been cleared
12/09/2024 22:00:11 [INFO] seafes:164 start_index_local:
Index updated, statistic report:
12/09/2024 22:00:11 [INFO] seafes:165 start_index_local: [commit read] 0
12/09/2024 22:00:11 [INFO] seafes:166 start_index_local: [dir read] 0
12/09/2024 22:00:11 [INFO] seafes:167 start_index_local: [file read] 0
12/09/2024 22:00:11 [INFO] seafes:168 start_index_local: [block read] 0
[seafer@localhost seafile-server-latest]$