Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

problem with jdbc run #977

Open
kavehyzd opened this issue Oct 21, 2018 · 1 comment
Open

problem with jdbc run #977

kavehyzd opened this issue Oct 21, 2018 · 1 comment

Comments

@kavehyzd
Copy link

Hi i'm having problems with the jdbc importer version 2.3.4.0 and elasticsearch 6.4.2

this is my config in script file .bat

@echo off

set DIR=%~dp0
set LIB=%DIR%..\lib\*
set BIN=%DIR%..\bin

REM ???
echo {^
    "type" : "jdbc",^
    "jdbc" : {^
        "url" : "jdbc:mysql://localhost:3306/seotools",^
        "user" : "root",^
        "password" : "",^
        "sql" :  "select *  from category",^
        "treat_binary_as_string" : true,^
        "elasticsearch" : {^
             "cluster" : "elasticsearch",^
             "host" : "localhost",^
             "port" : 9200^
        },^
        "index" : "testing"^
      }^
}|"%JAVA_HOME%\bin\java" -cp "%LIB%" -Dlog4j.configurationFile="%BIN%\log4j2.xml" "org.xbib.tools.Runner" "org.xbib.tools.JDBCImporter"


and when i run script get this error in log file

[15:37:10,920][INFO ][importer.jdbc            ][pool-3-thread-1] strategy standard: settings = {elasticsearch.cluster=elasticsearch, elasticsearch.host=localhost, elasticsearch.port=9300, index=testing, password=, sql=select *  from category, treat_binary_as_string=true, url=jdbc:mysql://localhost:3306/seotools, user=root}, context = org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext@1b7a157f
[15:37:10,927][INFO ][importer.jdbc.context.standard][pool-3-thread-1] found sink class org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink@302e7e94
[15:37:10,935][INFO ][importer.jdbc.context.standard][pool-3-thread-1] found source class org.xbib.elasticsearch.jdbc.strategy.standard.StandardSource@33d74c20
[15:37:10,987][INFO ][org.xbib.elasticsearch.helper.client.BaseTransportClient][pool-3-thread-1] creating transport client on Windows 10 Java HotSpot(TM) 64-Bit Server VM Oracle Corporation 1.8.0_121-b13 25.121-b13 with effective settings {autodiscover=false, client.transport.ignore_cluster_name=false, client.transport.nodes_sampler_interval=5s, client.transport.ping_timeout=5s, cluster.name=elasticsearch, flush_interval=5s, host.0=localhost, max_actions_per_request=10000, max_concurrent_requests=8, max_volume_per_request=10mb, name=importer, port=9300, sniff=false}
[15:37:11,010][INFO ][org.elasticsearch.plugins][pool-3-thread-1] [importer] modules [], plugins [helper], sites []
[15:37:11,591][INFO ][org.xbib.elasticsearch.helper.client.BaseTransportClient][pool-3-thread-1] trying to connect to [localhost/127.0.0.1:9300]
[15:37:11,729][INFO ][org.elasticsearch.org.xbib.elasticsearch.helper.client.TransportClient][pool-3-thread-1] [importer] failed to get node info for {#transport#-1}{127.0.0.1}{localhost/127.0.0.1:9300}, disconnecting...
org.elasticsearch.transport.NodeDisconnectedException: [][localhost/127.0.0.1:9300][cluster:monitor/nodes/liveness] disconnected
[15:37:11,731][ERROR][importer.jdbc            ][pool-3-thread-1] error while processing request: no cluster nodes available, check settings {autodiscover=false, client.transport.ignore_cluster_name=false, client.transport.nodes_sampler_interval=5s, client.transport.ping_timeout=5s, cluster.name=elasticsearch, flush_interval=5s, host.0=localhost, max_actions_per_request=10000, max_concurrent_requests=8, max_volume_per_request=10mb, name=importer, port=9300, sniff=false}
org.elasticsearch.client.transport.NoNodeAvailableException: no cluster nodes available, check settings {autodiscover=false, client.transport.ignore_cluster_name=false, client.transport.nodes_sampler_interval=5s, client.transport.ping_timeout=5s, cluster.name=elasticsearch, flush_interval=5s, host.0=localhost, max_actions_per_request=10000, max_concurrent_requests=8, max_volume_per_request=10mb, name=importer, port=9300, sniff=false}
	at org.xbib.elasticsearch.helper.client.BulkTransportClient.init(BulkTransportClient.java:164) ~[elasticsearch-helper-2.3.4.0.jar:?]
	at org.xbib.elasticsearch.helper.client.ClientBuilder.toBulkTransportClient(ClientBuilder.java:113) ~[elasticsearch-helper-2.3.4.0.jar:?]
	at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink.createClient(StandardSink.java:348) ~[elasticsearch-jdbc-2.3.4.0.jar:?]
	at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink.beforeFetch(StandardSink.java:100) ~[elasticsearch-jdbc-2.3.4.0.jar:?]
	at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.beforeFetch(StandardContext.java:183) ~[elasticsearch-jdbc-2.3.4.0.jar:?]
	at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.execute(StandardContext.java:164) ~[elasticsearch-jdbc-2.3.4.0.jar:?]
	at org.xbib.tools.JDBCImporter.process(JDBCImporter.java:199) ~[elasticsearch-jdbc-2.3.4.0.jar:?]
	at org.xbib.tools.JDBCImporter.newRequest(JDBCImporter.java:185) [elasticsearch-jdbc-2.3.4.0.jar:?]
	at org.xbib.tools.JDBCImporter.newRequest(JDBCImporter.java:51) [elasticsearch-jdbc-2.3.4.0.jar:?]
	at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:50) [elasticsearch-jdbc-2.3.4.0.jar:?]
	at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:16) [elasticsearch-jdbc-2.3.4.0.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_121]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_121]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_121]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_121]

please help me to run jdbc with elastic search

thanks

@HappyToSummer
Copy link

make sure your es&node alive,you can use the head plugin test,http://xxxx:9200/_plugin/head/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants