Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dynomite crash #827

Open
ynsdll opened this issue Nov 13, 2024 · 0 comments
Open

dynomite crash #827

ynsdll opened this issue Nov 13, 2024 · 0 comments

Comments

@ynsdll
Copy link

ynsdll commented Nov 13, 2024

Dynomite up and running clearly. But when ı try to set a key to redis with redis-cli, dynomite getting crash.

Dynomite version: v0.8
Redis server v=7.0.15

dc1.yaml


dyn_o_mite:
  datacenter: dc-a
  rack: rack1
  dyn_listen: 0.0.0.0:7379
  dyn_seed_provider: simple_provider
  preconnect: true
  dyn_seeds:
    - 172.31.43.150:7379:rack1:dc-b:1383429731
  listen: 0.0.0.0:8379
  servers:
    - 127.0.0.1:6379:1
  tokens: '12345678'
  secure_server_option: datacenter
  pem_key_file: /home/ubuntu/dynomite.pem
  data_store: 0
  timeout: 30000
  stats_listen: 127.0.0.1:2222

dc2.yaml

dyn_o_mite:
  datacenter: dc-b
  rack: rack1
  dyn_listen: 0.0.0.0:7379
  dyn_seed_provider: simple_provider
  preconnect: true
  dyn_seeds:
  - 172.31.40.87:7379:rack1:dc-a:12345678
  listen: 0.0.0.0:8379
  servers:
  - 127.0.0.1:6379:1
  tokens: '1383429731'
  secure_server_option: datacenter
  pem_key_file: /home/ubuntu/dynomite.pem
  data_store: 0
  timeout: 30000
  stats_listen: 127.0.0.1:22225

dc1 dynomite client logs;

[2024-11-13 14:21:16.553] dn_print_run:205 dynomite-v0.6.21-rc2-23-g7bb74b0-dirty built for Linux 6.8.0-1016-aws x86_64 started on pid 42567
[2024-11-13 14:21:16.553] dn_print_run:209 run, rabbit run / dig that hole, forget the sun / and when at last the work is done / don't sit down / it's time to dig another one
[2024-11-13 14:21:16.553] dn_print_run:214
     #                                      m
  mmm#  m   m  mmmm    mmm   mmmmm  mmm    mm#mm   mmm
 #   #  \m m/  #   #  #   #  # # #    #      #    #   #
 #   #   #m#   #   #  #   #  # # #    #      #    #''''
 \#m##   \#    #   #   #m#   # # #  mm#mm    mm    #mm
         m/
        ##

[2024-11-13 14:21:16.553] conf_datastore_transform:168 Created <DATASTORE 0x5811330adf30 127.0.0.1:6379:1>
[2024-11-13 14:21:16.556] stats_listen:1342 m 4 listening on '127.0.0.1:22223'
[2024-11-13 14:21:16.556] conn_connect:283 <CONN_SERVER 0x5811330efb80 6 to '127.0.0.1:6379:1'> connecting.....
[2024-11-13 14:21:16.556] proxy_init:113 <CONN_PROXY 0x581133108f20 8 listening on '0.0.0.0:8379'> inited in redis <POOL 0x5811330ad4e0 'dyn_o_mite'>
[2024-11-13 14:21:16.556] dnode_proxy_init:101 <CONN_PEER_PROXY 0x5811331091d0 9 listening on '0.0.0.0:7379'> inited in redis <POOL 0x5811330ad4e0 'dyn_o_mite'>
[2024-11-13 14:21:16.556] dnode_peer_add_local:155 Initialized local peer: <NODE 0x5811330fd2e0 0.0.0.0 dc-a rack1 secured:0>
[2024-11-13 14:21:16.556] dnode_initialize_peer_each:233 added peer <NODE 0x5811330fb6a0 172.31.43.150 dc-b rack1 secured:1>
[2024-11-13 14:21:16.556] conn_connect:283 <CONN_REMOTE_PEER_SERVER 0x581133109480 10 to '172.31.43.150:7379:rack1:dc-b:1383429731'> connecting.....
[2024-11-13 14:21:16.556] preselect_remote_rack_for_replication:1411 my rack index 0
[2024-11-13 14:21:16.556] preselect_remote_rack_for_replication:1434 Selected rack rack1 for replication to remote region dc-b
[2024-11-13 14:21:16.556] core_start:325 mbuf_size not in YAML: using deprecated way  16384
[2024-11-13 14:21:16.556] core_start:333 max_msgs not in YAML: using deprecated way 200000
[2024-11-13 14:21:16.557] server_connected:297 <CONN_SERVER 0x5811330efb80 6 to '127.0.0.1:6379:1'> connected
[2024-11-13 14:21:16.557] core_close:412 close <CONN_REMOTE_PEER_SERVER 0x581133109480 10 to '172.31.43.150:7379:rack1:dc-b:1383429731'> on event FFFFFF eof 0 done 0 rb 0 sb 0: Connection refused
[2024-11-13 14:21:16.557] dnode_peer_close:420 <CONN_REMOTE_PEER_SERVER 0x581133109480 10 to '172.31.43.150:7379:rack1:dc-b:1383429731'> Closing, Dropped 0 outqueue & 0 inqueue requests
[2024-11-13 14:21:16.557] event_del_conn:211 epoll ctl on e 5 sd 10 failed: No such file or directory
[2024-11-13 14:21:16.557] dnode_peer_unref:55 Marking <NODE 0x5811330fb6a0 172.31.43.150 dc-b rack1 secured:1> as down
[2024-11-13 14:21:17.558] conn_connect:283 <CONN_REMOTE_PEER_SERVER 0x581133109480 10 to '172.31.43.150:7379:rack1:dc-b:1383429731'> connecting.....
[2024-11-13 14:21:17.559] core_close:412 close <CONN_REMOTE_PEER_SERVER 0x581133109480 10 to '172.31.43.150:7379:rack1:dc-b:1383429731'> on event FFFFFF eof 0 done 0 rb 0 sb 0: Connection refused
[2024-11-13 14:21:17.559] dnode_peer_close:420 <CONN_REMOTE_PEER_SERVER 0x581133109480 10 to '172.31.43.150:7379:rack1:dc-b:1383429731'> Closing, Dropped 0 outqueue & 0 inqueue requests
[2024-11-13 14:21:17.559] event_del_conn:211 epoll ctl on e 5 sd 10 failed: No such file or directory
[2024-11-13 14:21:17.559] dnode_peer_unref:55 Marking <NODE 0x5811330fb6a0 172.31.43.150 dc-b rack1 secured:1> as down
[2024-11-13 14:21:18.681] dnode_accept:156 Accepting client connection from 172.31.43.150:57456 on sd 10
[2024-11-13 14:21:18.682] dnode_accept:201 <CONN_PEER_PROXY 0x5811331091d0 9 listening on '0.0.0.0:7379'> accepted <CONN_LOCAL_PEER_CLIENT 0x581133109480 10 from '172.31.43.150:57456'>
[2024-11-13 14:21:19.560] conn_connect:283 <CONN_REMOTE_PEER_SERVER 0x5811331173a0 11 to '172.31.43.150:7379:rack1:dc-b:1383429731'> connecting.....
[2024-11-13 14:21:19.561] dnode_peer_connected:754 <CONN_REMOTE_PEER_SERVER 0x5811331173a0 11 to '172.31.43.150:7379:rack1:dc-b:1383429731'> connected
[2024-11-13 14:21:20.719] proxy_accept:203 <CONN_PROXY 0x581133108f20 8 listening on '0.0.0.0:8379'> accepted <CONN_CLIENT 0x581133117650 12 from '172.31.40.87:43270'>
[2024-11-13 14:21:20.719] redis_parse_req:1583 parsed unsupported command 'COMMAND'
[2024-11-13 14:21:20.719] redis_parse_req:2383 parsed bad req 1 res 1 type 0 state 5
00000000  2a 32 0d 0a 24 37 0d 0a  43 4f 4d 4d 41 4e 44 0d   |*2..$7..COMMAND.|
00000010  0a 24 34 0d 0a 44 4f 43  53 0d 0a                  |.$4..DOCS..|
[2024-11-13 14:21:20.719] core_close:412 close <CONN_CLIENT 0x581133117650 12 from '172.31.40.87:43270'> on event FFFF eof 0 done 0 rb 27 sb 0: Invalid argument
[2024-11-13 14:21:20.719] client_unref_internal_try_put:97 <CONN_CLIENT 0x581133117650 -1 from '172.31.40.87:43270'> unref owner <POOL 0x5811330ad4e0 'dyn_o_mite'>
[2024-11-13 14:21:24.141] proxy_accept:203 <CONN_PROXY 0x581133108f20 8 listening on '0.0.0.0:8379'> accepted <CONN_CLIENT 0x581133117650 12 from '172.31.40.87:43280'>
[2024-11-13 14:21:24.154] dn_stacktrace:286 [0] /lib/x86_64-linux-gnu/libc.so.6(+0x45320) [0x74986e045320]
??:0
[2024-11-13 14:21:24.157] dn_stacktrace:286 [1] ./dynomite(dmsg_get+0x2f) [0x581131a1dc2f]
??:0
[2024-11-13 14:21:24.159] dn_stacktrace:286 [2] ./dynomite(+0x1f48d) [0x581131a1e48d]
??:0
[2024-11-13 14:21:24.160] dn_stacktrace:286 [3] ./dynomite(dyn_parse_rsp+0x43) [0x581131a1e9c3]
??:0
[2024-11-13 14:21:24.162] dn_stacktrace:286 [4] ./dynomite(msg_recv+0x13b) [0x581131a2871b]
??:0
[2024-11-13 14:21:24.164] dn_stacktrace:286 [5] ./dynomite(core_core+0xdf) [0x581131a196af]
??:0
[2024-11-13 14:21:24.166] dn_stacktrace:286 [6] ./dynomite(event_wait+0xb9) [0x581131a4c4c9]
??:0
[2024-11-13 14:21:24.168] dn_stacktrace:286 [7] ./dynomite(core_loop+0x177) [0x581131a1aa07]
??:0
[2024-11-13 14:21:24.170] dn_stacktrace:286 [8] ./dynomite(main+0x728) [0x581131a0e988]
??:0
[2024-11-13 14:21:24.172] dn_stacktrace:286 [9] /lib/x86_64-linux-gnu/libc.so.6(+0x2a1ca) [0x74986e02a1ca]
??:0
[2024-11-13 14:21:24.174] dn_stacktrace:286 [10] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0x8b) [0x74986e02a28b]
??:0
[2024-11-13 14:21:24.177] dn_stacktrace:286 [11] ./dynomite(_start+0x25) [0x581131a0ed85]
??:0
[2024-11-13 14:21:24.178] signal_handler:123 signal 11 (SIGSEGV) received, core dumping
Segmentation fault (core dumped)

dc2 dynomite client logs;

`[2024-11-13 14:21:18.676] conf_datastore_transform:168 Created <DATASTORE 0x6424c7e7eef0 127.0.0.1:6379:1>
[2024-11-13 14:21:18.679] stats_listen:1342 m 4 listening on '127.0.0.1:22225'
[2024-11-13 14:21:18.680] conn_connect:283 <CONN_SERVER 0x6424c7ec0b80 6 to '127.0.0.1:6379:1'> connecting.....
[2024-11-13 14:21:18.680] proxy_init:113 <CONN_PROXY 0x6424c7ed00d0 8 listening on '0.0.0.0:8379'> inited in redis <POOL 0x6424c7e7e4e0 'dyn_o_mite'>
[2024-11-13 14:21:18.680] dnode_proxy_init:101 <CONN_PEER_PROXY 0x6424c7ed9e40 9 listening on '0.0.0.0:7379'> inited in redis <POOL 0x6424c7e7e4e0 'dyn_o_mite'>
[2024-11-13 14:21:18.680] dnode_peer_add_local:155 Initialized local peer: <NODE 0x6424c7ece200 0.0.0.0 dc-b rack1 secured:0>
[2024-11-13 14:21:18.680] dnode_initialize_peer_each:233 added peer <NODE 0x6424c7ecc5c0 172.31.40.87 dc-a rack1 secured:1>
[2024-11-13 14:21:18.680] conn_connect:283 <CONN_REMOTE_PEER_SERVER 0x6424c7eda1f0 10 to '172.31.40.87:7379:rack1:dc-a:12345678'> connecting.....
[2024-11-13 14:21:18.680] preselect_remote_rack_for_replication:1411 my rack index 0
[2024-11-13 14:21:18.680] preselect_remote_rack_for_replication:1434 Selected rack rack1 for replication to remote region dc-a
[2024-11-13 14:21:18.681] core_start:325 mbuf_size not in YAML: using deprecated way 16384
[2024-11-13 14:21:18.681] core_start:333 max_msgs not in YAML: using deprecated way 200000
[2024-11-13 14:21:18.681] server_connected:297 <CONN_SERVER 0x6424c7ec0b80 6 to '127.0.0.1:6379:1'> connected
[2024-11-13 14:21:18.681] dnode_peer_connected:754 <CONN_REMOTE_PEER_SERVER 0x6424c7eda1f0 10 to '172.31.40.87:7379:rack1:dc-a:12345678'> connected
[2024-11-13 14:21:19.561] dnode_accept:156 Accepting client connection from 172.31.40.87:42228 on sd 11
[2024-11-13 14:21:19.561] dnode_accept:201 <CONN_PEER_PROXY 0x6424c7ed9e40 9 listening on '0.0.0.0:7379'> accepted <CONN_LOCAL_PEER_CLIENT 0x6424c7ee8110 11 from '172.31.40.87:42228'>
[2024-11-13 14:21:24.153] dyn_parse_req:446 AES decryption key: oPpyze+dhuf7TOmNWYmEhUmz/P/wFVTq9kiQe4tX4K4=

[2024-11-13 14:21:25.345] conn_recv_data:365 <CONN_REMOTE_PEER_CLIENT 0x6424c7ee8110 11 from '172.31.40.87:42228'> recv eof rb 591 sb 46
[2024-11-13 14:21:25.345] core_close:412 close <CONN_REMOTE_PEER_CLIENT 0x6424c7ee8110 11 from '172.31.40.87:42228'> on event 00FF eof 1 done 1 rb 591 sb 46
[2024-11-13 14:21:25.345] dnode_client_unref_internal_try_put:51 unref <CONN_REMOTE_PEER_CLIENT 0x6424c7ee8110 -1 from '172.31.40.87:42228'> owner 0x6424c7e7e4e0 from pool 'dyn_o_mite'
[2024-11-13 14:21:25.345] conn_recv_data:365 <CONN_REMOTE_PEER_SERVER 0x6424c7eda1f0 10 to '172.31.40.87:7379:rack1:dc-a:12345678'> recv eof rb 0 sb 0
[2024-11-13 14:21:25.345] core_close:412 close <CONN_REMOTE_PEER_SERVER 0x6424c7eda1f0 10 to '172.31.40.87:7379:rack1:dc-a:12345678'> on event 00FF eof 1 done 1 rb 0 sb 0
[2024-11-13 14:21:25.345] dnode_peer_close:420 <CONN_REMOTE_PEER_SERVER 0x6424c7eda1f0 10 to '172.31.40.87:7379:rack1:dc-a:12345678'> Closing, Dropped 0 outqueue & 0 inqueue requests
[2024-11-13 14:21:25.345] event_del_conn:211 epoll ctl on e 5 sd 10 failed: No such file or directory
[2024-11-13 14:21:25.345] dnode_peer_unref:55 Marking <NODE 0x6424c7ecc5c0 172.31.40.87 dc-a rack1 secured:1> as down
[2024-11-13 14:21:26.346] conn_connect:283 <CONN_REMOTE_PEER_SERVER 0x6424c7eda1f0 10 to '172.31.40.87:7379:rack1:dc-a:12345678'> connecting.....
[2024-11-13 14:21:26.347] core_close:412 close <CONN_REMOTE_PEER_SERVER 0x6424c7eda1f0 10 to '172.31.40.87:7379:rack1:dc-a:12345678'> on event FFFFFF eof 0 done 0 rb 0 sb 0: Connection refused
[2024-11-13 14:21:26.347] dnode_peer_close:420 <CONN_REMOTE_PEER_SERVER 0x6424c7eda1f0 10 to '172.31.40.87:7379:rack1:dc-a:12345678'> Closing, Dropped 0 outqueue & 0 inqueue requests
[2024-11-13 14:21:26.347] event_del_conn:211 epoll ctl on e 5 sd 10 failed: No such file or directory
[2024-11-13 14:21:26.347] dnode_peer_unref:55 Marking <NODE 0x6424c7ecc5c0 172.31.40.87 dc-a rack1 secured:1> as down
^C[2024-11-13 14:21:27.216] signal_handler:123 signal 2 (SIGINT) received, exiting`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant