Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix memory leak. (backport #828) #840

Merged
merged 1 commit into from
Jan 4, 2022
Merged

Conversation

mergify[bot]
Copy link
Contributor

@mergify mergify bot commented Oct 25, 2021

This is an automatic backport of pull request #828 done by Mergify.


Mergify commands and options

More conditions and actions can be found in the documentation.

You can also trigger Mergify actions by commenting on this pull request:

  • @Mergifyio refresh will re-evaluate the rules
  • @Mergifyio rebase will rebase this PR on its base branch
  • @Mergifyio update will merge the base branch into this PR
  • @Mergifyio backport <destination> will backport this PR on <destination> branch

Additionally, on Mergify dashboard you can:

  • look at your merge queues
  • generate the Mergify configuration with the config editor.

Finally, you can contact us on https://mergify.io/

* Fix memory leak.

Signed-off-by: Lei Liu <[email protected]>

Co-authored-by: Chen Lihui <[email protected]>
Co-authored-by: Abrar Rahman Protyasha <[email protected]>
(cherry picked from commit ac13665)
@mergify mergify bot mentioned this pull request Oct 25, 2021
@fujitatomoya
Copy link
Collaborator

CC: @aprotyas

Copy link
Member

@cottsay cottsay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for backporting.

@aprotyas
Copy link
Member

CI (build/test: --packages-above-and-dependencies rclpy)

  • Linux Build Status
  • Linux-aarch64 Build Status
  • macOS Build Status
  • Windows Build Status

@aprotyas
Copy link
Member

aprotyas commented Oct 25, 2021

Apologies for the botched CI, I ran it on Rolling and not Galactic. 🤦

CI (build: --packages-above-and-dependencies rclpy, test: --packages-above rclpy)

  • Linux Build Status
  • Linux-aarch64 Build Status
  • macOS Build Status
  • Windows Build Status

@clalancette
Copy link
Contributor

I think we are just waiting on the Galactic sync to go out, then we can merge this. Assigning to @cottsay to merge once Galactic sync is done.

@aditya2592
Copy link

Hi, just checking if this can be merged? We have a ROS service that needs to run at a high Hz and the memory fills up pretty fast on the client process. I wanted to check if this fixes it

@cottsay cottsay merged commit 133bb07 into galactic Jan 4, 2022
@delete-merged-branch delete-merged-branch bot deleted the mergify/bp/galactic/pr-828 branch January 4, 2022 19:13
@aditya2592
Copy link

After building from source with this PR merged, the memory leak still exists on Galactic. Its not present on a build from Master. Are there are any other PRs that need to be merged for this to work?

@fujitatomoya
Copy link
Collaborator

@aditya2592

the memory leak still exists on Galactic.

do you mean that we still have the memory leak with using https://github.com/ros2/ros2/tree/galactic source build?

@aprotyas
Copy link
Member

aprotyas commented Jan 5, 2022

After building from source with this PR merged, the memory leak still exists on Galactic. Its not present on a build from Master. Are there are any other PRs that need to be merged for this to work?

@aditya2592 can you share a small example to check this?

I recorded a simple client's memory usage with this demo against rclpy built from source (galactic branch) in the latest ros:galactic Docker image and the trace in blue isn't indicative of a memory leak. Thoughts? For reference, here's a visual of the memory leak diagnosed in #822.

test_client

@aditya2592
Copy link

@aprotyas I am trying to recreate what I am seeing as an example, but basically the leak happens now when multiple clients are requesting the same service and the depth in QOS is > 1. The memory leak is not there when the depth in client is 1. So this makes me think that its related to multiple clients requesting a given service and not necessarily the issue that was fixed in this PR

@aprotyas
Copy link
Member

@aprotyas I am trying to recreate what I am seeing as an example, but basically the leak happens now when multiple clients are requesting the same service and the depth in QOS is > 1. The memory leak is not there when the depth in client is 1. So this makes me think that its related to multiple clients requesting a given service and not necessarily the issue that was fixed in this PR

Do you mind sharing a simple example to reproduce?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants