Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Synchronize velocity for diagnostics #1751

Open
wants to merge 34 commits into
base: development
Choose a base branch
from

Conversation

dpgrote
Copy link
Member

@dpgrote dpgrote commented Mar 2, 2021

This PR allows the synchronization in time of the particle velocities and positions when generating diagnostics. Without this option, the particle velocities will lag behind the position by a half time step. This adds the boolean input parameter warpx.synchronize_velocity_for_diagnostics to turn on this option, defaulting to false.

There are several pieces to this PR:

  • Changes to MultiDiagnostic and MultiReducedDiags adding routines to check if any diagnostics will be done
  • Adds a call to PushP to just before the diagnostics are done (to get the updated fields from the electrostatic calculation)
  • Add the appropriate documentation

What Evolve does is if the synchronization is to be done, advance the velocity a half step just before the diagnostics and sets is_synchronized=true. Then at the start of the next step, if is_synchronized is true, push the velocities back a half step to be ready for the full leap frog advance.

Comments:

  • Is the documentation in the correct place in parameters.rst?
  • The reduced diagnostics could perhaps use the new DoDiags method instead of accessing m_intervals in its ComputeDiags.
  • This PR leaves the original PushP unchanged, even though it is possibly buggy. That PushP fetches the fields, but uses the particle positions before the particle boundary conditions have been applied, leading to a possible out of bounds reference. Also, that PushP may not be consistent with the backwards PushP since the fields may had changed. Comments are added to the code to note this potential problem. I avoided changing this since it breaks many CI tests.

@dpgrote dpgrote added the component: diagnostics all types of outputs label Mar 2, 2021
@RemiLehe RemiLehe self-requested a review March 2, 2021 01:20
@RemiLehe RemiLehe self-assigned this Mar 2, 2021
(step == numsteps_max-1) ||
(synchronize_velocity_for_diagnostics &&
(multi_diags->DoComputeAndPack(step) ||
reduced_diags->DoDiags(step)))) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The above if condition looks like should not modify the default behavior of the code (which is probably what we want). However, it seems that a lot of automated tests are failing because of checkSum differences. @dpgrote Do you understand why?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, could you explain why this block of code was moved here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @RemiLehe - I moved the block of code to the end of the loop so that the state of the data when the half v push is done is the same as it will be at the start of the next step when the half push will be undone. Ideally, they should exactly cancel each other. This is particularly important when running with electrostatic where the push needs to be done after the new fields are computed. It seems that this block should be here anyway, since the field gather done in the PushP should be done after the particle boundary conditions are applied.

This would explain why some of the CI tests are failing, that something was changed between where this block was and where it is now. I'll look into it to understand what is happening.

@RemiLehe
Copy link
Member

RemiLehe commented Mar 8, 2021

Thanks for answering the above question.
It looks like a few additional checksums need to be updated, but then, after that, the PR would be ready to be merged.

@ax3l
Copy link
Member

ax3l commented Mar 29, 2021

@dpgrote just a little ping since there is a slight merge conflict in this PR that needs to be resolved now.

Note that the parameters.rst file moved now due to #1821

@dpgrote
Copy link
Member Author

dpgrote commented Mar 29, 2021

Thanks @ax3l . I need to get back to this one and finish it. There are some CI differences that I need to make sure I understand.

@RemiLehe
Copy link
Member

@dpgrote Should we change this to [WIP] while the CI issues are being investigated?

@dpgrote dpgrote changed the title Synchronize velocity for diagnostics [WIP]Synchronize velocity for diagnostics May 24, 2021
@dpgrote dpgrote force-pushed the synchronize_velocity_for_diagnostics branch from 0fe6cbe to a3a823a Compare January 18, 2023 18:41
@@ -217,8 +217,16 @@ WarpX::Evolve (int numsteps)
// B : guard cells are NOT up-to-date
}

if (cur_time + dt[0] >= stop_time - 1.e-3*dt[0] || step == numsteps_max-1) {
if ((cur_time + dt[0] >= stop_time - 1.e-3*dt[0] || step == numsteps_max-1) &&
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit long. Maybe a temporary bool could simplify the logic in the if?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the comments! Do you have a suggestion of how to simplify this?

// At the end of last step, push p by 0.5*dt to synchronize
// Note that this is potentially buggy since the PushP will do a field gather
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need a couple of input option asserts to avoid potentially buggy behavior?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The buggy behavior regarding particle bounds is not dependent on input options, but will always happen. For the other, Python with electrostatic, this should just be fixed by removing this block of code and using the one below instead. I can do this in a separate PR because it will break a number of CI tests.

@EZoni
Copy link
Member

EZoni commented Jan 15, 2025

@dpgrote

Hi Dave, do you think this is a change that we would still like to merge? I think so, I see it was mentioned from another PR last year. If this is still relevant, do you think it is possible to merge development and fix the merge conflicts or would it be easier to just start from scratch with a new PR?

@dpgrote
Copy link
Member Author

dpgrote commented Jan 15, 2025

@dpgrote

Hi Dave, do you think this is a change that we would still like to merge? I think so, I see it was mentioned from another PR last year. If this is still relevant, do you think it is possible to merge development and fix the merge conflicts or would it be easier to just start from scratch with a new PR?

@EZoni I would still like to have this PR merged. I can merge in development and see how bad the conflicts are.

@EZoni EZoni self-requested a review January 15, 2025 20:05
@EZoni EZoni self-assigned this Jan 15, 2025
@EZoni
Copy link
Member

EZoni commented Jan 29, 2025

@dpgrote

I took the liberty to merge development. Please feel free to have a look to double check that I did not miss anything while fixing the merge conflicts, whenever you have time. Let me know if the new code changes look correct to you and consistent with the original implementation.

@dpgrote
Copy link
Member Author

dpgrote commented Jan 29, 2025

@dpgrote

I took the liberty to merge development. Please feel free to have a look to double check that I did not miss anything while fixing the merge conflicts, whenever you have time. Let me know if the new code changes look correct to you and consistent with the original implementation.

Thanks. The changes look correct, following the same pattern.

@EZoni
Copy link
Member

EZoni commented Jan 30, 2025

Thanks for checking, @dpgrote. Note that HIP is failing independently of this PR (due to a new ROCm release) and expected to resolve itself in a few days.

@EZoni
Copy link
Member

EZoni commented Feb 5, 2025

@dpgrote

I saw that the HIP checks were finally fixed, but there was another minor merge conflict, so I merged development again and fixed the conflict. Hopefully this should be ready for review/merge now.

Comment on lines 247 to 248
if ((cur_time + dt[0] >= stop_time - 1.e-3*dt[0] || step == numsteps_max-1) &&
!synchronize_velocity_for_diagnostics) {
Copy link
Member

@ax3l ax3l Feb 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please add a helper function that returns true/false for if the last step query of the first part of this if and then simplify this if?

Also, you can reuse the logic for the if below.

makes the if look like

if (end_of_last_step && !synchronize_velocity_for_diagnostics)
  // existing if
...

if (end_of_last_step && (other_stuff || synchronize_velocity_for_diagnostics))
  // extra if below that was added

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this better?

@@ -309,6 +317,15 @@ WarpX::Evolve (int numsteps)
ExecutePythonCallback("afterEsolve");
}

if (synchronize_velocity_for_diagnostics &&
(multi_diags->DoComputeAndPack(step) || reduced_diags->DoDiags(step) ||
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please see suggestion above.
Please assign the logic behind multi_diags->DoComputeAndPack(step) || reduced_diags->DoDiags(step) to a named boolean as well, because otherwise this if gets a bit involved to read.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Better?

uz -= -e / m_e * Ez * dt / 2.0

# Leap frog advance
for i in range(5):

Check notice

Code scanning / CodeQL

Unused global variable Note

The global variable 'i' is not used.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To fix CodeQL's issue:

Suggested change
for i in range(5):
for _ in range(5):

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: diagnostics all types of outputs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants