Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Synchronize velocity for diagnostics #1751

Open
wants to merge 35 commits into
base: development
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
c06bf14
Add routine to check if any diagnostics will be done
dpgrote Mar 1, 2021
5870994
Added synchronize_velocity_for_diagnostics input parameter
dpgrote Mar 1, 2021
136b056
Added documentation for synchronize_velocity_for_diagnostics
dpgrote Mar 1, 2021
a3ae45d
Merge remote-tracking branch 'ECPwarpx/development' into synchronize_…
dpgrote Apr 1, 2021
6a76009
Merge remote-tracking branch 'ECPwarpx/development' into synchronize_…
dpgrote May 1, 2021
c3e8f06
Fixed pp_warpx for synchronize_velocity_for_diagnostics
dpgrote May 3, 2021
459f074
Merge remote-tracking branch 'ECPwarpx/development' into synchronize_…
dpgrote Jun 11, 2021
07cf2f3
Bug fix, check cur_time (not cur_time + dt)
dpgrote Jun 11, 2021
0d2617a
Merge remote-tracking branch 'ECPwarpx/development' into synchronize_…
dpgrote Jun 21, 2021
3cf522f
Merge remote-tracking branch 'ECPwarpx/development' into synchronize_…
dpgrote Jun 28, 2021
c91b148
Merge remote-tracking branch 'ECPwarpx/development' into synchronize_…
dpgrote Jun 29, 2021
e17704b
Merge remote-tracking branch 'ECPwarpx/development' into synchronize_…
dpgrote Jul 16, 2021
a72aea1
Merge remote-tracking branch 'ECPwarpx/development' into synchronize_…
dpgrote Sep 8, 2021
89d5c21
For synchronize, only call fill boundary with EM
dpgrote Sep 8, 2021
2d15c04
Merge branch 'development' into synchronize_velocity_for_diagnostics
dpgrote Jan 17, 2023
2580217
Updates to match development
dpgrote Jan 17, 2023
61a8f77
Update comment
dpgrote Jan 17, 2023
e8a8142
Fix the check for the final step (to match the check for the final di…
dpgrote Jan 18, 2023
a3a823a
The synchronization is only done when the flag is true
dpgrote Jan 18, 2023
decd39d
Fix the step check in the original code
dpgrote Jan 18, 2023
be721cd
When the flag is true, do the synchronization at the end of the simul…
dpgrote Jan 19, 2023
48e6f92
Add warpx_synchronize_velocity flag to picmi interface
dpgrote Jan 19, 2023
ef50556
Merge branch 'development' into synchronize_velocity_for_diagnostics
dpgrote Mar 17, 2023
c4910b2
Merge branch 'development' into synchronize_velocity_for_diagnostics
dpgrote Mar 30, 2023
383ec52
Merge branch 'development' into synchronize_velocity_for_diagnostics
dpgrote Apr 3, 2023
54ce2d6
Merge `ECP-WarpX:development` into `dpgrote:synchronize_velocity_for_…
EZoni Jan 29, 2025
876f6b2
Fix clang-tidy error: method `DoDiags` can be made `const`
EZoni Jan 29, 2025
eba2e29
Fix clang-tidy error: function `DoDiags` should be marked `[[nodiscar…
EZoni Jan 29, 2025
5353a51
Merge `ECP-WarpX:development` into `dpgrote:synchronize_velocity_for_…
EZoni Feb 5, 2025
60df9dd
Fix double back quote in documentation
dpgrote Feb 12, 2025
7cd5327
Fix small error and clean up complicated if conditions
dpgrote Feb 12, 2025
97b3b1a
Add CI test
dpgrote Feb 12, 2025
7656bb4
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Feb 12, 2025
35221d6
Fix const
dpgrote Feb 12, 2025
35abff4
Clean up for the code scanning
dpgrote Feb 14, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions Docs/source/usage/parameters.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2716,6 +2716,12 @@ WarpX has five types of diagnostics:
Similar to what is done for physical species, WarpX has a class Diagnostics that allows users to initialize different diagnostics, each of them with different fields, resolution and period.
This currently applies to standard diagnostics, but should be extended to back-transformed diagnostics and reduced diagnostics (and others) in a near future.

* ``warpx.synchronize_velocity_for_diagnostics`` (``0`` or ``1``, optional, default ``0``)
Whether to synchronize the particle velocities with the particle positions in the diagnostics.
In its normal operation, WarpX is using the leap frog algorithm to advance the particles, and leaves the positions and velocities of the particles unsynchronized at the end of each time step, with the velocities lagging behind a half step.
When this option is turned on, whenever any diagnostics will be calculated, the velocities will be advanced a half step to
synchronize with the position before the diagnostics are generated.

.. _running-cpp-parameters-diagnostics-full:

Full Diagnostics
Expand Down
10 changes: 10 additions & 0 deletions Examples/Tests/single_particle/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,3 +10,13 @@ add_warpx_test(
"analysis_default_regression.py --path diags/diag1000001" # checksum
OFF # dependency
)

add_warpx_test(
test_1d_synchronize_velocity # name
1 # dims
1 # nprocs
inputs_test_1d_synchronize_velocity # inputs
"analysis_synchronize_velocity.py diags/diag1000005" # analysis
OFF # checksum
OFF # dependency
)
60 changes: 60 additions & 0 deletions Examples/Tests/single_particle/analysis_synchronize_velocity.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
#!/usr/bin/env python3

# Copyright 2025 David Grote
#
# This file is part of WarpX.
#
# License: BSD-3-Clause-LBNL

import sys

import numpy as np
import yt

# scipy.constants use CODATA2022
# from scipy.constants import e, m_e, c

# These are CODATA2018 values, as used in WarpX
e = 1.602176634e-19
m_e = 9.1093837015e-31
c = 299792458.0

# Integrate the test particle 5 timesteps, ending up with the position
# and velocity synchronized.
# In the simulation, with the synchronize_velocity_for_diagnostics flag set,
# the velocity will be synchronized at the end of step 5 when the diagnostics
# are written (even though that is not the final time step).

z = 0.1
uz = 0.0
Ez = -1.0
dt = 1.0e-6

# Half backward advance of velocity
uz -= -e / m_e * Ez * dt / 2.0

# Leap frog advance
for _ in range(5):
uz += -e / m_e * Ez * dt
g = np.sqrt((uz / c) ** 2 + 1.0)
z += (uz / g) * dt

# Add half v advance to synchronize
uz += -e / m_e * Ez * dt / 2.0

filename = sys.argv[1]
ds = yt.load(filename)
ad = ds.all_data()
z_sim = ad["electron", "particle_position_x"]
uz_sim = ad["electron", "particle_momentum_z"] / m_e

print(f"Analysis Z = {z:18.16f}, Uz = {uz:18.10f}")
print(f"Simulation Z = {z_sim.v[0]:18.16f}, Uz = {uz_sim.v[0]:18.10f}")

tolerance_rel = 1.0e-15
error_rel = np.abs((uz - uz_sim.v[0]) / uz)

print("error_rel : " + str(error_rel))
print("tolerance_rel: " + str(tolerance_rel))

assert error_rel < tolerance_rel
36 changes: 36 additions & 0 deletions Examples/Tests/single_particle/inputs_test_1d_synchronize_velocity
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
max_step = 8
amr.n_cell = 8
amr.max_level = 0
amr.blocking_factor = 8
amr.max_grid_size = 8
geometry.dims = 1
geometry.prob_lo = 0
geometry.prob_hi = 3

# Boundary condition
boundary.field_lo = pec
boundary.field_hi = pec

algo.maxwell_solver = none

warpx.const_dt = 1.e-6
warpx.synchronize_velocity_for_diagnostics = 1

# Order of particle shape factors
algo.particle_shape = 1

particles.species_names = electron
electron.species_type = electron
electron.injection_style = "SingleParticle"
electron.single_particle_pos = 0.0 0.0 0.1
electron.single_particle_u = 0.0 0.0 0.0
electron.single_particle_weight = 1.0

# Apply a uniform Ez
particles.E_ext_particle_init_style = constant
particles.E_external_particle = 0.0 0.0 -1.0

# Diagnostics
diagnostics.diags_names = diag1
diag1.intervals = 5
diag1.diag_type = Full
9 changes: 9 additions & 0 deletions Python/pywarpx/picmi.py
Original file line number Diff line number Diff line change
Expand Up @@ -2816,6 +2816,11 @@ class Simulation(picmistandard.PICMI_Simulation):
warpx_checkpoint_signals: list of strings
Signals on which to write out a checkpoint
warpx_synchronize_velocity: bool, default=False
Flags whether the particle velocities are synchronized in time with
the positions in the diagnostics. When False, the particles are
one half step behind the positions (except for the final diagnostic).
warpx_numprocs: list of ints (1 in 1D, 2 in 2D, 3 in 3D)
Domain decomposition on the coarsest level.
The domain will be chopped into the exact number of pieces in each dimension as specified by this parameter.
Expand Down Expand Up @@ -2935,6 +2940,8 @@ def init(self, kw):
self.reduced_diags_separator = kw.pop("warpx_reduced_diags_separator", None)
self.reduced_diags_precision = kw.pop("warpx_reduced_diags_precision", None)

self.synchronize_velocity = kw.pop("warpx_synchronize_velocity", None)

self.inputs_initialized = False
self.warpx_initialized = False

Expand Down Expand Up @@ -2999,6 +3006,8 @@ def initialize_inputs(self):
pywarpx.warpx.break_signals = self.break_signals
pywarpx.warpx.checkpoint_signals = self.checkpoint_signals

pywarpx.warpx.synchronize_velocity_for_diagnostics = self.synchronize_velocity

pywarpx.warpx.numprocs = self.numprocs

reduced_diags = pywarpx.warpx.get_bucket("reduced_diags")
Expand Down
2 changes: 2 additions & 0 deletions Source/Diagnostics/MultiDiagnostics.H
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@ public:
void ReadParameters ();
/** \brief Loop over diags in alldiags and call their InitDiags */
void InitData ();
/** Check if any diagnostics will do compute and pack. */
bool DoComputeAndPack (int step, bool force_flush=false);
/** \brief Called at each iteration. Compute diags and flush. */
void FilterComputePackFlush (int step, bool force_flush=false, bool BackTransform=false);
/** \brief Called only at the last iteration. Loop over each diag and if m_dump_last_timestep
Expand Down
10 changes: 10 additions & 0 deletions Source/Diagnostics/MultiDiagnostics.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,16 @@ MultiDiagnostics::ReadParameters ()
}
}

bool
MultiDiagnostics::DoComputeAndPack (int step, bool force_flush)
{
bool result = false;
for( auto& diag : alldiags ){
result = result || diag->DoComputeAndPack(step, force_flush);
}
return result;
}

void
MultiDiagnostics::FilterComputePackFlush (int step, bool force_flush, bool BackTransform)
{
Expand Down
3 changes: 3 additions & 0 deletions Source/Diagnostics/ReducedDiags/MultiReducedDiags.H
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,9 @@ public:
* @param[in] step current iteration time */
void WriteToFile (int step);

/** Check if any diagnostics will be done */
bool DoDiags(int step);

/** \brief Loop over all ReducedDiags and call their WriteCheckpointData
* @param[in] dir checkpoint directory */
void WriteCheckpointData (std::string const & dir);
Expand Down
12 changes: 12 additions & 0 deletions Source/Diagnostics/ReducedDiags/MultiReducedDiags.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -159,6 +159,18 @@ void MultiReducedDiags::WriteToFile (int step)
}
// end void MultiReducedDiags::WriteToFile

// Check if any diagnostics will be done
bool MultiReducedDiags::DoDiags(int step)
{
bool result = false;
for (int i_rd = 0; i_rd < static_cast<int>(m_rd_names.size()); ++i_rd)
{
result = result || m_multi_rd[i_rd] -> DoDiags(step);
}
return result;
}
// end bool MultiReducedDiags::DoDiags

void MultiReducedDiags::WriteCheckpointData (std::string const & dir)
{
// Only the I/O rank does
Expand Down
3 changes: 3 additions & 0 deletions Source/Diagnostics/ReducedDiags/ReducedDiags.H
Original file line number Diff line number Diff line change
Expand Up @@ -97,6 +97,9 @@ public:
*/
virtual void WriteToFile (int step) const;

/** Check if diag should be done */
[[nodiscard]] bool DoDiags(int step) const { return m_intervals.contains(step+1); }

/**
* \brief Write out checkpoint related data
*
Expand Down
29 changes: 25 additions & 4 deletions Source/Evolve/WarpXEvolve.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,13 @@ WarpX::Synchronize () {
using ablastr::fields::Direction;
using warpx::fields::FieldType;

// Note that this is potentially buggy since the PushP will do a field gather
// using particles that have been pushed but not yet checked at the boundaries.
// Also, this PushP may be inconsistent with the PushP backwards above since
// the fields may change between the two (mainly effecting the Python version when
// using electrostatics).
// When synchronize_velocity_for_diagnostics is true, the PushP at the end of the
// step is used so that the correct behavior is obtained.
FillBoundaryE(guard_cells.ng_FieldGather);
FillBoundaryB(guard_cells.ng_FieldGather);
if (fft_do_time_averaging)
Expand Down Expand Up @@ -235,9 +242,12 @@ WarpX::Evolve (int numsteps)
}

// TODO: move out
bool const end_of_step_loop = (step == numsteps_max - 1) || (cur_time + dt[0] >= stop_time - 1.e-3*dt[0]);
if (evolve_scheme == EvolveScheme::Explicit) {
// At the end of last step, push p by 0.5*dt to synchronize
if (cur_time + dt[0] >= stop_time - 1.e-3*dt[0] || step == numsteps_max-1) {
// At the end of step loop, push p by 0.5*dt to synchronize
// This synchronization is not at the correct place since it is done before the window is moved,
// before particles are scraped, and before the electrostatic field update
if (end_of_step_loop && !synchronize_velocity_for_diagnostics) {
Synchronize();
}
}
Expand Down Expand Up @@ -309,6 +319,15 @@ WarpX::Evolve (int numsteps)
ExecutePythonCallback("afterEsolve");
}

bool const do_diagnostic = (multi_diags->DoComputeAndPack(step) || reduced_diags->DoDiags(step));
if (synchronize_velocity_for_diagnostics &&
(do_diagnostic || end_of_step_loop)) {
// When the diagnostics require synchronization, push p by 0.5*dt to synchronize.
// Note that this will be undone at the start of the next step by the half v-push
// backwards.
Synchronize();
}

// afterstep callback runs with the updated global time. It is included
// in the evolve timing.
ExecutePythonCallback("afterstep");
Expand Down Expand Up @@ -353,8 +372,10 @@ WarpX::Evolve (int numsteps)
// This if statement is needed for PICMI, which allows the Evolve routine to be
// called multiple times, otherwise diagnostics will be done at every call,
// regardless of the diagnostic period parameter provided in the inputs.
if (istep[0] == max_step || (stop_time - 1.e-3*dt[0] <= cur_time && cur_time < stop_time + dt[0])
|| m_exit_loop_due_to_interrupt_signal) {
bool const final_time_step = (istep[0] == max_step)
|| (cur_time >= stop_time - 1.e-3*dt[0]
&& cur_time < stop_time + dt[0]);
if (final_time_step || m_exit_loop_due_to_interrupt_signal) {
multi_diags->FilterComputePackFlushLastTimestep( istep[0] );
if (m_exit_loop_due_to_interrupt_signal) { ExecutePythonCallback("onbreaksignal"); }
}
Expand Down
2 changes: 2 additions & 0 deletions Source/WarpX.H
Original file line number Diff line number Diff line change
Expand Up @@ -1552,6 +1552,8 @@ private:
amrex::VisMF::Header::Version plotfile_headerversion = amrex::VisMF::Header::Version_v1;
amrex::VisMF::Header::Version slice_plotfile_headerversion = amrex::VisMF::Header::Version_v1;

bool synchronize_velocity_for_diagnostics = false;

bool use_single_read = true;
bool use_single_write = true;
int mffile_nstreams = 4;
Expand Down
2 changes: 2 additions & 0 deletions Source/WarpX.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -941,6 +941,8 @@ WarpX::ReadParameters ()
"J-damping can only be done when PML are inside simulation domain (do_pml_in_domain=1)"
);

pp_warpx.query("synchronize_velocity_for_diagnostics", synchronize_velocity_for_diagnostics);

{
// Parameters below control all plotfile diagnostics
bool plotfile_min_max = true;
Expand Down
Loading