Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hdf5 laser profile #9

Open
wants to merge 21 commits into
base: master
Choose a base branch
from
Open

Hdf5 laser profile #9

wants to merge 21 commits into from

Conversation

Flamefire
Copy link
Contributor

@Flamefire Flamefire commented Nov 2, 2016

This adds loading the laser profile from OpenPMD HDF5 files and corresponding tests.
Interpolation in the time and space domain is supported (HDF5 cell sizes and timesteps do not need to match the simulation sizes)
Also tests are added to check the implementation.

Current state:
Tests are failing (wrong values)
There is some confusion with the axis and the data layout. The problem: Laser profile is a 2D dataset but the simulation is 3D. So what I wanted to do is map the Simulation z-Axis to the Laser x-Axis and the y-Axis to the Laser y-Axis. But HDF5 meta data is the other way round than the simulation meta data, so the fastest varying dimension comes last (axis labels) and I'm not completly sure about the order of the other data (sizes, offset...)
What is added is a switch that will map the HDF5 axis to the simulation axis, default is X->Z Y->Y and some logic to handle that. But currently this is buggy, because swapping the data means not swapping the meta-data because that already needs to be swapped.

So bottom line: Someone with a good understanding of the data-order needs to check all the reading of the data and probably of the test data creation (python script)

Some changes were made to the HDF5 Field interpolator and its tests so test failures may happen.

Test: The Field interpolator and the laser profile is tested by creating HDF5 datasets in python in the pre-run commands. These are read in the simulation and a post-run script checks the checkpoints if the number of photons match the expected ones. To allow testing the interpolation, the photon count is created by a linear function in x,y,t with different steps for each variable to check swapped axis etc. There are also some variables (https://github.com/ComputationalRadiationPhysics/parataxis/pull/9/files#diff-617b9e17b3a7623178d7f249698fe73eR146) that can make the HDF5 sizes (cells and timesteps) larger by a factor to actually test the interpolation. To avoid rounding errors one should use powers of 2. My initial example used xRatio=yRatio=4, tRatio=8

Simple command to run the tests locally on 1 GPU (less than 1 Minute runtime): TBG_TPLFILE=submit/bash/bash_mpirun.tpl TBG_SUBMIT=bash buildSystem/runTests.py -e examples/hdf5Test/ -o ~/xrtTests/ -t HDF5FieldInterpolator

@Flamefire
Copy link
Contributor Author

Flamefire commented Nov 7, 2016

Some usefull python code to test this visually:

import numpy as np
import unittest
import h5py

import matplotlib.pyplot as plt
from matplotlib.colors import LogNorm

%matplotlib inline
cellSize = np.array([4e-9, 4e-9, 4e-9])
simSize = np.array("32 64 96".split(" ")).astype(int)

checkpointPath = "/home/grund59/xrtTests/output/HDF5Test_HDF5FieldInterpolator/simOutput" + "/checkpoints"
for timestep in [30]:
    hdf5Path = checkpointPath + "/hdf5_checkpoint_" + str(timestep) + ".h5"

    with h5py.File(hdf5Path) as f:
        data = list(f["data"].values())[0]
        curTime = data.attrs["time"] * data.attrs["timeUnitSI"]
        #npt.assert_almost_equal(curTime, dt * timestep)

        hdf5Photons = data["particles/p"]

        photonWeighting = hdf5Photons["weighting"]
        constWeighting = photonWeighting.attrs.get("value")

        # Probably cell index
        photonPosOffset = hdf5Photons["positionOffset"]
        offsetUnit = np.array([
                        photonPosOffset["x"].attrs["unitSI"],
                        photonPosOffset["y"].attrs["unitSI"],
                        photonPosOffset["z"].attrs["unitSI"]
                    ])
        # Make it an array of positions (each row has x,yz)
        photonPosOffset = np.transpose([
                            np.array(photonPosOffset["x"]),
                            np.array(photonPosOffset["y"]),
                            np.array(photonPosOffset["z"])
                          ])

        # Probably incell position
        photonPos = hdf5Photons["position"]
        posUnit = np.array([
                        photonPos["x"].attrs["unitSI"],
                        photonPos["y"].attrs["unitSI"],
                        photonPos["z"].attrs["unitSI"]
                  ])
        # Make it an array of positions (each row has x,y,z)
        photonPos = np.transpose([np.array(photonPos["x"]), np.array(photonPos["y"]), np.array(photonPos["z"])])

        # Combine to full positions in cells
        photonPos = photonPosOffset * (offsetUnit / cellSize) + photonPos * (posUnit / cellSize)

        # Use only photons just spawned and moved once
        photonMask = photonPos[:, 0] <= 1
        photonPosInFirstSlice = photonPos[photonMask][:,1:3].astype(int)
        print(photonPosInFirstSlice.max(), photonPosInFirstSlice[:][2].max())
        if constWeighting:
            weightings = np.full(len(photonPosInFirstSlice), constWeighting)
        else:
            weightings = np.array(photonWeighting)[photonMask]

        # Accumulate number of photons/cell
        numPhotonsPerCell, edgesX, edgesY = np.histogram2d(photonPosInFirstSlice[:,0], photonPosInFirstSlice[:,1],
                                                           weights=weightings, bins=simSize[1:3])
        numPhotonsPerCell = np.round(numPhotonsPerCell).astype(int)
        print(0, 2, numPhotonsPerCell[0, 2])
        plt.figure(figsize=(8,8))
        plt.imshow(numPhotonsPerCell.T, origin="lower", interpolation = "nearest", extent = [0, simSize[2], 0, simSize[1]])
        plt.colorbar()

@ax3l
Copy link
Member

ax3l commented Nov 28, 2016

@n01r you take this over?

@n01r
Copy link
Member

n01r commented Nov 28, 2016

Well, yeah, I hope it won't take me too long to understand and fix it ^^;

@ax3l
Copy link
Member

ax3l commented Jan 25, 2017

cross linking PaNOSC-ViNYL/SimEx#84 here

@n01r
Copy link
Member

n01r commented Mar 28, 2017

I tried out @Flamefire's Python code and found that I do not see what I would expect from just having a look at the HDF5 output with hdfview. In the checkpoint files I can see that even though the grid is (x, y, z) = (32, 64, 96) cells the photons get initialized in a (x=0, y=0..63, z=0..63) plane.
laser_init_checkpoint_alex
As one can clearly see there is no photon-free space for z > 63.
I chose to try out openPMD viewer for this and compared the laserProfile_<iteration>.h5 files created by the Python script in the test to the photons from the hdf5_checkpoint_<time step>.h5 data created in the simulation output.

test input
laser_init_checkpoint
test output
laser_profile

The scaling is unfortunate but we can see the switch of axes which cuts off part of the photon initialization.
The photon initialization files are also not completely openPMD conform since the number of photons to be initialized Nph is saved like a vector field even though it is scalar data.

@ax3l
Copy link
Member

ax3l commented Mar 31, 2017

The photon initialization files are also not completely openPMD conform since the number of photons to be initialized Nph is saved like a vector field even though it is scalar data.

as discussed offline a few days ago: that is no problem, all openPMD records are in principle tensor records without an implicit meaning that a component needs to represent a spatial component. this specific record is density component of each polarization direction while assuming in a certain space for a certain time only one polarization exists. in reality, the XFEL data sets you get should all be linearly polarized making the 2nd component of this record superfluous for now

@n01r
Copy link
Member

n01r commented Mar 31, 2017

@ax3l Yep, thanks. Forgot to add our offline discussion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants