-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: local detuning validation for ahs #244
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #244 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 48 48
Lines 3665 3702 +37
Branches 878 888 +10
=========================================
+ Hits 3665 3702 +37 ☔ View full report in Codecov by Sentry. |
src/braket/analog_hamiltonian_simulator/rydberg/validators/program.py
Outdated
Show resolved
Hide resolved
test/unit_tests/braket/analog_hamiltonian_simulator/test_validator/validators/test_program.py
Outdated
Show resolved
Hide resolved
src/braket/analog_hamiltonian_simulator/rydberg/validators/program.py
Outdated
Show resolved
Hide resolved
# If there is local detuning, the net value of detuning for each atom | ||
# should not exceed a certain value | ||
@root_validator(pre=True, skip_on_failure=True) | ||
def net_detuning_must_not_exceed_max_net_detuning(cls, values): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we split this function? Also, let's discuss more on this offline.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Few questions we discussed offline and we might need confirmation from the science team on these:
- Where to cross-verify those schema values, trying to understand what does these detuning patterns, magnitude mean?
if not len(local_detuning): | ||
return values | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- if there are no local detuning why are we are returning values are we supposed to return an error or warning if this is specific for local detuning?
# Get the contributions from local detuning at the time point | ||
for detuning_pattern, shift_coef in zip(detuning_patterns, shift_coefs): | ||
detuning_to_check += shift_coef[time_ind] * float(detuning_pattern[atom_index]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- The method name says its checking if the net values doesn't exceed the max detuning value, but it retrieving the contributions, I need a bit more context on this to understand why this is part of this method.
And similarly, for # Merge the time points for different shifting terms and detuning term
f"[{-capabilities.MAX_NET_DETUNING}, {capabilities.MAX_NET_DETUNING}]." | ||
f"Numerical instabilities may occur during simulation." | ||
) | ||
return values |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we require this return?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I did some tests, and yes, this return is necessary. The idea is that we want to raise a warning immediately once we found an atom has net detuning larger than the allowed value, and stop the validator for the net detuning. If we don't have this return, the unit test will fail.
An alternative approach is to break the for loop like we did in this validator
src/braket/analog_hamiltonian_simulator/rydberg/validators/program.py
Outdated
Show resolved
Hide resolved
for time_ind, time in enumerate(time_points): | ||
|
||
# Get the contributions from all the global detunings | ||
# (there could be multiple global driving fields) at the time point | ||
values_global_detuning = sum( | ||
[detuning_coef[time_ind] for detuning_coef in detuning_coefs] | ||
) | ||
|
||
for atom_index in range(len(detuning_patterns[0])): | ||
# Get the contributions from local detuning at the time point | ||
values_local_detuning = sum( | ||
[ | ||
shift_coef[time_ind] * float(detuning_pattern[atom_index]) | ||
for detuning_pattern, shift_coef in zip(detuning_patterns, shift_coefs) | ||
] | ||
) | ||
|
||
# The net detuning is the sum of both the global and local detunings | ||
detuning_to_check = np.real(values_local_detuning + values_global_detuning) | ||
|
||
# Issue a warning if the absolute value of the net detuning is | ||
# beyond MAX_NET_DETUNING | ||
if abs(detuning_to_check) > capabilities.MAX_NET_DETUNING: | ||
warnings.warn( | ||
f"Atom {atom_index} has net detuning {detuning_to_check} rad/s " | ||
f"at time {time} seconds, which is outside the typical range " | ||
f"[{-capabilities.MAX_NET_DETUNING}, {capabilities.MAX_NET_DETUNING}]." | ||
f"Numerical instabilities may occur during simulation." | ||
) | ||
return values |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It still seems like this could be function of its own, but should be fine.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean we have a function to check the validity of atom i
at time point j
, and then call that function in the for loop: for atom_index ...
?
…gram.py Co-authored-by: Viraj Chaudhari <[email protected]>
src/braket/analog_hamiltonian_simulator/rydberg/validators/program.py
Outdated
Show resolved
Hide resolved
def _check_threshold( | ||
values, | ||
time_points, | ||
global_detuning_coefs, | ||
local_detuning_patterns, | ||
local_detuning_coefs, | ||
capabilities, | ||
): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing docstring and type info, similar to this, if this belong in the helper file let's move it there.
…gram.py Co-authored-by: Viraj Chaudhari <[email protected]>
@@ -36,3 +44,63 @@ def validate_value_range_with_warning( | |||
f"[{min_value}, {max_value}]. The values should be specified in SI units." | |||
) | |||
break # Only one warning messasge will be issued | |||
|
|||
|
|||
def validate_net_detuning_with_warning( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Do we need this to be public?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, but given that the previous function validate_value_range_with_warning
is public, I figured that we want to be consistent here. Should I change both to private functions or can I keep them like that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, let's keep it like that, we can change to private if necessary.
Issue #, if available:
Description of changes:
Testing done:
Merge Checklist
Put an
x
in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your pull request.General
Tests
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.