-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add LogScaler transformer #932
base: main
Are you sure you want to change the base?
Conversation
Task linked: CU-86b3gbr2k RDT - Add |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #932 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 18 18
Lines 2233 2274 +41
=========================================
+ Hits 2233 2274 +41
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
53d70fe
to
f1dbfa5
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Look great! I only left some minor comments.
value was missing. Then use it to recreate missing values. | ||
* ``None``: Do nothing with the missing values on the reverse transform. Simply | ||
pass whatever data we get through. | ||
constant (float): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Default to 0" -> "Defaults to 0".
Also, either add the `` quotation marks around the 0, False, True values here, or remove them from the other values in the docstring, so it's conistent.
|
||
class TestLogScaler: | ||
def test___init__super_attrs(self): | ||
"""super() arguments are properly passed and set as attributes.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing "Test" at the start
data = super()._transform(data) | ||
|
||
if data.ndim > 1: | ||
self._validate_data(data[:, 0]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would create a helper function for lines 711-715 like you did for _fit, so we don't repeat the code.
data = data.to_numpy() | ||
|
||
if data.ndim > 1: | ||
if self.invert: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here.
invert: bool = False, | ||
learn_rounding_scheme: bool = False, | ||
): | ||
self.constant = constant |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we add validation to the constant/invert args?
|
||
class TestLogScaler: | ||
def test_learn_rounding(self): | ||
# Setup |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add short docstrings to these tests.
Implements #930
CU-86b3gbr2k