-
Notifications
You must be signed in to change notification settings - Fork 3.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[python-package] verbose
does not supress warnings with custom objectives
#6014
Comments
verbose
does not supress warnings with custom objectivesverbose
does not supress warnings with custom objectives
is this related to pycaret/pycaret#3660 ? I installed lightgbm via pip... |
I've also encountered with this issue and after trying every solution out there class CustomLogger:
import lightgbm as lgbm |
Also encounter it when using |
GidonKR, thanks for temporary solution! |
From @GidonKR here is some code that worked for me in Python 3.12
|
Thanks for using LightGBM and for putting the effort into a minimal, reproducible example! I'm able to reproduce this with the latest import numpy as np
import pandas as pd
from lightgbm import LGBMRegressor
df = pd.DataFrame({"x": [0, 0, 1, 1, 1], "y": [0, 1, 0, 1, 1]})
def l2_obj(y_true, y_pred):
grad = y_pred - y_true
hess = np.ones_like(y_pred)
return grad, hess
params = {
"min_child_samples": 1,
"n_estimators": 1,
"n_jobs": 1,
"verbose": -1
}
LGBMRegressor(
**{**params, "objective": l2_obj}
).fit(df[["x"]], df["y"])
# [LightGBM] [Info] Using self-defined objective function
# [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
LGBMRegressor(
**{**params, "objective": "regression"}
).fit(df[["x"]], df["y"])
# (no logs) This is definitely a bug. I'll investigate and hopefully put up a fix shortly. |
And very sorry for the delayed response. This project has a very small number of maintainers relative to its popularity, and 0 who work on LightGBM maintenance full-time. If any of you involved in this thread have feedback on what we could do to make it more likely that you'll investigate such issues and contribute fixes yourself in the future, we'd love to hear them here or over in #6350. |
Looks like this behavior is not specific to the import numpy as np
import pandas as pd
import lightgbm as lgb
df = pd.DataFrame({"x": [0, 0, 1, 1, 1], "y": [0, 1, 0, 1, 1]})
def l2_obj(y_pred, train_data):
y_true = train_data.get_label()
grad = y_pred - y_true
hess = np.ones_like(y_pred)
return grad, hess
params = {
"min_child_samples": 1,
"n_estimators": 1,
"n_jobs": 1,
"verbosity": -1
}
lgb.train(
params={
**params,
"objective": l2_obj
},
train_set=lgb.Dataset(df[["x"]], label=df[["y"]])
)
lgb.train(
params={
**params,
"objective": "regression"
},
train_set=lgb.Dataset(df[["x"]], label=df[["y"]])
) |
I believe I've found the root cause. This call in LightGBM/python-package/lightgbm/basic.py Lines 4132 to 4133 in 1443548
Calls Lines 2035 to 2040 in 1443548
Which eventually calls Lines 42 to 55 in 1443548
|
@jameslamb Did you forget to merge with release 4.4.0? |
Nope, #6428 wasn't ready in time for the 4.4.0 release. We will try to get it into the next one. |
Description
In the Python package of LightGBM 4.0.0, setting the
verbose
parameter to-1
does not suppress warnings if the objective is a user-defined function.Reproducible example
Environment info
LightGBM version or commit hash: 4.0.0
Command(s) you used to install LightGBM
The text was updated successfully, but these errors were encountered: