Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

include parameters from reference dataset on subset (fixes #5402) #5416

Merged
merged 3 commits into from
Aug 28, 2022
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions src/io/dataset.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -740,6 +740,11 @@ void Dataset::CopyFeatureMapperFrom(const Dataset* dataset) {
group_feature_cnt_ = dataset->group_feature_cnt_;
forced_bin_bounds_ = dataset->forced_bin_bounds_;
feature_need_push_zeros_ = dataset->feature_need_push_zeros_;
max_bin_ = dataset->max_bin_;
min_data_in_bin_ = dataset->min_data_in_bin_;
bin_construct_sample_cnt_ = dataset->bin_construct_sample_cnt_;
use_missing_ = dataset->use_missing_;
zero_as_missing_ = dataset->zero_as_missing_;
}

void Dataset::CreateValid(const Dataset* dataset) {
Expand Down
7 changes: 7 additions & 0 deletions tests/python_package_test/test_basic.py
Original file line number Diff line number Diff line change
Expand Up @@ -243,6 +243,13 @@ def test_chunked_dataset_linear():
valid_data.construct()


def test_save_dataset_subset_and_load_from_file(tmp_path):
data = np.random.rand(100, 2)
ds = lgb.Dataset(data)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please try overriding the defaults (e.g. setting max_bin: 7 or something) and check that those non-default values survive the round trip to disk?

I think that would increase our confidence that this is working as expected. Otherwise, I think a bug of the form "all parameter values are lost when writing to binary Dataset file" could make it through this test.

Copy link
Collaborator Author

@jmoralez jmoralez Aug 16, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that will raise an error like the one in #4904, but I'll try it and confirm here.

Copy link
Collaborator

@jameslamb jameslamb Aug 16, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok I see. I expected a PR called "include parameters" to test that parameters on either side of an operation had the same values.

I think maybe I got confused by the presence of writing and reading a binary file, and thought the issue was specific to storing a Dataset to disk. Is it like "when taking a subset, not all parameters are copied from the reference Dataset to the subset...and this can show up as an error loading the Dataset from file"?

If it is, then it would be great to be able to reach into the Dataset (on the C++ side, not the Python object) and check that attributes like max_bin_ are the same and a non-default value before writing to a file. But I'm not sure how to do that without introducing a new c_api entrypoint.

So if adding the test I suggested does hit the error from #4904, then I think this test in its current state is ok. It is still an improvement that fixes a bug.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I built the dataset with non default parameters and check that loading it with the same ones succeeds in 6a2fd1f.

Copy link
Collaborator

@jameslamb jameslamb Aug 20, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok nice! Thank you for that and for the explanation.

I tried running this PR's test code on latest master, and can see based on the error message that those non-default parameter values are being respected.

import lightgbm as lgb
import numpy as np

data = np.random.rand(100, 2)
params = {'max_bin': 50, 'min_data_in_bin': 10}
ds = lgb.Dataset(data, params=params).construct()
ds.subset([1, 2, 3, 5, 8]).save_binary('subset.bin')
lgb.Dataset('subset.bin', params=params).construct()

[LightGBM] [Fatal] Dataset was constructed with parameter max_bin=32649. It cannot be changed to 50 when loading from binary file.

I'm not sure where that 32649 is coming from, but I think that's not an issue caused by this PR.

ds.subset([1, 2, 3, 5, 8]).save_binary(tmp_path / 'subset.bin')
lgb.Dataset(tmp_path / 'subset.bin').construct()


def test_subset_group():
rank_example_dir = Path(__file__).absolute().parents[2] / 'examples' / 'lambdarank'
X_train, y_train = load_svmlight_file(str(rank_example_dir / 'rank.train'))
Expand Down