Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Source File and Lock Specification Approach #316

Open
wants to merge 6 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
84 changes: 2 additions & 82 deletions conda_lock/conda_lock.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,6 @@
except ImportError:
PIP_SUPPORT = False
from conda_lock.lockfile import (
Dependency,
GitMeta,
InputMeta,
LockedDependency,
Expand All @@ -76,12 +75,8 @@
write_conda_lock_file,
)
from conda_lock.lookup import set_lookup_location
from conda_lock.src_parser import LockSpecification, aggregate_lock_specs
from conda_lock.src_parser.environment_yaml import parse_environment_file
from conda_lock.src_parser.meta_yaml import parse_meta_yaml_file
from conda_lock.src_parser.pyproject_toml import parse_pyproject_toml
from conda_lock.src_parser import LockSpecification, make_lock_spec
from conda_lock.virtual_package import (
FakeRepoData,
default_virtual_package_repodata,
virtual_package_repo_from_specification,
)
Expand Down Expand Up @@ -114,8 +109,6 @@
sys.exit(1)


DEFAULT_PLATFORMS = ["osx-64", "linux-64", "win-64"]

KIND_EXPLICIT: Literal["explicit"] = "explicit"
KIND_LOCK: Literal["lock"] = "lock"
KIND_ENV: Literal["env"] = "env"
Expand Down Expand Up @@ -243,44 +236,6 @@ def fn_to_dist_name(fn: str) -> str:
return fn


def make_lock_spec(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both make_lock_spec and parse_source_files are specifically related to parsing source files, and don't have a big effect on the rest of the program. Thus, I moved them to conda_lock/src_parser/__init__.py to reduce the amount of code in this file (since its about 1500 lines of code).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for all this work!!! It would still really help for reviewing to have these refactor steps in separate commits so that I could view the substantive changes separately. (In general it's much easier to follow several smaller logical commits than one massive one.)

I'm sincerely very eager to see this through quickly, but my schedule looks difficult at the moment. I'll see what I can do, but apologies in advance if I'm slow to respond.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@maresb I tried to break this PR down into a couple of commits to make it a bit easier. I had some trouble breaking down the last couple of commits since the contents is very much tied together. But if it is still difficult to look through, let me know.

Copy link
Contributor

@maresb maresb Feb 5, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, the additional commits! This is much better for review.

It could still be even better... The best would be one single logical change per commit. Please don't change this particular commit now, but to explain what I mean, your first commit "Move Function Sub PR" could be further broken down into:

  • Add pip_support as argument to make_lock_spec and parse_source_files
  • Move parse_source_files to src_parser
  • Move make_lock_spec to src_parser

because this is the level to which I need to deconstruct the changes to see what's going on. (Currently I have to diff each function removed from conda_lock.py with each function added to __init__.py in order to see exactly what changed, so a verbatim cut-from-one-file and paste-in-the-other easier to process as a single logical change.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, good point. I'm used to writing large PRs in general. In the future, I can definitely break down my commits even further. Would you like me to modify the last large commit of this PR? My concern with modifying that commit is that I'm not sure how to break it down without having some commit be broken. I normally try to ensure that every commit exposed to master is a somewhat-working impl of the app or library.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Large PRs are fine, it's just large commits which are difficult to understand.

Your commits don't need to be perfect, and I'm asking for a fairly high standard. I can explain how I try to write commits:

I'm not sure how to break it down without having some commit be broken.

This is a good rule in general. But in some situations I think it's fine to break something in one commit and fix it in a subsequent one. (For example, in one commit I might remove functionality X, and then in the next commit I add functionality Y which replaces X. This way the new details of Y aren't confused with the old details of X.)

What also helps is to stage partial changes. For example, in the process of implementing X, I may modify some type hints in another part of the code. In this case, I can stage and commit the type hints in a separate commit, so that my implementation of X remains focused.

The most complicated technique is to rebase code you've already committed, but that is really a lot of work.

A few concrete suggestions for how you could break up the main commit:

  • Switch to ruamel
  • Define new classes
  • Implement core logic using the new classes

I think I can handle the large commit as-is, but I would need to find an uninterrupted block of time where I could work through the whole thing at once. If you manage to break up the commit, then I can probably finish reviewing it sooner.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have you spent a lot of time looking at the last 2 commits: Initial Version of SourceFile Approach and Adding Test Yaml Files. If not, I can try to split those further.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you already looked through the first 2 commits thoroughly, so I will leave them alone.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you already looked through the first 2 commits thoroughly, so I will leave them alone.

Yes, in fact, if you create a separate PR for those I think we can already merge them since they are minor refactoring changes.

For the big commit I was thinking: since this is a major change, Marius will have to review it after me. Thus it may be worthwhile to invest extra time to make it readable.

*,
src_files: List[pathlib.Path],
virtual_package_repo: FakeRepoData,
channel_overrides: Optional[Sequence[str]] = None,
platform_overrides: Optional[Sequence[str]] = None,
required_categories: Optional[AbstractSet[str]] = None,
) -> LockSpecification:
"""Generate the lockfile specs from a set of input src_files. If required_categories is set filter out specs that do not match those"""
lock_specs = parse_source_files(
src_files=src_files, platform_overrides=platform_overrides
)

lock_spec = aggregate_lock_specs(lock_specs)
lock_spec.virtual_package_repo = virtual_package_repo
lock_spec.channels = (
[Channel.from_string(co) for co in channel_overrides]
if channel_overrides
else lock_spec.channels
)
lock_spec.platforms = (
list(platform_overrides) if platform_overrides else lock_spec.platforms
) or list(DEFAULT_PLATFORMS)

if required_categories is not None:

def dep_has_category(d: Dependency, categories: AbstractSet[str]) -> bool:
return d.category in categories

lock_spec.dependencies = [
d
for d in lock_spec.dependencies
if dep_has_category(d, categories=required_categories)
]

return lock_spec


def make_lock_files(
*,
conda: PathLike,
Expand Down Expand Up @@ -358,6 +313,7 @@ def make_lock_files(
platform_overrides=platform_overrides,
virtual_package_repo=virtual_package_repo,
required_categories=required_categories if filter_categories else None,
pip_support=PIP_SUPPORT,
)
lock_content: Optional[Lockfile] = None

Expand Down Expand Up @@ -867,42 +823,6 @@ def create_lockfile_from_spec(
)


def parse_source_files(
src_files: List[pathlib.Path],
platform_overrides: Optional[Sequence[str]],
) -> List[LockSpecification]:
"""
Parse a sequence of dependency specifications from source files

Parameters
----------
src_files :
Files to parse for dependencies
platform_overrides :
Target platforms to render environment.yaml and meta.yaml files for
"""
desired_envs: List[LockSpecification] = []
for src_file in src_files:
if src_file.name == "meta.yaml":
desired_envs.append(
parse_meta_yaml_file(
src_file, list(platform_overrides or DEFAULT_PLATFORMS)
)
)
elif src_file.name == "pyproject.toml":
desired_envs.append(parse_pyproject_toml(src_file))
else:
desired_envs.append(
parse_environment_file(
src_file,
platform_overrides,
default_platforms=DEFAULT_PLATFORMS,
pip_support=PIP_SUPPORT,
)
)
return desired_envs


def _add_auth_to_line(line: str, auth: Dict[str, str]) -> str:
matching_auths = [a for a in auth if a in line]
if not matching_auths:
Expand Down
86 changes: 85 additions & 1 deletion conda_lock/src_parser/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
import typing

from itertools import chain
from typing import Dict, List, Optional, Tuple, Union
from typing import AbstractSet, Dict, List, Optional, Sequence, Tuple, Union

from pydantic import BaseModel, validator
from typing_extensions import Literal
Expand All @@ -17,6 +17,8 @@
from conda_lock.virtual_package import FakeRepoData


DEFAULT_PLATFORMS = ["osx-64", "linux-64", "win-64"]

logger = logging.getLogger(__name__)


Expand Down Expand Up @@ -136,3 +138,85 @@ def aggregate_lock_specs(
platforms=ordered_union(lock_spec.platforms or [] for lock_spec in lock_specs),
sources=ordered_union(lock_spec.sources or [] for lock_spec in lock_specs),
)


def parse_source_files(
src_files: List[pathlib.Path],
platform_overrides: Optional[Sequence[str]],
pip_support: bool = True,
) -> List[LockSpecification]:
"""
Parse a sequence of dependency specifications from source files

Parameters
----------
src_files :
Files to parse for dependencies
platform_overrides :
Target platforms to render environment.yaml and meta.yaml files for
"""
from conda_lock.src_parser.environment_yaml import parse_environment_file
from conda_lock.src_parser.meta_yaml import parse_meta_yaml_file
from conda_lock.src_parser.pyproject_toml import parse_pyproject_toml
Comment on lines +193 to +195
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any particular reason to put these imports inside the function? For more standardized code, I'd prefer to have imports at the top of the file unless there's a good reason.

Copy link
Contributor Author

@srilman srilman Feb 5, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those modules import conda_lock/src_parser/__init__.py. When I included them at the top, I was getting a circular dependency error. Those files use the SourceDependency, SourceFile, VersionedDependency, and URLDependency classes. If I move these classes to a new file like conda_lock/src_parser/models.py, then I can get rid of the circular dependency and have these be top level imports. Thoughts?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, yes, circular dependencies in Python are really annoying.

Those files use the SourceDependency, SourceFile, VersionedDependency, and URLDependency classes.

Are they using them just for type hints? If so, then they are not true import cycles. In that case, you can do

from typing import TYPE_CHECKING

if TYPE_CHECKING;
    from ... import SourceDependency, ...

and the types won't be imported at runtime, avoiding the cycle. (Very nice, I see that you already know this trick! 😄)

If it's not just for type hints, then there may be some genuinely circular logic occurring. For instance, if a imports B from c and c imports D from a, then you may need to make a new module e which contains B and D. (Then a imports B from e and c imports D from e, and all is well.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not just for type hints. I will move them to a new module.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I forgot to mention in my previous comment that Python does permit certain types of circular imports.

Running from a import B is essentially equivalent to import c; B = c.B. Python will allow you to import c from a while also importing a from c, as long as you don't access the module attributes (i.e. c.B) before the module has been fully loaded. (This works when all a.B accesses occur within function definitions.)

So there is another strategy: rather than fix the cycles, try to import modules lazily, i.e. replace from c import Bimport c and Bc.B. But I have the impression that this leads to very fragile code. I think it's generally much more robust to remove all non-typing-related circular imports when possible.


desired_envs: List[LockSpecification] = []
for src_file in src_files:
if src_file.name == "meta.yaml":
desired_envs.append(
parse_meta_yaml_file(
src_file, list(platform_overrides or DEFAULT_PLATFORMS)
)
)
elif src_file.name == "pyproject.toml":
desired_envs.append(parse_pyproject_toml(src_file))
else:
desired_envs.append(
parse_environment_file(
src_file,
platform_overrides,
default_platforms=DEFAULT_PLATFORMS,
pip_support=pip_support,
)
)
return desired_envs


def make_lock_spec(
*,
src_files: List[pathlib.Path],
virtual_package_repo: FakeRepoData,
channel_overrides: Optional[Sequence[str]] = None,
platform_overrides: Optional[Sequence[str]] = None,
required_categories: Optional[AbstractSet[str]] = None,
pip_support: bool = True,
) -> LockSpecification:
"""Generate the lockfile specs from a set of input src_files. If required_categories is set filter out specs that do not match those"""
lock_specs = parse_source_files(
src_files=src_files,
platform_overrides=platform_overrides,
pip_support=pip_support,
)

lock_spec = aggregate_lock_specs(lock_specs)
lock_spec.virtual_package_repo = virtual_package_repo
lock_spec.channels = (
[Channel.from_string(co) for co in channel_overrides]
if channel_overrides
else lock_spec.channels
)
lock_spec.platforms = (
list(platform_overrides) if platform_overrides else lock_spec.platforms
) or list(DEFAULT_PLATFORMS)

if required_categories is not None:

def dep_has_category(d: Dependency, categories: AbstractSet[str]) -> bool:
return d.category in categories

lock_spec.dependencies = [
d
for d in lock_spec.dependencies
if dep_has_category(d, categories=required_categories)
]

return lock_spec
12 changes: 8 additions & 4 deletions tests/test_conda_lock.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,20 +30,17 @@
from conda_lock.conda_lock import (
DEFAULT_FILES,
DEFAULT_LOCKFILE_NAME,
DEFAULT_PLATFORMS,
_add_auth_to_line,
_add_auth_to_lockfile,
_extract_domain,
_strip_auth_from_line,
_strip_auth_from_lockfile,
aggregate_lock_specs,
create_lockfile_from_spec,
default_virtual_package_repodata,
determine_conda_executable,
extract_input_hash,
main,
make_lock_spec,
parse_meta_yaml_file,
run_lock,
)
from conda_lock.conda_solver import extract_json_object, fake_conda_environment
Expand All @@ -66,8 +63,15 @@
)
from conda_lock.models.channel import Channel
from conda_lock.pypi_solver import parse_pip_requirement, solve_pypi
from conda_lock.src_parser import LockSpecification, Selectors, VersionedDependency
from conda_lock.src_parser import (
DEFAULT_PLATFORMS,
LockSpecification,
Selectors,
VersionedDependency,
aggregate_lock_specs,
)
from conda_lock.src_parser.environment_yaml import parse_environment_file
from conda_lock.src_parser.meta_yaml import parse_meta_yaml_file
from conda_lock.src_parser.pyproject_toml import (
parse_pyproject_toml,
poetry_version_to_conda_version,
Expand Down