diff --git a/docs/concepts/changelog_templates.rst b/docs/concepts/changelog_templates.rst index d42a210d7..baa895a32 100644 --- a/docs/concepts/changelog_templates.rst +++ b/docs/concepts/changelog_templates.rst @@ -1035,6 +1035,101 @@ the remote VCS for each commit. Both of these are injected into the template env by PSR. +.. _changelog-templates-remote: + +Remote Template Directories +--------------------------- + +*Introduced in v10.6.0* + +As of v10.6.0, PSR supports loading changelog templates from remote filesystems such as +GitHub repositories, S3 buckets, HTTP servers, and other locations supported by `fsspec`_. +This enables you to centralize your changelog templates in a shared location and reuse +them across multiple projects. + +.. _fsspec: https://filesystem-spec.readthedocs.io/ + +To use a remote template directory, set the :ref:`template_dir ` +setting to a URL supported by fsspec. + +Most remote protocols require additional packages. Install the appropriate +fsspec extra for your protocol (e.g., ``fsspec[s3]`` for S3). +Refer to the `fsspec extras documentation`_ for the full list of available protocols. + +.. _fsspec extras documentation: https://filesystem-spec.readthedocs.io/en/latest/#installation + + +GitHub Repository (Public) +^^^^^^^^^^^^^^^^^^^^^^^^^^ + +For a public GitHub repository, use the ``github://`` protocol: + +.. code-block:: toml + + [tool.semantic_release.changelog] + template_dir = "github://myorg:shared-templates@main/changelog-templates" + +The URL format is ``github://org:repo@ref/path`` where: + +* ``org`` is the GitHub organization or username +* ``repo`` is the repository name +* ``ref`` is the branch name, tag, or commit SHA +* ``path`` is the path to the template directory within the repository + + +GitHub Repository (Private) +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +For private repositories, you need to provide authentication via the +:ref:`storage_options ` setting: + +.. code-block:: toml + + [tool.semantic_release.changelog] + template_dir = "github://myorg:private-templates@main/changelog-templates" + + [tool.semantic_release.changelog.storage_options] + username = { env = "GITHUB_TOKEN" } + token = { env = "GITHUB_TOKEN" } + +Refer to the `fsspec GithubFileSystem documentation`_ for details on supported +authentication methods. + +.. _fsspec GithubFileSystem documentation: https://filesystem-spec.readthedocs.io/en/latest/api.html#fsspec.implementations.github.GithubFileSystem + + +Other Supported Protocols +^^^^^^^^^^^^^^^^^^^^^^^^^ + +Through fsspec, PSR supports many other remote filesystems including: + +* **HTTP/HTTPS**: ``https://example.com/templates`` +* **S3**: ``s3://bucket-name/templates`` +* **Google Cloud Storage**: ``gcs://bucket-name/templates`` +* **Azure Blob Storage**: ``az://container-name/templates`` + +Refer to the `fsspec documentation`_ for the full list of supported filesystems and +their respective configuration options. + +.. _fsspec documentation: https://filesystem-spec.readthedocs.io/en/latest/api.html#built-in-implementations + + +Security Considerations +^^^^^^^^^^^^^^^^^^^^^^^ + +.. warning:: + Chained protocols (e.g., ``simplecache::s3://bucket/path``) are not supported for + security reasons. + +When using remote templates, be aware that: + +* Templates are fetched from the remote location each time the changelog is generated +* Local path traversal protections do not apply to remote URLs (remote templates are + by definition external to your repository) +* Ensure your remote template source is trusted, as templates have access to your + project's release history and are rendered using Jinja2 + + .. _changelog-templates-custom_release_notes: Custom Release Notes diff --git a/docs/configuration/configuration.rst b/docs/configuration/configuration.rst index 690a3e4c3..7e0a6d5a5 100644 --- a/docs/configuration/configuration.rst +++ b/docs/configuration/configuration.rst @@ -727,6 +727,8 @@ reStructuredText ``..\n version list`` ``template_dir`` **************** +*Remote filesystem support introduced in v10.6.0* + **Type:** ``str`` When files exist within the specified directory, they will be used as templates for @@ -738,12 +740,56 @@ No default changelog template or release notes template will be used when this d exists and the directory is not empty. If the directory is empty, the default changelog template will be used. +As of v10.6.0, ``template_dir`` supports remote filesystem URLs in addition to local +paths. This allows you to store your changelog templates in a central location such +as a GitHub repository, S3 bucket, or any other filesystem supported by `fsspec`_. +See :ref:`changelog-templates-remote` for usage examples. + +.. _fsspec: https://filesystem-spec.readthedocs.io/ + +.. warning:: + Chained protocols (e.g., ``simplecache::s3://...``) are not supported for security + reasons. + This option is discussed in more detail at :ref:`changelog-templates` **Default:** ``"templates"`` ---- +.. _config-changelog-storage_options: + +``storage_options`` +******************* + +*Introduced in v10.6.0* + +**Type:** ``dict[str, str | EnvConfigVar]`` + +Authentication and configuration options for remote template directories. This setting +is only relevant when using a remote URL for :ref:`template_dir `. + +Values can be plain strings or environment variable references using the ``{ env = "VAR_NAME" }`` +syntax. + +This setting is passed to the underlying `fsspec`_ filesystem. Refer to the fsspec documentation +for available options per protocol. + +**Example:** GitHub private repository authentication + +.. code-block:: toml + + [tool.semantic_release.changelog] + template_dir = "github://myorg:private-repo@main/templates" + + [tool.semantic_release.changelog.storage_options] + username = { env = "GITHUB_TOKEN" } + token = { env = "GITHUB_TOKEN" } + +**Default:** ``{}`` (empty) + +---- + .. _config-commit_author: ``commit_author`` diff --git a/pyproject.toml b/pyproject.toml index 19b636cb7..c459cc976 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -40,6 +40,7 @@ dependencies = [ "rich ~= 14.0", "shellingham ~= 1.5", "Deprecated ~= 1.2", # Backport of deprecated decorator for python 3.8 + "universal-pathlib ~= 0.2.0", ] [project.scripts] diff --git a/src/semantic_release/changelog/template.py b/src/semantic_release/changelog/template.py index d74295404..410d680e2 100644 --- a/src/semantic_release/changelog/template.py +++ b/src/semantic_release/changelog/template.py @@ -1,25 +1,59 @@ from __future__ import annotations -import os -import shutil from pathlib import Path, PurePosixPath from typing import TYPE_CHECKING -from jinja2 import FileSystemLoader +from jinja2 import BaseLoader, TemplateNotFound from jinja2.sandbox import SandboxedEnvironment +from upath import UPath from semantic_release.globals import logger from semantic_release.helpers import dynamic_import if TYPE_CHECKING: # pragma: no cover + import os from typing import Callable, Iterable, Literal from jinja2 import Environment +class UPathLoader(BaseLoader): + """ + Jinja2 loader using UPath for universal filesystem abstraction. + + This loader enables loading templates from any filesystem supported by fsspec/UPath, + including local files, git repositories, HTTP URLs, S3 buckets, etc. + """ + + def __init__( + self, + searchpath: UPath, + encoding: str = "utf-8", + ) -> None: + self.searchpath = searchpath + self.encoding = encoding + + def get_source( + self, _environment: Environment, template: str + ) -> tuple[str, str, Callable[[], bool]]: + path = self.searchpath / template + if not path.exists(): + raise TemplateNotFound(template) + source = path.read_text(encoding=self.encoding) + return source, str(path), lambda: True + + def list_templates(self) -> list[str]: + templates: list[str] = [] + for f in self.searchpath.rglob("*"): + if f.is_file(): + rel_path = PurePosixPath(f.path).relative_to(self.searchpath.path) + templates.append(str(rel_path)) + return templates + + # pylint: disable=too-many-arguments,too-many-locals def environment( - template_dir: Path | str = ".", + template_dir: Path | UPath | str = ".", block_start_string: str = "{%", block_end_string: str = "%}", variable_start_string: str = "{{", @@ -36,10 +70,10 @@ def environment( autoescape: bool | str = True, ) -> SandboxedEnvironment: """ - Create a jinja2.sandbox.SandboxedEnvironment with certain parameter resrictions. + Create a jinja2.sandbox.SandboxedEnvironment with certain parameter restrictions. - For example the Loader is fixed to FileSystemLoader, although the searchpath - is configurable. + Uses UPathLoader which supports both local and remote template directories + (git repositories, HTTP URLs, S3 buckets, etc.) via fsspec/UPath. ``autoescape`` can be a string in which case it should follow the convention ``module:attr``, in this instance it will be dynamically imported. @@ -52,6 +86,11 @@ def environment( else: autoescape_value = autoescape + if not isinstance(template_dir, UPath): + template_dir = UPath(template_dir) + + loader = UPathLoader(template_dir, encoding="utf-8") + return ComplexDirectorySandboxedEnvironment( block_start_string=block_start_string, block_end_string=block_end_string, @@ -67,7 +106,7 @@ def environment( keep_trailing_newline=keep_trailing_newline, extensions=extensions, autoescape=autoescape_value, - loader=FileSystemLoader(template_dir, encoding="utf-8"), + loader=loader, ) @@ -89,50 +128,51 @@ def join_path(self, template: str, parent: str) -> str: def recursive_render( - template_dir: Path, + template_dir: Path | UPath | str, environment: Environment, _root_dir: str | os.PathLike[str] = ".", ) -> list[str]: rendered_paths: list[str] = [] - for root, file in ( - (Path(root), file) - for root, _, files in os.walk(template_dir) - for file in files - if not any( - elem.startswith(".") for elem in Path(root).relative_to(template_dir).parts - ) - and not file.startswith(".") - ): - output_path = (_root_dir / root.relative_to(template_dir)).resolve() - logger.info("Rendering templates from %s to %s", root, output_path) + root_dir = Path(_root_dir) + + if not isinstance(template_dir, UPath): + template_dir = UPath(template_dir) + + for src_file in template_dir.rglob("*"): + if not src_file.is_file(): + continue + + # Convert to PurePosixPath for local path operations. + # PurePosixPath is correct because remote filesystems always use forward slashes + rel_path = PurePosixPath(src_file.path).relative_to(template_dir.path) + + if any(part.startswith(".") for part in rel_path.parts): + continue + + output_path = (root_dir / rel_path.parent).resolve() + logger.info("Rendering templates from %s to %s", src_file.parent, output_path) output_path.mkdir(parents=True, exist_ok=True) - if file.endswith(".j2"): - # We know the file ends with .j2 by the filter in the for-loop - output_filename = file[:-3] - # Strip off the template directory from the front of the root path - - # that's the output location relative to the repo root - src_file_path = str((root / file).relative_to(template_dir)) - output_file_path = str((output_path / output_filename).resolve()) + + if rel_path.suffix == ".j2": + src_file_rel = str(rel_path) + output_file_path = output_path / rel_path.stem # Although, file stream rendering is possible and preferred in most # situations, here it is not desired as you cannot read the previous # contents of a file during the rendering of the template. This mechanism # is used for inserting into a current changelog. When using stream rendering # of the same file, it always came back empty - logger.debug("rendering %s to %s", src_file_path, output_file_path) - rendered_file = environment.get_template(src_file_path).render().rstrip() - with open(output_file_path, "w", encoding="utf-8") as output_file: - output_file.write(f"{rendered_file}\n") + logger.debug("rendering %s to %s", src_file_rel, output_file_path) + rendered_file = environment.get_template(src_file_rel).render().rstrip() + output_file_path.write_text(f"{rendered_file}\n", encoding="utf-8") - rendered_paths.append(output_file_path) + rendered_paths.append(str(output_file_path)) else: - src_file = str((root / file).resolve()) - target_file = str((output_path / file).resolve()) - logger.debug( - "source file %s is not a template, copying to %s", src_file, target_file - ) - shutil.copyfile(src_file, target_file) - rendered_paths.append(target_file) + # Copy non-template file + target_file = output_path / rel_path.name + logger.debug("copying %s to %s", src_file, target_file) + target_file.write_bytes(src_file.read_bytes()) + rendered_paths.append(str(target_file)) return rendered_paths diff --git a/src/semantic_release/cli/changelog_writer.py b/src/semantic_release/cli/changelog_writer.py index 65e387896..832a23648 100644 --- a/src/semantic_release/cli/changelog_writer.py +++ b/src/semantic_release/cli/changelog_writer.py @@ -1,7 +1,6 @@ from __future__ import annotations import os -from contextlib import suppress from pathlib import Path from typing import TYPE_CHECKING @@ -29,6 +28,7 @@ if TYPE_CHECKING: # pragma: no cover from jinja2 import Environment + from upath import UPath from semantic_release.changelog.context import ChangelogContext from semantic_release.changelog.release_history import Release, ReleaseHistory @@ -107,7 +107,7 @@ def render_release_notes( def apply_user_changelog_template_directory( - template_dir: Path, + template_dir: Path | UPath | str, environment: Environment, destination_dir: Path, noop: bool = False, @@ -180,21 +180,19 @@ def write_changelog_files( mask_initial_release=runtime_ctx.changelog_mask_initial_release, ) - user_templates = [] + user_templates: list[str] = [] # Update known templates list if Directory exists and directory has actual files to render if template_dir.is_dir(): - user_templates.extend( - [ - f - for f in template_dir.rglob("*") - if f.is_file() and f.suffix == JINJA2_EXTENSION - ] - ) - - with suppress(ValueError): - # do not include a release notes override when considering number of changelog templates - user_templates.remove(template_dir / DEFAULT_RELEASE_NOTES_TPL_FILE) + user_templates = [ + str(f) + for f in template_dir.rglob("*") + if ( + f.is_file() + and f.suffix == JINJA2_EXTENSION + and f.name != DEFAULT_RELEASE_NOTES_TPL_FILE + ) + ] # Render user templates if found if len(user_templates) > 0: @@ -208,7 +206,7 @@ def write_changelog_files( ) logger.info( - "No contents found in %r, using default changelog template", template_dir + "No contents found in %r, using default changelog template", str(template_dir) ) return [ write_default_changelog( @@ -225,7 +223,7 @@ def write_changelog_files( def generate_release_notes( hvcs_client: HvcsBase, release: Release, - template_dir: Path, + template_dir: Path | UPath, history: ReleaseHistory, style: str, mask_initial_release: bool, diff --git a/src/semantic_release/cli/config.py b/src/semantic_release/cli/config.py index 514d76ef1..c27fe4dbe 100644 --- a/src/semantic_release/cli/config.py +++ b/src/semantic_release/cli/config.py @@ -29,6 +29,7 @@ model_validator, ) from typing_extensions import Annotated, Self +from upath import UPath from urllib3.util.url import parse_url import semantic_release.hvcs as hvcs @@ -149,6 +150,39 @@ def interpret_output_format(self) -> Self: return self +class StorageOptionsConfig(BaseModel): + """ + fsspec storage options with environment variable support. + + This configuration allows passing authentication and other options to fsspec + for remote template directories. Values can be either strings or EnvConfigVar + objects to resolve from environment variables. + + Example configuration: + [tool.semantic_release.changelog.storage_options] + key = { env = "AWS_ACCESS_KEY_ID" } + secret = { env = "AWS_SECRET_ACCESS_KEY" } + """ + + model_config = {"extra": "allow"} + + @model_validator(mode="before") + @classmethod + def resolve_env_vars(cls, values: Dict[str, Any]) -> Dict[str, Any]: + resolved: Dict[str, Any] = {} + for key, value in values.items(): + if isinstance(value, dict) and "env" in value: + # This is an EnvConfigVar-like dict + env_var = EnvConfigVar(**value) + resolved_value = env_var.getvalue() + # Only include if we got a value + if resolved_value is not None: + resolved[key] = resolved_value + elif value is not None: + resolved[key] = value + return resolved + + class ChangelogConfig(BaseModel): # TODO: BREAKING CHANGE v11, move to DefaultChangelogTemplatesConfig changelog_file: str = "" @@ -162,6 +196,20 @@ class ChangelogConfig(BaseModel): mode: ChangelogMode = ChangelogMode.UPDATE insertion_flag: str = "" template_dir: str = "templates" + storage_options: StorageOptionsConfig = Field(default_factory=StorageOptionsConfig) + + @field_validator("template_dir", mode="after") + @classmethod + def validate_template_dir(cls, val: str) -> str: + # Chained protocols (e.g., simplecache::s3://) are not supported. + # Attackers could use arbitrarily complex nested protocols to bypass + # local path security heuristics. For example, simplecache::file:///etc/passwd + # would have protocol "simplecache", not "file", bypassing the local path + # traversal check. Rather than trying to parse and validate nested chains, + # we disallow them entirely. + if "::" in val: + raise ValueError("Chained protocols are not supported for template_dir.") + return val @field_validator("exclude_commit_patterns", mode="after") @classmethod @@ -565,7 +613,7 @@ class RuntimeContext: changelog_output_format: ChangelogOutputFormat ignore_token_for_push: bool template_environment: Environment - template_dir: Path + template_dir: UPath build_command: Optional[str] build_command_env: dict[str, str] dist_glob_patterns: Tuple[str, ...] @@ -808,18 +856,23 @@ def from_raw_config( # noqa: C901 "Changelog file destination must be inside of the repository directory." ) - # Must use absolute after resolve because windows does not resolve if the path does not exist - # which means it returns a relative path. So we force absolute to ensure path is complete - # for the next check of path matching - template_dir = ( - Path(raw.changelog.template_dir).expanduser().resolve().absolute() - ) - - # Prevent path traversal attacks - if raw.repo_dir not in template_dir.parents: - raise InvalidConfiguration( - "Template directory must be inside of the repository directory." - ) + template_dir_str = raw.changelog.template_dir + storage_opts = raw.changelog.storage_options.model_dump(exclude_none=True) + template_dir = UPath(template_dir_str, **storage_opts) + + # Prevent path traversal attacks (local paths only). + # Chained protocols are already rejected by ChangelogConfig.validate_template_dir, + # so we can safely check the protocol property here. + if template_dir.protocol in ("", "file", "local"): + # Must use absolute after resolve because windows does not resolve if the path does not exist + # which means it returns a relative path. So we force absolute to ensure path is complete + # for the next check of path matching + resolved = Path(template_dir_str).expanduser().resolve().absolute() + if raw.repo_dir not in resolved.parents and resolved != raw.repo_dir: + raise InvalidConfiguration( + "Template directory must be inside of the repository directory." + ) + template_dir = UPath(resolved) template_environment = environment( template_dir=template_dir, diff --git a/tests/unit/semantic_release/changelog/test_template.py b/tests/unit/semantic_release/changelog/test_template.py index 32c6ab4dc..f038f8bbf 100644 --- a/tests/unit/semantic_release/changelog/test_template.py +++ b/tests/unit/semantic_release/changelog/test_template.py @@ -4,14 +4,26 @@ # but not all of them. The testing can be expanded to cover all the options later. # It's not super essential as Jinja2 does most of the testing, we're just checking # that we can properly set the right strings in the template environment. +from re import compile as regexp from textwrap import dedent from typing import TYPE_CHECKING +from unittest.mock import MagicMock import pytest +from fsspec.implementations.memory import ( # type: ignore[import-untyped] + MemoryFileSystem, +) +from jinja2 import TemplateNotFound +from upath import UPath -from semantic_release.changelog.template import environment +from semantic_release.changelog.template import ( + UPathLoader, + environment, + recursive_render, +) if TYPE_CHECKING: + from pathlib import Path from typing import Any EXAMPLE_TEMPLATE_FORMAT_STR = """ @@ -65,3 +77,200 @@ def test_template_env_configurable(format_map: dict[str, Any], subjects: tuple[s actual_result = template.render(title="important", subjects=subjects) assert expected_result == actual_result + + +def test_upathloader_get_source_existing_template(tmp_path: Path): + template_dir = tmp_path / "templates" + template_dir.mkdir() + template_file = template_dir / "test.j2" + template_content = "Hello {{ name }}" + template_file.write_text(template_content) + loader = UPathLoader(UPath(template_dir)) + + source, path, uptodate = loader.get_source(MagicMock(), "test.j2") + + assert source == template_content + assert str(template_dir / "test.j2") in path + assert uptodate() is True + + +def test_upathloader_get_source_missing_template_raises(tmp_path: Path): + template_dir = tmp_path / "templates" + template_dir.mkdir() + loader = UPathLoader(UPath(template_dir)) + + with pytest.raises(TemplateNotFound, match=regexp(r"nonexistent\.j2")): + loader.get_source(MagicMock(), "nonexistent.j2") + + +def test_upathloader_get_source_nested_template(tmp_path: Path): + template_dir = tmp_path / "templates" + nested_dir = template_dir / "subdir" / "nested" + nested_dir.mkdir(parents=True) + template_file = nested_dir / "deep.j2" + template_file.write_text("Deep template") + loader = UPathLoader(UPath(template_dir)) + + source, _, _ = loader.get_source(MagicMock(), "subdir/nested/deep.j2") + + assert source == "Deep template" + + +def test_upathloader_list_templates_empty_directory(tmp_path: Path): + template_dir = tmp_path / "templates" + template_dir.mkdir() + loader = UPathLoader(UPath(template_dir)) + + templates = loader.list_templates() + + assert templates == [] + + +def test_upathloader_list_templates_flat_structure(tmp_path: Path): + template_dir = tmp_path / "templates" + template_dir.mkdir() + (template_dir / "a.j2").write_text("A") + (template_dir / "b.html").write_text("B") + (template_dir / "c.txt").write_text("C") + loader = UPathLoader(UPath(template_dir)) + + templates = loader.list_templates() + + assert sorted(templates) == ["a.j2", "b.html", "c.txt"] + + +def test_upathloader_list_templates_nested_structure(tmp_path: Path): + template_dir = tmp_path / "templates" + template_dir.mkdir() + (template_dir / "root.j2").write_text("Root") + subdir = template_dir / "subdir" + subdir.mkdir() + (subdir / "nested.j2").write_text("Nested") + loader = UPathLoader(UPath(template_dir)) + + templates = loader.list_templates() + + assert "root.j2" in templates + assert any("subdir" in t and "nested.j2" in t for t in templates) + + +def test_upathloader_list_templates_excludes_directories(tmp_path: Path): + template_dir = tmp_path / "templates" + template_dir.mkdir() + (template_dir / "file.j2").write_text("File") + (template_dir / "subdir").mkdir() + loader = UPathLoader(UPath(template_dir)) + + templates = loader.list_templates() + + assert templates == ["file.j2"] + + +def test_upathloader_custom_encoding(tmp_path: Path): + template_dir = tmp_path / "templates" + template_dir.mkdir() + template_file = template_dir / "unicode.j2" + content = "Unicode: \u00e9\u00e8\u00ea" + template_file.write_text(content, encoding="utf-8") + loader = UPathLoader(UPath(template_dir), encoding="utf-8") + + source, _, _ = loader.get_source(MagicMock(), "unicode.j2") + + assert source == content + + +@pytest.fixture +def clear_memory_fs(): + # MemoryFileSystem uses a shared store, so we need to clear it + MemoryFileSystem.store.clear() + MemoryFileSystem.pseudo_dirs.clear() + yield + MemoryFileSystem.store.clear() + MemoryFileSystem.pseudo_dirs.clear() + + +@pytest.fixture +def memory_fs(clear_memory_fs: None) -> MemoryFileSystem: + return MemoryFileSystem() + + +def test_upathloader_with_memory_filesystem(memory_fs: MemoryFileSystem): + memory_fs.mkdir("/templates") + memory_fs.pipe("/templates/test.j2", b"Hello {{ name }}") + upath = UPath("memory:///templates") + loader = UPathLoader(upath) + + source, path, _ = loader.get_source(MagicMock(), "test.j2") + + assert source == "Hello {{ name }}" + assert "test.j2" in path + + +def test_upathloader_list_templates_memory_filesystem(memory_fs: MemoryFileSystem): + memory_fs.mkdir("/templates") + memory_fs.pipe("/templates/a.j2", b"A") + memory_fs.pipe("/templates/b.j2", b"B") + upath = UPath("memory:///templates") + loader = UPathLoader(upath) + + templates = loader.list_templates() + + assert sorted(templates) == ["a.j2", "b.j2"] + + +def test_upathloader_nested_memory_filesystem(memory_fs: MemoryFileSystem): + memory_fs.mkdir("/templates") + memory_fs.mkdir("/templates/subdir") + memory_fs.pipe("/templates/subdir/nested.j2", b"Nested content") + upath = UPath("memory:///templates") + loader = UPathLoader(upath) + + source, _, _ = loader.get_source(MagicMock(), "subdir/nested.j2") + + assert source == "Nested content" + + +def test_environment_with_memory_filesystem(memory_fs: MemoryFileSystem): + memory_fs.mkdir("/templates") + memory_fs.pipe("/templates/greet.j2", b"Hello, {{ name }}!") + upath = UPath("memory:///templates") + env = environment(template_dir=upath) + + template = env.get_template("greet.j2") + result = template.render(name="Remote") + + assert result == "Hello, Remote!" + + +def test_recursive_render_with_memory_filesystem( + memory_fs: MemoryFileSystem, tmp_path: Path +): + memory_fs.mkdir("/templates") + memory_fs.pipe("/templates/config.yaml.j2", b"key: {{ value }}") + memory_fs.pipe("/templates/readme.txt", b"Static file") + output_dir = tmp_path / "output" + output_dir.mkdir() + upath = UPath("memory:///templates") + env = environment(template_dir=upath) + env.globals["value"] = "test_value" + + rendered_paths = recursive_render( + template_dir=upath, + environment=env, + _root_dir=output_dir, + ) + + assert len(rendered_paths) == 2 + assert (output_dir / "config.yaml").exists() + assert "key: test_value" in (output_dir / "config.yaml").read_text() + assert (output_dir / "readme.txt").exists() + assert (output_dir / "readme.txt").read_text() == "Static file" + + +def test_upathloader_missing_template_memory_filesystem(memory_fs: MemoryFileSystem): + memory_fs.mkdir("/templates") + upath = UPath("memory:///templates") + loader = UPathLoader(upath) + + with pytest.raises(TemplateNotFound, match=regexp(r"missing\.j2")): + loader.get_source(MagicMock(), "missing.j2") diff --git a/tests/unit/semantic_release/cli/test_config.py b/tests/unit/semantic_release/cli/test_config.py index 343748187..6dd3efe12 100644 --- a/tests/unit/semantic_release/cli/test_config.py +++ b/tests/unit/semantic_release/cli/test_config.py @@ -22,6 +22,7 @@ HvcsClient, RawConfig, RuntimeContext, + StorageOptionsConfig, _known_hvcs, ) from semantic_release.cli.util import load_raw_config_file @@ -31,7 +32,7 @@ from semantic_release.commit_parser.tag import TagParserOptions from semantic_release.const import DEFAULT_COMMIT_AUTHOR from semantic_release.enums import LevelBump -from semantic_release.errors import ParserLoadError +from semantic_release.errors import InvalidConfiguration, ParserLoadError from tests.fixtures.repos import repo_w_no_tags_conventional_commits from tests.util import ( @@ -463,3 +464,267 @@ def test_git_remote_url_w_insteadof_alias( # Evaluate: the remote URL should be the full URL assert expected_url.url == actual_url + + +def test_storage_options_plain_string_values(): + config = StorageOptionsConfig.model_validate( + { + "key": "my-access-key", + "secret": "my-secret-key", + "bucket": "my-bucket", + } + ) + dumped = config.model_dump(exclude_none=True) + + assert dumped["key"] == "my-access-key" + assert dumped["secret"] == "my-secret-key" + assert dumped["bucket"] == "my-bucket" + + +def test_storage_options_env_var_resolution(): + with mock.patch.dict(os.environ, {"AWS_KEY": "resolved-key"}, clear=True): + config = StorageOptionsConfig.model_validate( + { + "key": {"env": "AWS_KEY"}, + } + ) + dumped = config.model_dump(exclude_none=True) + + assert dumped["key"] == "resolved-key" + + +def test_storage_options_missing_env_var_excluded(): + with mock.patch.dict(os.environ, {}, clear=True): + config = StorageOptionsConfig.model_validate( + { + "key": {"env": "NONEXISTENT_VAR"}, + } + ) + dumped = config.model_dump(exclude_none=True) + + assert "key" not in dumped + + +def test_storage_options_env_var_with_default(): + with mock.patch.dict(os.environ, {}, clear=True): + config = StorageOptionsConfig.model_validate( + { + "key": {"env": "MISSING_VAR", "default": "fallback-value"}, + } + ) + dumped = config.model_dump(exclude_none=True) + + assert dumped["key"] == "fallback-value" + + +def test_storage_options_env_var_with_default_env(): + with mock.patch.dict(os.environ, {"BACKUP_KEY": "backup-value"}, clear=True): + config = StorageOptionsConfig.model_validate( + { + "key": {"env": "PRIMARY_KEY", "default_env": "BACKUP_KEY"}, + } + ) + dumped = config.model_dump(exclude_none=True) + + assert dumped["key"] == "backup-value" + + +def test_storage_options_mixed_values(): + with mock.patch.dict(os.environ, {"SECRET_KEY": "secret123"}, clear=True): + config = StorageOptionsConfig.model_validate( + { + "bucket": "my-bucket", + "secret": {"env": "SECRET_KEY"}, + "missing": {"env": "MISSING"}, + } + ) + dumped = config.model_dump(exclude_none=True) + + assert dumped["bucket"] == "my-bucket" + assert dumped["secret"] == "secret123" + assert "missing" not in dumped + + +def test_storage_options_empty_config(): + config = StorageOptionsConfig.model_validate({}) + dumped = config.model_dump(exclude_none=True) + + assert dumped == {} + + +@pytest.mark.parametrize( + "chained_protocol", + [ + "simplecache::s3://bucket/templates", + "simplecache::http://example.com/templates", + "blockcache::file:///etc/templates", + "memory::s3://bucket/path", + "filecache::gcs://bucket/templates", + ], +) +def test_changelog_config_rejects_chained_protocols(chained_protocol: str): + with pytest.raises( + ValidationError, + match=regexp(r".*Chained protocols are not supported.*"), + ): + ChangelogConfig.model_validate( + { + "template_dir": chained_protocol, + } + ) + + +@pytest.mark.parametrize( + "valid_template_dir", + [ + "templates", + "./templates", + "../templates", + "/absolute/path/templates", + "s3://bucket/templates", + "http://example.com/templates", + "https://example.com/templates", + "gcs://bucket/templates", + "az://container/templates", + "file:///local/path", + ], +) +def test_changelog_config_accepts_valid_template_dirs(valid_template_dir: str): + config = ChangelogConfig.model_validate( + { + "template_dir": valid_template_dir, + } + ) + + assert config.template_dir == valid_template_dir + + +def test_changelog_config_with_storage_options(): + with mock.patch.dict(os.environ, {"S3_KEY": "test-key"}, clear=True): + config = ChangelogConfig.model_validate( + { + "template_dir": "s3://bucket/templates", + "storage_options": { + "key": {"env": "S3_KEY"}, + }, + } + ) + opts = config.storage_options.model_dump(exclude_none=True) + + assert config.template_dir == "s3://bucket/templates" + assert opts["key"] == "test-key" + + +@pytest.mark.usefixtures(repo_w_no_tags_conventional_commits.__name__) +def test_local_template_dir_outside_repo_rejected( + example_pyproject_toml: Path, + example_project_dir: Path, + update_pyproject_toml: UpdatePyprojectTomlFn, + change_to_ex_proj_dir: None, +): + outside_dir = example_project_dir.parent / "outside_repo_templates" + outside_dir.mkdir(parents=True, exist_ok=True) + update_pyproject_toml( + "tool.semantic_release.changelog.template_dir", + str(outside_dir), + ) + + with pytest.raises( + InvalidConfiguration, + match=regexp(r".*Template directory must be inside.*repository.*"), + ): + RuntimeContext.from_raw_config( + RawConfig.model_validate(load_raw_config_file(example_pyproject_toml)), + global_cli_options=GlobalCommandLineOptions(), + ) + + +@pytest.mark.usefixtures(repo_w_no_tags_conventional_commits.__name__) +def test_local_template_dir_inside_repo_accepted( + example_pyproject_toml: Path, + example_project_dir: Path, + update_pyproject_toml: UpdatePyprojectTomlFn, + change_to_ex_proj_dir: None, +): + template_dir = example_project_dir / "my-templates" + template_dir.mkdir(parents=True) + update_pyproject_toml( + "tool.semantic_release.changelog.template_dir", + str(template_dir), + ) + + runtime_ctx = RuntimeContext.from_raw_config( + RawConfig.model_validate(load_raw_config_file(example_pyproject_toml)), + global_cli_options=GlobalCommandLineOptions(), + ) + + assert runtime_ctx is not None + + +@pytest.mark.usefixtures(repo_w_no_tags_conventional_commits.__name__) +def test_relative_template_dir_resolved_correctly( + example_pyproject_toml: Path, + example_project_dir: Path, + update_pyproject_toml: UpdatePyprojectTomlFn, + change_to_ex_proj_dir: None, +): + (example_project_dir / "rel-templates").mkdir() + update_pyproject_toml( + "tool.semantic_release.changelog.template_dir", + "rel-templates", + ) + + runtime_ctx = RuntimeContext.from_raw_config( + RawConfig.model_validate(load_raw_config_file(example_pyproject_toml)), + global_cli_options=GlobalCommandLineOptions(), + ) + + assert runtime_ctx is not None + + +@pytest.mark.parametrize( + "remote_protocol", + [ + "s3://bucket/templates", + "http://example.com/templates", + "https://example.com/templates", + "gcs://bucket/templates", + ], +) +@pytest.mark.usefixtures(repo_w_no_tags_conventional_commits.__name__) +def test_remote_protocol_skips_path_traversal_check( + remote_protocol: str, + example_pyproject_toml: Path, + update_pyproject_toml: UpdatePyprojectTomlFn, + change_to_ex_proj_dir: None, +): + update_pyproject_toml( + "tool.semantic_release.changelog.template_dir", + remote_protocol, + ) + # Remote protocols should not raise InvalidConfiguration for path traversal. + # They may fail for other reasons (network access), but path traversal check is skipped. + try: + RuntimeContext.from_raw_config( + RawConfig.model_validate(load_raw_config_file(example_pyproject_toml)), + global_cli_options=GlobalCommandLineOptions(), + ) + except InvalidConfiguration as err: + # If InvalidConfiguration is raised, it must NOT be about path traversal + assert "inside of the repository" not in str(err) # noqa: PT017 + + +@pytest.mark.usefixtures(repo_w_no_tags_conventional_commits.__name__) +def test_default_template_dir_accepted( + example_pyproject_toml: Path, + example_project_dir: Path, + change_to_ex_proj_dir: None, +): + (example_project_dir / "templates").mkdir() + + runtime_ctx = RuntimeContext.from_raw_config( + RawConfig.model_validate(load_raw_config_file(example_pyproject_toml)), + global_cli_options=GlobalCommandLineOptions(), + ) + + assert runtime_ctx is not None diff --git a/tests/unit/semantic_release/version/declarations/test_pattern_declaration.py b/tests/unit/semantic_release/version/declarations/test_pattern_declaration.py index ddca3dbf6..ae03575f0 100644 --- a/tests/unit/semantic_release/version/declarations/test_pattern_declaration.py +++ b/tests/unit/semantic_release/version/declarations/test_pattern_declaration.py @@ -435,10 +435,9 @@ def test_pattern_declaration_noop_warning_on_missing_file( # Evaluate assert file_to_modify is None - assert ( - "FILE NOT FOUND: cannot stamp version in non-existent file" - in capsys.readouterr().err - ) + # Normalize whitespace since rich may wrap text based on terminal width + stderr_output = " ".join(capsys.readouterr().err.split()) + assert "FILE NOT FOUND: cannot stamp version in non-existent file" in stderr_output def test_pattern_declaration_noop_warning_on_no_version_in_file( @@ -465,10 +464,9 @@ def test_pattern_declaration_noop_warning_on_no_version_in_file( # Evaluate assert file_to_modify is None - assert ( - "VERSION PATTERN NOT FOUND: no version to stamp in file" - in capsys.readouterr().err - ) + # Normalize whitespace since rich may wrap text based on terminal width + stderr_output = " ".join(capsys.readouterr().err.split()) + assert "VERSION PATTERN NOT FOUND: no version to stamp in file" in stderr_output @pytest.mark.parametrize( diff --git a/tests/unit/semantic_release/version/declarations/test_toml_declaration.py b/tests/unit/semantic_release/version/declarations/test_toml_declaration.py index a768b6cd3..5ee8174c0 100644 --- a/tests/unit/semantic_release/version/declarations/test_toml_declaration.py +++ b/tests/unit/semantic_release/version/declarations/test_toml_declaration.py @@ -279,10 +279,9 @@ def test_toml_declaration_noop_warning_on_missing_file( # Evaluate assert file_to_modify is None - assert ( - "FILE NOT FOUND: cannot stamp version in non-existent file" - in capsys.readouterr().err - ) + # Normalize whitespace since rich may wrap text based on terminal width + stderr_output = " ".join(capsys.readouterr().err.split()) + assert "FILE NOT FOUND: cannot stamp version in non-existent file" in stderr_output def test_toml_declaration_noop_warning_on_no_version_in_file( @@ -312,10 +311,9 @@ def test_toml_declaration_noop_warning_on_no_version_in_file( # Evaluate assert file_to_modify is None - assert ( - "VERSION PATTERN NOT FOUND: no version to stamp in file" - in capsys.readouterr().err - ) + # Normalize whitespace since rich may wrap text based on terminal width + stderr_output = " ".join(capsys.readouterr().err.split()) + assert "VERSION PATTERN NOT FOUND: no version to stamp in file" in stderr_output @pytest.mark.parametrize(