Compare commits

...

55 Commits
v0.2.1 ... main

Author SHA1 Message Date
IamTheFij ddf509e9a4 Bump patch version to 2.2.1
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
2023-10-31 20:42:28 -07:00
IamTheFij fbb38a9d7d Make git-url optional again for pseudo_builder
continuous-integration/drone/push Build is passing Details
2023-10-31 20:41:40 -07:00
IamTheFij f0ab45f0c6 Make sure hatch is installed when verifying tag
continuous-integration/drone/push Build was killed Details
continuous-integration/drone/tag Build is passing Details
2023-10-27 15:41:30 -07:00
IamTheFij 3eb5fb3d75 Bump version to v2.2.0
continuous-integration/drone/tag Build is failing Details
continuous-integration/drone/push Build is passing Details
2023-10-27 15:32:43 -07:00
IamTheFij f1352658ae Update pseudo_builder to be able to include extra files 2023-10-27 15:32:43 -07:00
IamTheFij b8b81825f6 Skip upload pipeline entirely when not tagging for now 2023-10-27 15:32:30 -07:00
IamTheFij daedacb35f Really require python 3.7
continuous-integration/drone/push Build is passing Details
2023-10-27 15:27:04 -07:00
IamTheFij 09a7d38bc7 Skip upload to test pypi because we can't overwrite 2023-10-27 15:26:46 -07:00
IamTheFij ff803dbc31 Use hatch dynamic version so that we can increment before test uploads
continuous-integration/drone/push Build is failing Details
2023-10-27 14:25:02 -07:00
IamTheFij 5ba06140dc Add linting back in
continuous-integration/drone/push Build is failing Details
2023-10-27 14:04:46 -07:00
IamTheFij 302258ce6c Fix test upload to use hatch publish
continuous-integration/drone/push Build is failing Details
2023-10-27 13:52:03 -07:00
IamTheFij 5423c04df6 Clean venv before trying to run hatch again
continuous-integration/drone/push Build is failing Details
2023-10-27 13:46:32 -07:00
IamTheFij 30801c5927 Switch from tox to hatch
continuous-integration/drone/push Build is failing Details
2023-10-27 13:41:46 -07:00
IamTheFij 8b9ff334a5 Switch to pyproject
continuous-integration/drone/push Build is failing Details
2023-10-26 17:32:28 -07:00
IamTheFij 08773d61b7 Bump version to v2.1.1
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
2023-06-26 16:28:58 -07:00
IamTheFij 6726931916 Run tests on python 3.11
continuous-integration/drone/push Build is passing Details
2023-06-26 16:26:39 -07:00
IamTheFij b0e327e2cd Remove walrus operator to fix for python3.7
continuous-integration/drone/push Build is passing Details
Python 3.7 goes into end of support mode tomorrow, 2023-06-27, but will
likely be in wide use for some time after
2023-06-26 16:14:34 -07:00
IamTheFij ab1f25304b Correct some argument help strings
continuous-integration/drone/push Build is failing Details
2023-06-12 11:08:45 -07:00
IamTheFij dfc12ed79e Raise exception if trying to extract a member that doesn't exist 2023-06-12 11:08:21 -07:00
IamTheFij de7fe72cec Bump version to v2.1.0
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
2023-06-05 11:45:49 -07:00
IamTheFij 0f46808403 Add verbose flag to print version and asset downloaded
continuous-integration/drone/push Build is passing Details
2023-06-05 11:45:22 -07:00
IamTheFij face8e9af0 Bump version to 2.0.0
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
Changed behavior around pre-release versions
2023-05-22 17:11:38 -07:00
IamTheFij d555284a01 Avoid installing pre-release versions unless explicitly asked
continuous-integration/drone/push Build is passing Details
2023-05-22 17:09:55 -07:00
IamTheFij 869b0b25b4 Fix passing of version when using download_release 2023-05-22 17:05:28 -07:00
IamTheFij d6c0673a1d Use new type annotations introduced in Python 3.10
continuous-integration/drone/push Build is passing Details
2022-10-11 12:42:07 -07:00
IamTheFij d48daaab10 Update dev requirements
Make sure mypy and type stubs are installed in dev environment. They
are already used for linting.
2022-10-11 12:42:07 -07:00
IamTheFij e147fad63c Bump version to 1.2.0
continuous-integration/drone/tag Build is passing Details
2022-10-11 12:41:30 -07:00
IamTheFij ab0603d1b9 Improve content type detection
continuous-integration/drone/push Build is passing Details
Cycle through detected content types and use the first supported one.

Adds tests to cover cases of priority and exceptions.
2022-10-11 12:20:57 -07:00
IamTheFij e6a269af3d Add application/x-tar+xz as a known content type 2022-10-11 12:20:08 -07:00
IamTheFij e92283b4e9 Skip existing files in test pypi and do twine check every time
continuous-integration/drone/push Build is passing Details
2022-08-31 13:20:23 -07:00
IamTheFij 10849adfb8 Don't worry about verifying tags when pushing to test pypi
continuous-integration/drone/push Build is failing Details
2022-08-31 12:45:36 -07:00
IamTheFij 8a4ac73c8d Bump version to v1.1.3
continuous-integration/drone/push Build is failing Details
continuous-integration/drone/tag Build is passing Details
2022-08-31 12:28:00 -07:00
IamTheFij fe0d9059aa Recognize new zip content type
continuous-integration/drone/tag Build was killed Details
continuous-integration/drone/push Build is failing Details
2022-08-31 12:26:41 -07:00
IamTheFij 1b367f5ddb Deploy to test pypi for pushes to main
continuous-integration/drone/push Build is failing Details
2022-07-08 12:52:24 -07:00
IamTheFij 7ff461fd89 Bump version to v1.1.2
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
2022-06-30 19:49:43 -07:00
IamTheFij 8585380eae Fix missing removeprefix in pseudo_builder.py 2022-06-30 19:48:35 -07:00
IamTheFij d876639c3e Bump patch version to v1.1.1
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
2022-06-30 15:43:08 -07:00
IamTheFij a27e09c77e Add drone tests for older Python versions
continuous-integration/drone/push Build is passing Details
2022-06-30 15:41:52 -07:00
IamTheFij 3f23ddd3cc Support for python3.7 and python3.8
continuous-integration/drone/push Build is passing Details
Added tox targets as well
2022-06-30 15:37:24 -07:00
IamTheFij 1b74126494 Use shutil.move so files can be installed across disks
continuous-integration/drone/push Build is passing Details
2022-06-30 13:55:31 -07:00
IamTheFij c49c3ca345 Bump to v1.1.0
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build was killed Details
2022-06-08 08:52:08 -07:00
IamTheFij e046c9a92a Add the ability to format in asset name into post download command 2022-06-08 08:50:20 -07:00
IamTheFij de1032cdbb Update pre-commit hooks
continuous-integration/drone/push Build is passing Details
2022-04-04 20:14:57 -07:00
IamTheFij 199e53fe71 Bump version to v1.0.0
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build was killed Details
2022-03-10 16:46:51 -08:00
IamTheFij 61496f3b18 Refactor some method names and Python API 2022-03-10 16:46:13 -08:00
IamTheFij c8607d0207 Add additional recognized content types for extract
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build was killed Details
2022-03-10 16:44:57 -08:00
IamTheFij f1c0cb9c40 Bump version to 0.4.1
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
2022-01-10 11:51:10 -08:00
IamTheFij 3aa32347e6 Add printing of coverage report back
Still not failing on coverage level since tests are being written
2022-01-10 11:50:45 -08:00
IamTheFij 94b011799d Add tests for version parsing 2022-01-10 11:50:26 -08:00
IamTheFij f36c0b7ff7 Add additional tar/gzip headers 2022-01-10 11:50:07 -08:00
IamTheFij 83e76376d0 Bump version to 0.4.0
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
2022-01-10 10:59:55 -08:00
IamTheFij 648784b91c Add path argument
continuous-integration/drone/push Build is passing Details
This allows specifying a download directory.

Note: This is rhe directory, not name of the downloaded file(s)
2022-01-09 13:27:06 -08:00
IamTheFij a6b0f46d7e Bump version to v0.3.0
continuous-integration/drone/push Build is passing Details
continuous-integration/drone/tag Build is passing Details
2022-01-07 11:09:02 -08:00
IamTheFij 020d9f442e Add new fetching of git tags 2022-01-07 11:08:37 -08:00
IamTheFij 279b57c4ef Add sample packaging implementation
continuous-integration/drone/push Build is passing Details
2022-01-06 15:10:17 -08:00
13 changed files with 567 additions and 223 deletions

View File

@ -1,9 +1,11 @@
# Build pipelines
PYTHON_VERSIONS = [
# "3.7", doesnt support subscripted types, eg list[str]
# "3.8",
"3.7",
"3.8",
"3.9",
"3.10",
"3.11",
"latest",
]
@ -13,6 +15,19 @@ def main(ctx):
# Run tests
pipelines += tests()
pipelines += [{
"kind": "pipeline",
"name": "lint",
"workspace": get_workspace(),
"steps": [{
"name": "lint",
"image": "python:3",
"commands": [
"python -V",
"make lint",
]
}]
}]
# Add pypi push pipeline
pipelines += push_to_pypi()
@ -41,24 +56,20 @@ def tests():
"name": "tests",
"workspace": get_workspace(),
"steps": [
tox_step("python:"+version)
test_step("python:"+version)
for version in PYTHON_VERSIONS
],
}]
# Builds a single python test step
def tox_step(docker_tag, python_cmd="python", tox_env="py3"):
def test_step(docker_tag, python_cmd="python"):
return {
"name": "test {}".format(docker_tag.replace(":", "")),
"image": docker_tag,
"environment": {
"TOXENV": tox_env,
},
"commands": [
"{} -V".format(python_cmd),
"pip install tox",
"tox",
"make clean-all test"
],
}
@ -108,37 +119,36 @@ def push_to_pypi():
return [{
"kind": "pipeline",
"name": "deploy to pypi",
"depends_on": ["tests"],
"depends_on": ["tests", "lint"],
"workspace": get_workspace(),
"trigger": {
"event": ["tag"],
"ref": [
"refs/heads/main",
# "refs/heads/main",
"refs/tags/v*",
],
},
"steps": [
{
"name": "push to test pypi",
"image": "python:3",
"environment": {
"TWINE_USERNAME": {
"from_secret": "PYPI_USERNAME",
},
"TWINE_PASSWORD": {
"from_secret": "TEST_PYPI_PASSWORD",
},
},
"commands": ["make upload-test"],
},
# {
# "name": "push to test pypi",
# "image": "python:3",
# "environment": {
# "HATCH_INDEX_USER": {
# "from_secret": "PYPI_USERNAME",
# },
# "HATCH_INDEX_AUTH": {
# "from_secret": "TEST_PYPI_PASSWORD",
# },
# },
# "commands": ["make upload-test"],
# },
{
"name": "push to pypi",
"image": "python:3",
"environment": {
"TWINE_USERNAME": {
"HATCH_INDEX_USER": {
"from_secret": "PYPI_USERNAME",
},
"TWINE_PASSWORD": {
"HATCH_INDEX_AUTH": {
"from_secret": "PYPI_PASSWORD",
},
},

View File

@ -1,7 +1,7 @@
---
repos:
- repo: https://github.com/psf/black
rev: 21.12b0
rev: 22.3.0
hooks:
- id: black
- repo: https://github.com/pre-commit/pre-commit-hooks
@ -15,11 +15,11 @@ repos:
- id: name-tests-test
exclude: tests/(common.py|util.py|(helpers|integration/factories)/(.+).py)
- repo: https://github.com/asottile/reorder_python_imports
rev: v2.6.0
rev: v3.0.1
hooks:
- id: reorder-python-imports
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.930
rev: v0.942
hooks:
- id: mypy
exclude: docs/

View File

@ -5,7 +5,7 @@ ENV := venv
.PHONY: default
default: test
# Creates virtualenv
# Creates de virtualenv
$(ENV):
python3 -m venv $(ENV)
@ -13,86 +13,76 @@ $(ENV):
$(ENV)/bin/$(NAME): $(ENV)
$(ENV)/bin/pip install -r requirements-dev.txt
# Install tox into virtualenv for running tests
$(ENV)/bin/tox: $(ENV)
$(ENV)/bin/pip install tox
# Install wheel for building packages
$(ENV)/bin/wheel: $(ENV)
$(ENV)/bin/pip install wheel
# Install twine for uploading packages
$(ENV)/bin/twine: $(ENV)
$(ENV)/bin/pip install twine
# Install hatch into virtualenv for running tests
$(ENV)/bin/hatch: $(ENV)
$(ENV)/bin/pip install hatch
# Installs dev requirements to virtualenv
.PHONY: devenv
devenv: $(ENV)/bin/$(NAME)
# Generates a smaller env for running tox, which builds it's own env
.PHONY: test-env
test-env: $(ENV)/bin/tox
# Generates a small build env for building and uploading dists
.PHONY: build-env
build-env: $(ENV)/bin/twine $(ENV)/bin/wheel
# Runs package
.PHONY: run
run: $(ENV)/bin/$(NAME)
$(ENV)/bin/$(NAME)
# Runs tests with tox
# Runs tests for current python
.PHONY: test
test: $(ENV)/bin/tox
$(ENV)/bin/tox
test: $(ENV)/bin/hatch
$(ENV)/bin/hatch run +py=3 test:run
# Runs test matrix
.PHONY: test-matrix
test-matrix: $(ENV)/bin/hatch
$(ENV)/bin/hatch run test:run
# Builds wheel for package to upload
.PHONY: build
build: $(ENV)/bin/wheel
$(ENV)/bin/python setup.py sdist
$(ENV)/bin/python setup.py bdist_wheel
build: $(ENV)/bin/hatch
$(ENV)/bin/hatch build
# Verify that the python version matches the git tag so we don't push bad shas
.PHONY: verify-tag-version
verify-tag-version:
verify-tag-version: $(ENV)/bin/hatch
$(eval TAG_NAME = $(shell [ -n "$(DRONE_TAG)" ] && echo $(DRONE_TAG) || git describe --tags --exact-match))
test "v$(shell python setup.py -V)" = "$(TAG_NAME)"
test "v$(shell $(ENV)/bin/hatch version)" = "$(TAG_NAME)"
# Uses twine to upload to pypi
# Upload to pypi
.PHONY: upload
upload: verify-tag-version build $(ENV)/bin/twine
$(ENV)/bin/twine upload dist/*
upload: verify-tag-version build
$(ENV)/bin/hatch publish
# Uses twine to upload to test pypi
.PHONY: upload-test
upload-test: verify-tag-version build $(ENV)/bin/twine
$(ENV)/bin/twine upload --repository-url https://test.pypi.org/legacy/ dist/*
upload-test: build
# Bump version to a post version based on num of commits since last tag to prevent overwriting
$(ENV)/bin/hatch version $(shell git describe --tags | sed 's/-[0-9a-z]*$$//')
$(ENV)/bin/hatch publish --repo test
# Cleans all build, runtime, and test artifacts
.PHONY: clean
clean:
rm -fr ./build *.egg-info ./htmlcov ./.coverage ./.pytest_cache ./.tox
rm -fr ./build *.egg-info ./htmlcov ./.coverage ./.pytest_cache
find . -name '*.pyc' -delete
find . -name '__pycache__' -delete
# Cleans dist and env
.PHONY: dist-clean
dist-clean: clean
-$(ENV)/bin/hatch env prune
rm -fr ./dist $(ENV)
# Run linters
.PHONY: lint
lint: $(ENV)/bin/hatch
$(ENV)/bin/hatch run lint:all
# Install pre-commit hooks
.PHONY: install-hooks
install-hooks: devenv
$(ENV)/bin/pre-commit install -f --install-hooks
$(ENV)/bin/hatch run lint:install-hooks
# Generates test coverage
.coverage:
$(ENV)/bin/tox
.coverage: test
# Builds coverage html
htmlcov/index.html: .coverage
$(ENV)/bin/coverage html
$(ENV)/bin/hatch run coverage html
# Opens coverage html in browser (on macOS and some Linux systems)
.PHONY: open-coverage
@ -106,7 +96,7 @@ docs-clean:
# Builds docs
docs/build/html/index.html:
$(ENV)/bin/tox -e docs
$(ENV)/bin/hatch run docs:build
# Shorthand for building docs
.PHONY: docs

View File

@ -28,7 +28,7 @@ In practice, it means that for a project like [StyLua](https://github.com/Johnny
--map-system Windows=win64 --map-system Darwin=macos --map-system=linux=Linux \
"stylua-{version}-{system}.zip"
And `release-gitter` will get the release version from the `Cargo.toml`, get the URL from the `git remote`, call the Github API and look for a release matching the templated file name, extract the `stylua` file from the archive, and then make it executable.
And `release-gitter` will get the release version from the `Cargo.toml`, get the URL from the `git remote`, call the Github API and look for a release matching the templated file name, extract the `stylua` file from the archive, and then make it executable. Alternatively, if you're project `--version-git-tag` can be used to pull the version from the latest tag. This will automatically do a shallow fetch (depth = 1), but this can be supressed with `--version-git-no-fetch`.
This allows a single command to be run from a checked out repo from pre-commit on any system to fetch the appropriate binary.
@ -38,6 +38,7 @@ Full usage is as follows:
usage: release-gitter [-h] [--hostname HOSTNAME] [--owner OWNER] [--repo REPO]
[--git-url GIT_URL] [--version VERSION]
[--version-git-tag] [--version-git-no-fetch]
[--map-system MAP_SYSTEM] [--map-arch MAP_ARCH]
[--exec EXEC] [--extract-files EXTRACT_FILES]
[--extract-all] [--url-only]
@ -59,6 +60,10 @@ Full usage is as follows:
repo
--version VERSION Release version to download. If not provied, it will
look for project metadata
--version-git-tag, -t
Get the release version from a git tag
--version-git-no-fetch
Shallow fetch tags prior to checking versions
--map-system MAP_SYSTEM, -s MAP_SYSTEM
Map a platform.system() value to a custom value
--map-arch MAP_ARCH, -a MAP_ARCH
@ -68,3 +73,9 @@ Full usage is as follows:
A list of file name to extract from downloaded archive
--extract-all, -x Shell commands to execute after download or extraction
--url-only Only print the URL and do not download
### Pre-Commit usage
This can be used a way to wrap a binary release from a Github or Gitea by adding a `pyproject.toml` file to your current project directory and adding a `.pre-commit-hooks.yaml` file.
Take a look at the `./sample_pseudo_bin` directory to see an example.

View File

@ -1,21 +1,46 @@
"""
This builder functions as a pseudo builder that instead downloads and installs a binary file using
release-gitter based on a pyproject.toml file
release-gitter based on a pyproject.toml file. It's a total hack...
"""
from __future__ import annotations
from dataclasses import dataclass
from pathlib import Path
from shutil import copy
from shutil import copytree
from shutil import move
import toml
from wheel.wheelfile import WheelFile
import release_gitter as rg
from release_gitter import removeprefix
PACKAGE_NAME = "pseudo"
def download(config) -> list[Path]:
release = rg.get_release(
@dataclass
class Config:
format: str
git_url: str
hostname: str
owner: str
repo: str
version: str | None = None
pre_release: bool = False
version_git_tag: bool = False
version_git_no_fetch: bool = False
map_system: dict[str, str] | None = None
map_arch: dict[str, str] | None = None
exec: str | None = None
extract_all: bool = False
extract_files: list[str] | None = None
include_extra_files: list[str] | None = None
def download(config: Config) -> list[Path]:
release = rg.fetch_release(
rg.GitRemoteInfo(config.hostname, config.owner, config.repo), config.version
)
asset = rg.match_asset(
@ -35,26 +60,35 @@ def download(config) -> list[Path]:
return files
def read_metadata():
def read_metadata() -> Config:
config = toml.load("pyproject.toml").get("tool", {}).get("release-gitter")
if not config:
raise ValueError("Must have configuration in [tool.release-gitter]")
args = []
for key, value in config.items():
key = "--" + key
if key == "--format":
args += [value]
elif isinstance(value, dict):
for sub_key, sub_value in value.items():
args = [key, f"{sub_key}={sub_value}"] + args
elif isinstance(value, list):
for sub_value in value:
args = [key, sub_value] + args
else:
args = [key, value] + args
git_url = config.pop("git-url", None)
remote_info = rg.parse_git_remote(git_url)
return rg.parse_args(args)
args = Config(
format=config.pop("format"),
git_url=git_url,
hostname=config.pop("hostname", remote_info.hostname),
owner=config.pop("owner", remote_info.owner),
repo=config.pop("repo", remote_info.repo),
)
for key, value in config.items():
setattr(args, str(key).replace("-", "_"), value)
if args.version is None:
args.version = rg.read_version(
args.version_git_tag,
not args.version_git_no_fetch,
)
if args.extract_all:
args.extract_files = []
return args
class _PseudoBuildBackend:
@ -64,11 +98,11 @@ class _PseudoBuildBackend:
def prepare_metadata_for_build_wheel(
self, metadata_directory, config_settings=None
):
# Createa .dist-info directory containing wheel metadata inside metadata_directory. Eg {metadata_directory}/{package}-{version}.dist-info/
# Create a .dist-info directory containing wheel metadata inside metadata_directory. Eg {metadata_directory}/{package}-{version}.dist-info/
print("Prepare meta", metadata_directory, config_settings)
metadata = read_metadata()
version = metadata.version.removeprefix("v")
version = removeprefix(metadata.version, "v") if metadata.version else "0.0.0"
# Returns distinfo dir?
dist_info = Path(metadata_directory) / f"{PACKAGE_NAME}-{version}.dist-info"
@ -115,7 +149,7 @@ class _PseudoBuildBackend:
metadata_directory = Path(metadata_directory)
metadata = read_metadata()
version = metadata.version.removeprefix("v")
version = removeprefix(metadata.version, "v") if metadata.version else "0.0.0"
wheel_directory = Path(wheel_directory)
wheel_directory.mkdir(exist_ok=True)
@ -123,15 +157,23 @@ class _PseudoBuildBackend:
wheel_scripts = wheel_directory / f"{PACKAGE_NAME}-{version}.data/scripts"
wheel_scripts.mkdir(parents=True, exist_ok=True)
# copytree(metadata_directory, wheel_directory / metadata_directory.name)
copytree(metadata_directory, wheel_directory / metadata_directory.name)
metadata = read_metadata()
files = download(metadata)
for file in files:
file.rename(wheel_scripts / file.name)
move(file, wheel_scripts / file.name)
print(f"ls {wheel_directory}: {list(wheel_directory.glob('*'))}")
for file_name in metadata.include_extra_files or []:
file = Path(file_name)
if Path.cwd() in file.absolute().parents:
copy(file_name, wheel_scripts / file)
else:
raise ValueError(
f"Cannot include any path that is not within the current directory: {file_name}"
)
print(f"ls {wheel_directory}: {list(wheel_directory.rglob('*'))}")
wheel_filename = f"{PACKAGE_NAME}-{version}-py2.py3-none-any.whl"
with WheelFile(wheel_directory / wheel_filename, "w") as wf:

64
pyproject.toml Normal file
View File

@ -0,0 +1,64 @@
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "release-gitter"
dynamic = ["version"]
description = "Easily download releases from sites like Github and Gitea"
readme = "README.md"
license = "MIT"
classifiers = [
"Programming Language :: Python :: 3",
"Operating System :: OS Independent",
"License :: OSI Approved :: MIT License",
]
authors = [
{ name = "Ian Fijolek", email = "iamthefij@gmail.com" }
]
maintainers = [
{ name = "Ian Fijolek", email = "iamthefij@gmail.com" }
]
requires-python = ">=3.7"
dependencies = ["requests"]
[project.optional-dependencies]
builder = [
"toml",
"wheel",
]
[project.scripts]
release-gitter = "release_gitter:main"
[project.urls]
Homepage = "https://git.iamthefij.com/iamthefij/release-gitter"
[tool.hatch.version]
path = "release_gitter.py"
[tool.hatch.build]
include = ["release_gitter.py", "pseudo_builder.py"]
[tool.hatch.envs.test]
dependencies = [
"coverage",
]
[tool.hatch.envs.test.scripts]
run = [
"coverage erase",
"coverage run --source=release_gitter -m unittest discover . *_test.py",
"coverage report -m # --fail-under 70",
]
[[tool.hatch.envs.test.matrix]]
python = ["3", "3.7", "3.8", "3.9", "3.10", "3.11"]
[tool.hatch.envs.lint]
detached = true
dependencies = ["pre-commit"]
[tool.hatch.envs.lint.scripts]
all = "pre-commit run --all-files"
install-hooks = "pre-commit install --install-hooks"

View File

@ -1,4 +1,6 @@
#! /usr/bin/env python3
from __future__ import annotations
import argparse
import platform
from collections.abc import Sequence
@ -11,23 +13,44 @@ from subprocess import check_output
from tarfile import TarFile
from tarfile import TarInfo
from typing import Any
from typing import Optional
from typing import Union
from urllib.parse import urlparse
from zipfile import ZipFile
import requests
__version__ = "2.2.1"
# Extract metadata from repo
class UnsupportedContentTypeError(ValueError):
pass
class InvalidRemoteError(ValueError):
pass
def removeprefix(s: str, pre: str) -> str:
# Duplicate str.removeprefix for py<3.9
try:
return s.removeprefix(pre) # type: ignore
except AttributeError:
# Py < 3.9
return s[len(pre) :] if s and s.startswith(pre) else s
def removesuffix(s: str, suf: str) -> str:
# Duplicate str.removesuffix for py<3.9
try:
return s.removesuffix(suf) # type: ignore
except AttributeError:
# Py < 3.9
return s[: -len(suf)] if s and s.endswith(suf) else s
@dataclass
class GitRemoteInfo:
"""Extracts information about a repository"""
hostname: str
owner: str
repo: str
@ -65,8 +88,8 @@ class GitRemoteInfo:
)
def get_git_remote(git_url: Optional[str] = None) -> GitRemoteInfo:
"""Extract Github repo info from git remote url"""
def parse_git_remote(git_url: str | None = None) -> GitRemoteInfo:
"""Extract Github repo info from a git remote url"""
if not git_url:
git_url = (
check_output(["git", "remote", "get-url", "origin"]).decode("UTF-8").strip()
@ -91,10 +114,10 @@ def get_git_remote(git_url: Optional[str] = None) -> GitRemoteInfo:
f"{path[1:3]} Could not parse owner and repo from URL {git_url}"
)
return GitRemoteInfo(u.hostname, path[1], path[2].removesuffix(".git"))
return GitRemoteInfo(u.hostname, path[1], removesuffix(path[2], ".git"))
def get_cargo_version(p: Path) -> str:
def parse_cargo_version(p: Path) -> str:
"""Extracts cargo version from a Cargo.toml file"""
with p.open() as f:
for line in f:
@ -104,9 +127,24 @@ def get_cargo_version(p: Path) -> str:
raise ValueError(f"No version found in {p}")
def read_version() -> Optional[str]:
def read_git_tag(fetch: bool = True) -> str | None:
"""Get local git tag for current repo
fetch: optionally fetch tags with depth of 1 from remote"""
if fetch:
check_call(["git", "fetch", "--tags", "--depth", "1"])
git_tag = check_output(["git", "describe", "--tags"]).decode("UTF-8").strip()
return git_tag or None
def read_version(from_tags: bool = False, fetch: bool = False) -> str | None:
"""Read version information from file or from git"""
if from_tags:
return read_git_tag(fetch)
matchers = {
"Cargo.toml": get_cargo_version,
"Cargo.toml": parse_cargo_version,
}
for name, extractor in matchers.items():
@ -119,13 +157,10 @@ def read_version() -> Optional[str]:
return None
# Fetch release and assets from Github
def get_release(
def fetch_release(
remote: GitRemoteInfo,
version: Optional[str] = None
# TODO: Accept an argument for pre-release
version: str | None = None,
pre_release=False,
) -> dict[Any, Any]:
"""Fetches a release object from a Github repo
@ -141,22 +176,29 @@ def get_release(
# Return the latest if requested
if version is None or version == "latest":
return result.json()[0]
for release in result.json():
if release["prerelease"] and not pre_release:
continue
return release
# Return matching version
for release in result.json():
if release["tag_name"].endswith(version):
return release
raise ValueError(f"Could not find release version ending in {version}")
raise ValueError(
f"Could not find release version ending in {version}."
f"{ ' Is it a pre-release?' if not pre_release else ''}"
)
def match_asset(
release: dict[Any, Any],
format: str,
version: Optional[str] = None,
system_mapping: Optional[dict[str, str]] = None,
arch_mapping: Optional[dict[str, str]] = None,
version: str | None = None,
system_mapping: dict[str, str] | None = None,
arch_mapping: dict[str, str] | None = None,
) -> dict[Any, Any]:
"""Accepts a release and searches for an appropriate asset attached using
a provided template and some alternative mappings for version, system, and machine info
@ -234,15 +276,25 @@ class PackageAdapter:
"""Adapts the names and extractall methods from ZipFile and TarFile classes"""
def __init__(self, content_type: str, response: requests.Response):
self._package: Union[TarFile, ZipFile]
if content_type == "application/zip":
self._package: TarFile | ZipFile
if content_type in (
"application/zip",
"application/x-zip-compressed",
):
self._package = ZipFile(BytesIO(response.content))
elif content_type == "application/x-tar":
self._package = TarFile(fileobj=response.raw)
elif content_type == "application/x-tar+gzip":
elif content_type in (
"application/gzip",
"application/x-tar+gzip",
"application/x-tar+xz",
"application/x-compressed-tar",
):
self._package = TarFile.open(fileobj=BytesIO(response.content), mode="r:*")
else:
raise ValueError(f"Unknown or unsupported content type {content_type}")
raise UnsupportedContentTypeError(
f"Unknown or unsupported content type {content_type}"
)
def get_names(self) -> list[str]:
"""Get list of all file names in package"""
@ -255,44 +307,83 @@ class PackageAdapter:
f"Unknown package type, cannot extract from {type(self._package)}"
)
def extractall(self, file_names: list[str]) -> list[str]:
def extractall(
self,
path: Path | None,
members: list[str] | None,
) -> list[str]:
"""Extract all or a subset of files from the package
If the `file_names` list is empty, all files will be extracted"""
if not file_names:
self._package.extractall()
if path is None:
path = Path.cwd()
if not members:
self._package.extractall(path=path)
return self.get_names()
if isinstance(self._package, ZipFile):
self._package.extractall(members=file_names)
if isinstance(self._package, TarFile):
self._package.extractall(members=(TarInfo(name) for name in file_names))
missing_members = set(members) - set(self.get_names())
if missing_members:
raise ValueError(f"Missing members: {missing_members}")
return file_names
if isinstance(self._package, ZipFile):
self._package.extractall(path=path, members=members)
if isinstance(self._package, TarFile):
self._package.extractall(
path=path, members=(TarInfo(name) for name in members)
)
return members
def get_asset_package(
asset: dict[str, Any], result: requests.Response
) -> PackageAdapter:
possible_content_types = (
asset.get("content_type"),
"+".join(t for t in guess_type(asset["name"]) if t is not None),
)
for content_type in possible_content_types:
if not content_type:
continue
try:
return PackageAdapter(content_type, result)
except UnsupportedContentTypeError:
continue
else:
raise UnsupportedContentTypeError(
"Cannot extract files from archive because we don't recognize the content type"
)
def download_asset(
asset: dict[Any, Any],
extract_files: Optional[list[str]] = None,
extract_files: list[str] | None = None,
destination: Path | None = None,
) -> list[Path]:
"""Download asset from entity passed in
Extracts files from archives if provided. Any empty list will extract all files
Args
`asset`: asset dictionary as returned from API
`extract_files`: optional list of file paths to extract. An empty list will extract all
`destination`: destination directory to put the downloaded assset
Returns
list of Path objects containing all extracted files
"""
if destination is None:
destination = Path.cwd()
result = requests.get(asset["browser_download_url"])
content_type = asset.get(
"content_type",
guess_type(asset["name"]),
)
if extract_files is not None:
if isinstance(content_type, tuple):
content_type = "+".join(t for t in content_type if t is not None)
if not content_type:
raise TypeError(
"Cannot extract files from archive because we don't recognize the content type"
)
package = PackageAdapter(content_type, result)
extract_files = package.extractall(extract_files)
return [Path.cwd() / name for name in extract_files]
package = get_asset_package(asset, result)
extract_files = package.extractall(path=destination, members=extract_files)
return [destination / name for name in extract_files]
file_name = Path.cwd() / asset["name"]
file_name = destination / asset["name"]
with open(file_name, "wb") as f:
f.write(result.content)
@ -304,8 +395,8 @@ class MapAddAction(argparse.Action):
self,
_: argparse.ArgumentParser,
namespace: argparse.Namespace,
values: Union[str, Sequence[Any], None],
option_string: Optional[str] = None,
values: str | Sequence[Any] | None,
option_string: str | None = None,
):
# Validate that required value has something
if self.required and not values:
@ -335,12 +426,21 @@ class MapAddAction(argparse.Action):
setattr(namespace, self.dest, dest)
def parse_args(args: Optional[list[str]] = None) -> argparse.Namespace:
def _parse_args(args: list[str] | None = None) -> argparse.Namespace:
parser = argparse.ArgumentParser()
parser.add_argument(
"format",
help="Format template to match assets. Eg `foo-{version}-{system}-{arch}.zip`",
help="Format template to match assets. Eg. `foo-{version}-{system}-{arch}.zip`",
)
parser.add_argument(
"destination",
metavar="DEST",
nargs="?",
type=Path,
default=Path.cwd(),
help="Destination directory. Defaults to current directory",
)
parser.add_argument("-v", action="store_true", help="verbose logging")
parser.add_argument(
"--hostname",
help="Git repository hostname",
@ -359,7 +459,23 @@ def parse_args(args: Optional[list[str]] = None) -> argparse.Namespace:
)
parser.add_argument(
"--version",
help="Release version to download. If not provied, it will look for project metadata",
help="Release version to download. If not provided, it will look for project metadata",
)
parser.add_argument(
"--prerelease",
action="store_true",
help="Include pre-release versions in search",
)
parser.add_argument(
"--version-git-tag",
"-t",
action="store_true",
help="Get the release version from a git tag",
)
parser.add_argument(
"--version-git-no-fetch",
action="store_true",
help="Shallow fetch tags prior to checking versions",
)
parser.add_argument(
"--map-system",
@ -374,19 +490,21 @@ def parse_args(args: Optional[list[str]] = None) -> argparse.Namespace:
help="Map a platform.machine() value to a custom value",
)
parser.add_argument(
"--exec", "-c", help="Shell commands to execute after download or extraction"
"--exec",
"-c",
help="Shell commands to execute after download or extraction. {} will be expanded to the downloaded asset name.",
)
parser.add_argument(
"--extract-files",
"-e",
action="append",
help="A list of file name to extract from downloaded archive",
help="A list of file names to extract from the downloaded archive",
)
parser.add_argument(
"--extract-all",
"-x",
action="store_true",
help="Shell commands to execute after download or extraction",
help="Extract all files from the downloaded archive",
)
parser.add_argument(
"--url-only",
@ -398,7 +516,7 @@ def parse_args(args: Optional[list[str]] = None) -> argparse.Namespace:
# Merge in fields from args and git remote
if not all((parsed_args.owner, parsed_args.repo, parsed_args.hostname)):
remote_info = get_git_remote(parsed_args.git_url)
remote_info = parse_git_remote(parsed_args.git_url)
def merge_field(a, b, field):
value = getattr(a, field)
@ -409,7 +527,10 @@ def parse_args(args: Optional[list[str]] = None) -> argparse.Namespace:
merge_field(parsed_args, remote_info, field)
if parsed_args.version is None:
parsed_args.version = read_version()
parsed_args.version = read_version(
parsed_args.version_git_tag,
not parsed_args.version_git_no_fetch,
)
if parsed_args.extract_all:
parsed_args.extract_files = []
@ -417,11 +538,45 @@ def parse_args(args: Optional[list[str]] = None) -> argparse.Namespace:
return parsed_args
def main():
args = parse_args()
def download_release(
remote_info: GitRemoteInfo,
destination: Path,
format: str,
version: str | None = None,
system_mapping: dict[str, str] | None = None,
arch_mapping: dict[str, str] | None = None,
extract_files: list[str] | None = None,
pre_release=False,
) -> list[Path]:
"""Convenience method for fetching, downloading and extracting a release"""
release = fetch_release(
remote_info,
version=version,
pre_release=pre_release,
)
asset = match_asset(
release,
format,
version=version,
system_mapping=system_mapping,
arch_mapping=arch_mapping,
)
files = download_asset(
asset,
extract_files=extract_files,
destination=destination,
)
release = get_release(
GitRemoteInfo(args.hostname, args.owner, args.repo), args.version
return files
def main():
args = _parse_args()
release = fetch_release(
GitRemoteInfo(args.hostname, args.owner, args.repo),
version=args.version,
pre_release=args.prerelease,
)
asset = match_asset(
release,
@ -431,17 +586,24 @@ def main():
arch_mapping=args.map_arch,
)
if args.v:
print(f"Downloading {asset['name']} from release {release['name']}")
if args.url_only:
print(asset["browser_download_url"])
return
files = download_asset(asset, extract_files=args.extract_files)
files = download_asset(
asset,
extract_files=args.extract_files,
destination=args.destination,
)
print(f"Downloaded {', '.join(str(f) for f in files)}")
# Optionally execute post command
if args.exec:
check_call(args.exec, shell=True)
check_call(args.exec.format(asset["name"]), shell=True)
if __name__ == "__main__":

View File

@ -1,10 +1,15 @@
from __future__ import annotations
import unittest
from tarfile import TarFile
from typing import Any
from typing import Callable
from typing import NamedTuple
from typing import Optional
from unittest.mock import MagicMock
from unittest.mock import mock_open
from unittest.mock import patch
from zipfile import ZipFile
import requests
@ -16,7 +21,7 @@ class TestExpression(NamedTuple):
args: list[Any]
kwargs: dict[str, Any]
expected: Any
exception: Optional[type[Exception]]
exception: Optional[type[Exception]] = None
def run(self, f: Callable):
with self.t.subTest(f=f, args=self.args, kwargs=self.kwargs):
@ -34,6 +39,15 @@ class TestExpression(NamedTuple):
raise
class TestGeneral(unittest.TestCase):
def test_removesuffix(self):
for test_case in (
TestExpression(self, ["repo.git", ".git"], {}, "repo"),
TestExpression(self, ["repo", ".git"], {}, "repo"),
):
test_case.run(release_gitter.removesuffix)
class TestRemoteInfo(unittest.TestCase):
def test_parse_remote_info(self):
for test_case in (
@ -66,7 +80,7 @@ class TestRemoteInfo(unittest.TestCase):
release_gitter.InvalidRemoteError,
),
):
test_case.run(release_gitter.get_git_remote)
test_case.run(release_gitter.parse_git_remote)
def test_generate_release_url(self):
for subtest in (
@ -103,5 +117,85 @@ class TestRemoteInfo(unittest.TestCase):
subtest.run(release_gitter.GitRemoteInfo.get_releases_url)
class TestVersionInfo(unittest.TestCase):
def test_no_cargo_file(self):
with patch("pathlib.Path.exists", return_value=False):
version = release_gitter.read_version()
self.assertIsNone(version)
@patch("pathlib.Path.exists", return_value=True)
@patch(
"pathlib.Path.open",
mock_open(read_data="\n".join(["[package]", 'version = "1.0.0"'])),
)
def test_cargo_file_has_version(self, *_):
version = release_gitter.read_version()
self.assertEqual(version, "1.0.0")
@patch("pathlib.Path.exists", return_value=True)
@patch(
"pathlib.Path.open",
mock_open(read_data="\n".join(["[package]"])),
)
def test_cargo_file_missing_version(self, *_):
with self.assertRaises(ValueError):
release_gitter.read_version()
@patch("release_gitter.ZipFile", autospec=True)
@patch("release_gitter.BytesIO", autospec=True)
class TestContentTypeDetection(unittest.TestCase):
def test_asset_encoding_priority(self, *_):
package = release_gitter.get_asset_package(
{
"content_type": "application/x-tar",
"name": "test.zip",
},
MagicMock(spec=["raw", "content"]),
)
# Tar should take priority over the file name zip extension
self.assertIsInstance(package._package, TarFile)
def test_fallback_to_supported_encoding(self, *_):
package = release_gitter.get_asset_package(
{
"content_type": "application/octetstream",
"name": "test.zip",
},
MagicMock(spec=["raw", "content"]),
)
# Should fall back to zip extension
self.assertIsInstance(package._package, ZipFile)
def test_missing_only_name_content_type(self, *_):
package = release_gitter.get_asset_package(
{
"name": "test.zip",
},
MagicMock(spec=["raw", "content"]),
)
# Should fall back to zip extension
self.assertIsInstance(package._package, ZipFile)
def test_no_content_types(self, *_):
with self.assertRaises(release_gitter.UnsupportedContentTypeError):
release_gitter.get_asset_package(
{
"name": "test",
},
MagicMock(spec=["raw", "content"]),
)
def test_no_supported_content_types(self, *_):
with self.assertRaises(release_gitter.UnsupportedContentTypeError):
release_gitter.get_asset_package(
{
"content_type": "application/octetstream",
"name": "test",
},
MagicMock(spec=["raw", "content"]),
)
if __name__ == "__main__":
unittest.main()

View File

@ -1,4 +1,6 @@
-e .
pytest
coverage
hatch
mypy
pre-commit
types-requests
types-toml

View File

@ -0,0 +1,8 @@
---
- id: stylua
name: StyLua
description: An opinionated Lua code formatter
entry: stylua
language: python
types:
- lua

View File

@ -0,0 +1,18 @@
[build-system]
requires = ["release-gitter[builder]"]
build-backend = "pseudo_builder"
[tool.release-gitter]
# git-url is not needed if you're in the actual source repo
git-url = "https://github.com/JohnnyMorganz/StyLua"
# version is not needed if you have a Cargo.toml in the current directory
version = "0.11.3"
extract-files = [ "stylua" ]
format = "stylua-{version}-{system}.zip"
exec = "chmod +x stylua"
[tool.release-gitter.map-system]
Darwin = "macos"
Windows = "win64"
Linux = "linux"

View File

@ -1,40 +0,0 @@
from codecs import open
from os import path
from setuptools import find_packages
from setuptools import setup
here = path.abspath(path.dirname(__file__))
# Get the long description from the README file
with open(path.join(here, "README.md"), encoding="utf-8") as f:
long_description = f.read()
setup(
name="release-gitter",
version="0.2.1",
description="Easily download releases from sites like Github and Gitea",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://git.iamthefij.com/iamthefij/release-gitter.git",
download_url=(
"https://git.iamthefij.com/iamthefij/release-gitter.git/archive/master.tar.gz"
),
author="iamthefij",
author_email="",
classifiers=[
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
],
keywords="",
py_modules=["release_gitter", "pseudo_builder"],
install_requires=["requests"],
extras_require={"builder": ["toml", "wheel"]},
entry_points={
"console_scripts": [
"release-gitter=release_gitter:main",
],
},
)

17
tox.ini
View File

@ -1,17 +0,0 @@
[tox]
envlist = py3
[testenv]
deps =
-rrequirements-dev.txt
commands =
coverage erase
coverage run --source=release_gitter -m unittest discover . {posargs:"*_test.py"}
# coverage report -m --fail-under 70
pre-commit run --all-files
[testenv:pre-commit]
deps =
pre-commit
commands =
pre-commit {posargs}