Non-working draft of v2

- Major refactor of project structure and workflows
- Replaced legacy scripts and commands with new Go implementation
- Updated and added CI/CD workflows
- Added new configuration and linting files
- Removed deprecated files and test logs
- Updated documentation and action metadata

This is an initial, non-working draft for v2.0. Further testing and fixes required.
This commit is contained in:
awalsh128 2025-08-23 11:25:52 -07:00
parent cbdbab28e6
commit 1840a3c552
69 changed files with 3762 additions and 1395 deletions

View file

@ -1,10 +0,0 @@
---
name: Bug report
about: Basic template with warning about known issues.
title: ''
labels: ''
assignees: awalsh128
---
Please read about the limitation of [non-file dependencies](https://github.com/awalsh128/cache-apt-pkgs-action/blob/master/README.md#non-file-dependencies) before filing an issue.

63
.github/ISSUE_TEMPLATE/bug.md vendored Normal file
View file

@ -0,0 +1,63 @@
---
name: Bug Report
about: Create a report to help us improve or fix the action
title: "[BUG] "
labels: bug
assignees: 'awalsh128'
---
> **Note**: Please read about the limitation of [non-file dependencies](https://github.com/awalsh128/cache-apt-pkgs-action/blob/master/README.md#non-file-dependencies) before filing an issue.
## Description
A clear and concise description of what the bug is.
## Steps to Reproduce
### 1. Workflow Configuration
```yaml
# Replace with your workflow
```
### 2. Package List
```plaintext
# List your packages here
```
### 3. Environment Details
- Runner OS: [e.g., Ubuntu 22.04]
- Action version: [e.g., v2.0.0]
## Expected Behavior
A clear and concise description of what you expected to happen.
## Actual Behavior
What actually happened? Please include:
- Error messages
- Action logs
- Cache status (hit/miss)
## Debug Information
If possible, please run the action with debug mode enabled:
```yaml
with:
debug: true
```
And provide the debug output.
## Additional Context
- Are you using any specific package versions?
- Are there any special package configurations?
- Does the issue happen consistently or intermittently?
- Have you tried clearing the cache and retrying?

74
.github/workflows/ci.yml vendored Normal file
View file

@ -0,0 +1,74 @@
name: CI
on:
push:
branches: [dev-v2.0]
pull_request:
branches: [dev-v2.0]
schedule:
- cron: 0 0 * * * # Run at 00:00 UTC every day
workflow_dispatch:
inputs:
debug:
description: Run in debug mode.
type: boolean
required: false
default: false
env:
DEBUG: ${{ github.event.inputs.debug || false }}
SHELLOPTS: errexit:pipefail
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: npm
- name: Install golangci-lint
uses: golangci/golangci-lint-action@v4
with:
version: latest
- name: Install cspell
run: npm install -g cspell
- name: Check spelling
run: cspell --config ./.vscode/cspell.json "**" --no-progress
- name: Golang Lint
run: golangci-lint run
- name: trunk.io Lint
uses: trunk-io/trunk-action@v1
with:
arguments: check
- name: Install Go
uses: actions/setup-go@v5
with:
go-version: "1.21"
cache: true
- name: Install Go module dependencies
run: go mod download
- name: Build
run: go build -v ./...
- name: Test with coverage
run: go test -v -race -coverprofile=coverage.txt -covermode=atomic ./...
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
with:
files: ./coverage.txt
fail_ci_if_error: true

View file

@ -1,19 +0,0 @@
name: Publish Dev Push Event
on:
workflow_dispatch:
push:
branches:
- dev
jobs:
publish_event:
runs-on: ubuntu-latest
name: Publish dev push
steps:
- run: |
curl -i \
-X POST \
-H "Accept: application/vnd.github.v3+json" \
-H "Authorization: token ${{ secrets.PUBLISH_PUSH_TOKEN }}" \
https://api.github.com/repos/awalsh128/cache-apt-pkgs-action-ci/dispatches \
-d '{"event_type":"dev_push"}'

View file

@ -1,19 +0,0 @@
name: Publish Master Push Event
on:
workflow_dispatch:
push:
branches:
- master
jobs:
publish_event:
runs-on: ubuntu-latest
name: Publish master push
steps:
- run: |
curl -i \
-X POST \
-H "Accept: application/vnd.github.v3+json" \
-H "Authorization: token ${{ secrets.PUBLISH_PUSH_TOKEN }}" \
https://api.github.com/repos/awalsh128/cache-apt-pkgs-action-ci/dispatches \
-d '{"event_type":"master_push"}'

View file

@ -1,19 +0,0 @@
name: Publish Staging Push Event
on:
workflow_dispatch:
push:
branches:
- staging
jobs:
publish_event:
runs-on: ubuntu-latest
name: Publish staging push
steps:
- run: |
curl -i \
-X POST \
-H "Accept: application/vnd.github.v3+json" \
-H "Authorization: token ${{ secrets.PUBLISH_PUSH_TOKEN }}" \
https://api.github.com/repos/awalsh128/cache-apt-pkgs-action-ci/dispatches \
-d '{"event_type":"staging_push"}'

View file

@ -1,29 +0,0 @@
name: Pull Request
on:
pull_request:
types: [opened, synchronize]
permissions:
contents: read
jobs:
integrate:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Go
uses: actions/setup-go@v4
with:
go-version-file: "go.mod"
- name: Build and test
run: |
go build -v ./...
go test -v ./...
- name: Lint
uses: golangci/golangci-lint-action@v3
with:
version: v1.52.2

497
.github/workflows/test-action.yml vendored Normal file
View file

@ -0,0 +1,497 @@
name: Test Action
on:
# Manual trigger with specific ref
workflow_dispatch:
inputs:
ref:
description: Branch, tag, SHA to test (for PRs use pull/ID/head)
required: true
type: string
debug:
description: Enable debug logging
type: boolean
required: false
default: false
# Automatic triggers
push:
branches: [dev-v2.0] # Test on pushes to dev branch
paths:
- src/** # Only when action code changes
- action.yml
- .github/workflows/test-action.yml
pull_request:
branches: [dev-v2.0] # Test on PRs to dev branch
paths:
- src/** # Only when action code changes
- action.yml
- .github/workflows/test-action.yml
# Environment configuration
env:
DEBUG: ${{ github.event.inputs.debug || false }}
# Test for overrides in built in shell options (regression issue 98).
SHELLOPTS: errexit:pipefail
# Use PR's SHA when testing a PR, otherwise use the ref provided
TEST_REF: ${{ github.event.pull_request.head.sha || github.event.inputs.ref || github.ref }}
jobs:
list_all_versions:
runs-on: ubuntu-latest
name: List all package versions (including deps).
steps:
# Checkout the code we want to test
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ env.TEST_REF }}
# Run the action from the checked out code
- name: Execute
id: execute
uses: ./
with:
packages: xdot=1.3-1
version: ${{ github.run_id }}-${{ github.run_attempt }}-list_all_versions
debug: ${{ env.DEBUG }}
- name: Verify
if: |
steps.execute.outputs.cache-hit != 'false' ||
steps.execute.outputs.all-package-version-list != 'fonts-liberation2=1:2.1.5-3,gir1.2-atk-1.0=2.52.0-1build1,gir1.2-freedesktop=1.80.1-1,gir1.2-gdkpixbuf-2.0=2.42.10+dfsg-3ubuntu3.2,gir1.2-gtk-3.0=3.24.41-4ubuntu1.3,gir1.2-harfbuzz-0.0=8.3.0-2build2,gir1.2-pango-1.0=1.52.1+ds-1build1,graphviz=2.42.2-9ubuntu0.1,libann0=1.1.2+doc-9build1,libblas3=3.12.0-3build1.1,libcdt5=2.42.2-9ubuntu0.1,libcgraph6=2.42.2-9ubuntu0.1,libgts-0.7-5t64=0.7.6+darcs121130-5.2build1,libgts-bin=0.7.6+darcs121130-5.2build1,libgvc6=2.42.2-9ubuntu0.1,libgvpr2=2.42.2-9ubuntu0.1,libharfbuzz-gobject0=8.3.0-2build2,liblab-gamut1=2.42.2-9ubuntu0.1,liblapack3=3.12.0-3build1.1,libpangoxft-1.0-0=1.52.1+ds-1build1,libpathplan4=2.42.2-9ubuntu0.1,python3-cairo=1.25.1-2build2,python3-gi-cairo=3.48.2-1,python3-numpy=1:1.26.4+ds-6ubuntu1,xdot=1.3-1'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
echo "package-version-list = ${{ steps.execute.outputs.package-version-list }}"
echo "all-package-version-list = ${{ steps.execute.outputs.all-package-version-list }}"
echo "diff all-package-version-list"
diff <(echo "${{ steps.execute.outputs.all-package-version-list }}" ) <(echo "fonts-liberation2=1:2.1.5-3,gir1.2-atk-1.0=2.52.0-1build1,gir1.2-freedesktop=1.80.1-1,gir1.2-gdkpixbuf-2.0=2.42.10+dfsg-3ubuntu3.2,gir1.2-gtk-3.0=3.24.41-4ubuntu1.3,gir1.2-harfbuzz-0.0=8.3.0-2build2,gir1.2-pango-1.0=1.52.1+ds-1build1,graphviz=2.42.2-9ubuntu0.1,libann0=1.1.2+doc-9build1,libblas3=3.12.0-3build1.1,libcdt5=2.42.2-9ubuntu0.1,libcgraph6=2.42.2-9ubuntu0.1,libgts-0.7-5t64=0.7.6+darcs121130-5.2build1,libgts-bin=0.7.6+darcs121130-5.2build1,libgvc6=2.42.2-9ubuntu0.1,libgvpr2=2.42.2-9ubuntu0.1,libharfbuzz-gobject0=8.3.0-2build2,liblab-gamut1=2.42.2-9ubuntu0.1,liblapack3=3.12.0-3build1.1,libpangoxft-1.0-0=1.52.1+ds-1build1,libpathplan4=2.42.2-9ubuntu0.1,python3-cairo=1.25.1-2build2,python3-gi-cairo=3.48.2-1,python3-numpy=1:1.26.4+ds-6ubuntu1,xdot=1.3-1")
exit 1
shell: bash
list_versions:
runs-on: ubuntu-latest
name: List package versions.
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: xdot rolldice
version: ${{ github.run_id }}-${{ github.run_attempt }}-list_versions
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'false' || steps.execute.outputs.package-version-list != 'rolldice=1.16-1build3,xdot=1.3-1'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
echo "package-version-list = ${{ steps.execute.outputs.package-version-list }}"
echo "diff package-version-list"
diff <(echo "${{ steps.execute.outputs.package-version-list }}" ) <(echo "rolldice=1.16-1build3,xdot=1.3-1")
exit 1
shell: bash
standard_workflow_install:
runs-on: ubuntu-latest
name: Standard workflow install package and cache.
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: xdot rolldice
version: ${{ github.run_id }}-${{ github.run_attempt }}-standard_workflow
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'false'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
exit 1
shell: bash
standard_workflow_install_with_new_version:
needs: standard_workflow_install
runs-on: ubuntu-latest
name: Standard workflow packages with new version.
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: xdot rolldice
version: ${{ github.run_id }}-${{ github.run_attempt }}-standard_workflow_install_with_new_version
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'false'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
exit 1
shell: bash
standard_workflow_restore:
needs: standard_workflow_install
runs-on: ubuntu-latest
name: Standard workflow restore cached packages.
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: xdot rolldice
version: ${{ github.run_id }}-${{ github.run_attempt }}-standard_workflow
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'true'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
exit 1
shell: bash
standard_workflow_restore_with_packages_out_of_order:
needs: standard_workflow_install
runs-on: ubuntu-latest
name: Standard workflow restore with packages out of order.
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: rolldice xdot
version: ${{ github.run_id }}-${{ github.run_attempt }}-standard_workflow
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'true'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
exit 1
shell: bash
standard_workflow_add_package:
needs: standard_workflow_install
runs-on: ubuntu-latest
name: Standard workflow add another package.
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: xdot rolldice distro-info-data
version: ${{ github.run_id }}-${{ github.run_attempt }}-standard_workflow
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'false'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
exit 1
shell: bash
standard_workflow_restore_add_package:
needs: standard_workflow_add_package
runs-on: ubuntu-latest
name: Standard workflow restore added package.
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: xdot rolldice distro-info-data
version: ${{ github.run_id }}-${{ github.run_attempt }}-standard_workflow
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'true'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
exit 1
shell: bash
no_packages:
runs-on: ubuntu-latest
name: No packages passed.
steps:
- name: Execute
id: execute
uses: ./
with:
packages: ""
continue-on-error: true
- name: Verify
if: steps.execute.outcome == 'failure'
run: exit 0
shell: bash
package_not_found:
runs-on: ubuntu-latest
name: Package not found.
steps:
- name: Execute
id: execute
uses: ./
with:
packages: package_that_doesnt_exist
continue-on-error: true
- name: Verify
if: steps.execute.outcome == 'failure'
run: exit 0
shell: bash
version_contains_spaces:
runs-on: ubuntu-latest
name: Version contains spaces.
steps:
- name: Execute
id: execute
uses: ./
with:
packages: xdot
version: 123 abc
debug: ${{ env.DEBUG }}
continue-on-error: true
- name: Verify
if: steps.execute.outcome == 'failure'
run: exit 0
shell: bash
regression_36:
runs-on: ubuntu-latest
name: "Reinstall existing package (regression issue #36)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libgtk-3-dev
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_36
debug: ${{ env.DEBUG }}
regression_37:
runs-on: ubuntu-latest
name: "Install with reported package dependencies not installed (regression issue #37)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libosmesa6-dev libgl1-mesa-dev python3-tk pandoc git-restore-mtime
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_37
debug: ${{ env.DEBUG }}
debug_disabled:
runs-on: ubuntu-latest
name: Debug disabled.
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: xdot
version: ${{ github.run_id }}-${{ github.run_attempt }}-list-all-package-versions
debug: ${{ env.DEBUG }}
regression_72_1:
runs-on: ubuntu-latest
name: "Cache Java CA certs package v1 (regression issue #72)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: openjdk-11-jre
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_72
debug: ${{ env.DEBUG }}
regression_72_2:
runs-on: ubuntu-latest
name: "Cache Java CA certs package v2 (regression issue #72)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: default-jre
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_72
debug: ${{ env.DEBUG }}
regression_76:
runs-on: ubuntu-latest
name: "Cache empty archive (regression issue #76)."
steps:
- uses: actions/checkout@v4
- run: |
sudo wget -O- https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB | gpg --dearmor | tee /usr/share/keyrings/oneapi-archive-keyring.gpg > /dev/null;
echo "deb [signed-by=/usr/share/keyrings/oneapi-archive-keyring.gpg] https://apt.repos.intel.com/oneapi all main" | sudo tee /etc/apt/sources.list.d/oneAPI.list;
sudo apt-get -qq update;
sudo apt-get install -y intel-oneapi-runtime-libs intel-oneapi-runtime-opencl;
sudo apt-get install -y opencl-headers ocl-icd-opencl-dev;
sudo apt-get install -y libsundials-dev;
- uses: ./
with:
packages: intel-oneapi-runtime-libs
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_76
debug: ${{ env.DEBUG }}
regression_79:
runs-on: ubuntu-latest
name: "Tar error with libboost-dev (regression issue #79)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libboost-dev
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_79
debug: ${{ env.DEBUG }}
regression_81:
runs-on: ubuntu-latest
name: "Tar error with alsa-ucm-conf (regression issue #81)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libasound2 libatk-bridge2.0-0 libatk1.0-0 libatspi2.0-0 libcups2 libdrm2 libgbm1 libnspr4 libnss3 libxcomposite1 libxdamage1 libxfixes3 libxkbcommon0 libxrandr2
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_81
debug: ${{ env.DEBUG }}
regression_84_literal_block_install:
runs-on: ubuntu-latest
name: "Install multiline package listing using literal block style (regression issue #84)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: >
xdot
rolldice distro-info-data
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_84_literal_block
debug: ${{ env.DEBUG }}
regression_84_literal_block_restore:
needs: regression_84_literal_block_install
runs-on: ubuntu-latest
name: "Restore multiline package listing using literal block style (regression issue #84)."
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: xdot rolldice distro-info-data
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_84_literal_block
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'true'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
exit 1
shell: bash
regression_84_folded_block_install:
runs-on: ubuntu-latest
name: "Install multiline package listing using literal block style (regression issue #84)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: |
xdot \
rolldice distro-info-data
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_84_folded_block
debug: ${{ env.DEBUG }}
regression_84_folded_block_restore:
needs: regression_84_folded_block_install
runs-on: ubuntu-latest
name: "Restore multiline package listing using literal block style (regression issue #84)."
steps:
- uses: actions/checkout@v4
- name: Execute
id: execute
uses: ./
with:
packages: xdot rolldice distro-info-data
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_84_folded_block
debug: ${{ env.DEBUG }}
- name: Verify
if: steps.execute.outputs.cache-hit != 'true'
run: |
echo "cache-hit = ${{ steps.execute.outputs.cache-hit }}"
exit 1
shell: bash
regression_89:
runs-on: ubuntu-latest
name: "Upload logs artifact name (regression issue #89)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libgtk-3-dev:amd64
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_89
debug: ${{ env.DEBUG }}
regression_98:
runs-on: ubuntu-latest
name: "Install error due to SHELLOPTS override (regression issue #98)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: git-restore-mtime libgl1-mesa-dev libosmesa6-dev pandoc
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_98
debug: ${{ env.DEBUG }}
regression_106_install:
runs-on: ubuntu-latest
name: "Stale apt repo not finding package on restore, install phase (regression issue #106)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libtk8.6
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_106
debug: ${{ env.DEBUG }}
regression_106_restore:
needs: regression_106_install
runs-on: ubuntu-latest
name: "Stale apt repo not finding package on restore, restore phase (regression issue #106)."
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libtk8.6
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_106
debug: ${{ env.DEBUG }}
regression_159_install:
runs-on: ubuntu-latest
name: apt-show false positive parsing Package line (regression issue #159).
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: texlive-latex-extra
version: ${{ github.run_id }}-${{ github.run_attempt }}-regression_159
debug: ${{ env.DEBUG }}
multi_arch_cache_key:
runs-on: ubuntu-latest
name: Cache packages with multi-arch cache key.
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libfuse2
version: ${{ github.run_id }}-${{ github.run_attempt }}-multi_arch_cache_key
debug: ${{ env.DEBUG }}
virtual_package:
runs-on: ubuntu-latest
name: Cache virtual package.
steps:
- uses: actions/checkout@v4
- uses: ./
with:
packages: libvips
version: ${{ github.run_id }}-${{ github.run_attempt }}-virtual_package
debug: ${{ env.DEBUG }}

12
.golangci.json Normal file
View file

@ -0,0 +1,12 @@
{
"linters": {
"enable": [
"gofmt",
"govet",
"staticcheck",
"errcheck",
"ineffassign",
"gocritic"
]
}
}

457
.golangci.yml Normal file
View file

@ -0,0 +1,457 @@
# This file is licensed under the terms of the MIT license https://opensource.org/license/mit
# Copyright (c) 2021-2025 Marat Reymers
## Golden config for golangci-lint v2.4.0
#
# This is the best config for golangci-lint based on my experience and opinion.
# It is very strict, but not extremely strict.
# Feel free to adapt it to suit your needs.
# If this config helps you, please consider keeping a link to this file (see the next comment).
# Based on https://gist.github.com/maratori/47a4d00457a92aa426dbd48a18776322
version: "2"
issues:
# Maximum count of issues with the same text.
# Set to 0 to disable.
# Default: 3
max-same-issues: 50
formatters:
enable:
- goimports # checks if the code and import statements are formatted according to the 'goimports' command
- golines # checks if code is formatted, and fixes long lines
- gci # checks if code and import statements are formatted, with additional rules
- gofmt # checks if the code is formatted according to 'gofmt' command
- gofumpt # enforces a stricter format than 'gofmt', while being backwards compatible
- swaggo # formats swaggo comments
# All settings can be found here https://github.com/golangci/golangci-lint/blob/HEAD/.golangci.reference.yml
settings:
goimports:
# A list of prefixes, which, if set, checks import paths
# with the given prefixes are grouped after 3rd-party packages.
# Default: []
local-prefixes:
- github.com/awalsh128/cache-apt-pkgs
golines:
# Target maximum line length.
# Default: 100
max-len: 100
linters:
enable:
- asasalint # checks for pass []any as any in variadic func(...any)
- asciicheck # checks that your code does not contain non-ASCII identifiers
- bidichk # checks for dangerous unicode character sequences
- bodyclose # checks whether HTTP response body is closed successfully
- canonicalheader # checks whether net/http.Header uses canonical header
- copyloopvar # detects places where loop variables are copied (Go 1.22+)
- cyclop # checks function and package cyclomatic complexity
- depguard # checks if package imports are in a list of acceptable packages
- dupl # tool for code clone detection
- durationcheck # checks for two durations multiplied together
- embeddedstructfieldcheck # checks embedded types in structs
- errcheck # checking for unchecked errors, these unchecked errors can be critical bugs in some cases
- errname # checks that sentinel errors are prefixed with the Err and error types are suffixed with the Error
- errorlint # finds code that will cause problems with the error wrapping scheme introduced in Go 1.13
- exhaustive # checks exhaustiveness of enum switch statements
- exptostd # detects functions from golang.org/x/exp/ that can be replaced by std functions
- fatcontext # detects nested contexts in loops
- forbidigo # forbids identifiers
- funcorder # checks the order of functions, methods, and constructors
- funlen # tool for detection of long functions
- gocheckcompilerdirectives # validates go compiler directive comments (//go:)
- gochecknoglobals # checks that no global variables exist
- gochecknoinits # checks that no init functions are present in Go code
- gochecksumtype # checks exhaustiveness on Go "sum types"
- gocognit # computes and checks the cognitive complexity of functions
- goconst # finds repeated strings that could be replaced by a constant
- gocritic # provides diagnostics that check for bugs, performance and style issues
- gocyclo # computes and checks the cyclomatic complexity of functions
- godot # checks if comments end in a period
- gomoddirectives # manages the use of 'replace', 'retract', and 'excludes' directives in go.mod
- goprintffuncname # checks that printf-like functions are named with f at the end
- gosec # inspects source code for security problems
- govet # reports suspicious constructs, such as Printf calls whose arguments do not align with the format string
- iface # checks the incorrect use of interfaces, helping developers avoid interface pollution
- ineffassign # detects when assignments to existing variables are not used
- intrange # finds places where for loops could make use of an integer range
- loggercheck # checks key value pairs for common logger libraries (kitlog,klog,logr,zap)
- makezero # finds slice declarations with non-zero initial length
- mirror # reports wrong mirror patterns of bytes/strings usage
- mnd # detects magic numbers
- musttag # enforces field tags in (un)marshaled structs
- nakedret # finds naked returns in functions greater than a specified function length
- nestif # reports deeply nested if statements
- nilerr # finds the code that returns nil even if it checks that the error is not nil
- nilnesserr # reports that it checks for err != nil, but it returns a different nil value error (powered by nilness and nilerr)
- nilnil # checks that there is no simultaneous return of nil error and an invalid value
- noctx # finds sending http request without context.Context
- nolintlint # reports ill-formed or insufficient nolint directives
- nonamedreturns # reports all named returns
- nosprintfhostport # checks for misuse of Sprintf to construct a host with port in a URL
- perfsprint # checks that fmt.Sprintf can be replaced with a faster alternative
- predeclared # finds code that shadows one of Go's predeclared identifiers
- promlinter # checks Prometheus metrics naming via promlint
- protogetter # reports direct reads from proto message fields when getters should be used
- reassign # checks that package variables are not reassigned
- recvcheck # checks for receiver type consistency
- revive # fast, configurable, extensible, flexible, and beautiful linter for Go, drop-in replacement of golint
- rowserrcheck # checks whether Err of rows is checked successfully
- sloglint # ensure consistent code style when using log/slog
- spancheck # checks for mistakes with OpenTelemetry/Census spans
- sqlclosecheck # checks that sql.Rows and sql.Stmt are closed
- staticcheck # is a go vet on steroids, applying a ton of static analysis checks
- testableexamples # checks if examples are testable (have an expected output)
- testifylint # checks usage of github.com/stretchr/testify
- testpackage # makes you use a separate _test package
- tparallel # detects inappropriate usage of t.Parallel() method in your Go test codes
- unconvert # removes unnecessary type conversions
- unparam # reports unused function parameters
- unused # checks for unused constants, variables, functions and types
- usestdlibvars # detects the possibility to use variables/constants from the Go standard library
- usetesting # reports uses of functions with replacement inside the testing package
- wastedassign # finds wasted assignment statements
- whitespace # detects leading and trailing whitespace
## you may want to enable
#- arangolint # opinionated best practices for arangodb client
#- decorder # checks declaration order and count of types, constants, variables and functions
#- exhaustruct # [highly recommend to enable] checks if all structure fields are initialized
#- ginkgolinter # [if you use ginkgo/gomega] enforces standards of using ginkgo and gomega
#- godox # detects usage of FIXME, TODO and other keywords inside comments
#- goheader # checks is file header matches to pattern
#- inamedparam # [great idea, but too strict, need to ignore a lot of cases by default] reports interfaces with unnamed method parameters
#- interfacebloat # checks the number of methods inside an interface
#- ireturn # accept interfaces, return concrete types
#- noinlineerr # disallows inline error handling `if err := ...; err != nil {`
#- prealloc # [premature optimization, but can be used in some cases] finds slice declarations that could potentially be preallocated
#- tagalign # checks that struct tags are well aligned
#- varnamelen # [great idea, but too many false positives] checks that the length of a variable's name matches its scope
#- wrapcheck # checks that errors returned from external packages are wrapped
#- zerologlint # detects the wrong usage of zerolog that a user forgets to dispatch zerolog.Event
## disabled
#- containedctx # detects struct contained context.Context field
#- contextcheck # [too many false positives] checks the function whether use a non-inherited context
#- dogsled # checks assignments with too many blank identifiers (e.g. x, _, _, _, := f())
#- dupword # [useless without config] checks for duplicate words in the source code
#- err113 # [too strict] checks the errors handling expressions
#- errchkjson # [don't see profit + I'm against of omitting errors like in the first example https://github.com/breml/errchkjson] checks types passed to the json encoding functions. Reports unsupported types and optionally reports occasions, where the check for the returned error can be omitted
#- forcetypeassert # [replaced by errcheck] finds forced type assertions
#- gomodguard # [use more powerful depguard] allow and block lists linter for direct Go module dependencies
#- gosmopolitan # reports certain i18n/l10n anti-patterns in your Go codebase
#- grouper # analyzes expression groups
#- importas # enforces consistent import aliases
#- lll # [replaced by golines] reports long lines
#- maintidx # measures the maintainability index of each function
#- misspell # [useless] finds commonly misspelled English words in comments
#- nlreturn # [too strict and mostly code is not more readable] checks for a new line before return and branch statements to increase code clarity
#- paralleltest # [too many false positives] detects missing usage of t.Parallel() method in your Go test
#- tagliatelle # checks the struct tags
#- thelper # detects golang test helpers without t.Helper() call and checks the consistency of test helpers
#- wsl # [too strict and mostly code is not more readable] whitespace linter forces you to use empty lines
#- wsl_v5 # [too strict and mostly code is not more readable] add or remove empty lines
# All settings can be found here https://github.com/golangci/golangci-lint/blob/HEAD/.golangci.reference.yml
settings:
cyclop:
# The maximal code complexity to report.
# Default: 10
max-complexity: 30
# The maximal average package complexity.
# If it's higher than 0.0 (float) the check is enabled.
# Default: 0.0
package-average: 10.0
depguard:
# Rules to apply.
#
# Variables:
# - File Variables
# Use an exclamation mark `!` to negate a variable.
# Example: `!$test` matches any file that is not a go test file.
#
# `$all` - matches all go files
# `$test` - matches all go test files
#
# - Package Variables
#
# `$gostd` - matches all of go's standard library (Pulled from `GOROOT`)
#
# Default (applies if no custom rules are defined): Only allow $gostd in all files.
rules:
"deprecated":
# List of file globs that will match this list of settings to compare against.
# By default, if a path is relative, it is relative to the directory where the golangci-lint command is executed.
# The placeholder '${base-path}' is substituted with a path relative to the mode defined with `run.relative-path-mode`.
# The placeholder '${config-path}' is substituted with a path relative to the configuration file.
# Default: $all
files:
- $all
# List of packages that are not allowed.
# Entries can be a variable (starting with $), a string prefix, or an exact match (if ending with $).
# Default: []
deny:
- pkg: github.com/golang/protobuf
desc: Use google.golang.org/protobuf instead, see https://developers.google.com/protocol-buffers/docs/reference/go/faq#modules
- pkg: github.com/satori/go.uuid
desc: Use github.com/google/uuid instead, satori's package is not maintained
- pkg: github.com/gofrs/uuid$
desc: Use github.com/gofrs/uuid/v5 or later, it was not a go module before v5
"non-test files":
files:
- "!$test"
deny:
- pkg: math/rand$
desc: Use math/rand/v2 instead, see https://go.dev/blog/randv2
"non-main files":
files:
- "!**/main.go"
deny:
- pkg: log$
desc: Use log/slog instead, see https://go.dev/blog/slog
embeddedstructfieldcheck:
# Checks that sync.Mutex and sync.RWMutex are not used as embedded fields.
# Default: false
forbid-mutex: true
errcheck:
# Report about not checking of errors in type assertions: `a := b.(MyStruct)`.
# Such cases aren't reported by default.
# Default: false
check-type-assertions: true
exhaustive:
# Program elements to check for exhaustiveness.
# Default: [ switch ]
check:
- switch
- map
exhaustruct:
# List of regular expressions to match type names that should be excluded from processing.
# Anonymous structs can be matched by '<anonymous>' alias.
# Has precedence over `include`.
# Each regular expression must match the full type name, including package path.
# For example, to match type `net/http.Cookie` regular expression should be `.*/http\.Cookie`,
# but not `http\.Cookie`.
# Default: []
exclude:
# std libs
- ^net/http.Client$
- ^net/http.Cookie$
- ^net/http.Request$
- ^net/http.Response$
- ^net/http.Server$
- ^net/http.Transport$
- ^net/url.URL$
- ^os/exec.Cmd$
- ^reflect.StructField$
# public libs
- ^github.com/Shopify/sarama.Config$
- ^github.com/Shopify/sarama.ProducerMessage$
- ^github.com/mitchellh/mapstructure.DecoderConfig$
- ^github.com/prometheus/client_golang/.+Opts$
- ^github.com/spf13/cobra.Command$
- ^github.com/spf13/cobra.CompletionOptions$
- ^github.com/stretchr/testify/mock.Mock$
- ^github.com/testcontainers/testcontainers-go.+Request$
- ^github.com/testcontainers/testcontainers-go.FromDockerfile$
- ^golang.org/x/tools/go/analysis.Analyzer$
- ^google.golang.org/protobuf/.+Options$
- ^gopkg.in/yaml.v3.Node$
# Allows empty structures in return statements.
# Default: false
allow-empty-returns: true
funcorder:
# Checks if the exported methods of a structure are placed before the non-exported ones.
# Default: true
struct-method: false
funlen:
# Checks the number of lines in a function.
# If lower than 0, disable the check.
# Default: 60
lines: 100
# Checks the number of statements in a function.
# If lower than 0, disable the check.
# Default: 40
statements: 50
gochecksumtype:
# Presence of `default` case in switch statements satisfies exhaustiveness, if all members are not listed.
# Default: true
default-signifies-exhaustive: false
gocognit:
# Minimal code complexity to report.
# Default: 30 (but we recommend 10-20)
min-complexity: 20
gocritic:
# Settings passed to gocritic.
# The settings key is the name of a supported gocritic checker.
# The list of supported checkers can be found at https://go-critic.com/overview.
settings:
captLocal:
# Whether to restrict checker to params only.
# Default: true
paramsOnly: false
underef:
# Whether to skip (*x).method() calls where x is a pointer receiver.
# Default: true
skipRecvDeref: false
govet:
# Enable all analyzers.
# Default: false
enable-all: true
# Disable analyzers by name.
# Run `GL_DEBUG=govet golangci-lint run --enable=govet` to see default, all available analyzers, and enabled analyzers.
# Default: []
disable:
- fieldalignment # too strict
# Settings per analyzer.
settings:
shadow:
# Whether to be strict about shadowing; can be noisy.
# Default: false
strict: true
inamedparam:
# Skips check for interface methods with only a single parameter.
# Default: false
skip-single-param: true
mnd:
# List of function patterns to exclude from analysis.
# Values always ignored: `time.Date`,
# `strconv.FormatInt`, `strconv.FormatUint`, `strconv.FormatFloat`,
# `strconv.ParseInt`, `strconv.ParseUint`, `strconv.ParseFloat`.
# Default: []
ignored-functions:
- args.Error
- flag.Arg
- flag.Duration.*
- flag.Float.*
- flag.Int.*
- flag.Uint.*
- os.Chmod
- os.Mkdir.*
- os.OpenFile
- os.WriteFile
- prometheus.ExponentialBuckets.*
- prometheus.LinearBuckets
nakedret:
# Make an issue if func has more lines of code than this setting, and it has naked returns.
# Default: 30
max-func-lines: 0
nolintlint:
# Exclude following linters from requiring an explanation.
# Default: []
allow-no-explanation: [funlen, gocognit, golines]
# Enable to require an explanation of nonzero length after each nolint directive.
# Default: false
require-explanation: true
# Enable to require nolint directives to mention the specific linter being suppressed.
# Default: false
require-specific: true
perfsprint:
# Optimizes into strings concatenation.
# Default: true
strconcat: false
reassign:
# Patterns for global variable names that are checked for reassignment.
# See https://github.com/curioswitch/go-reassign#usage
# Default: ["EOF", "Err.*"]
patterns:
- ".*"
rowserrcheck:
# database/sql is always checked.
# Default: []
packages:
- github.com/jmoiron/sqlx
sloglint:
# Enforce not using global loggers.
# Values:
# - "": disabled
# - "all": report all global loggers
# - "default": report only the default slog logger
# https://github.com/go-simpler/sloglint?tab=readme-ov-file#no-global
# Default: ""
no-global: all
# Enforce using methods that accept a context.
# Values:
# - "": disabled
# - "all": report all contextless calls
# - "scope": report only if a context exists in the scope of the outermost function
# https://github.com/go-simpler/sloglint?tab=readme-ov-file#context-only
# Default: ""
context: scope
staticcheck:
# SAxxxx checks in https://staticcheck.dev/docs/configuration/options/#checks
# Example (to disable some checks): [ "all", "-SA1000", "-SA1001"]
# Default: ["all", "-ST1000", "-ST1003", "-ST1016", "-ST1020", "-ST1021", "-ST1022"]
checks:
- all
# Incorrect or missing package comment.
# https://staticcheck.dev/docs/checks/#ST1000
- -ST1000
# Use consistent method receiver names.
# https://staticcheck.dev/docs/checks/#ST1016
- -ST1016
# Omit embedded fields from selector expression.
# https://staticcheck.dev/docs/checks/#QF1008
- -QF1008
usetesting:
# Enable/disable `os.TempDir()` detections.
# Default: false
os-temp-dir: true
exclusions:
# Log a warning if an exclusion rule is unused.
# Default: false
warn-unused: true
# Predefined exclusion rules.
# Default: []
presets:
- std-error-handling
- common-false-positives
# Excluding configuration per-path, per-linter, per-text and per-source.
rules:
- source: "TODO"
linters: [godot]
- text: "should have a package comment"
linters: [revive]
- text: 'exported \S+ \S+ should have comment( \(or a comment on this block\))? or be unexported'
linters: [revive]
- text: 'package comment should be of the form ".+"'
source: "// ?(nolint|TODO)"
linters: [revive]
- text: 'comment on exported \S+ \S+ should be of the form ".+"'
source: "// ?(nolint|TODO)"
linters: [revive, staticcheck]
- path: '_test\.go'
linters:
- bodyclose
- dupl
- errcheck
- funlen
- goconst
- gosec
- noctx
- wrapcheck

9
.trunk/.gitignore vendored Normal file
View file

@ -0,0 +1,9 @@
*out
*logs
*actions
*notifications
*tools
plugins
user_trunk.yaml
user.yaml
tmp

4
.trunk/check-ignore Normal file
View file

@ -0,0 +1,4 @@
vendor/**/*
dist/**/*
bin/**/*
*.pb.go

View file

@ -0,0 +1,2 @@
# Prettier friendly markdownlint config (all formatting rules disabled)
extends: markdownlint/style/prettier

View file

@ -0,0 +1,9 @@
shell=bash
enable=all
source-path=SCRIPTDIR
disable=SC2154
external-sources=true
# If you're having issues with shellcheck following source, disable the errors via:
# disable=SC1090
# disable=SC1091

View file

@ -0,0 +1,7 @@
rules:
quoted-strings:
required: only-when-needed
extra-allowed: ["{|}"]
key-duplicates: {}
octal-values:
forbid-implicit-octal: true

View file

@ -0,0 +1,5 @@
version: "0.2"
# Suggestions can sometimes take longer on CI machines,
# leading to inconsistent results.
suggestionsTimeout: 5000 # ms
enabled: false

49
.trunk/trunk.yaml Normal file
View file

@ -0,0 +1,49 @@
# This file controls the behavior of Trunk: https://docs.trunk.io/cli
# To learn more about the format of this file, see https://docs.trunk.io/reference/trunk-yaml
version: 0.1
cli:
version: 1.25.0
plugins:
sources:
- id: trunk
ref: v1.7.1
uri: https://github.com/trunk-io/plugins
runtimes:
enabled:
- go@1.21.0
lint:
disabled:
enabled:
- cspell@9.2.0
- markdownlint@0.45.0
- actionlint@1.7.7
- gofmt@1.20.4
- golangci-lint@1.54.2
- shellcheck@0.10.0
- shfmt@3.6.0
- yamllint@1.37.1
definitions:
- name: golangci-lint
files: [go]
commands:
- name: golangci-lint
output: regex
parse_regex: "^(?P<path>.*?):(?P<line>\\d+):(?P<column>\\d+): (?P<message>.*)$"
run: golangci-lint run ${target}
success_codes: [0]
in_place: false
- name: shell
files: [shell]
commands:
- name: shellcheck
output: regex
parse_regex: "^(?P<path>.*?):(?P<line>\\d+):(?P<column>\\d+): (?P<level>.*?) (?P<message>.*)$"
run: shellcheck -f gcc ${target}
success_codes: [0]
actions:
enabled:
- trunk-upgrade-available
- trunk-fmt-pre-commit
disabled:
- trunk-announce
- trunk-check-pre-push

248
.vscode/cspell.json vendored Normal file
View file

@ -0,0 +1,248 @@
{
"version": "0.2",
"language": "en",
"dictionaries": [
"go",
"softwareTerms"
],
"enableFiletypes": [
"go",
"md",
"yml",
],
"words": [
"arangodb",
"arangolint",
"asasalint",
"asciicheck",
"awalsh",
"Axxxx",
"bidichk",
"bodyclose",
"canonicalheader",
"cli",
"cmdflags",
"cmdtesting",
"codecov",
"containedctx",
"contextcheck",
"copyloopvar",
"covermode",
"coverprofile",
"createreplaylogs",
"cwd",
"cyclo",
"cyclop",
"davecgh",
"DCMAKE",
"decorder",
"depguard",
"difflib",
"dmbeddedstouctfieldcheck",
"dockerdesktop",
"dpkg",
"dupl",
"dupword",
"durationcheck",
"eamodio",
"embeddedstructfieldcheck",
"errcheck",
"errchkjson",
"errexit",
"errname",
"Errorf",
"errorlint",
"erxygen",
"esac",
"Etz",
"exhaustruct",
"exptostd",
"fatcontext",
"fieldalignment",
"fileset",
"finkgulinter",
"fmt",
"folded",
"forbidigo",
"forcetypeassert",
"forcorder",
"funcorder",
"funlen",
"Fyf",
"ginkgolinter",
"gocheckcompilerdirectives",
"gochecknoglobals",
"gochecknoinits",
"gochecksumtype",
"gocognit",
"goconst",
"gocritic",
"gocyclo",
"godot",
"godox",
"gofmt",
"gofrs",
"gofumpt",
"goheader",
"goimports",
"golangci",
"golines",
"golint",
"gomega",
"gomoddirectives",
"gomodguard",
"gonlen",
"gopkg",
"goprintffuncname",
"GOROOT",
"gosec",
"gosmopolitan",
"gostd",
"govet",
"graphviz",
"iface",
"importas",
"inamedparam",
"ineffassign",
"interfacebloat",
"intrange",
"ireturn",
"jmoiron",
"jre",
"kitlog",
"klog",
"libasound",
"libatk",
"libatspi",
"libboost",
"libcups",
"libdrm",
"libfuse",
"libgbm",
"libgl",
"libgtk",
"libnspr",
"libnss",
"libosmesa",
"libtk",
"libvips",
"libxcomposite",
"libxdamage",
"libxfixes",
"libxkbcommon",
"libxrandr",
"loggercheck",
"logr",
"lqez",
"maintidx",
"makezero",
"mapstructure",
"mitchellh",
"mscgen",
"musttag",
"myapp",
"mypackage",
"nakedret",
"nestif",
"nginx",
"nilerr",
"nilness",
"nilnesserr",
"nilnil",
"nlreturn",
"noctx",
"noinlineerr",
"nolint",
"nolintlint",
"nonamedreturns",
"nonexistentpackagename",
"nosprintfhostport",
"oipefail",
"oneapi",
"onndoc",
"paeapi",
"pandoc",
"paralleltest",
"perfsprint",
"pipefail",
"pkgs",
"pmealloc",
"postgresql",
"postinst",
"ppenapi",
"prealloc",
"predeclared",
"preinst",
"prezard",
"promlint",
"promlinter",
"proto",
"protogetter",
"reassign",
"recvcheck",
"redis",
"replayfile",
"replayfilename",
"Reymers",
"rolldice",
"rowserrcheck",
"rwa",
"sarama",
"SCRIPTDIR",
"shellcheck",
"SHELLOPTS",
"shfmt",
"sidechannel",
"sloglint",
"SLXVDP",
"softprops",
"spancheck",
"spew",
"Sprintf",
"sqlclosecheck",
"sqlx",
"staticcheck",
"stderr",
"stdout",
"strconcat",
"strconv",
"stretchr",
"stretchside",
"strs",
"Submatch",
"swaggo",
"syspkg",
"tagalign",
"tagliatelle",
"testableexamples",
"testcontainers",
"testdata",
"testifylint",
"testpackage",
"thelper",
"toolchain",
"tparallel",
"Typeflag",
"unconvert",
"underef",
"unparam",
"unparsedflags",
"untar",
"usestdlibvars",
"usetesting",
"usr",
"varnamelen",
"wastedassign",
"wayou",
"whitespace",
"wrapcheck",
"xdot",
"yamllint",
"zerolog",
"zerologlint"
],
"ignorePaths": [
"dist",
".git"
]
}

10
.vscode/extensions.json vendored Normal file
View file

@ -0,0 +1,10 @@
{
"recommendations": [
"golang.go", // Official Go extension
"trunk.io", // trunk.io Linters
"wayou.vscode-todo-highlight", // Highlight TODOs
"streetsidesoftware.code-spell-checker", // Spell checking
"eamodio.gitlens", // Git integration
"github.vscode-github-actions", // GitHub Actions support
]
}

28
.vscode/settings.json vendored Normal file
View file

@ -0,0 +1,28 @@
{
"[go]": {
"editor.defaultFormatter": "trunk.io",
"editor.formatOnSave": true
},
"[json]": {
"editor.defaultFormatter": "vscode.json-language-features",
"editor.formatOnSave": true
},
"[jsonc]": {
"editor.defaultFormatter": "vscode.json-language-features",
"editor.formatOnSave": true
},
"[shell]": {
"editor.defaultFormatter": "trunk.io",
"editor.formatOnSave": true
},
"[yaml]": {
"editor.defaultFormatter": "trunk.io",
"editor.formatOnSave": true
},
"cSpell.enabled": false,
"cSpell.caseSensitive": false,
"cSpell.import": [
".vscode/cspell.json"
],
"editor.formatOnSave": true
}

73
.vscode/tasks.json vendored Normal file
View file

@ -0,0 +1,73 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "go: build",
"type": "shell",
"command": "go",
"args": [
"build",
"-v",
"./..."
],
"group": {
"kind": "build",
"isDefault": true
},
"problemMatcher": [
"$go"
],
"presentation": {
"reveal": "always",
"panel": "shared",
"showReuseMessage": false,
"clear": true
}
},
{
"label": "go: test",
"type": "shell",
"command": "go",
"args": [
"test",
"-v",
"./..."
],
"group": {
"kind": "test",
"isDefault": true
},
"problemMatcher": [
"$go"
],
"presentation": {
"reveal": "always",
"panel": "shared",
"showReuseMessage": false,
"clear": true
}
},
{
"label": "go: test with coverage",
"type": "shell",
"command": "go",
"args": [
"test",
"-v",
"-race",
"-coverprofile=coverage.txt",
"-covermode=atomic",
"./..."
],
"problemMatcher": [
"$go"
],
"presentation": {
"reveal": "always",
"panel": "shared",
"showReuseMessage": false,
"clear": true
}
}
]
}

View file

@ -1,8 +1,11 @@
# cache-apt-pkgs-action
[![License: Apache2](https://shields.io/badge/license-apache2-blue.svg)](https://github.com/awalsh128/fluentcpp/blob/master/LICENSE)
[![Master Test status](https://github.com/awalsh128/cache-apt-pkgs-action-ci/actions/workflows/master_test.yml/badge.svg)](https://github.com/awalsh128/cache-apt-pkgs-action-ci/actions/workflows/master_test.yml)
[![Dev Test status](https://github.com/awalsh128/cache-apt-pkgs-action-ci/actions/workflows/dev_test.yml/badge.svg)](https://github.com/awalsh128/cache-apt-pkgs-action-ci/actions/workflows/dev_test.yml)
[![CI](https://github.com/awalsh128/cache-apt-pkgs-action/actions/workflows/ci.yml/badge.svg?branch=master)](https://github.com/awalsh128/cache-apt-pkgs-action/actions/workflows/ci.yml)
[![CI (dev-v2.0)](https://github.com/awalsh128/cache-apt-pkgs-action/actions/workflows/ci.yml/badge.svg?branch=dev-v2.0)](https://github.com/awalsh128/cache-apt-pkgs-action/actions/workflows/ci.yml?query=branch%3Adev-v2.0)
[![Go Report Card](https://goreportcard.com/badge/github.com/awalsh128/cache-apt-pkgs-action)](https://goreportcard.com/report/github.com/awalsh128/cache-apt-pkgs-action)
[![Go Reference](https://pkg.go.dev/badge/github.com/awalsh128/cache-apt-pkgs-action.svg)](https://pkg.go.dev/github.com/awalsh128/cache-apt-pkgs-action)
[![License](https://img.shields.io/github/license/awalsh128/cache-apt-pkgs-action)](https://github.com/awalsh128/cache-apt-pkgs-action/blob/master/LICENSE)
[![Release](https://img.shields.io/github/v/release/awalsh128/cache-apt-pkgs-action)](https://github.com/awalsh128/cache-apt-pkgs-action/releases)
This action allows caching of Advanced Package Tool (APT) package dependencies to improve workflow execution time instead of installing the packages on every run.
@ -17,7 +20,7 @@ This action is a composition of [actions/cache](https://github.com/actions/cache
### Pre-requisites
Create a workflow `.yml` file in your repositories `.github/workflows` directory. An [example workflow](#example-workflow) is available below. For more information, reference the GitHub Help Documentation for [Creating a workflow file](https://help.github.com/en/articles/configuring-a-workflow#creating-a-workflow-file).
Create a workflow `.yml` file in your repositories `.github/workflows` directory. [Example workflows](#example-workflows) are available below. For more information, reference the GitHub Help Documentation for [Creating a workflow file](https://help.github.com/en/articles/configuring-a-workflow#creating-a-workflow-file).
### Versions
@ -46,9 +49,13 @@ There are three kinds of version labels you can use.
The cache is scoped to the packages given and the branch. The default branch cache is available to other branches.
### Example workflow
### Example workflows
This was a motivating use case for creating this action.
Below are some example workflows showing how to use this action.
#### Build and Deploy Doxygen Documentation
This example shows how to cache dependencies for building and deploying Doxygen documentation:
```yaml
name: Create Documentation
@ -66,7 +73,7 @@ jobs:
- name: Build
run: |
cmake -B ${{github.workspace}}/build -DCMAKE_BUILD_TYPE=${{env.BUILD_TYPE}}
cmake -B ${{github.workspace}}/build -DCMAKE_BUILD_TYPE=${{env.BUILD_TYPE}}
cmake --build ${{github.workspace}}/build --config ${{env.BUILD_TYPE}}
- name: Deploy
@ -76,17 +83,21 @@ jobs:
folder: ${{github.workspace}}/build/website
```
```yaml
#### Simple Package Installation
---
install_doxygen_deps:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: awalsh128/cache-apt-pkgs-action@latest
with:
packages: dia doxygen doxygen-doc doxygen-gui doxygen-latex graphviz mscgen
version: 1.0
This example shows the minimal configuration needed to cache and install packages:
```yaml
name: Install Dependencies
jobs:
install_doxygen_deps:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: awalsh128/cache-apt-pkgs-action@latest
with:
packages: dia doxygen doxygen-doc doxygen-gui doxygen-latex graphviz mscgen
version: 1.0
```
## Caveats

View file

@ -1,103 +1,132 @@
name: 'Cache APT Packages'
description: 'Install APT based packages and cache them for future runs.'
author: awalsh128
branding:
icon: 'hard-drive'
color: 'green'
inputs:
packages:
description: 'Space delimited list of packages to install. Version can be specified optionally using APT command syntax of <name>=<version> (e.g. xdot=1.2-2).'
required: true
default: ''
version:
description: 'Version of cache to load. Each version will have its own cache. Note, all characters except spaces are allowed.'
required: false
default: ''
execute_install_scripts:
description: 'Execute Debian package pre and post install script upon restore. See README.md caveats for more information.'
required: false
default: 'false'
refresh:
description: 'OBSOLETE: Refresh is not used by the action, use version instead.'
deprecationMessage: 'Refresh is not used by the action, use version instead.'
debug:
description: 'Enable debugging when there are issues with action. Minor performance penalty.'
required: false
default: 'false'
outputs:
cache-hit:
description: 'A boolean value to indicate a cache was found for the packages requested.'
# This compound expression is needed because lhs can be empty.
# Need to output true and false instead of true and nothing.
value: ${{ steps.load-cache.outputs.cache-hit || false }}
package-version-list:
description: 'The main requested packages and versions that are installed. Represented as a comma delimited list with equals delimit on the package version (i.e. <package>:<version,<package>:<version>).'
value: ${{ steps.post-cache.outputs.package-version-list }}
all-package-version-list:
description: 'All the pulled in packages and versions, including dependencies, that are installed. Represented as a comma delimited list with equals delimit on the package version (i.e. <package>:<version,<package>:<version>).'
value: ${{ steps.post-cache.outputs.all-package-version-list }}
runs:
using: "composite"
steps:
- id: pre-cache
run: |
${GITHUB_ACTION_PATH}/pre_cache_action.sh \
~/cache-apt-pkgs \
"$VERSION" \
"$EXEC_INSTALL_SCRIPTS" \
"$DEBUG" \
"$PACKAGES"
echo "CACHE_KEY=$(cat ~/cache-apt-pkgs/cache_key.md5)" >> $GITHUB_ENV
shell: bash
env:
VERSION: "${{ inputs.version }}"
EXEC_INSTALL_SCRIPTS: "${{ inputs.execute_install_scripts }}"
DEBUG: "${{ inputs.debug }}"
PACKAGES: "${{ inputs.packages }}"
- id: load-cache
uses: actions/cache/restore@v4
with:
path: ~/cache-apt-pkgs
key: cache-apt-pkgs_${{ env.CACHE_KEY }}
- id: post-cache
run: |
${GITHUB_ACTION_PATH}/post_cache_action.sh \
~/cache-apt-pkgs \
/ \
"$CACHE_HIT" \
"$EXEC_INSTALL_SCRIPTS" \
"$DEBUG" \
"$PACKAGES"
function create_list { local list=$(cat ~/cache-apt-pkgs/manifest_${1}.log | tr '\n' ','); echo ${list:0:-1}; };
echo "package-version-list=$(create_list main)" >> $GITHUB_OUTPUT
echo "all-package-version-list=$(create_list all)" >> $GITHUB_OUTPUT
shell: bash
env:
CACHE_HIT: "${{ steps.load-cache.outputs.cache-hit }}"
EXEC_INSTALL_SCRIPTS: "${{ inputs.execute_install_scripts }}"
DEBUG: "${{ inputs.debug }}"
PACKAGES: "${{ inputs.packages }}"
- id: upload-logs
if: ${{ inputs.debug == 'true' }}
uses: actions/upload-artifact@v4
with:
name: cache-apt-pkgs-logs_${{ env.CACHE_KEY }}
path: ~/cache-apt-pkgs/*.log
- id: save-cache
if: ${{ ! steps.load-cache.outputs.cache-hit }}
uses: actions/cache/save@v4
with:
path: ~/cache-apt-pkgs
key: ${{ steps.load-cache.outputs.cache-primary-key }}
- id: clean-cache
run: |
rm -rf ~/cache-apt-pkgs
shell: bash
name: "Cache APT Packages"
description: "Install APT based packages and cache them for future runs."
author: awalsh128
branding:
icon: "hard-drive"
color: "green"
inputs:
packages:
description: "Space delimited list of packages to install. Version can be specified optionally using APT command syntax of <name>=<version> (e.g. xdot=1.2-2)."
required: true
default: ""
version:
description: "Version of cache to load. Each version will have its own cache. Note, all characters except spaces are allowed."
required: false
default: ""
execute_install_scripts:
description: "Execute Debian package pre and post install script upon restore. See README.md caveats for more information."
required: false
default: "false"
refresh:
description: "OBSOLETE: Refresh is not used by the action, use version instead."
deprecationMessage: "Refresh is not used by the action, use version instead."
debug:
description: "Enable debugging when there are issues with action. Minor performance penalty."
required: false
default: "false"
outputs:
cache-hit:
description: "A boolean value to indicate a cache was found for the packages requested."
# This compound expression is needed because lhs can be empty.
# Need to output true and false instead of true and nothing.
value: ${{ steps.load-cache.outputs.cache-hit || false }}
package-version-list:
description: "The main requested packages and versions that are installed. Represented as a comma delimited list with equals delimit on the package version (i.e. <package>:<version,<package>:<version>)."
value: ${{ steps.post-cache.outputs.package-version-list }}
all-package-version-list:
description: "All the pulled in packages and versions, including dependencies, that are installed. Represented as a comma delimited list with equals delimit on the package version (i.e. <package>:<version,<package>:<version>)."
value: ${{ steps.post-cache.outputs.all-package-version-list }}
runs:
using: "composite"
steps:
- id: setup-binary
shell: bash
run: |
# Map runner architecture to binary name
case "${{ runner.arch }}" in
X64)
BINARY_NAME="cache-apt-pkgs-linux-amd64"
;;
ARM64)
BINARY_NAME="cache-apt-pkgs-linux-arm64"
;;
ARM)
BINARY_NAME="cache-apt-pkgs-linux-arm"
;;
*)
echo "Unsupported architecture: ${{ runner.arch }}"
exit 1
;;
esac
# Use bundled binary from action's dist directory
BINARY_PATH="${{ github.action_path }}/dist/$BINARY_NAME"
if [ ! -f "$BINARY_PATH" ]; then
echo "Error: Binary not found at $BINARY_PATH"
echo "Please ensure the action has been properly built and binaries are included in the dist directory"
exit 1
fi
# Create symlink to standardize binary name for other steps
ln -sf "$BINARY_PATH" "${{ github.action_path }}/cache-apt-pkgs"
chmod +x "${{ github.action_path }}/cache-apt-pkgs"
- id: create-cache-key
shell: bash
run: |
${BINARY_PATH} createkey \
-cache-dir "$CACHE_DIR" \
-version "${{ inputs.version }}" \
-global-version "4" \
-exec-install-scripts ${{ inputs.execute_install_scripts }} \
${{ inputs.packages }}
echo "cache-key=$(cat $CACHE_DIR/cache_key.md5)" >> $GITHUB_OUTPUT
- id: load-cache
uses: actions/cache/restore@v4
with:
path: ~/cache-apt-pkgs
key: cache-apt-pkgs_${{ steps.create-cache-key.outputs.cache-key }}
- id: post-load-cache
run: |
if [ "$CACHE_HIT" == "true" ]; then
${BINARY_PATH} restore
-cache-dir "~/cache-apt-pkgs" \
-exec-install-scripts "$EXEC_INSTALL_SCRIPTS" \
-restore-root "/" \
"$PACKAGES"
else
${BINARY_PATH} install -cache-dir "~/cache-apt-pkgs"
fi
function create_list { local list=$(cat ~/cache-apt-pkgs/manifest_${1}.log | tr '\n' ','); echo ${list:0:-1}; };
echo "package-version-list=$(create_list main)" >> $GITHUB_OUTPUT
echo "all-package-version-list=$(create_list all)" >> $GITHUB_OUTPUT
shell: bash
env:
CACHE_HIT: "${{ steps.load-cache.outputs.cache-hit }}"
EXEC_INSTALL_SCRIPTS: "${{ inputs.execute_install_scripts }}"
DEBUG: "${{ inputs.debug }}"
PACKAGES: "${{ inputs.packages }}"
- id: upload-logs
if: ${{ inputs.debug == 'true' }}
uses: actions/upload-artifact@v4
with:
name: cache-apt-pkgs-logs_${{ env.CACHE_KEY }}
path: ~/cache-apt-pkgs/*.log
- id: save-cache
if: ${{ ! steps.load-cache.outputs.cache-hit }}
uses: actions/cache/save@v4
with:
path: ~/cache-apt-pkgs
key: ${{ steps.load-cache.outputs.cache-primary-key }}
- id: clean-cache
run: |
rm -rf ~/cache-apt-pkgs
shell: bash

Binary file not shown.

Binary file not shown.

BIN
dist/cache-apt-pkgs-linux-386 vendored Executable file

Binary file not shown.

BIN
dist/cache-apt-pkgs-linux-amd64 vendored Executable file

Binary file not shown.

BIN
dist/cache-apt-pkgs-linux-arm vendored Executable file

Binary file not shown.

BIN
dist/cache-apt-pkgs-linux-arm64 vendored Executable file

Binary file not shown.

18
go.mod
View file

@ -1,3 +1,19 @@
module awalsh128.com/cache-apt-pkgs-action
go 1.20
go 1.23
toolchain go1.23.4
require (
github.com/bluet/syspkg v0.1.5
github.com/stretchr/testify v1.10.0
)
require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)
// Replace the syspkg module with your local version
replace github.com/bluet/syspkg => /home/awalsh128/syspkg

10
go.sum Normal file
View file

@ -0,0 +1,10 @@
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=
github.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View file

@ -1,106 +0,0 @@
#!/bin/bash
# Fail on any error.
set -e
# Debug mode for diagnosing issues.
# Setup first before other operations.
debug="${2}"
test "${debug}" = "true" && set -x
# Include library.
script_dir="$(dirname -- "$(realpath -- "${0}")")"
source "${script_dir}/lib.sh"
# Directory that holds the cached packages.
cache_dir="${1}"
# List of the packages to use.
input_packages="${@:3}"
if ! apt-fast --version > /dev/null 2>&1; then
log "Installing apt-fast for optimized installs..."
# Install apt-fast for optimized installs.
/bin/bash -c "$(curl -sL https://raw.githubusercontent.com/ilikenwf/apt-fast/master/quick-install.sh)"
log "done"
log_empty_line
fi
log "Updating APT package list..."
if [[ -z "$(find -H /var/lib/apt/lists -maxdepth 0 -mmin -5)" ]]; then
sudo apt-fast update > /dev/null
log "done"
else
log "skipped (fresh within at least 5 minutes)"
fi
log_empty_line
packages="$(get_normalized_package_list "${input_packages}")"
package_count=$(wc -w <<< "${packages}")
log "Clean installing and caching ${package_count} package(s)."
log_empty_line
manifest_main=""
log "Package list:"
for package in ${packages}; do
manifest_main="${manifest_main}${package},"
log "- ${package}"
done
write_manifest "main" "${manifest_main}" "${cache_dir}/manifest_main.log"
log_empty_line
# Strictly contains the requested packages.
manifest_main=""
# Contains all packages including dependencies.
manifest_all=""
install_log_filepath="${cache_dir}/install.log"
log "Clean installing ${package_count} packages..."
# Zero interaction while installing or upgrading the system via apt.
sudo DEBIAN_FRONTEND=noninteractive apt-fast --yes install ${packages} > "${install_log_filepath}"
log "done"
log "Installation log written to ${install_log_filepath}"
log_empty_line
installed_packages=$(get_installed_packages "${install_log_filepath}")
log "Installed package list:"
for installed_package in ${installed_packages}; do
# Reformat for human friendly reading.
log "- $(echo ${installed_package} | awk -F\= '{print $1" ("$2")"}')"
done
log_empty_line
installed_packages_count=$(wc -w <<< "${installed_packages}")
log "Caching ${installed_packages_count} installed packages..."
for installed_package in ${installed_packages}; do
cache_filepath="${cache_dir}/${installed_package}.tar"
# Sanity test in case APT enumerates duplicates.
if test ! -f "${cache_filepath}"; then
read package_name package_ver < <(get_package_name_ver "${installed_package}")
log " * Caching ${package_name} to ${cache_filepath}..."
# Pipe all package files (no folders) and installation control data to Tar.
tar -cf "${cache_filepath}" -C / --verbatim-files-from --files-from <( { dpkg -L "${package_name}" &&
get_install_script_filepath "" "${package_name}" "preinst" &&
get_install_script_filepath "" "${package_name}" "postinst" ;} |
while IFS= read -r f; do test -f "${f}" -o -L "${f}" && get_tar_relpath "${f}"; done )
log " done (compressed size $(du -h "${cache_filepath}" | cut -f1))."
fi
# Comma delimited name:ver pairs in the all packages manifest.
manifest_all="${manifest_all}${package_name}=${package_ver},"
done
log "done (total cache size $(du -h ${cache_dir} | tail -1 | awk '{print $1}'))"
log_empty_line
write_manifest "all" "${manifest_all}" "${cache_dir}/manifest_all.log"

178
lib.sh
View file

@ -1,178 +0,0 @@
#!/bin/bash
# Don't fail on error. We use the exit status as a conditional.
#
# This is the default behavior but can be overridden by the caller in the
# SHELLOPTS env var.
set +e
###############################################################################
# Execute the Debian install script.
# Arguments:
# Root directory to search from.
# File path to cached package archive.
# Installation script extension (preinst, postinst).
# Parameter to pass to the installation script.
# Returns:
# Filepath of the install script, otherwise an empty string.
###############################################################################
function execute_install_script {
local package_name=$(basename ${2} | awk -F\= '{print $1}')
local install_script_filepath=$(\
get_install_script_filepath "${1}" "${package_name}" "${3}")
if test ! -z "${install_script_filepath}"; then
log "- Executing ${install_script_filepath}..."
# Don't abort on errors; dpkg-trigger will error normally since it is
# outside its run environment.
sudo sh -x ${install_script_filepath} ${4} || true
log " done"
fi
}
###############################################################################
# Gets the Debian install script filepath.
# Arguments:
# Root directory to search from.
# Name of the unqualified package to search for.
# Extension of the installation script (preinst, postinst)
# Returns:
# Filepath of the script file, otherwise an empty string.
###############################################################################
function get_install_script_filepath {
# Filename includes arch (e.g. amd64).
local filepath="$(\
ls -1 ${1}var/lib/dpkg/info/${2}*.${3} 2> /dev/null \
| grep -E ${2}'(:.*)?.'${3} | head -1 || true)"
test "${filepath}" && echo "${filepath}"
}
###############################################################################
# Gets a list of installed packages from a Debian package installation log.
# Arguments:
# The filepath of the Debian install log.
# Returns:
# The list of colon delimited action syntax pairs with each pair equals
# delimited. <name>:<version> <name>:<version>...
###############################################################################
function get_installed_packages {
local install_log_filepath="${1}"
local regex="^Unpacking ([^ :]+)([^ ]+)? (\[[^ ]+\]\s)?\(([^ )]+)"
local dep_packages=""
while read -r line; do
# ${regex} should be unquoted since it isn't a literal.
if [[ "${line}" =~ ${regex} ]]; then
dep_packages="${dep_packages}${BASH_REMATCH[1]}=${BASH_REMATCH[4]} "
else
log_err "Unable to parse package name and version from \"${line}\""
exit 2
fi
done < <(grep "^Unpacking " ${install_log_filepath})
if test -n "${dep_packages}"; then
echo "${dep_packages:0:-1}" # Removing trailing space.
else
echo ""
fi
}
###############################################################################
# Splits a fully action syntax APT package into the name and version.
# Arguments:
# The action syntax equals delimited package pair or just the package name.
# Returns:
# The package name and version pair.
###############################################################################
function get_package_name_ver {
local ORIG_IFS="${IFS}"
IFS=\= read name ver <<< "${1}"
IFS="${ORIG_IFS}"
# If version not found in the fully qualified package value.
if test -z "${ver}"; then
# This is a fallback and should not be used any more as its slow.
log_err "Unexpected version resolution for package '${name}'"
ver="$(apt-cache show ${name} | grep '^Version:' | awk '{print $2}')"
fi
echo "${name}" "${ver}"
}
###############################################################################
# Sorts given packages by name and split on commas and/or spaces.
# Arguments:
# The comma and/or space delimited list of packages.
# Returns:
# Sorted list of space delimited package name=version pairs.
###############################################################################
function get_normalized_package_list {
# Remove commas, and block scalar folded backslashes,
# extraneous spaces at the middle, beginning and end
# then sort.
local packages=$(echo "${1}" \
| sed 's/[,\]/ /g; s/\s\+/ /g; s/^\s\+//g; s/\s\+$//g' \
| sort -t' ')
local script_dir="$(dirname -- "$(realpath -- "${0}")")"
local architecture=$(dpkg --print-architecture)
if [ "${architecture}" == "arm64" ]; then
${script_dir}/apt_query-arm64 normalized-list ${packages}
else
${script_dir}/apt_query-x86 normalized-list ${packages}
fi
}
###############################################################################
# Gets the relative filepath acceptable by Tar. Just removes the leading slash
# that Tar disallows.
# Arguments:
# Absolute filepath to archive.
# Returns:
# The relative filepath to archive.
###############################################################################
function get_tar_relpath {
local filepath=${1}
if test ${filepath:0:1} = "/"; then
echo "${filepath:1}"
else
echo "${filepath}"
fi
}
function log { echo "${@}"; }
function log_err { >&2 echo "${@}"; }
function log_empty_line { echo ""; }
###############################################################################
# Validates an argument to be of a boolean value.
# Arguments:
# Argument to validate.
# Variable name of the argument.
# Exit code if validation fails.
# Returns:
# Sorted list of space delimited packages.
###############################################################################
function validate_bool {
if test "${1}" != "true" -a "${1}" != "false"; then
log "aborted"
log "${2} value '${1}' must be either true or false (case sensitive)."
exit ${3}
fi
}
###############################################################################
# Writes the manifest to a specified file.
# Arguments:
# Type of manifest being written.
# List of packages being written to the file.
# File path of the manifest being written.
# Returns:
# Log lines from write.
###############################################################################
function write_manifest {
if [ ${#2} -eq 0 ]; then
log "Skipped ${1} manifest write. No packages to install."
else
log "Writing ${1} packages manifest to ${3}..."
# 0:-1 to remove trailing comma, delimit by newline and sort.
echo "${2:0:-1}" | tr ',' '\n' | sort > ${3}
log "done"
fi
}

View file

@ -1,37 +0,0 @@
#!/bin/bash
# Fail on any error.
set -e
# Include library.
script_dir="$(dirname -- "$(realpath -- "${0}")")"
source "${script_dir}/lib.sh"
# Directory that holds the cached packages.
cache_dir="${1}"
# Root directory to untar the cached packages to.
# Typically filesystem root '/' but can be changed for testing.
# WARNING: If non-root, this can cause errors during install script execution.
cache_restore_root="${2}"
# Indicates that the cache was found.
cache_hit="${3}"
# Cache and execute post install scripts on restore.
execute_install_scripts="${4}"
# Debug mode for diagnosing issues.
debug="${5}"
test "${debug}" = "true" && set -x
# List of the packages to use.
packages="${@:6}"
if test "${cache_hit}" = "true"; then
${script_dir}/restore_pkgs.sh "${cache_dir}" "${cache_restore_root}" "${execute_install_scripts}" "${debug}"
else
${script_dir}/install_and_cache_pkgs.sh "${cache_dir}" "${debug}" ${packages}
fi
log_empty_line

View file

@ -1,88 +0,0 @@
#!/bin/bash
set -e
# Include library.
script_dir="$(dirname -- "$(realpath -- "${0}")")"
source "${script_dir}/lib.sh"
# Debug mode for diagnosing issues.
# Setup first before other operations.
debug="${4}"
validate_bool "${debug}" debug 1
test ${debug} == "true" && set -x
# Directory that holds the cached packages.
cache_dir="${1}"
# Version of the cache to create or load.
version="${2}"
# Execute post-installation script.
execute_install_scripts="${3}"
# Debug mode for diagnosing issues.
debug="${4}"
# List of the packages to use.
input_packages="${@:5}"
# Trim commas, excess spaces, and sort.
log "Normalizing package list..."
packages="$(get_normalized_package_list "${input_packages}")"
log "done"
# Create cache directory so artifacts can be saved.
mkdir -p ${cache_dir}
log "Validating action arguments (version='${version}', packages='${packages}')...";
if grep -q " " <<< "${version}"; then
log "aborted"
log "Version value '${version}' cannot contain spaces." >&2
exit 2
fi
# Is length of string zero?
if test -z "${packages}"; then
log "aborted"
log "Packages argument cannot be empty." >&2
exit 3
fi
validate_bool "${execute_install_scripts}" execute_install_scripts 4
log "done"
log_empty_line
# Abort on any failure at this point.
set -e
log "Creating cache key..."
# Forces an update in cases where an accidental breaking change was introduced
# and a global cache reset is required, or change in cache action requiring reload.
force_update_inc="3"
# Force a different cache key for different architectures (currently x86_64 and aarch64 are available on GitHub)
cpu_arch="$(arch)"
log "- CPU architecture is '${cpu_arch}'."
value="${packages} @ ${version} ${force_update_inc}"
# Don't invalidate existing caches for the standard Ubuntu runners
if [ "${cpu_arch}" != "x86_64" ]; then
value="${value} ${cpu_arch}"
log "- Architecture '${cpu_arch}' added to value."
fi
log "- Value to hash is '${value}'."
key="$(echo "${value}" | md5sum | cut -f1 -d' ')"
log "- Value hashed as '${key}'."
log "done"
key_filepath="${cache_dir}/cache_key.md5"
echo ${key} > ${key_filepath}
log "Hash value written to ${key_filepath}"

View file

@ -1,61 +0,0 @@
#!/bin/bash
# Fail on any error.
set -e
# Debug mode for diagnosing issues.
# Setup first before other operations.
debug="${4}"
test ${debug} == "true" && set -x
# Include library.
script_dir="$(dirname -- "$(realpath -- "${0}")")"
source "${script_dir}/lib.sh"
# Directory that holds the cached packages.
cache_dir="${1}"
# Root directory to untar the cached packages to.
# Typically filesystem root '/' but can be changed for testing.
cache_restore_root="${2}"
test -d ${cache_restore_root} || mkdir ${cache_restore_root}
# Cache and execute post install scripts on restore.
execute_install_scripts="${3}"
cache_filepaths="$(ls -1 "${cache_dir}" | sort)"
log "Found $(echo ${cache_filepaths} | wc -w) files in the cache."
for cache_filepath in ${cache_filepaths}; do
log "- "$(basename ${cache_filepath})""
done
log_empty_line
log "Reading from main requested packages manifest..."
for logline in $(cat "${cache_dir}/manifest_main.log" | tr ',' '\n' ); do
log "- $(echo "${logline}" | tr ':' ' ')"
done
log "done"
log_empty_line
# Only search for archived results. Manifest and cache key also live here.
cached_filepaths=$(ls -1 "${cache_dir}"/*.tar | sort)
cached_filecount=$(echo ${cached_filepaths} | wc -w)
log "Restoring ${cached_filecount} packages from cache..."
for cached_filepath in ${cached_filepaths}; do
log "- $(basename "${cached_filepath}") restoring..."
sudo tar -xf "${cached_filepath}" -C "${cache_restore_root}" > /dev/null
log " done"
# Execute install scripts if available.
if test ${execute_install_scripts} == "true"; then
# May have to add more handling for extracting pre-install script before extracting all files.
# Keeping it simple for now.
execute_install_script "${cache_restore_root}" "${cached_filepath}" preinst install
execute_install_script "${cache_restore_root}" "${cached_filepath}" postinst configure
fi
done
log "done"

View file

@ -1,53 +0,0 @@
package main
import (
"flag"
"fmt"
"os"
"awalsh128.com/cache-apt-pkgs-action/src/internal/common"
"awalsh128.com/cache-apt-pkgs-action/src/internal/exec"
"awalsh128.com/cache-apt-pkgs-action/src/internal/logging"
)
func getExecutor(replayFilename string) exec.Executor {
if len(replayFilename) == 0 {
return &exec.BinExecutor{}
}
return exec.NewReplayExecutor(replayFilename)
}
func main() {
debug := flag.Bool("debug", false, "Log diagnostic information to a file alongside the binary.")
replayFilename := flag.String("replayfile", "",
"Replay command output from a specified file rather than executing a binary."+
"The file should be in the same format as the log generated by the debug flag.")
flag.Parse()
unparsedFlags := flag.Args()
logging.Init(os.Args[0]+".log", *debug)
executor := getExecutor(*replayFilename)
if len(unparsedFlags) < 2 {
logging.Fatalf("Expected at least 2 non-flag arguments but found %d.", len(unparsedFlags))
return
}
command := unparsedFlags[0]
pkgNames := unparsedFlags[1:]
switch command {
case "normalized-list":
pkgs, err := common.GetAptPackages(executor, pkgNames)
if err != nil {
logging.Fatalf("Encountered error resolving some or all package names, see combined std[out,err] below.\n%s", err.Error())
}
fmt.Println(pkgs.Serialize())
default:
logging.Fatalf("Command '%s' not recognized.", command)
}
}

View file

@ -1,70 +0,0 @@
package main
import (
"flag"
"testing"
"awalsh128.com/cache-apt-pkgs-action/src/internal/cmdtesting"
)
var createReplayLogs bool = false
func init() {
flag.BoolVar(&createReplayLogs, "createreplaylogs", false, "Execute the test commands, save the command output for future replay and skip the tests themselves.")
}
func TestMain(m *testing.M) {
cmdtesting.TestMain(m)
}
func TestNormalizedList_MultiplePackagesExists_StdoutsAlphaSortedPackageNameVersionPairs(t *testing.T) {
result := cmdtesting.New(t, createReplayLogs).Run("normalized-list", "xdot", "rolldice")
result.ExpectSuccessfulOut("rolldice=1.16-1build1 xdot=1.1-2")
}
func TestNormalizedList_SamePackagesDifferentOrder_StdoutsMatch(t *testing.T) {
expected := "rolldice=1.16-1build1 xdot=1.1-2"
ct := cmdtesting.New(t, createReplayLogs)
result := ct.Run("normalized-list", "rolldice", "xdot")
result.ExpectSuccessfulOut(expected)
result = ct.Run("normalized-list", "xdot", "rolldice")
result.ExpectSuccessfulOut(expected)
}
func TestNormalizedList_MultiVersionWarning_StdoutSingleVersion(t *testing.T) {
var result = cmdtesting.New(t, createReplayLogs).Run("normalized-list", "libosmesa6-dev", "libgl1-mesa-dev")
result.ExpectSuccessfulOut("libgl1-mesa-dev=21.2.6-0ubuntu0.1~20.04.2 libosmesa6-dev=21.2.6-0ubuntu0.1~20.04.2")
}
func TestNormalizedList_SinglePackageExists_StdoutsSinglePackageNameVersionPair(t *testing.T) {
var result = cmdtesting.New(t, createReplayLogs).Run("normalized-list", "xdot")
result.ExpectSuccessfulOut("xdot=1.1-2")
}
func TestNormalizedList_VersionContainsColon_StdoutsEntireVersion(t *testing.T) {
var result = cmdtesting.New(t, createReplayLogs).Run("normalized-list", "default-jre")
result.ExpectSuccessfulOut("default-jre=2:1.11-72")
}
func TestNormalizedList_NonExistentPackageName_StderrsAptCacheErrors(t *testing.T) {
var result = cmdtesting.New(t, createReplayLogs).Run("normalized-list", "nonexistentpackagename")
result.ExpectError(
`Error encountered running apt-cache --quiet=0 --no-all-versions show nonexistentpackagename
Exited with status code 100; see combined std[out,err] below:
N: Unable to locate package nonexistentpackagename
N: Unable to locate package nonexistentpackagename
E: No packages found`)
}
func TestNormalizedList_NoPackagesGiven_StderrsArgMismatch(t *testing.T) {
var result = cmdtesting.New(t, createReplayLogs).Run("normalized-list")
result.ExpectError("Expected at least 2 non-flag arguments but found 1.")
}
func TestNormalizedList_VirtualPackagesExists_StdoutsConcretePackage(t *testing.T) {
result := cmdtesting.New(t, createReplayLogs).Run("normalized-list", "libvips")
result.ExpectSuccessfulOut("libvips42=8.9.1-2")
}

View file

@ -1,10 +0,0 @@
2025/03/15 22:29:08 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:09 EXECUTION-OBJ-START
{
"Cmd": "apt-cache --quiet=0 --no-all-versions show xdot rolldice",
"Stdout": "Package: xdot\nArchitecture: all\nVersion: 1.1-2\nPriority: optional\nSection: universe/python\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Python Applications Packaging Team \u003cpython-apps-team@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 153\nDepends: gir1.2-gtk-3.0, graphviz, python3-gi, python3-gi-cairo, python3:any\nFilename: pool/universe/x/xdot/xdot_1.1-2_all.deb\nSize: 26708\nMD5sum: aab630b6e1f73a0e3ae85b208b8b6d00\nSHA1: 202155c123c7bd7628023b848e997f342d14359d\nSHA256: 4b7ecb2c4dc948a850024a9b7378d58195230659307bbde4018a1be17645e690\nHomepage: https://github.com/jrfonseca/xdot.py\nDescription-en: interactive viewer for Graphviz dot files\n xdot is an interactive viewer for graphs written in Graphviz's dot language.\n It uses internally the graphviz's xdot output format as an intermediate\n format, and PyGTK and Cairo for rendering. xdot can be used either as a\n standalone application from command line, or as a library embedded in your\n Python 3 application.\n .\n Features:\n * Since it doesn't use bitmaps it is fast and has a small memory footprint.\n * Arbitrary zoom.\n * Keyboard/mouse navigation.\n * Supports events on the nodes with URLs.\n * Animated jumping between nodes.\n * Highlights node/edge under mouse.\nDescription-md5: eb58f25a628b48a744f1b904af3b9282\n\nPackage: rolldice\nArchitecture: amd64\nVersion: 1.16-1build1\nPriority: optional\nSection: universe/games\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Thomas Ross \u003cthomasross@thomasross.io\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 31\nDepends: libc6 (\u003e= 2.7), libreadline8 (\u003e= 6.0)\nFilename: pool/universe/r/rolldice/rolldice_1.16-1build1_amd64.deb\nSize: 9628\nMD5sum: af6390bf2d5d5b4710d308ac06d4913a\nSHA1: 1d87ccac5b20f4e2a217a0e058408f46cfe5caff\nSHA256: 2e076006200057da0be52060e3cc2f4fc7c51212867173e727590bd7603a0337\nHomepage: https://github.com/sstrickl/rolldice\nDescription-en: virtual dice roller\n rolldice is a virtual dice roller that takes a string on the command\n line in the format of some fantasy role playing games like Advanced\n Dungeons \u0026 Dragons [1] and returns the result of the dice rolls.\n .\n [1] Advanced Dungeons \u0026 Dragons is a registered trademark of TSR, Inc.\nDescription-md5: fc24e9e12c794a8f92ab0ca6e1058501\n\n",
"Stderr": "",
"CombinedOut": "Package: xdot\nArchitecture: all\nVersion: 1.1-2\nPriority: optional\nSection: universe/python\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Python Applications Packaging Team \u003cpython-apps-team@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 153\nDepends: gir1.2-gtk-3.0, graphviz, python3-gi, python3-gi-cairo, python3:any\nFilename: pool/universe/x/xdot/xdot_1.1-2_all.deb\nSize: 26708\nMD5sum: aab630b6e1f73a0e3ae85b208b8b6d00\nSHA1: 202155c123c7bd7628023b848e997f342d14359d\nSHA256: 4b7ecb2c4dc948a850024a9b7378d58195230659307bbde4018a1be17645e690\nHomepage: https://github.com/jrfonseca/xdot.py\nDescription-en: interactive viewer for Graphviz dot files\n xdot is an interactive viewer for graphs written in Graphviz's dot language.\n It uses internally the graphviz's xdot output format as an intermediate\n format, and PyGTK and Cairo for rendering. xdot can be used either as a\n standalone application from command line, or as a library embedded in your\n Python 3 application.\n .\n Features:\n * Since it doesn't use bitmaps it is fast and has a small memory footprint.\n * Arbitrary zoom.\n * Keyboard/mouse navigation.\n * Supports events on the nodes with URLs.\n * Animated jumping between nodes.\n * Highlights node/edge under mouse.\nDescription-md5: eb58f25a628b48a744f1b904af3b9282\n\nPackage: rolldice\nArchitecture: amd64\nVersion: 1.16-1build1\nPriority: optional\nSection: universe/games\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Thomas Ross \u003cthomasross@thomasross.io\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 31\nDepends: libc6 (\u003e= 2.7), libreadline8 (\u003e= 6.0)\nFilename: pool/universe/r/rolldice/rolldice_1.16-1build1_amd64.deb\nSize: 9628\nMD5sum: af6390bf2d5d5b4710d308ac06d4913a\nSHA1: 1d87ccac5b20f4e2a217a0e058408f46cfe5caff\nSHA256: 2e076006200057da0be52060e3cc2f4fc7c51212867173e727590bd7603a0337\nHomepage: https://github.com/sstrickl/rolldice\nDescription-en: virtual dice roller\n rolldice is a virtual dice roller that takes a string on the command\n line in the format of some fantasy role playing games like Advanced\n Dungeons \u0026 Dragons [1] and returns the result of the dice rolls.\n .\n [1] Advanced Dungeons \u0026 Dragons is a registered trademark of TSR, Inc.\nDescription-md5: fc24e9e12c794a8f92ab0ca6e1058501\n\n",
"ExitCode": 0
}
EXECUTION-OBJ-END

View file

@ -1,10 +0,0 @@
2025/03/15 22:29:10 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:12 EXECUTION-OBJ-START
{
"Cmd": "apt-cache --quiet=0 --no-all-versions show libosmesa6-dev libgl1-mesa-dev",
"Stdout": "Package: libosmesa6-dev\nArchitecture: amd64\nVersion: 21.2.6-0ubuntu0.1~20.04.2\nMulti-Arch: same\nPriority: extra\nSection: devel\nSource: mesa\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Debian X Strike Force \u003cdebian-x@lists.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 88\nProvides: libosmesa-dev\nDepends: libosmesa6 (= 21.2.6-0ubuntu0.1~20.04.2), mesa-common-dev (= 21.2.6-0ubuntu0.1~20.04.2) | libgl-dev\nConflicts: libosmesa-dev\nReplaces: libosmesa-dev\nFilename: pool/main/m/mesa/libosmesa6-dev_21.2.6-0ubuntu0.1~20.04.2_amd64.deb\nSize: 8844\nMD5sum: b6c380d1b916bc6955aaf108a3be468e\nSHA1: 4b772c8127e60a342dabec4ff0939969d99038b4\nSHA256: bf003b66573d611877664e01659046d281b26698f3665345cb784ddded662c6a\nSHA512: 761557925874473e4408504772a0af4d29f6dc1dcbd53e772315dffb6da87d47960edca4de39deda7cae33a8730d87a19b40a7d29739ba7cff5b60ee4900a13a\nHomepage: https://mesa3d.org/\nDescription-en: Mesa Off-screen rendering extension -- development files\n This package provides the required environment for developing programs\n that use the off-screen rendering extension of Mesa.\n .\n For more information on OSmesa see the libosmesa6 package.\nDescription-md5: 9b1d7a0b3e6a2ea021f4443f42dcff4f\n\nPackage: libgl1-mesa-dev\nArchitecture: amd64\nVersion: 21.2.6-0ubuntu0.1~20.04.2\nMulti-Arch: same\nPriority: extra\nSection: libdevel\nSource: mesa\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Debian X Strike Force \u003cdebian-x@lists.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 70\nDepends: libgl-dev, libglvnd-dev\nFilename: pool/main/m/mesa/libgl1-mesa-dev_21.2.6-0ubuntu0.1~20.04.2_amd64.deb\nSize: 6420\nMD5sum: 759a811dcb12adfcebfc1a6aa52e85b9\nSHA1: 3b9de17b1c67ee40603e1eebaefa978810a2f2d2\nSHA256: 76846d96ae0706a7edcd514d452a1393bb8b8a8ac06518253dd5869441807052\nSHA512: 581e4b3752b4c98399f3519fce2c5ab033cea3cac66bde3c204af769ff0377400087f9c4c6aaebe06c19d05f8715b3346d249a86c3ae80a098ca476e76af01c3\nHomepage: https://mesa3d.org/\nDescription-en: transitional dummy package\n This is a transitional dummy package, it can be safely removed.\nDescription-md5: 635a93bcd1440d16621693fe064c2aa9\n\n",
"Stderr": "N: There are 2 additional records. Please use the '-a' switch to see them.\n",
"CombinedOut": "Package: libosmesa6-dev\nArchitecture: amd64\nVersion: 21.2.6-0ubuntu0.1~20.04.2\nMulti-Arch: same\nPriority: extra\nSection: devel\nSource: mesa\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Debian X Strike Force \u003cdebian-x@lists.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 88\nProvides: libosmesa-dev\nDepends: libosmesa6 (= 21.2.6-0ubuntu0.1~20.04.2), mesa-common-dev (= 21.2.6-0ubuntu0.1~20.04.2) | libgl-dev\nConflicts: libosmesa-dev\nReplaces: libosmesa-dev\nFilename: pool/main/m/mesa/libosmesa6-dev_21.2.6-0ubuntu0.1~20.04.2_amd64.deb\nSize: 8844\nMD5sum: b6c380d1b916bc6955aaf108a3be468e\nSHA1: 4b772c8127e60a342dabec4ff0939969d99038b4\nSHA256: bf003b66573d611877664e01659046d281b26698f3665345cb784ddded662c6a\nSHA512: 761557925874473e4408504772a0af4d29f6dc1dcbd53e772315dffb6da87d47960edca4de39deda7cae33a8730d87a19b40a7d29739ba7cff5b60ee4900a13a\nHomepage: https://mesa3d.org/\nDescription-en: Mesa Off-screen rendering extension -- development files\n This package provides the required environment for developing programs\n that use the off-screen rendering extension of Mesa.\n .\n For more information on OSmesa see the libosmesa6 package.\nDescription-md5: 9b1d7a0b3e6a2ea021f4443f42dcff4f\n\nPackage: libgl1-mesa-dev\nArchitecture: amd64\nVersion: 21.2.6-0ubuntu0.1~20.04.2\nMulti-Arch: same\nPriority: extra\nSection: libdevel\nSource: mesa\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Debian X Strike Force \u003cdebian-x@lists.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 70\nDepends: libgl-dev, libglvnd-dev\nFilename: pool/main/m/mesa/libgl1-mesa-dev_21.2.6-0ubuntu0.1~20.04.2_amd64.deb\nSize: 6420\nMD5sum: 759a811dcb12adfcebfc1a6aa52e85b9\nSHA1: 3b9de17b1c67ee40603e1eebaefa978810a2f2d2\nSHA256: 76846d96ae0706a7edcd514d452a1393bb8b8a8ac06518253dd5869441807052\nSHA512: 581e4b3752b4c98399f3519fce2c5ab033cea3cac66bde3c204af769ff0377400087f9c4c6aaebe06c19d05f8715b3346d249a86c3ae80a098ca476e76af01c3\nHomepage: https://mesa3d.org/\nDescription-en: transitional dummy package\n This is a transitional dummy package, it can be safely removed.\nDescription-md5: 635a93bcd1440d16621693fe064c2aa9\n\nN: There are 2 additional records. Please use the '-a' switch to see them.\n",
"ExitCode": 0
}
EXECUTION-OBJ-END

View file

@ -1,15 +0,0 @@
2025/03/15 22:29:13 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:14 EXECUTION-OBJ-START
{
"Cmd": "apt-cache --quiet=0 --no-all-versions show nonexistentpackagename",
"Stdout": "",
"Stderr": "N: Unable to locate package nonexistentpackagename\nN: Unable to locate package nonexistentpackagename\nE: No packages found\n",
"CombinedOut": "N: Unable to locate package nonexistentpackagename\nN: Unable to locate package nonexistentpackagename\nE: No packages found\n",
"ExitCode": 100
}
EXECUTION-OBJ-END
2025/03/15 22:29:14 Error encountered running apt-cache --quiet=0 --no-all-versions show nonexistentpackagename
Exited with status code 100; see combined std[out,err] below:
N: Unable to locate package nonexistentpackagename
N: Unable to locate package nonexistentpackagename
E: No packages found

View file

@ -1,2 +0,0 @@
2025/03/15 22:29:14 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:14 Expected at least 2 non-flag arguments but found 1.

View file

@ -1,20 +0,0 @@
2025/03/15 22:29:09 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:10 EXECUTION-OBJ-START
{
"Cmd": "apt-cache --quiet=0 --no-all-versions show rolldice xdot",
"Stdout": "Package: rolldice\nArchitecture: amd64\nVersion: 1.16-1build1\nPriority: optional\nSection: universe/games\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Thomas Ross \u003cthomasross@thomasross.io\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 31\nDepends: libc6 (\u003e= 2.7), libreadline8 (\u003e= 6.0)\nFilename: pool/universe/r/rolldice/rolldice_1.16-1build1_amd64.deb\nSize: 9628\nMD5sum: af6390bf2d5d5b4710d308ac06d4913a\nSHA1: 1d87ccac5b20f4e2a217a0e058408f46cfe5caff\nSHA256: 2e076006200057da0be52060e3cc2f4fc7c51212867173e727590bd7603a0337\nHomepage: https://github.com/sstrickl/rolldice\nDescription-en: virtual dice roller\n rolldice is a virtual dice roller that takes a string on the command\n line in the format of some fantasy role playing games like Advanced\n Dungeons \u0026 Dragons [1] and returns the result of the dice rolls.\n .\n [1] Advanced Dungeons \u0026 Dragons is a registered trademark of TSR, Inc.\nDescription-md5: fc24e9e12c794a8f92ab0ca6e1058501\n\nPackage: xdot\nArchitecture: all\nVersion: 1.1-2\nPriority: optional\nSection: universe/python\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Python Applications Packaging Team \u003cpython-apps-team@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 153\nDepends: gir1.2-gtk-3.0, graphviz, python3-gi, python3-gi-cairo, python3:any\nFilename: pool/universe/x/xdot/xdot_1.1-2_all.deb\nSize: 26708\nMD5sum: aab630b6e1f73a0e3ae85b208b8b6d00\nSHA1: 202155c123c7bd7628023b848e997f342d14359d\nSHA256: 4b7ecb2c4dc948a850024a9b7378d58195230659307bbde4018a1be17645e690\nHomepage: https://github.com/jrfonseca/xdot.py\nDescription-en: interactive viewer for Graphviz dot files\n xdot is an interactive viewer for graphs written in Graphviz's dot language.\n It uses internally the graphviz's xdot output format as an intermediate\n format, and PyGTK and Cairo for rendering. xdot can be used either as a\n standalone application from command line, or as a library embedded in your\n Python 3 application.\n .\n Features:\n * Since it doesn't use bitmaps it is fast and has a small memory footprint.\n * Arbitrary zoom.\n * Keyboard/mouse navigation.\n * Supports events on the nodes with URLs.\n * Animated jumping between nodes.\n * Highlights node/edge under mouse.\nDescription-md5: eb58f25a628b48a744f1b904af3b9282\n\n",
"Stderr": "",
"CombinedOut": "Package: rolldice\nArchitecture: amd64\nVersion: 1.16-1build1\nPriority: optional\nSection: universe/games\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Thomas Ross \u003cthomasross@thomasross.io\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 31\nDepends: libc6 (\u003e= 2.7), libreadline8 (\u003e= 6.0)\nFilename: pool/universe/r/rolldice/rolldice_1.16-1build1_amd64.deb\nSize: 9628\nMD5sum: af6390bf2d5d5b4710d308ac06d4913a\nSHA1: 1d87ccac5b20f4e2a217a0e058408f46cfe5caff\nSHA256: 2e076006200057da0be52060e3cc2f4fc7c51212867173e727590bd7603a0337\nHomepage: https://github.com/sstrickl/rolldice\nDescription-en: virtual dice roller\n rolldice is a virtual dice roller that takes a string on the command\n line in the format of some fantasy role playing games like Advanced\n Dungeons \u0026 Dragons [1] and returns the result of the dice rolls.\n .\n [1] Advanced Dungeons \u0026 Dragons is a registered trademark of TSR, Inc.\nDescription-md5: fc24e9e12c794a8f92ab0ca6e1058501\n\nPackage: xdot\nArchitecture: all\nVersion: 1.1-2\nPriority: optional\nSection: universe/python\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Python Applications Packaging Team \u003cpython-apps-team@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 153\nDepends: gir1.2-gtk-3.0, graphviz, python3-gi, python3-gi-cairo, python3:any\nFilename: pool/universe/x/xdot/xdot_1.1-2_all.deb\nSize: 26708\nMD5sum: aab630b6e1f73a0e3ae85b208b8b6d00\nSHA1: 202155c123c7bd7628023b848e997f342d14359d\nSHA256: 4b7ecb2c4dc948a850024a9b7378d58195230659307bbde4018a1be17645e690\nHomepage: https://github.com/jrfonseca/xdot.py\nDescription-en: interactive viewer for Graphviz dot files\n xdot is an interactive viewer for graphs written in Graphviz's dot language.\n It uses internally the graphviz's xdot output format as an intermediate\n format, and PyGTK and Cairo for rendering. xdot can be used either as a\n standalone application from command line, or as a library embedded in your\n Python 3 application.\n .\n Features:\n * Since it doesn't use bitmaps it is fast and has a small memory footprint.\n * Arbitrary zoom.\n * Keyboard/mouse navigation.\n * Supports events on the nodes with URLs.\n * Animated jumping between nodes.\n * Highlights node/edge under mouse.\nDescription-md5: eb58f25a628b48a744f1b904af3b9282\n\n",
"ExitCode": 0
}
EXECUTION-OBJ-END
2025/03/15 22:29:10 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:10 EXECUTION-OBJ-START
{
"Cmd": "apt-cache --quiet=0 --no-all-versions show xdot rolldice",
"Stdout": "Package: xdot\nArchitecture: all\nVersion: 1.1-2\nPriority: optional\nSection: universe/python\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Python Applications Packaging Team \u003cpython-apps-team@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 153\nDepends: gir1.2-gtk-3.0, graphviz, python3-gi, python3-gi-cairo, python3:any\nFilename: pool/universe/x/xdot/xdot_1.1-2_all.deb\nSize: 26708\nMD5sum: aab630b6e1f73a0e3ae85b208b8b6d00\nSHA1: 202155c123c7bd7628023b848e997f342d14359d\nSHA256: 4b7ecb2c4dc948a850024a9b7378d58195230659307bbde4018a1be17645e690\nHomepage: https://github.com/jrfonseca/xdot.py\nDescription-en: interactive viewer for Graphviz dot files\n xdot is an interactive viewer for graphs written in Graphviz's dot language.\n It uses internally the graphviz's xdot output format as an intermediate\n format, and PyGTK and Cairo for rendering. xdot can be used either as a\n standalone application from command line, or as a library embedded in your\n Python 3 application.\n .\n Features:\n * Since it doesn't use bitmaps it is fast and has a small memory footprint.\n * Arbitrary zoom.\n * Keyboard/mouse navigation.\n * Supports events on the nodes with URLs.\n * Animated jumping between nodes.\n * Highlights node/edge under mouse.\nDescription-md5: eb58f25a628b48a744f1b904af3b9282\n\nPackage: rolldice\nArchitecture: amd64\nVersion: 1.16-1build1\nPriority: optional\nSection: universe/games\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Thomas Ross \u003cthomasross@thomasross.io\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 31\nDepends: libc6 (\u003e= 2.7), libreadline8 (\u003e= 6.0)\nFilename: pool/universe/r/rolldice/rolldice_1.16-1build1_amd64.deb\nSize: 9628\nMD5sum: af6390bf2d5d5b4710d308ac06d4913a\nSHA1: 1d87ccac5b20f4e2a217a0e058408f46cfe5caff\nSHA256: 2e076006200057da0be52060e3cc2f4fc7c51212867173e727590bd7603a0337\nHomepage: https://github.com/sstrickl/rolldice\nDescription-en: virtual dice roller\n rolldice is a virtual dice roller that takes a string on the command\n line in the format of some fantasy role playing games like Advanced\n Dungeons \u0026 Dragons [1] and returns the result of the dice rolls.\n .\n [1] Advanced Dungeons \u0026 Dragons is a registered trademark of TSR, Inc.\nDescription-md5: fc24e9e12c794a8f92ab0ca6e1058501\n\n",
"Stderr": "",
"CombinedOut": "Package: xdot\nArchitecture: all\nVersion: 1.1-2\nPriority: optional\nSection: universe/python\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Python Applications Packaging Team \u003cpython-apps-team@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 153\nDepends: gir1.2-gtk-3.0, graphviz, python3-gi, python3-gi-cairo, python3:any\nFilename: pool/universe/x/xdot/xdot_1.1-2_all.deb\nSize: 26708\nMD5sum: aab630b6e1f73a0e3ae85b208b8b6d00\nSHA1: 202155c123c7bd7628023b848e997f342d14359d\nSHA256: 4b7ecb2c4dc948a850024a9b7378d58195230659307bbde4018a1be17645e690\nHomepage: https://github.com/jrfonseca/xdot.py\nDescription-en: interactive viewer for Graphviz dot files\n xdot is an interactive viewer for graphs written in Graphviz's dot language.\n It uses internally the graphviz's xdot output format as an intermediate\n format, and PyGTK and Cairo for rendering. xdot can be used either as a\n standalone application from command line, or as a library embedded in your\n Python 3 application.\n .\n Features:\n * Since it doesn't use bitmaps it is fast and has a small memory footprint.\n * Arbitrary zoom.\n * Keyboard/mouse navigation.\n * Supports events on the nodes with URLs.\n * Animated jumping between nodes.\n * Highlights node/edge under mouse.\nDescription-md5: eb58f25a628b48a744f1b904af3b9282\n\nPackage: rolldice\nArchitecture: amd64\nVersion: 1.16-1build1\nPriority: optional\nSection: universe/games\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Thomas Ross \u003cthomasross@thomasross.io\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 31\nDepends: libc6 (\u003e= 2.7), libreadline8 (\u003e= 6.0)\nFilename: pool/universe/r/rolldice/rolldice_1.16-1build1_amd64.deb\nSize: 9628\nMD5sum: af6390bf2d5d5b4710d308ac06d4913a\nSHA1: 1d87ccac5b20f4e2a217a0e058408f46cfe5caff\nSHA256: 2e076006200057da0be52060e3cc2f4fc7c51212867173e727590bd7603a0337\nHomepage: https://github.com/sstrickl/rolldice\nDescription-en: virtual dice roller\n rolldice is a virtual dice roller that takes a string on the command\n line in the format of some fantasy role playing games like Advanced\n Dungeons \u0026 Dragons [1] and returns the result of the dice rolls.\n .\n [1] Advanced Dungeons \u0026 Dragons is a registered trademark of TSR, Inc.\nDescription-md5: fc24e9e12c794a8f92ab0ca6e1058501\n\n",
"ExitCode": 0
}
EXECUTION-OBJ-END

View file

@ -1,10 +0,0 @@
2025/03/15 22:29:12 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:12 EXECUTION-OBJ-START
{
"Cmd": "apt-cache --quiet=0 --no-all-versions show xdot",
"Stdout": "Package: xdot\nArchitecture: all\nVersion: 1.1-2\nPriority: optional\nSection: universe/python\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Python Applications Packaging Team \u003cpython-apps-team@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 153\nDepends: gir1.2-gtk-3.0, graphviz, python3-gi, python3-gi-cairo, python3:any\nFilename: pool/universe/x/xdot/xdot_1.1-2_all.deb\nSize: 26708\nMD5sum: aab630b6e1f73a0e3ae85b208b8b6d00\nSHA1: 202155c123c7bd7628023b848e997f342d14359d\nSHA256: 4b7ecb2c4dc948a850024a9b7378d58195230659307bbde4018a1be17645e690\nHomepage: https://github.com/jrfonseca/xdot.py\nDescription-en: interactive viewer for Graphviz dot files\n xdot is an interactive viewer for graphs written in Graphviz's dot language.\n It uses internally the graphviz's xdot output format as an intermediate\n format, and PyGTK and Cairo for rendering. xdot can be used either as a\n standalone application from command line, or as a library embedded in your\n Python 3 application.\n .\n Features:\n * Since it doesn't use bitmaps it is fast and has a small memory footprint.\n * Arbitrary zoom.\n * Keyboard/mouse navigation.\n * Supports events on the nodes with URLs.\n * Animated jumping between nodes.\n * Highlights node/edge under mouse.\nDescription-md5: eb58f25a628b48a744f1b904af3b9282\n\n",
"Stderr": "",
"CombinedOut": "Package: xdot\nArchitecture: all\nVersion: 1.1-2\nPriority: optional\nSection: universe/python\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Python Applications Packaging Team \u003cpython-apps-team@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 153\nDepends: gir1.2-gtk-3.0, graphviz, python3-gi, python3-gi-cairo, python3:any\nFilename: pool/universe/x/xdot/xdot_1.1-2_all.deb\nSize: 26708\nMD5sum: aab630b6e1f73a0e3ae85b208b8b6d00\nSHA1: 202155c123c7bd7628023b848e997f342d14359d\nSHA256: 4b7ecb2c4dc948a850024a9b7378d58195230659307bbde4018a1be17645e690\nHomepage: https://github.com/jrfonseca/xdot.py\nDescription-en: interactive viewer for Graphviz dot files\n xdot is an interactive viewer for graphs written in Graphviz's dot language.\n It uses internally the graphviz's xdot output format as an intermediate\n format, and PyGTK and Cairo for rendering. xdot can be used either as a\n standalone application from command line, or as a library embedded in your\n Python 3 application.\n .\n Features:\n * Since it doesn't use bitmaps it is fast and has a small memory footprint.\n * Arbitrary zoom.\n * Keyboard/mouse navigation.\n * Supports events on the nodes with URLs.\n * Animated jumping between nodes.\n * Highlights node/edge under mouse.\nDescription-md5: eb58f25a628b48a744f1b904af3b9282\n\n",
"ExitCode": 0
}
EXECUTION-OBJ-END

View file

@ -1,10 +0,0 @@
2025/03/15 22:29:12 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:13 EXECUTION-OBJ-START
{
"Cmd": "apt-cache --quiet=0 --no-all-versions show default-jre",
"Stdout": "Package: default-jre\nArchitecture: amd64\nVersion: 2:1.11-72\nPriority: optional\nSection: interpreters\nSource: java-common (0.72)\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Debian Java Maintainers \u003cpkg-java-maintainers@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 6\nProvides: java-runtime, java10-runtime, java11-runtime, java2-runtime, java5-runtime, java6-runtime, java7-runtime, java8-runtime, java9-runtime\nDepends: default-jre-headless (= 2:1.11-72), openjdk-11-jre\nFilename: pool/main/j/java-common/default-jre_1.11-72_amd64.deb\nSize: 1084\nMD5sum: 4f441bb884801f3a07806934e4519652\nSHA1: 9922edaa7bd91921a2beee6c60343ebf551957a9\nSHA256: 063bb2ca3b51309f6625033c336beffb0eb8286aaabcf3bf917eef498de29ea5\nHomepage: https://wiki.debian.org/Java/\nDescription-en: Standard Java or Java compatible Runtime\n This dependency package points to the Java runtime, or Java compatible\n runtime recommended for this architecture, which is\n openjdk-11-jre for amd64.\nDescription-md5: 071b7a2f9baf048d89c849c14bcafb9e\nCnf-Extra-Commands: java,jexec\n\n",
"Stderr": "",
"CombinedOut": "Package: default-jre\nArchitecture: amd64\nVersion: 2:1.11-72\nPriority: optional\nSection: interpreters\nSource: java-common (0.72)\nOrigin: Ubuntu\nMaintainer: Ubuntu Developers \u003cubuntu-devel-discuss@lists.ubuntu.com\u003e\nOriginal-Maintainer: Debian Java Maintainers \u003cpkg-java-maintainers@lists.alioth.debian.org\u003e\nBugs: https://bugs.launchpad.net/ubuntu/+filebug\nInstalled-Size: 6\nProvides: java-runtime, java10-runtime, java11-runtime, java2-runtime, java5-runtime, java6-runtime, java7-runtime, java8-runtime, java9-runtime\nDepends: default-jre-headless (= 2:1.11-72), openjdk-11-jre\nFilename: pool/main/j/java-common/default-jre_1.11-72_amd64.deb\nSize: 1084\nMD5sum: 4f441bb884801f3a07806934e4519652\nSHA1: 9922edaa7bd91921a2beee6c60343ebf551957a9\nSHA256: 063bb2ca3b51309f6625033c336beffb0eb8286aaabcf3bf917eef498de29ea5\nHomepage: https://wiki.debian.org/Java/\nDescription-en: Standard Java or Java compatible Runtime\n This dependency package points to the Java runtime, or Java compatible\n runtime recommended for this architecture, which is\n openjdk-11-jre for amd64.\nDescription-md5: 071b7a2f9baf048d89c849c14bcafb9e\nCnf-Extra-Commands: java,jexec\n\n",
"ExitCode": 0
}
EXECUTION-OBJ-END

View file

@ -1,19 +0,0 @@
2025/03/15 22:29:14 Debug log created at /home/awalsh128/cache-apt-pkgs-action/src/cmd/apt_query/apt_query.log
2025/03/15 22:29:14 EXECUTION-OBJ-START
{
"Cmd": "apt-cache --quiet=0 --no-all-versions show libvips",
"Stdout": "",
"Stderr": "N: Can't select candidate version from package libvips as it has no candidate\nN: Can't select versions from package 'libvips' as it is purely virtual\nN: No packages found\n",
"CombinedOut": "N: Can't select candidate version from package libvips as it has no candidate\nN: Can't select versions from package 'libvips' as it is purely virtual\nN: No packages found\n",
"ExitCode": 0
}
EXECUTION-OBJ-END
2025/03/15 22:29:14 EXECUTION-OBJ-START
{
"Cmd": "bash -c apt-cache showpkg libvips | grep -A 1 \"Reverse Provides\" | tail -1",
"Stdout": "libvips42 8.9.1-2 (= )\n",
"Stderr": "",
"CombinedOut": "libvips42 8.9.1-2 (= )\n",
"ExitCode": 0
}
EXECUTION-OBJ-END

View file

@ -0,0 +1,154 @@
package main
import (
"flag"
"fmt"
"os"
"path/filepath"
"awalsh128.com/cache-apt-pkgs-action/src/internal/pkgs"
)
type Cmd struct {
Name string
Description string
Flags *flag.FlagSet
Examples []string // added Examples field for command usage examples
ExamplePackages *pkgs.Packages
Run func(cmd *Cmd, pkgArgs *pkgs.Packages) error
}
// binaryName returns the base name of the command without the path
var binaryName = filepath.Base(os.Args[0])
type Cmds map[string]*Cmd
func (c *Cmd) parseFlags() *pkgs.Packages {
if len(os.Args) < 3 {
fmt.Fprintf(os.Stderr, "error: command %q requires arguments\n", c.Name)
os.Exit(1)
}
// Parse the command line flags
if err := c.Flags.Parse(os.Args[2:]); err != nil {
fmt.Fprintf(os.Stderr, "error: unable to parse flags for command %q: %v\n", c.Name, err)
os.Exit(1)
}
// Check for missing required flags
missingFlagNames := []string{}
c.Flags.VisitAll(func(f *flag.Flag) {
// Only consider strings as required flags
if f.Value.String() == "" && f.DefValue != "" && f.Name != "help" {
missingFlagNames = append(missingFlagNames, f.Name)
}
})
if len(missingFlagNames) > 0 {
fmt.Fprintf(os.Stderr, "error: missing required flags for command %q: %s\n", c.Name, missingFlagNames)
os.Exit(1)
}
// Parse the remaining arguments as package arguments
pkgArgs := pkgs.ParsePackageArgs(c.Flags.Args())
return pkgArgs
}
func (c *Cmds) Add(cmd *Cmd) error {
if _, exists := (*c)[cmd.Name]; exists {
return fmt.Errorf("command %q already exists", cmd.Name)
}
(*c)[cmd.Name] = cmd
return nil
}
func (c *Cmds) Get(name string) (*Cmd, bool) {
cmd, ok := (*c)[name]
return cmd, ok
}
func (c *Cmd) getFlagCount() int {
count := 0
c.Flags.VisitAll(func(f *flag.Flag) {
count++
})
return count
}
func (c *Cmd) help() {
if c.getFlagCount() == 0 {
fmt.Fprintf(os.Stderr, "usage: %s %s [packages]\n\n", binaryName, c.Name)
fmt.Fprintf(os.Stderr, "%s\n\n", c.Description)
} else {
fmt.Fprintf(os.Stderr, "usage: %s %s [flags] [packages]\n\n", binaryName, c.Name)
fmt.Fprintf(os.Stderr, "%s\n\n", c.Description)
fmt.Fprintf(os.Stderr, "Flags:\n")
c.Flags.PrintDefaults()
}
if c.ExamplePackages == nil && len(c.Examples) == 0 {
return
}
fmt.Fprintf(os.Stderr, "\nExamples:\n")
if len(c.Examples) == 0 {
fmt.Fprintf(os.Stderr, " %s %s %s\n", binaryName, c.Name, c.ExamplePackages.String())
return
}
for _, example := range c.Examples {
fmt.Fprintf(os.Stderr, " %s %s %s %s\n", binaryName, c.Name, example, c.ExamplePackages.String())
}
}
func printUsage(cmds Cmds) {
fmt.Fprintf(os.Stderr, "usage: %s <command> [flags] [packages]\n\n", binaryName)
fmt.Fprintf(os.Stderr, "commands:\n")
// Get max length for alignment
maxLen := 0
for name := range cmds {
if len(name) > maxLen {
maxLen = len(name)
}
}
// Print aligned command descriptions
for name, cmd := range cmds {
fmt.Fprintf(os.Stderr, " %-*s %s\n", maxLen, name, cmd.Description)
}
fmt.Fprintf(os.Stderr, "\nUse \"%s <command> --help\" for more information about a command\n", binaryName)
}
func (c *Cmds) Parse() (*Cmd, *pkgs.Packages) {
if len(os.Args) < 2 {
fmt.Fprintf(os.Stderr, "error: no command specified\n\n")
printUsage(*c)
os.Exit(1)
}
binaryName := os.Args[1]
if binaryName == "--help" || binaryName == "-h" {
printUsage(*c)
os.Exit(0)
}
cmd, ok := c.Get(binaryName)
if !ok {
fmt.Fprintf(os.Stderr, "error: unknown command %q\n\n", binaryName)
printUsage(*c)
os.Exit(1)
}
// Handle command-specific help
for _, arg := range os.Args[2:] {
if arg == "--help" || arg == "-h" {
cmd.help()
os.Exit(0)
}
}
pkgArgs := cmd.parseFlags()
if pkgArgs == nil {
fmt.Fprintf(os.Stderr, "error: no package arguments specified for command %q\n\n", cmd.Name)
cmd.help()
os.Exit(1)
}
return cmd, pkgArgs
}

View file

@ -0,0 +1,210 @@
package main
import (
"flag"
"fmt"
"os"
"testing"
"awalsh128.com/cache-apt-pkgs-action/src/internal/pkgs"
"github.com/stretchr/testify/assert"
)
const (
// Packages.Name values
pkgNameNginx = "nginx"
pkgNameRedis = "redis"
pkgNamePostgres = "postgresql"
// Packages.Version values
pkgVersionNginx = "1.18.0"
pkgVersionPostgres = "13.2"
// Cmd field values
cmdName = "test"
cmdDescription = "test command"
// File path values
flagCacheDir = "/path/to/cache"
binaryFullPath = "/usr/local/bin/myapp"
binaryRelPath = "./bin/myapp"
binaryBaseName = "myapp"
)
// PkgArg tests
func TestPackagesEmpty(t *testing.T) {
assert := assert.New(t)
pkgArgs := pkgs.Packages{}
assert.Empty(pkgArgs.String(), "empty Packages should return empty string")
}
func TestPackagesSingleWithoutVersion(t *testing.T) {
assert := assert.New(t)
pkgArgs := pkgs.Packages{pkgs.Package{Name: pkgNameNginx}}
assert.Equal(pkgNameNginx, pkgArgs.String(), "Packages with single package without version")
}
func TestPackagesSingleWithVersion(t *testing.T) {
assert := assert.New(t)
pkgArgs := pkgs.Packages{pkgs.Package{Name: pkgNameNginx, Version: pkgVersionNginx}}
expected := pkgNameNginx + "=" + pkgVersionNginx
assert.Equal(expected, pkgArgs.String(), "Packages with single package with version")
}
func TestPackagesMultiple(t *testing.T) {
assert := assert.New(t)
pkgArgs := pkgs.Packages{
pkgs.Package{Name: pkgNameNginx, Version: pkgVersionNginx},
pkgs.Package{Name: pkgNameRedis},
pkgs.Package{Name: pkgNamePostgres, Version: pkgVersionPostgres},
}
expected := pkgNameNginx + "=" + pkgVersionNginx + " " + pkgNameRedis + " " + pkgNamePostgres + "=" + pkgVersionPostgres
assert.Equal(expected, pkgArgs.String(), "Packages with multiple packages")
}
// Cmd tests
func TestCmdName(t *testing.T) {
assert := assert.New(t)
cmd := &Cmd{Name: cmdName}
assert.Equal(cmdName, cmd.Name, "Cmd.Name should match set value")
}
func TestGetFlagCount(t *testing.T) {
assert := assert.New(t)
// Test with no flags
cmd := &Cmd{
Name: cmdName,
Flags: flag.NewFlagSet(cmdName, flag.ExitOnError),
}
assert.Equal(0, cmd.getFlagCount(), "Flag count should be 0 for empty FlagSet")
// Test with one flag
cmd.Flags.String("test1", "", "test flag 1")
assert.Equal(1, cmd.getFlagCount(), "Flag count should be 1 after adding one flag")
// Test with multiple flags
cmd.Flags.String("test2", "", "test flag 2")
cmd.Flags.Bool("test3", false, "test flag 3")
assert.Equal(3, cmd.getFlagCount(), "Flag count should match number of added flags")
// Test with help flag (which is automatically added)
assert.Equal(4, cmd.getFlagCount(), "Flag count should include help flag")
}
func TestCmdDescription(t *testing.T) {
assert := assert.New(t)
cmd := &Cmd{Description: cmdDescription}
assert.Equal(cmdDescription, cmd.Description, "Cmd.Description should match set value")
}
func TestCmdExamples(t *testing.T) {
assert := assert.New(t)
examples := []string{"--cache-dir " + flagCacheDir}
cmd := &Cmd{Examples: examples}
assert.Equal(examples, cmd.Examples, "Cmd.Examples should match set value")
}
func TestCmdExamplePackages(t *testing.T) {
assert := assert.New(t)
examplePkgs := &pkgs.Packages{
pkgs.Package{Name: pkgNameNginx},
pkgs.Package{Name: pkgNameRedis},
}
cmd := &Cmd{ExamplePackages: examplePkgs}
assert.Equal(examplePkgs, cmd.ExamplePackages, "Cmd.ExamplePackages should match set value")
}
func TestCmdRun(t *testing.T) {
assert := assert.New(t)
runCalled := false
cmd := &Cmd{
Run: func(cmd *Cmd, pkgArgs *pkgs.Packages) error {
runCalled = true
return nil
},
}
err := cmd.Run(cmd, nil)
assert.True(runCalled, "Cmd.Run should be called")
assert.NoError(err, "Cmd.Run should not return error")
// Test error case
cmd = &Cmd{
Run: func(cmd *Cmd, pkgArgs *pkgs.Packages) error {
return fmt.Errorf("test error")
},
}
err = cmd.Run(cmd, nil)
assert.Error(err, "Cmd.Run should return error")
assert.Contains(err.Error(), "test error", "Error message should match expected")
}
// Cmds tests
func TestCmdsAdd(t *testing.T) {
assert := assert.New(t)
cmds := make(Cmds)
cmd := &Cmd{Name: cmdName}
err := cmds.Add(cmd)
assert.NoError(err, "Add should not return error for new command")
_, exists := cmds[cmdName]
assert.True(exists, "command should be added to Cmds map")
// Test duplicate add
err = cmds.Add(cmd)
assert.Error(err, "Add should return error for duplicate command")
assert.Contains(err.Error(), "already exists", "Error should mention duplicate")
}
func TestCmdsGetExisting(t *testing.T) {
assert := assert.New(t)
cmds := make(Cmds)
cmd := &Cmd{Name: cmdName}
cmds[cmdName] = cmd
got, ok := cmds.Get(cmdName)
assert.True(ok, "Get should return true for existing command")
assert.Equal(cmd, got, "Get should return correct command")
}
func TestCmdsGetNonExistent(t *testing.T) {
assert := assert.New(t)
cmds := make(Cmds)
_, ok := cmds.Get("nonexistent")
assert.False(ok, "Get should return false for non-existent command")
}
// Binary name tests
func TestBinaryNameSimple(t *testing.T) {
origArgs := os.Args
defer func() { os.Args = origArgs }()
os.Args = []string{binaryBaseName}
if got := binaryName; got != binaryBaseName {
t.Errorf("binaryName = %q, want %q", got, binaryBaseName)
}
}
func TestBinaryNameWithPath(t *testing.T) {
origArgs := os.Args
defer func() { os.Args = origArgs }()
os.Args = []string{binaryFullPath}
if got := binaryName; got != binaryBaseName {
t.Errorf("binaryName = %q, want %q", got, binaryBaseName)
}
}
func TestBinaryNameWithRelativePath(t *testing.T) {
origArgs := os.Args
defer func() { os.Args = origArgs }()
os.Args = []string{binaryRelPath}
if got := binaryName; got != binaryBaseName {
t.Errorf("binaryName = %q, want %q", got, binaryBaseName)
}
} // Mock for os.Exit to prevent tests from actually exiting

View file

@ -0,0 +1,161 @@
package main
import (
"crypto/md5"
"flag"
"fmt"
"os"
"path/filepath"
"runtime"
"awalsh128.com/cache-apt-pkgs-action/src/internal/cache"
"awalsh128.com/cache-apt-pkgs-action/src/internal/logging"
"awalsh128.com/cache-apt-pkgs-action/src/internal/pkgs"
)
func createKey(cmd *Cmd, pkgArgs *pkgs.Packages) error {
key := cache.Key{
Packages: *pkgArgs,
Version: cmd.Flags.Lookup("version").Value.String(),
GlobalVersion: cmd.Flags.Lookup("global-version").Value.String(),
OsArch: cmd.Flags.Lookup("os-arch").Value.String(),
}
cacheDir := cmd.Flags.Lookup("cache-dir").Value.String()
keyText := key.Hash()
logging.Info("Created cache key text: %s", keyText)
keyTextFilepath := filepath.Join(cacheDir, "cache_key.txt")
logging.Info("Writing cache key text '%s' to '%s'", keyText, keyTextFilepath)
if err := os.WriteFile(keyTextFilepath, []byte(keyText), 0644); err != nil {
return fmt.Errorf("failed to write cache key text to %s: %w", keyTextFilepath, err)
}
logging.Info("Cache key text written")
logging.Info("Creating cache key MD5 hash from key text: %s", keyText)
hash := md5.Sum([]byte(keyText))
logging.Info("Created cache key with hash: %x", hash)
keyHashFilepath := filepath.Join(cmd.Flags.Lookup("cache-dir").Value.String(), "cache_key.md5")
logging.Info("Writing cache key hash '%x' to '%s'", hash, keyHashFilepath)
if err := os.WriteFile(keyHashFilepath, hash[:], 0644); err != nil {
return fmt.Errorf("failed to write cache key hash to %s: %w", keyHashFilepath, err)
}
logging.Info("Cache key written")
return nil
}
func installPackages(cmd *Cmd, pkgArgs *pkgs.Packages) error {
return fmt.Errorf("installPackages not implemented")
}
func restorePackages(cmd *Cmd, pkgArgs *pkgs.Packages) error {
return fmt.Errorf("restorePackages not implemented")
}
func validatePackages(cmd *Cmd, pkgArgs *pkgs.Packages) error {
apt, err := pkgs.New()
if err != nil {
return fmt.Errorf("error initializing APT: %v", err)
}
for _, pkg := range *pkgArgs {
if _, err := apt.ValidatePackage(&pkg); err != nil {
logging.Info("invalid: %s - %v", pkg.String(), err)
} else {
logging.Info("valid: %s", pkg.String())
}
}
return nil
}
func createCmdFlags(
createKey func(cmd *Cmd, pkgArgs *pkgs.Packages) error,
install func(cmd *Cmd, pkgArgs *pkgs.Packages) error,
restore func(cmd *Cmd, pkgArgs *pkgs.Packages) error) *Cmds {
examplePackages := &pkgs.Packages{
pkgs.Package{Name: "rolldice"},
pkgs.Package{Name: "xdot", Version: "1.1-2"},
pkgs.Package{Name: "libgtk-3-dev"},
}
commands := &Cmds{}
createKeyCmd := &Cmd{
Name: "createkey",
Description: "Create a cache key based on the provided options",
Flags: flag.NewFlagSet("createkey", flag.ExitOnError),
Run: createKey,
}
createKeyCmd.Flags.String("os-arch", runtime.GOARCH,
"OS architecture to use in the cache key.\n"+
"Action may be called from different runners in a different OS. This ensures the right one is fetched")
createKeyCmd.Flags.String("cache-dir", "", "Directory that holds the cached packages")
createKeyCmd.Flags.String("version", "", "Version of the cache key to force cache invalidation")
createKeyCmd.Flags.Int(
"global-version",
0,
"Unique version to force cache invalidation globally across all action callers\n"+
"Used to fix corrupted caches or bugs from the action itself",
)
createKeyCmd.Examples = []string{
"--os-arch amd64 --cache-dir ~/cache_dir --version 1.0.0 --global-version 1",
"--os-arch x86_64 --cache-dir /tmp/cache_dir --version v2 --global-version 2",
}
createKeyCmd.ExamplePackages = examplePackages
commands.Add(createKeyCmd)
installCmd := &Cmd{
Name: "install",
Description: "Install packages and saves them to the cache",
Flags: flag.NewFlagSet("install", flag.ExitOnError),
Run: install,
}
installCmd.Flags.String("cache-dir", "", "Directory that holds the cached packages")
installCmd.Examples = []string{
"--cache-dir ~/cache_dir",
"--cache-dir /tmp/cache_dir",
}
installCmd.ExamplePackages = examplePackages
commands.Add(installCmd)
restoreCmd := &Cmd{
Name: "restore",
Description: "Restore packages from the cache",
Flags: flag.NewFlagSet("restore", flag.ExitOnError),
Run: restore,
}
restoreCmd.Flags.String("cache-dir", "", "Directory that holds the cached packages")
restoreCmd.Flags.String("restore-root", "/", "Root directory to untar the cached packages to")
restoreCmd.Flags.Bool("execute-scripts", false, "Execute APT post-install scripts on restore")
restoreCmd.Examples = []string{
"--cache-dir ~/cache_dir --restore-root / --execute-scripts true",
"--cache-dir /tmp/cache_dir --restore-root /",
}
restoreCmd.ExamplePackages = examplePackages
commands.Add(restoreCmd)
validatePackagesCmd := &Cmd{
Name: "validate",
Description: "Validate package arguments",
Flags: flag.NewFlagSet("validate", flag.ExitOnError),
Run: validatePackages,
}
validatePackagesCmd.ExamplePackages = examplePackages
commands.Add(validatePackagesCmd)
return commands
}
func main() {
logging.Init("cache_apt_pkgs", true)
commands := createCmdFlags(createKey, installPackages, restorePackages)
cmd, pkgArgs := commands.Parse()
err := cmd.Run(cmd, pkgArgs)
if err != nil {
logging.Fatalf("error: %v\n", err)
}
}

210
src/internal/cache/io.go vendored Normal file
View file

@ -0,0 +1,210 @@
package cache
import (
"archive/tar"
"fmt"
"io"
"os"
"path/filepath"
)
func MkDir(dir string) error {
if dir == "" {
return fmt.Errorf("directory path cannot be empty")
}
if err := os.MkdirAll(dir, 0755); err != nil {
return fmt.Errorf("failed to create directory %s: %v", dir, err)
}
return nil
}
func WriteKey(filepath string, key Key) error {
// Write cache key to file
if err := os.WriteFile(filepath, []byte(key.Hash()), 0644); err != nil {
return fmt.Errorf("failed to write cache key to %s: %w", filepath, err)
}
return nil
}
// validateTarInputs checks if the input parameters for tar creation are valid
func validateTarInputs(destPath string, files []string) error {
if destPath == "" {
return fmt.Errorf("destination path cannot be empty")
}
if len(files) == 0 {
return fmt.Errorf("no files provided")
}
return nil
}
// createTarWriter creates a new tar writer for the given destination path
func createTarWriter(destPath string) (*tar.Writer, *os.File, error) {
// Create parent directory for destination file if it doesn't exist
if err := os.MkdirAll(filepath.Dir(destPath), 0755); err != nil {
return nil, nil, fmt.Errorf("failed to create parent directory for %s: %w", destPath, err)
}
// Create the tar file
file, err := os.Create(destPath)
if err != nil {
return nil, nil, fmt.Errorf("failed to create destination file %s: %w", destPath, err)
}
// Create tar writer
tw := tar.NewWriter(file)
return tw, file, nil
}
// validateFileType checks if the file is a regular file or symlink
func validateFileType(info os.FileInfo, absPath string) error {
if !info.Mode().IsRegular() && info.Mode()&os.ModeSymlink == 0 {
return fmt.Errorf("file %s is not a regular file or symlink", absPath)
}
return nil
}
// createFileHeader creates a tar header for the given file info
func createFileHeader(info os.FileInfo, absPath string) (*tar.Header, error) {
header, err := tar.FileInfoHeader(info, "") // Empty link name for now
if err != nil {
return nil, fmt.Errorf("failed to create tar header for %s: %w", absPath, err)
}
// Use path relative to root for archive
header.Name = absPath[1:] // Remove leading slash
return header, nil
}
// writeRegularFile writes a regular file's contents to the tar archive
func writeRegularFile(tw *tar.Writer, absPath string) error {
srcFile, err := os.Open(absPath)
if err != nil {
return fmt.Errorf("failed to open %s: %w", absPath, err)
}
defer srcFile.Close()
if _, err := io.Copy(tw, srcFile); err != nil {
return fmt.Errorf("failed to write %s to archive: %w", absPath, err)
}
return nil
}
// getSymlinkTarget gets the absolute path of a symlink target
func getSymlinkTarget(linkTarget, absPath string) string {
if filepath.IsAbs(linkTarget) {
return linkTarget
}
return filepath.Join(filepath.Dir(absPath), linkTarget)
}
// handleSymlinkTarget handles the target file of a symlink
func handleSymlinkTarget(tw *tar.Writer, targetPath string, header *tar.Header, linkTarget string) error {
targetInfo, err := os.Stat(targetPath)
if err != nil || targetInfo.IsDir() {
return nil // Skip if target doesn't exist or is a directory
}
// Create header for target file
targetHeader, err := tar.FileInfoHeader(targetInfo, "")
if err != nil {
return fmt.Errorf("failed to create tar header for symlink target %s: %w", targetPath, err)
}
// Store with path relative to root
targetHeader.Name = targetPath[1:]
// For absolute symlinks, make the linkname relative to root too
if filepath.IsAbs(linkTarget) {
header.Linkname = linkTarget[1:]
}
// Write target header and contents
if err := tw.WriteHeader(targetHeader); err != nil {
return fmt.Errorf("failed to write tar header for symlink target %s: %w", targetPath, err)
}
return writeRegularFile(tw, targetPath)
}
// handleSymlink handles a symlink file and its target
func handleSymlink(tw *tar.Writer, absPath string, header *tar.Header) error {
// Read the target of the symlink
linkTarget, err := os.Readlink(absPath)
if err != nil {
return fmt.Errorf("failed to read symlink %s: %w", absPath, err)
}
header.Linkname = linkTarget
// Get absolute path of target and handle it
targetPath := getSymlinkTarget(linkTarget, absPath)
return handleSymlinkTarget(tw, targetPath, header, linkTarget)
}
// TarFiles creates a tar archive containing the specified files.
// Matches behavior of install_and_cache_pkgs.sh script.
//
// Parameters:
// - destPath: Path where the tar file should be created
// - files: List of absolute file paths to include in the archive
//
// The function will:
// - Archive files relative to root directory (like -C /)
// - Include only regular files and symlinks
// - Preserve file permissions and timestamps
// - Handle special characters in paths
// - Save symlinks as-is without following them
//
// Returns an error if:
// - destPath is empty or invalid
// - Any file in files list is not a regular file or symlink
// - Permission denied when reading files or writing archive
func TarFiles(destPath string, files []string) error {
if err := validateTarInputs(destPath, files); err != nil {
return err
}
tw, file, err := createTarWriter(destPath)
if err != nil {
return err
}
defer file.Close()
defer tw.Close()
// Process each file
for _, absPath := range files {
// Get file info and validate type
info, err := os.Lstat(absPath)
if err != nil {
return fmt.Errorf("failed to stat %s: %w", absPath, err)
}
if err := validateFileType(info, absPath); err != nil {
return err
}
// Create and initialize header
header, err := createFileHeader(info, absPath)
if err != nil {
return err
}
// Handle symlinks and their targets
if info.Mode()&os.ModeSymlink != 0 {
if err := handleSymlink(tw, absPath, header); err != nil {
return err
}
}
// Write the file's header
if err := tw.WriteHeader(header); err != nil {
return fmt.Errorf("failed to write tar header for %s: %w", absPath, err)
}
// Write the file's contents if it's a regular file
if info.Mode().IsRegular() {
if err := writeRegularFile(tw, absPath); err != nil {
return err
}
}
}
return nil
}

196
src/internal/cache/io_test.go vendored Normal file
View file

@ -0,0 +1,196 @@
package cache
import (
"archive/tar"
"io"
"os"
"path/filepath"
"testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
const (
content1 = "content 1"
content2 = "content 2"
)
func setupFiles(t *testing.T) (string, func()) {
assert := assert.New(t)
// Create temporary directory
tempDir := filepath.Join(os.TempDir(), "tar_files")
err := os.MkdirAll(tempDir, 0755)
assert.NoError(err, "Failed to create temp dir")
// Create test files
file1Path := filepath.Join(tempDir, "file1.txt")
err = os.WriteFile(file1Path, []byte(content1), 0644)
assert.NoError(err, "Failed to create file 1")
file2Path := filepath.Join(tempDir, "file2.txt")
err = os.WriteFile(file2Path, []byte(content2), 0644)
assert.NoError(err, "Failed to create file 2")
subDirPath := filepath.Join(tempDir, "subdir")
err = os.MkdirAll(subDirPath, 0755)
assert.NoError(err, "Failed to create subdir")
// Create a file in subdir
file3Path := filepath.Join(subDirPath, "file3.txt")
err = os.WriteFile(file3Path, []byte(content2), 0644) // Same content as file2
assert.NoError(err, "Failed to create file 3")
// Create symlinks - one relative, one absolute
symlinkPath := filepath.Join(tempDir, "link.txt")
err = os.Symlink("file1.txt", symlinkPath)
assert.NoError(err, "Failed to create relative symlink")
absSymlinkPath := filepath.Join(tempDir, "abs_link.txt")
err = os.Symlink(file3Path, absSymlinkPath)
assert.NoError(err, "Failed to create absolute symlink")
cleanup := func() {
os.RemoveAll(tempDir)
}
return tempDir, cleanup
}
func TestTarFiles(t *testing.T) {
assert := assert.New(t)
require := require.New(t)
// Setup files
sourceDir, cleanup := setupFiles(t)
defer cleanup()
// Create destination for the archive
destPath := filepath.Join(sourceDir, "test.tar")
// Files to archive (absolute paths)
files := []string{
filepath.Join(sourceDir, "file1.txt"),
filepath.Join(sourceDir, "file2.txt"),
filepath.Join(sourceDir, "link.txt"),
filepath.Join(sourceDir, "abs_link.txt"),
}
// Create the archive
err := TarFiles(destPath, files)
require.NoError(err, "TarFiles should succeed")
// Verify the archive exists
_, err = os.Stat(destPath)
assert.NoError(err, "Archive file should exist") // Open and verify the archive contents
file, err := os.Open(destPath)
require.NoError(err, "Should be able to open archive")
defer file.Close()
// Create tar reader
tr := tar.NewReader(file)
// Map to track found files
foundFiles := make(map[string]bool)
foundContent := make(map[string]string)
foundLinks := make(map[string]string)
// Read all files from the archive
for {
header, err := tr.Next()
if err == io.EOF {
break
}
require.NoError(err, "Should be able to read tar header")
// When checking files, reconstruct the absolute path
absPath := "/" + header.Name
foundFiles[absPath] = true
if header.Typeflag == tar.TypeSymlink {
foundLinks[filepath.Base(header.Name)] = header.Linkname
continue
}
if header.Typeflag == tar.TypeReg {
content, err := io.ReadAll(tr)
require.NoError(err, "Should be able to read file content")
foundContent[filepath.Base(header.Name)] = string(content)
}
}
// Verify all files were archived
for _, f := range files {
assert.True(foundFiles[f], "Archive should contain %s", f)
}
// Verify symlink targets are present
file3AbsPath := filepath.Join(sourceDir, "subdir/file3.txt")
assert.True(foundFiles[file3AbsPath], "Archive should contain symlink target %s", file3AbsPath)
// Get base name of file3 for content check
file3Base := filepath.Base(file3AbsPath)
// Verify file contents
assert.Equal(content1, foundContent["file1.txt"], "file1.txt should have correct content")
assert.Equal(content2, foundContent["file2.txt"], "file2.txt should have correct content")
assert.Equal(content2, foundContent[file3Base], "file3.txt should have correct content")
// Verify symlinks
assert.Equal("file1.txt", foundLinks["link.txt"], "link.txt should point to file1.txt")
assert.Equal(file3AbsPath[1:], foundLinks["abs_link.txt"], "abs_link.txt should point to file3.txt with correct path")
}
func TestTarFilesErrors(t *testing.T) {
assert := assert.New(t)
// Setup files
sourceDir, cleanup := setupFiles(t)
defer cleanup()
file1 := filepath.Join(sourceDir, "file1.txt")
tests := []struct {
name string
destPath string
files []string
wantErr bool
}{
{
name: "Empty destination path",
destPath: "",
files: []string{file1},
wantErr: true,
},
{
name: "Empty files list",
destPath: "test.tar",
files: []string{},
wantErr: true,
},
{
name: "Non-existent file",
destPath: "test.tar",
files: []string{filepath.Join(sourceDir, "nonexistent.txt")},
wantErr: true,
},
{
name: "Directory in files list",
destPath: "test.tar",
files: []string{sourceDir},
wantErr: true,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
err := TarFiles(tt.destPath, tt.files)
if tt.wantErr {
assert.Error(err, "TarFiles should return error")
} else {
assert.NoError(err, "TarFiles should not return error")
}
})
}
}

30
src/internal/cache/key.go vendored Normal file
View file

@ -0,0 +1,30 @@
package cache
import (
"fmt"
"sort"
"awalsh128.com/cache-apt-pkgs-action/src/internal/pkgs"
)
// Key represents a cache key based on package list and version information
type Key struct {
Packages pkgs.Packages
Version string
GlobalVersion string
OsArch string
}
// Hash returns an MD5 hash of the key's contents, with packages sorted by name and version
func (k *Key) Hash() string {
// Sort packages in place by Name, then by Version
sort.Slice(k.Packages, func(i, j int) bool {
if k.Packages[i].Name != k.Packages[j].Name {
return k.Packages[i].Name < k.Packages[j].Name
}
return k.Packages[i].Version < k.Packages[j].Version
})
// Use the sorted packages to generate the hash input
return fmt.Sprintf("%s @ '%s' '%s' '%s'", k.Packages.String(), k.Version, k.GlobalVersion, k.OsArch)
}

149
src/internal/cache/key_test.go vendored Normal file
View file

@ -0,0 +1,149 @@
package cache
import (
"testing"
"awalsh128.com/cache-apt-pkgs-action/src/internal/pkgs"
"github.com/stretchr/testify/assert"
)
const (
// Package names
pkg1 = "pkg1"
pkg2 = "pkg2"
// Versions
version1 = "1.0.0"
version2 = "2.0.0"
version3 = "3.0.0"
// Architectures
archX86 = "amd64"
archArm = "arm64"
)
func TestKeyHashEmptyKey(t *testing.T) {
assert := assert.New(t)
key := &Key{}
hash := key.Hash()
assert.NotEmpty(hash, "Hash should not be empty even for empty key")
}
func TestKeyHashWithPackages(t *testing.T) {
assert := assert.New(t)
key := &Key{
Packages: pkgs.Packages{
pkgs.Package{Name: pkg1},
pkgs.Package{Name: pkg2},
},
}
hash1 := key.Hash()
// Same packages in different order should produce same hash
key2 := &Key{
Packages: pkgs.Packages{
pkgs.Package{Name: pkg2},
pkgs.Package{Name: pkg1},
},
}
hash2 := key2.Hash()
assert.Equal(hash1, hash2, "Hash should be same for same packages in different order")
// Test with versions
key3 := &Key{
Packages: pkgs.Packages{
pkgs.Package{Name: pkg1, Version: version2},
pkgs.Package{Name: pkg1, Version: version1},
pkgs.Package{Name: pkg2, Version: version1},
},
}
hash3 := key3.Hash()
key4 := &Key{
Packages: pkgs.Packages{
pkgs.Package{Name: pkg2, Version: version1},
pkgs.Package{Name: pkg1, Version: version1},
pkgs.Package{Name: pkg1, Version: version2},
},
}
hash4 := key4.Hash()
assert.Equal(hash3, hash4, "Hash should be same for same packages and versions in different order")
}
func TestKeyHashWithVersion(t *testing.T) {
assert := assert.New(t)
key := &Key{
Packages: pkgs.Packages{pkgs.Package{Name: pkg1}, pkgs.Package{Name: pkg2}},
Version: version1,
}
hash1 := key.Hash()
// Same package with different version should produce different hash
key2 := &Key{
Packages: pkgs.Packages{pkgs.Package{Name: pkg1}},
Version: version2,
}
hash2 := key2.Hash()
assert.NotEqual(hash1, hash2, "Hash should be different for different versions")
}
func TestKeyHashWithGlobalVersion(t *testing.T) {
assert := assert.New(t)
key := &Key{
Packages: pkgs.Packages{pkgs.Package{Name: pkg1}},
GlobalVersion: version1,
}
hash1 := key.Hash()
// Same package with different global version should produce different hash
key2 := &Key{
Packages: pkgs.Packages{{Name: pkg1}},
GlobalVersion: version2,
}
hash2 := key2.Hash()
assert.NotEqual(hash1, hash2, "Hash should be different for different global versions")
}
func TestKeyHashWithOsArch(t *testing.T) {
assert := assert.New(t)
key := &Key{
Packages: pkgs.Packages{{Name: pkg1}},
OsArch: archX86,
}
hash1 := key.Hash()
// Same package with different OS architecture should produce different hash
key2 := &Key{
Packages: pkgs.Packages{{Name: pkg1}},
OsArch: archArm,
}
hash2 := key2.Hash()
assert.NotEqual(hash1, hash2, "Hash should be different for different OS architectures")
}
func TestKeyHashWithAll(t *testing.T) {
assert := assert.New(t)
key := &Key{
Packages: pkgs.Packages{{Name: pkg1}, {Name: pkg2}},
Version: version1,
GlobalVersion: version2,
OsArch: archX86,
}
hash1 := key.Hash()
// Same values in different order should produce same hash
key2 := &Key{
Packages: pkgs.Packages{{Name: pkg2}, {Name: pkg1}},
Version: version1,
GlobalVersion: version2,
OsArch: archX86,
}
hash2 := key2.Hash()
assert.Equal(hash1, hash2, "Hash should be same for same values")
}

66
src/internal/cache/manifest.go vendored Normal file
View file

@ -0,0 +1,66 @@
package cache
import (
"encoding/json"
"fmt"
"os"
"strings"
"time"
)
type Manifest struct {
CacheKey Key
FilePaths []string
LastModified time.Time
}
func (m *Manifest) FilePathsText() string {
var lines []string
lines = append(lines, m.FilePaths...)
return strings.Join(lines, "\n")
}
func (m *Manifest) Json() (string, error) {
content, err := json.MarshalIndent(m, "", " ")
if err != nil {
return "", fmt.Errorf("failed to marshal manifest to JSON: %w", err)
}
return string(content), nil
}
func ReadJson(filepath string) (*Manifest, error) {
file, err := os.Open(filepath)
if err != nil {
return nil, fmt.Errorf("failed to open manifest at %s: %w", filepath, err)
}
defer file.Close()
content, err := os.ReadFile(filepath)
if err != nil {
return nil, fmt.Errorf("failed to read manifest at %s: %w", filepath, err)
}
var manifest Manifest
if err := json.Unmarshal(content, &manifest); err != nil {
return nil, fmt.Errorf("failed to parse manifest JSON: %w", err)
}
return &manifest, nil
}
func Write(filepath string, manifest *Manifest) error {
file, err := os.Create(filepath)
if err != nil {
return fmt.Errorf("failed to create manifest at %s: %w", filepath, err)
}
defer file.Close()
content, err := manifest.Json()
if err != nil {
return fmt.Errorf("failed to serialize manifest to %s: %v", filepath, err)
}
if _, err := file.Write([]byte(content)); err != nil {
return fmt.Errorf("failed to write manifest to %s: %v", filepath, err)
}
fmt.Printf("Manifest written to %s\n", filepath)
return nil
}

65
src/internal/cache/manifest_test.go vendored Normal file
View file

@ -0,0 +1,65 @@
package cache
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestFilePathsText(t *testing.T) {
tests := []struct {
name string
manifest *Manifest
expected string
description string
}{
{
name: "empty_paths",
manifest: &Manifest{
FilePaths: []string{},
},
expected: "",
description: "Empty FilePaths should return empty string",
},
{
name: "single_path",
manifest: &Manifest{
FilePaths: []string{"/path/to/file1"},
},
expected: "/path/to/file1",
description: "Single file path should be returned as is",
},
{
name: "multiple_paths",
manifest: &Manifest{
FilePaths: []string{
"/path/to/file1",
"/path/to/file2",
"/path/to/file3",
},
},
expected: "/path/to/file1\n/path/to/file2\n/path/to/file3",
description: "Multiple file paths should be joined with newlines",
},
{
name: "paths_with_special_chars",
manifest: &Manifest{
FilePaths: []string{
"/path with spaces/file1",
"/path/with/tabs\t/file2",
"/path/with/newlines\n/file3",
},
},
expected: "/path with spaces/file1\n/path/with/tabs\t/file2\n/path/with/newlines\n/file3",
description: "Paths with special characters should be preserved",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
assert := assert.New(t)
result := tt.manifest.FilePathsText()
assert.Equal(tt.expected, result, tt.description)
})
}
}

View file

@ -1,91 +0,0 @@
package cmdtesting
import (
"os"
"os/exec"
"strings"
"testing"
"awalsh128.com/cache-apt-pkgs-action/src/internal/common"
)
const binaryName = "apt_query"
type CmdTesting struct {
*testing.T
createReplayLogs bool
replayFilename string
}
func New(t *testing.T, createReplayLogs bool) *CmdTesting {
replayFilename := "testlogs/" + strings.ToLower(t.Name()) + ".log"
if createReplayLogs {
os.Remove(replayFilename)
os.Remove(binaryName + ".log")
}
return &CmdTesting{t, createReplayLogs, replayFilename}
}
type RunResult struct {
Testing *CmdTesting
CombinedOut string
Err error
}
func TestMain(m *testing.M) {
cmd := exec.Command("go", "build")
out, err := cmd.CombinedOutput()
if err != nil {
panic(string(out))
}
os.Exit(m.Run())
}
func (t *CmdTesting) Run(command string, pkgNames ...string) RunResult {
replayfile := "testlogs/" + strings.ToLower(t.Name()) + ".log"
flags := []string{"-debug=true"}
if !t.createReplayLogs {
flags = append(flags, "-replayfile="+replayfile)
}
cmd := exec.Command("./"+binaryName, append(append(flags, command), pkgNames...)...)
combinedOut, err := cmd.CombinedOutput()
if t.createReplayLogs {
err := common.AppendFile(binaryName+".log", t.replayFilename)
if err != nil {
t.T.Fatalf("Error encountered appending log file.\n%s", err.Error())
}
}
return RunResult{Testing: t, CombinedOut: string(combinedOut), Err: err}
}
func (r *RunResult) ExpectSuccessfulOut(expected string) {
if r.Testing.createReplayLogs {
r.Testing.Log("Skipping test while creating replay logs.")
return
}
if r.Err != nil {
r.Testing.Errorf("Error running command: %v\n%s", r.Err, r.CombinedOut)
return
}
fullExpected := expected + "\n" // Output will always have a end of output newline.
if r.CombinedOut != fullExpected {
r.Testing.Errorf("Unexpected combined std[err,out] found.\nExpected:\n'%s'\nActual:\n'%s'", fullExpected, r.CombinedOut)
}
}
func (r *RunResult) ExpectError(expectedCombinedOut string) {
if r.Testing.createReplayLogs {
r.Testing.Log("Skipping test while creating replay logs.")
return
}
fullExpectedCombinedOut := expectedCombinedOut + "\n" // Output will always have a end of output newline.
if r.CombinedOut != fullExpectedCombinedOut {
r.Testing.Errorf("Unexpected combined std[err,out] found.\nExpected:\n'%s'\nActual:\n'%s'", fullExpectedCombinedOut, r.CombinedOut)
}
}

View file

@ -1,123 +0,0 @@
package common
import (
"errors"
"fmt"
"sort"
"strings"
"awalsh128.com/cache-apt-pkgs-action/src/internal/exec"
"awalsh128.com/cache-apt-pkgs-action/src/internal/logging"
)
// An APT package name and version representation.
type AptPackage struct {
Name string
Version string
}
type AptPackages []AptPackage
// Serialize the APT packages into lines of <name>=<version>.
func (ps AptPackages) Serialize() string {
tokens := []string{}
for _, p := range ps {
tokens = append(tokens, p.Name+"="+p.Version)
}
return strings.Join(tokens, " ")
}
func isErrLine(line string) bool {
return strings.HasPrefix(line, "E: ") || strings.HasPrefix(line, "N: ")
}
// Resolves virtual packages names to their concrete one.
func getNonVirtualPackage(executor exec.Executor, name string) (pkg *AptPackage, err error) {
execution := executor.Exec("bash", "-c", fmt.Sprintf("apt-cache showpkg %s | grep -A 1 \"Reverse Provides\" | tail -1", name))
err = execution.Error()
if err != nil {
logging.Fatal(err)
return pkg, err
}
if isErrLine(execution.CombinedOut) {
return pkg, execution.Error()
}
splitLine := GetSplitLine(execution.CombinedOut, " ", 3)
if len(splitLine.Words) < 2 {
return pkg, fmt.Errorf("unable to parse space delimited line's package name and version from apt-cache showpkg output below:\n%s", execution.CombinedOut)
}
return &AptPackage{Name: splitLine.Words[0], Version: splitLine.Words[1]}, nil
}
func getPackage(executor exec.Executor, paragraph string) (pkg *AptPackage, err error) {
errMsgs := []string{}
for _, splitLine := range GetSplitLines(paragraph, ":", 2) {
switch splitLine.Words[0] {
case "Package":
// Initialize since this will provide the first struct value if present.
pkg = &AptPackage{}
pkg.Name = splitLine.Words[1]
case "Version":
pkg.Version = splitLine.Words[1]
case "N":
// e.g. Can't select versions from package 'libvips' as it is purely virtual
if strings.Contains(splitLine.Words[1], "as it is purely virtual") {
return getNonVirtualPackage(executor, GetSplitLine(splitLine.Words[1], "'", 4).Words[2])
}
if strings.HasPrefix(splitLine.Words[1], "Unable to locate package") && !ArrContainsString(errMsgs, splitLine.Line) {
errMsgs = append(errMsgs, splitLine.Line)
}
case "E":
if !ArrContainsString(errMsgs, splitLine.Line) {
errMsgs = append(errMsgs, splitLine.Line)
}
}
}
if len(errMsgs) == 0 {
return pkg, nil
}
return pkg, errors.New(strings.Join(errMsgs, "\n"))
}
// Gets the APT based packages as a sorted by package name list (normalized).
func GetAptPackages(executor exec.Executor, names []string) (AptPackages, error) {
prefixArgs := []string{"--quiet=0", "--no-all-versions", "show"}
execution := executor.Exec("apt-cache", append(prefixArgs, names...)...)
pkgs := []AptPackage{}
err := execution.Error()
if err != nil {
logging.Fatal(err)
return pkgs, err
}
errMsgs := []string{}
for _, paragraph := range strings.Split(execution.CombinedOut, "\n\n") {
trimmed := strings.TrimSpace(paragraph)
if trimmed == "" {
continue
}
pkg, err := getPackage(executor, trimmed)
if err != nil {
errMsgs = append(errMsgs, err.Error())
} else if pkg != nil { // Ignore cases where no package parsed and no errors occurred.
pkgs = append(pkgs, *pkg)
}
}
if len(errMsgs) > 0 {
errMsgs = append(errMsgs, strings.Join(errMsgs, "\n"))
}
sort.Slice(pkgs, func(i, j int) bool {
return pkgs[i].Name < pkgs[j].Name
})
if len(errMsgs) > 0 {
return pkgs, errors.New(strings.Join(errMsgs, "\n"))
}
return pkgs, nil
}

View file

@ -1,85 +0,0 @@
package common
import (
"io"
"os"
"path/filepath"
)
func AppendFile(source string, destination string) error {
err := createDirectoryIfNotPresent(filepath.Dir(destination))
if err != nil {
return err
}
in, err := os.Open(source)
if err != nil {
return err
}
defer in.Close()
out, err := os.OpenFile(destination, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0644)
if err != nil {
return err
}
defer out.Close()
data, err := io.ReadAll(in)
if err != nil {
return err
}
_, err = out.Write(data)
if err != nil {
return err
}
return nil
}
func CopyFile(source string, destination string) error {
err := createDirectoryIfNotPresent(filepath.Dir(destination))
if err != nil {
return err
}
in, err := os.Open(source)
if err != nil {
return err
}
defer in.Close()
out, err := os.Create(destination)
if err != nil {
return err
}
defer out.Close()
data, err := io.ReadAll(in)
if err != nil {
return err
}
_, err = out.Write(data)
if err != nil {
return err
}
return nil
}
func MoveFile(source string, destination string) error {
err := createDirectoryIfNotPresent(filepath.Dir(destination))
if err != nil {
return err
}
return os.Rename(source, destination)
}
func createDirectoryIfNotPresent(path string) error {
if _, err := os.Stat(path); os.IsNotExist(err) {
err := os.MkdirAll(path, 0755)
if err != nil {
return err
}
}
return nil
}

View file

@ -1,44 +1,94 @@
package common
import (
"bytes"
"os/exec"
"sort"
"strings"
)
// Checks if an exact string is in an array of strings.
func ArrContainsString(arr []string, element string) bool {
for _, x := range arr {
if x == element {
// ExecCommand executes a command and returns its output
var ExecCommand = func(command string, args ...string) (string, error) {
cmd := exec.Command(command, args...)
var stdout bytes.Buffer
cmd.Stdout = &stdout
cmd.Stderr = &stdout
err := cmd.Run()
return strings.TrimSpace(stdout.String()), err
}
// SortAndJoin sorts a slice of strings and joins them with the specified separator.
func SortAndJoin(strs []string, sep string) string {
if len(strs) == 0 {
return ""
}
sorted := make([]string, len(strs))
copy(sorted, strs)
sort.Strings(sorted)
return strings.Join(sorted, sep)
}
// SplitAndTrim splits a string by the given separator and trims whitespace from each part.
// Empty strings are removed from the result.
func SplitAndTrim(s, sep string) []string {
if s == "" {
return nil
}
parts := strings.Split(s, sep)
result := make([]string, 0, len(parts))
for _, part := range parts {
part = strings.TrimSpace(part)
if part != "" {
result = append(result, part)
}
}
return result
}
// ParseKeyValue parses a string in the format "key=value" and returns the key and value.
// If no separator is found, returns the entire string as the key and an empty value.
func ParseKeyValue(s, sep string) (key, value string) {
parts := strings.SplitN(s, sep, 2)
key = strings.TrimSpace(parts[0])
if len(parts) > 1 {
value = strings.TrimSpace(parts[1])
}
return key, value
}
// ContainsAny returns true if any of the substrings are found in s.
func ContainsAny(s string, substrings ...string) bool {
for _, sub := range substrings {
if strings.Contains(s, sub) {
return true
}
}
return false
}
// A line that has been split into words.
type SplitLine struct {
Line string // The original line.
Words []string // The split words in the line.
// EqualFold reports whether s and t, interpreted as UTF-8 strings,
// are equal under simple Unicode case-folding, which is a more general
// form of case-insensitivity.
func EqualFold(s, t string) bool {
return strings.EqualFold(s, t)
}
// Splits a line into words by the delimiter and max number of delimitation.
func GetSplitLine(line string, delimiter string, numWords int) SplitLine {
words := strings.SplitN(line, delimiter, numWords)
trimmedWords := make([]string, len(words))
for i, word := range words {
trimmedWords[i] = strings.TrimSpace(word)
// TrimPrefixCaseInsensitive removes the provided prefix from s if it exists,
// ignoring case. If s doesn't start with prefix, s is returned unchanged.
func TrimPrefixCaseInsensitive(s, prefix string) string {
if strings.HasPrefix(strings.ToLower(s), strings.ToLower(prefix)) {
return s[len(prefix):]
}
return SplitLine{line, trimmedWords}
return s
}
// Splits a paragraph into lines by newline and then splits each line into words specified by the delimiter and max number of delimitation.
func GetSplitLines(paragraph string, delimiter string, numWords int) []SplitLine {
lines := []SplitLine{}
for _, line := range strings.Split(paragraph, "\n") {
trimmed := strings.TrimSpace(line)
if trimmed == "" {
continue
// RemoveEmpty removes empty strings from a slice of strings.
func RemoveEmpty(strs []string) []string {
result := make([]string, 0, len(strs))
for _, s := range strs {
if s != "" {
result = append(result, s)
}
lines = append(lines, GetSplitLine(trimmed, delimiter, numWords))
}
return lines
return result
}

View file

@ -0,0 +1,297 @@
package common
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestExecCommand(t *testing.T) {
oldExecCommand := ExecCommand
defer func() {
ExecCommand = oldExecCommand
}()
tests := []struct {
name string
command string
args []string
output string
wantErr bool
mockFunc func(string, ...string) (string, error)
}{
{
name: "simple command",
command: "echo",
args: []string{"hello"},
output: "hello",
mockFunc: func(cmd string, args ...string) (string, error) {
assert.Equal(t, "echo", cmd)
assert.Equal(t, []string{"hello"}, args)
return "hello", nil
},
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
ExecCommand = tt.mockFunc
got, err := ExecCommand(tt.command, tt.args...)
if tt.wantErr {
assert.Error(t, err)
return
}
assert.NoError(t, err)
assert.Equal(t, tt.output, got)
})
}
}
func TestSortAndJoin(t *testing.T) {
tests := []struct {
name string
strs []string
sep string
expected string
}{
{
name: "empty slice",
strs: []string{},
sep: ",",
expected: "",
},
{
name: "single item",
strs: []string{"one"},
sep: ",",
expected: "one",
},
{
name: "multiple items",
strs: []string{"c", "a", "b"},
sep: ",",
expected: "a,b,c",
},
{
name: "custom separator",
strs: []string{"c", "a", "b"},
sep: "|",
expected: "a|b|c",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
assert := assert.New(t)
result := SortAndJoin(tt.strs, tt.sep)
assert.Equal(tt.expected, result)
})
}
}
func TestSplitAndTrim(t *testing.T) {
tests := []struct {
name string
input string
sep string
expected []string
}{
{
name: "empty string",
input: "",
sep: ",",
expected: nil,
},
{
name: "single item",
input: "one",
sep: ",",
expected: []string{"one"},
},
{
name: "multiple items with whitespace",
input: " a , b , c ",
sep: ",",
expected: []string{"a", "b", "c"},
},
{
name: "empty items removed",
input: "a,,b, ,c",
sep: ",",
expected: []string{"a", "b", "c"},
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
assert := assert.New(t)
result := SplitAndTrim(tt.input, tt.sep)
assert.Equal(tt.expected, result)
})
}
}
func TestParseKeyValue(t *testing.T) {
tests := []struct {
name string
input string
sep string
expectedKey string
expectedValue string
}{
{
name: "no separator",
input: "key",
sep: "=",
expectedKey: "key",
expectedValue: "",
},
{
name: "with separator",
input: "key=value",
sep: "=",
expectedKey: "key",
expectedValue: "value",
},
{
name: "with whitespace",
input: " key = value ",
sep: "=",
expectedKey: "key",
expectedValue: "value",
},
{
name: "custom separator",
input: "key:value",
sep: ":",
expectedKey: "key",
expectedValue: "value",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
assert := assert.New(t)
key, value := ParseKeyValue(tt.input, tt.sep)
assert.Equal(tt.expectedKey, key)
assert.Equal(tt.expectedValue, value)
})
}
}
func TestContainsAny(t *testing.T) {
tests := []struct {
name string
s string
substrings []string
expected bool
}{
{
name: "empty string and substrings",
s: "",
substrings: []string{},
expected: false,
},
{
name: "no matches",
s: "hello world",
substrings: []string{"foo", "bar"},
expected: false,
},
{
name: "single match",
s: "hello world",
substrings: []string{"hello", "foo"},
expected: true,
},
{
name: "multiple matches",
s: "hello world",
substrings: []string{"hello", "world"},
expected: true,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
assert := assert.New(t)
result := ContainsAny(tt.s, tt.substrings...)
assert.Equal(tt.expected, result)
})
}
}
func TestTrimPrefixCaseInsensitive(t *testing.T) {
tests := []struct {
name string
s string
prefix string
expected string
}{
{
name: "exact match",
s: "prefixText",
prefix: "prefix",
expected: "Text",
},
{
name: "case difference",
s: "PREFIXText",
prefix: "prefix",
expected: "Text",
},
{
name: "no match",
s: "Text",
prefix: "prefix",
expected: "Text",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
assert := assert.New(t)
result := TrimPrefixCaseInsensitive(tt.s, tt.prefix)
assert.Equal(tt.expected, result)
})
}
}
func TestRemoveEmpty(t *testing.T) {
tests := []struct {
name string
input []string
expected []string
}{
{
name: "empty slice",
input: []string{},
expected: []string{},
},
{
name: "no empty strings",
input: []string{"a", "b", "c"},
expected: []string{"a", "b", "c"},
},
{
name: "with empty strings",
input: []string{"a", "", "b", "", "c"},
expected: []string{"a", "b", "c"},
},
{
name: "all empty strings",
input: []string{"", "", ""},
expected: []string{},
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
assert := assert.New(t)
result := RemoveEmpty(tt.input)
assert.Equal(tt.expected, result)
})
}
}

View file

@ -1,37 +0,0 @@
package exec
import (
"fmt"
"os/exec"
"strings"
"awalsh128.com/cache-apt-pkgs-action/src/internal/logging"
)
// An executor that proxies command executions from the OS.
//
// NOTE: Extra abstraction layer needed for testing and replay.
type BinExecutor struct{}
func (c *BinExecutor) Exec(name string, arg ...string) *Execution {
cmd := exec.Command(name, arg...)
out, err := cmd.CombinedOutput()
if err != nil {
logging.Fatal(err)
}
execution := &Execution{
Cmd: name + " " + strings.Join(arg, " "),
CombinedOut: string(out),
ExitCode: cmd.ProcessState.ExitCode(),
}
logging.DebugLazy(func() string {
return fmt.Sprintf("EXECUTION-OBJ-START\n%s\nEXECUTION-OBJ-END", execution.Serialize())
})
if err != nil {
logging.Fatal(execution.Error())
}
return execution
}

View file

@ -1,49 +0,0 @@
package exec
import (
"encoding/json"
"fmt"
"awalsh128.com/cache-apt-pkgs-action/src/internal/logging"
)
type Executor interface {
// Executes a command and either returns the output or exits the programs and writes the output (including error) to STDERR.
Exec(name string, arg ...string) *Execution
}
type Execution struct {
Cmd string
CombinedOut string
ExitCode int
}
// Gets the error, if the command ran with a non-zero exit code.
func (e *Execution) Error() error {
if e.ExitCode == 0 {
return nil
}
return fmt.Errorf(
"Error encountered running %s\nExited with status code %d; see combined std[out,err] below:\n%s",
e.Cmd,
e.ExitCode,
e.CombinedOut,
)
}
func DeserializeExecution(payload string) *Execution {
var execution Execution
err := json.Unmarshal([]byte(payload), &execution)
if err != nil {
logging.Fatalf("Error encountered deserializing Execution object.\n%s", err)
}
return &execution
}
func (e *Execution) Serialize() string {
bytes, err := json.MarshalIndent(e, "", " ")
if err != nil {
logging.Fatalf("Error encountered serializing Execution object.\n%s", err)
}
return string(bytes)
}

View file

@ -1,74 +0,0 @@
package exec
import (
"bufio"
"os"
"strings"
"awalsh128.com/cache-apt-pkgs-action/src/internal/logging"
)
// An executor that replays execution results from a recorded result.
type ReplayExecutor struct {
logFilepath string
cmdExecs map[string]*Execution
}
func NewReplayExecutor(logFilepath string) *ReplayExecutor {
file, err := os.Open(logFilepath)
if err != nil {
logging.Fatal(err)
}
defer file.Close()
scanner := bufio.NewScanner(file)
cmdExecs := make(map[string]*Execution)
for scanner.Scan() {
line := scanner.Text()
if strings.Contains(line, "EXECUTION-OBJ-START") {
payload := ""
for scanner.Scan() {
line = scanner.Text()
if strings.Contains(line, "EXECUTION-OBJ-END") {
execution := DeserializeExecution(payload)
cmdExecs[execution.Cmd] = execution
break
} else {
payload += line + "\n"
}
}
}
}
if err := scanner.Err(); err != nil {
logging.Fatal(err)
}
return &ReplayExecutor{logFilepath, cmdExecs}
}
func (e *ReplayExecutor) getCmds() []string {
cmds := []string{}
for cmd := range e.cmdExecs {
cmds = append(cmds, cmd)
}
return cmds
}
func (e *ReplayExecutor) Exec(name string, arg ...string) *Execution {
cmd := name + " " + strings.Join(arg, " ")
value, ok := e.cmdExecs[cmd]
if !ok {
var available string
if len(e.getCmds()) > 0 {
available = "\n" + strings.Join(e.getCmds(), "\n")
} else {
available = " NONE"
}
logging.Fatalf(
"Unable to replay command '%s'.\n"+
"No command found in the debug log; available commands:%s", cmd, available)
}
return value
}

View file

@ -1,7 +1,7 @@
package logging
import (
"fmt"
"io"
"log"
"os"
"path/filepath"
@ -11,6 +11,7 @@ type Logger struct {
wrapped *log.Logger
Filename string
Debug bool
file *os.File
}
var logger *Logger
@ -22,13 +23,16 @@ func Init(filename string, debug bool) *Logger {
file, err := os.OpenFile(LogFilepath, os.O_CREATE|os.O_WRONLY, 0644)
if err != nil {
log.Fatal(err)
os.Exit(2)
}
cwd, _ := os.Getwd()
logger = &Logger{
wrapped: log.New(file, "", log.LstdFlags),
// Logs to both stderr and file.
// Stderr is used to act as a sidechannel of information and stay separate from the actual outputs of the program.
wrapped: log.New(io.MultiWriter(file, os.Stderr), "", log.LstdFlags),
Filename: filepath.Join(cwd, file.Name()),
Debug: debug,
file: file,
}
Debug("Debug log created at %s", logger.Filename)
return logger
@ -46,12 +50,14 @@ func Debug(format string, a ...any) {
}
}
func Info(format string, a ...any) {
logger.wrapped.Printf(format+"\n", a...)
}
func Fatal(err error) {
fmt.Fprintf(os.Stderr, "%s", err.Error())
logger.wrapped.Fatal(err)
}
func Fatalf(format string, a ...any) {
fmt.Fprintf(os.Stderr, format+"\n", a...)
logger.wrapped.Fatalf(format, a...)
logger.wrapped.Fatalf(format+"\n", a...)
}

76
src/internal/pkgs/pkgs.go Normal file
View file

@ -0,0 +1,76 @@
package pkgs
import (
"fmt"
"strings"
"github.com/bluet/syspkg"
"github.com/bluet/syspkg/manager"
)
// Package represents a package with its version information
type Package struct {
Name string
Version string
}
// String returns a string representation of a package in the format "name" or "name=version"
func (p Package) String() string {
if p.Version != "" {
return fmt.Sprintf("%s=%s", p.Name, p.Version)
}
return p.Name
}
type Packages []Package
func ParsePackageArgs(value []string) *Packages {
var pkgs Packages
for _, val := range value {
parts := strings.SplitN(val, "=", 2)
if len(parts) == 1 {
pkgs = append(pkgs, Package{Name: parts[0]})
continue
}
pkgs = append(pkgs, Package{Name: parts[0], Version: parts[1]})
}
return &pkgs
}
// String returns a string representation of Packages
func (p *Packages) String() string {
var parts []string
for _, arg := range *p {
parts = append(parts, arg.String())
}
return strings.Join(parts, " ")
}
type Apt struct {
Manager syspkg.PackageManager
}
func New() (*Apt, error) {
registry, err := syspkg.New(syspkg.IncludeOptions{AptFast: true})
if err != nil {
return nil, fmt.Errorf("error initializing SysPkg: %v", err)
}
// Get APT package manager (if available)
aptManager, err := registry.GetPackageManager("apt-fast")
if err != nil {
return nil, fmt.Errorf("APT package manager not available: %v", err)
}
return &Apt{
Manager: aptManager,
}, nil
}
func (a *Apt) ValidatePackage(pkg *Package) (syspkg.PackageInfo, error) {
packageInfo, err := a.Manager.GetPackageInfo(pkg.String(), &manager.Options{AssumeYes: true})
if err != nil {
return syspkg.PackageInfo{}, fmt.Errorf("error getting package info: %v", err)
}
return packageInfo, nil
}

View file

@ -0,0 +1,106 @@
package pkgs
import (
"testing"
)
func TestFindInstalledPackages(t *testing.T) {
tests := []struct {
name string
input []Package
wantValid []Package
wantInvalid []string
wantError bool
}{
{
name: "empty list",
input: []Package{},
wantValid: []Package{},
wantInvalid: []string{},
wantError: false,
},
// Add more test cases here once we have a way to mock syspkg
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result, err := FindInstalledPackages(tt.input)
if (err != nil) != tt.wantError {
t.Errorf("FindInstalledPackages() error = %v, wantError %v", err, tt.wantError)
return
}
if err != nil {
return
}
if len(result.Valid) != len(tt.wantValid) {
t.Errorf("FindInstalledPackages() valid = %v, want %v", result.Valid, tt.wantValid)
}
if len(result.Invalid) != len(tt.wantInvalid) {
t.Errorf("FindInstalledPackages() invalid = %v, want %v", result.Invalid, tt.wantInvalid)
}
})
}
}
func TestValidatePackages(t *testing.T) {
tests := []struct {
name string
packageList string
wantError bool
}{
{
name: "empty string",
packageList: "",
wantError: false,
},
{
name: "single package",
packageList: "gcc",
wantError: false,
},
{
name: "multiple packages",
packageList: "gcc,g++,make",
wantError: false,
},
{
name: "package with version",
packageList: "gcc=4.8.5",
wantError: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
_, err := ValidatePackages(tt.packageList)
if (err != nil) != tt.wantError {
t.Errorf("ValidatePackages() error = %v, wantError %v", err, tt.wantError)
}
})
}
}
func TestGetInstalledPackages(t *testing.T) {
tests := []struct {
name string
installLog string
wantError bool
}{
{
name: "empty log",
installLog: "test_empty.log",
wantError: true,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
_, err := GetInstalledPackages(tt.installLog)
if (err != nil) != tt.wantError {
t.Errorf("GetInstalledPackages() error = %v, wantError %v", err, tt.wantError)
}
})
}
}

1
test_distribute.sh Normal file
View file

@ -0,0 +1 @@

94
tools/distribute.sh Executable file
View file

@ -0,0 +1,94 @@
#!/bin/bash
set -e
CMD="$1"
RUNNER_ARCH="$2"
BUILD_DIR="../dist"
# GitHub runner.arch values to GOARCH values
# https://github.com/github/docs/blob/main/data/reusables/actions/runner-arch-description.md
# https://github.com/golang/go/blob/master/src/internal/syslist/syslist.go
declare -A rarch_to_goarch=(
["X86"]="386"
["X64"]="amd64"
["ARM"]="arm"
["ARM64"]="arm64"
)
function usage() {
echo "error: $1" >&2
echo -e "
Usage: $0 <command>
Commands:
push - Build and push all architecture binaries to dist directory.
getbinpath [X86, X64, ARM, ARM64] - Get the binary path from dist directory." >&2
exit 1
}
function push() {
rm -fr "$BUILD_DIR"
mkdir -p "$BUILD_DIR"
# Package name
PACKAGE_NAME="cache-apt-pkgs"
# Print the build plan
echo "Building for these architectures:"
for arch in "${!rarch_to_goarch[@]}"; do
echo " - Linux/$arch (GOARCH=${rarch_to_goarch[$arch]})"
done
echo
# Build for each architecture
local binary_name
for runner_arch in "${!rarch_to_goarch[@]}"; do
go_arch="${rarch_to_goarch[$runner_arch]}"
binary_name="$BUILD_DIR/$PACKAGE_NAME-linux-$go_arch"
echo "Building $binary_name for Linux/$runner_arch (GOARCH=$go_arch)..."
# Build the binary
GOOS=linux GOARCH=$go_arch go build -v \
-o "$binary_name" \
../src/cmd/cache_apt_pkgs
echo "✓ Built $PACKAGE_NAME-linux-$go_arch"
done
echo "All builds completed!"
}
function getbinpath() {
local runner_arch=$1
if [[ -z $runner_arch ]]; then
usage "runner architecture not provided"
fi
local go_arch="${rarch_to_goarch[$runner_arch]}"
if [[ -z $go_arch ]]; then
usage "invalid runner architecture: $runner_arch"
fi
local binary_name="$BUILD_DIR/cache-apt-pkgs-linux-$go_arch"
if [[ ! -f $binary_name ]]; then
usage "binary not found: $binary_name (did you run 'push' first?)"
fi
echo "$binary_name"
}
case $CMD in
push)
push
;;
getbinpath)
getbinpath "$RUNNER_ARCH"
;;
"")
usage "command not provided"
;;
*)
usage "invalid command: $CMD"
;;
esac

127
tools/distribute_test.sh Executable file
View file

@ -0,0 +1,127 @@
#!/bin/bash
# Colors for test output
GREEN='\033[0;32m'
RED='\033[0;31m'
NC='\033[0m' # No Color
DIST_DIR="../dist"
# Test counter
PASS=0
FAIL=0
function test_case() {
local name=$1
local cmd=$2
local expected_output=$3
local should_succeed=${4:-true}
echo -n "Testing $name... "
# Run the command and capture both stdout and stderr
local output
if [[ $should_succeed == "true" ]]; then
output=$($cmd 2>&1)
local status=$?
if [[ $status -eq 0 && $output == *"$expected_output"* ]]; then
echo -e "${GREEN}PASS${NC}"
((PASS++))
return 0
fi
else
output=$($cmd 2>&1) || true
if [[ $output == *"$expected_output"* ]]; then
echo -e "${GREEN}PASS${NC}"
((PASS++))
return 0
fi
fi
echo -e "${RED}FAIL${NC}"
echo " Expected output to contain: '$expected_output'"
echo " Got: '$output'"
((FAIL++))
return 0 # Don't fail the whole test suite on one failure
}
echo "Running distribute.sh tests..."
echo "----------------------------"
# Test command validation
test_case "no command" \
"./distribute.sh" \
"error: command not provided" \
false
test_case "invalid command" \
"./distribute.sh invalid_cmd" \
"error: invalid command" \
false
# Test getbinpath
test_case "getbinpath no arch" \
"./distribute.sh getbinpath" \
"error: runner architecture not provided" \
false
test_case "getbinpath invalid arch" \
"./distribute.sh getbinpath INVALID" \
"error: invalid runner architecture: INVALID" \
false
# Test push and binary creation
test_case "push command" \
"./distribute.sh push" \
"All builds completed!" \
true
# Test binary existence
for arch in "X86:386" "X64:amd64" "ARM:arm" "ARM64:arm64"; do
runner_arch=${arch%:*}
go_arch=${arch#*:}
test_case "binary exists for $runner_arch" \
"test -f ${DIST_DIR}/cache-apt-pkgs-linux-$go_arch" \
"" \
true
done
# Test getbinpath for each architecture
for arch in "X86:386" "X64:amd64" "ARM:arm" "ARM64:arm64"; do
runner_arch=${arch%:*}
go_arch=${arch#*:}
test_case "getbinpath for $runner_arch" \
"./distribute.sh getbinpath $runner_arch" \
"${DIST_DIR}/cache-apt-pkgs-linux-$go_arch" \
true
done
# Test cleanup and rebuild
test_case "cleanup" \
"rm -rf ${DIST_DIR}" \
"" \
true
test_case "getbinpath after cleanup" \
"./distribute.sh getbinpath X64" \
"error: binary not found" \
false
test_case "rebuild after cleanup" \
"./distribute.sh push" \
"All builds completed!" \
true
test_case "getbinpath after rebuild" \
"./distribute.sh getbinpath X64" \
"${DIST_DIR}/cache-apt-pkgs-linux-amd64" \
true
# Print test summary
echo -e "\nTest Summary"
echo "------------"
echo -e "Tests passed: ${GREEN}$PASS${NC}"
echo -e "Tests failed: ${RED}$FAIL${NC}"
# Exit with failure if any tests failed
[[ $FAIL -eq 0 ]] || exit 1