Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
57 changes: 0 additions & 57 deletions .github/actions/apt_requirements/action.yml

This file was deleted.

29 changes: 29 additions & 0 deletions .github/actions/apt_requirements/restore_apt_cache/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Composite action restore APT cache

This action restores an APT cache from GitHub's cache.

Combined with [**save_apt_cache**](../save_apt_cache/README.md), it helps save time by avoiding the download of APT requirements.

The action is composed of five steps:

1. **Compute APT requirements files SHA256 hash** - This step uses [**misc/compute_files_hash**](../../misc/compute_files_hash/README.md) action to compute a single SHA256 hash of the APT requirements file described by the *apt_rquirements_file_path* input variable. The computed SHA256 hash will be part of the cache key.
2. **Backup `/var/cache/apt/archives permissions`** - This step backs up the permissions associated to the `/var/cache/apt/archives` directory. So, after restoring the APT cache they can be restored to the original ones.
3. **Add write permissions for all to `/var/cache/apt/archives`** - This step sets the write permission to the `/var/cache/apt/archives`. This is crucial because the [**cache/restore**](https://github.com/actions/cache/blob/main/restore/README.md) GitHub's action needs to be able to write to it. Without setting the correct write permission, a permission error is raised.
4. **Restore APT cache** - This step restores the APT cache. It uses the GitHub's [**cache/restore**](https://github.com/actions/cache/blob/main/restore/README.md) action with the following parameters:
* **path** - A list of files, directories, or paths to restore - set to `/var/cache/apt/archives/*.deb`.
* **key** - An explicit key for a cache entry - set to the combination of three strings:
* *git_reference*, provided as an input to the action.
* A static part, `-apt-`
* The previously computed SHA256 hash of the APT requirements file.
5. **Restore original permissions to `/var/cache/apt/archives` and delete backup** - This step restore the original permissions to the `/var/cache/apt/archives` directory. Finally, the backup file is deleted.

## Documentation

### Inputs

* **apt_requirements_file_path** - Required - Path to the APT requirements file. It will be used to compute a SHA256 hash used in the cache key.
* **git_reference** - Optional - A git reference that will be used to build the cache key. It defaults to `github.ref_name` which is a context variable containing **the short ref name of the branch or tag that triggered the workflow run**. For example it may be `feature-branch-1` or, for pull requests, `<pr_number>/merge`.

### Outputs

* **cache-hit** - A boolean value which is true when APT cache is found in the GitHub's cache, false otherwise.
64 changes: 64 additions & 0 deletions .github/actions/apt_requirements/restore_apt_cache/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
name: Composite action restore APT cache
description: Composite action to restore APT cache
inputs:
apt_requirements_file_path:
description: Path to the APT requirements file
required: true
git_reference:
description: A git reference (name of the branch, reference to the PR) that will be used to build the cache key.
required: false
default: ${{ github.ref_name }}

outputs:
cache-hit:
description: Whether the APT cache was found in the GitHub's cache or not.
value: ${{ steps.restore_apt_cache.outputs.cache-hit }}


runs:
using: "composite"
steps:
- name: Compute APT requirements file SHA256 hash
id: compute_apt_requirements_file_sha256_hash
uses: ./.github/actions/misc/compute_files_hash
with:
file_paths: ${{ inputs.apt_requirements_file_path }}

- name: Backup /var/cache/apt/archives permissions
id: backup_apt_cache_dir_permissions
run: |
PERMISSIONS_FILE_PATH="/tmp/apt_cache_dir_permissions.facl"
echo "apt_cache_dir_permissions_file=$PERMISSIONS_FILE_PATH" > $GITHUB_OUTPUT
sudo getfacl -p /var/cache/apt/archives > $PERMISSIONS_FILE_PATH
ARCHIVES_PERMISSIONS=$(ls -ld /var/cache/apt/archives)
echo "::debug::Original permissions given to /var/cache/apt/archives: $ARCHIVES_PERMISSIONS"
echo "::debug::Created /var/cache/apt/archives permissions backup to $PERMISSIONS_FILE_PATH"
shell: bash

# Vital to be able to restore cache
# If write permission is not set, a permissions error will be raised
- name: Add write permission for all to /var/cache/apt/archives
run: |
sudo chmod a+w /var/cache/apt/archives
ARCHIVES_NEW_PERMISSIONS=$(ls -ld /var/cache/apt/archives)
echo "::debug::New permissions given to /var/cache/apt/archives: $ARCHIVES_NEW_PERMISSIONS"
shell: bash

- name: Restore APT cache
uses: actions/cache/restore@v4
id: restore_apt_cache
with:
path: /var/cache/apt/archives/*.deb
key: ${{ inputs.git_reference }}-apt-${{ steps.compute_apt_requirements_file_sha256_hash.outputs.computed_hash }}

- name: Restore original permissions to /var/cache/apt/archives and delete backup
run: |
PERMISSIONS_FILE_PATH=${{ steps.backup_apt_cache_dir_permissions.outputs.apt_cache_dir_permissions_file }}
sudo setfacl --restore="$PERMISSIONS_FILE_PATH"
ARCHIVES_RESTORED_PERMISSIONS=$(ls -ld /var/cache/apt/archives)
echo "::debug::Restored original permissions to /var/cache/apt/archives: $ARCHIVES_RESTORED_PERMISSIONS"
if [[ -f "$PERMISSIONS_FILE_PATH" ]]; then
sudo rm "$PERMISSIONS_FILE_PATH"
echo "::debug::Correctly removed $PERMISSIONS_FILE_PATH permissions backup file"
fi
shell: bash
22 changes: 22 additions & 0 deletions .github/actions/apt_requirements/save_apt_cache/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# Composite action save APT cache

This action saves the APT cache, almost always located at `/var/cache/apt/archives/*.deb` to the GitHub's cache.

Combined with [**restore_apt_cache**](../restore_apt_cache/README.md) helps save time by avoiding the download of APT requirements.

The action is composed of two steps:

1. **Compute APT requirements file SHA256 hash** - This step uses the [**misc/compute_files_hash**](../../misc/compute_files_hash/README.md) action to compute the SHA256 hash of the APT requriments file that will be part of the cache key.
2. **Save APT cache** - This step does the real caching on GitHub. The GitHub's [**cache/save**](https://github.com/actions/cache/blob/main/save/README.md) is used with the following parameters:
1. **path** - A list of files, directories, or paths to cache - set to `/var/cache/apt/archives/*.deb` to save all `*.deb` files in APT cache.
2. **key** - An explicit key for a cache entry - set to the combination of three strings:
1. *git_reference*, provided as an input to the action.
2. A static part, `-apt-`
3. The previously computed SHA256 hash of the APT requirements file.

## Documentation

### Inputs

* **apt_requirements_file_path** - Required - Path to the APT requirements file. It will be used to compute a SHA256 hash used in the cache key.
* **git_reference** - Optional - A git reference that will be used to build the cache key. It defaults to `github.ref_name` which is a context variable containing **the short ref name of the branch or tag that triggered the workflow run**. For example it may be `feature-branch-1` or, for pull requests, `<pr_number>/merge`.
24 changes: 24 additions & 0 deletions .github/actions/apt_requirements/save_apt_cache/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: Composite action save APT cache
description: Composite action to save APT cache
inputs:
apt_requirements_file_path:
description: Path to the APT requirements file
required: true
git_reference:
description: A git reference (name of the branch, reference to the PR) that will be used to build the cache key.
required: false
default: ${{ github.ref_name }}

runs:
using: "composite"
steps:
- name: Compute APT requiremments file SHA256 hash
id: compute_apt_requirements_file_sha256_hash
uses: ./.github/actions/misc/compute_files_hash
with:
file_paths: ${{ inputs.apt_requirements_file_path }}
- name: Save APT cache
uses: actions/cache/save@v4
with:
path: /var/cache/apt/archives/*.deb
key: ${{ inputs.git_reference }}-apt-${{ steps.compute_apt_requirements_file_sha256_hash.outputs.computed_hash }}
18 changes: 18 additions & 0 deletions .github/actions/misc/compute_files_hash/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Composite action compute files hash

This action computes a single SHA256 hash of one or more files.
Given a **space separated list of file paths**, a new file is created by concatenating all those files together. Then the SHA256 hash of the newly created file is computed and returned as the output.

Before being joined together, each file is tested to ensure that it **exists** and that it is **a regular file**.

This action is useful when saving/restoring a cache in which a unique key is required. As a matter of fact, the hash is used as a part of the hash key.

## Documentation

### Inputs

* `file_paths` - Mandatory - Space separated list of file paths for which a single SHA256 hash will be computed.

### Outputs

* `computed_hash` - A SHA256 hash of the file obtained by joining (concatenating) all input files together.
12 changes: 7 additions & 5 deletions .github/actions/misc/compute_files_hash/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@ name: Composite action compute files hash
description: Composite action to compute a single hash of one or more files
inputs:
file_paths:
description: Comma separeted list of files.
required: false
description: Space separeted list of files for which a single SHA256 hash will be computed.
required: true

outputs:
computed_hash:
Expand All @@ -16,11 +16,13 @@ runs:
- name: Compute files SHA256 hash
id: compute_files_sha256_hash
run: |
if [[ -z '${{ inputs.file_paths }}' ]]; then
echo "::error::file_paths cannot be empty!"
exit 1
fi
JOINED_FILES="cat "
# Create a bash array of file paths
IFS=',' read -r -a files <<< "${{ inputs.file_paths }}"
echo "::debug::File paths array is composed by: ${files[@]}"
for file in ${files[@]};
for file in ${{ inputs.file_paths }};
do
if [[ -f $file ]]; then
# Concat file path to cat command
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Composite action create Python dev requirements file

This action creates the `requirements-dev.txt` file which will contain all **development dependencies**.

As of today, the only development dependency supported is `coverage`.

## Documentation

### Inputs

* **install_from** - Optional - The path used as working directory when creating the `requirements-dev.txt` file. It defaults to the current directory (i.e. `.`).
* **project_dev_requirements_file** - Optional - The path of a project `requirements-dev.txt`. This was designed in case development requirements other than coverage are required. If specified, the dependencies in the project `requirements-dev.txt` will be appended in the newly created `requirements-dev.txt`. **Be careful: if a relative path is used this will depend on *install_from*.** Defaults to empty strings, and hence **no custom `requirements-dev.txt`**.
* **use_coverage** - Optional - Whether to use coverage or not. It defaults to false.
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
# Composite action create Python docs requirements file

This action creates the `requirements-docs.txt` file. This is a Python requirements file that will contain all **dependencies required to build the documentation**.

## Documentation

### Inputs

* **install_from** - Optional - The path used as working directory when creating the `requirements-docs.txt` file. It defaults to the current directory (i.e. `.`).
* **project_docs_requirements_file** - Optional - The path of a project `requirements-docs.txt`. This was designed in case requirements to build documentation other than rstcheck, sphinx, sphinx_rtd_theme, sphinxcontrib-spelling and sphinxcontrib-django2 are required. If specified, the dependencies in the project `requirements-docs.txt` will be appended in the newly created `requirements-docs.txt`. **Be careful: if a relative path is used this will depend on *install_from*.** Defaults to empty strings, and hence **no custom `requirements-docs.txt`**.
* **django_settings_module** - Optional - Path to the Django settings file. It's used to make GitHub action aware of Django presence. In this case, `sphinxcontrib-django2` is also added to the newly created requirement file. **Be careful: if a relative path is used this will depend on *install_from*.** Defaults to empty strings, and hence **no Django settings file**.
* **check_docs_directory** - Optional - Path that will be used by rstcheck to check documentation. **Be careful: if a relative path is used this will depend on *install_from*.** Defaults to empty strings, and hence **documentation won't be checked**.
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Composite action create Python linter requirements file

This action creates the `requirements-linters.txt` file which will contain all **linter dependencies** required by the CI.
The user can then choose which linters will be run, and hence written to the `requirements-linters.txt`, by the CI by setting some flags to true like *use_black*.

As of today only the following linters are supported:

* `autoflake`
* `bandit`
* `black`
* `flake8`
* `flake8-django`
* `isort`
* `pylint`
* `pylint-django`

## Documentation

### Inputs

* **install_from** - Optional - The path used as working directory when creating the `requirements-linters.txt` file. It defaults to the current directory (i.e. `.`).
* `project_linter_requirements_file` - Optional - The path of a project `requirements-linters.txt`. This was designed in case requirements for linters other than `autoflake`, `bandit`, `black`, `flake8`, `flake8-django`, `isort`, `pylint` and `pylint-django` are required. If specified, the dependencies in the project `requirements-linters.txt` will be appended in the newly created `requirements-linters.txt`. **Be careful: if a relative path is used this will depend on *install_from*.** Defaults to empty strings, and hence **no custom `requirements-linters.txt`**.
* **django_settings_module** - Optional - Path to the Django settings file. It's used to make GitHub action aware of Django presence. In the case of a Django project, `flake8-django` and `pylint-django`, may be used and hence they will be added to the newly created requirements file. **Be careful: if a relative path is used this will depend on *install_from*.** Defaults to empty strings, and hence **no Django settings file**.
* **use_autoflake** - Optional - Flag to state whether to use or not `autoflake` linter. It defaults to false.
* **use_bandit** - Optional - Flag to state whether to use or not `bandit` linter. It defaults to false.
* **use_flake8** - Optional - Flag to state whether to use or not `flake8` linter. It defaults to false.
* **use_pylint** - Optional - Flag to state whether to use or not `pylint` linter. It defaults to false.
20 changes: 20 additions & 0 deletions .github/actions/python_requirements/create_virtualenv/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Composite action create Python virtual environment

This GitHub action creates a Python virtual environment using Python's `venv` module.

When the *activate_only* flag set is to true, the virtual environment at *virtualenv_path* will only be activated—**no creation will take place**.

NOTE:

To activate a Python virtual environment, the `activate` script is often used.
However, in a GitHub Action environment, this is not enough because environment variables are "lost" at the end of the Action. For this we need to do two things:

1. Append the `VIRTUAL_ENV` environment variable to the `GITHUB_ENV` environment file. The [`GITHUB_ENV`](https://docs.github.com/en/enterprise-cloud@latest/actions/writing-workflows/choosing-what-your-workflow-does/workflow-commands-for-github-actions#setting-an-environment-variable) files makes environment variables available to any subsequent steps in a workflow job. Finally, it's important to note that `VIRTUAL_ENV` variable is created by the `activate` script and contains the path to the virtual environment.
2. Prepend the virtual environment's `bin` path to the system PATH. To allow also any subsequent steps in a workflow to be able to use it, [`GITHUB_PATH`](https://docs.github.com/en/enterprise-cloud@latest/actions/writing-workflows/choosing-what-your-workflow-does/workflow-commands-for-github-actions#adding-a-system-path) is employed.

## Documentation

### Inputs

* **virtualenv_path** - Optional - The path where the virtual environment will be created. It defaults to `.venv`.
* **activate_only** - Optional - Flag that states whether to only activate the virtual environment. If false, a new virtual environment will be created before being activated. It defaults to false.
Original file line number Diff line number Diff line change
Expand Up @@ -22,5 +22,7 @@ runs:
run: |
source ${{ inputs.virtualenv_path }}/bin/activate
echo "VIRTUAL_ENV=$VIRTUAL_ENV" >> $GITHUB_ENV
echo "::debug::Virtual environment path is $VIRTUAL_ENV"
echo "$VIRTUAL_ENV/bin" >> $GITHUB_PATH
echo "::debug::PATH environment variable state after $VIRTUAL_ENV/bin path being added to it: $GITHUB_PATH"
shell: bash
Loading
Loading