While I initially thought figuring this out would be a time consuming task, it turned out to be pretty straighforward thanks to the helpful documentation with examples avaialable on the topic. The use case for me was to upload a gzipped tar of a SQLite database file (few KBs) from a workflow that is scheduled to run once everyday. Following workflow is geard toward the task.
# This workflow saves the db.sqlite3 file as an artifact once everyday
name: SQLite backup
# Controls when the workflow will run
on:
# Triggers the workflow on schedule
schedule:
# * is a special character in YAML so you have to quote this string
# Runs every day at 00:09 UTC
- cron: '9 0 * * *'
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "backup"
backup:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
#
# Create db.sqlite3.bak.tar.gz within current directory in GH actions runner's shell
#
# Runs a set of commands using the runners shell
- name: 'Upload sqlite backup file as a build artifact'
uses: actions/upload-artifact@v2
with:
name: db.sqlite3
path: db.sqlite3.bak.tar.gz
retention-days: 1
Above workflow saves db.sqlite3.bak.tar.gz
(named db.sqlite3
) as an artifact with a retention
period of 1
day. Once the retention period expires, the artifact will be
removed and be no longer accessible. The default retention period is 90 days at
the time of writing, which would be the default case if it is not specified.