mkrepo
is a repository generator with pluggable backends,
which allows you to maintain an RPM or DEB repository on various
storages, like local filesystem or S3, and periodically regenerate metadata.
Use it in tandem with your favourite CI system to produce a better pipeline.
mkrepo
helps you to get rid of ad-hoc cron jobs.
As a bonus, mkrepo
supports on-premises S3 servers like Minio.
Works on Linux and OS X. Should also work on BSD and Windows, but I haven't checked.
Create an s3 bucket named e.g. builds
and put a sample package package.rpm
to s3://builds/rpmrepo/Packages
. Then do the following:
./mkrepo.py s3://builds/rpmrepo
After this, you will find all metadata generated in s3://builds/rpmrepo/repodata
To run the tests, use the following command::
make test
Python libraries:
- boto3
mkrepo
parses your ~/.aws/config
and reads secret key and region settings.
So you may skip them in command line invocation in case you have aws config.
mkrepo.py [-h]
[--temp-dir TEMP_DIR]
[--s3-access-key-id S3_ACCESS_KEY_ID]
[--s3-secret-access-key S3_SECRET_ACCESS_KEY]
[--s3-endpoint S3_ENDPOINT]
[--s3-region S3_REGION]
[--s3-public-read]
[--sign]
[--force]
path [path ...]
--temp-dir
- /(optional)/directory used to store temporary artifacts (default is .mkrepo)--s3-access-key-id
- /(optional)/ specify S3 access key ID--s3-secret-access-key
- /(optional)/ specify S3 secret key--s3-endpoint
- /(optional)/ specify S3 server URI--s3-region
- /(optional)/ specify S3 region (default is us-east-1)--s3-public-read
- /(optional)/ set read-only permission on files uploaded to S3 for anonymous users--sign
- /(optional) sign package metadata--force
- /(optional) when adding packages to the index, the malformed one will be skipped. By default, a malformed package will cause the utility to stop working. The malformed_list.txt file will also be added to the repositorypath
- specify list of path to scan for repositories
GPG_SIGN_KEY
- the name of the key that will be used to sign package metadata.
Tips for working with GPG keys
- Create a new key:
gpg --full-generate-key
- To view all your keys, you can use:
gpg --list-secret-keys --keyid-format LONG
- Scripts can use something like this to get the Key ID:
export GPG_SIGN_KEY="$(gpg --list-secret-keys --with-colons | grep ^sec: | cut -d: -f5)"
- Export the key in ASCII armored format:
gpg --armor --export-secret-keys MYKEYID > mykeys.asc
- Import the key:
cat mykeys.asc | gpg --batch --import
MKREPO_DEB_ORIGIN
- the value of the "Origin" field of the "Release" file.MKREPO_DEB_LABEL
- the value of the "Label" field of the "Release" file.MKREPO_DEB_DESCRIPTION
- the value of the "Description" field of the "Release" file.
mkrepo
searches the supplied path for either Packages
or pool
subdir. If
it finds Packages
, it assumes an rpm repo. If it finds pool
, it assumes a
deb repo.
Then it parses existing metadata files (if any) and compares timestamps recorded there with timestamps of all package files in the repo. Any packages that have different timestamps or that don't exist in metadata, are parsed and added to metadata.
Then new metadata is uploaded to S3, replacing previous one.
Thanks to Cyril Rohr and Ken Robertson, authors of the following awesome tools:
Unfortunately, we needed a solution that is completely decoupled from CI pipeline, and the mentioned tools only support package push mode, when you have to use a tool to actually push packages to s3, insted of native s3 clients.