Skip to content

Commit

Permalink
Merge pull request #18 from ggatward/content-automation
Browse files Browse the repository at this point in the history
Content automation
  • Loading branch information
ggatward authored Dec 10, 2017
2 parents 56f13c0 + af6e8ce commit b0724f3
Show file tree
Hide file tree
Showing 13 changed files with 541 additions and 27 deletions.
13 changes: 13 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,19 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## [Unreleased]
### Added
- push_puppetforge now supports jFrog Artifiactory repository via HTTP POST
- sat_import now checks for exports that have not been imported (missed/skipped)
- sat_import --fixhistory option to force align import/export histories
- Email notification capability for use when automating content scripts
- Add unattended option to allow scripts to be automated
- auto_content scripts to allow unattended import/publish/promote/clean activity

### Changed
- --notar export saved in /cdn_export dir rather than /export to prevent it being deleted

### Removed
- Skip GPG short option (-n)


## [1.1.1] - 2017-10-25
Expand Down
43 changes: 36 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ hammer user create --login svc-api-user --firstname API --lastname User \
--organization-ids 1 --default-organization-id 1 --admin true
```

Foreman needs to be configured to export content to the location you require. By default the path is
Foreman needs to be configured to export content to the location you require. By default the path is
/var/lib/pulp/katello-export - this will result in you probably filling your /var/lib/pulp partition!
The configs in these scripts assume that the exports are going to /var/sat-export - this should be a
dedicated partition or even better dedicated disk just for export content.
Expand Down Expand Up @@ -68,6 +68,11 @@ logging:
dir: /var/log/sat6-scripts (Directory to use for logging)
debug: [True|False]
email:
mailout: True
mailfrom: Satellite 6 <[email protected]>
mailto: [email protected]
export:
dir: /var/sat-export (Directory to export content to - Connected Satellite)
Expand Down Expand Up @@ -199,6 +204,7 @@ optional arguments:
-l, --last Display time of last export
-L, --list List all successfully completed exports
--nogpg Skip GPG checking
-u, --unattended Answer any prompts safely, allowing automated usage
-r, --repodata Include repodata for repos with no incremental content
-p, --puppetforge Include puppet-forge-server format Puppet Forge repo
--notar Do not archive the extracted content
Expand All @@ -223,11 +229,14 @@ This companion script to sat_export, running on the Disconnected Satellite
performs a sha256sum verification of each part of the specified archive prior
to extracting the transferred content to disk.

Once the content has been extracted, a sync is triggered of each repository
in the import set. Note that repositories MUST be enabled on the disconnected
satellite prior to the sync working - for this reason a `nosync` option (-n)
exists so that the repos can be extracted to disk and then enabled before the
sync occurs. In order to not overload the Satellite during the sync, the
Once the content has been extracted, a check is performed to see if any exports
performed have not yet been imported. This is to assist with data integrity on
the disconnected Satellite system. Any missing imports will be displayed and the
option to continue or abort will be presented. Upon continuing, a sync is triggered
of each repository in the import set. Note that repositories MUST be enabled on the
disconnected satellite prior to the sync working - for this reason a `nosync`
option (-n) exists so that the repos can be extracted to disk and then enabled
before the sync occurs. In order to not overload the Satellite during the sync, the
repositories will be synced in smaller batches, the number of repos in a batch
being defined in the config.yml file. (It has been observed on systems with a
large number of repos that triggering a sync on all repos at once pretty much
Expand All @@ -245,6 +254,13 @@ All previously imported datasets can be shown with the (-L) flag.
Note that a dataset can normally only be imported ONCE. To force an import of an
already completed dataset, use the (-f) flag.

In the event that missing import datasets are detected, they should be imported to
ensure data integrity and consistency. There may however be cases that result in
the missing imports being included by other means, or no longer required at all.
In these cases, the --fixhistory flag can be used to 'reset' the import history
so that it matches the export history of the current import dataset, clearing
these warnings.

### Help Output
```
usage: sat_import.py [-h] [-o ORG] -d DATE [-n] [-r] [-l] [-L] [-f]
Expand All @@ -260,7 +276,9 @@ optional arguments:
-l, --last Show the last successfully completed import date
-L, --list List all successfully completed imports
-c, --count Display all package counts after import
-f, --force Force import of data if it has previously been done
-f, --force Force import of data if it has previously been done
-u, --unattended Answer any prompts safely, allowing automated usage
--fixhistory Force import history to match export history
```

### Examples
Expand Down Expand Up @@ -482,3 +500,14 @@ optional arguments:
./promote_content_view.py -e Production -a # Promote all views to Production
./promote_content_view.py -e Quality -d # See what would be done for Quality
```


# auto_content
Sample script that allows for the unattended automation of content management.
This script will find any import datasets present and import them (in order).
Successful import of the content then triggers a publish. On nominated days/weeks
content is promoted to various lifecycle stages, and content view cleanup is also
performed. Like the other scripts it calls, it supports a dry run (-d) option to
show what would be performed without actually doing it.

This script can be copied and extended to support custom automation requirements.
193 changes: 193 additions & 0 deletions auto_content.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,193 @@
#!/usr/bin/env python

import sys, os, glob
import subprocess
import argparse
import datetime
import helpers


def dates():
# What day is it? (0=Sun -> 6=Sat)
dayofweek = datetime.datetime.today().weekday()

# Figure out which week of the month we are in
weekofmonth = (datetime.datetime.now().day-1)/7+1

print "Day %s of week %s" % (dayofweek, weekofmonth)

return(dayofweek,weekofmonth)


def run_imports(dryrun):
print "Processing Imports..."

# Find any sha256 files in the import dir
infiles = glob.glob(helpers.IMPORTDIR + '/*.sha256')

# Extract the dataset timestamp/name from the filename and add to a new list
# Assumes naming standard sat6_export_YYYYMMDD-HHMM_NAME.sha256
# 'sorted' function should result in imports being done in correct order by filename
tslist = []
good_imports = False
for f in sorted(infiles):
dstime = f.split('_')[-2]
dsname = (f.split('_')[-1]).split('.')[-2]
tslist.append(dstime + '_' + dsname)

if tslist:
msg = 'Found import datasets on disk...\n' + '\n'.join(tslist)
else:
msg = 'No import datasets to process'
helpers.log_msg(msg, 'INFO')
print msg

# Now for each import file in the list, run the import script in unattended mode:-)
if tslist:
if not dryrun:
for dataset in tslist:
rc = subprocess.call(['/usr/local/bin/sat_import', '-u', '-r', '-d', dataset])

# If the import is successful
if rc == 0:
good_imports = True

else:
msg = "Dry run - not actually performing import"
helpers.log_msg(msg, 'WARNING')

return good_imports


def publish_cv(dryrun):
print "Running Content View Publish..."

# Set the initial state
good_publish = False

if not dryrun:
rc = subprocess.call(['/usr/local/bin/publish_content_views', '-q', '-a'])
else:
msg = "Dry run - not actually performing publish"
helpers.log_msg(msg, 'WARNING')
rc = subprocess.call(['/usr/local/bin/publish_content_views', '-q', '-a', '-d'])

if rc == 0:
good_publish = True

return good_publish


def promote_cv(dryrun, lifecycle):
print "Running Content View Promotion to " + lifecycle + "..."

# Set the initial state
good_promote = False

if not dryrun:
rc = subprocess.call(['/usr/local/bin/promote_content_views', '-q', '-e', lifecycle])
else:
msg = "Dry run - not actually performing promotion"
helpers.log_msg(msg, 'WARNING')
rc = subprocess.call(['/usr/local/bin/promote_content_views', '-q', '-d', '-e', lifecycle])

if rc == 0:
good_promote = True

return good_promote


def push_puppet(dryrun):
print "Pushing puppet modules to puppet-forge server..."

# Set the initial state
good_puppetpush = False

if not dryrun:
for dataset in tslist:
rc = subprocess.call(['/usr/local/bin/push-puppetforge', '-r', 'puppet-forge'])

# If the import is successful
if rc == 0:
good_puppetpush = True

else:
msg = "Dry run - not actually performing module push"
helpers.log_msg(msg, 'WARNING')

return good_puppetpush


def clean_cv(dryrun):
print "Running Content View Cleanup..."

if not dryrun:
rc = subprocess.call(['/usr/local/bin/clean_content_views', '-a', '-c'])
else:
msg = "Dry run - not actually performing cleanup"
helpers.log_msg(msg, 'WARNING')
rc = subprocess.call(['/usr/local/bin/clean_content_views', '-a', '-c', '-d'])


def main(args):

### Run import/publish on scheduled day

# Check for sane input
parser = argparse.ArgumentParser(
description='Imports, Publishes and Promotes content views.')
parser.add_argument('-d', '--dryrun', help='Dry Run - Only show what will be done',
required=False, action="store_true")
parser.add_argument('-p', '--puppet', help='Include puppet-forge module push',
required=False, action="store_true")

args = parser.parse_args()

# Set default flags and read in options given to us
if args.dryrun:
dryrun = True
else:
dryrun = False

run_publish = False
run_promote = True

# Determine the day of week and week of month for use in our scheduling
(dayofweek, weekofmonth) = dates()


# Run promotion first - this ensures content consistency (QA->Prod, Library->QA)
if dayofweek == 1:
if weekofmonth == 4:
run_promote = promote_cv(dryrun, 'Production')

# Run QA promotion on 2nd and 4th Monday. Conditional on Prod promotion success
if weekofmonth == 2 or weekofmonth == 4:
if run_promote:
run_promote = promote_cv(dryrun, 'Quality')


# Every day, check if there are any imports in our input dir and import them.
# run_publish will be returned as 'True' if any successful imports were performed.
# If no imports are performed, or they fail, publish can't be triggered.
run_publish = run_imports(dryrun)

# If the imports succeeded, we can go ahead and publish the new content to Library
if run_publish:
publish_cv(dryrun)
# Push any new puppet-forge modules if we have requested that
if args.puppet:
push_puppet(dryrun)

# Run content view cleanup once a month, after we have done all promotions for the month.
if dayofweek == 4:
if weekofmonth == 4:
clean_cv(dryrun)


if __name__ == "__main__":
try:
main(sys.argv[1:])
except KeyboardInterrupt, e:
print >> sys.stderr, ("\n\nExiting on user cancel.")
sys.exit(1)
10 changes: 10 additions & 0 deletions bin/auto_content
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
#!/usr/bin/python
import sys

sys.path.insert(0, '/usr/share/sat6_scripts')
try:
import auto_content
auto_content.main(sys.argv[1:])
except KeyboardInterrupt, e:
print >> sys.stderr, "\n\nExiting on user cancel."
sys.exit(1)
5 changes: 5 additions & 0 deletions config/config.yml.example
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,11 @@ logging:
dir: /var/log/satellite
debug: False

email:
mailout: True
mailfrom: Satellite 6 <[email protected]>
mailto: [email protected]

export:
dir: /var/sat-export

Expand Down
Loading

0 comments on commit b0724f3

Please sign in to comment.