Skip to content

Commit

Permalink
Multiple issues in 6.3 resolved (#45)
Browse files Browse the repository at this point in the history
* Sat63 fixes (#38)
* New methods for Sat 6.3 yum export
* Fix to puppet exporter to handle backend_id
* Fix for 6.3 file exports
* Fix DoV export for 6.3
* Count DRPMs as well as RPMs
* Update README
* Version bump to 1.2.3
* Update CHANGELOG.md
* Patch applied for Issues 42 and 43
* Add split size option
* Remove diff file
  • Loading branch information
ggatward authored Oct 14, 2018
1 parent 1de89ad commit 54e469e
Show file tree
Hide file tree
Showing 5 changed files with 70 additions and 16 deletions.
15 changes: 15 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,21 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## [Unreleased]
### Fixed
- clean_content_views raised an exception if a CV version was included in a composite view.
- Default org view was assumed to be version 1.0. Correct version is now extracted (Issue #43)
- Org name and label do not always match. Issue with mixed case and spaces in org name (Issue #42)

### Added
- Option to define the tar split size (Issue #44)


## [1.2.3] - 2018-03-12
### Changed
- Export package count now counts DRPM packages exported by Sat 6.3

### Fixed
- sat_export did not handle new backend_identifier value generated by Sat 6.3


## [1.2.3] - 2018-03-12
Expand Down
6 changes: 5 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -146,6 +146,9 @@ for import sync, however this behaviour can be overridden with the (-r) flag. Th
will be useful to periodically ensure that the disconnected satellite repos are
consistent - the repodata will indicate mismatches with synced content.

The exported content will be archived in TAR format, with a chunk size specified
by the (-S) option. The default is 4200Mb.

To export a selected repository set, the exports.yml config file must exist in the
config directory. The format of this file is shown below, and contains one or more
'env' stanzas, containing a list of repositories to export. The repository name is
Expand Down Expand Up @@ -195,7 +198,7 @@ directory being written to the import directory during the sat_import process.

### Help Output
```
usage: sat_export.py [-h] [-o ORG] [-e ENV] [-a | -i | -s SINCE] [-l] [-n]
usage: sat_export.py [-h] [-o ORG] [-e ENV] [-a | -i | -s SINCE] [-l] [-n] [-S SIZE]
Performs Export of Default Content View.
Expand All @@ -212,6 +215,7 @@ optional arguments:
--nogpg Skip GPG checking
-u, --unattended Answer any prompts safely, allowing automated usage
-r, --repodata Include repodata for repos with no incremental content
-S, --splitsize Size of split files in Megabytes, defaults to 4200
-p, --puppetforge Include puppet-forge-server format Puppet Forge repo
--notar Do not archive the extracted content
--forcexport Force export from an import-only (Disconnected) Satellite
Expand Down
19 changes: 19 additions & 0 deletions helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -219,6 +219,25 @@ def get_org_id(org_name):

return org_id

def get_org_label(org_name):
"""
Return the Organisation label for a given Org Name
"""
# Check if our organization exists, and extract its label
org = get_json(SAT_API + "organizations/" + org_name)
# If the requested organization is not found, exit
if org.get('error', None):
msg = "Organization '%s' does not exist." % org_name
log_msg(msg, 'ERROR')
sys.exit(1)
else:
# Our organization exists, so let's grab the label and write some debug
org_label = org['label']
msg = "Organisation '" + org_name + "' found with label " + org['label']
log_msg(msg, 'DEBUG')

return org_label


class ProgressBar:
def __init__(self, duration):
Expand Down
11 changes: 9 additions & 2 deletions man/sat_export.8
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,9 @@ can import directly from it.
.BR download_manifest (8).
.RE
.RS 3
- All content is archived into a chunked tar file, with each part being 4Gb to allow
- All content is archived into a chunked tar file, with each part being a default of 4Gb to allow
.RS 2
transfer via DVD if required.
transfer via DVD if required. This size can be changed if required.
.RE
.RE
.RS 3
Expand Down Expand Up @@ -148,6 +148,13 @@ will only synchronise repositories that contain new packages. This option forces
to synchronise ALL repositories even if no updates are present.
.RE
.PP
.BR "-S", " --splitsize"
.RS 3
Define the size of the tar chunks generated during export. By default the size will be
4200Mb (4.2Gb) to allow for transfer of segments via DVD. In some cases data diodes
require smaller chunk sizes for reliable transfer.
.RE
.PP
.BR "-p", " --puppetforge"
.RS 3
If exporting puppetforge modules from Satellite, also export them in a format compatible
Expand Down
35 changes: 22 additions & 13 deletions sat_export.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,11 +45,13 @@ def get_cv(org_id):
helpers.log_msg(msg, 'DEBUG')
msg = " Version: " + str(ver['version'])
helpers.log_msg(msg, 'DEBUG')
cv_ver = str(ver['version'])
msg = " Version ID: " + str(ver['id'])
helpers.log_msg(msg, 'DEBUG')

# Return the ID (should be '1') and the label (forms part of the export path name)
return cv_result['id'], cv_result['label']
return cv_result['id'], cv_ver, cv_result['label']


# Promote a content view version
def export_cv(dov_ver, last_export, export_type):
Expand Down Expand Up @@ -536,7 +538,7 @@ def do_gpg_check(export_dir):
print helpers.GREEN + "GPG Check - Pass" + helpers.ENDC


def create_tar(export_dir, name, export_history):
def create_tar(export_dir, name, export_history, splitsize):
"""
Create a TAR of the content we have exported
Creates a single tar, then splits into DVD size chunks and calculates
Expand All @@ -554,6 +556,7 @@ def create_tar(export_dir, name, export_history):
pickle.dump(export_history, open(export_dir + '/exporthistory_' + name + '.pkl', 'wb'))

os.chdir(export_dir)
print "export_dir is " + export_dir
full_tarfile = helpers.EXPORTDIR + '/sat6_export_' + today + '_' + name
short_tarfile = 'sat6_export_' + today + '_' + name
with tarfile.open(full_tarfile, 'w') as archive:
Expand Down Expand Up @@ -581,7 +584,7 @@ def create_tar(export_dir, name, export_history):
msg = "Splitting TAR file..."
helpers.log_msg(msg, 'INFO')
print msg
os.system("split -d -b 4200M " + full_tarfile + " " + full_tarfile + "_")
os.system("split -d -b " + str(splitsize) + "M " + full_tarfile + " " + full_tarfile + "_")
os.remove(full_tarfile)

# Temporary until pythonic method is done
Expand All @@ -591,7 +594,7 @@ def create_tar(export_dir, name, export_history):
os.system('sha256sum ' + short_tarfile + '_* > ' + short_tarfile + '.sha256')


def prep_export_tree(org_name, basepaths):
def prep_export_tree(org_label, basepaths):
"""
Function to combine individual export directories into single export tree
Export top level contains /content and /custom directories with 'listing'
Expand All @@ -608,10 +611,10 @@ def prep_export_tree(org_name, basepaths):
for basepath in basepaths:
msg = "Processing " + basepath
helpers.log_msg(msg, 'DEBUG')
subprocess.call("cp -rp " + basepath + "*/" + org_name + \
subprocess.call("cp -rp " + basepath + "*/" + org_label + \
"/Library/* " + helpers.EXPORTDIR + "/export", shell=True, stdout=devnull, stderr=devnull)

# Remove original directores
# Remove original directories
os.system("rm -rf " + basepath + "*/")

# We need to re-generate the 'listing' files as we will have overwritten some during the merge
Expand Down Expand Up @@ -722,6 +725,8 @@ def main(args):
required=False, action="store_true")
parser.add_argument('-p', '--puppetforge', help='Include puppet-forge-server format Puppet Forge repo',
required=False, action="store_true")
parser.add_argument('-S', '--splitsize', help='Size of split files in Megabytes, defaults to 4200',
required=False, type=int, default=4200)
args = parser.parse_args()

# If we are set as the 'DISCONNECTED' satellite, we will generally be IMPORTING content.
Expand All @@ -748,6 +753,7 @@ def main(args):

# Get the org_id (Validates our connection to the API)
org_id = helpers.get_org_id(org_name)
org_label = helpers.get_org_label(org_name)
exported_repos = []
export_history = []
basepaths = []
Expand Down Expand Up @@ -890,16 +896,16 @@ def main(args):
check_running_tasks(label, ename)

# Get the version of the CV (Default Org View) to export
dov_ver, dov_label = get_cv(org_id)
dov_id, dov_ver, dov_label = get_cv(org_id)

# Set the basepath of the export (needed due to Satellite 6.3 changes in other exports)
# 6.3 provides a 'latest_version' in the API that gives us '1.0' however this is not available
# in 6.2 so we must build the string manually for compatibility
basepath = helpers.EXPORTDIR + "/" + org_name + "-" + dov_label + "-v" + str(dov_ver) + ".0"
basepath = helpers.EXPORTDIR + "/" + org_label + "-" + dov_label + "-v" + str(dov_ver)
basepaths.append(basepath)

# Now we have a CV ID and a starting date, and no conflicting tasks, we can export
export_id = export_cv(dov_ver, last_export, export_type)
export_id = export_cv(dov_id, last_export, export_type)

# Now we need to wait for the export to complete
helpers.wait_for_task(export_id, 'export')
Expand Down Expand Up @@ -964,7 +970,6 @@ def main(args):

# Check if there are any currently running tasks that will conflict
ok_to_export = check_running_tasks(repo_result['label'], ename)

if ok_to_export:
# Count the number of packages
numpkg = count_packages(repo_result['id'])
Expand Down Expand Up @@ -1077,7 +1082,6 @@ def main(args):

# Check if there are any currently running tasks that will conflict
ok_to_export = check_running_tasks(repo_result['label'], ename)

if ok_to_export:
# Satellite 6.3 uses a different path for published file content
if 'backend_identifier' in repo_result:
Expand Down Expand Up @@ -1134,6 +1138,11 @@ def main(args):

# Check if there are any currently running tasks that will conflict
ok_to_export = check_running_tasks(repo_result['label'], ename)
# Satellite 6.3 uses a new backend_identifier key in the API result
if 'backend_identifier' in repo_result:
backend_id = repo_result['backend_identifier']
else:
backend_id = repo_result['label']

# Satellite 6.3 uses a new backend_identifier key in the API result
if 'backend_identifier' in repo_result:
Expand Down Expand Up @@ -1166,7 +1175,7 @@ def main(args):


# Combine resulting directory structures into a single repo format (top level = /content)
prep_export_tree(org_name, basepaths)
prep_export_tree(org_label, basepaths)

# Now we need to process the on-disk export data.
# Define the location of our exported data.
Expand All @@ -1186,7 +1195,7 @@ def main(args):

# Add our exported data to a tarfile
if not args.notar:
create_tar(export_dir, ename, export_history)
create_tar(export_dir, ename, export_history, args.splitsize)
else:
# We need to manually clean up a couple of working files from the export
if os.path.exists(helpers.EXPORTDIR + "/iso"):
Expand Down

0 comments on commit 54e469e

Please sign in to comment.