The thin argparse
wrapper for quick, clear and easy declaration of (hierarchical) console command interfaces via Python.
argcmdr
:
- handles the boilerplate of CLI
- while maintaining the clarity and extensibility of your code
- without requiring you to learn Yet Another argument-definition syntax
- without reinventing the wheel or sacrificing the flexibility of
argparse
- enables invocation via
- executable script (
__name__ == '__main__'
) setuptools
entrypoint- command-defining module (like the
Makefile
ofmake
)
- executable script (
- determines command hierarchy flexibly and cleanly
- command declarations are nested to indicate CLI hierarchy or
- commands are decorated to indicate their hierarchy
- includes support for elegant interaction with the operating system, via
plumbum
argcmdr
is developed for Python version 3.6.3 and above.
Using Linux or OS X? A suitable version of Python is likely already installed on your system.
For example, check the version of the default Python v3 executable on your system, if any:
python3 --version
Or, list what versions of Python executables are installed:
ls -1 /usr/bin/python3.*
If Python 3.6.3 – or greater – is not installed on your system, it is available from python.org.
However, depending on your system, you might prefer to install Python via a package manager, such as Homebrew on Mac OS X or APT on Debian-based Linux systems.
Alternatively, pyenv is highly recommended to manage arbitrary installations of Python, and may be most easily installed via the pyenv installer.
To install from PyPI via pip
:
pip install argcmdr
To install from Github:
pip install git+https://github.com/dssg/argcmdr.git
To install from source:
python setup.py install
To install just the management file command, manage
, via pipx:
pipx install argcmdr
To download manage
as an all-in-one, pre-built executable:
curl -LO https://github.com/dssg/argcmdr/releases/download/1.1.0/manage-1.1.0.zip
argcmdr
is built around the base class Command
. Your console command extends Command
, and optionally defines:
__init__(parser)
, which adds to the parser the arguments that your command requires, as supported byargparse
(see argparse)__call__([args, parser, ...])
, which is invoked when your console command is invoked, and which is expected to implement your command's functionality
For example, let's define the executable file listdir
, a trivial script which prints the current directory's contents:
#!/usr/bin/env python import os from argcmdr import Command, main class Main(Command): """print the current directory's contents""" def __call__(self): print(*os.listdir()) if __name__ == '__main__': main(Main)
Should we execute this script, it will perform much the same as ls -A
.
Let's say, however, that we would like to optionally print each item of the directory's contents on a separate line:
class Main(Command): """print the current directory's contents""" def __init__(self, parser): parser.add_argument( '-1', action='store_const', const='\n', default=' ', dest='sep', help='list one file per line', ) def __call__(self, args): print(*os.listdir(), sep=args.sep)
We now optionally support execution similar to ls -A1
, via listdir -1
.
Fittingly, this is reflected in the script's autogenerated usage text – listdir -h
prints:
usage: listdir [-h] [--tb] [-1] print the current directory's contents optional arguments: -h, --help show this help message and exit --tb, --traceback print error tracebacks -1 list one file per line
For particularly trivial commands, the class declaration syntax may be considered verbose and unnecessary. The @cmd
decorator manufactures the appropriate Command
from a decorated function or method.
The first command may be rewritten to produce an identical result:
from argcmdr import cmd @cmd def main(): """print the current directory's contents""" print(*os.listdir())
and, for the second, cmd
optionally accepts an argparse
argument definition:
@cmd('-1', action='store_const', const='\n', default=' ', dest='sep', help='list one file per line') def main(args): """print the current directory's contents""" print(*os.listdir(), sep=args.sep)
Further arguments may be added via additional decoration:
@cmd('-a', ...) @cmd('-1', ...) def main(args): ...
As much as we gain from Python and its standard library, it's quite typical to need to spawn non-Python subprocesses, and for that matter for your script's purpose to be entirely to orchestrate workflows built from operating system commands. Python's – and argcmdr's – benefit is to make this work easier, debuggable, testable and scalable.
In fact, our above, trivial example could be accomplished easily with direct execution of ls
:
import argparse from argcmdr import Local, main class Main(Local): """list directory contents""" def __init__(self, parser): parser.add_argument( 'remainder', metavar='arguments for ls', nargs=argparse.REMAINDER, ) def __call__(self, args): print(self.local['ls'](args.remainder))
local
, bound to the Local
base class, is a dictionary which caches path look-ups for system executables.
This could, however, still be cleaner. For this reason, the Local
command features a parallel invocation interface, prepare([args, parser, local, ...])
:
class Main(Local): """list directory contents""" def __init__(self, parser): parser.add_argument( 'remainder', metavar='arguments for ls', nargs=argparse.REMAINDER, ) def prepare(self, args): return self.local['ls'][args.remainder]
Via the prepare
interface, standard output is printed by default, and your command logic may be tested in a "dry run," as reflected in the usage output of the above:
usage: listdir [-h] [--tb] [-q] [-d] [-s] [--no-show] ... list directory contents positional arguments: arguments for ls optional arguments: -h, --help show this help message and exit --tb, --traceback print error tracebacks -q, --quiet do not print command output -d, --dry-run do not execute commands, but print what they are (unless --no-show is provided) -s, --show print command expressions (by default not printed unless dry-run) --no-show do not print command expressions (by default not printed unless dry-run)
To execute multiple local subprocesses, prepare
may either return an iterable (e.g. list
) of the above plumbum
bound commands, or prepare
may be defined as a generator function, (i.e. make repeated use of yield
– see below).
Subprocess commands emitted by Local.prepare
are executed in order and, by default, failed execution is interrupted by a raised exception:
class Release(Local): """release the package to pypi""" def __init__(self, parser): parser.add_argument( 'part', choices=('major', 'minor', 'patch'), help="part of the version to be bumped", ) def prepare(self, args): yield self.local['bumpversion'][args.part] yield self.local['python']['setup.py', 'sdist', 'bdist_wheel'] yield self.local['twine']['upload', 'dist/*']
Should the bumpversion
command fail, the deploy
command will not proceed.
In some cases, however, we might like to disable this functionality, and proceed regardless of a subprocess's exit code. We may pass arguments such as retcode
to plumbum
by setting this attribute on the prepare
method:
def prepare(self, args): yield self.local['bumpversion'][args.part] yield self.local['python']['setup.py', 'sdist', 'bdist_wheel'] yield self.local['twine']['upload', 'dist/*'] prepare.retcode = None
Subprocess commands emitted by the above method will not raise execution exceptions, regardless of their exit code. (To allow only certain exit code(s), set retcode
as appropriate – see plumbum.)
Having disabled execution exceptions – and regardless – we might need to inspect a subprocess command's exit code, standard output or standard error. As such, (whether we manipulate retcode
or not), argcmdr
communicates these command results with prepare
generator methods:
def prepare(self, args): (code, out, err) = yield self.local['bumpversion']['--list', args.part] yield self.local['python']['setup.py', 'sdist', 'bdist_wheel'] if out is None: version = 'DRY-RUN' else: (version_match,) = re.finditer( r'^new_version=([\d.]+)$', out, re.M, ) version = version_match.group(1) yield self.local['twine']['upload', f'dist/*{version}*']
In the above, prepare
stores the results of bumpversion
execution, in order to determine from its standard output the version to be released.
Moreover, we might like to define special handling for execution errors; and, perhaps rather than manipulate retcode
for all commands emitted by our method, we might like to handle them separately. As such, execution exceptions are also communicated back to prepare
generators:
def prepare(self, args): try: (_code, out, _err) = yield self.local['bumpversion']['--list', args.part] except self.local.ProcessExecutionError: print("execution failed but here's a joke ...") ...
Commands are run in the foreground by default, their outputs printed, as well as recorded for inspection, via the plumbum
modifier, TEE
.
To execute a command in the background (and continue), we may specify the BG
modifier:
def prepare(self, args): future = yield (self.local.BG, self.local['bumpversion']['--list', args.part])
Alternatively, we may wish to execute a command in the foreground only, (and not record its output) – e.g. to best support processes which require TTY:
def prepare(self): return (self.local.FG, self.local['ipython']['-i', 'startup.py'])
Local
is an alternate command base class, and a subclass of Command
. Any base class may be substituted for Command
when using the command decorator:
@cmd(base=CustomCommand) def main(): ...
Moreover, Local
functionality may be requested via keyword flag local
:
@cmd(local=True) def main(self): ...
And in support of the above, common case, the @local
decorator is provided:
from argcmdr import local @local def main(self): ...
Note that in the last two examples, our command function's call signature included self
.
Decorated command functions are in fact replaced with manufactured subclasses of Command
, and the function is invoked as this command's functionality – either __call__
or prepare
. It is assumed that, by default, this function should be treated as a staticmethod
, and given no reference to the manufactured Command
instance. However, in the case of local
decoration, this is not the case; the binding is left up to the decorated object, which, according to Python descriptor rules, means that a decorated function is treated as a "method" and receives the instance. This way, local
command functions may access the instance's local
dictionary of operating system executables.
Binding may be explicitly controlled via the decorator keyword binding
, e.g.:
@cmd(binding=True, base=CustomCommand) def main(self): ...
See Method commands for further examples of decorator-defined commands and alternative bindings.
Note that in our last trivial examples of listing directory contents, we made our script dependent upon the ls
command in the operating environment. argcmdr
will not, by default, print tracebacks, and it will colorize unhandled exceptions; however, we might prefer to print a far friendlier error message.
One easy way of printing friendly error messages is to make use of argparse.ArgumentParser.error()
. As we've seen, Command
invocation, via either __call__
or prepare
, may accept zero arguments, or it may require the parsed arguments argparse.Namespace
. Moreover, it may require a second argument to receive the argument parser, and a third argument to receive the local
dictionary:
class Main(Local): """list directory contents""" def __init__(self, parser): parser.add_argument( 'remainder', metavar='arguments for ls', nargs=argparse.REMAINDER, ) def prepare(self, args, parser, local): try: local_exec = local['ls'] except local.CommandNotFound: parser.error('command not available') yield local_exec[args.remainder]
If ls
is not available, the user is presented the following message upon executing the above:
usage: listdir [-h] [--tb] [-q] [-d] [-s] [--no-show] ... listdir: error: command not available
The command invocation's parsed arguments are most straight-forwardly accessible as the first argument of the Command
invocation signature, either __call__
or prepare
. However, in less-than-trivial implementations, wherein command methods are factored for reusability, passing the argument namespace from method to method may become tedious. To support such scenarios, this object is made additionally available via the Command
property, args
.
Consider a class of commands which require a database password. We don't want to store this password anywhere in plain text; rather, we expect it to be input, either via (piped) standard input or the TTY:
class DbSync(Command): """sync databases""" def __init__(self, parser): parser.add_argument( '-p', '--password', action='store_true', dest='stdin_password', default=False, help="read database password from standard input", ) def __call__(self, args): engine = self.dbengine(args) ... def dbcreds(self, args): dbcreds = { 'username': os.getenv('PGUSER'), 'host': os.getenv('PGHOST'), 'port': os.getenv('PGPORT'), 'database': os.getenv('PGDATABASE'), } missing = [key for (key, value) in dbcreds.items() if not value] if missing: raise RuntimeError( "database connection information missing from " "environmental configuration: " + ', '.join(missing) ) if args.stdin_password: dbcreds['password'] = sys.stdin.read().rstrip('\n\r') # we're done with the (pipe) stdin, so force it back to TTY for # any subsequent input() sys.stdin = open('/dev/tty') else: dbcreds['password'] = os.getenv('PGPASSWORD') if not dbcreds['password']: dbcreds['password'] = getpass.getpass( 'enter password for ' + ('{username}@{host}:{port}'.format_map(dbcreds) | colors.bold) + ': ' | colors.yellow ) return dbcreds def dburi(self, args): return sqlalchemy.engine.url.URL('postgres', **self.dbcreds(args)) def dbengine(self, args): return sqlalchemy.create_engine(self.dburi(args))
Not only were we forced to verbosely daisy-chain the arguments namespace, args
, from method to method; moreover, we were prevented from (trivially) caching the result of dbcreds
, to ensure that the password isn't ever requested more than once.
Now, let's reimplement the above, making use of the property args
:
class DbSync(Command): """sync databases""" def __init__(self, parser): parser.add_argument( '-p', '--password', action='store_true', dest='stdin_password', default=False, help="read database password from standard input", ) def __call__(self): engine = self.dbengine ... @cachedproperty def dbcreds(self): dbcreds = { 'username': os.getenv('PGUSER'), 'host': os.getenv('PGHOST'), 'port': os.getenv('PGPORT'), 'database': os.getenv('PGDATABASE'), } missing = [key for (key, value) in dbcreds.items() if not value] if missing: raise RuntimeError( "database connection information missing from " "environmental configuration: " + ', '.join(missing) ) if self.args.stdin_password: dbcreds['password'] = sys.stdin.read().rstrip('\n\r') # we're done with the (pipe) stdin, so force it back to TTY for # any subsequent input() sys.stdin = open('/dev/tty') else: dbcreds['password'] = os.getenv('PGPASSWORD') if not dbcreds['password']: dbcreds['password'] = getpass.getpass( 'enter password for ' + ('{username}@{host}:{port}'.format_map(dbcreds) | colors.bold) + ': ' | colors.yellow ) return dbcreds @property def dburi(self): return sqlalchemy.engine.url.URL('postgres', **self.dbcreds) @property def dbengine(self): return sqlalchemy.create_engine(self.dburi)
In this form, args
needn't be passed from method to method; in fact, methods of the DbSync
command needn't worry about arguments which don't directly interest them at all. And, using cachedproperty
from Dickens, the database credentials are trivially cached, ensuring they aren't needlessly re-requested.
Note that attempting to access the args
property before invocation arguments have been parsed – e.g. within __init__
– is not allowed, and will raise RuntimeError
.
In addition to args
, the parser
associated with the command may alternatively be retrieved via its parser
property.
Similar to args
, the parser
is not available until the command has been initialized; however, this property may be used within __init__
, so long as the base __init__
has been invoked (e.g. via super().__init__
).
Our tools should be modular and composable, favoring atomicity over monolithism. Nevertheless, well-designed, -structured and -annotated code and application interfaces pay their users and developers tremendous dividends over time – no less in the case of more extensive interfaces, and particularly so for project management libraries (consider the Makefile
).
argcmdr
intends to facilitate the definition of argparse
-based interfaces regardless of their structure. But it's in multi-level, or hierarchical, command argumentation that argcmdr
shines.
Rather than procedurally defining subparsers, Command
class declarations may simply be nested.
Let's define an executable file manage
for managing a codebase:
#!/usr/bin/env python import os from argcmdr import Local, main class Management(Local): """manage deployment""" def __init__(self, parser): parser.add_argument( '-e', '--env', choices=('development', 'production'), default='development', help="target environment", ) class Build(Local): """build app""" def prepare(self, args): req_path = os.path.join('requirements', f'{args.env}.txt') yield self.local['pip']['-r', req_path] class Deploy(Local): """deploy app""" def prepare(self, args): yield self.local['eb']['deploy', args.env] if __name__ == '__main__': main(Management)
Local
command Management
, above, defines no functionality of its own. As such, executing manage
without arguments prints its autogenerated usage:
usage: manage [-h] [--tb] [-q] [-d] [-s] [--no-show] [-e {development,production}] {build,deploy} ...
Because Management
extends Local
, it inherits argumentation controlling whether standard output is printed and offering to run commands in "dry" mode. (Note, however, that it could have omitted these options by extending Command
. Moreover, it may override class method base_parser()
.)
Management
adds to the basic interface the optional argument --env
. Most important, however, are the related, nested commands Build
and Deploy
, which define functionality via prepare
. Neither nested command extends its subparser – though they could; but rather, they depend upon the common argumentation defined by Management
.
Exploring the interface via --help
tells us a great deal, for example manage -h
:
usage: manage [-h] [--tb] [-q] [-d] [-s] [--no-show] [-e {development,production}] {build,deploy} ... manage deployment optional arguments: -h, --help show this help message and exit --tb, --traceback print error tracebacks -q, --quiet do not print command output -d, --dry-run do not execute commands, but print what they are (unless --no-show is provided) -s, --show print command expressions (by default not printed unless dry-run) --no-show do not print command expressions (by default not printed unless dry-run) -e {development,production}, --env {development,production} target environment management commands: {build,deploy} available commands build build app deploy deploy app
And manage deploy -h
:
usage: manage deploy [-h] deploy app optional arguments: -h, --help show this help message and exit
As such, a "dry run":
manage -de production deploy
prints the following:
> /home/user/.local/bin/eb deploy production
and without the dry-run flag the above operating system command is executed.
There is no artificial limit to the number of levels you may add to your command hierarchy. However, application interfaces are commonly "wider" than they are "deep". For this reason, as an alternative to class-nesting, the hierarchical relationship may be defined by a class decorator provided by the RootCommand
.
Let's define the executable file git
with no particular purpose whatsoever:
#!/usr/bin/env python from argcmdr import Command, RootCommand, main class Git(RootCommand): """another stupid content tracker""" def __init__(self, parser): parser.add_argument( '-C', default='.', dest='path', help="run as if git was started in <path> instead of the current " "working directory.", ) @Git.register class Stash(Command): """stash the changes in a dirty working directory away""" def __call__(self, args): self['save'].delegate() class Save(Command): """save your local modifications to a new stash""" def __init__(self, parser): parser.add_argument( '-p', '--patch', dest='interactive', action='store_true', default=False, help="interactively select hunks from the diff between HEAD " "and the working tree to be stashed", ) def __call__(self, args): print("stash save", f"(interactive: {args.interactive})") class List(Command): """list the stashes that you currently have""" def __call__(self): print("stash list") if __name__ == '__main__': main(Git)
We anticipate adding many subcommands to git
beyond stash
; and so, rather than nest all of these command classes under Git
:
- we've defined
Git
as aRootCommand
- we've declared
Stash
at the module root - we've decorated
Stash
withGit.register
The RootCommand
functions identically to the Command
; it only adds this ability to extend the listing of its subcommands by those registered via its decorator. (Notably, LocalRoot
composes the functionaliy of Local
and RootCommand
via multiple inheritance.)
The stash
command, on the other hand, has opted to contain the entirety of its hierarchical functionality, nesting its own subcommands list
and save
.
Nevertheless, you are not limited to a single RootCommand
. Any command whose hierarchy you would like to extend via the register
decorator may inherit it. Moreover, the @cmd
decorator accepts the keyword flag root
.
Decorator-manufactured commands are no less capable than those derived from class declaration syntax, except in that other commands cannot, syntactically, be nested beneath them. (For that reason the @cmd
decorator's root
flag is of note.) Decorator-manufactured commands can nonetheless themselves extend hierarchies, either by being further decorated by register
or nested under command class declarations:
@Git.register class Stash(Command): """stash the changes in a dirty working directory away""" def __call__(self, args): self['save'].delegate() @cmd('-p', '--patch', dest='interactive', action='store_true', default=False, help="interactively select hunks from the diff between HEAD " "and the working tree to be stashed") def save(args): """save your local modifications to a new stash""" print("stash save", f"(interactive: {args.interactive})") @cmd def list(): """list the stashes that you currently have""" print("stash list")
Above we've rewritten the trivial stash
commands save
and list
as @cmd
-decorated functions.
Say, however, that we needed to invert the factoring of save
logic between that command and its parent:
@Git.register class Stash(Command): """stash the changes in a dirty working directory away""" def perform_save(self, interactive=False): print("stash save", f"(interactive: {interactive})") def __call__(self): self.perform_save() @cmd('-p', '--patch', dest='interactive', action='store_true', default=False, help="interactively select hunks from the diff between HEAD " "and the working tree to be stashed") @cmd(binding=True) def save(self, args): """save your local modifications to a new stash""" self[-1].perform_save(args.interactive) @cmd def list(): """list the stashes that you currently have""" print("stash list")
(Note that cmd
can accept both an argparse
argument specification and command feature-defining arguments at once; however, this is of use mainly to the definition of helpers such as the local
decorator, as this style is difficult to read and otherwise discouraged. Moreover, only the first – i.e. inner-most – cmd
decorator's command features are respected.)
In this version, save
functionality is shared as a method of Stash
. save
is able to access this method only by ascending the command hierarchy. This might make particular sense when multiple nested commands must share functionality, which is defined on the command class under which they are nested. (Note, however, that in such a case as this one, where the shared method could be defined as a staticmethod
, it is no less advisable to do so, and for nested commands to access it directly as, e.g. Stash.perform_save
.)
Our above reference to self
in save
, however, is at first glance misleading. This command looks like an instance method of Stash
; yet, it's its own Command
, and the save
function receives as its first invocation argument an instance of the Command
class save
. Moreover, in this case, save
gains nothing from this self-reference; its class defines no special attributes or functionality of its own beyond argument-parsing.
To improve on the above, we may instead decorate our command function with cmdmethod
:
@Git.register class Stash(Command): """stash the changes in a dirty working directory away""" def perform_save(self, interactive=False): print("stash save", f"(interactive: {interactive})") def __call__(self): self.perform_save() @cmdmethod('-p', '--patch', dest='interactive', action='store_true', default=False, help="interactively select hunks from the diff between HEAD " "and the working tree to be stashed") def save(self, args): """save your local modifications to a new stash""" self.perform_save(args.interactive)
The cmdmethod
decorator – as well as the complementary localmethod
decorator – alter the binding of the decorated function such that it receives the instance of its parent command – not itself – upon invocation. Much cleaner.
As with the local
decorator, cmdmethod
is merely a wrapper of cmd
. Identical functionality can be achieved via the binding
keyword, though far more verbosely:
from argcmdr import CommandDecorator @cmd(binding=CommandDecorator.Binding.parent) def save(self, args): ...
Unlike the base command git
in the example above, the command git stash
– despite defining its own subcommands – also defines its own functionality, via __call__
. This functionality, however, is merely a shortcut to the stash
command save
. Rather than repeat the definition of this functionality, Stash
"walks" its hierarchy to access the instantiation of Save
, and invokes this command by reference.
Much of argcmdr
is defined at the class level, and as such many Command
methods are classmethod
. In the static or class context, we might walk the command hierarchy by reference, e.g. to Stash.Save
; or, from a class method of Stash
, as cls.Save
. Moreover, Command
defines the class-level "property" subcommands
, which returns a list of Command
classes immediately "under" it in the hierarchy.
The hierarchy of executable command objects, however, is instantiated at runtime and cached within the Command
instance. To facilitate navigation of this hierarchy, the Command
object is itself subscriptable. Look-up keys may be:
- strings – descend the hierarchy to the named command
- negative integers – ascend the hierarchy this many levels
- a sequence combining the above – to combine "steps" into a single action
In the above example, Stash
may have (redundantly) accessed Save
with the look-up key:
(-1, 'stash', 'save')
that is with the full expression:
self[-1, 'stash', 'save']
(The single key 'save'
, however, was far more to the point.)
Because command look-ups are relative to the current command, Command
also offers the property
root
, which returns the base command. As such, our redundant expression could be rewritten:
self.root['stash', 'save']
Finally, a command instance's immediate subcommands may be iterated by iteration of the command, e.g.:
def __call__(self): for subcommand in self: subcommand.delegate()
As you've seen above, command instance subscription enables access to ancestor and descendent commands from the command hierarchy.
And, simple Command
instances may be executed directly via __call__
. However, above, we instead invoked the delegate
method. Why?
__call__
must be invoked as defined – including its argument signature – which may or may not includeargs
and/orparser
(and which may change during development)- The
args
andparser
in the scope of the delegating command – (generally the command actually selected by user argumentation) – reflect the arguments defined for that command, not those of the delegated command.
For Local
command instances, the situation, without delegate
, is worse:
- To generate system commands (rather than executing them immediately), we must know to target
prepare
rather than__call__
For example, above, our Stash
command might look like the following without delegate
:
class Stash(Command): """stash the changes in a dirty working directory away""" def __call__(self, args): self['save'](args) class Save(Command): """save your local modifications to a new stash""" def __init__(self, parser): parser.add_argument( '-p', '--patch', dest='interactive', action='store_true', default=False, help="interactively select hunks from the diff between HEAD " "and the working tree to be stashed", ) def __call__(self, args): interactive = getattr(args, 'interactive', False) print("stash save", f"(interactive: {interactive})")
Note, in Stash.__call__
, the passing through of args
; and, in Stash.Save.__call__
, the use of getattr
. With delegate
, neither is required.
You'll also find that there's the command method call
(without underscores)!
This is a shorcut for delegate('__call__', …)
: i.e. it will only delegate to the bound command by invoking its __call__
method, (even if it's a Local
command defining prepare
).
Whereas delegate
is useful for switching between commands via their default invocation methods (either __call__
or prepare
), and for switching between execution methods of a single command, call
is useful for ensuring that the bound command will be executed – i.e. that its __call__
method will be invoked – regardless of its type. This is important to argcmdr itself (in argcmdr.main
), and useful for command delegation across disparate base classes.
In addition to the interface of custom executables, argcmdr
endeavors to improve the generation and maintainability of non-executable but standardized files, intended for management of code development projects and operations.
Similar to a project's Makefile
, we might define our previous codebase-management file as the following Python module, manage.py
:
import os from argcmdr import Local, main class Management(Local): """manage deployment""" def __init__(self, parser): parser.add_argument( '-e', '--env', choices=('development', 'production'), default='development', help="target environment", ) class Build(Local): """build app""" def prepare(self, args): req_path = os.path.join('requirements', f'{args.env}.txt') yield self.local['pip']['-r', req_path] class Deploy(Local): """deploy app""" def prepare(self, args): yield self.local['eb']['deploy', args.env]
Unlike our original script, manage
, manage.py
is not executable, and need define neither an initial shebang line nor a final __name__ == '__main__'
block.
Rather, argcmdr
supplies its own, general-purpose manage
executable command, which loads Commands from any manage.py
in the current directory, or as specified by option --manage-file PATH
. As such, the usage and functionality of our manage.py
, as invoked via argcmdr's installed manage
command, is identical to our original manage
. We need only ensure that argcmdr
is installed, in order to make use of it to manage any or all project tasks, in a standard way, with even less boilerplate.
In lieu of an explicitly defined execution path, manage
infers the base command – and hence the entrypoint – of the manage.py
management file module.
The entrypoint of a management file defining – at the module level – only one Command
, or multiple commands but only one RootCommand
, is assumed to be this one command. Otherwise, the intended entrypoint must be decorated with @entrypoint
:
from argcmdr import entrypoint, RootCommand class GoodCommand(RootCommand): def good_function(self): ... @entrypoint class CommandEhh(GoodCommand): def __call__(self): self.good_function() ... @CommandEhh.register class CommandBeh(GoodCommand): def __call__(self): self.good_function() ...
We may infer from the above that GoodCommand
is merely a base class extension, and that the module's CLI begins with the most "root" command, CommandEhh
, which is extended by CommandBeh
. However, rather than go out on a limb, when presented with these three subclasses of Command
and RootCommand
, argcmdr
requires that the intended entrypoint is explicitly marked.
Note, however, that only commands declared at the module, or "top" level, are considered potential entrypoints:
class CommandEhh(Command): class CommandBeh(Command): ...
Presented with a module containing only the above commands, argcmdr
would identify CommandEhh
as the entrypoint; CommandBeh
would never be considered, even if decorated @entrypoint
.
Python packages, no less than stand-alone modules, may also be defined for use with the manage
command, to aid in maintenance and development.
Consider the following example directory layout:
manage/ ├── __init__.py ├── cloud.py ├── db.py ├── main.py ├── morale.py ├── server.py └── util.py
argcmdr
will load the above top-level Python module, manage
, just as well as it would the manage
module defined by a manage.py
file, (whether these are available on the PYTHONPATH
or not).
Furthermore, detecting that this manage
is in fact a package, argcmdr
will automatically and recursively load all of the modules this package contains.
This allows the developer to provide argcmdr
the minimum that it requires at manage/__init__.py
– access to an interface entrypoint, i.e. the base Command
– and to organize the development of that interface in whatever maintainable way suits them.
To wit, the developer simply might write, in manage/__init__.py
:
from .main import Main # noqa
(…And they will have no need of the @entrypoint
decorator, as argcmdr
will only see the one top-level command.)
Of course, that top-level command might have been defined in __init__.py
, or as you might prefer, in manage/main.py
:
from argcmdr import RootCommand class Main(RootCommand): """your one-stop shop for devops""" ...
And, each subcommand may be defined in a submodule, such as manage/cloud.py
:
from argcmdr import Command from .main import Main @Main.register class Cloud(Command): """manage cloud computing resources""" ...
Thanks to automatic loading, the Cloud
subcommand, (which will resolve to manage cloud
), will be picked up, without additional boilerplate and without needing to consider circular imports.
To disable automatic submodule loading, set the following in manage/__init__.py
:
__auto_init_package__ = False
And to make (custom) use of this feature, see: argcmdr.init_package()
.
To ensure that such a friendly – and relatively high-level – project requirement as argcmdr
is satisfied, consider the expressly low-level utility install-cli, with which to guide contributors through the process of provisioning your project's most basic requirements.
argcmdr
supports shell command argument completion via argcomplete
(see argcomplete).
As explained by its documentation, your user (perhaps in executing the installation of your command), may enable argument completion, either:
- specifically for your shell command
- or generally for any script containing the string PYTHON_ARGCOMPLETE_OK in its first 1024 bytes
For flexibility, (and, e.g., in support of installation into virtual environments, or otherwise where system- or user-global installation is undesirable or impossible), argcmdr
does not currently insist on a particular scheme to enable argument completion.
Rather, for example, to enable argument completion system-wide, specifically for the manage
command (provisioned by argcmdr
), you might execute the following from a Bash shell (as the root user):
register-python-argcomplete --shell bash manage > /usr/share/bash-completion/completions/python-argcomplete-manage.sh
(or depending upon your system):
register-python-argcomplete --shell bash manage > /etc/bash_completion.d/python-argcomplete-manage.sh
Alternatively, the same argument completion may be enabled, but only for the current user:
mkdir -p ~/.local/share/bash-completion/completions/ register-python-argcomplete --shell bash manage > ~/.local/share/bash-completion/completions/python-argcomplete-manage.sh
(or as preferred):
mkdir -p ~/.bash_completion.d register-python-argcomplete --shell bash manage > ~/.bash_completion.d/python-argcomplete-manage.sh
Only in the latter case, the user may also required the file ~/.bash_completion
, including contents of the following form:
if [ -d ~/.bash_completion.d/ ] && [ ! -z "$(ls ~/.bash_completion.d/)" ]; then for bcfile in ~/.bash_completion.d/*; do . "$bcfile" done fi
(such that Bash will load the completion file automatically).
In the case that neither system-wide nor user-only installation is appropriate, the same argument completion may be enabled, but only for the current shell:
eval "$(register-python-argcomplete --shell bash manage)"
Regardless of the method, having so enabled argument completion (for your command), in your shell, argcmdr
will handle the rest, generating completion suggestions based on your command definition.