Skip to content

Commit

Permalink
Documentation edits to rephrase optional plurals (#6643)
Browse files Browse the repository at this point in the history
  • Loading branch information
Rick Larsen authored Oct 19, 2021
1 parent 8b164fa commit f8f35dc
Show file tree
Hide file tree
Showing 26 changed files with 60 additions and 58 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ The Endpoint bundle contains a Web service that exposes the interface to retriev
The Endpoint calls the `CatalogFramework` to execute the operations of its specification.
The `CatalogFramework` relies on the Sources to execute the actual resource retrieval.
Optional `PreResource` and `PostResource` Catalog Plugins may be invoked by the `CatalogFramework` to modify the resource retrieval request/response prior to the Catalog Provider processing the request and providing the response.
It is possible to retrieve resources from specific remote Sources by specifying the site name(s) in the request.
It is possible to retrieve resources from specific remote Sources by specifying the site names in the request.

.Product Caching
Product Caching is enabled by default.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,8 @@ The Endpoint calls the `CatalogFramework` to execute the operations of its speci
The `CatalogFramework` relies on the `CatalogProvider` to execute the actual query.
Optional PreQuery and PostQuery Catalog Plugins may be invoked by the `CatalogFramework` to modify the query request/response prior to the Catalog Provider processing the query request and providing the query response.
If a `CatalogProvider` is not configured and no other remote Sources are configured, a fault will be returned.
It is possible to have only remote Sources configured and no local `CatalogProvider` configured and be able to execute queries to specific remote Sources by specifying the site name(s) in the query request.
It is possible to have only remote Sources configured and no local `CatalogProvider` configured and be able to execute queries to specific remote Sources by specifying the site names in the query request.


==== Product Caching

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,12 @@ Use this framework if:
* queries to specific sites are required.
* queries to only the local provider are required.
It is possible to have only remote Sources configured with no local `CatalogProvider` configured and be able to execute queries to specific remote sources by specifying the site name(s) in the query request.
It is possible to have only remote Sources configured with no local `CatalogProvider` configured and be able to execute queries to specific remote sources by specifying the site names in the query request.

The Standard Catalog Framework also maintains a list of `ResourceReaders` for resource retrieval operations.
A resource reader is matched to the scheme (the protocol, such as `file://`) in the URI of the resource specified in the request to be retrieved.

Site information about the catalog provider and/or any federated source(s) can be retrieved using the Standard Catalog Framework.
Site information about the catalog provider and/or any federated sources can be retrieved using the Standard Catalog Framework.
Site information includes the source's name, version, availability, and the list of unique content types currently stored in the source (such as NITF).
If no local catalog provider is configured, the site information returned includes site info for the catalog framework with no content types included.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ The URL to the application's KAR file in this Maven repository should be the ins

== Including Data Files in a KAR File

The developer may need to include data or configuration file(s) in a KAR file.
The developer may need to include data or configuration files in a KAR file.
An example of this is a properties file for the JDBC connection properties of a catalog provider.

It is recommended that:
Expand All @@ -100,7 +100,7 @@ Sub-directories under `src/main/resources` can be used, for example, `etc/securi
== Installing a KAR File

When the user downloads an application by clicking on the *Installation* link, the application's KAR file is downloaded.
To install manually, the KAR file can be placed in the `${home_directory}/deploy` directory of the running ${branding} instance. ${branding} then detects that a file with a `.kar` file extension has been placed in this monitored directory, unzips the KAR file into the `${home_directory}/system` directory, and installs the bundle(s) listed in the KAR file's feature descriptor file.
To install manually, the KAR file can be placed in the `${home_directory}/deploy` directory of the running ${branding} instance. ${branding} then detects that a file with a `.kar` file extension has been placed in this monitored directory, unzips the KAR file into the `${home_directory}/system` directory, and installs the bundles listed in the KAR file's feature descriptor file.
To install via the ${admin-console}:
. Navigate to the *${admin-console}*.
. Select *Manage* button.
Expand Down Expand Up @@ -162,7 +162,7 @@ public class SourcesPlugin extends AbstractApplicationPlugin {

=== Including KAR Files

Sometimes a developer may need to include data or configuration file(s) in a KAR file.
Sometimes a developer may need to include data or configuration files in a KAR file.
An example of this would be a properties file for the JDBC connection properties of a catalog provider.

It is recommended that:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ The `validator` key must have a value of one of the following:
- `enumeration`
* `arguments`: (unlimited) [list of strings: each argument is one case-sensitive, valid enumeration value]
- `relationship`
* `arguments`: (4+) [attribute value or null, one of mustHave|cannotHave|canOnlyHave, target attribute name, null or target attribute value(s) as additional arguments]
* `arguments`: (4+) [attribute value or null, one of mustHave|cannotHave|canOnlyHave, target attribute name, null or target attribute values as additional arguments]
- `match_any`
* `validators`: (unlimited) [list of previously defined validators: valid if any validator succeeds]
====
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Many alternative tools exist, and OSGi manifest files can also be created by han

Avoid creating bundles by hand or editing a manifest file:: Many tools exist for creating bundles, notably the Maven Bundle plugin, which handle the details of OSGi configuration and automate the bundling process including generation of the manifest file.
Always make a distinction on which imported packages are `optional` or `required`:: Requiring every package when not necessary can cause an unnecessary dependency ripple effect among bundles.
Embedding is an implementation detail:: Using the `Embed-Dependency` instruction provided by the `maven-bundle-plugin` will insert the specified jar(s) into the target archive and add them to the `Bundle-ClassPath`. These jars and their contained packages/classes are not for public consumption; they are for the internal implementation of this service implementation only.
Embedding is an implementation detail:: Using the `Embed-Dependency` instruction provided by the `maven-bundle-plugin` will insert the specified jars into the target archive and add them to the `Bundle-ClassPath`. These jars and their contained packages/classes are not for public consumption; they are for the internal implementation of this service implementation only.
Bundles should never be embedded:: Bundles expose service implementations; they do not provide arbitrary classes to be used by other bundles.
Bundles should expose service implementations:: This is the corollary to the previous rule. Bundles should not be created when arbitrary concrete classes are being extracted to a library. In that case, a library/jar is the appropriate module packaging type.
Bundles should generally _only_ export service packages:: If there are packages internal to a bundle that comprise its implementation but not its public manifestation of the API, they should be excluded from export and kept as private packages.
Expand Down Expand Up @@ -97,7 +97,7 @@ However, the distributions frequently upgrade CXF between releases to take advan
If building on these components, be aware of the version upgrades with each distribution release.

Instead, component developers should package and deliver their own dependencies to ensure future compatibility.
For example, if re-using a bundle, the specific bundle version that you are depending on should be included in your packaged release, and the proper versions should be referenced in your bundle(s).
For example, if re-using a bundle, the specific bundle version that you are depending on should be included in your packaged release, and the proper versions should be referenced in your bundles.

=== Deploying a Bundle

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -198,7 +198,7 @@ Optionally, set the `ElementSetName` to determine how much detail to return.

* Brief: the least possible detail.
* Summary: (Default)
* Full: All metadata elements for the record(s).
* Full: All metadata elements for the records.

Within the `Constraint` element, define the query as an OSG or CQL filter.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
:operations: na
:order: 00

(((Federation))) with ${branding} is primarily accomplished through <<{integrating-prefix}endpoints,Endpoints>> accessible through http(s) requests and responses.
(((Federation))) with ${branding} is primarily accomplished through <<{integrating-prefix}endpoints,Endpoints>> accessible through https requests and responses.

[NOTE]
====
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,8 @@ Notably, the Catalog allows event evaluation on both the previous value (if avai
Subscription durability is not provided by the Event Processor.
Thus, all subscriptions are transient and will not be recreated in the event of a system restart.
It is the responsibility of Endpoints using subscriptions to persist and re-establish the subscription on startup.
This decision was made for the sake of simplicity, flexibility, and the inability of the Event Processor to recreate a fully-configured Delivery Method without being overly restrictive.

This decision was made for the sake of simplicity, flexibility, and the inability of the Event Processor to recreate a fully configured Delivery Method without being overly restrictive.

[IMPORTANT]
====
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,4 +31,4 @@ Temporal searches can use the `created` or `modified` date attributes.
Datatype Search:: A datatype search is used to search for metadata based on the datatype of the resource.
Wildcards (*) can be used in both the datatype and version fields.
Metadata that matches any of the datatypes (and associated versions if specified) will be returned.
If a version is not specified, then all metadata records for the specified datatype(s) regardless of version will be returned.
If a version is not specified, then all metadata records for the specified datatypes regardless of version will be returned.
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,6 @@ Prevent data with errors or warnings from being ingested at all.
. Select the *${ddf-catalog}* application.
. Select *Configuration*.
. Select *Metacard Validation Marker Plugin*.
. Enter *ID* of validator(s) to enforce.
. Enter *ID* of each validator to enforce.
. Select *Enforce errors* to prevent ingest for errors.
. Select *Enforce warnings* to prevent ingest for warnings.
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
To change HTTP or HTTPS ports from the default values, edit the `custom.system.properties` file.

. Open the file at ${home_directory}/etc/custom.system.properties
. Change the value after the `=` to the desired port number(s):
. Change the value after the `=` to the desired port numbers:
.. `org.codice.ddf.system.httpsPort=8993` to `org.codice.ddf.system.httpsPort=<PORT>`
.. `org.codice.ddf.system.httpPort=8181` to `org.codice.ddf.system.httpPort=<PORT>`
. Restart ${branding} for changes to take effect.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
== {title}

The ((Landing Page)) is the first page presented to users of ${branding}.
It is customizable to allow adding organizationally-relevant content.
It is customizable to allow adding organizationally relevant content.

=== Installing the Landing Page

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,4 @@ Attribute overrides are available for the following ingest methods:
.. *Confluence Connected Source*.
.. *Confluence Federated Source*.
. Select *Attribute Overrides*.
. Enter the key-value pair for the attribute to override and the value(s) to set.
. Enter the key-value pair for the attribute to override and the values to set.
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ The ${ddf-catalog} application contains a ((Content Directory Monitor)) feature
For more information about configuring a directory to be monitored, see <<{managing-prefix}configuring_the_content_directory_monitor,Configuring the Content Directory Monitor>>.

Files placed in the monitored directory will be ingested automatically.
If a file cannot be ingested, they will be moved to an automatically-created directory named `.errors`.
If a file cannot be ingested, they will be moved to an automatically created directory named `.errors`.
More information about the ingest operations can be found in the ingest log.
The default location of the log is `${home_directory}/data/log/ingest_error.log`.
Optionally, ingested files can be automatically moved to a directory called `.ingested`.
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ The syntax for the ingest command is

`ingest -t <transformer type> <file path>`

Select the `<transformer type>` based on the type of file(s) ingested.
Select the `<transformer type>` based on the type of files ingested.
Metadata will be extracted if it exists in a format compatible with the transformer.
The default transformer is the <<{developing-prefix}xml_input_transformer,XML input transformer>>, which supports the metadata schema `catalog:metacard`.
To see a list of all transformers currently installed, and the file types supported by each, run the `catalog:transformers` command.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

== {title}

((Ingesting)) is the process of getting metacard(s) into the Catalog Framework.
((Ingesting)) is the process of getting metacards into the Catalog Framework.
Ingested files are "transformed" into a neutral format that can be searched against as well as migrated to other formats and systems.
There are multiple methods available for ingesting files into the ${branding}.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@ The subscriptions commands are installed when the Catalog application is install
|Description

|subscriptions:delete
|Deletes the subscription(s) specified by the search phrase or LDAP filter.
|Deletes the subscriptions specified by the search phrase or LDAP filter.

|subscriptions:list
|List the subscription(s) specified by the search phrase or LDAP filter.
|List the subscriptions specified by the search phrase or LDAP filter.
|===

.subscriptions:list Command Usage Examples
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ https://{FQDN}:{PORT}/solr/metacard_cache/update?stream.body=<delete><query>*:*<
----
https://{FQDN}:{PORT}/solr/metacard_cache/update?stream.body=<delete><query>original_id_txt:50ffd32b21254c8a90c15fccfb98f139</query></delete>&commit=true
----
* To delete record(s) in the Solr cache using a query on a field in the record(s) - in this example, the `title_txt` field is being used with wildcards to search for any records with word remote in the title:
* To delete records in the Solr cache using a query on a field in the records - in this example, the `title_txt` field is being used with wildcards to search for any records with word remote in the title:

.Deletion of records in Solr query cache using search criteria
----
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,15 +24,15 @@ ingested is valid. Below is a list of the validators run against the data ingest

=== Validators run on ingest

* *((Size Validator))*: Validates the size of an attribute's value(s).
* *((Range Validator))*: Validates an attribute's value(s) against an *inclusive* numeric range.
* *((Enumeration Validator))*: Validates an attribute's value(s) against a set of acceptable values.
* *((Future Date Validator))*: Validates an attribute's value(s) against the current date and time,
* *((Size Validator))*: Validates the size of an attribute's values.
* *((Range Validator))*: Validates an attribute's values against an *inclusive* numeric range.
* *((Enumeration Validator))*: Validates an attribute's values against a set of acceptable values.
* *((Future Date Validator))*: Validates an attribute's valuess against the current date and time,
validating that they are in the future.
* *((Past Date Validator))*: Validates an attribute's value(s) against the current date and time,
* *((Past Date Validator))*: Validates an attribute's values against the current date and time,
validating that they are in the past.
* *((ISO3 Country Code Validator))*: Validates an attribute's value(s) against the ISO_3166-1 Alpha3 country codes.
* *((Pattern Validator))*: Validates an attribute's value(s) against a regular expression.
* *((ISO3 Country Code Validator))*: Validates an attribute's values against the ISO_3166-1 Alpha3 country codes.
* *((Pattern Validator))*: Validates an attribute's values against a regular expression.
* *((Required Attributes Metacard Validator))*: Validates that a metacard contains certain attributes.
- ID: `ddf.catalog.validation.impl.validator.RequiredAttributesMetacardValidator`
* *((Duplication Validator))*: Validates metacard against the local catalog for duplicates based on configurable attributes.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
For hardening purposes, it is recommended to implement a way to verify a ((Certificate Revocation List)) (CRL) at least daily or an ((Online Certificate Status Protocol (OCSP) server)).

=== Managing a Certificate Revocation List (CRL)
A Certificate Revocation List is a collection of formerly-valid certificates that should explicitly _not_ be accepted.
A Certificate Revocation List is a collection of formerly valid certificates that should explicitly _not_ be accepted.

==== Creating a CRL

Expand Down
Loading

0 comments on commit f8f35dc

Please sign in to comment.