Skip to content

Commit

Permalink
#266 - Added Map File to file structure.
Browse files Browse the repository at this point in the history
1. Added Map File to file structure.
1. Increased code coverage of `TransactionProcessor` to 100%.
1. Some code refactoring.
  • Loading branch information
thehenrytsai authored Feb 3, 2020
1 parent b4f059b commit 170964a
Show file tree
Hide file tree
Showing 17 changed files with 294 additions and 130 deletions.
22 changes: 16 additions & 6 deletions docs/protocol.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,13 +102,15 @@ At such time an ID is published/anchored, a user can provide either the paramete


## Sidetree Operation Batching
The Sidetree protocol increases operation throughput by batching multiple operations together then anchoring a reference to this batch on the blockchain.
For every batch of Sidetree operations created, there are two files that are created and stored in the CAS layer:
The Sidetree protocol increases operation throughput by batching multiple operations together then anchoring a reference to this operation batch on the blockchain.
For every batch of Sidetree operations created, there are three files that are created and stored in the CAS layer:

1. Batch file - The file containing the actual change data of all the operations batched together.
2. Anchor file - The hash of the _anchor file_ is written to the blockchain as a Sidetree transaction, hence the name _'anchor'_. This file contains the following:
1. Map file - This file contain references to one or more _batch files_. Currently this map file only reference one batch file, but this design allows for operation data to be seperated in multiple batch files for optimized on-demand reslution.
1. Anchor file - The hash of the _anchor file_ is written to the blockchain as a Sidetree transaction, hence the name _'anchor'_. This file contains the following:
1. Hash of the _map file_.
1. Array of DID suffixes (the unique portion of the DID string that differentiates one DID from another) for all DIDs that are declared to have operations within the associated _batch file_.

1. Metadata about the associated Sidetree operations, including a content addressable hash of the operation _batch file_.
2. Array of DID suffixes (the unique portion of the DID string that differentiates one DID from another) for all DIDs that are declared to have operations within the associated _batch file_.

### Batch File Schema
The _batch file_ is a ZIP compressed JSON document of the following schema:
Expand All @@ -122,11 +124,19 @@ The _batch file_ is a ZIP compressed JSON document of the following schema:
}
```

### Map File Schema
The _map file_ is a JSON document of the following schema:
```json
{
"batchFileHash": "Encoded multihash of the batch file.",
}
```

### Anchor File Schema
The _anchor file_ is a JSON document of the following schema:
```json
{
"batchFileHash": "Encoded multihash of the batch file.",
"mapFileHash": "Encoded multihash of the map file.",
"didUniqueSuffixes": ["Unique suffix of DID of 1st operation", "Unique suffix of DID of 2nd operation", "..."]
}
```
Expand Down
18 changes: 9 additions & 9 deletions lib/core/versions/latest/AnchorFile.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ export default class AnchorFile {
* Parses and validates the given anchor file buffer.
* @throws `SidetreeError` if failed parsing or validation.
*/
public static async parseAndValidate (anchorFileBuffer: Buffer, maxOperationsPerBatch: number): Promise<AnchorFileModel> {
public static async parseAndValidate (anchorFileBuffer: Buffer): Promise<AnchorFileModel> {

let anchorFileDecompressedBuffer;
try {
Expand All @@ -36,30 +36,30 @@ export default class AnchorFile {
throw new SidetreeError(ErrorCode.AnchorFileHasUnknownProperty);
}

if (!anchorFile.hasOwnProperty('batchFileHash')) {
throw new SidetreeError(ErrorCode.AnchorFileBatchFileHashMissing);
if (!anchorFile.hasOwnProperty('mapFileHash')) {
throw new SidetreeError(ErrorCode.AnchorFileMapFileHashMissing);
}

if (!anchorFile.hasOwnProperty('didUniqueSuffixes')) {
throw new SidetreeError(ErrorCode.AnchorFileDidUniqueSuffixesMissing);
}

// Batch file hash validations.
if (typeof anchorFile.batchFileHash !== 'string') {
throw new SidetreeError(ErrorCode.AnchorFileBatchFileHashNotString);
// Map file hash validations.
if (typeof anchorFile.mapFileHash !== 'string') {
throw new SidetreeError(ErrorCode.AnchorFileMapFileHashNotString);
}

const didUniqueSuffixBuffer = Encoder.decodeAsBuffer(anchorFile.batchFileHash);
const didUniqueSuffixBuffer = Encoder.decodeAsBuffer(anchorFile.mapFileHash);
if (!Multihash.isComputedUsingHashAlgorithm(didUniqueSuffixBuffer, ProtocolParameters.hashAlgorithmInMultihashCode)) {
throw new SidetreeError(ErrorCode.AnchorFileBatchFileHashUnsupported, `Batch file hash '${anchorFile.batchFileHash}' is unsupported.`);
throw new SidetreeError(ErrorCode.AnchorFileMapFileHashUnsupported, `Map file hash '${anchorFile.mapFileHash}' is unsupported.`);
}

// DID Unique Suffixes validations.
if (!Array.isArray(anchorFile.didUniqueSuffixes)) {
throw new SidetreeError(ErrorCode.AnchorFileDidUniqueSuffixesNotArray);
}

if (anchorFile.didUniqueSuffixes.length > maxOperationsPerBatch) {
if (anchorFile.didUniqueSuffixes.length > ProtocolParameters.maxOperationsPerBatch) {
throw new SidetreeError(ErrorCode.AnchorFileExceededMaxOperationCount);
}

Expand Down
6 changes: 3 additions & 3 deletions lib/core/versions/latest/BatchFile.ts
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ export default class BatchFile {
let endTimer = timeSpan();
const decompressedBatchFileBuffer = await Compressor.decompress(batchFileBuffer);
const batchFileObject = await JsonAsync.parse(decompressedBatchFileBuffer);
console.info(`Parsed batch file ${anchorFile.batchFileHash} in ${endTimer.rounded()} ms.`);
console.info(`Parsed batch file in ${endTimer.rounded()} ms.`);

// Ensure only properties specified by Sidetree protocol are given.
const allowedProperties = new Set(['operations']);
Expand Down Expand Up @@ -68,7 +68,7 @@ export default class BatchFile {
if (batchSize !== operationCountInAnchorFile) {
throw new SidetreeError(
ErrorCode.BatchFileOperationCountMismatch,
`Batch size of ${batchSize} in batch file '${anchorFile.batchFileHash}' does not size of ${operationCountInAnchorFile} in anchor file.`
`Batch size of ${batchSize} in batch file does not match of size of ${operationCountInAnchorFile} in anchor file.`
);
}

Expand Down Expand Up @@ -106,7 +106,7 @@ export default class BatchFile {

namedAnchoredOperationModels.push(operation);
}
console.info(`Decoded ${batchSize} operations in batch ${anchorFile.batchFileHash}. Time taken: ${endTimer.rounded()} ms.`);
console.info(`Decoded ${batchSize} operations in batch file. Time taken: ${endTimer.rounded()} ms.`);

return namedAnchoredOperationModels;
}
Expand Down
10 changes: 8 additions & 2 deletions lib/core/versions/latest/BatchWriter.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ import ICas from '../../interfaces/ICas';
import IBatchWriter from '../../interfaces/IBatchWriter';
import IBlockchain from '../../interfaces/IBlockchain';
import IOperationQueue from './interfaces/IOperationQueue';
import MapFile from './MapFile';
import Operation from './Operation';
import ProtocolParameters from './ProtocolParameters';

Expand Down Expand Up @@ -39,16 +40,21 @@ export default class BatchWriter implements IBatchWriter {
// Create the batch file buffer from the operation batch.
const batchFileBuffer = await BatchFile.fromOperationBuffers(operationBuffers);

// Write the 'batch file' to content addressable store.
// Write the batch file to content addressable store.
const batchFileHash = await this.cas.write(batchFileBuffer);
console.info(`Wrote batch file ${batchFileHash} to content addressable store.`);

// Write the map file to content addressable store.
const mapFileBuffer = await MapFile.createBuffer(batchFileHash);
const mapFileHash = await this.cas.write(mapFileBuffer);
console.info(`Wrote map file ${mapFileHash} to content addressable store.`);

// Construct the DID unique suffixes of each operation to be included in the anchor file.
const didUniqueSuffixes = batch.map(operation => operation.didUniqueSuffix);

// Construct the 'anchor file'.
const anchorFileModel: AnchorFileModel = {
batchFileHash: batchFileHash,
mapFileHash,
didUniqueSuffixes
};

Expand Down
19 changes: 10 additions & 9 deletions lib/core/versions/latest/ErrorCode.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,33 +7,30 @@ export default {
AnchoredDataNumberOfOperationsLessThanZero: 'anchored_data_number_of_operations_less_than_zero',
AnchoredDataNumberOfOperationsNotFourBytes: 'anchored_data_number_of_operations_not_four_bytes',
AnchoredDataNumberOfOperationsNotInteger: 'anchored_data_number_of_operations_not_integer',
AnchorFileBatchFileHashMissing: 'anchor_file_batch_file_hash_missing',
AnchorFileBatchFileHashNotString: 'anchor_file_batch_file_hash_not_string',
AnchorFileBatchFileHashUnsupported: 'anchor_file_batch_file_hash_unsupported',
AnchorFileMapFileHashMissing: 'anchor_file_map_file_hash_missing',
AnchorFileMapFileHashNotString: 'anchor_file_map_file_hash_not_string',
AnchorFileMapFileHashUnsupported: 'anchor_file_map_file_hash_unsupported',
AnchorFileDecompressionFailure: 'anchor_file_decompression_failed',
AnchorFileDidUniqueSuffixEntryNotString: 'anchor_file_did_unique_suffix_entry_not_string',
AnchorFileDidUniqueSuffixesHasDuplicates: 'anchor_file_did_unique_suffixes_has_duplicates',
AnchorFileDidUniqueSuffixesMissing: 'anchor_file_did_unique_suffixes_missing',
AnchorFileDidUniqueSuffixesNotArray: 'anchor_file_did_unique_suffixes_not_array',
AnchorFileDidUniqueSuffixTooLong: 'anchor_file_did_unique_suffix_too_long',
AnchorFileExceededMaxOperationCount: 'anchor_file_exceeded_max_operation_count',
AnchorFileHashNotValid: 'anchor_file_hash_not_valid',
AnchorFileHasUnknownProperty: 'anchor_file_has_unknown_property',
AnchorFileNotAFile: 'anchor_file_not_a_file',
AnchorFileNotJson: 'anchor_file_not_json',
AnchorFileTooLarge: 'anchor_file_too_large',
BatchFileHashNotValid: 'batch_file_hash_not_valid',
BatchFileNotAFile: 'batch_file_not_a_file',
BatchFileOperationCountExceedsLimit: 'batch_file_operation_count_exceeds_limit',
BatchFileOperationCountMismatch: 'batch_file_operation_count_mismatch',
BatchFileOperationsNotArrayOfStrings: 'batch_file_operations_not_array_of_string',
BatchFileOperationsPropertyNotArray: 'batch_file_operations_property_not_array',
BatchFileOperationMismatch: 'batch_file_operation_mismatch',
BatchFileOperationSizeExceedsLimit: 'batch_file_operation_size_exceeds_limit',
BatchFileTooLarge: 'batch_file_too_large',
BatchFileUnexpectedProperty: 'batch_file_unexpected_property',
BatchWriterAlreadyHasOperationForDid: 'batch_writer_already_has_operation_for_did',
CasFileHashNotValid: 'cas_file_hash_not_valid',
CasFileNotAFile: 'cas_file_not_a file',
CasFileNotFound: 'cas_file_not_found',
CasFileTooLarge: 'cas_file_too_large',
CasNotReachable: 'cas_not_reachable',
DidEncodedDidDocumentHashMismatch: 'did_encoded_did_document_hash_mismatch',
DidIncorrectPrefix: 'did_incorrect_prefix',
Expand All @@ -42,6 +39,10 @@ export default {
DocumentIncorretEncodedFormat: 'document_incorrect_encoded_format',
DocumentNotJson: 'document_not_json',
DocumentNotValidOriginalDocument: 'document_not_valid_original_document',
MapFileBatchFileHashMissingOrIncorrectType: 'map_file_batch_file_hash_missing_or_incorrect_type',
MapFileDecompressionFailure: 'map_file_decompression_failure',
MapFileHasUnknownProperty: 'map_file_has_unknown_property',
MapFileNotJson: 'map_file_not_json',
MultihashNotLatestSupportedHashAlgorithm: 'multihash_not_latest_supported_hash_algorithm',
MultihashUnsupportedHashAlgorithm: 'multihash_unsupported_hash_algorithm',
OperationCreateInvalidDidDocument: 'operation_create_invalid_did_document',
Expand Down
54 changes: 54 additions & 0 deletions lib/core/versions/latest/MapFile.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
import Compressor from './util/Compressor';
import ErrorCode from './ErrorCode';
import JsonAsync from './util/JsonAsync';
import MapFileModel from './models/MapFileModel';
import SidetreeError from '../../SidetreeError';

/**
* Class containing Map File related operations.
*/
export default class MapFile {
/**
* Parses and validates the given map file buffer.
* @throws `SidetreeError` if failed parsing or validation.
*/
public static async parseAndValidate (mapFileBuffer: Buffer): Promise<MapFileModel> {

let decompressedBuffer;
try {
decompressedBuffer = await Compressor.decompress(mapFileBuffer);
} catch (error) {
throw SidetreeError.createFromError(ErrorCode.MapFileDecompressionFailure, error);
}

let mapFile;
try {
mapFile = await JsonAsync.parse(decompressedBuffer);
} catch (error) {
throw SidetreeError.createFromError(ErrorCode.MapFileNotJson, error);
}

const mapFileProperties = Object.keys(mapFile);
if (mapFileProperties.length > 1) {
throw new SidetreeError(ErrorCode.MapFileHasUnknownProperty);
}

if (typeof mapFile.batchFileHash !== 'string') {
throw new SidetreeError(ErrorCode.MapFileBatchFileHashMissingOrIncorrectType);
}

return mapFile;
}

/**
* Creates the Map File buffer.
*/
public static async createBuffer (batchFileHash: string): Promise<Buffer> {
const mapFileModel = { batchFileHash };

const rawData = JSON.stringify(mapFileModel);
const compressedRawData = await Compressor.compress(Buffer.from(rawData));

return compressedRawData;
}
}
12 changes: 8 additions & 4 deletions lib/core/versions/latest/Multihash.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import * as crypto from 'crypto';
import Encoder from './Encoder';
import ErrorCode from './ErrorCode';
import ProtocolParameters from './ProtocolParameters';
import SidetreeError from '../../SidetreeError';
const multihashes = require('multihashes');

Expand All @@ -10,20 +11,23 @@ const multihashes = require('multihashes');
export default class Multihash {
/**
* Hashes the content using the hashing algorithm specified.
* @param hashAlgorithmInMultihashCode The hashing algorithm to use. If not given, latest supported hashing algorithm will be used.
*/
public static hash (content: Buffer, hashAlgorithmInMultihashCode: number): Buffer {
const hashAlgorithm = hashAlgorithmInMultihashCode;
public static hash (content: Buffer, hashAlgorithmInMultihashCode?: number): Buffer {
if (hashAlgorithmInMultihashCode === undefined) {
hashAlgorithmInMultihashCode = ProtocolParameters.hashAlgorithmInMultihashCode;
}

let hash;
switch (hashAlgorithm) {
switch (hashAlgorithmInMultihashCode) {
case 18: // SHA256
hash = crypto.createHash('sha256').update(content).digest();
break;
default:
throw new SidetreeError(ErrorCode.MultihashUnsupportedHashAlgorithm);
}

const hashAlgorithmName = multihashes.codes[hashAlgorithm];
const hashAlgorithmName = multihashes.codes[hashAlgorithmInMultihashCode];
const multihash = multihashes.encode(hash, hashAlgorithmName);

return multihash;
Expand Down
4 changes: 1 addition & 3 deletions lib/core/versions/latest/Operation.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ import JwsModel from './models/JwsModel';
import KeyUsage from './KeyUsage';
import Multihash from './Multihash';
import OperationType from '../../enums/OperationType';
import ProtocolParameters from './ProtocolParameters';
import SidetreeError from '../../SidetreeError';

/**
Expand Down Expand Up @@ -134,9 +133,8 @@ export default class Operation {
* Computes the cryptographic multihash of the given string.
*/
private static computeHash (dataString: string): string {
const hashAlgorithmInMultihashCode = ProtocolParameters.hashAlgorithmInMultihashCode;
const encodedOperationPayloadBuffer = Buffer.from(dataString);
const multihash = Multihash.hash(encodedOperationPayloadBuffer, hashAlgorithmInMultihashCode);
const multihash = Multihash.hash(encodedOperationPayloadBuffer);
const encodedMultihash = Encoder.encode(multihash);
return encodedMultihash;
}
Expand Down
Loading

0 comments on commit 170964a

Please sign in to comment.