Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support payload argument separation similar to DltViewer #10

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 5 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ This Visual Studio Code(tm) extension adds support to open DLT (diagnostic log a

**Note:** It works well with [![Version](https://vsmarketplacebadge.apphb.com/version/mbehr1.fishbone.svg)](https://marketplace.visualstudio.com/items?itemName=mbehr1.fishbone) **fishbone** extension and provides a rest query and filter API that can be used for badges and "apply filter". (todo picture/animation...)

A more detailed documentation is available here: [Docs](https://mbehr1.github.io/dlt-logs/).
A more detailed documentation is available here: [Docs](https://mbehr1.github.io/dlt-logs/).

## Features

Expand Down Expand Up @@ -95,6 +95,7 @@ This extension contributes the following settings:
* `dlt-logs.fileExtensions`: Specifies the file extensions to use for file open dialog. Defaults to .dlt|.DLT.
* `dlt-logs.maxNumberLogs`: Specified the maximum number of DLT logs that get displayed in one page. If more logs exist - considering the active filters - a paging mechanism is in place that starts rendering a new page at 4/5th of the page boundary. Searching is limited to the visible page. Defaults to 0.4mio logs. Depending on your machines performance/RAM you might reduce/increase this. Best case is to find a limit/config where all logs fit into that range (use filter!).
* `dlt-logs.reReadTimeout`: Specified the timeout in ms after opening the file before starting to parse the dlt file. If the file doesn't open, increase this to e.g. 5s.
* `dlt-logs.separatePayloadArgs`: Separate payload arguments with a space (' ') character. Default is false to not break existing filters.
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about reversing the logic? The target should be same output as dltviewer.
The problem are existing users that have filters with payload. We could introduce the flag as
"legacyPayloadDecoding", default false, at startup check whether there are existing filters with payload and ask/inform user to check his filters or enable the option?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In lights of your later comments regarding memory consumption etc. this is probably the better solution.

One question remains: How do you recognize "legacy" filters? It would be complete guesswork as especially the payload filter is in the end only a string regex. What would be the identifier?
Of the top of my head? Add a completely redundant version field to either the filter section or each filter. If the field is not available, show a message stating that there are maybe incompatible filters?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, as soon as there is any filter with a payload/payloadregex I'd show the message and ask the user. Filters with ecu/apid/ctid,... are not impacted.

* `dlt-logs.columns`: Specifies which columns are visible. See example config. Usually doesn't need to be changed manually but by button "select columns".
* `dlt-logs.filters`: Configures the filter that are available.
There are four type of filters:
Expand Down Expand Up @@ -129,7 +130,7 @@ This extension contributes the following settings:
For report generation filter can contain:
* **reportOptions**: object that can contain:
* **conversionFunction**: can be used to modify the captured values for that event. Needs to be a JS function returning an array of objects { valueName: value } gets the regex 'matches' as parameter. Additional parameter is "params" which is an object with msg, localObj and reportObj. TODO create wiki with full example. E.g. "return {'limit':42};" for a static value. or "return {'timeStamp': params.msg.timeStamp/10000};". 'localObj' is initially an empty Object {} that can be used to store properties for that filter (e.g. interims data for calculations). 'reportObj' is an Object similar to localObj but shared between all filters. So take care here for name clashes!
* **valueMap**: object that can contain keys matching to the captured data names and the property is an array with objects { capturedName : newName }.
* **valueMap**: object that can contain keys matching to the captured data names and the property is an array with objects { capturedName : newName }.
E.g."reportOptions": { "valueMap": { "STATE_onOff": [ { "1": "on" }, { "0": "off" }, {"ff": "invalid" }]}}

Filter configuration changes and menu items *add filter...*, *edit filter...*, *delete filter...* actions will be applied instantly to the configuration/view.
Expand All @@ -141,8 +142,8 @@ This extension contributes the following settings:
* **allowSave**: can be used to disable saving capability. Can be used if you're not interested in the files but still want to see any transfers. Reduces memory consumption.
* **keepFLDA**: if enabled the FLDA messages are visible in the logs (if no other filter removes them). Default is to not show the FLDA messages.
* **apid**: restrict searching for file transfer messages to this APID. Can be empty (as by spec). If you know the APID providing this speeds up processing.
* **ctid**: restrict searching for file transfer message to this CTID. Can be empty (as by spec).
* **ctid**: restrict searching for file transfer message to this CTID. Can be empty (as by spec).

* `dlt-logs.decorations`: Definition of the decoration types supported for marker filters.
* `dlt-logs.configs`: Definition of **Configs**. A config consists of a:
* **name**: Name of that config
Expand Down
5 changes: 5 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,11 @@
],
"description": "Specifies the file extensions that can be opened."
},
"dlt-logs.separatePayloadArgs": {
"type": "boolean",
"default": false,
"description": "Separate payload arguments with a space (' ') character. Default is false to not break existing filters."
},
"dlt-logs.reReadTimeout": {
"type": "integer",
"default": 1000,
Expand Down
33 changes: 21 additions & 12 deletions src/dltDocument.ts
Original file line number Diff line number Diff line change
Expand Up @@ -96,9 +96,9 @@ export class DltDocument {

textEditors: Array<vscode.TextEditor> = []; // don't use in here!

/* allDecorations contain a list of all decorations for the filteredMsgs.
/* allDecorations contain a list of all decorations for the filteredMsgs.
* the ranges dont contain line numbers but the filteredMsg number.
* during renderLines the visible decorations will be created and stored in
* during renderLines the visible decorations will be created and stored in
* decorations (with updated ranges)
*/
private _allDecorations?: Map<vscode.TextEditorDecorationType, vscode.DecorationOptions[]>;
Expand All @@ -112,7 +112,7 @@ export class DltDocument {
lastSelectedTimeEv: Date | undefined; // the last received time event that might have been used to reveal our line. used for adjustTime on last event feature.
gotTimeSyncEvents: boolean = false; // we've been at least once to sync time based on timeSync events

get timeAdjustMs(): number { return this._timeAdjustMs; } // read only. use adustTime to change
get timeAdjustMs(): number { return this._timeAdjustMs; } // read only. use adjustTime to change
ChrisRBe marked this conversation as resolved.
Show resolved Hide resolved

private _realStat: fs.Stats;

Expand Down Expand Up @@ -162,11 +162,11 @@ export class DltDocument {
this.treeNode.children.forEach((child) => { child.parent = this.treeNode; });
parentTreeNode.push(this.treeNode);

// load filters:
// load filters:
this.onDidChangeConfigFilters();


{ // load decorations:
{ // load decorations:
const decorationsObjs = vscode.workspace.getConfiguration().get<Array<object>>("dlt-logs.decorations");
this.parseDecorationsConfigs(decorationsObjs);
}
Expand Down Expand Up @@ -253,7 +253,7 @@ export class DltDocument {
private debouncedApplyFilterTimeout: NodeJS.Timeout | undefined;
/**
* Trigger applyFilter and show progress
* This is debounced/delayed a bit (500ms) to avoid too frequent
* This is debounced/delayed a bit (500ms) to avoid too frequent
* apply filter operation that is longlasting.
*/
triggerApplyFilter() {
Expand Down Expand Up @@ -568,7 +568,7 @@ export class DltDocument {

console.log(`autoEnableConfigs enabled ${enabled} configs.`);
if (enabled > 0) {
// we don't need this as applyFilter will be called anyhow (might better add a parameter)
// we don't need this as applyFilter will be called anyhow (might better add a parameter)
// this.onFilterChange(undefined);
}
this._didAutoEnableConfigs = true;
Expand Down Expand Up @@ -842,6 +842,14 @@ export class DltDocument {
return matchingMsgs;
}

/**
* Return a flag signalling if payload arguments should be separated by a space character ' ' or not.
* @returns flag for payload argument separator
*/
static getSepPayloadArgs() {
const separatePayloadArgsConf = vscode.workspace.getConfiguration().get<boolean>('dlt-logs.separatePayloadArgs')
return separatePayloadArgsConf ? separatePayloadArgsConf : false; // do not add ' ' between payload args as default
}

private _applyFilterRunning: boolean = false;
async applyFilter(progress: vscode.Progress<{ increment?: number | undefined, message?: string | undefined, }> | undefined, applyEventFilter: boolean = false) {
Expand Down Expand Up @@ -1060,7 +1068,7 @@ export class DltDocument {
}

lineCloseTo(index: number, ignoreSkip = false): number {
// provides the line number "close" to the index
// provides the line number "close" to the index
// todo this causes problems once we do sort msgs (e.g. by timestamp)
// that is the matching line or the next higher one
// todo use binary search
Expand All @@ -1075,7 +1083,7 @@ export class DltDocument {
}
if (!ignoreSkip && i > this._skipMsgs + this._maxNrMsgs) {
console.log(`lineCloseTo(${index} not in range (>). todo needs to trigger reload.)`);
return this.staticLinesAbove.length + this._maxNrMsgs; // go to first line
return this.staticLinesAbove.length + this._maxNrMsgs; // go to first line
}
return i + (ignoreSkip ? 0 : this.staticLinesAbove.length);
}
Expand Down Expand Up @@ -1552,7 +1560,7 @@ export class DltDocument {
// need to remove current text in the editor and insert new one.
// otherwise the editor tries to identify the changes. that
// lasts long on big files...
// tried using editor.edit(replace or remove/insert) but that leads to a
// tried using editor.edit(replace or remove/insert) but that leads to a
// doc marked with changes and then FileChange event gets ignored...
// so we add empty text interims wise:
this._text = "...revealing new range...";
Expand Down Expand Up @@ -1612,6 +1620,7 @@ export class DltDocument {
async (progress) => {
// do we have any filters to apply at load time?
const [posFilters, negFilters, decFilters, eventFilters, negBeforePosFilters] = DltDocument.getFilter(this.allFilters, true, true);
const sepPayloadArgs = DltDocument.getSepPayloadArgs();
console.log(` have ${posFilters.length} pos. and ${negFilters.length} neg. filters at load time.`);

let data = Buffer.allocUnsafe(chunkSize);
Expand All @@ -1621,7 +1630,7 @@ export class DltDocument {
if (read) {
const copiedBuf = Buffer.from(data.slice(0, read)); // have to create a copy of Buffer here!
// parse data:
const parseInfo = DltDocument.dltP.parseDltFromBuffer(copiedBuf, 0, this.msgs, posFilters, negFilters, negBeforePosFilters);
const parseInfo = DltDocument.dltP.parseDltFromBuffer(copiedBuf, 0, this.msgs, sepPayloadArgs, posFilters, negFilters, negBeforePosFilters);
if (parseInfo[0] > 0) {
console.log(`checkFileChanges skipped ${parseInfo[0]} bytes.`);
}
Expand Down Expand Up @@ -1773,7 +1782,7 @@ export class DltDocument {
}

/**
*
*
* @param context ExtensionContext (needed for report generation -> access to settings,...)
* @param cmd get|patch|delete
* @param paths docs/<id>/filters[...]
Expand Down
8 changes: 5 additions & 3 deletions src/dltExport.ts
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,8 @@ const getFirstMsg = (fileUri: vscode.Uri) => {
let data = Buffer.allocUnsafe(1 * 1024 * 1024); // we do only scan first MB
let read = fs.readSync(fd, data, 0, data.byteLength, 0);
const msgs: DltMsg[] = [];
const parseInfo = DltDocument.dltP.parseDltFromBuffer(data.slice(0, read), 0, msgs, [], [], []);
const sepPayloadArgs = DltDocument.getSepPayloadArgs();
const parseInfo = DltDocument.dltP.parseDltFromBuffer(data.slice(0, read), 0, msgs, sepPayloadArgs, [], [], []);
if (msgs.length) { return msgs[0]; } else { return undefined; };
};

Expand Down Expand Up @@ -423,7 +424,7 @@ async function doExport(exportOptions: ExportDltOptions) {
});
console.log(`sorted ${minMsgInfos.length} msgs`);
progress.report({ message: `sorted ${minMsgInfos.length} messages` });
await util.sleep(10); // 10ms each 100ms
await util.sleep(10); // 10ms each 100ms
}

// pass 2:
Expand Down Expand Up @@ -558,6 +559,7 @@ const pass1ReadUri = async (
let msgs: DltMsg[] = [];
// determine current filters:
const [posFilters, negFilters, decFilters, eventFilters, negBeforePosFilters] = DltDocument.getFilter(allFilters, true, true);
const sepPayloadArgs = DltDocument.getSepPayloadArgs();
progress?.report({ message: `pass 1: processing file '${basename(fileUri.fsPath)}'` });
let index = 0;
let lastIncrement = 0;
Expand All @@ -568,7 +570,7 @@ const pass1ReadUri = async (
// parse data:
const msgOffsets: number[] = [];
const msgLengths: number[] = [];
const parseInfo = DltDocument.dltP.parseDltFromBuffer(data.slice(0, read), 0, msgs, posFilters, negFilters, negBeforePosFilters, msgOffsets, msgLengths);
const parseInfo = DltDocument.dltP.parseDltFromBuffer(data.slice(0, read), 0, msgs, sepPayloadArgs, posFilters, negFilters, negBeforePosFilters, msgOffsets, msgLengths);
if (parseInfo[0] > 0) {
}
if (parseInfo[1] > 0) {
Expand Down
3 changes: 2 additions & 1 deletion src/dltLoadTimeAssistant.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ export async function loadTimeFilterAssistant(fileUri: vscode.Uri, allFilters: D

// determine current filters:
const [posFilters, negFilters, decFilters, eventFilters, negBeforePosFilters] = DltDocument.getFilter(allFilters, true, true);
const sepPayloadArgs = DltDocument.getSepPayloadArgs();

let apids: PickItem[] = [];
let apidCntMap: Map<string, number> = new Map(); // for each apid the number of messages as we do want to sort them
Expand All @@ -40,7 +41,7 @@ export async function loadTimeFilterAssistant(fileUri: vscode.Uri, allFilters: D
if (read) {
//const copiedBuf = Buffer.from(data.slice(0, read)); // have to create a copy of Buffer here! not necessary to access apid
// parse data:
const parseInfo = DltDocument.dltP.parseDltFromBuffer(data.slice(0, read), 0, msgs, posFilters, negFilters, negBeforePosFilters);
const parseInfo = DltDocument.dltP.parseDltFromBuffer(data.slice(0, read), 0, msgs, sepPayloadArgs, posFilters, negFilters, negBeforePosFilters);
if (parseInfo[0] > 0) {
}
if (parseInfo[1] > 0) {
Expand Down
16 changes: 10 additions & 6 deletions src/dltParser.ts
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ export const serviceIds: string[] = ["", "set_log_level", "set_trace_status", "g
// not covered:
/*
#define DLT_SERVICE_ID_UNREGISTER_CONTEXT 0xf01 < Service ID: Message unregister context
#define DLT_SERVICE_ID_CONNECTION_INFO 0xf02 < Service ID: Message connection info
#define DLT_SERVICE_ID_CONNECTION_INFO 0xf02 < Service ID: Message connection info
#define DLT_SERVICE_ID_TIMEZONE 0xf03 < Service ID: Timezone
#define DLT_SERVICE_ID_MARKER 0xf04 < Service ID: Timezone
#define DLT_SERVICE_ID_CALLSW_CINJECTION 0xFFF < Service ID: Message Injection (minimal ID)
Expand Down Expand Up @@ -93,6 +93,7 @@ export class DltMsg {
readonly timeAsNumber: number; // time in ms. Date uses more memory!
get timeAsDate(): Date { return new Date(this.timeAsNumber); }

readonly payloadArgSeparator: string;
ChrisRBe marked this conversation as resolved.
Show resolved Hide resolved
Copy link
Author

@ChrisRBe ChrisRBe Jan 9, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
readonly payloadArgSeparator: string;
static payloadArgSeparator = vscode.workspace.getConfiguration().get<boolean>('dlt-logs.legacyPayloadDecoding') ? "" : " "; // default behaviour of legacyPayloadDecoding is false, add a space character in that case.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could this work?

// parsed from data:
readonly mcnt: number;
private _htyp: number;
Expand All @@ -114,7 +115,7 @@ export class DltMsg {
lifecycle: DltLifecycleInfo | undefined = undefined;
decorations: Array<[vscode.TextEditorDecorationType, Array<vscode.DecorationOptions>]> = [];

constructor(storageHeaderEcu: string, stdHdr: any, index: number, timeAsNumber: number, data: Buffer) {
constructor(storageHeaderEcu: string, stdHdr: any, index: number, timeAsNumber: number, data: Buffer, sepPayArgs?: boolean) {
this.index = index;
this.timeAsNumber = timeAsNumber;

Expand Down Expand Up @@ -173,6 +174,8 @@ export class DltMsg {
}
this._eac = getEACFromIdx(getIdxFromEAC(eac)) || eac;

this.payloadArgSeparator = sepPayArgs ? " ": "";

const payloadOffset = DLT_STORAGE_HEADER_SIZE + stdHeaderSize + (useExtHeader ? DLT_EXT_HEADER_SIZE : 0);
this._payloadData = data.slice(payloadOffset);
assert.equal(this._payloadData.byteLength, stdHdr["len"] - (payloadOffset - DLT_STORAGE_HEADER_SIZE));
Expand Down Expand Up @@ -292,7 +295,7 @@ export class DltMsg {
argOffset += 2;
const rawd = this._payloadData.slice(argOffset, argOffset + lenRaw);
this._payloadArgs.push({ type: Buffer, v: Buffer.from(rawd) }); // we make a copy here to avoid referencing the payloadData that we want to release/gc afterwards
this._payloadText += rawd.toString("hex");
this._payloadText += rawd.toString("hex") + this.payloadArgSeparator;
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's not the only place where the space is missing?
And: does dltviewer add a space at the end?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hard to tell.
Do you have an easy to use DLT example that uses all different DLT_LOG functions to write messages? Plus, I'm not really that good at reading the dltviewer code to answer that clearly.

What I would say: the current implementation, or rather, simply adding a space at this position did not seem to add additional spaces at the end in vscode.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sadly I miss good examples. Could write a small test app that creates those.
I might have some code to remove whitespace at end ;-) (will take a look)
I can check the dltviewer code and try to understand the rules implemented there.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
this._payloadText += rawd.toString("hex") + this.payloadArgSeparator;
this._payloadText += rawd.toString("hex") + DltMsg.payloadArgSeparator;

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as the consequence?

argOffset += lenRaw;
} else { break; } // todo VARI, FIXP, TRAI, STRU
}
Expand Down Expand Up @@ -340,7 +343,7 @@ export class DltMsg {
case 4:
case 5:
case 6:
case 7: // info about reg. appids and ctids with log level and trace status info and all text. descr.
case 7: // info about reg. appids and ctids with log level and trace status info and all text. descr.
{ // todo could use few binaryparser here...
const hasLogLevel: boolean = (respCode === 4) || (respCode === 6) || (respCode === 7);
const hasTraceStatus: boolean = (respCode === 5) || (respCode === 6) || (respCode === 7);
Expand Down Expand Up @@ -571,11 +574,12 @@ function dltWriteExtHeader(buffer: Buffer, offset: number, extH: DltExtHeader) {

export class DltParser {

parseDltFromBuffer(buf: Buffer, startOffset: number, msgs: Array<DltMsg>, posFilters?: DltFilter[], negFilters?: DltFilter[], negBeforePosFilters?: DltFilter[], msgOffsets?: number[], msgLengths?: number[]) { // todo make async
parseDltFromBuffer(buf: Buffer, startOffset: number, msgs: Array<DltMsg>, sepPayArgs: boolean, posFilters?: DltFilter[], negFilters?: DltFilter[], negBeforePosFilters?: DltFilter[], msgOffsets?: number[], msgLengths?: number[]) { // todo make async
let skipped: number = 0;
let remaining: number = buf.byteLength - startOffset;
let nrMsgs: number = 0; let offset = startOffset;
const startIndex: number = msgs.length ? (msgs[msgs.length - 1].index + 1) : 0; // our first index to use is either prev one +1 or 0 as start value
const separatePayloadArgs: boolean = sepPayArgs ? sepPayArgs : false;
while (remaining >= MIN_DLT_MSG_SIZE) {
const storageHeader = dltParseStorageHeader(buf, offset);
if (storageHeader.pattern === DLT_STORAGE_HEADER_PATTERN) {
Expand All @@ -591,7 +595,7 @@ export class DltParser {

if (len >= MIN_STD_HEADER_SIZE) {
try {
const newMsg = new DltMsg(storageHeader.ecu, stdHeader, startIndex + nrMsgs, timeAsNumber, buf.slice(msgOffset, offset));
const newMsg = new DltMsg(storageHeader.ecu, stdHeader, startIndex + nrMsgs, timeAsNumber, buf.slice(msgOffset, offset), separatePayloadArgs);
// do we need to filter this one?
let keepAfterNegBeforePosFilters: boolean = true;
if (negBeforePosFilters?.length) {
Expand Down