Skip to content

Commit

Permalink
Integrate pigweed tokenizer logging (#15039)
Browse files Browse the repository at this point in the history
* Integrate pigweed tokenizer for logging

*add pw_tokenizer build changes.
*add build changes in nxp k32w lighting-app.
*update nxp k32w lighting-app README.
*change logging functions to use the tokenizer.
*guard tokenizer changes with define.
*split logs where too many arguments.
*create build test for k32w lighting-app with tokenizer.
*update .wordlist.txt and remove duplicates.

Signed-off-by: Andrei Menzopol <[email protected]>

* Fix misspell docs/VSCODE_DEVELOPMENT.md.

Signed-off-by: Andrei Menzopol <[email protected]>
  • Loading branch information
andrei-menzopol authored and pull[bot] committed Apr 4, 2022
1 parent fa47d9d commit 3560708
Show file tree
Hide file tree
Showing 19 changed files with 371 additions and 40 deletions.
25 changes: 9 additions & 16 deletions .github/.wordlist.txt
Original file line number Diff line number Diff line change
Expand Up @@ -334,6 +334,8 @@ depottools
deps
desc
descheduled
detokenization
detokenizer
dev
devcontainer
devCtrl
Expand Down Expand Up @@ -696,6 +698,7 @@ LinkSoftwareAndDocumentationPack
LinuxOTAImageProcessor
LinuxOTARequestorDriver
LocalConfigDisabled
localedef
localhost
LocalizationConfiguration
localstatedir
Expand Down Expand Up @@ -802,6 +805,7 @@ MX
mydir
MyPASSWORD
MySSID
nameserver
namespacing
nano
natively
Expand All @@ -822,6 +826,7 @@ NitrogenDioxideConcentrationMeasurement
nl
NLUnitTest
NLUnitTests
nmcli
noc
NodeId
nongnu
Expand Down Expand Up @@ -1031,6 +1036,7 @@ RGB
riscv
rloc
rmw
rodata
Rollershade
rootfs
RPC
Expand Down Expand Up @@ -1174,6 +1180,7 @@ TestCluster
TestConstraints
TestEmptyString
TestGenExample
TestGroupDemoConfig
TestMultiRead
TESTPASSWD
TestPICS
Expand Down Expand Up @@ -1207,6 +1214,7 @@ tngvndl
TODO
toJson
tokenized
tokenizer
toolchain
toolchains
topologies
Expand Down Expand Up @@ -1268,6 +1276,7 @@ USERINTERFACE
UserLabel
usermod
usr
UTF
util
utils
UUID
Expand Down Expand Up @@ -1373,19 +1382,3 @@ zephyrproject
Zigbee
zigbeealliance
zigbeethread
libshell
TestGroupDemoConfig
ACLs
AddNOC
CHIPConfig
CHIPProjectAppConfig
CaseAdminNode
DataVersion
ProxyView
ReadAttribute
WriteAttribute
kAdminister
kManage
kOperate
kView
xFFFFFFFD
1 change: 1 addition & 0 deletions .github/workflows/examples-k32w.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@ jobs:
scripts/run_in_build_env.sh "\
./scripts/build/build_examples.py \
--target k32w-light-release \
--target k32w-light-tokenizer-release \
--target k32w-lock-low-power-release \
--target k32w-shell-release \
build \
Expand Down
18 changes: 18 additions & 0 deletions examples/lighting-app/nxp/k32w/k32w0/BUILD.gn
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,14 @@ import("${k32w0_sdk_build_root}/k32w0_executable.gni")
import("${k32w0_sdk_build_root}/k32w0_sdk.gni")

import("${chip_root}/src/crypto/crypto.gni")
import("${chip_root}/src/lib/core/core.gni")
import("${chip_root}/src/platform/device.gni")

if (chip_pw_tokenizer_logging) {
import("//build_overrides/pigweed.gni")
import("$dir_pw_tokenizer/database.gni")
}

assert(current_os == "freertos")

k32w0_platform_dir = "${chip_root}/examples/platform/nxp/k32w/k32w0"
Expand Down Expand Up @@ -105,8 +111,20 @@ k32w0_executable("light_app") {
output_dir = root_out_dir
}

if (chip_pw_tokenizer_logging) {
pw_tokenizer_database("light_app.database") {
database = "$root_build_dir/chip-k32w061-light-example-database.bin"
create = "binary"
deps = [ ":light_app" ]
optional_paths = [ "$root_build_dir/chip-k32w061-light-example" ]
}
}

group("k32w0") {
deps = [ ":light_app" ]
if (chip_pw_tokenizer_logging) {
deps += [ ":light_app.database" ]
}
}

group("default") {
Expand Down
71 changes: 71 additions & 0 deletions examples/lighting-app/nxp/k32w/k32w0/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,10 @@ network.
- [Building](#building)
- [Flashing and debugging](#flashdebug)
- [Testing the example](#testing-the-example)
- [Pigweed Tokenizer](#tokenizer)
- [Detokenizer script](#detokenizer)
- [Notes](#detokenizer-notes)
- [Known issues](#detokenizer-known-issues)

<hr>

Expand Down Expand Up @@ -228,3 +232,70 @@ The app can be deployed against any generic OpenThread Border Router. See the
guide
[Commissioning NXP K32W using Android CHIPTool](../../../docs/guides/nxp_k32w_android_commissioning.md)
for step-by-step instructions.

<a name="tokenizer"></a>

## Pigweed tokenizer

The tokenizer is a pigweed module that allows hashing the strings. This greatly
reduces the flash needed for logs. The module can be enabled by building with
the gn argument _chip_pw_tokenizer_logging=true_. The detokenizer script is
needed for parsing the hashed scripts.

<a name="detokenizer"></a>

### Detokenizer script

The python3 script detokenizer.py is a script that decodes the tokenized logs
either from a file or from a serial port. The script can be used in the
following ways:

```
usage: detokenizer.py serial [-h] -i INPUT -d DATABASE [-o OUTPUT]
usage: detokenizer.py file [-h] -i INPUT -d DATABASE -o OUTPUT
```

The first parameter is either _serial_ or _file_ and it selects between decoding
from a file or from a serial port.

The second parameter is _-i INPUT_ and it must se set to the path of the file or
the serial to decode from.

The third parameter is _-d DATABASE_ and represents the path to the token
database to be used for decoding. The default path is
_out/debug/chip-k32w061-light-example-database.bin_ after a successful build.

The forth parameter is _-o OUTPUT_ and it represents the path to the output file
where the decoded logs will be stored. This parameter is required for file usage
and optional for serial usage. If not provided when used with serial port, it
will show the decoded log only at the stdout and not save it to file.

<a name="detokenizer-notes"></a>

### Notes

The token database is created automatically after building the binary if the
argument _chip_pw_tokenizer_logging=true_ was used.

The detokenizer script must be run inside the example's folder after a
successful run of the _scripts/activate.sh_ script. The pw_tokenizer module used
by the script is loaded by the environment.

<a name="detokenizer-known-issues"></a>

### Known issues

The building process will not update the token database if it already exists. In
case that new strings are added and the database already exists in the output
folder, it must be deleted so that it will be recreated at the next build.

Not all tokens will be decoded. This is due to a gcc/pw_tokenizer issue. The
pw_tokenizer creates special elf sections using attributes where the tokens and
strings will be stored. This sections will be used by the database creation
script. For template C++ functions, gcc ignores these attributes and places all
the strings by default in the .rodata section. As a result the database creation
script won't find them in the special-created sections.

If run, closed and rerun with the serial option on the same serial port, the
detokenization script will get stuck and not show any logs. The solution is to
unplug and plug the board and then rerun the script.
148 changes: 148 additions & 0 deletions examples/lighting-app/nxp/k32w/k32w0/detokenizer.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
import argparse
import sys
import os
import serial
import pw_tokenizer


def parse_args():
"""Parse input arguments
Return:
parsed arguments struncture
"""

parser = argparse.ArgumentParser()
subparsers = parser.add_subparsers(dest='type')
subparsers.required = True

parser_file = subparsers.add_parser('file')
parser_file.add_argument(
"-i", "--input", help="Input file name.", required=True)
parser_file.add_argument(
"-d", "--database", help="Token database.", required=True)
parser_file.add_argument(
"-o", "--output", help="Output file name.", required=True)

parser_file = subparsers.add_parser('serial')
parser_file.add_argument(
"-i", "--input", help="Input serial port name.", required=True)
parser_file.add_argument(
"-d", "--database", help="Token database.", required=True)
parser_file.add_argument(
"-o", "--output", help="Output file name. Write to stdout and to file.")

return parser.parse_args()


def decode_string(tstr, detok):
"""Decodes a single token.
Args:
tstr - encoded input string
detok - detokenizer
Return:
decoded string or None
"""
try:
t = bytes.fromhex(tstr)
s = str(detok.detokenize(t))

if s.find('$') == 0:
return None
return s
except:
return None


def decode_serial(serialport, outfile, database):
"""Decodes logs from serial port.
Args:
infile - path to input file
outfile - path to output file
database - path to token database
"""

detokenizer = pw_tokenizer.Detokenizer(database)
input = serial.Serial(serialport, 115200, timeout=None)

output = None
if outfile:
output = open(outfile, 'w')

if input:

try:
while(True):
if(input.in_waiting > 0):
# read line from serial port and ascii decode
line = input.readline().decode('ascii').strip()
# find token start and detokenize
idx = line.rfind(']')
dstr = decode_string(line[idx + 1:], detokenizer)
if dstr:
line = line[:idx+1] + dstr
print(line, file=sys.stdout)
if output:
print(line, file=output)
except:
print("Serial error or program closed", file=sys.stderr)

if output:
input.close()
output.close()

else:
print("Invalid or closed serial port.", file=sys.stderr)


def decode_file(infile, outfile, database):
"""Decodes logs from input file.
Args:
infile - path to input file
outfile - path to output file
database - path to token database
"""

if os.path.isfile(infile):

detokenizer = pw_tokenizer.Detokenizer(database)

output = open(outfile, 'w')

with open(infile, 'rb') as file:
for line in file:
try:
# ascii decode line
# serial terminals may include non ascii characters
line = line.decode('ascii').strip()
except:
continue
# find token start and detokenize
idx = line.rfind(']')
dstr = decode_string(line[idx + 1:], detokenizer)
if dstr:
line = line[:idx+1] + dstr
print(line, file=output)
output.close()

else:
print("File does not exist or is not a file.", file=sys.stderr)


def detokenize_input():

args = parse_args()

if args.type == 'file':
decode_file(args.input, args.output, args.database)
if args.type == 'serial':
decode_serial(args.input, args.output, args.database)


if __name__ == '__main__':
detokenize_input()
sys.exit(0)
2 changes: 1 addition & 1 deletion examples/lighting-app/nxp/k32w/k32w0/main/AppTask.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -416,7 +416,7 @@ void AppTask::ResetActionEventHandler(AppEvent * aEvent)
return;
}

K32W_LOG("Factory Reset Triggered. Push the RESET button within %u ms to cancel!", resetTimeout);
K32W_LOG("Factory Reset Triggered. Push the RESET button within %lu ms to cancel!", resetTimeout);
sAppTask.mFunction = kFunction_FactoryReset;

/* LEDs will start blinking to signal that a Factory Reset was scheduled */
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,11 @@
#define SWU_INTERVAl_WINDOW_MAX_MS (24 * 60 * 60 * 1000) // 24 hours

#if K32W_LOG_ENABLED
#if CHIP_PW_TOKENIZER_LOGGING
#define K32W_LOG(MSG, ...) ChipLogDetail(Echo, MSG, __VA_ARGS__);
#else
#define K32W_LOG(...) otPlatLog(OT_LOG_LEVEL_NONE, OT_LOG_REGION_API, ##__VA_ARGS__);
#endif
#else
#define K32W_LOG(...)
#endif
1 change: 1 addition & 0 deletions scripts/build/build/targets.py
Original file line number Diff line number Diff line change
Expand Up @@ -368,6 +368,7 @@ def K32WTargets():
yield target.Extend('light', app=K32WApp.LIGHT).GlobBlacklist("Debug builds broken due to LWIP_DEBUG redefition")

yield target.Extend('light-release', app=K32WApp.LIGHT, release=True)
yield target.Extend('light-tokenizer-release', app=K32WApp.LIGHT, tokenizer=True, release=True).GlobBlacklist("Only on demand build")
yield target.Extend('shell-release', app=K32WApp.SHELL, release=True)
yield target.Extend('lock-release', app=K32WApp.LOCK, release=True)
yield target.Extend('lock-low-power-release', app=K32WApp.LOCK, low_power=True, release=True).GlobBlacklist("Only on demand build")
Expand Down
Loading

0 comments on commit 3560708

Please sign in to comment.