Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: Add static code analyzer integration #51212

Closed

Conversation

romkell
Copy link
Contributor

@romkell romkell commented Oct 12, 2022

Brief

This is a unfinished draft implementation of a static code analyzer (SCA) integration implementation into Zephyr's cmake build system. Its not a full integration than rather an interface:

  • sca_export.json
  • sca_import.json
  • sca configuration update script hooks

Mechansim and architecture

The idea behind is to replace the actual toolchain in the CMAKE_C_COMPILER etc. variables by the SCA executable's at a late configuration stage of the build system (vai import json file). The actual toolchain information is exported (export json file) and a hook script started, allowing to configure the SCA accordingly.
Multiple SCA's can be added in the import json file. Their configuration getting referenced by SCA_VARIANT=<sca-variant-name>.
The hook scripts are organized in a <sca-integration>/hooks/<sca-variant> subfolder.

By using json type files for the import and export format, the mechanism can easily be extended.

Integration into build system

There is one main cmake script handling the sca_integration

  • sca_integration.cmake.
    This script need to be hooked into the build system at the right position/location.

Configuration folders

There are two equally structured folders, which are considered by the script:

  • In tree: $ZEPHYR_BASE/scripts/sca_integration
  • Out of tree: $SCA_INTEGRATION_DIR (e.g. $MY_REPO_BASE/scripts/sca_integration)
  • The out of tree configuration for the same SCA (-DSCA_VARIANT) will overrule the in tree configuration

Open points / questions

  • Where is the right spot to integrate such a script?
  • Why are most CMAKE_<xyz>_COMPILER variables empty (unless CMAKE_C_COMPILER) at the current location where the cmake file is currently called?
  • Write a doc

Call examples

Even though not really different, two examples for in and out of tree:

In tree

west build -p always -b native_posix ./tests/crypto/rand32/ -- -DSCA_VARIANT=cpptest

Out of tree

west build -p always -b native_posix ./tests/middleware/sdp/bin_selector/ -- -DSCA_VARIANT=Axivion

To make it even more convenient, a west option --sca /-s or similar could be implemented avoiding the -- -DSCA_VARIANT=<sca-variant-name>

Back porting

The implementation is intended to be back-ported to 2.7-auditable-branch.

Existing integrations

The suggestion would be to integrate any analyzer tool where the actual toolchain/compiler need to be replaced by the analyzer in the build system, which then calls the toolchain/compiler before starting its own analysis.

The mechansim would also work for analyzers which intercept the call to the toolchain/compiler at an OS level (as virus scanner do), telling those analyzers which executable to intercept. In that case the import file section for such an analyzer remains basically empty (C_COMPILER, CXX_COMPILER etc. will be missing in the import json).

I would assume that also sparse (#43776) could be integrated with that mechanism.

Further information

Commit comment

Add a generic way to integrate any static code analyzer into the cmake build script.
- Writing the selected toolchain paths to the sca_export.json
- Reading the static code analyzer toolchain paths from a sca_import.json and replacing the CMAKE_C_COMPILER settings with the values read from the json
- Executing static code analyzer customized scripts updating the analyzers configuration (with the information in sca_export.json)
- Support for a in tree and a out of tree location, allowing to integrate analyzers as part of Zephyr itself and as part of a customer environment using Zephyr as its base.

Signed-off-by: Roman Kellner <[email protected]>

Add a generic way to integrate any static code analyzer
into the cmake build script.
- Writing the selected toolchain paths to the sca_export.json
- Reading the static code analyzer toolchain paths from a
  sca_import.json and replacing the CMAKE_C_COMPILER
  settings with the values read from the json
- Executing static code analyzer customized scripts updating
  the analyzers configuration (with the information in sca_export.json)
- Support for a in tree and a out of tree location, allowing to
  integrate analyzers as part of Zephyr itself and as part of a
  customer environment using Zephyr as its base.

Signed-off-by: Roman Kellner <[email protected]>
@carlescufi
Copy link
Member

As mentioned in the TSC, is there a reason not to use:
https://cmake.org/cmake/help/latest/prop_tgt/LANG_CPPCHECK.html
?

@romkell
Copy link
Contributor Author

romkell commented Oct 28, 2022

As mentioned in the TSC, is there a reason not to use: https://cmake.org/cmake/help/latest/prop_tgt/LANG_CPPCHECK.html ?

@carlescufi I feel there are several arguments against it:

  • https://cmake.org/cmake/help/latest/manual/cmake-properties.7.html (searched for <LANG>_CPPCHECK) is/was intended for the cppcheck tool (as the others there such as cpplint, clang-tidy etc.)
    • It could possibly (mis-)used to run another SCA tool using the CMAKE_<LANG>_CPPCHECK variable, but:
  • in my opinion the approach should be generic, not tool centric, There is lots of SCA tools out there: Bugseng eclair, Axivion axivion suite, parasoft cpptest, Perforce Klocwork, Vector (Gimple) pclint plus etc.
  • The different SCA tool use different methods of integration and have different extends of what kind of analysis they do (not only Misra or CERT, some do analysis on the binary after linking (do they need linker option information?)):
    • No build system integration required: C/C++ single file based checks - just use the compile_commands.json. Those typically do not take the linking step into account resp do no analysis on the linked output (e.g cppcheck, pc-lint plus, axivion cafecc). The cmake CMAKE_<LANG>_CPPCHECK option works for such tools.
    • Build system integration: the SCA tool is called instead of the compiler/linker and itself then calls the compiler/linker. Therefore the SCA tool needs to know the compiler/linker executable. Even cppcheck has a wrapper for that (https://github.com/csutils/cscppc). e.g. Axivion ircc, presumably Parasoft cpptest. The cmake CMAKE_<LANG>_CPPCHECK option will not work for such tools.
    • compiler/linker call interception (no build system integration): The SCA tool needs to know the compiler/linker executable and intercepts the call to the executable on an OS level (I guess it is the same mechanism as a virus scanner). The build system does not need to be modified (patching of CMAKE_<LANG>_COMPILER is not required), but the selected compiler/linker executable needs to be made available to the SCA tool (Bugseng eclair). The cmake CMAKE_<LANG>_CPPCHECK option will not work for OS intercepting SCA tools.
  • I consider cppcheck a good open source Misra C/C++ analyzer to get 90% fixed and would be a good option as free-of-charge SCA for the community (clang static analyzer being another one). One or both could be pre-configures as in-tree solution with the Zepyhr OS coding guidelines to be able to work locally offline for Zephyr OS development. The last 10% then can be done using the CI integrated Zephyr OS certified SCA tool (parasoft cpptest, or ...). Not everyone can affort or will buy a licenses for cpptest, Klocwork or waht ever Zepyhr OS chooses to use.
  • Neither cppcheck nor clang have a certification (IEC61508, etc.)
  • The Project might use an in-tree configuration with their SCA tool(s) and its configuration. But user of Zephyr OS might have bought another SCA tool already than the certified one the Zephyr OS project wants to use. No company will buy a second 20 to 50k tool just to be able to work with Zephyr OS because it is the only option.
  • The user might even apply other coding guidelines on their proprietary out-of-tree code, hence need another configuration.
  • I would rather prefer reading configurations and command line parameter from a file (json being supported by cmake), instead from an increasing number of env variables.

From https://gitlab.inria.fr/sed-bso/heat/-/blob/master/CMakeLists.txt:

 static analysis during build
find_program(CPPCHECK "cppcheck")
if (CPPCHECK)
  set(CMAKE_C_CPPCHECK "${CPPCHECK}"
    "--language=c"
    "--platform=unix64"
    "--enable=all"
    "--force"
    "--inline-suppr"
    )
endif()
find_program(CLANGTIDY "clang-tidy")
if (CLANGTIDY)
  set(CMAKE_C_CLANG_TIDY "${CLANGTIDY}")
endif()

But..

Instead of implementing such an integration ourselfs, it might make sense to raise a change request toward cmake to provide a generic solution.
The question is when would we get it and will we update to that cmake version soon?

I added a commant to: https://gitlab.kitware.com/cmake/cmake/-/issues/23945

@romkell romkell changed the title RFC: Draft: Add static code analyzer integration RFC: Add static code analyzer integration Oct 28, 2022
@tejlmand
Copy link
Collaborator

@romkell thanks for this proposal, it surely would be beneficial to have better support for SCA tools in Zephyr.

There are a couple of questions from me.

First, why is it needed to start exporting CMake cache variables in a json format, when those settings are directly available inside CMake itself, but also in the CMakeCache for external tools to read ?
I see the export file is used here:

execute_process(COMMAND ${SCA_UPDATE_SCRIPT} ${SCA_INTEGRATION_DIR}/sca_export.json

to the update tool.
But a json format is really only very useful for the python script, not the bat and bash which needs very customized handling to understand json.
This again leads me to wonder, what exactly is the purpose of the update.[bat|sh|py] script ?
Can you make an example.
You do write:

The actual toolchain information is exported (export json file) and a hook script started, allowing to configure the SCA accordingly.

but why the complexity of going through an export json, isn't such configuration quite static meaning it can be prepared in advance ?

For the import file: https://github.com/zephyrproject-rtos/zephyr/blob/0d98433a73ac74b2c639a1a2d9e6a6c2e452035c/scripts/sca_integration/sca_import.json
why are such hardcoded paths needed ?
Everywhere else we use find_program / find_package which are well known parts of CMake.
Thus it seems a little strange that we should start specifying such tools in a json file instead.

Already today we have an established infrastructure for toolchain / compiler / linker:
https://github.com/zephyrproject-rtos/zephyr/tree/main/cmake/toolchain
https://github.com/zephyrproject-rtos/zephyr/tree/main/cmake/compiler
https://github.com/zephyrproject-rtos/zephyr/tree/main/cmake/linker

did you ever consider making use of that infrastructure ?
In fact one can add an out-of-tree toolchain called cppcheck and use that instead of the real compiler.
But a better approach will probably to have an sca tree in place, similar to the toolchain tree.
That way a user might set: ZEPHYR_TOOLCHAIN_VARIANT=zephyr and ZEPHYR_SCA_VARIANT=cppcheck.
then the SCA tool can decide in it's cmake integration whether or not to use CMAKE_<lang>_CPPCHECK.
Whether or not to also be able to compile code as part of calling the compiler directly like the sparse tool (when called though cgcc), or as a two step process, like cppcheck + compiler invocation.
If there is a need, the sca tool could then re-use the scheme of compiler flags in case there is a need for detailed option handling (i'm not sure there is such a need, but again i'm not familiar with all possible sca tools out there).

Such an approach has the benefit of being aligned with existing toolchain handling and not rely on new config / json files.
Zephyr already supports oot toolchains, so same pattern can be used for sca tools.

The main thing i'm unsure about is an actual example for your update.sh scripts, as to what exactly they are expected to do. We already have several ways to adjust the build system, and this one doesn't seem to be that clean in it's integration.
But without more details I have a hard time evaluate it.

@marc-hb
Copy link
Collaborator

marc-hb commented Nov 17, 2022

but also in the CMakeCache for external tools to read ?

@tejlmand what is the recommended way for external tools to read the CMakeCache? I searched the internet and only found deprecated ways.

To sparse zephyr code in production I ended copying and pasting a bit of Zephyr Cmake code which felt very awkward:
thesofproject/sof@77bb169

@tejlmand
Copy link
Collaborator

@tejlmand what is the recommended way for external tools to read the CMakeCache? I searched the internet and only found deprecated ways.

I don't know if there exists any recommended ways, but the format of the cache follows the format of setting cache variables from the command line when using full signature, that is:
-D<var>:<type>=<value> on the command line and in the CMakeCache that is <var>:<type>=<value>.
So if your tool is capable of passing a CMake setting to the build system using -D then it basically already ought to have the knowledge of the format.
Setting of cache vars: https://cmake.org/cmake/help/latest/manual/cmake.1.html#cmdoption-cmake-D

One can of course read the CMakeCache directly as a txt file, but another alternative is to simply invoke: cmake -LA <build-folder> which will list all non-advanced settings from the cache in that exact format.
Ref: https://cmake.org/cmake/help/latest/manual/cmake.1.html#cmdoption-cmake-L-A-H

In Zephyr we have a dedicated python module which can read the CMake cache.
That is free to re-use for other python scripts:

class CMakeCacheEntry:
'''Represents a CMake cache entry.
This class understands the type system in a CMakeCache.txt, and
converts the following cache types to Python types:
Cache Type Python type
---------- -------------------------------------------
FILEPATH str
PATH str
STRING str OR list of str (if ';' is in the value)
BOOL bool
INTERNAL str OR list of str (if ';' is in the value)
STATIC str OR list of str (if ';' is in the value)
UNINITIALIZED str OR list of str (if ';' is in the value)
---------- -------------------------------------------
'''

The cache types can be seen here:
https://cmake.org/cmake/help/latest/prop_cache/TYPE.html

@marc-hb
Copy link
Collaborator

marc-hb commented Nov 17, 2022

not the bat and bash which needs very customized handling to understand json.

From bash, jq can do wonders actually. PowerShell seems to have built-in support for JSON!! Now if you're still stuck in the 80s and to the utterly broken .BAT language then ask someone else from the present to write a script for you. Seriously.

@marc-hb
Copy link
Collaborator

marc-hb commented Nov 17, 2022

So if your tool is capable of passing a CMake setting to the build system using -D then it basically already ought to have the knowledge of the format.

sparse does not know or care about CMake but it (unfortunately) needs a value from Zephyr's CMakeCache.txt.

simply invoke: cmake -LA which will list all non-advanced settings from the cache

I looked at this option and then I gave up because I didn't want to parse the output (filter out all the noise, stripping the type, checkout for the weird NOT-FOUND and what not)

but another alternative is to simply invoke: cmake -LA which will list all non-advanced settings from the cache in that exact format.

Very interesting; if I had seen this maybe I would have used it. However it looks like this is only a library meaning a new python script has to be created to import and use it. So this is still not at the level of jq command line convenience.

PS: I forgot which other, "deprecated" method I found sorry. Maybe I got confused.

@romkell
Copy link
Contributor Author

romkell commented Nov 17, 2022

Thanks for looking at it.
First of all I am not an cmake expert, hence any know-how is welcome.

The notion here is to provide a generic interface for a Zephyr OS customer/user to

  1. execute a tool replacing the compiler/linker, where the replacing tool will call the compiler/linker instead (some SCA types, where the SCA then calls the intended compiler with all the params).
  2. provide the compiler/linker executables path for a tool intercepting the compiler/linker on OS level (other SCA types, the SCA need to know which executable to intercept)
  3. execute an additional tool along (pre/post) with the compiler/linker - not yet implemented

without having to change anything in the cmake build system itself.

but why the complexity of going through an export json, isn't such configuration quite static meaning it can be prepared in advance ?

Since Zephyr OS selects the compiler via the defined board, it is rather dynamic and not static. It is different whether I compile native_posix, nucleo_g071rb or sifive_... .

First, why is it needed to start exporting CMake cache variables in a json format, when those settings are directly available inside CMake itself, but also in the CMakeCache for external tools to read ?

I did not know about cmakecache.txt and its format. Is it valid to read there directly (cmakecache.txt being some kind of database, which's format might change at kitewares disposal also if they might not have done in years)? Should one not use the cmake API set(), unset() and $CACHE(varname), which are cmake internals? Who to access with an external tool?

For the import file: https://github.com/zephyrproject-rtos/zephyr/blob/0d98433a73ac74b2c639a1a2d9e6a6c2e452035c/scripts/sca_integration/sca_import.json
why are such hardcoded paths needed ?
Everywhere else we use find_program / find_package which are well known parts of CMake.

We could use find_program and the program name only. Providing the full path to the tool is unambiguous. I do not insist on absolute paths. Both could be allowed. The more option the more complex.

Thus it seems a little strange that we should start specifying such tools in a json file instead.

I used json because it seemed the only format that cmake supports out of the box (no yml, no ini ...) and it is a standard.

But a json format is really only very useful for the python script, not the bat and bash which needs very customized handling to understand json.

There are so many programming languages that have excellent support for json. bat and bash probably not. but I rather saw bat and bash as the most common scripts on Windows and Unixes, which very likely then can call any other "program" of users choice.

This again leads me to wonder, what exactly is the purpose of the update.[bat|sh|py] script ?
Can you make an example.

Since the cmake called compiler change with the selected board, the SCA tool needs to know that information before it executes. Eventually the configuration file for the SCA tool has to be adjusted accordingly. Every SCA tool will be different in that respect.
update.[xyz] is comparable to a Subversion or Git pre-commit hook or similar. Instead of getting the commit information passed, here we pass the selected compiler/linker toolchain paths. If the SCA tool does not need is configuration updated, the update.[xyz] scripts can be omitted.

Already today we have an established infrastructure for toolchain / compiler / linker:
https://github.com/zephyrproject-rtos/zephyr/tree/main/cmake/toolchain
https://github.com/zephyrproject-rtos/zephyr/tree/main/cmake/compiler
https://github.com/zephyrproject-rtos/zephyr/tree/main/cmake/linker

did you ever consider making use of that infrastructure ?

I saw all of that. From what I understood it is the toolchain / compiler / linker etc. that the Zephyr OS project maintains and even bundles in the SDK.
I did not want Zephyr OS to support all of https://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis#C,_C++ an maintain their integration as found in the folders mentioned above.
A Zephyr OS customer / user just should get the freedom and an easy way to integrate their favorite SCA tool, which they very likely bought typically for some money. Also should the customer / user have the freedom to have their tool configured the way they require it. Therefore I first started with an oot and later thought, why not using the same interface for in tree for Zephyr OSes preferred SCA tool and the configuration with the Zephyr OS rules.

Meanwhile I feel that even calling it a "SCA" integration interface is narrowing it down too much, since you can integrate any tool not only SCAs with the option mentioned at the top of this response.

I plan to attend the architecture WG meeting next week (22.11.22, unless cancelled) and would be happy to explain and discuss. Can we put that on the agenda?
I might not have seen all the options that are already in the existing build system. And I feel that I can not explain better in this chat with reasonable effort (feel it already is lengthily).

@codecov-commenter

This comment was marked as off-topic.

@nashif
Copy link
Member

nashif commented Nov 18, 2022

Do you have examples of how this works with tools like cppcheck, clang-tidy, iwyu?
See for example how we launch ccache, most checkers would follow the same approach I think.

I have used built-in support for such tools utilizing C_COMPILER_LAUNCHER and RULE_LAUNCH_COMPILE, RULE_LAUNCH_LINK properties in the past, which seem to be the obvious choice for such integration.

Using the json part and scripting does seem to be awkward and I agree with Torsten there is already infrastrucutre in place to support different toolchains, linkers and so on, so adding a launcher configuration using the same model seems the right and obvious way to go.
In any case, without having an example, it is very difficult to make see how this solution is generic.

@romkell
Copy link
Contributor Author

romkell commented Nov 18, 2022

I have used built-in support for such tools utilizing C_COMPILER_LAUNCHER and RULE_LAUNCH_COMPILE, RULE_LAUNCH_LINK properties in the past, which seem to be the obvious choice for such integration.

I had quick look at CMAKE_<LANG>_COMPILER_LAUNCHER resp. <LANG>_COMPILER_LAUNCHER.
The gist is clear, the details would have to be evaluated. It seems to be cmakes intended solution which is nicer than patching CMAKE_<LANG>_COMPILER and _LINKER somewhere in the cmake scripts.

I still feel we should not only support the integration of one SCA but as many supportive tools as the user likes, with an option to select one when building (west build, cmake).

  • cppcheck (free, fixing 90% of the (misra) issue, for the community without availability to a commercial SCA)
  • clang-analyzer (free, as above, for cppcheck)
  • <my-company-bought-sca> (bought for some reasonable money, checking the company internal code, oot)
  • <zephyr-bought-commercial-sca> (I assume, not locally available to the community only on Zephyr OS CI)
  • <no-sca-tool-put-another-tool-doing-what-so-ever>
  • <going-straight-without-launching-any-extra-tool> (without unnecessarily slowing down my build)

Something as CMAKE_<LANG>_COMPILER_<TOOL_NAME> could be done for each option and if set in the env <LANG>_COMPILER_LAUNCHER can be set accordingly one of the cmake script files.

Personally I do not like to clutter the env with all kind of env variables. I prefer a config file (json, yml, ini, I do not care that much, as long there is good support from script languages) which I can easily put under version control.

JSON seems to be cmake's choice: https://cmake.org/cmake/help/latest/manual/cmake-presets.7.html.

@marc-hb
Copy link
Collaborator

marc-hb commented Nov 18, 2022

JSON seems to be cmake's choice: https://cmake.org/cmake/help/latest/manual/cmake-presets.7.html.

JSON is also IPLD's choice https://ipld.io/docs/codecs/ and many others'. It's the standard, I don't think it has any serious competition in that particular niche, has it?

Personally I do not like to clutter the env with all kind of env variables.

Environment variables are a "build code smell". Probably their biggest sin is to be nearly invisible and to make it very hard to reproduce issues because of unsuspected differences. Numerous tools clean the environment (sudo, docker, bitbake,...). CMake cannot define build-time environment variables, see for instance sparse static analysis commit 3ebb18b

EDIT: yet another issue with environment variables: zephyrproject-rtos/west#613

Also, no one can ever tell where environment variables came from and no one knows who's going to read them. They're "super-global" variables.

Environment variables are used when every other, proper solution is unavailable or failed.

@tejlmand
Copy link
Collaborator

JSON seems to be cmake's choice: https://cmake.org/cmake/help/latest/manual/cmake-presets.7.html.

JSON is also IPLD's choice https://ipld.io/docs/codecs/ and many others'. It's the standard, I don't think it has any serious competition in that particular niche, has it?

note, i'm not against json, my problem with json in this particular case and the proposal in this PR is that it is not following any of our existing ways of configuring the build system or integrate extra tools.

For example we use variables for specifying toolchains, those can be set either in enviroment or on command line, such as -DZEPHYR_TOOLCHAIN_VARIANT=<toolchain> or as environment variable.

We use yaml files in Zephyr modules to integrate Zephyr modules with the build system.

We have Kconfig to enable / disable features in the build and control compile options.

Therefore I don't believe we should also start to use Json for integrating SCA tools, especially not when I believe the existing build infrastructure can easily be extended to support such tools in a way that is aligned with what we already have in place.

Personally I do not like to clutter the env with all kind of env variables.

Environment variables are a "build code smell".

Did anyone request this to be done using environment variables ?

I saw all of that. From what I understood it is the toolchain / compiler / linker etc. that the Zephyr OS project maintains and even bundles in the SDK.

Only the Zephyr SDK, Zephyr supports other toolchains besides that one, oneAPI, espressif, gnuarmemb, arcmwdt, etc.
Not to mention there are downstream users with custom toolchain, where they hook into the Zephyr build system using TOOLCHAIN_ROOT setting.

So users can already today chose one of the officially supported toolchains or hook into the build system using CMake directly for any toolchain they want to use.

but why the complexity of going through an export json, isn't such configuration quite static meaning it can be prepared in advance ?

Since Zephyr OS selects the compiler via the defined board, it is rather dynamic and not static. It is different whether I compile native_posix, nucleo_g071rb or sifive_... .

but if you integrate support for the SCA tool using CMake, then you have all this information available without the need for exporting to json, and you can much easier decide how to configure the SCA tool based on the board, arch, compiler, etc.
Even disable it completely under certain conditions if need be.

Eventually the configuration file for the SCA tool has to be adjusted accordingly. Every SCA tool will be different in that respect.

But CMake already has a lot of built-in stuff for generating files based on settings.
One very simple solution in a lot of cases are: configure_file() which can even replace CMake vars with there actual value in a template file when called.
So this didn't answer my question of why the bat, sh, py.
If we allow integrating in similar fashion as the toolchain root, then we stay consistent, and if anyone downstream wants to call out to a bat / sh script from within CMake, they can do so.
But let's keep our integration point in CMake.

@romkell
Copy link
Contributor Author

romkell commented Nov 23, 2022

@carlescufi, @tejlmand, @marc-hb @nashif : can we make that a agenda point on the architecture WG possibly the next week. I feel a verbal communication is more productive than this chat conversation. Everyone has already written quite a lot.

I guess we need to discuss the base concept before the details how to do it.
What do you think?

@carlescufi
Copy link
Member

@carlescufi, @tejlmand, @marc-hb @nashif : can we make that a agenda point on the architecture WG possibly the next week. I feel a verbal communication is more productive than this chat conversation. Everyone has already written quite a lot.

Sure, let's discuss this in the Architecture WG.

@carlescufi
Copy link
Member

carlescufi commented Nov 29, 2022

Architecture WG:

  • @tejlmand mentions we should have a different tree FindSCATool() that then can hook in in multiple places, allow for C compiler selection, etc.
  • @nashif and @tejlmand are in favor of using a generic framework that can the be extended out of tree, just like everything else we have here
  • @tejlmand mentions that we could take the SPARSE integration and rework it so that we set the basics of the frameworks
  • @nashif asks for concrete examples in the PR so we understand better how the infrastructure would work
  • @tejlmand suggests a generic framework that can either reuse existing functionality (eg. CMake's own CPPCHECK support) or make their own for more complex scenarios or tools
  • @tejlmand will provide a draft PR/branch with a rough outline of how the framework would look like

@tejlmand
Copy link
Collaborator

Architecture WG:

As promised, see #52671 for example on how this can be done.

@romkell please take a look and see how this fits into your needs.
Notice the sca.cmake can setup any tool / script to be called as part of the build in any way you like.
It can also decide to simply use CMake's built-in support like CMAKE_C_CPPCHECK if that suit the need of the tool.

@romkell
Copy link
Contributor Author

romkell commented Dec 5, 2022

PR closed in favor of #52671.

@romkell romkell closed this Dec 5, 2022
@shubhamtotade
Copy link

@romkell I wanted to integrate Klocwork in zephyr. Where do i need to make the changes in ?

@romkell
Copy link
Contributor Author

romkell commented Jun 22, 2023

@shubhamtotade

@romkell I wanted to integrate Klocwork in zephyr. Where do i need to make the changes in ?

I withdrew my approach in favour of #52671

You can find https://docs.zephyrproject.org/latest/develop/sca/index.html in the doc Zephyr OS > 3.3.0.

@romkell romkell deleted the cmake_sca_integration branch June 22, 2023 20:19
@marc-hb
Copy link
Collaborator

marc-hb commented Jun 22, 2023

@shubhamtotade
Copy link

You can find https://docs.zephyrproject.org/latest/develop/sca/index.html in the doc Zephyr OS > 3.3.0.

How to enable klocwork SCA into zephyr version 3.0.99.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants