-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom static analysis tooling for CI/CD #546
Comments
Roughly speaking, yes! The AST produced by the Go parser does indeed maintain order of declaration and tracking code within a single file is rather simple to do. Take the following (hacked together) program which takes a path to a file (or code over package main
import (
"fmt"
"go/ast"
"go/parser"
"go/token"
"io/ioutil"
"os"
"strings"
)
func main() {
var src string
var err error
if len(os.Args) < 2 {
srcBytes, err := ioutil.ReadAll(os.Stdin)
if err != nil {
panic(err)
}
src = string(srcBytes)
} else {
srcBytes, err := ioutil.ReadFile(os.Args[1])
if err != nil {
panic(err)
}
src = string(srcBytes)
}
fset := new(token.FileSet)
parsed, err := parser.ParseFile(fset, "", src, 0)
if err != nil {
panic(err)
}
functions := filter(parsed.Decls, func(decl ast.Decl) bool {
_, ok := decl.(*ast.FuncDecl)
return ok
})
if len(functions) == 0 {
panic("why there ain't no functions at all, what kinda computer program is this!?")
}
function := functions[0].(*ast.FuncDecl)
isInit := function.Name.Name == "init"
isNotMethod := function.Recv == nil
noParameters := len(function.Type.Params.List) == 0
noReturn := function.Type.Results == nil
if isInit && isNotMethod && noParameters && noReturn {
fmt.Println("Good job, contributor!")
} else {
chastise(function, fset, src)
}
}
func filter(decls []ast.Decl, predicate func(decl ast.Decl) bool) (filtered []ast.Decl) {
for _, decl := range decls {
if predicate(decl) {
filtered = append(filtered, decl)
}
}
return
}
func chastise(function *ast.FuncDecl, set *token.FileSet, src string) {
file := set.File(function.Pos())
start := file.Offset(function.Pos())
end := file.Offset(function.End())
srcCode := strings.NewReader(src)
buf := make([]byte, end - start)
_, err := srcCode.ReadAt(buf, int64(start))
if err != nil {
panic(err)
}
lineno := file.Line(function.Pos())
fmt.Printf("File %s Line %d: got non-init function as the first function declaration\n", file.Name(), lineno)
fmt.Println("----")
fmt.Println(string(buf))
fmt.Println("----")
fmt.Println("^^^^ ZLint lints must have func init() {} as their first function declaration")
} Given the following input we should get $ ./main
package code
func add(a, b int) int {
return a + b
}
func B() {}
const str = "asd"
var things = []byte{}
func init() {}
-----------------------------------------------------
Line 3: got non-init function as the first function declaration
----
func add(a, b int) int {
return a + b
}
----
^^^^ ZLint lints must have func init() {} as their first function declaration Conversely... $ ./main
package code
func init() {}
func add(a, b int) int {
return a + b
}
func B() {}
const str = "asd"
var things = []byte{}
Good job, contributor! With a little extra work, one would think it perfectly possible to accomplish something similar to what |
@christopher-henderson That's excellent, neat work! Do you have interest in operationalizing it into a CI check? If not I'd be happy to run with it. |
@cpu yeah, I was gonna give it a quick whack this weekend alongside moving the version bump along. |
Awesome, looking forward to it. Thanks! |
So, @cpu, of course nothing is ever that easy. I saw that we use First, let's take a gander at their new linters guidelines. Write a struct with a magic name and magic interface, update these handful of files, and open a PR. Pretty familiar, no? Well, that's if you want to merge a lint into the core tool itself; a so-called "public lint". If you want a "private lint" then you can again implements a magic interface, compile a plugin with The catch with a "private lint", however, is that you cannot link a plugin into their release builds; you have to clone and build the project yourself to generate a build that can dynamically link in this way. We would have to review the GitHub workflow that we use for this as we're using their own github action to trigger this which does not appear to support anything other than pulling the built releases. Of course, we can throw our use of that action away and implement a small script to pull their repo, checkout, versions and compile the right binary. But even then getting these objects to successfully link at runtime has proven to be tricky. I got it to work once, but then changed something in my environment (no idea what, honestly) and then started to get linker errors due to differing versions of transitive dependencies. But if we're pulling the repo local why not just apply a patch at that follows the easier path of building a first class lint into it? But then what do we do? Fork and vendor the project. I think you can see that it's been a hell of an afternoon and I'm beginning to be wary of integrating with |
@christopher-henderson 😆 Who lints the linters?
That's too bad. I appreciate you thinking about how this could look and describing the challenges. I haven't looked at extending
That's closer to what I was imagining. I was thinking we could leave Roughly the plan would be:
I feel like that will give us the result we're looking for without dynamic linking shenanigans or having to change up the existing WDYT? Sound reasonable? |
@cpu totally, I think that'll be the plan of attack. |
Inspired by #536 (comment)
The text was updated successfully, but these errors were encountered: