-
Notifications
You must be signed in to change notification settings - Fork 835
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ensure request is chained before payload is logged #4301
Changes from 2 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -100,6 +100,11 @@ func (p *PredictorProcess) transformInput(node *v1.PredictiveUnit, msg payload.S | |
modelName := p.getModelName(node) | ||
|
||
if callModel || callTransformInput { | ||
msg, err := p.Client.Chain(p.Ctx, modelName, msg) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This makes sense, also for the Tensorflow protocol - thinking it may be worth also implementing this for the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Here if the chain function fails and throws an error, the message will not be logged, is that a concern? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Good spot @axsaucedo , just added the same for @SachinVarghese that's a good point. Although if the chaining fails, the request will also fail. Should we log the request if we know it's gonna fail? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I had a look at this again, current behavior seems to be good. |
||
if err != nil { | ||
return nil, err | ||
} | ||
|
||
//Log Request | ||
if node.Logger != nil && (node.Logger.Mode == v1.LogRequest || node.Logger.Mode == v1.LogAll) { | ||
err := p.logPayload(node.Name, node.Logger, payloadLogger.InferenceRequest, msg, puid) | ||
|
@@ -108,10 +113,6 @@ func (p *PredictorProcess) transformInput(node *v1.PredictiveUnit, msg payload.S | |
} | ||
} | ||
|
||
msg, err := p.Client.Chain(p.Ctx, modelName, msg) | ||
if err != nil { | ||
return nil, err | ||
} | ||
p.RoutingMutex.Lock() | ||
p.Routing[node.Name] = -1 | ||
p.RoutingMutex.Unlock() | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this from a missed license run from a different PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems that way. Either that, or some subdep got updated. The linter failed within the GH Action, and re-generating the licenses file locally fixed it.