-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow the timestamp field to be customized #33
Comments
@savaki I've never used Loggly before. From https://www.loggly.com/docs/automated-parsing/#json, it looks we want the flexibility to (a) call the field |
Correct. I'm using my forked version now to publish logs to loggly. func appendTimestamp(data []byte, t time.Time) []byte {
data = append(data, `,"timestamp":"`...)
data = t.UTC().AppendFormat(data, "2006-01-02T15:04:05.000Z")
return append(data, `"`...)
}
func sample() {
logger = zap.NewJSON(zap.Output(output), zap.Append(appendTimestamp))
} |
Gotcha. I think we can actually provide a little more flexibility than that - it should be easy enough to allow users to configure nearly everything about the final message (time format, key names, etc.). I'll try to open a PR this weekend. When I do, I'll tag you - let me know what you think. |
awesome, thanks. |
Note to self: this should also replace the |
+1 here, we require our 'ts' field to be ISO-8601 format (ie. "2006-01-02T15:04:05.000Z") |
Addressed in #115. You can now customize your timestamps with something like this: logger := zap.New(
zap.NewJSONEncoder(zap.RFC3339Formatter("@timestamp")),
zap.ErrorLevel,
zap.Output(someFile),
) If your preferred time format isn't supported out of the box, it should be easy to plug in your own |
Currently, the timestamp is defined to be unix nanos within the field, "ts". I'd like to be able to pump the logs directly into third party services like loggly which have a different naming requirement and format for the timestamp field.
Was thinking of something like this:
https://github.com/savaki/zap
The text was updated successfully, but these errors were encountered: