Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Send GitLab host logs to CloudWatch #3894

Closed
amarjandu opened this issue Feb 24, 2022 · 6 comments
Closed

Send GitLab host logs to CloudWatch #3894

amarjandu opened this issue Feb 24, 2022 · 6 comments
Assignees
Labels
+ [priority] High compliance [subject] Information and software security demo [process] To be demonstrated at the end of the sprint demoed [process] Successfully demonstrated to team enh [type] New feature or request gitlab [subject] One of the GitLab instances orange [process] Done by the Azul team spike:8 [process] Spike estimate of eight points

Comments

@amarjandu
Copy link
Contributor

amarjandu commented Feb 24, 2022

/mnt/gitlab/logs/
/var/log/

Data Browser System Overview - Logging   Monitoring(1)

@amarjandu amarjandu added the orange [process] Done by the Azul team label Feb 24, 2022
@melainalegaspi melainalegaspi added gitlab [subject] One of the GitLab instances enh [type] New feature or request labels Feb 25, 2022
@achave11-ucsc achave11-ucsc changed the title Send GitLab syslog to CloudWatch Send GitLab host logs to CloudWatch Apr 29, 2022
@achave11-ucsc
Copy link
Member

achave11-ucsc commented Apr 29, 2022

@hannes-ucsc:

Perform following spike:

  1. Build docker image for agent https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/ContainerInsights-build-docker-image.html
  2. Start EC2 instance from rancher AMI that's used by our GitLab instance (@amarjandu can help).
  3. Upload image tarball to EC2 instance
  4. Import tarball
  5. Configure agent
  6. Start container from image
  7. Verify that logs are send to cloudwatch

It's importat to be aware of the distinction between the deprecated and the new agent. See note at top of page https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/AgentReference.html.

@achave11-ucsc achave11-ucsc added the spike:8 [process] Spike estimate of eight points label Apr 29, 2022
@achave11-ucsc achave11-ucsc self-assigned this Apr 29, 2022
@hannes-ucsc
Copy link
Member

Change of plans: we'll be migrating the GitLab instance to start from the Amazon Linux AMI on which the CloudWatch agent can be installed via yum.

@hannes-ucsc
Copy link
Member

hannes-ucsc commented Feb 28, 2023

Docker uses the syslog driver by default. On the Amazon Linux AMI, this causes all container output (stderr and stdout) to be written to /var/log/messages. The container running the gitlab image on that instance runs a wrapper which runs gitlab-ctl tail in the background, which runs Unix tail on all log files written to /var/log/gitlab in the container (/mnt/gitlab/logs on the host). The tail command is invoked with --name which causes every line in tail's output to be prefaced with a line reading ===> /var/log/gitlab/foo.log. This is relatively useless because the adjacency correlation gets lost in /var/log/messages on the host with interspersed lines from other processes.

We should disable the tail by setting GITLAB_SKIP_TAIL_LOGS for the gitlab container so that the verbose GitLab container logs don't pollute /var/log/messages. They are already available on the host under /mnt/gitlab/logs from where the CloudWatch Logs agent can be configured pick them up.

Scan all file system for other log directories we might want to forward to CloudWatch. The SSM agent log directory comes to mind, for example.

@achave11-ucsc
Copy link
Member

achave11-ucsc commented May 5, 2023

All the log entries agreed on were added, and are active. Additionally, the experiment on …/reconfigure/{dated}.log was successful, not only is the log forwarded after starting and stopping the gitlab deamon (and dependents), but the log forwards the re-creation of the instance (when new log entries and related changes were pushed) as well.

@hannes-ucsc
Copy link
Member

For demo, show that the forwarding of the logs survives restarting the agent (via systemctl restart) as well as rebooting and recreating the instance. Show evidence the GITLAB_SKIP_TAIL_LOGS does indeed prevent GitLab log files from being tailed or sent to the host. Show that the agent picks up new log files matching /mnt/gitlab/logs/reconfigure/*.log without having to be restarted. I think restarting the gitlab container or running gitlab-ctl reconfigure inside it causes a new log file to be created.

https://docs.gitlab.com/ee/administration/restart_gitlab.html#omnibus-gitlab-reconfigure

@hannes-ucsc hannes-ucsc added the demo [process] To be demonstrated at the end of the sprint label May 27, 2023
@achave11-ucsc achave11-ucsc added the demoed [process] Successfully demonstrated to team label Jun 6, 2023
@nolunwa-ucsc
Copy link

nolunwa-ucsc commented Jun 26, 2023

@dsotirho-ucsc Please update this spreadsheet to include the Gitlogs audit event captured in cloudwatch. https://docs.google.com/spreadsheets/d/1XTl3tQmr9wU1o8z8uGNjOU55jYlmWWMBR33QuKfg_S4/edit#gid=311132757

@dsotirho-ucsc dsotirho-ucsc removed this from the AnVIL Public Release milestone Oct 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
+ [priority] High compliance [subject] Information and software security demo [process] To be demonstrated at the end of the sprint demoed [process] Successfully demonstrated to team enh [type] New feature or request gitlab [subject] One of the GitLab instances orange [process] Done by the Azul team spike:8 [process] Spike estimate of eight points
Projects
None yet
Development

No branches or pull requests

8 participants