Skip to main content

Streamen des Überwachungsprotokolls für ein Unternehmen

Erfahren Sie, wie Sie Überwachungs- und Git-Ereignisdaten aus GitHub in ein externes Datenverwaltungssystem streamen können.

Wer kann dieses Feature verwenden?

Enterprise owners

Note

Webhooks might be a good alternative to the audit log or API polling for certain use cases. Webhooks are a way for GitHub to notify your server when specific events occur for a repository, organization, or enterprise. Compared to the API or searching the audit log, webhooks can be more efficient if you just want to learn and possibly log when certain events occur on your enterprise, organization, or repository. See Webhooks documentation.

About audit log streaming

You can help protect intellectual property and maintain compliance for your company by using streaming to keep copies of your audit log data. The audit log details events such as changes to settings and access, user membership, app permissions, and more. See Audit log events for your enterprise, Audit log events for your organization, and Security log events.

Streaming audit log data has these benefits:

  • Data exploration. Examine streamed events using your preferred tool for querying large quantities of data. The stream contains both audit events and Git events across the entire enterprise account.
  • Data continuity. If you pause a stream, it retains a buffer for seven days, so there is no data loss for the first week. If the stream remains paused for more than seven days, it will resume from a point one week prior to the current time. If paused for three weeks or more, the stream won't retain any data and will start anew from the current timestamp.
  • Data retention. Keep your exported audit logs and Git events data as long as you need to.

You can set up, pause, or delete a stream at any time. The stream exports audit and Git events data for all of the organizations in your enterprise, for activity from the time the stream is enabled onwards.

All streamed audit logs are sent as compressed JSON files. The filename format is inYYYY/MM/HH/MM/<uuid>.json.gz.

Note

GitHub uses an at-least-once delivery method. Due to certain network or system issues, some events may be duplicated.

Health checks for audit log streams

Every 24 hours, a health check runs for each stream. If a stream is set up incorrectly, an email will be sent to the enterprise owners. To avoid audit log events being dropped from the stream, a misconfigured stream must be fixed within six days.

To fix your streaming configuration, follow the steps in Setting up audit log streaming.

Setting up audit log streaming

To set up the audit log stream, follow the instructions for your provider:

Note

To get a list of IP address ranges that GitHub uses for connections to the streaming endpoint, use the REST API. The meta endpoint for GitHub Enterprise Cloud includes a hooks key with a list of the IP addresses. See REST API endpoints for meta data.

Streaming to multiple endpoints

Note

This feature is currently in public preview and subject to change.

You can stream audit logs to multiple endpoints. For example, you can stream your audit log to two endpoints of the same type, or you can stream to two different providers. To set up multiple streams, follow the instructions for each provider.

Setting up streaming to Amazon S3

You can set up streaming to S3 with access keys or, to avoid storing long-lived secrets in GitHub Enterprise Cloud, with OpenID Connect (OIDC).

Setting up streaming to S3 with access keys

To set up audit log streaming from GitHub you will need:

  • Your AWS access key ID
  • Your AWS secret key

For information on creating or accessing your access key ID and secret key, see Understanding and getting your AWS credentials in the AWS documentation.

From AWS:

  1. Create a bucket, and block public access to the bucket. See Creating, configuring, and working with Amazon S3 buckets in the AWS documentation.

  2. Create a policy that allows GitHub to write to the bucket. Copy the following JSON and replace EXAMPLE-BUCKET with the name of your bucket. GitHub requires only the permissions in this JSON.

    {
       "Version": "2012-10-17",
       "Statement": [
          {
             "Sid": "VisualEditor0",
             "Effect": "Allow",
             "Action": [
                "s3:PutObject"
             ],
             "Resource": "arn:aws:s3:::EXAMPLE-BUCKET/*"
         }
       ]
    }
    

    See Creating IAM policies in the AWS documentation.

From GitHub:

  1. In the top-right corner of GitHub, click your profile photo.

  2. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.

  3. On the left side of the page, in the enterprise account sidebar, click Settings.

  4. Under " Settings", click Audit log.

  5. Under "Audit log", click Log streaming.

  6. Select the Configure stream dropdown menu and click Amazon S3.

  7. Under "Authentication", click Access keys.

  8. Configure the stream settings.

    • Under "Region", select the bucket's region. For example, us-east-1.
    • Under "Bucket", type the name of the bucket you want to stream to. For example, auditlog-streaming-test.
    • Under "Access Key ID", type your access key ID. For example, ABCAIOSFODNN7EXAMPLE1.
    • Under "Secret Key", type your secret key. For example, aBcJalrXUtnWXYZ/A1MDENG/zPxRfiCYEXAMPLEKEY.
  9. To verify that GitHub can connect and write to the Amazon S3 endpoint, click Check endpoint.

  10. After you have successfully verified the endpoint, click Save.

Setting up streaming to S3 with OpenID Connect

From AWS:

  1. Add the GitHub OIDC provider to IAM. See Creating OpenID Connect (OIDC) identity providers in the AWS documentation.

    • For the provider URL, use https://oidc-configuration.audit-log.githubusercontent.com.
    • For "Audience", use sts.amazonaws.com.
  2. Create a bucket, and block public access to the bucket. See Creating, configuring, and working with Amazon S3 buckets in the AWS documentation.

  3. Create a policy that allows GitHub to write to the bucket. Copy the following JSON and replace EXAMPLE-BUCKET with the name of your bucket. GitHub requires only the permissions in this JSON.

    {
       "Version": "2012-10-17",
       "Statement": [
          {
             "Sid": "VisualEditor0",
             "Effect": "Allow",
             "Action": [
                "s3:PutObject"
             ],
             "Resource": "arn:aws:s3:::EXAMPLE-BUCKET/*"
         }
       ]
    }
    

    See Creating IAM policies in the AWS documentation.

  4. Configure the role and trust policy for the GitHub IdP. See Creating a role for web identity or OpenID Connect Federation (console) in the AWS documentation.

    • Add the permissions policy you created earlier to allow writes to the bucket.

    • Edit the trust relationship to add the sub field to the validation conditions, replacing ENTERPRISE with the name of your enterprise.

      "Condition": {
         "StringEquals": {
            "oidc-configuration.audit-log.githubusercontent.com:aud": "sts.amazonaws.com",
            "oidc-configuration.audit-log.githubusercontent.com:sub": "https://github.com/ENTERPRISE"
          }
       }
      
    • Make note of the Amazon Resource Name (ARN) of the created role.

From GitHub:

  1. In the top-right corner of GitHub, click your profile photo.

  2. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.

  3. On the left side of the page, in the enterprise account sidebar, click Settings.

  4. Under " Settings", click Audit log.

  5. Under "Audit log", click Log streaming.

  6. Select the Configure stream dropdown menu and click Amazon S3.

  7. Under "Authentication", click OpenID Connect.

  8. Configure the stream settings.

    • Under "Region", select the bucket's region. For example, us-east-1; an option for Auto Discovery is also available.
    • Under "Bucket", type the name of the bucket you want to stream to. For example, auditlog-streaming-test.
    • Under "ARN Role" type the ARN role you noted earlier. For example, arn:aws::iam::1234567890:role/github-audit-log-streaming-role.
  9. To verify that GitHub can connect and write to the Amazon S3 endpoint, click Check endpoint.

  10. After you have successfully verified the endpoint, click Save.

Disabling streaming to S3 with OpenID Connect

To disable streaming to S3 with OIDC, delete the GitHub OIDC provider you created in AWS when you set up streaming. See Creating OpenID Connect (OIDC) identity providers in the AWS documentation.

If you disable streaming due to a security vulnerability in OIDC, after you delete the provider, set up streaming with access keys until the vulnerability is resolved. See Setting up streaming to S3 with access keys.

Integrating with AWS CloudTrail Lake

You can consolidate your audit logs by integrating streaming to S3 with AWS CloudTrail Lake. See the AWS CloudTrail Documentation or the GitHub Audit Log to CloudTrail Open Audit in the aws-samples/aws-cloudtrail-lake-github-audit-log repository.

Setting up streaming to Azure Blob Storage

Note

Audit log streaming to blob storage in Azure Government is not supported.

Before setting up a stream in GitHub, first create a storage account and a container in Microsoft Azure. See Introduction to Azure Blob Storage in the Microsoft documentation.

To configure the stream, you need the URL of a SAS token.

From the Microsoft Azure portal:

  1. On the Home page, click Storage Accounts.
  2. Under "Name", click the name of the storage account you want to use.
  3. Under "Data storage", click Containers.
  4. Click the name of the container you want to use.
  5. In the left sidebar, under "Settings", click Shared access tokens.
  6. Select the Permissions dropdown menu, then select Create and Write and deselect all other options.
  7. Set an expiry date that complies with your secret rotation policy.
  8. Click Generate SAS token and URL.
  9. Copy the value of the Blob SAS URL field that's displayed. You will use this URL in GitHub.

From GitHub:

  1. In the top-right corner of GitHub, click your profile photo.
  2. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.
  3. On the left side of the page, in the enterprise account sidebar, click Settings.
  4. Under " Settings", click Audit log.
  5. Under "Audit log", click Log streaming.
  6. Select the Configure stream dropdown menu and click Azure Blob Storage.
  7. On the configuration page, enter the blob SAS URL that you copied in Azure. The Container field is auto-filled based on the URL.
  8. Click Check endpoint to verify that GitHub can connect and write to the Azure Blob Storage endpoint.
  9. After you have successfully verified the endpoint, click Save.

Setting up streaming to Azure Event Hubs

Note

Event Hubs instances in Azure Government are not supported.

Before setting up a stream in GitHub, you need:

From the Microsoft Azure portal:

  1. At the top of the page, use the search box to search for "Event Hubs".
  2. Select Event Hubs. The names of your event hubs are listed.
  3. Make a note of the name of the event hub to which you want to stream. Click the event hub.
  4. In the left menu, click Shared Access Policies.
  5. Select a shared access policy from the list of policies, or create a new policy.
  6. Copy the connection string from the Connection string-primary key field.

From GitHub:

  1. In the top-right corner of GitHub, click your profile photo.
  2. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.
  3. On the left side of the page, in the enterprise account sidebar, click Settings.
  4. Under " Settings", click Audit log.
  5. Under "Audit log", click Log streaming.
  6. Select the Configure stream dropdown and click Azure Event Hubs.
  7. On the configuration page, enter:
    • The name of the Azure Event Hubs instance.
    • The connection string.
  8. Click Check endpoint to verify that GitHub can connect and write to the Azure Events Hub endpoint.
  9. After you have successfully verified the endpoint, click Save.

Setting up streaming to Datadog

To set up streaming to Datadog, create a client token or an API key in Datadog, then configure audit log streaming in GitHub Enterprise Cloud using the token for authentication. You do not need to create a bucket or other storage container in Datadog.

After you set up streaming to Datadog, you can see your audit log data by filtering by "github.audit.streaming." See Log Management.

  1. If you don't already have a Datadog account, create one.

  2. In Datadog, generate a client token or an API key and then click Copy key. See API and Application Keys in Datadog Docs.

  3. In the top-right corner of GitHub, click your profile photo.

  4. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.

  5. On the left side of the page, in the enterprise account sidebar, click Settings.

  6. Under " Settings", click Audit log.

  7. Under "Audit log", click Log streaming.

  8. Select the Configure stream dropdown and click Datadog.

  9. In the Token field, paste the token you copied earlier.

  10. Select the Site dropdown and click your Datadog site. To determine your site, compare your Datadog URL to the table in Datadog sites in Datadog Docs.

  11. To verify that GitHub can connect and write to the Datadog endpoint, click Check endpoint.

  12. After you have successfully verified the endpoint, click Save.

  13. After a few minutes, confirm that audit log data appears on the Logs tab in Datadog. If it doesn't appear, confirm that your token and site are correct in GitHub.

Setting up streaming to Google Cloud Storage

To set up streaming to Google Cloud Storage, create a service account in Google Cloud with the appropriate credentials and permissions, then configure audit log streaming in GitHub Enterprise Cloud using the service account's credentials for authentication.

  1. Create a service account for Google Cloud. You do not need to set access controls or IAM roles for this account. See Creating and managing service accounts in the Google Cloud documentation.

  2. Create a JSON key for the service account, and store the key securely. See Creating and managing service account keys in the Google Cloud documentation.

  3. If you haven't yet, create a bucket. See Creating storage buckets in the Google Cloud documentation.

  4. Give the service account the Storage Object Creator role for the bucket. See Using Cloud IAM permissions in the Google Cloud documentation.

  5. In the top-right corner of GitHub, click your profile photo.

  6. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.

  7. On the left side of the page, in the enterprise account sidebar, click Settings.

  8. Under " Settings", click Audit log.

  9. Under "Audit log", click Log streaming.

  10. Select the Configure stream dropdown and click Google Cloud Storage.

  11. Under "Bucket", type the name of your Google Cloud Storage bucket.

  12. Under "JSON Credentials", paste the entire contents of your service account's JSON key file.

  13. To verify that GitHub can connect and write to the Google Cloud Storage bucket, click Check endpoint.

  14. After you have successfully verified the endpoint, click Save.

Setting up streaming to Splunk

To stream audit logs to Splunk's HTTP Event Collector (HEC) endpoint, make sure that the endpoint is configured to accept HTTPS connections. See Set up and use HTTP Event Collector in Splunk Web in the Splunk documentation.

Note

GitHub validates the HEC endpoint via <Domain>:port/services/collector. If self-hosting the endpoint (such as with Splunk HEC Receiver via OpenTelemetry), make sure it's reachable at this destination.

  1. In the top-right corner of GitHub, click your profile photo.

  2. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.

  3. On the left side of the page, in the enterprise account sidebar, click Settings.

  4. Under " Settings", click Audit log.

  5. Under "Audit log", click Log streaming.

  6. Select the Configure stream dropdown and click Splunk.

  7. On the configuration page, enter:

    • The domain where the application you want to stream to is hosted.

      If you're using Splunk Cloud, Domain should be http-inputs-<host>, where host is the domain you use in Splunk Cloud. For example, http-inputs-mycompany.splunkcloud.com.

      If you're using the free trial version of Splunk Cloud, Domain should be inputs.<host>, where host is the domain you use in Splunk Cloud. For example, inputs.mycompany.splunkcloud.com.

    • The port on which the application accepts data.

      If you're using Splunk Cloud, Port should be 443.

      If you're using the free trial version of Splunk Cloud, Port should be 8088.

    • A token that GitHub can use to authenticate to the third-party application.

  8. Leave the Enable SSL verification check box selected.

    Audit logs are always streamed as encrypted data, however, with this option selected, GitHub verifies the SSL certificate of your Splunk instance when delivering events. SSL verification helps ensure that events are delivered to your URL endpoint securely. Verification is optional, but we recommend you leave SSL verification enabled.

  9. Click Check endpoint to verify that GitHub can connect and write to the Splunk endpoint.

  10. After you have successfully verified the endpoint, click Save.

Pausing audit log streaming

Pause the stream to perform maintenance on the receiving application without losing audit data. Audit logs are stored for up to seven days on GitHub and are then exported when you unpause the stream.

Datadog only accepts logs from up to 18 hours in the past. If you pause a stream to a Datadog endpoint for more than 18 hours, you risk losing logs that Datadog won't accept after you resume streaming.

  1. In the top-right corner of GitHub, click your profile photo.
  2. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.
  3. On the left side of the page, in the enterprise account sidebar, click Settings.
  4. Under " Settings", click Audit log.
  5. Under "Audit log", click Log streaming.
  6. To the right of your configured stream, click Pause stream.
  7. A confirmation message displays. Click Pause stream to confirm.

To restart streaming, click Resume stream.

Deleting the audit log stream

  1. In the top-right corner of GitHub, click your profile photo.
  2. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.
  3. On the left side of the page, in the enterprise account sidebar, click Settings.
  4. Under " Settings", click Audit log.
  5. Under "Audit log", click Log streaming.
  6. Under "Danger zone", click Delete stream.
  7. A confirmation message displays. Click Delete stream to confirm.

Enabling audit log streaming of API requests

Note

This feature is currently in public preview and subject to change.

  1. In the top-right corner of GitHub, click your profile photo.
  2. Depending on your environment, click Your enterprise, or click Your enterprises then click the enterprise you want to view.
  3. On the left side of the page, in the enterprise account sidebar, click Settings.
  4. Under " Settings", click Audit log.
  5. Under "Audit log", click Settings.
  6. Under "API Requests", select Enable API Request Events.
  7. Click Save.