LogicMonitor seeks to disrupt AI landscape with $800M strategic investment at $2.4B valuation to revolutionize data centers.

Learn More

Sending GCP Logs

Last updated on 25 August, 2024

The following describes how to send logs from Google Cloud Platform (GCP) to LM Logs for analysis.

Requirements

LogicMonitor API tokens to authenticate all requests to the log ingestion API.

Supported GCP Logs and Resources

LM Logs supports the following resources and log types:

  • GCP audit logs
  • GCP Cloud Composer logs
  • GCP Cloud Function logs
  • GCP Cloud Run logs
  • GCP CloudSQL logs
  • Virtual Machine (VM) instance logs

Installation Instructions

1. On your Google Cloud account, select Activate Cloud Shell. This opens the Cloud Shell Terminal below the workspace.

2. In the Terminal, run the following commands to select the project.

gcloud config set project [PROJECT_ID]

3. Run the following command to install the integration:

source <(curl -s https://raw.githubusercontent.com/logicmonitor/lm-logs-gcp/master/script/gcp.sh) && deploy_lm-logs

Installing the integration creates these resources:

  • A PubSub topic named export-logs-to-logicmonitor and a pull subscription.
  • A Virtual Machine (VM) named lm-logs-forwarder.

Note: You will be prompted to confirm the region where the VM is deployed. This should already be configured within your project.

Configuring the Log Forwarder

1. After the installation script completes, navigate to the Compute Engine > VM Instances and select lm-logs-forwarder.

2. Under Remote access, select SSH.

3. SSH into the VM (lm-logs-forwarder) and run the following command, filling in the values:

export GCP_PROJECT_ID="GCP_PROJECT_ID"
export LM_COMPANY_NAME="LM_COMPANY_NAME"
export LM_ACCESS_ID="LM_ACCESS_ID"
export LM_ACCESS_KEY="LM_ACCESS_KEY" 

source <(curl -s https://raw.githubusercontent.com/logicmonitor/lm-logs-gcp/master/script/vm.sh)

Exporting Logs from Logging to PubSub

You need to create a sink from Logging to the PubSub topic export-logs-to-logicmonitor (created at installation).

1. In the Logging page, filter the logs that you want to export.

Recommendation: Use the filters to remove logs that contain sensitive information so that they are not sent to LogicMonitor.

2. Select Actions > Create sink and under Sink details, provide a name.

3. Under Sink destination, choose Cloud Pub/Sub as the destination and select export-logs-to-logicmonitor. The pub/sub can be located in a different project.

4. Select Create sink. If there are no issues, you should see the logs stream into the LM Logs page.

Removing the Integration

Run the following command to delete the integration and all its resources:

source <(curl -s https://raw.githubusercontent.com/logicmonitor/lm-logs-gcp/master/script/gcp.sh) && delete_lm-logs

Default Metadata for Logs

The following metadata is added by default to logs along with the raw message string.

Metadata keyDescription
severitySeverity level for the event. Must be one of “Informational”, “Warning”, “Error”, or “Critical”.
logNameThe resource name of the log to which this event belongs.
categoryLog category for the event. Typical log categories are “Audit”, “Operational”, “Execution”, and “Request”.
_typeService, application, or device/virtual machine responsible for creating the event.
labelsLabels for the event.
resource.labelsLabels associated with the resource to which the event belongs.
httpRequestThe HTTP request associated with the log entry, if any.

Additional Metadata

If you need additional metadata, use the following configuration for plugin fluent-plugin-lm-logs-gcp.

Add the following block in fluentd/td-agent config file.

<filter pubsub.publish>
    @type gcplm
    metadata_keys severity, logName, labels, resource.type, resource.labels, httpRequest, trace, spanId, custom_key1, custom_key2
    use_default_severity true
</filter>

To add static metadata, use the record transformer. Add the following block to fluentd.conf for the static metadata.

<filter pubsub.publish>
  @type record_transformer
  <record>
    some_key some_value
    tag ${tag} # can add dynamic data as well
  </record>
</filter>

For more information about the configurations for the plugin, see lm-logs-fluentd-gcp-filter on github.

In This Article