Skip to content

Latest commit

 

History

History
240 lines (190 loc) · 20.6 KB

File metadata and controls

240 lines (190 loc) · 20.6 KB
title error-log-logger
keywords
Apache APISIX
API Gateway
Plugin
Error log logger
description The error-log-logger Plugin pushes APISIX's error logs to TCP, Apache SkyWalking, Apache Kafka, or ClickHouse servers, in batches. You can specify the severity level for which the Plugin sends the corresponding logs.

Description

The error-log-logger Plugin pushes APISIX's error logs (error.log) to TCP, Apache SkyWalking, Apache Kafka, or ClickHouse servers, in batches. You can specify the severity level of logs that the Plugin sends.

The Plugin is disabled by default. Once enabled, it will automatically start pushing error logs to remote servers. You should configure remote server details in Plugin metadata only, instead of on other resources, such as Routes.

Plugin Metadata

There are no attributes to configure this Plugin on Routes or Services. All configuration is done through Plugin metadata.

Name Type Required Default Valid values Description
tcp object False TCP server configurations.
tcp.host string True IP address or the hostname of the TCP server.
tcp.port integer True [0,...] Target upstream port.
tcp.tls boolean False false When set to true, performs SSL verification.
tcp.tls_server_name string False Server name for the new TLS extension SNI.
skywalking object False SkyWalking server configurations.
skywalking.endpoint_addr string False http://127.0.0.1:12900/v3/logs Address of the SkyWalking server.
skywalking.service_name string False APISIX Service name for the SkyWalking reporter.
skywalking.service_instance_name string False APISIX Service Instance Service instance name for the SkyWalking reporter. Set it to $hostname to directly get the local hostname.
clickhouse object False ClickHouse server configurations.
clickhouse.endpoint_addr string True http://127.0.0.1:8123 ClickHouse endpoint. Required if clickhouse is configured.
clickhouse.user string True default ClickHouse username. Required if clickhouse is configured.
clickhouse.password string True ClickHouse password. Required if clickhouse is configured. The password is encrypted with AES before being stored in etcd. See encrypted storage fields.
clickhouse.database string True Name of the database to store the logs. Required if clickhouse is configured.
clickhouse.logtable string True Table name to store the logs. Required if clickhouse is configured. The table should have a data column where the Plugin will push logs to.
kafka object False Kafka server configurations.
kafka.brokers array True List of Kafka broker nodes.
kafka.brokers[].host string True The host of the Kafka broker.
kafka.brokers[].port integer True [0, 65535] The port of the Kafka broker.
kafka.brokers[].sasl_config object False The SASL configuration of the Kafka broker.
kafka.brokers[].sasl_config.mechanism string False PLAIN ["PLAIN"] The mechanism of SASL configuration.
kafka.brokers[].sasl_config.user string True The user of SASL configuration. Required if sasl_config is present.
kafka.brokers[].sasl_config.password string True The password of SASL configuration. Required if sasl_config is present.
kafka.kafka_topic string True Target topic to push the logs for organization.
kafka.producer_type string False async ["async", "sync"] Message sending mode of the producer.
kafka.required_acks integer False 1 [-1, 0, 1] Number of acknowledgements the leader needs to receive for the producer to consider the request complete. See Apache Kafka documentation for more.
kafka.key string False Key used for allocating partitions for messages.
kafka.cluster_name integer False 1 [1,...] Name of the cluster. Used when there are two or more Kafka clusters. Only works if producer_type is set to async.
kafka.meta_refresh_interval integer False 30 [1,...] Time interval in seconds to auto-refresh the metadata. Same as the refresh_interval parameter in lua-resty-kafka.
timeout integer False 3 [1,...] Timeout in seconds for establishing the connection and sending data.
keepalive integer False 30 [1,...] Time in seconds to keep the connection alive after sending data.
level string False WARN ["STDERR", "EMERG", "ALERT", "CRIT", "ERR", "ERROR", "WARN", "NOTICE", "INFO", "DEBUG"] Severity level to filter the error logs. Note that ERR is the same as ERROR.
name string False error-log-logger Unique identifier of the Plugin for the batch processor.
batch_max_size integer False 1000 [1,...] Maximum number of log entries per batch. Once reached, the batch is sent to the configured logging service. Set to 1 for immediate processing.
inactive_timeout integer False 3 [1,...] Maximum time in seconds to wait for new logs before sending the batch. The value should be smaller than buffer_duration.
buffer_duration integer False 60 [1,...] Maximum time in seconds from the earliest entry allowed before sending the batch.
retry_delay integer False 1 [0,...] Time interval in seconds to retry sending the batch if the previous attempt failed.
max_retry_count integer False 0 [0,...] Maximum number of unsuccessful retries before dropping the log entries.

This Plugin supports using batch processors to aggregate and process entries (logs/data) in a batch. This avoids the need for frequently submitting the data. The batch processor submits data every 3 seconds or when the data in the queue reaches 1000. See Batch Processor for more information or setting your custom configuration.

Example of default log format

["2024/01/06 16:04:30 [warn] 11786#9692271: *1 [lua] plugin.lua:205: load(): new plugins: {"error-log-logger":true}, context: init_worker_by_lua*","\n","2024/01/06 16:04:30 [warn] 11786#9692271: *1 [lua] plugin.lua:255: load_stream(): new plugins: {"limit-conn":true,"ip-restriction":true,"syslog":true,"mqtt-proxy":true}, context: init_worker_by_lua*","\n"]

Enable Plugin

The error-log-logger Plugin is disabled by default. To enable the Plugin, add it to your configuration file (conf/config.yaml):

plugins:
  - ...
  - error-log-logger

Reload APISIX for the change to take effect.

Once the Plugin is enabled, configure it through Plugin metadata as shown in the examples below.

Examples

The following examples demonstrate how you can configure the error-log-logger Plugin for different scenarios.

:::note

You can fetch the admin_key from config.yaml and save to an environment variable with the following command:

admin_key=$(yq '.deployment.admin.admin_key[0].key' conf/config.yaml | sed 's/"//g')

:::

Send Logs to TCP Server

The following example demonstrates how to configure the error-log-logger Plugin to send error logs to a TCP server.

Start a TCP server listening on port 19000:

nc -l 19000

Configure the Plugin metadata, setting the TCP server host and port, and the severity level to INFO so most logs will be sent for easier verification:

curl "http://127.0.0.1:9180/apisix/admin/plugin_metadata/error-log-logger" -X PUT \
  -H "X-API-KEY: ${admin_key}" \
  -d '{
    "tcp": {
      "host": "192.168.2.103",
      "port": 19000
    },
    "level": "INFO"
  }'

To verify, you can manually generate a log at warn level by reloading APISIX. In the terminal session where netcat is listening, you should see a log entry similar to the following:

2025/01/26 20:15:29 [warn] 211#211: *35552 [lua] plugin.lua:205: load(): new plugins: {...}, context: init_worker_by_lua*

Send Logs to SkyWalking

The following example demonstrates how to configure the error-log-logger Plugin to send error logs to SkyWalking.

Start a SkyWalking storage, OAP, and Booster UI with Docker Compose, following SkyWalking's documentation. Once set up, the OAP server should be listening on 12800 and you should be able to access the UI at http://localhost:8080.

Configure the Plugin metadata, setting the SkyWalking endpoint address and the severity level to INFO so most logs will be sent for easier verification:

curl "http://127.0.0.1:9180/apisix/admin/plugin_metadata/error-log-logger" -X PUT \
  -H "X-API-KEY: ${admin_key}" \
  -d '{
    "skywalking": {
      "endpoint_addr": "http://192.168.2.103:12800/v3/logs"
    },
    "level": "INFO"
  }'

To verify, you can manually generate a log at warn level by reloading APISIX. In the SkyWalking UI, navigate to General Service > Services. You should see a service called APISIX with log entries.

Send Logs to ClickHouse

The following example demonstrates how to configure the error-log-logger Plugin to send error logs to ClickHouse.

Start a sample ClickHouse server with user default and empty password:

docker run -d -p 8123:8123 -p 9000:9000 -p 9009:9009 --name clickhouse-server clickhouse/clickhouse-server

In ClickHouse database default, create a table named default_logs with a data column. Note that the data column is expected by the Plugin to push logs to:

curl "http://127.0.0.1:8123" -X POST -d '
  CREATE TABLE default.default_logs (
    data String,
    PRIMARY KEY(`data`)
  )
  ENGINE = MergeTree()
  ORDER BY (`data`)
' --user default:

Configure the Plugin metadata with the ClickHouse server details. Set the severity level to INFO so most logs will be sent for easier verification:

curl "http://127.0.0.1:9180/apisix/admin/plugin_metadata/error-log-logger" -X PUT \
  -H "X-API-KEY: ${admin_key}" \
  -d '{
    "clickhouse": {
      "endpoint_addr": "http://192.168.2.103:8123",
      "user": "default",
      "password": "",
      "database": "default",
      "logtable": "default_logs"
    },
    "level": "INFO"
  }'

To verify, you can manually generate a log at warn level by reloading APISIX. Then send a request to ClickHouse to see the log entries:

echo 'SELECT * FROM default.default_logs FORMAT Pretty' | curl "http://127.0.0.1:8123/?" -d @-

Send Logs to Kafka

The following example demonstrates how to configure the error-log-logger Plugin to send error logs to a Kafka server.

Configure the Plugin metadata with the Kafka broker details:

curl "http://127.0.0.1:9180/apisix/admin/plugin_metadata/error-log-logger" -X PUT \
  -H "X-API-KEY: ${admin_key}" \
  -d '{
    "kafka": {
      "brokers": [
        {
          "host": "127.0.0.1",
          "port": 9092
        }
      ],
      "kafka_topic": "apisix-error-logs"
    },
    "level": "ERROR",
    "inactive_timeout": 1
  }'