We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
td-agent is able to run columnify -h yet td-agent throwing err
su - td-agent -c "columnify -h" Usage of columnify: columnify [-flags] [input files] -output string path to output file; default: stdout -parquetCompressionCodec string parquet compression codec, default: SNAPPY (default "SNAPPY") -parquetPageSize int parquet file page size, default: 8kB (default 8192) -parquetRowGroupSize int parquet file row group size, default: 128MB (default 134217728) -recordType string record data format type, [avro|csv|jsonl|ltsv|msgpack|tsv] (default "jsonl") -schemaFile string path to schema file -schemaType string schema type, [avro|bigquery]
Install td-agent and configure to use syslog input and s3 output with store_as parquet
To be able to write syslog input to MinIO (S3) in parquet format files
- Fluentd version: - TD Agent version: td-agent 4.4.2 fluentd 1.15.3 (e89092ce1132a933c12bb23fe8c9323c07ca81f5) - fluent-plugin-s3 version: 1.7.2 - aws-sdk-s3 version: - aws-sdk-sqs version: - Operating system: CentOS Linux release 7.9.2009 (Core) - Kernel version: 3.10.0-1160.53.1.el7
<match **> @type s3 aws_key_id xyz aws_sec_key abcd s3_bucket bucketname s3_endpoint https://xyz.com:9000/ s3_region us-east-1 force_path_style true path %Y-%m-%d/ store_as parquet parquet_compression_codec snappy record_type msgpack schema_type avro schema_file /etc/td-agent/log.avsc <buffer tag,time> @type file path /var/log/td-agent/buffer/s3 timekey 120 # 5 min partition timekey_wait 2m timekey_use_utc true # use utc chunk_limit_size 256m @type msgpack
2023-05-23 09:26:17 +0000 [info]: init supervisor logger path=nil rotate_age=nil rotate_size=nil 2023-05-23 09:26:17 +0000 [info]: parsing config file is succeeded path="/etc/td-agent/td-agent.conf" 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-calyptia-monitoring' version '0.1.3' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-elasticsearch' version '5.2.4' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-flowcounter-simple' version '0.1.0' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-kafka' version '0.18.1' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-metrics-cmetrics' version '0.1.2' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-opensearch' version '1.0.8' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-prometheus' version '2.0.3' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-prometheus_pushgateway' version '0.1.0' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-record-modifier' version '2.1.1' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-rewrite-tag-filter' version '2.4.0' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-s3' version '1.7.2' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-sd-dns' version '0.1.0' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-systemd' version '1.0.5' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-td' version '1.2.0' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-utmpx' version '0.5.0' 2023-05-23 09:26:17 +0000 [info]: gem 'fluent-plugin-webhdfs' version '1.5.0' 2023-05-23 09:26:17 +0000 [info]: gem 'fluentd' version '1.15.3' 2023-05-23 09:26:17 +0000 [warn]: 'force_path_style' parameter is deprecated: S3 will drop path style API in 2020: See https://aws.amazon.com/blogs/aws/amazon-s3-path-deprecation-plan-the-rest-of-the-story/ 2023-05-23 09:26:17 +0000 [error]: config error file="/etc/td-agent/td-agent.conf" error_class=Fluent::ConfigError error="'columnify' utility must be in PATH for -h compression"
No response
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Describe the bug
td-agent is able to run columnify -h yet td-agent throwing err
su - td-agent -c "columnify -h"
Usage of columnify: columnify [-flags] [input files]
-output string
path to output file; default: stdout
-parquetCompressionCodec string
parquet compression codec, default: SNAPPY (default "SNAPPY")
-parquetPageSize int
parquet file page size, default: 8kB (default 8192)
-parquetRowGroupSize int
parquet file row group size, default: 128MB (default 134217728)
-recordType string
record data format type, [avro|csv|jsonl|ltsv|msgpack|tsv] (default "jsonl")
-schemaFile string
path to schema file
-schemaType string
schema type, [avro|bigquery]
To Reproduce
Install td-agent and configure to use syslog input and s3 output with store_as parquet
Expected behavior
To be able to write syslog input to MinIO (S3) in parquet format files
Your Environment
Your Configuration
<match **>
@type s3
aws_key_id xyz
aws_sec_key abcd
s3_bucket bucketname
s3_endpoint https://xyz.com:9000/
s3_region us-east-1
force_path_style true
path %Y-%m-%d/
store_as parquet
parquet_compression_codec snappy
record_type msgpack
schema_type avro
schema_file /etc/td-agent/log.avsc
<buffer tag,time>
@type file
path /var/log/td-agent/buffer/s3
timekey 120 # 5 min partition
timekey_wait 2m
timekey_use_utc true # use utc
chunk_limit_size 256m
@type msgpack
Your Error Log
Additional context
No response
The text was updated successfully, but these errors were encountered: