Skip to content

streamdal/log-processor

Repository files navigation

GitHub Discord

Real-Time Log Processing with Streamdal and Go

Overview

Leverage the power of Streamdal to supercharger your existing logging solution.

Features

Real-Time Log Processing: Fast processing of incoming logs with minimal latency.

PII Redaction: Automated redaction of sensitive information from logs.

Centralized Rule Management: Streamdal provides a UI for central rule management. These rules are then pulled in from the log-processing agents across your organization.

Scalable Architecture: Designed to handle high volumes of log data efficiently.Blazing fast pipelines, all function built using WASM, and pushed down to local log-processors to distribute load to the edge and allow for realtime processing.

Go Processing Agent: Custom-built agent for processing logs.

Getting Started

The steps below will deploy the Streamdal stack along with a logstash environment to demonstrate how the solution works

  1. Deploy Streamdals Server see https://github.com/streamdal/streamdal/tree/main/install/docker
  2. Clone this repo ```git clone [email protected]:streamdal/log-processor.git``
  3. Bring up development environment cd log-processor/;docker-compose up -d
  4. View Streamdal UI http://127.0.0.1:8080/
  5. Execute log generator to send sample data python3 log-generator.py
  6. Create a pipeline Create Pipeline
  7. Attach Pipeline to the Data Stream Attach Pipeline
  8. Use Streamdal's tail to confirm IPs are masked Tail
  9. You should see the masked IP Live Tail
  10. Goto kibana http://127.0.0.1:5601/
  11. Create a kibana pattern Index Pattern
  12. Confirm IP data is masked in Kibana Kibana

Deploying in prod / existing logstash

  1. Deploy Streamdal to kubernetes using the helm https://github.com/streamdal/streamdal/tree/main/install/helm
  2. Deploy log-processor to all your logstash agents using docker or via the binary streamdal/log-processor
  3. Update logstash agent to send the json logs you want to process to the log-processor
input {
  tcp {
    port => 5044
    codec => json_lines
  }
}

output {
  stdout { codec => rubydebug}
  tcp {
    host => "go-app" 
    port => 6000
    codec => json_lines
  }
}
  1. Add a section to recieve the processed data from the Streamdal log-processor and output to final destination
input {
  tcp {
    port => 7002
    codec => json_lines
  }
}

output {
  elasticsearch {
    hosts => ["elasticsearch:9200"] # Assumes Elasticsearch service is named 'elasticsearch' in docker-compose
    index => "processed-logs-%{+YYYY.MM.dd}" # Customize the index name as needed
  }
}
  1. Access the Streamdal Console you deployed earlier to apply whatever pipelines / rules needed.

Community

We're building Streamdal in the open and we'd love for you to join us!

Join our Discord!

About

Process logs based on Streamdal pipelines

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •