Leverage the power of Streamdal to supercharger your existing logging solution.
Real-Time Log Processing: Fast processing of incoming logs with minimal latency.
PII Redaction: Automated redaction of sensitive information from logs.
Centralized Rule Management: Streamdal provides a UI for central rule management. These rules are then pulled in from the log-processing agents across your organization.
Scalable Architecture: Designed to handle high volumes of log data efficiently.Blazing fast pipelines, all function built using WASM, and pushed down to local log-processors to distribute load to the edge and allow for realtime processing.
Go Processing Agent: Custom-built agent for processing logs.
The steps below will deploy the Streamdal stack along with a logstash environment to demonstrate how the solution works
- Deploy Streamdals Server see https://github.com/streamdal/streamdal/tree/main/install/docker
- Clone this repo ```git clone [email protected]:streamdal/log-processor.git``
- Bring up development environment
cd log-processor/;docker-compose up -d
- View Streamdal UI http://127.0.0.1:8080/
- Execute log generator to send sample data
python3 log-generator.py
- Create a pipeline
- Attach Pipeline to the Data Stream
- Use Streamdal's tail to confirm IPs are masked
- You should see the masked IP
- Goto kibana http://127.0.0.1:5601/
- Create a kibana pattern
- Confirm IP data is masked in Kibana
- Deploy Streamdal to kubernetes using the helm https://github.com/streamdal/streamdal/tree/main/install/helm
- Deploy log-processor to all your logstash agents using docker or via the binary streamdal/log-processor
- Update logstash agent to send the json logs you want to process to the log-processor
input {
tcp {
port => 5044
codec => json_lines
}
}
output {
stdout { codec => rubydebug}
tcp {
host => "go-app"
port => 6000
codec => json_lines
}
}
- Add a section to recieve the processed data from the Streamdal log-processor and output to final destination
input {
tcp {
port => 7002
codec => json_lines
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"] # Assumes Elasticsearch service is named 'elasticsearch' in docker-compose
index => "processed-logs-%{+YYYY.MM.dd}" # Customize the index name as needed
}
}
- Access the Streamdal Console you deployed earlier to apply whatever pipelines / rules needed.
We're building Streamdal in the open and we'd love for you to join us!
Join our Discord!