In this lab we'll simulate a transaction through multiple logs and review the logs and specific transactions, via transaction IDs, through Dynatrace DQL. This lab contains a prebuilt setup deployable via code spaces which will spin up a kubernetes cluster and deploy a container which will generate 6 log files in 3 different formats. This is useful for anyone looking to find commmonality of specfic values across multiple log files.
- Deploy a k8s cluster in github codespaces
- Generate logs and ship them to the Dynatrace logs ingest api
- Review the generated log entries in Dynatrace
- Learn basic parsing syntax for Dynatrace DQL
- Learn advanced use cases for leverging DQL to build analytics and reporting
- Dynatrace SasS Tenant
- Z Shell
- Github Codespaces
- Kubernetes
- Docker
- JSON
- XML
- GIT
- Dynatrace Query Language (DQL)
Some code or apis will take an input from a end user or an api, like a user name, password, credit card transaction, UUID, etc and pass that input as a variable between other apis and down stream services. Sometimes, these values are also logged for debugging or reporting purposes. For the purposes of this lab, the script we will deploy will use set of templates to dynamically populate a set of "transaction" IDs across six seperate log files or "devices" to simulate this type of transaction. We'll add some complexity to this transaction flow by having a differnt logging format for each "device" or "service" hop. We'll use The Dynatrace Query Language to quickly and easily access, review and track specific transactions as they run through our environment:
Within this github repo we'll build a java container from a dockerfile and deploy it into a kubernetes cluster in the log-generator
namespace:
While the container runs it will gernerate log files based on templates in the /templates
directory to read, append the varibles in those templates and generate a new log line in each of the 6 log files created and POST them to your Dynatrace tenants logs ingest api endpoint.
You can review the status of the log generation and shipping forom the log-generator container by running:
kubectl logs -f -n log-generator -l app=log-generator
For the shell script version of this lab see the shell setup
- A Github account
- A active Dynatrace SaaS tenant
- If you don't have one you can sign up for a free 15 day Dynatrace Trial
- Two Dynatrace API Access tokens
-
In your Dynatrace tenant, navigate to the "Access Tokens" app/ page and generate two tokens:
-
Logs ingest token with the scopes:
- Ingest Logs
-
Login to your github account.
-
Create a fork of this repo
-
Once the repo has been forked click on "code" --> "codespaces" --> "..." --> "New with options"
-
Update the values:
- DT_URL
(use the .live url and no trailing slash!!)
- DT_LOG_INGEST_TOKEN
- DT_URL
-
Click on "create code space"
-
upload the
code-spaces-multitple-log-transaction-notebook.json
notebook to your Dynatrace tenant- this file is in the "assets/codespaces" folder of this repository
- Click here
-
Wait ~5 minutes and envision a world where software works perfectly while the codespaces environment provisions