Skip to content

kirk63/faa-multiplelog-transaction

 
 

Repository files navigation

Reviewing Transactions & Common Data Across Multiple Log Files

Overview

What You’ll Learn Today

In this lab we'll simulate a transaction through multiple logs and review the logs and specific transactions, via transaction IDs, through Dynatrace DQL. This lab contains a prebuilt setup deployable via code spaces which will spin up a kubernetes cluster and deploy a container which will generate 6 log files in 3 different formats. This is useful for anyone looking to find commmonality of specfic values across multiple log files.

In this lab we will...

  1. Deploy a k8s cluster in github codespaces
  2. Generate logs and ship them to the Dynatrace logs ingest api
  3. Review the generated log entries in Dynatrace
  4. Learn basic parsing syntax for Dynatrace DQL
  5. Learn advanced use cases for leverging DQL to build analytics and reporting

such dql, much wow!

Technical Specification

Technologies We Will Work With Today

Simulated Transaction Flow

Some code or apis will take an input from a end user or an api, like a user name, password, credit card transaction, UUID, etc and pass that input as a variable between other apis and down stream services. Sometimes, these values are also logged for debugging or reporting purposes. For the purposes of this lab, the script we will deploy will use set of templates to dynamically populate a set of "transaction" IDs across six seperate log files or "devices" to simulate this type of transaction. We'll add some complexity to this transaction flow by having a differnt logging format for each "device" or "service" hop. We'll use The Dynatrace Query Language to quickly and easily access, review and track specific transactions as they run through our environment:

transaction flow

Log Generation

Within this github repo we'll build a java container from a dockerfile and deploy it into a kubernetes cluster in the log-generator namespace:

transaction flow

While the container runs it will gernerate log files based on templates in the /templates directory to read, append the varibles in those templates and generate a new log line in each of the 6 log files created and POST them to your Dynatrace tenants logs ingest api endpoint.

You can review the status of the log generation and shipping forom the log-generator container by running:

kubectl logs -f -n log-generator -l app=log-generator

tail logs

SETUP

For the shell script version of this lab see the shell setup

Prerequisites

  1. A Github account
  2. A active Dynatrace SaaS tenant
  3. Two Dynatrace API Access tokens

Access token requierements

  1. In your Dynatrace tenant, navigate to the "Access Tokens" app/ page and generate two tokens:

    ingest token

  2. Logs ingest token with the scopes:

    • Ingest Logs

    ingest token

Deploying the app via codespaces:

  1. Login to your github account.

  2. Create a fork of this repo

    operator tokens

  3. Once the repo has been forked click on "code" --> "codespaces" --> "..." --> "New with options"

    operator tokens

  4. Update the values:

    • DT_URL (use the .live url and no trailing slash!!)
    • DT_LOG_INGEST_TOKEN
  5. Click on "create code space"

    operator tokens

  6. upload the code-spaces-multitple-log-transaction-notebook.json notebook to your Dynatrace tenant

    • this file is in the "assets/codespaces" folder of this repository
    • Click here

    logs files

  7. Wait ~5 minutes and envision a world where software works perfectly while the codespaces environment provisions

About

multiple log & entity transaction demo flow

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Shell 67.1%
  • Java 27.8%
  • Dockerfile 5.1%