Skip to content

Commit 947191a

Browse files
authored
[Updated] App Platform Guides for end of beta (#7294)
1 parent 832326b commit 947191a

File tree

4 files changed

+5
-27
lines changed
  • docs/guides/kubernetes
    • deploy-llm-for-ai-inferencing-on-apl
    • deploy-rag-pipeline-and-chatbot-on-apl
    • inter-service-communication-with-rabbitmq-and-apl
    • use-app-platform-to-deploy-wordpress

4 files changed

+5
-27
lines changed

docs/guides/kubernetes/deploy-llm-for-ai-inferencing-on-apl/index.md

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -5,18 +5,14 @@ description: "This guide includes steps and guidance for deploying a large langu
55
authors: ["Akamai"]
66
contributors: ["Akamai"]
77
published: 2025-03-25
8-
modified: 2025-06-04
8+
modified: 2025-06-26
99
keywords: ['ai','ai inference','ai inferencing','llm','large language model','app platform','lke','linode kubernetes engine','llama 3','kserve','istio','knative']
1010
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
1111
external_resources:
1212
- '[Akamai App Platform for LKE](https://techdocs.akamai.com/cloud-computing/docs/application-platform)'
1313
- '[Akamai App Platform Documentation](https://techdocs.akamai.com/app-platform/docs/welcome)'
1414
---
1515

16-
{{< note title="Beta Notice" type="warning" >}}
17-
The Akamai App Platform is now available as a limited beta. It is not recommended for production workloads. To register for the beta, visit the [Betas](https://cloud.linode.com/betas) page in the Cloud Manager and click the Sign Up button next to the Akamai App Platform Beta.
18-
{{< /note >}}
19-
2016
LLMs (large language models) are deep-learning models that are pre-trained on vast amounts of information. AI inferencing is the method by which an AI model (such as an LLM) is trained to "infer", and subsequently deliver accurate information. The LLM used in this deployment, Meta AI's [Llama 3](https://www.llama.com/docs/overview/), is an open-source, pre-trained LLM often used for tasks like responding to questions in multiple languages, coding, and advanced reasoning.
2117

2218
[KServe](https://kserve.github.io/website/latest/) is a standard Model Inference Platform for Kubernetes, built for highly-scalable use cases. KServe comes with multiple Model Serving Runtimes, including the [Hugging Face](https://huggingface.co/welcome) serving runtime. The Hugging Face runtime supports the following machine learning (ML) tasks: text generation, Text2Text generation, token classification, sequence and text classification, and fill mask.
@@ -65,8 +61,6 @@ If you prefer to manually install an LLM and RAG Pipeline on LKE rather than usi
6561

6662
- Access granted to Meta AI's Llama 3 model is required. To request access, navigate to Hugging Face's [Llama 3-8B Instruct LLM link](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct), read and accept the license agreement, and submit your information.
6763

68-
- Enrollment into the Akamai App Platform's [beta program](https://cloud.linode.com/betas).
69-
7064
## Set Up Infrastructure
7165

7266
### Provision an LKE Cluster

docs/guides/kubernetes/deploy-rag-pipeline-and-chatbot-on-apl/index.md

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,18 +5,14 @@ description: "This guide expands on a previously built LLM and AI inferencing ar
55
authors: ["Akamai"]
66
contributors: ["Akamai"]
77
published: 2025-03-25
8-
modified: 2025-06-04
8+
modified: 2025-06-26
99
keywords: ['ai','ai inference','ai inferencing','llm','large language model','app platform','lke','linode kubernetes engine','rag pipeline','retrieval augmented generation','open webui','kubeflow']
1010
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
1111
external_resources:
1212
- '[Akamai App Platform for LKE](https://techdocs.akamai.com/cloud-computing/docs/application-platform)'
1313
- '[Akamai App Platform Documentation](https://techdocs.akamai.com/app-platform/docs/welcome)'
1414
---
1515

16-
{{< note title="Beta Notice" type="warning" >}}
17-
The Akamai App Platform is now available as a limited beta. It is not recommended for production workloads. To register for the beta, visit the [Betas](https://cloud.linode.com/betas) page in the Cloud Manager and click the Sign Up button next to the Akamai App Platform Beta.
18-
{{< /note >}}
19-
2016
This guide builds on the LLM (Large Language Model) architecture built in our [Deploy an LLM for AI Inferencing with App Platform for LKE](/docs/guides/deploy-llm-for-ai-inferencing-on-apl) guide by deploying a RAG (Retrieval-Augmented Generation) pipeline that indexes a custom data set. RAG is a particular method of context augmentation that attaches relevant data as context when users send queries to an LLM.
2117

2218
Follow the steps in this tutorial to install Kubeflow Pipelines and deploy a RAG pipeline using Akamai App Platform for LKE. The deployment in this guide uses the previously deployed Open WebUI chatbot to respond to queries using a custom data set. The data set you use may vary depending on your use case. For example purposes, this guide uses a sample data set from Linode Docs in Markdown format.

docs/guides/kubernetes/inter-service-communication-with-rabbitmq-and-apl/index.md

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -5,18 +5,14 @@ description: "This guide shows how to deploy a RabbitMQ message broker architect
55
authors: ["Akamai"]
66
contributors: ["Akamai"]
77
published: 2025-03-20
8-
modified: 2025-06-04
8+
modified: 2025-06-26
99
keywords: ['app platform','lke','linode kubernetes engine','rabbitmq','microservice','message broker']
1010
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
1111
external_resources:
1212
- '[Akamai App Platform for LKE](https://techdocs.akamai.com/cloud-computing/docs/application-platform)'
1313
- '[Akamai App Platform Documentation](https://techdocs.akamai.com/app-platform/docs/welcome)'
1414
---
1515

16-
{{< note title="Beta Notice" type="warning" >}}
17-
The Akamai App Platform is now available as a limited beta. It is not recommended for production workloads. To register for the beta, visit the [Betas](https://cloud.linode.com/betas) page in the Cloud Manager and click the Sign Up button next to the Akamai App Platform Beta.
18-
{{< /note >}}
19-
2016
## Introduction
2117

2218
Asynchronous messaging is a common microservice architecture pattern used to decouple inter-service communication. Akamai App Platform uses RabbitMQ to provide an integrated messaging and streaming broker. RabbitMQ is a widely-adopted, open source message broker that uses AMQP (Advanced Message Queuing Protocol) to communicate with producers (apps that send messages) and consumers (apps that receive messages).
@@ -71,8 +67,6 @@ To address this, RabbitMQ allows you to bind, or link, each service - email, SMS
7167

7268
- A [Cloud Manager](https://cloud.linode.com/) account is required to use Akamai's cloud computing services, including LKE.
7369

74-
- Enrollment into the Akamai App Platform's [beta program](https://cloud.linode.com/betas).
75-
7670
- An provisioned and configured LKE cluster with App Platform enabled and [auto-scaling](https://techdocs.akamai.com/cloud-computing/docs/manage-nodes-and-node-pools#autoscale-automatically-resize-node-pools) turned on. An LKE cluster consisting of 3 Dedicated Compute Instances is sufficient for the deployment in this guide to run, but additional resources may be required during the configuration of your App Platform architecture.
7771

7872
To ensure sufficient resources are available, it is recommended that node pool auto-scaling for your LKE cluster is enabled after deployment. Make sure to set the max number of nodes higher than your minimum. This may result in higher billing costs.
@@ -99,7 +93,7 @@ Once your LKE cluster with App Platform has been fully deployed, [sign in](https
9993

10094
### Create a New Team
10195

102-
[Teams](https://techdocs.akamai.com/app-platform/docs/platform-teams) are isolated tenants on the platform to support Development and DevOps teams, projects, or even DTAP (Development, Testing, Acceptance, Production). A Team gets access to the Console, including access to self-service features and all shared apps available on the platform.
96+
[Teams](https://techdocs.akamai.com/app-platform/docs/platform-teams) are isolated tenants on the platform to support Development and DevOps teams, projects, or DTAP (Development, Testing, Acceptance, Production). A Team gets access to the Console, including access to self-service features and all shared apps available on the platform.
10397

10498
When working in the context of an admin-level Team, users can create and access resources in any namespace. When working in the context of a non-admin Team, users can only create and access resources used in that Team's namespace.
10599

docs/guides/kubernetes/use-app-platform-to-deploy-wordpress/index.md

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -5,18 +5,14 @@ description: "Two to three sentences describing your guide."
55
authors: ["Akamai"]
66
contributors: ["Akamai"]
77
published: 2025-05-06
8-
modified: 2025-06-04
8+
modified: 2025-06-26
99
keywords: ['app platform','app platform for lke','lke','linode kubernetes engine','kubernetes','persistent volumes','mysql']
1010
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
1111
external_resources:
1212
- '[Akamai App Platform for LKE](https://techdocs.akamai.com/cloud-computing/docs/application-platform)'
1313
- '[Akamai App Platform Documentation](https://techdocs.akamai.com/app-platform/docs/welcome)'
1414
---
1515

16-
{{< note title="Beta Notice" type="warning" >}}
17-
The Akamai App Platform is now available as a limited beta. It is not recommended for production workloads. To register for the beta, visit the [Betas](https://cloud.linode.com/betas) page in the Cloud Manager and click the Sign Up button next to the Akamai App Platform Beta.
18-
{{< /note >}}
19-
2016
This guide includes steps for deploying a WordPress site and persistent MySQL database using [App Platform for Linode Kubernetes Engine](https://techdocs.akamai.com/cloud-computing/docs/application-platform) (LKE). In this architecture, both WordPress and MySQL use PersistentVolumes (PV) and PersistentVolumeClaims (PVC) to store data.
2117

2218
To add the WordPress and MySQL Helm charts to the App Platform Catalog, the **Add Helm Chart** feature of Akamai App Platform for LKE is used.
@@ -25,8 +21,6 @@ To add the WordPress and MySQL Helm charts to the App Platform Catalog, the **Ad
2521

2622
- A [Cloud Manager](https://cloud.linode.com/) account is required to use Akamai's cloud computing services, including LKE.
2723

28-
- Enrollment into the Akamai App Platform's [beta program](https://cloud.linode.com/betas).
29-
3024
- An provisioned and configured LKE cluster with App Platform enabled and [auto-scaling](https://techdocs.akamai.com/cloud-computing/docs/manage-nodes-and-node-pools#autoscale-automatically-resize-node-pools) turned on. A Kubernetes cluster consisting of 3 [Dedicated CPU Compute Instances](https://techdocs.akamai.com/cloud-computing/docs/dedicated-cpu-compute-instances) is sufficient for the deployment in this guide to run, but additional resources may be required during the configuration of your App Platform architecture.
3125

3226
To ensure sufficient resources are available, it is recommended that node pool auto-scaling for your LKE cluster is enabled after deployment. Make sure to set the max number of nodes higher than your minimum. This may result in higher billing costs.

0 commit comments

Comments
 (0)