diff --git a/docs/guides/kubernetes/deploy-llm-for-ai-inferencing-on-apl/index.md b/docs/guides/kubernetes/deploy-llm-for-ai-inferencing-on-apl/index.md index 4e2c003a224..c6aa25b5ee8 100644 --- a/docs/guides/kubernetes/deploy-llm-for-ai-inferencing-on-apl/index.md +++ b/docs/guides/kubernetes/deploy-llm-for-ai-inferencing-on-apl/index.md @@ -5,7 +5,7 @@ description: "This guide includes steps and guidance for deploying a large langu authors: ["Akamai"] contributors: ["Akamai"] published: 2025-03-25 -modified: 2025-06-04 +modified: 2025-06-26 keywords: ['ai','ai inference','ai inferencing','llm','large language model','app platform','lke','linode kubernetes engine','llama 3','kserve','istio','knative'] license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)' external_resources: @@ -13,10 +13,6 @@ external_resources: - '[Akamai App Platform Documentation](https://techdocs.akamai.com/app-platform/docs/welcome)' --- -{{< note title="Beta Notice" type="warning" >}} -The Akamai App Platform is now available as a limited beta. It is not recommended for production workloads. To register for the beta, visit the [Betas](https://cloud.linode.com/betas) page in the Cloud Manager and click the Sign Up button next to the Akamai App Platform Beta. -{{< /note >}} - LLMs (large language models) are deep-learning models that are pre-trained on vast amounts of information. AI inferencing is the method by which an AI model (such as an LLM) is trained to "infer", and subsequently deliver accurate information. The LLM used in this deployment, Meta AI's [Llama 3](https://www.llama.com/docs/overview/), is an open-source, pre-trained LLM often used for tasks like responding to questions in multiple languages, coding, and advanced reasoning. [KServe](https://kserve.github.io/website/latest/) is a standard Model Inference Platform for Kubernetes, built for highly-scalable use cases. KServe comes with multiple Model Serving Runtimes, including the [Hugging Face](https://huggingface.co/welcome) serving runtime. The Hugging Face runtime supports the following machine learning (ML) tasks: text generation, Text2Text generation, token classification, sequence and text classification, and fill mask. @@ -65,8 +61,6 @@ If you prefer to manually install an LLM and RAG Pipeline on LKE rather than usi - Access granted to Meta AI's Llama 3 model is required. To request access, navigate to Hugging Face's [Llama 3-8B Instruct LLM link](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct), read and accept the license agreement, and submit your information. -- Enrollment into the Akamai App Platform's [beta program](https://cloud.linode.com/betas). - ## Set Up Infrastructure ### Provision an LKE Cluster diff --git a/docs/guides/kubernetes/deploy-rag-pipeline-and-chatbot-on-apl/index.md b/docs/guides/kubernetes/deploy-rag-pipeline-and-chatbot-on-apl/index.md index d74334f1702..706fa13cb15 100644 --- a/docs/guides/kubernetes/deploy-rag-pipeline-and-chatbot-on-apl/index.md +++ b/docs/guides/kubernetes/deploy-rag-pipeline-and-chatbot-on-apl/index.md @@ -5,7 +5,7 @@ description: "This guide expands on a previously built LLM and AI inferencing ar authors: ["Akamai"] contributors: ["Akamai"] published: 2025-03-25 -modified: 2025-06-04 +modified: 2025-06-26 keywords: ['ai','ai inference','ai inferencing','llm','large language model','app platform','lke','linode kubernetes engine','rag pipeline','retrieval augmented generation','open webui','kubeflow'] license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)' external_resources: @@ -13,10 +13,6 @@ external_resources: - '[Akamai App Platform Documentation](https://techdocs.akamai.com/app-platform/docs/welcome)' --- -{{< note title="Beta Notice" type="warning" >}} -The Akamai App Platform is now available as a limited beta. It is not recommended for production workloads. To register for the beta, visit the [Betas](https://cloud.linode.com/betas) page in the Cloud Manager and click the Sign Up button next to the Akamai App Platform Beta. -{{< /note >}} - This guide builds on the LLM (Large Language Model) architecture built in our [Deploy an LLM for AI Inferencing with App Platform for LKE](/docs/guides/deploy-llm-for-ai-inferencing-on-apl) guide by deploying a RAG (Retrieval-Augmented Generation) pipeline that indexes a custom data set. RAG is a particular method of context augmentation that attaches relevant data as context when users send queries to an LLM. Follow the steps in this tutorial to install Kubeflow Pipelines and deploy a RAG pipeline using Akamai App Platform for LKE. The deployment in this guide uses the previously deployed Open WebUI chatbot to respond to queries using a custom data set. The data set you use may vary depending on your use case. For example purposes, this guide uses a sample data set from Linode Docs in Markdown format. diff --git a/docs/guides/kubernetes/inter-service-communication-with-rabbitmq-and-apl/index.md b/docs/guides/kubernetes/inter-service-communication-with-rabbitmq-and-apl/index.md index bbd1febfa74..e91f9a8c3da 100644 --- a/docs/guides/kubernetes/inter-service-communication-with-rabbitmq-and-apl/index.md +++ b/docs/guides/kubernetes/inter-service-communication-with-rabbitmq-and-apl/index.md @@ -5,7 +5,7 @@ description: "This guide shows how to deploy a RabbitMQ message broker architect authors: ["Akamai"] contributors: ["Akamai"] published: 2025-03-20 -modified: 2025-06-04 +modified: 2025-06-26 keywords: ['app platform','lke','linode kubernetes engine','rabbitmq','microservice','message broker'] license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)' external_resources: @@ -13,10 +13,6 @@ external_resources: - '[Akamai App Platform Documentation](https://techdocs.akamai.com/app-platform/docs/welcome)' --- -{{< note title="Beta Notice" type="warning" >}} -The Akamai App Platform is now available as a limited beta. It is not recommended for production workloads. To register for the beta, visit the [Betas](https://cloud.linode.com/betas) page in the Cloud Manager and click the Sign Up button next to the Akamai App Platform Beta. -{{< /note >}} - ## Introduction Asynchronous messaging is a common microservice architecture pattern used to decouple inter-service communication. Akamai App Platform uses RabbitMQ to provide an integrated messaging and streaming broker. RabbitMQ is a widely-adopted, open source message broker that uses AMQP (Advanced Message Queuing Protocol) to communicate with producers (apps that send messages) and consumers (apps that receive messages). @@ -71,8 +67,6 @@ To address this, RabbitMQ allows you to bind, or link, each service - email, SMS - A [Cloud Manager](https://cloud.linode.com/) account is required to use Akamai's cloud computing services, including LKE. -- Enrollment into the Akamai App Platform's [beta program](https://cloud.linode.com/betas). - - An provisioned and configured LKE cluster with App Platform enabled and [auto-scaling](https://techdocs.akamai.com/cloud-computing/docs/manage-nodes-and-node-pools#autoscale-automatically-resize-node-pools) turned on. An LKE cluster consisting of 3 Dedicated Compute Instances is sufficient for the deployment in this guide to run, but additional resources may be required during the configuration of your App Platform architecture. To ensure sufficient resources are available, it is recommended that node pool auto-scaling for your LKE cluster is enabled after deployment. Make sure to set the max number of nodes higher than your minimum. This may result in higher billing costs. @@ -99,7 +93,7 @@ Once your LKE cluster with App Platform has been fully deployed, [sign in](https ### Create a New Team -[Teams](https://techdocs.akamai.com/app-platform/docs/platform-teams) are isolated tenants on the platform to support Development and DevOps teams, projects, or even DTAP (Development, Testing, Acceptance, Production). A Team gets access to the Console, including access to self-service features and all shared apps available on the platform. +[Teams](https://techdocs.akamai.com/app-platform/docs/platform-teams) are isolated tenants on the platform to support Development and DevOps teams, projects, or DTAP (Development, Testing, Acceptance, Production). A Team gets access to the Console, including access to self-service features and all shared apps available on the platform. When working in the context of an admin-level Team, users can create and access resources in any namespace. When working in the context of a non-admin Team, users can only create and access resources used in that Team's namespace. diff --git a/docs/guides/kubernetes/use-app-platform-to-deploy-wordpress/index.md b/docs/guides/kubernetes/use-app-platform-to-deploy-wordpress/index.md index ae8889980d7..6f413ce6b7d 100644 --- a/docs/guides/kubernetes/use-app-platform-to-deploy-wordpress/index.md +++ b/docs/guides/kubernetes/use-app-platform-to-deploy-wordpress/index.md @@ -5,7 +5,7 @@ description: "Two to three sentences describing your guide." authors: ["Akamai"] contributors: ["Akamai"] published: 2025-05-06 -modified: 2025-06-04 +modified: 2025-06-26 keywords: ['app platform','app platform for lke','lke','linode kubernetes engine','kubernetes','persistent volumes','mysql'] license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)' external_resources: @@ -13,10 +13,6 @@ external_resources: - '[Akamai App Platform Documentation](https://techdocs.akamai.com/app-platform/docs/welcome)' --- -{{< note title="Beta Notice" type="warning" >}} -The Akamai App Platform is now available as a limited beta. It is not recommended for production workloads. To register for the beta, visit the [Betas](https://cloud.linode.com/betas) page in the Cloud Manager and click the Sign Up button next to the Akamai App Platform Beta. -{{< /note >}} - This guide includes steps for deploying a WordPress site and persistent MySQL database using [App Platform for Linode Kubernetes Engine](https://techdocs.akamai.com/cloud-computing/docs/application-platform) (LKE). In this architecture, both WordPress and MySQL use PersistentVolumes (PV) and PersistentVolumeClaims (PVC) to store data. To add the WordPress and MySQL Helm charts to the App Platform Catalog, the **Add Helm Chart** feature of Akamai App Platform for LKE is used. @@ -25,8 +21,6 @@ To add the WordPress and MySQL Helm charts to the App Platform Catalog, the **Ad - A [Cloud Manager](https://cloud.linode.com/) account is required to use Akamai's cloud computing services, including LKE. -- Enrollment into the Akamai App Platform's [beta program](https://cloud.linode.com/betas). - - An provisioned and configured LKE cluster with App Platform enabled and [auto-scaling](https://techdocs.akamai.com/cloud-computing/docs/manage-nodes-and-node-pools#autoscale-automatically-resize-node-pools) turned on. A Kubernetes cluster consisting of 3 [Dedicated CPU Compute Instances](https://techdocs.akamai.com/cloud-computing/docs/dedicated-cpu-compute-instances) is sufficient for the deployment in this guide to run, but additional resources may be required during the configuration of your App Platform architecture. To ensure sufficient resources are available, it is recommended that node pool auto-scaling for your LKE cluster is enabled after deployment. Make sure to set the max number of nodes higher than your minimum. This may result in higher billing costs.