Skip to content

Commit b9922f0

Browse files
committed
Merge branch 'main' into ollama-images
2 parents 056c557 + c607360 commit b9922f0

File tree

7 files changed

+5
-5
lines changed

7 files changed

+5
-5
lines changed

+llms/+internal/callOllamaChatAPI.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
function [text, message, response] = callOllamaChatAPI(model, messages, nvp)
22
% This function is undocumented and will change in a future release
33

4-
%callOllamaChatAPI Calls the Ollama® chat completions API.
4+
%callOllamaChatAPI Calls the Ollama chat completions API.
55
%
66
% MESSAGES and FUNCTIONS should be structs matching the json format
77
% required by the Ollama Chat Completions API.

+llms/+utils/errorMessageCatalog.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@
4141
catalog("llms:mustBeAssistantWithIdAndFunction") = "Field 'tool_call' must be a struct with fields 'id' and 'function'.";
4242
catalog("llms:mustBeAssistantWithNameAndArguments") = "Field 'function' must be a struct with fields 'name' and 'arguments'.";
4343
catalog("llms:assistantMustHaveTextNameAndArguments") = "Fields 'name' and 'arguments' must be text with one or more characters.";
44-
catalog("llms:mustBeValidIndex") = "Value is larger than the number of elements in Messages ({1}).";
44+
catalog("llms:mustBeValidIndex") = "Index exceeds the number of array elements. Index must be less than or equal to ({1}).";
4545
catalog("llms:stopSequencesMustHaveMax4Elements") = "Number of elements must not be larger than 4.";
4646
catalog("llms:endpointMustBeSpecified") = "Unable to find endpoint. Either set environment variable AZURE_OPENAI_ENDPOINT or specify name-value argument ""Endpoint"".";
4747
catalog("llms:deploymentMustBeSpecified") = "Unable to find deployment name. Either set environment variable AZURE_OPENAI_DEPLOYMENT or specify name-value argument ""Deployment"".";

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
[![Open in MATLAB Online](https://www.mathworks.com/images/responsive/global/open-in-matlab-online.svg)](https://matlab.mathworks.com/open/github/v1?repo=matlab-deep-learning/llms-with-matlab) [![View Large Language Models (LLMs) with MATLAB on File Exchange](https://www.mathworks.com/matlabcentral/images/matlab-file-exchange.svg)](https://www.mathworks.com/matlabcentral/fileexchange/163796-large-language-models-llms-with-matlab)
44

5-
This repository contains code to connect MATLAB to the [OpenAI™ Chat Completions API](https://platform.openai.com/docs/guides/text-generation/chat-completions-api) (which powers ChatGPT™), OpenAI Images API (which powers DALL·E™), [Azure® OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/), and both local and nonlocal [Ollama®](https://ollama.com/) models. This allows you to leverage the natural language processing capabilities of large language models directly within your MATLAB environment.
5+
This repository contains code to connect MATLAB to the [OpenAI™ Chat Completions API](https://platform.openai.com/docs/guides/text-generation/chat-completions-api) (which powers ChatGPT™), OpenAI Images API (which powers DALL·E™), [Azure® OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/), and both local and nonlocal [Ollama](https://ollama.com/) models. This allows you to leverage the natural language processing capabilities of large language models directly within your MATLAB environment.
66

77
## Requirements
88

doc/Ollama.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Ollama
22

3-
This repository contains code to connect MATLAB to an [Ollama®](https://ollama.com) server, running large language models (LLMs).
3+
This repository contains code to connect MATLAB to an [Ollama](https://ollama.com) server, running large language models (LLMs).
44

55
To use local models with Ollama, you will need to install and start an Ollama server, and “pull” models into it. Please follow the Ollama documentation for details. You should be familiar with the limitations and risks associated with using this technology, and you agree that you shall be solely responsible for full compliance with any terms that may apply to your use of any specific model.
66

6 Bytes
Binary file not shown.
Binary file not shown.

ollamaChat.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
classdef (Sealed) ollamaChat < llms.internal.textGenerator
2-
%ollamaChat Chat completion API from Ollama®.
2+
%ollamaChat Chat completion API from Ollama.
33
%
44
% CHAT = ollamaChat(modelName) creates an ollamaChat object for the given model.
55
%

0 commit comments

Comments
 (0)