Skip to content

Commit 08160b7

Browse files
committed
Rename mustBeValidTopP to mustBeValidProbability
We're now using this validator for more than just `TopP`, and a new name is in order.
1 parent a7e6170 commit 08160b7

File tree

5 files changed

+7
-7
lines changed

5 files changed

+7
-7
lines changed

+llms/+internal/textGenerator.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
Temperature {llms.utils.mustBeValidTemperature} = 1
99

1010
%TopP Top probability mass to consider for generation.
11-
TopP {llms.utils.mustBeValidTopP} = 1
11+
TopP {llms.utils.mustBeValidProbability} = 1
1212

1313
%StopSequences Sequences to stop the generation of tokens.
1414
StopSequences {llms.utils.mustBeValidStop} = {}

+llms/+utils/mustBeValidTopP.m renamed to +llms/+utils/mustBeValidProbability.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
function mustBeValidTopP(value)
1+
function mustBeValidProbability(value)
22
% This function is undocumented and will change in a future release
33

44
% Copyright 2024 The MathWorks, Inc.

azureChat.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -109,7 +109,7 @@
109109
nvp.Tools (1,:) {mustBeA(nvp.Tools, "openAIFunction")} = openAIFunction.empty
110110
nvp.APIVersion (1,1) string {mustBeAPIVersion} = "2024-02-01"
111111
nvp.Temperature {llms.utils.mustBeValidTemperature} = 1
112-
nvp.TopP {llms.utils.mustBeValidTopP} = 1
112+
nvp.TopP {llms.utils.mustBeValidProbability} = 1
113113
nvp.StopSequences {llms.utils.mustBeValidStop} = {}
114114
nvp.ResponseFormat (1,1) string {mustBeMember(nvp.ResponseFormat,["text","json"])} = "text"
115115
nvp.PresencePenalty {llms.utils.mustBeValidPenalty} = 0

ollamaChat.m

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@
7373
Model (1,1) string
7474
Endpoint (1,1) string
7575
TopK (1,1) {mustBeReal,mustBePositive} = Inf
76-
MinP (1,1) {llms.utils.mustBeValidTopP} = 0
76+
MinP (1,1) {llms.utils.mustBeValidProbability} = 0
7777
TailFreeSamplingZ (1,1) {mustBeReal} = 1
7878
end
7979

@@ -83,8 +83,8 @@
8383
modelName {mustBeTextScalar}
8484
systemPrompt {llms.utils.mustBeTextOrEmpty} = []
8585
nvp.Temperature {llms.utils.mustBeValidTemperature} = 1
86-
nvp.TopP {llms.utils.mustBeValidTopP} = 1
87-
nvp.MinP {llms.utils.mustBeValidTopP} = 0
86+
nvp.TopP {llms.utils.mustBeValidProbability} = 1
87+
nvp.MinP {llms.utils.mustBeValidProbability} = 0
8888
nvp.TopK (1,1) {mustBeReal,mustBePositive} = Inf
8989
nvp.StopSequences {llms.utils.mustBeValidStop} = {}
9090
nvp.ResponseFormat (1,1) string {mustBeMember(nvp.ResponseFormat,["text","json"])} = "text"

openAIChat.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@
9494
nvp.Tools (1,:) {mustBeA(nvp.Tools, "openAIFunction")} = openAIFunction.empty
9595
nvp.ModelName (1,1) string {mustBeModel} = "gpt-4o-mini"
9696
nvp.Temperature {llms.utils.mustBeValidTemperature} = 1
97-
nvp.TopP {llms.utils.mustBeValidTopP} = 1
97+
nvp.TopP {llms.utils.mustBeValidProbability} = 1
9898
nvp.StopSequences {llms.utils.mustBeValidStop} = {}
9999
nvp.ResponseFormat (1,1) string {mustBeMember(nvp.ResponseFormat,["text","json"])} = "text"
100100
nvp.APIKey {mustBeNonzeroLengthTextScalar}

0 commit comments

Comments
 (0)