Skip to content

Commit b3d1e01

Browse files
Add responseLogProbs and logProbs parameters to generateContentReq (#266)
* Add responseLogProbs and logProbs parameters to generateContentReq * update docs & test * Update docs and add avglogprobs and logprobsresult as output * update variable names in responses.ts * Move parameters to GenerationConfig * Update test cases for new parameters to test generationConfig * Updated generatecontentresponse testcase * Update case of logprobs * put back parameters in test case
1 parent dda0b5c commit b3d1e01

30 files changed

+420
-68
lines changed

.changeset/cyan-pants-move.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,4 @@
22
"@google/generative-ai": minor
33
---
44

5-
Add `frequencyPenalty` and `presencePenalty` parameters support for `generateContent()`
5+
Add `frequencyPenalty`, `presencePenalty`, `responseLogprobs`, and `logProbs` parameters support for `generationConfig`. Added `avgLogprobs` and `logprobsResult` to `GenerateContentResponse`. Updated test cases.

common/api-review/generative-ai.api.md

Lines changed: 24 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,8 @@
66

77
// @public
88
export interface BaseParams {
9-
frequencyPenalty?: number;
109
// (undocumented)
1110
generationConfig?: GenerationConfig;
12-
presencePenalty?: number;
1311
// (undocumented)
1412
safetySettings?: SafetySetting[];
1513
}
@@ -371,6 +369,7 @@ export interface FunctionResponsePart {
371369

372370
// @public
373371
export interface GenerateContentCandidate {
372+
avgLogprobs?: number;
374373
// (undocumented)
375374
citationMetadata?: CitationMetadata;
376375
// (undocumented)
@@ -381,6 +380,7 @@ export interface GenerateContentCandidate {
381380
finishReason?: FinishReason;
382381
// (undocumented)
383382
index: number;
383+
logprobsResult?: LogprobsResult;
384384
// (undocumented)
385385
safetyRatings?: SafetyRating[];
386386
}
@@ -429,8 +429,12 @@ export interface GenerateContentStreamResult {
429429
export interface GenerationConfig {
430430
// (undocumented)
431431
candidateCount?: number;
432+
frequencyPenalty?: number;
433+
logprobs?: number;
432434
// (undocumented)
433435
maxOutputTokens?: number;
436+
presencePenalty?: number;
437+
responseLogprobs?: boolean;
434438
responseMimeType?: string;
435439
responseSchema?: ResponseSchema;
436440
// (undocumented)
@@ -460,17 +464,13 @@ export class GenerativeModel {
460464
cachedContent: CachedContent;
461465
countTokens(request: CountTokensRequest | string | Array<string | Part>, requestOptions?: SingleRequestOptions): Promise<CountTokensResponse>;
462466
embedContent(request: EmbedContentRequest | string | Array<string | Part>, requestOptions?: SingleRequestOptions): Promise<EmbedContentResponse>;
463-
// (undocumented)
464-
frequencyPenalty?: number;
465467
generateContent(request: GenerateContentRequest | string | Array<string | Part>, requestOptions?: SingleRequestOptions): Promise<GenerateContentResult>;
466468
generateContentStream(request: GenerateContentRequest | string | Array<string | Part>, requestOptions?: SingleRequestOptions): Promise<GenerateContentStreamResult>;
467469
// (undocumented)
468470
generationConfig: GenerationConfig;
469471
// (undocumented)
470472
model: string;
471473
// (undocumented)
472-
presencePenalty?: number;
473-
// (undocumented)
474474
safetySettings: SafetySetting[];
475475
startChat(startChatParams?: StartChatParams): ChatSession;
476476
// (undocumented)
@@ -577,6 +577,19 @@ export interface InlineDataPart {
577577
text?: never;
578578
}
579579

580+
// @public
581+
export interface LogprobsCandidate {
582+
logProbability: number;
583+
token: string;
584+
tokenID: number;
585+
}
586+
587+
// @public
588+
export interface LogprobsResult {
589+
chosenCandidates: LogprobsCandidate[];
590+
topCandidates: TopCandidates[];
591+
}
592+
580593
// @public
581594
export interface ModelParams extends BaseParams {
582595
// (undocumented)
@@ -730,6 +743,11 @@ export interface ToolConfig {
730743
functionCallingConfig: FunctionCallingConfig;
731744
}
732745

746+
// @public
747+
export interface TopCandidates {
748+
candidates: LogprobsCandidate[];
749+
}
750+
733751
// @public
734752
export interface UsageMetadata {
735753
cachedContentTokenCount?: number;

docs/reference/main/generative-ai.baseparams.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,8 +16,6 @@ export interface BaseParams
1616

1717
| Property | Modifiers | Type | Description |
1818
| --- | --- | --- | --- |
19-
| [frequencyPenalty?](./generative-ai.baseparams.frequencypenalty.md) | | number | _(Optional)_ Frequency penalty applied to the next token's logprobs, multiplied by the number of times each token has been seen in the respponse so far. |
2019
| [generationConfig?](./generative-ai.baseparams.generationconfig.md) | | [GenerationConfig](./generative-ai.generationconfig.md) | _(Optional)_ |
21-
| [presencePenalty?](./generative-ai.baseparams.presencepenalty.md) | | number | _(Optional)_ Presence penalty applied to the next token's logprobs if the token has already been seen in the response. |
2220
| [safetySettings?](./generative-ai.baseparams.safetysettings.md) | | [SafetySetting](./generative-ai.safetysetting.md)<!-- -->\[\] | _(Optional)_ |
2321

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerateContentCandidate](./generative-ai.generatecontentcandidate.md) &gt; [avgLogprobs](./generative-ai.generatecontentcandidate.avglogprobs.md)
4+
5+
## GenerateContentCandidate.avgLogprobs property
6+
7+
Average log probability score of the candidate.
8+
9+
**Signature:**
10+
11+
```typescript
12+
avgLogprobs?: number;
13+
```
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerateContentCandidate](./generative-ai.generatecontentcandidate.md) &gt; [logprobsResult](./generative-ai.generatecontentcandidate.logprobsresult.md)
4+
5+
## GenerateContentCandidate.logprobsResult property
6+
7+
Log-likelihood scores for the response tokens and top tokens.
8+
9+
**Signature:**
10+
11+
```typescript
12+
logprobsResult?: LogprobsResult;
13+
```

docs/reference/main/generative-ai.generatecontentcandidate.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,12 @@ export interface GenerateContentCandidate
1616

1717
| Property | Modifiers | Type | Description |
1818
| --- | --- | --- | --- |
19+
| [avgLogprobs?](./generative-ai.generatecontentcandidate.avglogprobs.md) | | number | _(Optional)_ Average log probability score of the candidate. |
1920
| [citationMetadata?](./generative-ai.generatecontentcandidate.citationmetadata.md) | | [CitationMetadata](./generative-ai.citationmetadata.md) | _(Optional)_ |
2021
| [content](./generative-ai.generatecontentcandidate.content.md) | | [Content](./generative-ai.content.md) | |
2122
| [finishMessage?](./generative-ai.generatecontentcandidate.finishmessage.md) | | string | _(Optional)_ |
2223
| [finishReason?](./generative-ai.generatecontentcandidate.finishreason.md) | | [FinishReason](./generative-ai.finishreason.md) | _(Optional)_ |
2324
| [index](./generative-ai.generatecontentcandidate.index.md) | | number | |
25+
| [logprobsResult?](./generative-ai.generatecontentcandidate.logprobsresult.md) | | [LogprobsResult](./generative-ai.logprobsresult.md) | _(Optional)_ Log-likelihood scores for the response tokens and top tokens. |
2426
| [safetyRatings?](./generative-ai.generatecontentcandidate.safetyratings.md) | | [SafetyRating](./generative-ai.safetyrating.md)<!-- -->\[\] | _(Optional)_ |
2527

docs/reference/main/generative-ai.baseparams.frequencypenalty.md renamed to docs/reference/main/generative-ai.generationconfig.frequencypenalty.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
22

3-
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [BaseParams](./generative-ai.baseparams.md) &gt; [frequencyPenalty](./generative-ai.baseparams.frequencypenalty.md)
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerationConfig](./generative-ai.generationconfig.md) &gt; [frequencyPenalty](./generative-ai.generationconfig.frequencypenalty.md)
44

5-
## BaseParams.frequencyPenalty property
5+
## GenerationConfig.frequencyPenalty property
66

77
Frequency penalty applied to the next token's logprobs, multiplied by the number of times each token has been seen in the respponse so far.
88

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerationConfig](./generative-ai.generationconfig.md) &gt; [logprobs](./generative-ai.generationconfig.logprobs.md)
4+
5+
## GenerationConfig.logprobs property
6+
7+
Valid if responseLogProbs is set to True. This will set the number of top logprobs to return at each decoding step in the logprobsResult.
8+
9+
**Signature:**
10+
11+
```typescript
12+
logprobs?: number;
13+
```

docs/reference/main/generative-ai.generationconfig.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,11 @@ export interface GenerationConfig
1717
| Property | Modifiers | Type | Description |
1818
| --- | --- | --- | --- |
1919
| [candidateCount?](./generative-ai.generationconfig.candidatecount.md) | | number | _(Optional)_ |
20+
| [frequencyPenalty?](./generative-ai.generationconfig.frequencypenalty.md) | | number | _(Optional)_ Frequency penalty applied to the next token's logprobs, multiplied by the number of times each token has been seen in the respponse so far. |
21+
| [logprobs?](./generative-ai.generationconfig.logprobs.md) | | number | _(Optional)_ Valid if responseLogProbs is set to True. This will set the number of top logprobs to return at each decoding step in the logprobsResult. |
2022
| [maxOutputTokens?](./generative-ai.generationconfig.maxoutputtokens.md) | | number | _(Optional)_ |
23+
| [presencePenalty?](./generative-ai.generationconfig.presencepenalty.md) | | number | _(Optional)_ Presence penalty applied to the next token's logprobs if the token has already been seen in the response. |
24+
| [responseLogprobs?](./generative-ai.generationconfig.responselogprobs.md) | | boolean | _(Optional)_ If True, export the logprobs results in response. |
2125
| [responseMimeType?](./generative-ai.generationconfig.responsemimetype.md) | | string | _(Optional)_ Output response mimetype of the generated candidate text. Supported mimetype: <code>text/plain</code>: (default) Text output. <code>application/json</code>: JSON response in the candidates. |
2226
| [responseSchema?](./generative-ai.generationconfig.responseschema.md) | | [ResponseSchema](./generative-ai.responseschema.md) | _(Optional)_ Output response schema of the generated candidate text. Note: This only applies when the specified <code>responseMIMEType</code> supports a schema; currently this is limited to <code>application/json</code>. |
2327
| [stopSequences?](./generative-ai.generationconfig.stopsequences.md) | | string\[\] | _(Optional)_ |

docs/reference/main/generative-ai.baseparams.presencepenalty.md renamed to docs/reference/main/generative-ai.generationconfig.presencepenalty.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
22

3-
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [BaseParams](./generative-ai.baseparams.md) &gt; [presencePenalty](./generative-ai.baseparams.presencepenalty.md)
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerationConfig](./generative-ai.generationconfig.md) &gt; [presencePenalty](./generative-ai.generationconfig.presencepenalty.md)
44

5-
## BaseParams.presencePenalty property
5+
## GenerationConfig.presencePenalty property
66

77
Presence penalty applied to the next token's logprobs if the token has already been seen in the response.
88

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerationConfig](./generative-ai.generationconfig.md) &gt; [responseLogprobs](./generative-ai.generationconfig.responselogprobs.md)
4+
5+
## GenerationConfig.responseLogprobs property
6+
7+
If True, export the logprobs results in response.
8+
9+
**Signature:**
10+
11+
```typescript
12+
responseLogprobs?: boolean;
13+
```

docs/reference/main/generative-ai.generativemodel.frequencypenalty.md

Lines changed: 0 additions & 11 deletions
This file was deleted.

docs/reference/main/generative-ai.generativemodel.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,10 +24,8 @@ export declare class GenerativeModel
2424
| --- | --- | --- | --- |
2525
| [apiKey](./generative-ai.generativemodel.apikey.md) | | string | |
2626
| [cachedContent](./generative-ai.generativemodel.cachedcontent.md) | | [CachedContent](./generative-ai.cachedcontent.md) | |
27-
| [frequencyPenalty?](./generative-ai.generativemodel.frequencypenalty.md) | | number | _(Optional)_ |
2827
| [generationConfig](./generative-ai.generativemodel.generationconfig.md) | | [GenerationConfig](./generative-ai.generationconfig.md) | |
2928
| [model](./generative-ai.generativemodel.model.md) | | string | |
30-
| [presencePenalty?](./generative-ai.generativemodel.presencepenalty.md) | | number | _(Optional)_ |
3129
| [safetySettings](./generative-ai.generativemodel.safetysettings.md) | | [SafetySetting](./generative-ai.safetysetting.md)<!-- -->\[\] | |
3230
| [systemInstruction?](./generative-ai.generativemodel.systeminstruction.md) | | [Content](./generative-ai.content.md) | _(Optional)_ |
3331
| [toolConfig?](./generative-ai.generativemodel.toolconfig.md) | | [ToolConfig](./generative-ai.toolconfig.md) | _(Optional)_ |

docs/reference/main/generative-ai.generativemodel.presencepenalty.md

Lines changed: 0 additions & 11 deletions
This file was deleted.
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsCandidate](./generative-ai.logprobscandidate.md) &gt; [logProbability](./generative-ai.logprobscandidate.logprobability.md)
4+
5+
## LogprobsCandidate.logProbability property
6+
7+
The candidate's log probability.
8+
9+
**Signature:**
10+
11+
```typescript
12+
logProbability: number;
13+
```
Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsCandidate](./generative-ai.logprobscandidate.md)
4+
5+
## LogprobsCandidate interface
6+
7+
Candidate for the logprobs token and score.
8+
9+
**Signature:**
10+
11+
```typescript
12+
export interface LogprobsCandidate
13+
```
14+
15+
## Properties
16+
17+
| Property | Modifiers | Type | Description |
18+
| --- | --- | --- | --- |
19+
| [logProbability](./generative-ai.logprobscandidate.logprobability.md) | | number | The candidate's log probability. |
20+
| [token](./generative-ai.logprobscandidate.token.md) | | string | The candidate's token string value. |
21+
| [tokenID](./generative-ai.logprobscandidate.tokenid.md) | | number | The candidate's token id value. |
22+
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsCandidate](./generative-ai.logprobscandidate.md) &gt; [token](./generative-ai.logprobscandidate.token.md)
4+
5+
## LogprobsCandidate.token property
6+
7+
The candidate's token string value.
8+
9+
**Signature:**
10+
11+
```typescript
12+
token: string;
13+
```
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsCandidate](./generative-ai.logprobscandidate.md) &gt; [tokenID](./generative-ai.logprobscandidate.tokenid.md)
4+
5+
## LogprobsCandidate.tokenID property
6+
7+
The candidate's token id value.
8+
9+
**Signature:**
10+
11+
```typescript
12+
tokenID: number;
13+
```
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md) &gt; [chosenCandidates](./generative-ai.logprobsresult.chosencandidates.md)
4+
5+
## LogprobsResult.chosenCandidates property
6+
7+
Length = total number of decoding steps. The chosen candidates may or may not be in topCandidates.
8+
9+
**Signature:**
10+
11+
```typescript
12+
chosenCandidates: LogprobsCandidate[];
13+
```
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md)
4+
5+
## LogprobsResult interface
6+
7+
Logprobs Result
8+
9+
**Signature:**
10+
11+
```typescript
12+
export interface LogprobsResult
13+
```
14+
15+
## Properties
16+
17+
| Property | Modifiers | Type | Description |
18+
| --- | --- | --- | --- |
19+
| [chosenCandidates](./generative-ai.logprobsresult.chosencandidates.md) | | [LogprobsCandidate](./generative-ai.logprobscandidate.md)<!-- -->\[\] | Length = total number of decoding steps. The chosen candidates may or may not be in topCandidates. |
20+
| [topCandidates](./generative-ai.logprobsresult.topcandidates.md) | | [TopCandidates](./generative-ai.topcandidates.md)<!-- -->\[\] | Length = total number of decoding steps. |
21+
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md) &gt; [topCandidates](./generative-ai.logprobsresult.topcandidates.md)
4+
5+
## LogprobsResult.topCandidates property
6+
7+
Length = total number of decoding steps.
8+
9+
**Signature:**
10+
11+
```typescript
12+
topCandidates: TopCandidates[];
13+
```

docs/reference/main/generative-ai.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -74,6 +74,8 @@
7474
| [GenerationConfig](./generative-ai.generationconfig.md) | Config options for content-related requests |
7575
| [GenerativeContentBlob](./generative-ai.generativecontentblob.md) | Interface for sending an image. |
7676
| [InlineDataPart](./generative-ai.inlinedatapart.md) | Content part interface if the part represents an image. |
77+
| [LogprobsCandidate](./generative-ai.logprobscandidate.md) | Candidate for the logprobs token and score. |
78+
| [LogprobsResult](./generative-ai.logprobsresult.md) | Logprobs Result |
7779
| [ModelParams](./generative-ai.modelparams.md) | Params passed to [GoogleGenerativeAI.getGenerativeModel()](./generative-ai.googlegenerativeai.getgenerativemodel.md)<!-- -->. |
7880
| [PromptFeedback](./generative-ai.promptfeedback.md) | If the prompt was blocked, this will be populated with <code>blockReason</code> and the relevant <code>safetyRatings</code>. |
7981
| [RequestOptions](./generative-ai.requestoptions.md) | Params passed to getGenerativeModel() or GoogleAIFileManager(). |
@@ -85,6 +87,7 @@
8587
| [StartChatParams](./generative-ai.startchatparams.md) | Params for [GenerativeModel.startChat()](./generative-ai.generativemodel.startchat.md)<!-- -->. |
8688
| [TextPart](./generative-ai.textpart.md) | Content part interface if the part represents a text string. |
8789
| [ToolConfig](./generative-ai.toolconfig.md) | Tool config. This config is shared for all tools provided in the request. |
90+
| [TopCandidates](./generative-ai.topcandidates.md) | Candidates with top log probabilities at each decoding step |
8891
| [UsageMetadata](./generative-ai.usagemetadata.md) | Metadata on the generation request's token usage. |
8992

9093
## Variables
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
2+
3+
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [TopCandidates](./generative-ai.topcandidates.md) &gt; [candidates](./generative-ai.topcandidates.candidates.md)
4+
5+
## TopCandidates.candidates property
6+
7+
Sorted by log probability in descending order.
8+
9+
**Signature:**
10+
11+
```typescript
12+
candidates: LogprobsCandidate[];
13+
```

0 commit comments

Comments
 (0)