InferenceConfiguration

The configuration parameters that control how the foundation model behaves during evaluation, including response generation settings.

Types

Link copied to clipboard
class Builder
Link copied to clipboard
object Companion

Properties

Link copied to clipboard

The maximum number of tokens to generate in the model response during evaluation.

Link copied to clipboard

The list of sequences that will cause the model to stop generating tokens when encountered.

Link copied to clipboard

The temperature value that controls randomness in the model's responses. Lower values produce more deterministic outputs.

Link copied to clipboard
val topP: Float?

The top-p sampling parameter that controls the diversity of the model's responses by limiting the cumulative probability of token choices.

Functions

Link copied to clipboard
Link copied to clipboard
open operator override fun equals(other: Any?): Boolean
Link copied to clipboard
open override fun hashCode(): Int
Link copied to clipboard
open override fun toString(): String