Module spin_sdk.llm
Module for working with the Spin large language model API
Functions
def generate_embeddings(model: str, text: Sequence[str]) ‑> EmbeddingsResult
-
A
Err(Error_ModelNotSupported)
will be raised if the component does not have access to the specified model.A
Err(Error_RuntimeError(str))
will be raised if there are any runtime errors.A
Err(Error_InvalidInput(str))
will be raised if an invalid input is provided. def infer(model: str, prompt: str) ‑> InferencingResult
-
A
Err(Error_ModelNotSupported)
will be raised if the component does not have access to the specified model.A
Err(Error_RuntimeError(str))
will be raised if there are any runtime errors.A
Err(Error_InvalidInput(str))
will be raised if an invalid input is provided. def infer_with_options(model: str,
prompt: str,
options: InferencingParams | None) ‑> InferencingResult-
A
Err(Error_ModelNotSupported)
will be raised if the component does not have access to the specified model.A
Err(Error_RuntimeError(str))
will be raised if there are any runtime errors.A
Err(Error_InvalidInput(str))
will be raised if an invalid input is provided.
Classes
class InferencingParams (max_tokens: int = 100,
repeat_penalty: float = 1.1,
repeat_penalty_last_n_token_count: int = 64,
temperature: float = 0.8,
top_k: int = 40,
top_p: float = 0.9)-
InferencingParams(max_tokens: int = 100, repeat_penalty: float = 1.1, repeat_penalty_last_n_token_count: int = 64, temperature: float = 0.8, top_k: int = 40, top_p: float = 0.9)
Expand source code
@dataclass class InferencingParams: max_tokens: int = 100 repeat_penalty: float = 1.1 repeat_penalty_last_n_token_count: int = 64 temperature: float = 0.8 top_k: int = 40 top_p: float = 0.9
Class variables
var max_tokens : int
var repeat_penalty : float
var repeat_penalty_last_n_token_count : int
var temperature : float
var top_k : int
var top_p : float