Skip to content

Overview / StrayCat

StrayCat

The stray cat goes around tools and hook, making troubles

Constructors

new StrayCat()

new StrayCat(userId, ws?): StrayCat

Parameters

ParameterType
userIdstring
ws?ElysiaWS

Returns

StrayCat

Properties

PropertyModifierTypeDefault value
activeForm?publicstringundefined
userIdpublicstringundefined
workingMemorypublicWorkingMemoryundefined
wsQueuepublicWSMessage[][]

Accessors

agentManager

Get Signature

get agentManager(): AgentManager

Returns

AgentManager


currentEmbedder

Get Signature

get currentEmbedder(): Embeddings

Returns

Embeddings


currentLLM

Get Signature

get currentLLM(): BaseChatModel

Returns

BaseChatModel


lastUserMessage

Get Signature

get lastUserMessage(): Message

Returns

Message


plugins

Get Signature

get plugins(): object[]

Returns

object[]


rabbitHole

Get Signature

get rabbitHole(): RabbitHole

Returns

RabbitHole


vectorMemory

Get Signature

get vectorMemory(): VectorMemory

Returns

VectorMemory


whiteRabbit

Get Signature

get whiteRabbit(): WhiteRabbit

Returns

WhiteRabbit

Methods

addHistory()

addHistory(message): void

Adds messages to the chat history.

Parameters

ParameterTypeDescription
messageMemoryMessage[]the messages to add

Returns

void


addInteraction()

addInteraction(interaction): void

Adds an interaction to the working memory.

Parameters

ParameterTypeDescription
interactionModelInteractionthe interaction to add

Returns

void


addWebSocket()

addWebSocket(value): void

This property is used to establish a new WebSocket connection.

Parameters

ParameterTypeDescription
valueundefined | ElysiaWSThe WebSocket instance.

Returns

void


classify()

classify<S, T>(sentence, labels, examples?): Promise<null | S>

Experimental

Classifies the given sentence into one of the provided labels.

Type Parameters

Type Parameter
S extends string
T extends [S, ...S[]]

Parameters

ParameterTypeDescription
sentencestringThe sentence to classify.
labelsTThe labels to classify the sentence into.
examples?{ [key in string]: S[] }Optional examples to help the LLM classify the sentence.

Returns

Promise<null | S>

The label of the sentence or null if it could not be classified.


clearHistory()

clearHistory(): void

Clears the chat history.

Returns

void


getHistory()

getHistory(k?): MemoryMessage[]

If passed a number k, retrieves the last k messages in the chat history. Otherwise, retrieves all messages in the chat history.

Parameters

ParameterTypeDescription
k?numberthe number of messages to retrieve

Returns

MemoryMessage[]

the messages present in the chat history


getInteraction()

getInteraction(k?): ModelInteraction[]

If passed a number k, retrieves the last k interactions in the working memory. Otherwise, retrieves all interactions in the working memory.

Parameters

ParameterTypeDescription
k?numberthe number of interactions to retrieve

Returns

ModelInteraction[]

the interactions present in the working memory


getPluginInfo()

getPluginInfo(id): undefined | { active: boolean; manifest: { authorName: string; authorUrl: string; description: string; name: string; pluginUrl: string; tags: string[]; thumb: string; version: string; }; settings: {}; }

Retrieves information about a plugin.

Parameters

ParameterTypeDescription
idstringThe ID of the plugin.

Returns

undefined | { active: boolean; manifest: { authorName: string; authorUrl: string; description: string; name: string; pluginUrl: string; tags: string[]; thumb: string; version: string; }; settings: {}; }

An object containing the plugin's active status, manifest, and settings.

Returns undefined if the plugin is not found.


llm()

Call Signature

llm(prompt, stream?): Promise<AIMessageChunk>

Executes the LLM with the given prompt and returns the response.

Parameters
ParameterTypeDescription
promptBaseLanguageModelInputThe prompt or messages to be passed to the LLM.
stream?falseOptional parameter to enable streaming mode.
Returns

Promise<AIMessageChunk>

Call Signature

llm(prompt, stream?): Promise<IterableReadableStream<AIMessageChunk>>

Executes the LLM with the given prompt and returns the response.

Parameters
ParameterTypeDescription
promptBaseLanguageModelInputThe prompt or messages to be passed to the LLM.
stream?trueOptional parameter to enable streaming mode.
Returns

Promise<IterableReadableStream<AIMessageChunk>>


queryDb()

queryDb<T>(question, type, source): Promise<string>

Experimental

Executes a SQL query based on a natural language question.

Type Parameters

Type Parameter
T extends "oracle" | "postgres" | "sqlite" | "mysql" | "mssql"

Parameters

ParameterTypeDescription
questionstringThe user question.
typeTThe SQL dialect to use.
sourceOmit<Extract<MysqlConnectionOptions, { type: T; }> | Extract<PostgresConnectionOptions, { type: T; }> | Extract<CockroachConnectionOptions, { type: T; }> | Extract<SqliteConnectionOptions, { type: T; }> | Extract<SqlServerConnectionOptions, { type: T; }> | Extract<SapConnectionOptions, { type: T; }> | Extract<OracleConnectionOptions, { type: T; }> | Extract<CordovaConnectionOptions, { type: T; }> | Extract<NativescriptConnectionOptions, { type: T; }> | Extract<ReactNativeConnectionOptions, { type: T; }> | Extract<SqljsConnectionOptions, { type: T; }> | Extract<MongoConnectionOptions, { type: T; }> | Extract<AuroraMysqlConnectionOptions, { type: T; }> | Extract<AuroraPostgresConnectionOptions, { type: T; }> | Extract<ExpoConnectionOptions, { type: T; }> | Extract<BetterSqlite3ConnectionOptions, { type: T; }> | Extract<CapacitorConnectionOptions, { type: T; }> | Extract<SpannerConnectionOptions, { type: T; }>, "type">The data source to execute the query on.

Returns

Promise<string>

The result of the SQL query in natural language.


recallRelevantMemories()

recallRelevantMemories(query?): Promise<void>

Recalls relevant memories based on the given query. If no query is provided, it uses the last user's message text as the query.

Parameters

ParameterTypeDescription
query?stringThe query string to search for relevant memories.

Returns

Promise<void>


run()

run(msg, save, returnWhy): Promise<WSMessage>

Processes the user message and returns the response.

Parameters

ParameterTypeDefault valueDescription
msgMessageundefinedThe message to send.
savebooleantrueWhether to save the message or not in the chat history (default: true).
returnWhybooleantrueWhether to return the 'why' field in the response (default: true).

Returns

Promise<WSMessage>

The response message.


send()

send(msg): void

Sends a message through the websocket connection.

If the websocket connection is not open, the message is queued.

If the message is of type 'chat', it is also stored in the chat history.

Parameters

ParameterTypeDescription
msgWSMessageThe message to send.

Returns

void

Released under the GPL-3.0 License.