1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110
|
// Copyright 2024 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
// https://github.com/webmachinelearning/prompt-api
dictionary LanguageModelCloneOptions {
AbortSignal signal;
};
dictionary LanguageModelPromptOptions {
object responseConstraint;
AbortSignal signal;
};
dictionary LanguageModelAppendOptions {
AbortSignal signal;
};
[
Exposed(Window AIPromptAPI, Worker AIPromptAPIForWorkers),
RuntimeEnabled=AIPromptAPI,
SecureContext
]
interface LanguageModel : EventTarget {
[
MeasureAs=LanguageModel_Create,
CallWith=ScriptState,
RaisesException
]
static Promise<LanguageModel> create(
optional LanguageModelCreateOptions options = {}
);
[
MeasureAs=LanguageModel_Availability,
CallWith=ScriptState,
RaisesException
]
static Promise<Availability> availability(
optional LanguageModelCreateCoreOptions options = {}
);
[
MeasureAs=LanguageModel_Params,
CallWith=ScriptState,
RaisesException
]
static Promise<LanguageModelParams?> params();
[
MeasureAs=LanguageModel_Prompt,
CallWith=ScriptState,
RaisesException
]
Promise<DOMString> prompt(
LanguageModelPrompt input,
optional LanguageModelPromptOptions options = {}
);
[
MeasureAs=LanguageModel_PromptStreaming,
CallWith=ScriptState,
RaisesException
]
ReadableStream promptStreaming(
LanguageModelPrompt input,
optional LanguageModelPromptOptions options = {}
);
[
MeasureAs=LanguageModel_Append,
CallWith=ScriptState,
RaisesException
]
Promise<undefined> append(
LanguageModelPrompt input,
optional LanguageModelAppendOptions options = {}
);
[
MeasureAs=LanguageModel_MeasureInputUsage,
CallWith=ScriptState,
RaisesException
]
Promise<double> measureInputUsage(
LanguageModelPrompt input,
optional LanguageModelPromptOptions options = {}
);
[MeasureAs=LanguageModel_InputUsage]
readonly attribute double inputUsage;
[MeasureAs=LanguageModel_InputQuota]
readonly attribute unrestricted double inputQuota;
[MeasureAs=LanguageModel_TopK]
readonly attribute unsigned long topK;
[MeasureAs=LanguageModel_Temperature]
readonly attribute float temperature;
attribute EventHandler onquotaoverflow;
[
MeasureAs=LanguageModel_Clone,
CallWith=ScriptState,
RaisesException
]
Promise<LanguageModel> clone(optional LanguageModelCloneOptions options = {});
[
MeasureAs=LanguageModel_Destroy,
CallWith=ScriptState,
RaisesException
]
undefined destroy();
};
|