File: index.rst

package info (click to toggle)
firefox 134.0.2-3
  • links: PTS, VCS
  • area: main
  • in suites: sid
  • size: 4,345,684 kB
  • sloc: cpp: 7,244,582; javascript: 6,236,669; ansic: 3,654,775; python: 1,359,774; xml: 618,542; asm: 426,944; java: 183,315; sh: 66,206; makefile: 19,398; perl: 13,009; objc: 12,453; yacc: 4,583; cs: 3,846; pascal: 2,989; lex: 1,720; ruby: 1,194; exp: 762; php: 436; lisp: 258; awk: 247; sql: 66; sed: 54; csh: 10
file content (34 lines) | stat: -rw-r--r-- 1,263 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
Firefox AI Platform
===================

This component is an experimental machine learning local inference engine based on
`Transformers.js <https://huggingface.co/docs/transformers.js/index>`_ and
the `ONNX runtime <https://onnxruntime.ai/>`_. You can use the component to leverage
the inference runtime in the context of the browser. To try out some inference tasks,
you can refer to the
`1000+ models <https://huggingface.co/models?library=transformers.js>`_
that are available in the Hugging Face Hub that are compatible with this runtime.

To enable the engine, flip the `browser.ml.enable` preference to `true` in `about:config`
then visit **about:inference** (Nightly only) or add the following snippet of code
into your (privileged) Javascript code in Firefox or in the browser console:

.. code-block:: javascript

  const { createEngine } = ChromeUtils.importESModule("chrome://global/content/ml/EngineProcess.sys.mjs");
  const engine = await createEngine({taskName: "summarization"});
  const request = { args:  ["This is the text to summarize"]};
  const res = await engine.run(request);
  console.log(res[0]["summary_text"]);


Learn more about the platform:

.. toctree::
   :maxdepth: 1

   architecture
   api
   notifications
   models
   perf