Gathering detailed insights and metrics for @jitl/quickjs-wasmfile-release-asyncify
Gathering detailed insights and metrics for @jitl/quickjs-wasmfile-release-asyncify
Gathering detailed insights and metrics for @jitl/quickjs-wasmfile-release-asyncify
Gathering detailed insights and metrics for @jitl/quickjs-wasmfile-release-asyncify
npm install @jitl/quickjs-wasmfile-release-asyncify
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
1,321 Stars
591 Commits
100 Forks
16 Watching
24 Branches
13 Contributors
Updated on 28 Nov 2024
TypeScript (77.83%)
Makefile (16.51%)
C (4.61%)
JavaScript (0.92%)
Shell (0.14%)
Cumulative downloads
Total Downloads
Last day
16%
5,208
Compared to previous day
Last week
17%
28,716
Compared to previous week
Last month
16.9%
108,322
Compared to previous month
Last year
0%
456,992
Compared to previous year
1
Javascript/Typescript bindings for QuickJS, a modern Javascript interpreter, compiled to WebAssembly.
Github | NPM | API Documentation | Variants | Examples
1import { getQuickJS } from "quickjs-emscripten" 2 3async function main() { 4 const QuickJS = await getQuickJS() 5 const vm = QuickJS.newContext() 6 7 const world = vm.newString("world") 8 vm.setProp(vm.global, "NAME", world) 9 world.dispose() 10 11 const result = vm.evalCode(`"Hello " + NAME + "!"`) 12 if (result.error) { 13 console.log("Execution failed:", vm.dump(result.error)) 14 result.error.dispose() 15 } else { 16 console.log("Success:", vm.dump(result.value)) 17 result.value.dispose() 18 } 19 20 vm.dispose() 21} 22 23main()
Install from npm
: npm install --save quickjs-emscripten
or yarn add quickjs-emscripten
.
The root entrypoint of this library is the getQuickJS
function, which returns
a promise that resolves to a QuickJSWASMModule when
the QuickJS WASM module is ready.
Once getQuickJS
has been awaited at least once, you also can use the getQuickJSSync
function to directly access the singleton in your synchronous code.
See QuickJSWASMModule.evalCode
1import { getQuickJS, shouldInterruptAfterDeadline } from "quickjs-emscripten" 2 3getQuickJS().then((QuickJS) => { 4 const result = QuickJS.evalCode("1 + 1", { 5 shouldInterrupt: shouldInterruptAfterDeadline(Date.now() + 1000), 6 memoryLimitBytes: 1024 * 1024, 7 }) 8 console.log(result) 9})
You can use QuickJSContext to build a scripting environment by modifying globals and exposing functions into the QuickJS interpreter.
Each QuickJSContext
instance has its own environment -- globals, built-in
classes -- and actions from one context won't leak into other contexts or
runtimes (with one exception, see Asyncify).
Every context is created inside a QuickJSRuntime. A runtime represents a Javascript heap, and you can even share values between contexts in the same runtime.
1const vm = QuickJS.newContext() 2let state = 0 3 4const fnHandle = vm.newFunction("nextId", () => { 5 return vm.newNumber(++state) 6}) 7 8vm.setProp(vm.global, "nextId", fnHandle) 9fnHandle.dispose() 10 11const nextId = vm.unwrapResult(vm.evalCode(`nextId(); nextId(); nextId()`)) 12console.log("vm result:", vm.getNumber(nextId), "native state:", state) 13 14nextId.dispose() 15vm.dispose()
When you create a context from a top-level API like in the example above,
instead of by calling runtime.newContext()
, a runtime is automatically created
for the lifetime of the context, and disposed of when you dispose the context.
The runtime has APIs for CPU and memory limits that apply to all contexts within the runtime in aggregate. You can also use the runtime to configure EcmaScript module loading.
1const runtime = QuickJS.newRuntime() 2// "Should be enough for everyone" -- attributed to B. Gates 3runtime.setMemoryLimit(1024 * 640) 4// Limit stack size 5runtime.setMaxStackSize(1024 * 320) 6// Interrupt computation after 1024 calls to the interrupt handler 7let interruptCycles = 0 8runtime.setInterruptHandler(() => ++interruptCycles > 1024) 9// Toy module system that always returns the module name 10// as the default export 11runtime.setModuleLoader((moduleName) => `export default '${moduleName}'`) 12const context = runtime.newContext() 13const ok = context.evalCode(` 14import fooName from './foo.js' 15globalThis.result = fooName 16`) 17context.unwrapResult(ok).dispose() 18// logs "foo.js" 19console.log(context.getProp(context.global, "result").consume(context.dump)) 20context.dispose() 21runtime.dispose()
When you evaluate code as an ES Module, the result will be a handle to the module's exports, or a handle to a promise that resolves to the module's exports if the module depends on a top-level await.
1const context = QuickJS.newContext() 2const result = context.evalCode( 3 ` 4 export const name = 'Jake' 5 export const favoriteBean = 'wax bean' 6 export default 'potato' 7`, 8 "jake.js", 9 { type: "module" }, 10) 11const moduleExports = context.unwrapResult(result) 12console.log(context.dump(moduleExports)) 13// -> { name: 'Jake', favoriteBean: 'wax bean', default: 'potato' } 14moduleExports.dispose()
Many methods in this library return handles to memory allocated inside the
WebAssembly heap. These types cannot be garbage-collected as usual in
Javascript. Instead, you must manually manage their memory by calling a
.dispose()
method to free the underlying resources. Once a handle has been
disposed, it cannot be used anymore. Note that in the example above, we call
.dispose()
on each handle once it is no longer needed.
Calling QuickJSContext.dispose()
will throw a RuntimeError if you've forgotten to
dispose any handles associated with that VM, so it's good practice to create a
new VM instance for each of your tests, and to call vm.dispose()
at the end
of every test.
1const vm = QuickJS.newContext() 2const numberHandle = vm.newNumber(42) 3// Note: numberHandle not disposed, so it leaks memory. 4vm.dispose() 5// throws RuntimeError: abort(Assertion failed: list_empty(&rt->gc_obj_list), at: quickjs/quickjs.c,1963,JS_FreeRuntime)
Here are some strategies to reduce the toil of calling .dispose()
on each
handle you create:
using
statementThe using
statement is a Stage 3 (as of 2023-12-29) proposal for Javascript that declares a constant variable and automatically calls the [Symbol.dispose]()
method of an object when it goes out of scope. Read more in this Typescript release announcement. Here's the "Interfacing with the interpreter" example re-written using using
:
1using vm = QuickJS.newContext() 2let state = 0 3 4// The block here isn't needed for correctness, but it shows 5// how to get a tighter bound on the lifetime of `fnHandle`. 6{ 7 using fnHandle = vm.newFunction("nextId", () => { 8 return vm.newNumber(++state) 9 }) 10 11 vm.setProp(vm.global, "nextId", fnHandle) 12 // fnHandle.dispose() is called automatically when the block exits 13} 14 15using nextId = vm.unwrapResult(vm.evalCode(`nextId(); nextId(); nextId()`)) 16console.log("vm result:", vm.getNumber(nextId), "native state:", state) 17// nextId.dispose() is called automatically when the block exits 18// vm.dispose() is called automatically when the block exits
A
Scope
instance manages a set of disposables and calls their .dispose()
method in the reverse order in which they're added to the scope. Here's the
"Interfacing with the interpreter" example re-written using Scope
:
1Scope.withScope((scope) => { 2 const vm = scope.manage(QuickJS.newContext()) 3 let state = 0 4 5 const fnHandle = scope.manage( 6 vm.newFunction("nextId", () => { 7 return vm.newNumber(++state) 8 }), 9 ) 10 11 vm.setProp(vm.global, "nextId", fnHandle) 12 13 const nextId = scope.manage(vm.unwrapResult(vm.evalCode(`nextId(); nextId(); nextId()`))) 14 console.log("vm result:", vm.getNumber(nextId), "native state:", state) 15 16 // When the withScope block exits, it calls scope.dispose(), which in turn calls 17 // the .dispose() methods of all the disposables managed by the scope. 18})
You can also create Scope
instances with new Scope()
if you want to manage
calling scope.dispose()
yourself.
Lifetime.consume(fn)
Lifetime.consume
is sugar for the common pattern of using a handle and then
immediately disposing of it. Lifetime.consume
takes a map
function that
produces a result of any type. The map
fuction is called with the handle,
then the handle is disposed, then the result is returned.
Here's the "Interfacing with interpreter" example re-written using .consume()
:
1const vm = QuickJS.newContext() 2let state = 0 3 4vm.newFunction("nextId", () => { 5 return vm.newNumber(++state) 6}).consume((fnHandle) => vm.setProp(vm.global, "nextId", fnHandle)) 7 8vm.unwrapResult(vm.evalCode(`nextId(); nextId(); nextId()`)).consume((nextId) => 9 console.log("vm result:", vm.getNumber(nextId), "native state:", state), 10) 11 12vm.dispose()
Generally working with Scope
leads to more straight-forward code, but
Lifetime.consume
can be handy sugar as part of a method call chain.
To add APIs inside the QuickJS environment, you'll need to create objects to define the shape of your API, and add properties and functions to those objects to allow code inside QuickJS to call code on the host. The newFunction documentation covers writing functions in detail.
By default, no host functionality is exposed to code running inside QuickJS.
1const vm = QuickJS.newContext() 2// `console.log` 3const logHandle = vm.newFunction("log", (...args) => { 4 const nativeArgs = args.map(vm.dump) 5 console.log("QuickJS:", ...nativeArgs) 6}) 7// Partially implement `console` object 8const consoleHandle = vm.newObject() 9vm.setProp(consoleHandle, "log", logHandle) 10vm.setProp(vm.global, "console", consoleHandle) 11consoleHandle.dispose() 12logHandle.dispose() 13 14vm.unwrapResult(vm.evalCode(`console.log("Hello from QuickJS!")`)).dispose()
To expose an asynchronous function that returns a promise to callers within
QuickJS, your function can return the handle of a QuickJSDeferredPromise
created via context.newPromise()
.
When you resolve a QuickJSDeferredPromise
-- and generally whenever async
behavior completes for the VM -- pending listeners inside QuickJS may not
execute immediately. Your code needs to explicitly call
runtime.executePendingJobs()
to resume execution inside QuickJS. This API
gives your code maximum control to schedule when QuickJS will block the host's
event loop by resuming execution.
To work with QuickJS handles that contain a promise inside the environment, there are two options:
You can synchronously peek into a QuickJS promise handle and get its state without introducing asynchronous host code, described by the type JSPromiseState:
1type JSPromiseState = 2 | { type: "pending"; error: Error } 3 | { type: "fulfilled"; value: QuickJSHandle; notAPromise?: boolean } 4 | { type: "rejected"; error: QuickJSHandle }
The result conforms to the SuccessOrFail
type returned by context.evalCode
, so you can use context.unwrapResult(context.getPromiseState(promiseHandle))
to assert a promise is settled successfully and retrieve its value. Calling context.unwrapResult
on a pending or rejected promise will throw an error.
1const promiseHandle = context.evalCode(`Promise.resolve(42)`) 2const resultHandle = context.unwrapResult(context.getPromiseState(promiseHandle)) 3context.getNumber(resultHandle) === 42 // true 4resultHandle.dispose() 5promiseHandle.dispose()
You can convert the QuickJSHandle into a native promise using
context.resolvePromise()
. Take care with this API to avoid 'deadlocks' where
the host awaits a guest promise, but the guest cannot make progress until the
host calls runtime.executePendingJobs()
. The simplest way to avoid this kind
of deadlock is to always schedule executePendingJobs
after any promise is
settled.
1const vm = QuickJS.newContext() 2const fakeFileSystem = new Map([["example.txt", "Example file content"]]) 3 4// Function that simulates reading data asynchronously 5const readFileHandle = vm.newFunction("readFile", (pathHandle) => { 6 const path = vm.getString(pathHandle) 7 const promise = vm.newPromise() 8 setTimeout(() => { 9 const content = fakeFileSystem.get(path) 10 promise.resolve(vm.newString(content || "")) 11 }, 100) 12 // IMPORTANT: Once you resolve an async action inside QuickJS, 13 // call runtime.executePendingJobs() to run any code that was 14 // waiting on the promise or callback. 15 promise.settled.then(vm.runtime.executePendingJobs) 16 return promise.handle 17}) 18readFileHandle.consume((handle) => vm.setProp(vm.global, "readFile", handle)) 19 20// Evaluate code that uses `readFile`, which returns a promise 21const result = vm.evalCode(`(async () => { 22 const content = await readFile('example.txt') 23 return content.toUpperCase() 24})()`) 25const promiseHandle = vm.unwrapResult(result) 26 27// Convert the promise handle into a native promise and await it. 28// If code like this deadlocks, make sure you are calling 29// runtime.executePendingJobs appropriately. 30const resolvedResult = await vm.resolvePromise(promiseHandle) 31promiseHandle.dispose() 32const resolvedHandle = vm.unwrapResult(resolvedResult) 33console.log("Result:", vm.getString(resolvedHandle)) 34resolvedHandle.dispose()
Sometimes, we want to create a function that's synchronous from the perspective of QuickJS, but prefer to implement that function asynchronously in your host code. The most obvious use-case is for EcmaScript module loading. The underlying QuickJS C library expects the module loader function to return synchronously, but loading data synchronously in the browser or server is somewhere between "a bad idea" and "impossible". QuickJS also doesn't expose an API to "pause" the execution of a runtime, and adding such an API is tricky due to the VM's implementation.
As a work-around, we provide an alternate build of QuickJS processed by Emscripten/Binaryen's ASYNCIFY compiler transform. Here's how Emscripten's documentation describes Asyncify:
Asyncify lets synchronous C or C++ code interact with asynchronous [host] JavaScript. This allows things like:
- A synchronous call in C that yields to the event loop, which allows browser events to be handled.
- A synchronous call in C that waits for an asynchronous operation in [host] JS to complete.
Asyncify automatically transforms ... code into a form that can be paused and resumed ..., so that it is asynchronous (hence the name “Asyncify”) even though [it is written] in a normal synchronous way.
This means we can suspend an entire WebAssembly module (which could contain multiple runtimes and contexts) while our host Javascript loads data asynchronously, and then resume execution once the data load completes. This is a very handy superpower, but it comes with a couple of major limitations:
An asyncified WebAssembly module can only suspend to wait for a single asynchronous call at a time. You may call back into a suspended WebAssembly module eg. to create a QuickJS value to return a result, but the system will crash if this call tries to suspend again. Take a look at Emscripten's documentation on reentrancy.
Asyncified code is bigger and runs slower. The asyncified build of Quickjs-emscripten library is 1M, 2x larger than the 500K of the default version. There may be room for further optimization Of our build in the future.
To use asyncify features, use the following functions:
These functions are asynchronous because they always create a new underlying WebAssembly module so that each instance can suspend and resume independently, and instantiating a WebAssembly module is an async operation. This also adds substantial overhead compared to creating a runtime or context inside an existing module; if you only need to wait for a single async action at a time, you can create a single top-level module and create runtimes or contexts inside of it.
Here's an example of valuating a script that loads React asynchronously as an ES
module. In our example, we're loading from the filesystem for reproducibility,
but you can use this technique to load using fetch
.
1const module = await newQuickJSAsyncWASMModule() 2const runtime = module.newRuntime() 3const path = await import("path") 4const { promises: fs } = await import("fs") 5 6const importsPath = path.join(__dirname, "../examples/imports") + "/" 7// Module loaders can return promises. 8// Execution will suspend until the promise resolves. 9runtime.setModuleLoader((moduleName) => { 10 const modulePath = path.join(importsPath, moduleName) 11 if (!modulePath.startsWith(importsPath)) { 12 throw new Error("out of bounds") 13 } 14 console.log("loading", moduleName, "from", modulePath) 15 return fs.readFile(modulePath, "utf-8") 16}) 17 18// evalCodeAsync is required when execution may suspend. 19const context = runtime.newContext() 20const result = await context.evalCodeAsync(` 21import * as React from 'esm.sh/react@17' 22import * as ReactDOMServer from 'esm.sh/react-dom@17/server' 23const e = React.createElement 24globalThis.html = ReactDOMServer.renderToStaticMarkup( 25 e('div', null, e('strong', null, 'Hello world!')) 26) 27`) 28context.unwrapResult(result).dispose() 29const html = context.getProp(context.global, "html").consume(context.getString) 30console.log(html) // <div><strong>Hello world!</strong></div>
Here's an example of turning an async function into a sync function inside the VM.
1const context = await newAsyncContext() 2const path = await import("path") 3const { promises: fs } = await import("fs") 4 5const importsPath = path.join(__dirname, "../examples/imports") + "/" 6const readFileHandle = context.newAsyncifiedFunction("readFile", async (pathHandle) => { 7 const pathString = path.join(importsPath, context.getString(pathHandle)) 8 if (!pathString.startsWith(importsPath)) { 9 throw new Error("out of bounds") 10 } 11 const data = await fs.readFile(pathString, "utf-8") 12 return context.newString(data) 13}) 14readFileHandle.consume((fn) => context.setProp(context.global, "readFile", fn)) 15 16// evalCodeAsync is required when execution may suspend. 17const result = await context.evalCodeAsync(` 18// Not a promise! Sync! vvvvvvvvvvvvvvvvvvvv 19const data = JSON.parse(readFile('data.json')) 20data.map(x => x.toUpperCase()).join(' ') 21`) 22const upperCaseData = context.unwrapResult(result).consume(context.getString) 23console.log(upperCaseData) // 'VERY USEFUL DATA'
This library is complicated to use, so please consider automated testing your implementation. We highly writing your test suite to run with both the "release" build variant of quickjs-emscripten, and also the DEBUG_SYNC build variant. The debug sync build variant has extra instrumentation code for detecting memory leaks.
The class TestQuickJSWASMModule exposes the memory leak detection API,
although this API is only accurate when using DEBUG_SYNC
variant. You can also
enable debug logging to help diagnose failures.
1// Define your test suite in a function, so that you can test against 2// different module loaders. 3function myTests(moduleLoader: () => Promise<QuickJSWASMModule>) { 4 let QuickJS: TestQuickJSWASMModule 5 beforeEach(async () => { 6 // Get a unique TestQuickJSWASMModule instance for each test. 7 const wasmModule = await moduleLoader() 8 QuickJS = new TestQuickJSWASMModule(wasmModule) 9 }) 10 afterEach(() => { 11 // Assert that the test disposed all handles. The DEBUG_SYNC build 12 // variant will show detailed traces for each leak. 13 QuickJS.assertNoMemoryAllocated() 14 }) 15 16 it("works well", () => { 17 // TODO: write a test using QuickJS 18 const context = QuickJS.newContext() 19 context.unwrapResult(context.evalCode("1 + 1")).dispose() 20 context.dispose() 21 }) 22} 23 24// Run the test suite against a matrix of module loaders. 25describe("Check for memory leaks with QuickJS DEBUG build", () => { 26 const moduleLoader = memoizePromiseFactory(() => newQuickJSWASMModule(DEBUG_SYNC)) 27 myTests(moduleLoader) 28}) 29 30describe("Realistic test with QuickJS RELEASE build", () => { 31 myTests(getQuickJS) 32})
For more testing examples, please explore the typescript source of quickjs-emscripten repository.
The main quickjs-emscripten
package includes several build variants of the WebAssembly module:
RELEASE...
build variants should be used in production. They offer better performance and smaller file size compared to DEBUG...
build variants.
RELEASE_SYNC
: This is the default variant used when you don't explicitly provide one. It offers the fastest performance and smallest file size.RELEASE_ASYNC
: The default variant if you need asyncify magic, which comes at a performance cost. See the asyncify docs for details.DEBUG...
build variants can be helpful during development and testing. They include source maps and assertions for catching bugs in your code. We recommend running your tests with both a debug build variant and the release build variant you'll use in production.
DEBUG_SYNC
: Instrumented to detect memory leaks, in addition to assertions and source maps.DEBUG_ASYNC
: An asyncify variant with source maps.To use a variant, call newQuickJSWASMModule
or newQuickJSAsyncWASMModule
with the variant object. These functions return a promise that resolves to a QuickJSWASMModule, the same as getQuickJS
.
1import { 2 newQuickJSWASMModule, 3 newQuickJSAsyncWASMModule, 4 RELEASE_SYNC, 5 DEBUG_SYNC, 6 RELEASE_ASYNC, 7 DEBUG_ASYNC, 8} from "quickjs-emscripten" 9 10const QuickJSReleaseSync = await newQuickJSWASMModule(RELEASE_SYNC) 11const QuickJSDebugSync = await newQuickJSWASMModule(DEBUG_SYNC) 12const QuickJSReleaseAsync = await newQuickJSAsyncWASMModule(RELEASE_ASYNC) 13const QuickJSDebugAsync = await newQuickJSAsyncWASMModule(DEBUG_ASYNC) 14 15for (const quickjs of [ 16 QuickJSReleaseSync, 17 QuickJSDebugSync, 18 QuickJSReleaseAsync, 19 QuickJSDebugAsync, 20]) { 21 const vm = quickjs.newContext() 22 const result = vm.unwrapResult(vm.evalCode("1 + 1")).consume(vm.getNumber) 23 console.log(result) 24 vm.dispose() 25 quickjs.dispose() 26}
Including 4 different copies of the WebAssembly module in the main package gives it an install size of about 9.04mb. If you're building a CLI package or library of your own, or otherwise don't need to include 4 different variants in your node_modules
, you can switch to the quickjs-emscripten-core
package, which contains only the Javascript code for this library, and install one (or more) variants a-la-carte as separate packages.
The most minimal setup would be to install quickjs-emscripten-core
and @jitl/quickjs-wasmfile-release-sync
(1.3mb total):
1yarn add quickjs-emscripten-core @jitl/quickjs-wasmfile-release-sync 2du -h node_modules 3# 640K node_modules/@jitl/quickjs-wasmfile-release-sync 4# 80K node_modules/@jitl/quickjs-ffi-types 5# 588K node_modules/quickjs-emscripten-core 6# 1.3M node_modules
Then, you can use quickjs-emscripten-core's newQuickJSWASMModuleFromVariant
to create a QuickJS module (see the minimal example):
1// src/quickjs.mjs 2import { newQuickJSWASMModuleFromVariant } from "quickjs-emscripten-core" 3import RELEASE_SYNC from "@jitl/quickjs-wasmfile-release-sync" 4export const QuickJS = await newQuickJSWASMModuleFromVariant(RELEASE_SYNC) 5 6// src/app.mjs 7import { QuickJS } from "./quickjs.mjs" 8console.log(QuickJS.evalCode("1 + 1"))
See the documentation of quickjs-emscripten-core for more details and the list of variant packages.
To run QuickJS, we need to load a WebAssembly module into the host Javascript runtime's memory (usually as an ArrayBuffer or TypedArray) and compile it to a WebAssembly.Module. This means we need to find the file path or URI of the WebAssembly module, and then read it using an API like fetch
(browser) or fs.readFile
(NodeJS). quickjs-emscripten
tries to handle this automatically using patterns like new URL('./local-path', import.meta.url)
that work in the browser or are handled automatically by bundlers, or __dirname
in NodeJS, but you may need to configure this manually if these don't work in your environment, or you want more control about how the WebAssembly module is loaded.
To customize the loading of an existing variant, create a new variant with your loading settings using newVariant
, passing CustomizeVariantOptions. For example, you need to customize loading in Cloudflare Workers (see the full example).
1import { newQuickJSWASMModule, DEBUG_SYNC as baseVariant, newVariant } from "quickjs-emscripten" 2import cloudflareWasmModule from "./DEBUG_SYNC.wasm" 3import cloudflareWasmModuleSourceMap from "./DEBUG_SYNC.wasm.map.txt" 4 5/** 6 * We need to make a new variant that directly passes the imported WebAssembly.Module 7 * to Emscripten. Normally we'd load the wasm file as bytes from a URL, but 8 * that's forbidden in Cloudflare workers. 9 */ 10const cloudflareVariant = newVariant(baseVariant, { 11 wasmModule: cloudflareWasmModule, 12 wasmSourceMapData: cloudflareWasmModuleSourceMap, 13})
quickjs-ng/quickjs (aka quickjs-ng) is a fork of the original bellard/quickjs under active development. It implements more EcmaScript standards and removes some of quickjs's custom language features like BigFloat.
There are several variants of quickjs-ng available, and quickjs-emscripten may switch to using quickjs-ng by default in the future. See the list of variants.
You can use quickjs-emscripten directly from an HTML file in two ways:
Import it in an ES Module script tag
1<!doctype html> 2<!-- Import from a ES Module CDN --> 3<script type="module"> 4 import { getQuickJS } from "https://esm.sh/quickjs-emscripten@0.25.0" 5 const QuickJS = await getQuickJS() 6 console.log(QuickJS.evalCode("1+1")) 7</script>
In edge cases, you might want to use the IIFE build which provides QuickJS as the global QJS
. You should probably use the ES module though, any recent browser supports it.
1<!doctype html> 2<!-- Add a script tag to load the library as the QJS global --> 3<script 4 src="https://cdn.jsdelivr.net/npm/quickjs-emscripten@0.25.0/dist/index.global.js" 5 type="text/javascript" 6></script> 7<!-- Then use the QJS global in a script tag --> 8<script type="text/javascript"> 9 QJS.getQuickJS().then((QuickJS) => { 10 console.log(QuickJS.evalCode("1+1")) 11 }) 12</script>
Debug logging can be enabled globally, or for specific runtimes. You need to use a DEBUG build variant of the WebAssembly module to see debug log messages from the C part of this library.
1import { newQuickJSWASMModule, DEBUG_SYNC } from "quickjs-emscripten" 2 3const QuickJS = await newQuickJSWASMModule(DEBUG_SYNC)
With quickjs-emscripten-core:
1import { newQuickJSWASMModuleFromVariant } from "quickjs-emscripten-core" 2import DEBUG_SYNC from "@jitl/quickjs-wasmfile-debug-sync" 3 4const QuickJS = await newQuickJSWASMModuleFromVariant(DEBUG_SYNC)
To enable debug logging globally, call setDebugMode. This affects global Javascript parts of the library, like the module loader and asyncify internals, and is inherited by runtimes created after the call.
1import { setDebugMode } from "quickjs-emscripten" 2 3setDebugMode(true)
With quickjs-emscripten-core:
1import { setDebugMode } from "quickjs-emscripten-core" 2 3setDebugMode(true)
To enable debug logging for a specific runtime, call setDebugModeRt. This affects only the runtime and its associated contexts.
1const runtime = QuickJS.newRuntime() 2runtime.setDebugMode(true) 3 4const context = QuickJS.newContext() 5context.runtime.setDebugMode(true)
quickjs-emscripten
and related packages should work in any environment that supports ES2020.
Github | NPM | API Documentation | Variants | Examples
This was inspired by seeing https://github.com/maple3142/duktape-eval on Hacker News and Figma's blogposts about using building a Javascript plugin runtime:
Stability: Because the version number of this project is below 1.0.0
,
*expect occasional breaking API changes.
Security: This project makes every effort to be secure, but has not been audited. Please use with care in production settings.
Roadmap: I work on this project in my free time, for fun. Here's I'm thinking comes next. Last updated 2022-03-18.
Further work on module loading APIs:
Higher-level tools for reading QuickJS values:
context.isArray(handle)
, context.isPromise(handle)
, etc.context.getIterable(handle)
, context.iterateObjectEntries(handle)
.
This better supports user-level code to deserialize complex handle objects.Higher-level tools for creating QuickJS values:
SQLite integration.
This library is implemented in two languages: C (compiled to WASM with Emscripten), and Typescript.
You will need node
, yarn
, make
, and emscripten
to build this project.
The ./c directory contains C code that wraps the QuickJS C library (in ./quickjs).
Public functions (those starting with QTS_
) in ./c/interface.c are
automatically exported to native code (via a generated header) and to
Typescript (via a generated FFI class). See ./generate.ts for how this works.
The C code builds with emscripten
(using emcc
), to produce WebAssembly.
The version of Emscripten used by the project is defined in templates/Variant.mk.
emscripten
on your machine. For example on macOS, brew install emscripten
.We produce multiple build variants of the C code compiled to WebAssembly using a template script the ./packages directory. Each build variant uses its own copy of a Makefile to build the C code. The Makefile is generated from a template in ./templates/Variant.mk.
Related NPM scripts:
yarn vendor:update
updates vendor/quickjs and vendor/quickjs-ng to the latest versions on Github.yarn build:codegen
updates the ./packages from the template script ./prepareVariants.ts
and Variant.mk.yarn build:packages
builds the variant packages in parallel.The Javascript/Typescript code is also organized into several NPM packages in ./packages:
getQuickJS
function.
This package combines quickjs-emscripten-core with platform-appropriate WebAssembly/C code.Related NPM scripts:
yarn check
runs all available checks (build, format, tests, etc).yarn build
builds all the packages and generates the docs.yarn test
runs the tests for all packages.
yarn test:fast
runs the tests using only fast build variants.yarn doc
generates the docs into ./doc
.
yarn doc:serve
previews the current ./doc
in a browser.yarn prettier
formats the repo.Just run yarn set version from sources
to upgrade the Yarn release.
No vulnerabilities found.
Reason
no dangerous workflow patterns detected
Reason
10 commit(s) and 8 issue activity found in the last 90 days -- score normalized to 10
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
Found 0/10 approved changesets -- score normalized to 0
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
dependency not pinned by hash detected -- score normalized to 0
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Reason
23 existing vulnerabilities detected
Details
Score
Last Scanned on 2024-11-25
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More