vm: add run-after-evaluate microtask mode

This allows timeouts to apply to e.g. `Promise`s and `async function`s
from code running inside of `vm.Context`s, by giving the Context its
own microtasks queue.

Fixes: https://github.com/nodejs/node/issues/3020

PR-URL: https://github.com/nodejs/node/pull/34023
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Denys Otrishko <shishugi@gmail.com>
This commit is contained in:
Anna Henningsen 2020-06-23 00:33:04 +02:00 committed by James M Snell
parent e68563e31c
commit f63436d190
No known key found for this signature in database
GPG Key ID: 7341B15C070877AC
13 changed files with 342 additions and 39 deletions

View File

@ -188,6 +188,9 @@ overhead.
<!-- YAML <!-- YAML
added: v0.3.1 added: v0.3.1
changes: changes:
- version: REPLACEME
pr-url: https://github.com/nodejs/node/pull/34023
description: The `microtaskMode` option is supported now.
- version: v10.0.0 - version: v10.0.0
pr-url: https://github.com/nodejs/node/pull/19016 pr-url: https://github.com/nodejs/node/pull/19016
description: The `contextCodeGeneration` option is supported now. description: The `contextCodeGeneration` option is supported now.
@ -225,6 +228,10 @@ changes:
`EvalError`. **Default:** `true`. `EvalError`. **Default:** `true`.
* `wasm` {boolean} If set to false any attempt to compile a WebAssembly * `wasm` {boolean} If set to false any attempt to compile a WebAssembly
module will throw a `WebAssembly.CompileError`. **Default:** `true`. module will throw a `WebAssembly.CompileError`. **Default:** `true`.
* `microtaskMode` {string} If set to `afterEvaluate`, microtasks (tasks
scheduled through `Promise`s any `async function`s) will be run immediately
after the script has run. They are included in the `timeout` and
`breakOnSigint` scopes in that case.
* Returns: {any} the result of the very last statement executed in the script. * Returns: {any} the result of the very last statement executed in the script.
First contextifies the given `contextObject`, runs the compiled code contained First contextifies the given `contextObject`, runs the compiled code contained
@ -846,6 +853,9 @@ function with the given `params`.
<!-- YAML <!-- YAML
added: v0.3.1 added: v0.3.1
changes: changes:
- version: REPLACEME
pr-url: https://github.com/nodejs/node/pull/34023
description: The `microtaskMode` option is supported now.
- version: v10.0.0 - version: v10.0.0
pr-url: https://github.com/nodejs/node/pull/19398 pr-url: https://github.com/nodejs/node/pull/19398
description: The first argument can no longer be a function. description: The first argument can no longer be a function.
@ -871,6 +881,10 @@ changes:
`EvalError`. **Default:** `true`. `EvalError`. **Default:** `true`.
* `wasm` {boolean} If set to false any attempt to compile a WebAssembly * `wasm` {boolean} If set to false any attempt to compile a WebAssembly
module will throw a `WebAssembly.CompileError`. **Default:** `true`. module will throw a `WebAssembly.CompileError`. **Default:** `true`.
* `microtaskMode` {string} If set to `afterEvaluate`, microtasks (tasks
scheduled through `Promise`s any `async function`s) will be run immediately
after a script has run through [`script.runInContext()`][].
They are included in the `timeout` and `breakOnSigint` scopes in that case.
* Returns: {Object} contextified object. * Returns: {Object} contextified object.
If given a `contextObject`, the `vm.createContext()` method will [prepare If given a `contextObject`, the `vm.createContext()` method will [prepare
@ -1002,6 +1016,9 @@ console.log(contextObject);
<!-- YAML <!-- YAML
added: v0.3.1 added: v0.3.1
changes: changes:
- version: REPLACEME
pr-url: https://github.com/nodejs/node/pull/34023
description: The `microtaskMode` option is supported now.
- version: v10.0.0 - version: v10.0.0
pr-url: https://github.com/nodejs/node/pull/19016 pr-url: https://github.com/nodejs/node/pull/19016
description: The `contextCodeGeneration` option is supported now. description: The `contextCodeGeneration` option is supported now.
@ -1068,6 +1085,10 @@ changes:
* Returns: {Module Namespace Object|vm.Module} Returning a `vm.Module` is * Returns: {Module Namespace Object|vm.Module} Returning a `vm.Module` is
recommended in order to take advantage of error tracking, and to avoid recommended in order to take advantage of error tracking, and to avoid
issues with namespaces that contain `then` function exports. issues with namespaces that contain `then` function exports.
* `microtaskMode` {string} If set to `afterEvaluate`, microtasks (tasks
scheduled through `Promise`s any `async function`s) will be run immediately
after the script has run. They are included in the `timeout` and
`breakOnSigint` scopes in that case.
* Returns: {any} the result of the very last statement executed in the script. * Returns: {any} the result of the very last statement executed in the script.
The `vm.runInNewContext()` first contextifies the given `contextObject` (or The `vm.runInNewContext()` first contextifies the given `contextObject` (or
@ -1224,13 +1245,13 @@ within which it can operate. The process of creating the V8 Context and
associating it with the `contextObject` is what this document refers to as associating it with the `contextObject` is what this document refers to as
"contextifying" the object. "contextifying" the object.
## Timeout limitations when using `process.nextTick()`, promises, and `queueMicrotask()` ## Timeout interactions with asynchronous tasks and Promises
Because of the internal mechanics of how the `process.nextTick()` queue and `Promise`s and `async function`s can schedule tasks run by the JavaScript
the microtask queue that underlies Promises are implemented within V8 and engine asynchronously. By default, these tasks are run after all JavaScript
Node.js, it is possible for code running within a context to "escape" the functions on the current stack are done executing.
`timeout` set using `vm.runInContext()`, `vm.runInNewContext()`, and This allows escaping the functionality of the `timeout` and
`vm.runInThisContext()`. `breakOnSigint` options.
For example, the following code executed by `vm.runInNewContext()` with a For example, the following code executed by `vm.runInNewContext()` with a
timeout of 5 milliseconds schedules an infinite loop to run after a promise timeout of 5 milliseconds schedules an infinite loop to run after a promise
@ -1240,21 +1261,52 @@ resolves. The scheduled loop is never interrupted by the timeout:
const vm = require('vm'); const vm = require('vm');
function loop() { function loop() {
console.log('entering loop');
while (1) console.log(Date.now()); while (1) console.log(Date.now());
} }
vm.runInNewContext( vm.runInNewContext(
'Promise.resolve().then(loop);', 'Promise.resolve().then(() => loop());',
{ loop, console }, { loop, console },
{ timeout: 5 } { timeout: 5 }
); );
// This prints *before* 'entering loop' (!)
console.log('done executing');
``` ```
This issue also occurs when the `loop()` call is scheduled using This can be addressed by passing `microtaskMode: 'afterEvaluate'` to the code
the `process.nextTick()` and `queueMicrotask()` functions. that creates the `Context`:
This issue occurs because all contexts share the same microtask and nextTick ```js
queues. const vm = require('vm');
function loop() {
while (1) console.log(Date.now());
}
vm.runInNewContext(
'Promise.resolve().then(() => loop());',
{ loop, console },
{ timeout: 5, microtaskMode: 'afterEvaluate' }
);
```
In this case, the microtask scheduled through `promise.then()` will be run
before returning from `vm.runInNewContext()`, and will be interrupted
by the `timeout` functionality. This applies only to code running in a
`vm.Context`, so e.g. [`vm.runInThisContext()`][] does not take this option.
Promise callbacks are entered into the microtask queue of the context in which
they were created. For example, if `() => loop()` is replaced with just `loop`
in the above example, then `loop` will be pushed into the global microtask
queue, because it is a function from the outer (main) context, and thus will
also be able to escape the timeout.
If asynchronous scheduling functions such as `process.nextTick()`,
`queueMicrotask()`, `setTimeout()`, `setImmediate()`, etc. are made available
inside a `vm.Context`, functions passed to them will be added to global queues,
which are shared by all contexts. Therefore, callbacks passed to those functions
are not controllable through the timeout either.
[`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING`]: errors.html#ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING [`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING`]: errors.html#ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING
[`ERR_VM_MODULE_STATUS`]: errors.html#ERR_VM_MODULE_STATUS [`ERR_VM_MODULE_STATUS`]: errors.html#ERR_VM_MODULE_STATUS

View File

@ -29,6 +29,7 @@ const {
const { const {
ContextifyScript, ContextifyScript,
MicrotaskQueue,
makeContext, makeContext,
isContext: _isContext, isContext: _isContext,
constants, constants,
@ -186,6 +187,7 @@ function getContextOptions(options) {
name: options.contextName, name: options.contextName,
origin: options.contextOrigin, origin: options.contextOrigin,
codeGeneration: undefined, codeGeneration: undefined,
microtaskMode: options.microtaskMode,
}; };
if (contextOptions.name !== undefined) if (contextOptions.name !== undefined)
validateString(contextOptions.name, 'options.contextName'); validateString(contextOptions.name, 'options.contextName');
@ -201,6 +203,8 @@ function getContextOptions(options) {
validateBoolean(wasm, 'options.contextCodeGeneration.wasm'); validateBoolean(wasm, 'options.contextCodeGeneration.wasm');
contextOptions.codeGeneration = { strings, wasm }; contextOptions.codeGeneration = { strings, wasm };
} }
if (options.microtaskMode !== undefined)
validateString(options.microtaskMode, 'options.microtaskMode');
return contextOptions; return contextOptions;
} }
@ -222,7 +226,8 @@ function createContext(contextObject = {}, options = {}) {
const { const {
name = `VM Context ${defaultContextNameIndex++}`, name = `VM Context ${defaultContextNameIndex++}`,
origin, origin,
codeGeneration codeGeneration,
microtaskMode
} = options; } = options;
validateString(name, 'options.name'); validateString(name, 'options.name');
@ -239,7 +244,22 @@ function createContext(contextObject = {}, options = {}) {
validateBoolean(wasm, 'options.codeGeneration.wasm'); validateBoolean(wasm, 'options.codeGeneration.wasm');
} }
makeContext(contextObject, name, origin, strings, wasm); let microtaskQueue = null;
if (microtaskMode !== undefined) {
validateString(microtaskMode, 'options.microtaskMode');
if (microtaskMode === 'afterEvaluate') {
microtaskQueue = new MicrotaskQueue();
} else {
throw new ERR_INVALID_ARG_VALUE(
'options.microtaskQueue',
microtaskQueue,
'must be \'afterEvaluate\' or undefined'
);
}
}
makeContext(contextObject, name, origin, strings, wasm, microtaskQueue);
return contextObject; return contextObject;
} }

View File

@ -433,6 +433,7 @@ constexpr size_t kFsStatsBufferLength =
V(i18n_converter_template, v8::ObjectTemplate) \ V(i18n_converter_template, v8::ObjectTemplate) \
V(libuv_stream_wrap_ctor_template, v8::FunctionTemplate) \ V(libuv_stream_wrap_ctor_template, v8::FunctionTemplate) \
V(message_port_constructor_template, v8::FunctionTemplate) \ V(message_port_constructor_template, v8::FunctionTemplate) \
V(microtask_queue_ctor_template, v8::FunctionTemplate) \
V(pipe_constructor_template, v8::FunctionTemplate) \ V(pipe_constructor_template, v8::FunctionTemplate) \
V(promise_wrap_template, v8::ObjectTemplate) \ V(promise_wrap_template, v8::ObjectTemplate) \
V(sab_lifetimepartner_constructor_template, v8::FunctionTemplate) \ V(sab_lifetimepartner_constructor_template, v8::FunctionTemplate) \

View File

@ -35,6 +35,7 @@ using v8::IntegrityLevel;
using v8::Isolate; using v8::Isolate;
using v8::Local; using v8::Local;
using v8::MaybeLocal; using v8::MaybeLocal;
using v8::MicrotaskQueue;
using v8::Module; using v8::Module;
using v8::Number; using v8::Number;
using v8::Object; using v8::Object;
@ -106,15 +107,15 @@ void ModuleWrap::New(const FunctionCallbackInfo<Value>& args) {
Local<String> url = args[0].As<String>(); Local<String> url = args[0].As<String>();
Local<Context> context; Local<Context> context;
ContextifyContext* contextify_context = nullptr;
if (args[1]->IsUndefined()) { if (args[1]->IsUndefined()) {
context = that->CreationContext(); context = that->CreationContext();
} else { } else {
CHECK(args[1]->IsObject()); CHECK(args[1]->IsObject());
ContextifyContext* sandbox = contextify_context = ContextifyContext::ContextFromContextifiedSandbox(
ContextifyContext::ContextFromContextifiedSandbox(
env, args[1].As<Object>()); env, args[1].As<Object>());
CHECK_NOT_NULL(sandbox); CHECK_NOT_NULL(contextify_context);
context = sandbox->context(); context = contextify_context->context();
} }
Local<Integer> line_offset; Local<Integer> line_offset;
@ -224,6 +225,7 @@ void ModuleWrap::New(const FunctionCallbackInfo<Value>& args) {
} }
obj->context_.Reset(isolate, context); obj->context_.Reset(isolate, context);
obj->contextify_context_ = contextify_context;
env->hash_to_module_map.emplace(module->GetIdentityHash(), obj); env->hash_to_module_map.emplace(module->GetIdentityHash(), obj);
@ -319,6 +321,11 @@ void ModuleWrap::Evaluate(const FunctionCallbackInfo<Value>& args) {
Local<Context> context = obj->context_.Get(isolate); Local<Context> context = obj->context_.Get(isolate);
Local<Module> module = obj->module_.Get(isolate); Local<Module> module = obj->module_.Get(isolate);
ContextifyContext* contextify_context = obj->contextify_context_;
std::shared_ptr<MicrotaskQueue> microtask_queue;
if (contextify_context != nullptr)
microtask_queue = contextify_context->microtask_queue();
// module.evaluate(timeout, breakOnSigint) // module.evaluate(timeout, breakOnSigint)
CHECK_EQ(args.Length(), 2); CHECK_EQ(args.Length(), 2);
@ -334,18 +341,24 @@ void ModuleWrap::Evaluate(const FunctionCallbackInfo<Value>& args) {
bool timed_out = false; bool timed_out = false;
bool received_signal = false; bool received_signal = false;
MaybeLocal<Value> result; MaybeLocal<Value> result;
auto run = [&]() {
MaybeLocal<Value> result = module->Evaluate(context);
if (!result.IsEmpty() && microtask_queue)
microtask_queue->PerformCheckpoint(isolate);
return result;
};
if (break_on_sigint && timeout != -1) { if (break_on_sigint && timeout != -1) {
Watchdog wd(isolate, timeout, &timed_out); Watchdog wd(isolate, timeout, &timed_out);
SigintWatchdog swd(isolate, &received_signal); SigintWatchdog swd(isolate, &received_signal);
result = module->Evaluate(context); result = run();
} else if (break_on_sigint) { } else if (break_on_sigint) {
SigintWatchdog swd(isolate, &received_signal); SigintWatchdog swd(isolate, &received_signal);
result = module->Evaluate(context); result = run();
} else if (timeout != -1) { } else if (timeout != -1) {
Watchdog wd(isolate, timeout, &timed_out); Watchdog wd(isolate, timeout, &timed_out);
result = module->Evaluate(context); result = run();
} else { } else {
result = module->Evaluate(context); result = run();
} }
if (result.IsEmpty()) { if (result.IsEmpty()) {

View File

@ -12,6 +12,10 @@ namespace node {
class Environment; class Environment;
namespace contextify {
class ContextifyContext;
}
namespace loader { namespace loader {
enum ScriptType : int { enum ScriptType : int {
@ -82,12 +86,13 @@ class ModuleWrap : public BaseObject {
static ModuleWrap* GetFromModule(node::Environment*, v8::Local<v8::Module>); static ModuleWrap* GetFromModule(node::Environment*, v8::Local<v8::Module>);
v8::Global<v8::Function> synthetic_evaluation_steps_; v8::Global<v8::Function> synthetic_evaluation_steps_;
bool synthetic_ = false;
v8::Global<v8::Module> module_; v8::Global<v8::Module> module_;
v8::Global<v8::String> url_; v8::Global<v8::String> url_;
bool linked_ = false;
std::unordered_map<std::string, v8::Global<v8::Promise>> resolve_cache_; std::unordered_map<std::string, v8::Global<v8::Promise>> resolve_cache_;
v8::Global<v8::Context> context_; v8::Global<v8::Context> context_;
contextify::ContextifyContext* contextify_context_ = nullptr;
bool synthetic_ = false;
bool linked_ = false;
uint32_t id_; uint32_t id_;
}; };

View File

@ -54,6 +54,8 @@ using v8::Maybe;
using v8::MaybeLocal; using v8::MaybeLocal;
using v8::MeasureMemoryExecution; using v8::MeasureMemoryExecution;
using v8::MeasureMemoryMode; using v8::MeasureMemoryMode;
using v8::MicrotaskQueue;
using v8::MicrotasksPolicy;
using v8::Name; using v8::Name;
using v8::NamedPropertyHandlerConfiguration; using v8::NamedPropertyHandlerConfiguration;
using v8::Number; using v8::Number;
@ -108,7 +110,10 @@ Local<Name> Uint32ToName(Local<Context> context, uint32_t index) {
ContextifyContext::ContextifyContext( ContextifyContext::ContextifyContext(
Environment* env, Environment* env,
Local<Object> sandbox_obj, const ContextOptions& options) : env_(env) { Local<Object> sandbox_obj,
const ContextOptions& options)
: env_(env),
microtask_queue_wrap_(options.microtask_queue_wrap) {
MaybeLocal<Context> v8_context = CreateV8Context(env, sandbox_obj, options); MaybeLocal<Context> v8_context = CreateV8Context(env, sandbox_obj, options);
// Allocation failure, maximum call stack size reached, termination, etc. // Allocation failure, maximum call stack size reached, termination, etc.
@ -188,7 +193,13 @@ MaybeLocal<Context> ContextifyContext::CreateV8Context(
object_template->SetHandler(config); object_template->SetHandler(config);
object_template->SetHandler(indexed_config); object_template->SetHandler(indexed_config);
Local<Context> ctx = Context::New(env->isolate(), nullptr, object_template); Local<Context> ctx = Context::New(
env->isolate(),
nullptr, // extensions
object_template,
{}, // global object
{}, // deserialization callback
microtask_queue() ? microtask_queue().get() : nullptr);
if (ctx.IsEmpty()) return MaybeLocal<Context>(); if (ctx.IsEmpty()) return MaybeLocal<Context>();
// Only partially initialize the context - the primordials are left out // Only partially initialize the context - the primordials are left out
// and only initialized when necessary. // and only initialized when necessary.
@ -247,7 +258,7 @@ void ContextifyContext::Init(Environment* env, Local<Object> target) {
void ContextifyContext::MakeContext(const FunctionCallbackInfo<Value>& args) { void ContextifyContext::MakeContext(const FunctionCallbackInfo<Value>& args) {
Environment* env = Environment::GetCurrent(args); Environment* env = Environment::GetCurrent(args);
CHECK_EQ(args.Length(), 5); CHECK_EQ(args.Length(), 6);
CHECK(args[0]->IsObject()); CHECK(args[0]->IsObject());
Local<Object> sandbox = args[0].As<Object>(); Local<Object> sandbox = args[0].As<Object>();
@ -273,6 +284,13 @@ void ContextifyContext::MakeContext(const FunctionCallbackInfo<Value>& args) {
CHECK(args[4]->IsBoolean()); CHECK(args[4]->IsBoolean());
options.allow_code_gen_wasm = args[4].As<Boolean>(); options.allow_code_gen_wasm = args[4].As<Boolean>();
if (args[5]->IsObject() &&
!env->microtask_queue_ctor_template().IsEmpty() &&
env->microtask_queue_ctor_template()->HasInstance(args[5])) {
options.microtask_queue_wrap.reset(
Unwrap<MicrotaskQueueWrap>(args[5].As<Object>()));
}
TryCatchScope try_catch(env); TryCatchScope try_catch(env);
auto context_ptr = std::make_unique<ContextifyContext>(env, sandbox, options); auto context_ptr = std::make_unique<ContextifyContext>(env, sandbox, options);
@ -845,6 +863,7 @@ void ContextifyScript::RunInThisContext(
display_errors, display_errors,
break_on_sigint, break_on_sigint,
break_on_first_line, break_on_first_line,
nullptr, // microtask_queue
args); args);
TRACE_EVENT_NESTABLE_ASYNC_END0( TRACE_EVENT_NESTABLE_ASYNC_END0(
@ -891,6 +910,7 @@ void ContextifyScript::RunInContext(const FunctionCallbackInfo<Value>& args) {
display_errors, display_errors,
break_on_sigint, break_on_sigint,
break_on_first_line, break_on_first_line,
contextify_context->microtask_queue(),
args); args);
TRACE_EVENT_NESTABLE_ASYNC_END0( TRACE_EVENT_NESTABLE_ASYNC_END0(
@ -902,6 +922,7 @@ bool ContextifyScript::EvalMachine(Environment* env,
const bool display_errors, const bool display_errors,
const bool break_on_sigint, const bool break_on_sigint,
const bool break_on_first_line, const bool break_on_first_line,
std::shared_ptr<MicrotaskQueue> mtask_queue,
const FunctionCallbackInfo<Value>& args) { const FunctionCallbackInfo<Value>& args) {
if (!env->can_call_into_js()) if (!env->can_call_into_js())
return false; return false;
@ -926,18 +947,24 @@ bool ContextifyScript::EvalMachine(Environment* env,
MaybeLocal<Value> result; MaybeLocal<Value> result;
bool timed_out = false; bool timed_out = false;
bool received_signal = false; bool received_signal = false;
auto run = [&]() {
MaybeLocal<Value> result = script->Run(env->context());
if (!result.IsEmpty() && mtask_queue)
mtask_queue->PerformCheckpoint(env->isolate());
return result;
};
if (break_on_sigint && timeout != -1) { if (break_on_sigint && timeout != -1) {
Watchdog wd(env->isolate(), timeout, &timed_out); Watchdog wd(env->isolate(), timeout, &timed_out);
SigintWatchdog swd(env->isolate(), &received_signal); SigintWatchdog swd(env->isolate(), &received_signal);
result = script->Run(env->context()); result = run();
} else if (break_on_sigint) { } else if (break_on_sigint) {
SigintWatchdog swd(env->isolate(), &received_signal); SigintWatchdog swd(env->isolate(), &received_signal);
result = script->Run(env->context()); result = run();
} else if (timeout != -1) { } else if (timeout != -1) {
Watchdog wd(env->isolate(), timeout, &timed_out); Watchdog wd(env->isolate(), timeout, &timed_out);
result = script->Run(env->context()); result = run();
} else { } else {
result = script->Run(env->context()); result = run();
} }
// Convert the termination exception into a regular exception. // Convert the termination exception into a regular exception.
@ -1232,6 +1259,43 @@ static void MeasureMemory(const FunctionCallbackInfo<Value>& args) {
args.GetReturnValue().Set(promise); args.GetReturnValue().Set(promise);
} }
MicrotaskQueueWrap::MicrotaskQueueWrap(Environment* env, Local<Object> obj)
: BaseObject(env, obj),
microtask_queue_(
MicrotaskQueue::New(env->isolate(), MicrotasksPolicy::kExplicit)) {
MakeWeak();
}
const std::shared_ptr<MicrotaskQueue>&
MicrotaskQueueWrap::microtask_queue() const {
return microtask_queue_;
}
void MicrotaskQueueWrap::New(const FunctionCallbackInfo<Value>& args) {
CHECK(args.IsConstructCall());
new MicrotaskQueueWrap(Environment::GetCurrent(args), args.This());
}
void MicrotaskQueueWrap::Init(Environment* env, Local<Object> target) {
HandleScope scope(env->isolate());
Local<String> class_name =
FIXED_ONE_BYTE_STRING(env->isolate(), "MicrotaskQueue");
Local<FunctionTemplate> tmpl = env->NewFunctionTemplate(New);
tmpl->InstanceTemplate()->SetInternalFieldCount(
ContextifyScript::kInternalFieldCount);
tmpl->SetClassName(class_name);
if (target->Set(env->context(),
class_name,
tmpl->GetFunction(env->context()).ToLocalChecked())
.IsNothing()) {
return;
}
env->set_microtask_queue_ctor_template(tmpl);
}
void Initialize(Local<Object> target, void Initialize(Local<Object> target,
Local<Value> unused, Local<Value> unused,
Local<Context> context, Local<Context> context,
@ -1240,6 +1304,7 @@ void Initialize(Local<Object> target,
Isolate* isolate = env->isolate(); Isolate* isolate = env->isolate();
ContextifyContext::Init(env, target); ContextifyContext::Init(env, target);
ContextifyScript::Init(env, target); ContextifyScript::Init(env, target);
MicrotaskQueueWrap::Init(env, target);
env->SetMethod(target, "startSigintWatchdog", StartSigintWatchdog); env->SetMethod(target, "startSigintWatchdog", StartSigintWatchdog);
env->SetMethod(target, "stopSigintWatchdog", StopSigintWatchdog); env->SetMethod(target, "stopSigintWatchdog", StopSigintWatchdog);

View File

@ -10,11 +10,32 @@
namespace node { namespace node {
namespace contextify { namespace contextify {
class MicrotaskQueueWrap : public BaseObject {
public:
MicrotaskQueueWrap(Environment* env, v8::Local<v8::Object> obj);
const std::shared_ptr<v8::MicrotaskQueue>& microtask_queue() const;
static void Init(Environment* env, v8::Local<v8::Object> target);
static void New(const v8::FunctionCallbackInfo<v8::Value>& args);
// This could have methods for running the microtask queue, if we ever decide
// to make that fully customizable from userland.
SET_NO_MEMORY_INFO()
SET_MEMORY_INFO_NAME(MicrotaskQueueWrap)
SET_SELF_SIZE(MicrotaskQueueWrap)
private:
std::shared_ptr<v8::MicrotaskQueue> microtask_queue_;
};
struct ContextOptions { struct ContextOptions {
v8::Local<v8::String> name; v8::Local<v8::String> name;
v8::Local<v8::String> origin; v8::Local<v8::String> origin;
v8::Local<v8::Boolean> allow_code_gen_strings; v8::Local<v8::Boolean> allow_code_gen_strings;
v8::Local<v8::Boolean> allow_code_gen_wasm; v8::Local<v8::Boolean> allow_code_gen_wasm;
BaseObjectPtr<MicrotaskQueueWrap> microtask_queue_wrap;
}; };
class ContextifyContext { class ContextifyContext {
@ -53,6 +74,11 @@ class ContextifyContext {
context()->GetEmbedderData(ContextEmbedderIndex::kSandboxObject)); context()->GetEmbedderData(ContextEmbedderIndex::kSandboxObject));
} }
inline std::shared_ptr<v8::MicrotaskQueue> microtask_queue() const {
if (!microtask_queue_wrap_) return {};
return microtask_queue_wrap_->microtask_queue();
}
template <typename T> template <typename T>
static ContextifyContext* Get(const v8::PropertyCallbackInfo<T>& args); static ContextifyContext* Get(const v8::PropertyCallbackInfo<T>& args);
@ -102,6 +128,7 @@ class ContextifyContext {
const v8::PropertyCallbackInfo<v8::Boolean>& args); const v8::PropertyCallbackInfo<v8::Boolean>& args);
Environment* const env_; Environment* const env_;
v8::Global<v8::Context> context_; v8::Global<v8::Context> context_;
BaseObjectPtr<MicrotaskQueueWrap> microtask_queue_wrap_;
}; };
class ContextifyScript : public BaseObject { class ContextifyScript : public BaseObject {
@ -125,6 +152,7 @@ class ContextifyScript : public BaseObject {
const bool display_errors, const bool display_errors,
const bool break_on_sigint, const bool break_on_sigint,
const bool break_on_first_line, const bool break_on_first_line,
std::shared_ptr<v8::MicrotaskQueue> microtask_queue,
const v8::FunctionCallbackInfo<v8::Value>& args); const v8::FunctionCallbackInfo<v8::Value>& args);
inline uint32_t id() { return id_; } inline uint32_t id() { return id_; }

View File

@ -14,9 +14,6 @@ test-vm-timeout-escape-queuemicrotask: SKIP
[$system==win32] [$system==win32]
[$system==linux] [$system==linux]
# https://github.com/nodejs/node/pull/23743
# https://github.com/nodejs/node/issues/3020
test-vm-timeout-escape-promise: PASS,FLAKY
[$system==macos] [$system==macos]

View File

@ -35,7 +35,7 @@ assert.throws(() => {
queueMicrotask, queueMicrotask,
loop loop
}, },
{ timeout } { timeout, microtaskMode: 'afterScriptRun' }
); );
}, { }, {
code: 'ERR_SCRIPT_EXECUTION_TIMEOUT', code: 'ERR_SCRIPT_EXECUTION_TIMEOUT',

View File

@ -0,0 +1,38 @@
'use strict';
// https://github.com/nodejs/node/issues/3020
// Promises used to allow code to escape the timeout
// set for runInContext, runInNewContext, and runInThisContext.
require('../common');
const assert = require('assert');
const vm = require('vm');
const NS_PER_MS = 1000000n;
const hrtime = process.hrtime.bigint;
function loop() {
const start = hrtime();
while (1) {
const current = hrtime();
const span = (current - start) / NS_PER_MS;
if (span >= 100n) {
throw new Error(
`escaped timeout at ${span} milliseconds!`);
}
}
}
assert.throws(() => {
vm.runInNewContext(
'Promise.resolve().then(() => loop());',
{
hrtime,
loop
},
{ timeout: 10, microtaskMode: 'afterEvaluate' }
);
}, {
code: 'ERR_SCRIPT_EXECUTION_TIMEOUT',
message: 'Script execution timed out after 10ms'
});

View File

@ -0,0 +1,42 @@
// Flags: --experimental-vm-modules
'use strict';
// https://github.com/nodejs/node/issues/3020
// Promises used to allow code to escape the timeout
// set for runInContext, runInNewContext, and runInThisContext.
const common = require('../common');
const assert = require('assert');
const vm = require('vm');
const NS_PER_MS = 1000000n;
const hrtime = process.hrtime.bigint;
function loop() {
const start = hrtime();
while (1) {
const current = hrtime();
const span = (current - start) / NS_PER_MS;
if (span >= 100n) {
throw new Error(
`escaped timeout at ${span} milliseconds!`);
}
}
}
assert.rejects(async () => {
const module = new vm.SourceTextModule(
'Promise.resolve().then(() => loop());',
{
context: vm.createContext({
hrtime,
loop
}, { microtaskMode: 'afterEvaluate' })
});
await module.link(common.mustNotCall());
await module.evaluate({ timeout: 10 });
}, {
code: 'ERR_SCRIPT_EXECUTION_TIMEOUT',
message: 'Script execution timed out after 10ms'
});

View File

@ -0,0 +1,42 @@
// Flags: --experimental-vm-modules
'use strict';
// https://github.com/nodejs/node/issues/3020
// Promises used to allow code to escape the timeout
// set for runInContext, runInNewContext, and runInThisContext.
const common = require('../common');
const assert = require('assert');
const vm = require('vm');
const NS_PER_MS = 1000000n;
const hrtime = process.hrtime.bigint;
function loop() {
const start = hrtime();
while (1) {
const current = hrtime();
const span = (current - start) / NS_PER_MS;
if (span >= 100n) {
throw new Error(
`escaped timeout at ${span} milliseconds!`);
}
}
}
assert.rejects(async () => {
const module = new vm.SourceTextModule(
'Promise.resolve().then(() => loop()); loop();',
{
context: vm.createContext({
hrtime,
loop
}, { microtaskMode: 'afterEvaluate' })
});
await module.link(common.mustNotCall());
await module.evaluate({ timeout: 5 });
}, {
code: 'ERR_SCRIPT_EXECUTION_TIMEOUT',
message: 'Script execution timed out after 5ms'
});

View File

@ -1,8 +1,8 @@
'use strict'; 'use strict';
// https://github.com/nodejs/node/issues/3020 // https://github.com/nodejs/node/issues/3020
// Promises, nextTick, and queueMicrotask allow code to escape the timeout // Promises used to allow code to escape the timeout
// set for runInContext, runInNewContext, and runInThisContext // set for runInContext, runInNewContext, and runInThisContext.
require('../common'); require('../common');
const assert = require('assert'); const assert = require('assert');
@ -26,12 +26,12 @@ function loop() {
assert.throws(() => { assert.throws(() => {
vm.runInNewContext( vm.runInNewContext(
'Promise.resolve().then(loop); loop();', 'Promise.resolve().then(() => loop()); loop();',
{ {
hrtime, hrtime,
loop loop
}, },
{ timeout: 5 } { timeout: 5, microtaskMode: 'afterEvaluate' }
); );
}, { }, {
code: 'ERR_SCRIPT_EXECUTION_TIMEOUT', code: 'ERR_SCRIPT_EXECUTION_TIMEOUT',