Commit 68ae81bf authored by Milad Fa's avatar Milad Fa Committed by V8 LUCI CQ

PPC/s390: [wasm] Fix return value of lazy compile runtime function

Port 22a16bda

Original Commit Message:

    The Runtime_WasmCompileLazy function was returning a ptr-sized address,
    wrapped in an Object. This worked because no GC is triggered between the
    return from the runtime function and the point where we jump to the
    returned address.

    In a pointer-compressed world though, generated code assumes that all
    objects live in the same 4GB heap, so comparisons only compare the lower
    32 bit. On a 64-bit system, this can lead to collisions where a
    comparison determines that the returned address equals a heap object,
    even though the upper 32-bit differ.

    This happens occasionally in the wild, where the returned function entry
    pointer has the same lower half than the exception sentinel value. This
    leads to triggering stack unwinding (by the CEntry stub), which then
    fails (with a CHECK) because there is no pending exception.

    This CL fixes that by returning a Smi instead which is the offset in the
    jump table where the kWasmCompileLazy builtin should jump to. The
    builtin then gets the jump table start address from the instance object,
    adds the offset that the runtime function returned, and performs the
    jump.

    We do not include a regression test because this failure is very
    spurious and hard to reproduce.

R=clemensb@chromium.org, joransiu@ca.ibm.com, junyan@redhat.com, midawson@redhat.com
BUG=
LOG=N

Change-Id: I92907b97a9d44d8cf42bb356ef350a22f7c5d5e1
Reviewed-on: https://chromium-review.googlesource.com/c/v8/v8/+/3666249
Commit-Queue: Milad Farazmand <mfarazma@redhat.com>
Reviewed-by: 's avatarClemens Backes <clemensb@chromium.org>
Reviewed-by: 's avatarJunliang Yan <junyan@redhat.com>
Cr-Commit-Position: refs/heads/main@{#80752}
parent fe44d706
...@@ -2904,8 +2904,8 @@ void Builtins::Generate_Construct(MacroAssembler* masm) { ...@@ -2904,8 +2904,8 @@ void Builtins::Generate_Construct(MacroAssembler* masm) {
void Builtins::Generate_WasmCompileLazy(MacroAssembler* masm) { void Builtins::Generate_WasmCompileLazy(MacroAssembler* masm) {
// The function index was put in a register by the jump table trampoline. // The function index was put in a register by the jump table trampoline.
// Convert to Smi for the runtime call. // Convert to Smi for the runtime call.
__ SmiTag(kWasmCompileLazyFuncIndexRegister, __ SmiTag(kWasmCompileLazyFuncIndexRegister);
kWasmCompileLazyFuncIndexRegister);
{ {
HardAbortScope hard_abort(masm); // Avoid calls to Abort. HardAbortScope hard_abort(masm); // Avoid calls to Abort.
FrameAndConstantPoolScope scope(masm, StackFrame::WASM_COMPILE_LAZY); FrameAndConstantPoolScope scope(masm, StackFrame::WASM_COMPILE_LAZY);
...@@ -2939,21 +2939,37 @@ void Builtins::Generate_WasmCompileLazy(MacroAssembler* masm) { ...@@ -2939,21 +2939,37 @@ void Builtins::Generate_WasmCompileLazy(MacroAssembler* masm) {
__ MultiPush(gp_regs); __ MultiPush(gp_regs);
__ MultiPushF64AndV128(fp_regs, simd_regs); __ MultiPushF64AndV128(fp_regs, simd_regs);
// Pass instance and function index as explicit arguments to the runtime // Push the Wasm instance for loading the jump table address after the
// runtime call.
__ Push(kWasmInstanceRegister);
// Push the Wasm instance again as an explicit argument to the runtime
// function. // function.
__ Push(kWasmInstanceRegister, kWasmCompileLazyFuncIndexRegister); __ Push(kWasmInstanceRegister);
// Push the function index as second argument.
__ Push(kWasmCompileLazyFuncIndexRegister);
// Initialize the JavaScript context with 0. CEntry will use it to // Initialize the JavaScript context with 0. CEntry will use it to
// set the current context on the isolate. // set the current context on the isolate.
__ LoadSmiLiteral(cp, Smi::zero()); __ LoadSmiLiteral(cp, Smi::zero());
__ CallRuntime(Runtime::kWasmCompileLazy, 2); __ CallRuntime(Runtime::kWasmCompileLazy, 2);
// The entrypoint address is the return value. // The runtime function returns the jump table slot offset as a Smi. Use
__ mr(r11, kReturnRegister0); // that to compute the jump target in r11.
__ Pop(kWasmInstanceRegister);
__ LoadU64(
r11,
MemOperand(kWasmInstanceRegister,
WasmInstanceObject::kJumpTableStartOffset - kHeapObjectTag),
r0);
__ SmiUntag(kReturnRegister0);
__ AddS64(r11, r11, kReturnRegister0);
// r11 now holds the jump table slot where we want to jump to in the end.
// Restore registers. // Restore registers.
__ MultiPopF64AndV128(fp_regs, simd_regs); __ MultiPopF64AndV128(fp_regs, simd_regs);
__ MultiPop(gp_regs); __ MultiPop(gp_regs);
} }
// Finally, jump to the entrypoint.
// Finally, jump to the jump table slot for the function.
__ Jump(r11); __ Jump(r11);
} }
......
...@@ -2910,8 +2910,8 @@ void Builtins::Generate_Construct(MacroAssembler* masm) { ...@@ -2910,8 +2910,8 @@ void Builtins::Generate_Construct(MacroAssembler* masm) {
void Builtins::Generate_WasmCompileLazy(MacroAssembler* masm) { void Builtins::Generate_WasmCompileLazy(MacroAssembler* masm) {
// The function index was put in a register by the jump table trampoline. // The function index was put in a register by the jump table trampoline.
// Convert to Smi for the runtime call. // Convert to Smi for the runtime call.
__ SmiTag(kWasmCompileLazyFuncIndexRegister, __ SmiTag(kWasmCompileLazyFuncIndexRegister);
kWasmCompileLazyFuncIndexRegister);
{ {
HardAbortScope hard_abort(masm); // Avoid calls to Abort. HardAbortScope hard_abort(masm); // Avoid calls to Abort.
FrameAndConstantPoolScope scope(masm, StackFrame::WASM_COMPILE_LAZY); FrameAndConstantPoolScope scope(masm, StackFrame::WASM_COMPILE_LAZY);
...@@ -2939,21 +2939,35 @@ void Builtins::Generate_WasmCompileLazy(MacroAssembler* masm) { ...@@ -2939,21 +2939,35 @@ void Builtins::Generate_WasmCompileLazy(MacroAssembler* masm) {
__ MultiPush(gp_regs); __ MultiPush(gp_regs);
__ MultiPushF64OrV128(fp_regs, ip); __ MultiPushF64OrV128(fp_regs, ip);
// Pass instance and function index as explicit arguments to the runtime // Push the Wasm instance for loading the jump table address after the
// runtime call.
__ Push(kWasmInstanceRegister);
// Push the Wasm instance again as an explicit argument to the runtime
// function. // function.
__ Push(kWasmInstanceRegister, r7); __ Push(kWasmInstanceRegister);
// Push the function index as second argument.
__ Push(kWasmCompileLazyFuncIndexRegister);
// Initialize the JavaScript context with 0. CEntry will use it to // Initialize the JavaScript context with 0. CEntry will use it to
// set the current context on the isolate. // set the current context on the isolate.
__ LoadSmiLiteral(cp, Smi::zero()); __ LoadSmiLiteral(cp, Smi::zero());
__ CallRuntime(Runtime::kWasmCompileLazy, 2); __ CallRuntime(Runtime::kWasmCompileLazy, 2);
// The entrypoint address is the return value. // The runtime function returns the jump table slot offset as a Smi. Use
__ mov(ip, r2); // that to compute the jump target in ip.
__ Pop(kWasmInstanceRegister);
__ LoadU64(ip, MemOperand(kWasmInstanceRegister,
WasmInstanceObject::kJumpTableStartOffset -
kHeapObjectTag));
__ SmiUntag(kReturnRegister0);
__ AddS64(ip, ip, kReturnRegister0);
// ip now holds the jump table slot where we want to jump to in the end.
// Restore registers. // Restore registers.
__ MultiPopF64OrV128(fp_regs, ip); __ MultiPopF64OrV128(fp_regs, ip);
__ MultiPop(gp_regs); __ MultiPop(gp_regs);
} }
// Finally, jump to the entrypoint.
// Finally, jump to the jump table slot for the function.
__ Jump(ip); __ Jump(ip);
} }
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment