gh-139109: A new tracing JIT compiler frontend for CPython (GH-140310)
This PR changes the current JIT model from trace projection to trace recording. Benchmarking: better pyperformance (about 1.7% overall) geomean versus current https://raw.githubusercontent.com/facebookexperimental/free-threading-benchmarking/refs/heads/main/results/bm-20251108-3.15.0a1%2B-7e2bc1d-JIT/bm-20251108-vultr-x86_64-Fidget%252dSpinner-tracing_jit-3.15.0a1%2B-7e2bc1d-vs-base.svg, 100% faster Richards on the most improved benchmark versus the current JIT. Slowdown of about 10-15% on the worst benchmark versus the current JIT. **Note: the fastest version isn't the one merged, as it relies on fixing bugs in the specializing interpreter, which is left to another PR**. The speedup in the merged version is about 1.1%. https://raw.githubusercontent.com/facebookexperimental/free-threading-benchmarking/refs/heads/main/results/bm-20251112-3.15.0a1%2B-f8a764a-JIT/bm-20251112-vultr-x86_64-Fidget%252dSpinner-tracing_jit-3.15.0a1%2B-f8a764a-vs-base.svg Stats: 50% more uops executed, 30% more traces entered the last time we ran them. It also suggests our trace lengths for a real trace recording JIT are too short, as a lot of trace too long aborts https://github.com/facebookexperimental/free-threading-benchmarking/blob/main/results/bm-20251023-3.15.0a1%2B-eb73378-CLANG%2CJIT/bm-20251023-vultr-x86_64-Fidget%252dSpinner-tracing_jit-3.15.0a1%2B-eb73378-pystats-vs-base.md . This new JIT frontend is already able to record/execute significantly more instructions than the previous JIT frontend. In this PR, we are now able to record through custom dunders, simple object creation, generators, etc. None of these were done by the old JIT frontend. Some custom dunders uops were discovered to be broken as part of this work gh-140277 The optimizer stack space check is disabled, as it's no longer valid to deal with underflow. Pros: * Ignoring the generated tracer code as it's automatically created, this is only additional 1k lines of code. The maintenance burden is handled by the DSL and code generator. * `optimizer.c` is now significantly simpler, as we don't have to do strange things to recover the bytecode from a trace. * The new JIT frontend is able to handle a lot more control-flow than the old one. * Tracing is very low overhead. We use the tail calling interpreter/computed goto interpreter to switch between tracing mode and non-tracing mode. I call this mechanism dual dispatch, as we have two dispatch tables dispatching to each other. Specialization is still enabled while tracing. * Better handling of polymorphism. We leverage the specializing interpreter for this. Cons: * (For now) requires tail calling interpreter or computed gotos. This means no Windows JIT for now :(. Not to fret, tail calling is coming soon to Windows though https://github.com/python/cpython/pull/139962 Design: * After each instruction, the `record_previous_inst` function/label is executed. This does as the name suggests. * The tracing interpreter lowers bytecode to uops directly so that it can obtain "fresh" values at the point of lowering. * The tracing version behaves nearly identical to the normal interpreter, in fact it even has specialization! This allows it to run without much of a slowdown when tracing. The actual cost of tracing is only a function call and writes to memory. * The tracing interpreter uses the specializing interpreter's deopt to naturally form the side exit chains. This allows it to side exit chain effectively, without repeating much code. We force a re-specializing when tracing a deopt. * The tracing interpreter can even handle goto errors/exceptions, but I chose to disable them for now as it's not tested. * Because we do not share interpreter dispatch, there is should be no significant slowdown to the original specializing interpreter on tailcall and computed got with JIT disabled. With JIT enabled, there might be a slowdown in the form of the JIT trying to trace. * Things that could have dynamic instruction pointer effects are guarded on. The guard deopts to a new instruction --- `_DYNAMIC_EXIT`.
This commit is contained in:
@@ -2938,8 +2938,8 @@ dummy_func(
|
||||
JUMP_BACKWARD_JIT,
|
||||
};
|
||||
|
||||
tier1 op(_SPECIALIZE_JUMP_BACKWARD, (--)) {
|
||||
#if ENABLE_SPECIALIZATION_FT
|
||||
specializing tier1 op(_SPECIALIZE_JUMP_BACKWARD, (--)) {
|
||||
#if ENABLE_SPECIALIZATION
|
||||
if (this_instr->op.code == JUMP_BACKWARD) {
|
||||
uint8_t desired = tstate->interp->jit ? JUMP_BACKWARD_JIT : JUMP_BACKWARD_NO_JIT;
|
||||
FT_ATOMIC_STORE_UINT8_RELAXED(this_instr->op.code, desired);
|
||||
@@ -2953,25 +2953,21 @@ dummy_func(
|
||||
tier1 op(_JIT, (--)) {
|
||||
#ifdef _Py_TIER2
|
||||
_Py_BackoffCounter counter = this_instr[1].counter;
|
||||
if (backoff_counter_triggers(counter) && this_instr->op.code == JUMP_BACKWARD_JIT) {
|
||||
_Py_CODEUNIT *start = this_instr;
|
||||
/* Back up over EXTENDED_ARGs so optimizer sees the whole instruction */
|
||||
if (!IS_JIT_TRACING() && backoff_counter_triggers(counter) &&
|
||||
this_instr->op.code == JUMP_BACKWARD_JIT &&
|
||||
next_instr->op.code != ENTER_EXECUTOR) {
|
||||
/* Back up over EXTENDED_ARGs so executor is inserted at the correct place */
|
||||
_Py_CODEUNIT *insert_exec_at = this_instr;
|
||||
while (oparg > 255) {
|
||||
oparg >>= 8;
|
||||
start--;
|
||||
insert_exec_at--;
|
||||
}
|
||||
_PyExecutorObject *executor;
|
||||
int optimized = _PyOptimizer_Optimize(frame, start, &executor, 0);
|
||||
if (optimized <= 0) {
|
||||
this_instr[1].counter = restart_backoff_counter(counter);
|
||||
ERROR_IF(optimized < 0);
|
||||
int succ = _PyJit_TryInitializeTracing(tstate, frame, this_instr, insert_exec_at, next_instr, STACK_LEVEL(), 0, NULL, oparg);
|
||||
if (succ) {
|
||||
ENTER_TRACING();
|
||||
}
|
||||
else {
|
||||
this_instr[1].counter = initial_jump_backoff_counter();
|
||||
assert(tstate->current_executor == NULL);
|
||||
assert(executor != tstate->interp->cold_executor);
|
||||
tstate->jit_exit = NULL;
|
||||
TIER1_TO_TIER2(executor);
|
||||
this_instr[1].counter = restart_backoff_counter(counter);
|
||||
}
|
||||
}
|
||||
else {
|
||||
@@ -3017,6 +3013,10 @@ dummy_func(
|
||||
|
||||
tier1 inst(ENTER_EXECUTOR, (--)) {
|
||||
#ifdef _Py_TIER2
|
||||
if (IS_JIT_TRACING()) {
|
||||
next_instr = this_instr;
|
||||
goto stop_tracing;
|
||||
}
|
||||
PyCodeObject *code = _PyFrame_GetCode(frame);
|
||||
_PyExecutorObject *executor = code->co_executors->executors[oparg & 255];
|
||||
assert(executor->vm_data.index == INSTR_OFFSET() - 1);
|
||||
@@ -3078,7 +3078,7 @@ dummy_func(
|
||||
|
||||
macro(POP_JUMP_IF_NOT_NONE) = unused/1 + _IS_NONE + _POP_JUMP_IF_FALSE;
|
||||
|
||||
tier1 inst(JUMP_BACKWARD_NO_INTERRUPT, (--)) {
|
||||
replaced inst(JUMP_BACKWARD_NO_INTERRUPT, (--)) {
|
||||
/* This bytecode is used in the `yield from` or `await` loop.
|
||||
* If there is an interrupt, we want it handled in the innermost
|
||||
* generator or coroutine, so we deliberately do not check it here.
|
||||
@@ -5245,19 +5245,40 @@ dummy_func(
|
||||
tier2 op(_EXIT_TRACE, (exit_p/4 --)) {
|
||||
_PyExitData *exit = (_PyExitData *)exit_p;
|
||||
#if defined(Py_DEBUG) && !defined(_Py_JIT)
|
||||
_Py_CODEUNIT *target = _PyFrame_GetBytecode(frame) + exit->target;
|
||||
const _Py_CODEUNIT *target = ((frame->owner == FRAME_OWNED_BY_INTERPRETER)
|
||||
? _Py_INTERPRETER_TRAMPOLINE_INSTRUCTIONS_PTR : _PyFrame_GetBytecode(frame))
|
||||
+ exit->target;
|
||||
OPT_HIST(trace_uop_execution_counter, trace_run_length_hist);
|
||||
if (frame->lltrace >= 2) {
|
||||
if (frame->lltrace >= 3) {
|
||||
printf("SIDE EXIT: [UOp ");
|
||||
_PyUOpPrint(&next_uop[-1]);
|
||||
printf(", exit %tu, temp %d, target %d -> %s, is_control_flow %d]\n",
|
||||
exit - current_executor->exits, exit->temperature.value_and_backoff,
|
||||
(int)(target - _PyFrame_GetBytecode(frame)),
|
||||
_PyOpcode_OpName[target->op.code], exit->is_control_flow);
|
||||
}
|
||||
#endif
|
||||
tstate->jit_exit = exit;
|
||||
TIER2_TO_TIER2(exit->executor);
|
||||
}
|
||||
|
||||
tier2 op(_DYNAMIC_EXIT, (exit_p/4 --)) {
|
||||
#if defined(Py_DEBUG) && !defined(_Py_JIT)
|
||||
_PyExitData *exit = (_PyExitData *)exit_p;
|
||||
_Py_CODEUNIT *target = frame->instr_ptr;
|
||||
OPT_HIST(trace_uop_execution_counter, trace_run_length_hist);
|
||||
if (frame->lltrace >= 3) {
|
||||
printf("DYNAMIC EXIT: [UOp ");
|
||||
_PyUOpPrint(&next_uop[-1]);
|
||||
printf(", exit %tu, temp %d, target %d -> %s]\n",
|
||||
exit - current_executor->exits, exit->temperature.value_and_backoff,
|
||||
(int)(target - _PyFrame_GetBytecode(frame)),
|
||||
_PyOpcode_OpName[target->op.code]);
|
||||
}
|
||||
#endif
|
||||
tstate->jit_exit = exit;
|
||||
TIER2_TO_TIER2(exit->executor);
|
||||
#endif
|
||||
// Disabled for now (gh-139109) as it slows down dynamic code tremendously.
|
||||
// Compile and jump to the cold dynamic executors in the future.
|
||||
GOTO_TIER_ONE(frame->instr_ptr);
|
||||
}
|
||||
|
||||
tier2 op(_CHECK_VALIDITY, (--)) {
|
||||
@@ -5369,7 +5390,8 @@ dummy_func(
|
||||
}
|
||||
|
||||
tier2 op(_DEOPT, (--)) {
|
||||
GOTO_TIER_ONE(_PyFrame_GetBytecode(frame) + CURRENT_TARGET());
|
||||
GOTO_TIER_ONE((frame->owner == FRAME_OWNED_BY_INTERPRETER)
|
||||
? _Py_INTERPRETER_TRAMPOLINE_INSTRUCTIONS_PTR : _PyFrame_GetBytecode(frame) + CURRENT_TARGET());
|
||||
}
|
||||
|
||||
tier2 op(_HANDLE_PENDING_AND_DEOPT, (--)) {
|
||||
@@ -5399,32 +5421,76 @@ dummy_func(
|
||||
tier2 op(_COLD_EXIT, ( -- )) {
|
||||
_PyExitData *exit = tstate->jit_exit;
|
||||
assert(exit != NULL);
|
||||
assert(frame->owner < FRAME_OWNED_BY_INTERPRETER);
|
||||
_Py_CODEUNIT *target = _PyFrame_GetBytecode(frame) + exit->target;
|
||||
_Py_BackoffCounter temperature = exit->temperature;
|
||||
if (!backoff_counter_triggers(temperature)) {
|
||||
exit->temperature = advance_backoff_counter(temperature);
|
||||
GOTO_TIER_ONE(target);
|
||||
}
|
||||
_PyExecutorObject *executor;
|
||||
if (target->op.code == ENTER_EXECUTOR) {
|
||||
PyCodeObject *code = _PyFrame_GetCode(frame);
|
||||
executor = code->co_executors->executors[target->op.arg];
|
||||
Py_INCREF(executor);
|
||||
assert(tstate->jit_exit == exit);
|
||||
exit->executor = executor;
|
||||
TIER2_TO_TIER2(exit->executor);
|
||||
}
|
||||
else {
|
||||
if (!backoff_counter_triggers(temperature)) {
|
||||
exit->temperature = advance_backoff_counter(temperature);
|
||||
GOTO_TIER_ONE(target);
|
||||
}
|
||||
_PyExecutorObject *previous_executor = _PyExecutor_FromExit(exit);
|
||||
assert(tstate->current_executor == (PyObject *)previous_executor);
|
||||
int chain_depth = previous_executor->vm_data.chain_depth + 1;
|
||||
int optimized = _PyOptimizer_Optimize(frame, target, &executor, chain_depth);
|
||||
if (optimized <= 0) {
|
||||
exit->temperature = restart_backoff_counter(temperature);
|
||||
GOTO_TIER_ONE(optimized < 0 ? NULL : target);
|
||||
// For control-flow guards, we don't want to increase the chain depth, as those don't actually
|
||||
// represent deopts but rather just normal programs!
|
||||
int chain_depth = previous_executor->vm_data.chain_depth + !exit->is_control_flow;
|
||||
// Note: it's safe to use target->op.arg here instead of the oparg given by EXTENDED_ARG.
|
||||
// The invariant in the optimizer is the deopt target always points back to the first EXTENDED_ARG.
|
||||
// So setting it to anything else is wrong.
|
||||
int succ = _PyJit_TryInitializeTracing(tstate, frame, target, target, target, STACK_LEVEL(), chain_depth, exit, target->op.arg);
|
||||
exit->temperature = restart_backoff_counter(exit->temperature);
|
||||
if (succ) {
|
||||
GOTO_TIER_ONE_CONTINUE_TRACING(target);
|
||||
}
|
||||
exit->temperature = initial_temperature_backoff_counter();
|
||||
GOTO_TIER_ONE(target);
|
||||
}
|
||||
}
|
||||
|
||||
tier2 op(_COLD_DYNAMIC_EXIT, ( -- )) {
|
||||
// TODO (gh-139109): This should be similar to _COLD_EXIT in the future.
|
||||
_Py_CODEUNIT *target = frame->instr_ptr;
|
||||
GOTO_TIER_ONE(target);
|
||||
}
|
||||
|
||||
tier2 op(_GUARD_IP__PUSH_FRAME, (ip/4 --)) {
|
||||
_Py_CODEUNIT *target = frame->instr_ptr + IP_OFFSET_OF(_PUSH_FRAME);
|
||||
if (target != (_Py_CODEUNIT *)ip) {
|
||||
frame->instr_ptr += IP_OFFSET_OF(_PUSH_FRAME);
|
||||
EXIT_IF(true);
|
||||
}
|
||||
}
|
||||
|
||||
tier2 op(_GUARD_IP_YIELD_VALUE, (ip/4 --)) {
|
||||
_Py_CODEUNIT *target = frame->instr_ptr + IP_OFFSET_OF(YIELD_VALUE);
|
||||
if (target != (_Py_CODEUNIT *)ip) {
|
||||
frame->instr_ptr += IP_OFFSET_OF(YIELD_VALUE);
|
||||
EXIT_IF(true);
|
||||
}
|
||||
}
|
||||
|
||||
tier2 op(_GUARD_IP_RETURN_VALUE, (ip/4 --)) {
|
||||
_Py_CODEUNIT *target = frame->instr_ptr + IP_OFFSET_OF(RETURN_VALUE);
|
||||
if (target != (_Py_CODEUNIT *)ip) {
|
||||
frame->instr_ptr += IP_OFFSET_OF(RETURN_VALUE);
|
||||
EXIT_IF(true);
|
||||
}
|
||||
}
|
||||
|
||||
tier2 op(_GUARD_IP_RETURN_GENERATOR, (ip/4 --)) {
|
||||
_Py_CODEUNIT *target = frame->instr_ptr + IP_OFFSET_OF(RETURN_GENERATOR);
|
||||
if (target != (_Py_CODEUNIT *)ip) {
|
||||
frame->instr_ptr += IP_OFFSET_OF(RETURN_GENERATOR);
|
||||
EXIT_IF(true);
|
||||
}
|
||||
assert(tstate->jit_exit == exit);
|
||||
exit->executor = executor;
|
||||
TIER2_TO_TIER2(exit->executor);
|
||||
}
|
||||
|
||||
label(pop_2_error) {
|
||||
@@ -5571,6 +5637,62 @@ dummy_func(
|
||||
DISPATCH();
|
||||
}
|
||||
|
||||
label(record_previous_inst) {
|
||||
#if _Py_TIER2
|
||||
assert(IS_JIT_TRACING());
|
||||
int opcode = next_instr->op.code;
|
||||
bool stop_tracing = (opcode == WITH_EXCEPT_START ||
|
||||
opcode == RERAISE || opcode == CLEANUP_THROW ||
|
||||
opcode == PUSH_EXC_INFO || opcode == INTERPRETER_EXIT);
|
||||
int full = !_PyJit_translate_single_bytecode_to_trace(tstate, frame, next_instr, stop_tracing);
|
||||
if (full) {
|
||||
LEAVE_TRACING();
|
||||
int err = stop_tracing_and_jit(tstate, frame);
|
||||
ERROR_IF(err < 0);
|
||||
DISPATCH_GOTO_NON_TRACING();
|
||||
}
|
||||
// Super instructions. Instruction deopted. There's a mismatch in what the stack expects
|
||||
// in the optimizer. So we have to reflect in the trace correctly.
|
||||
_PyThreadStateImpl *_tstate = (_PyThreadStateImpl *)tstate;
|
||||
if ((_tstate->jit_tracer_state.prev_state.instr->op.code == CALL_LIST_APPEND &&
|
||||
opcode == POP_TOP) ||
|
||||
(_tstate->jit_tracer_state.prev_state.instr->op.code == BINARY_OP_INPLACE_ADD_UNICODE &&
|
||||
opcode == STORE_FAST)) {
|
||||
_tstate->jit_tracer_state.prev_state.instr_is_super = true;
|
||||
}
|
||||
else {
|
||||
_tstate->jit_tracer_state.prev_state.instr = next_instr;
|
||||
}
|
||||
PyObject *prev_code = PyStackRef_AsPyObjectBorrow(frame->f_executable);
|
||||
if (_tstate->jit_tracer_state.prev_state.instr_code != (PyCodeObject *)prev_code) {
|
||||
Py_SETREF(_tstate->jit_tracer_state.prev_state.instr_code, (PyCodeObject*)Py_NewRef((prev_code)));
|
||||
}
|
||||
|
||||
_tstate->jit_tracer_state.prev_state.instr_frame = frame;
|
||||
_tstate->jit_tracer_state.prev_state.instr_oparg = oparg;
|
||||
_tstate->jit_tracer_state.prev_state.instr_stacklevel = PyStackRef_IsNone(frame->f_executable) ? 2 : STACK_LEVEL();
|
||||
if (_PyOpcode_Caches[_PyOpcode_Deopt[opcode]]) {
|
||||
(&next_instr[1])->counter = trigger_backoff_counter();
|
||||
}
|
||||
DISPATCH_GOTO_NON_TRACING();
|
||||
#else
|
||||
Py_FatalError("JIT label executed in non-jit build.");
|
||||
#endif
|
||||
}
|
||||
|
||||
label(stop_tracing) {
|
||||
#if _Py_TIER2
|
||||
assert(IS_JIT_TRACING());
|
||||
int opcode = next_instr->op.code;
|
||||
_PyJit_translate_single_bytecode_to_trace(tstate, frame, NULL, true);
|
||||
LEAVE_TRACING();
|
||||
int err = stop_tracing_and_jit(tstate, frame);
|
||||
ERROR_IF(err < 0);
|
||||
DISPATCH_GOTO_NON_TRACING();
|
||||
#else
|
||||
Py_FatalError("JIT label executed in non-jit build.");
|
||||
#endif
|
||||
}
|
||||
|
||||
|
||||
// END BYTECODES //
|
||||
|
||||
@@ -1004,6 +1004,8 @@ static const _Py_CODEUNIT _Py_INTERPRETER_TRAMPOLINE_INSTRUCTIONS[] = {
|
||||
{ .op.code = RESUME, .op.arg = RESUME_OPARG_DEPTH1_MASK | RESUME_AT_FUNC_START }
|
||||
};
|
||||
|
||||
const _Py_CODEUNIT *_Py_INTERPRETER_TRAMPOLINE_INSTRUCTIONS_PTR = (_Py_CODEUNIT*)&_Py_INTERPRETER_TRAMPOLINE_INSTRUCTIONS;
|
||||
|
||||
#ifdef Py_DEBUG
|
||||
extern void _PyUOpPrint(const _PyUOpInstruction *uop);
|
||||
#endif
|
||||
@@ -1051,6 +1053,43 @@ _PyObjectArray_Free(PyObject **array, PyObject **scratch)
|
||||
}
|
||||
}
|
||||
|
||||
#if _Py_TIER2
|
||||
// 0 for success, -1 for error.
|
||||
static int
|
||||
stop_tracing_and_jit(PyThreadState *tstate, _PyInterpreterFrame *frame)
|
||||
{
|
||||
int _is_sys_tracing = (tstate->c_tracefunc != NULL) || (tstate->c_profilefunc != NULL);
|
||||
int err = 0;
|
||||
if (!_PyErr_Occurred(tstate) && !_is_sys_tracing) {
|
||||
err = _PyOptimizer_Optimize(frame, tstate);
|
||||
}
|
||||
_PyThreadStateImpl *_tstate = (_PyThreadStateImpl *)tstate;
|
||||
// Deal with backoffs
|
||||
_PyExitData *exit = _tstate->jit_tracer_state.initial_state.exit;
|
||||
if (exit == NULL) {
|
||||
// We hold a strong reference to the code object, so the instruction won't be freed.
|
||||
if (err <= 0) {
|
||||
_Py_BackoffCounter counter = _tstate->jit_tracer_state.initial_state.jump_backward_instr[1].counter;
|
||||
_tstate->jit_tracer_state.initial_state.jump_backward_instr[1].counter = restart_backoff_counter(counter);
|
||||
}
|
||||
else {
|
||||
_tstate->jit_tracer_state.initial_state.jump_backward_instr[1].counter = initial_jump_backoff_counter();
|
||||
}
|
||||
}
|
||||
else {
|
||||
// Likewise, we hold a strong reference to the executor containing this exit, so the exit is guaranteed
|
||||
// to be valid to access.
|
||||
if (err <= 0) {
|
||||
exit->temperature = restart_backoff_counter(exit->temperature);
|
||||
}
|
||||
else {
|
||||
exit->temperature = initial_temperature_backoff_counter();
|
||||
}
|
||||
}
|
||||
_PyJit_FinalizeTracing(tstate);
|
||||
return err;
|
||||
}
|
||||
#endif
|
||||
|
||||
/* _PyEval_EvalFrameDefault is too large to optimize for speed with PGO on MSVC.
|
||||
*/
|
||||
@@ -1180,9 +1219,9 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
#if _Py_TAIL_CALL_INTERP
|
||||
# if Py_STATS
|
||||
return _TAIL_CALL_error(frame, stack_pointer, tstate, next_instr, instruction_funcptr_table, 0, lastopcode);
|
||||
return _TAIL_CALL_error(frame, stack_pointer, tstate, next_instr, instruction_funcptr_handler_table, 0, lastopcode);
|
||||
# else
|
||||
return _TAIL_CALL_error(frame, stack_pointer, tstate, next_instr, instruction_funcptr_table, 0);
|
||||
return _TAIL_CALL_error(frame, stack_pointer, tstate, next_instr, instruction_funcptr_handler_table, 0);
|
||||
# endif
|
||||
#else
|
||||
goto error;
|
||||
@@ -1191,9 +1230,9 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int
|
||||
|
||||
#if _Py_TAIL_CALL_INTERP
|
||||
# if Py_STATS
|
||||
return _TAIL_CALL_start_frame(frame, NULL, tstate, NULL, instruction_funcptr_table, 0, lastopcode);
|
||||
return _TAIL_CALL_start_frame(frame, NULL, tstate, NULL, instruction_funcptr_handler_table, 0, lastopcode);
|
||||
# else
|
||||
return _TAIL_CALL_start_frame(frame, NULL, tstate, NULL, instruction_funcptr_table, 0);
|
||||
return _TAIL_CALL_start_frame(frame, NULL, tstate, NULL, instruction_funcptr_handler_table, 0);
|
||||
# endif
|
||||
#else
|
||||
goto start_frame;
|
||||
@@ -1235,7 +1274,9 @@ _PyTier2Interpreter(
|
||||
tier2_start:
|
||||
|
||||
next_uop = current_executor->trace;
|
||||
assert(next_uop->opcode == _START_EXECUTOR || next_uop->opcode == _COLD_EXIT);
|
||||
assert(next_uop->opcode == _START_EXECUTOR ||
|
||||
next_uop->opcode == _COLD_EXIT ||
|
||||
next_uop->opcode == _COLD_DYNAMIC_EXIT);
|
||||
|
||||
#undef LOAD_IP
|
||||
#define LOAD_IP(UNUSED) (void)0
|
||||
@@ -1259,7 +1300,9 @@ tier2_start:
|
||||
uint64_t trace_uop_execution_counter = 0;
|
||||
#endif
|
||||
|
||||
assert(next_uop->opcode == _START_EXECUTOR || next_uop->opcode == _COLD_EXIT);
|
||||
assert(next_uop->opcode == _START_EXECUTOR ||
|
||||
next_uop->opcode == _COLD_EXIT ||
|
||||
next_uop->opcode == _COLD_DYNAMIC_EXIT);
|
||||
tier2_dispatch:
|
||||
for (;;) {
|
||||
uopcode = next_uop->opcode;
|
||||
|
||||
@@ -93,11 +93,19 @@
|
||||
# define Py_PRESERVE_NONE_CC __attribute__((preserve_none))
|
||||
Py_PRESERVE_NONE_CC typedef PyObject* (*py_tail_call_funcptr)(TAIL_CALL_PARAMS);
|
||||
|
||||
# define DISPATCH_TABLE_VAR instruction_funcptr_table
|
||||
# define DISPATCH_TABLE instruction_funcptr_handler_table
|
||||
# define TRACING_DISPATCH_TABLE instruction_funcptr_tracing_table
|
||||
# define TARGET(op) Py_PRESERVE_NONE_CC PyObject *_TAIL_CALL_##op(TAIL_CALL_PARAMS)
|
||||
|
||||
# define DISPATCH_GOTO() \
|
||||
do { \
|
||||
Py_MUSTTAIL return (((py_tail_call_funcptr *)instruction_funcptr_table)[opcode])(TAIL_CALL_ARGS); \
|
||||
} while (0)
|
||||
# define DISPATCH_GOTO_NON_TRACING() \
|
||||
do { \
|
||||
Py_MUSTTAIL return (((py_tail_call_funcptr *)DISPATCH_TABLE)[opcode])(TAIL_CALL_ARGS); \
|
||||
} while (0)
|
||||
# define JUMP_TO_LABEL(name) \
|
||||
do { \
|
||||
Py_MUSTTAIL return (_TAIL_CALL_##name)(TAIL_CALL_ARGS); \
|
||||
@@ -115,19 +123,36 @@
|
||||
# endif
|
||||
# define LABEL(name) TARGET(name)
|
||||
#elif USE_COMPUTED_GOTOS
|
||||
# define DISPATCH_TABLE_VAR opcode_targets
|
||||
# define DISPATCH_TABLE opcode_targets_table
|
||||
# define TRACING_DISPATCH_TABLE opcode_tracing_targets_table
|
||||
# define TARGET(op) TARGET_##op:
|
||||
# define DISPATCH_GOTO() goto *opcode_targets[opcode]
|
||||
# define DISPATCH_GOTO_NON_TRACING() goto *DISPATCH_TABLE[opcode];
|
||||
# define JUMP_TO_LABEL(name) goto name;
|
||||
# define JUMP_TO_PREDICTED(name) goto PREDICTED_##name;
|
||||
# define LABEL(name) name:
|
||||
#else
|
||||
# define TARGET(op) case op: TARGET_##op:
|
||||
# define DISPATCH_GOTO() goto dispatch_opcode
|
||||
# define DISPATCH_GOTO_NON_TRACING() goto dispatch_opcode
|
||||
# define JUMP_TO_LABEL(name) goto name;
|
||||
# define JUMP_TO_PREDICTED(name) goto PREDICTED_##name;
|
||||
# define LABEL(name) name:
|
||||
#endif
|
||||
|
||||
#if (_Py_TAIL_CALL_INTERP || USE_COMPUTED_GOTOS) && _Py_TIER2
|
||||
# define IS_JIT_TRACING() (DISPATCH_TABLE_VAR == TRACING_DISPATCH_TABLE)
|
||||
# define ENTER_TRACING() \
|
||||
DISPATCH_TABLE_VAR = TRACING_DISPATCH_TABLE;
|
||||
# define LEAVE_TRACING() \
|
||||
DISPATCH_TABLE_VAR = DISPATCH_TABLE;
|
||||
#else
|
||||
# define IS_JIT_TRACING() (0)
|
||||
# define ENTER_TRACING()
|
||||
# define LEAVE_TRACING()
|
||||
#endif
|
||||
|
||||
/* PRE_DISPATCH_GOTO() does lltrace if enabled. Normally a no-op */
|
||||
#ifdef Py_DEBUG
|
||||
#define PRE_DISPATCH_GOTO() if (frame->lltrace >= 5) { \
|
||||
@@ -164,11 +189,19 @@ do { \
|
||||
DISPATCH_GOTO(); \
|
||||
}
|
||||
|
||||
#define DISPATCH_NON_TRACING() \
|
||||
{ \
|
||||
assert(frame->stackpointer == NULL); \
|
||||
NEXTOPARG(); \
|
||||
PRE_DISPATCH_GOTO(); \
|
||||
DISPATCH_GOTO_NON_TRACING(); \
|
||||
}
|
||||
|
||||
#define DISPATCH_SAME_OPARG() \
|
||||
{ \
|
||||
opcode = next_instr->op.code; \
|
||||
PRE_DISPATCH_GOTO(); \
|
||||
DISPATCH_GOTO(); \
|
||||
DISPATCH_GOTO_NON_TRACING(); \
|
||||
}
|
||||
|
||||
#define DISPATCH_INLINED(NEW_FRAME) \
|
||||
@@ -280,6 +313,7 @@ GETITEM(PyObject *v, Py_ssize_t i) {
|
||||
/* This takes a uint16_t instead of a _Py_BackoffCounter,
|
||||
* because it is used directly on the cache entry in generated code,
|
||||
* which is always an integral type. */
|
||||
// Force re-specialization when tracing a side exit to get good side exits.
|
||||
#define ADAPTIVE_COUNTER_TRIGGERS(COUNTER) \
|
||||
backoff_counter_triggers(forge_backoff_counter((COUNTER)))
|
||||
|
||||
@@ -366,12 +400,19 @@ do { \
|
||||
next_instr = _Py_jit_entry((EXECUTOR), frame, stack_pointer, tstate); \
|
||||
frame = tstate->current_frame; \
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame); \
|
||||
int keep_tracing_bit = (uintptr_t)next_instr & 1; \
|
||||
next_instr = (_Py_CODEUNIT *)(((uintptr_t)next_instr) & (~1)); \
|
||||
if (next_instr == NULL) { \
|
||||
/* gh-140104: The exception handler expects frame->instr_ptr
|
||||
to after this_instr, not this_instr! */ \
|
||||
next_instr = frame->instr_ptr + 1; \
|
||||
JUMP_TO_LABEL(error); \
|
||||
} \
|
||||
if (keep_tracing_bit) { \
|
||||
assert(((_PyThreadStateImpl *)tstate)->jit_tracer_state.prev_state.code_curr_size == 2); \
|
||||
ENTER_TRACING(); \
|
||||
DISPATCH_NON_TRACING(); \
|
||||
} \
|
||||
DISPATCH(); \
|
||||
} while (0)
|
||||
|
||||
@@ -382,13 +423,23 @@ do { \
|
||||
goto tier2_start; \
|
||||
} while (0)
|
||||
|
||||
#define GOTO_TIER_ONE(TARGET) \
|
||||
do \
|
||||
{ \
|
||||
tstate->current_executor = NULL; \
|
||||
OPT_HIST(trace_uop_execution_counter, trace_run_length_hist); \
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer); \
|
||||
return TARGET; \
|
||||
#define GOTO_TIER_ONE_SETUP \
|
||||
tstate->current_executor = NULL; \
|
||||
OPT_HIST(trace_uop_execution_counter, trace_run_length_hist); \
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
|
||||
#define GOTO_TIER_ONE(TARGET) \
|
||||
do \
|
||||
{ \
|
||||
GOTO_TIER_ONE_SETUP \
|
||||
return (_Py_CODEUNIT *)(TARGET); \
|
||||
} while (0)
|
||||
|
||||
#define GOTO_TIER_ONE_CONTINUE_TRACING(TARGET) \
|
||||
do \
|
||||
{ \
|
||||
GOTO_TIER_ONE_SETUP \
|
||||
return (_Py_CODEUNIT *)(((uintptr_t)(TARGET))| 1); \
|
||||
} while (0)
|
||||
|
||||
#define CURRENT_OPARG() (next_uop[-1].oparg)
|
||||
|
||||
139
Python/executor_cases.c.h
generated
139
Python/executor_cases.c.h
generated
@@ -4189,6 +4189,8 @@
|
||||
break;
|
||||
}
|
||||
|
||||
/* _JUMP_BACKWARD_NO_INTERRUPT is not a viable micro-op for tier 2 because it is replaced */
|
||||
|
||||
case _GET_LEN: {
|
||||
_PyStackRef obj;
|
||||
_PyStackRef len;
|
||||
@@ -7108,12 +7110,36 @@
|
||||
PyObject *exit_p = (PyObject *)CURRENT_OPERAND0();
|
||||
_PyExitData *exit = (_PyExitData *)exit_p;
|
||||
#if defined(Py_DEBUG) && !defined(_Py_JIT)
|
||||
_Py_CODEUNIT *target = _PyFrame_GetBytecode(frame) + exit->target;
|
||||
const _Py_CODEUNIT *target = ((frame->owner == FRAME_OWNED_BY_INTERPRETER)
|
||||
? _Py_INTERPRETER_TRAMPOLINE_INSTRUCTIONS_PTR : _PyFrame_GetBytecode(frame))
|
||||
+ exit->target;
|
||||
OPT_HIST(trace_uop_execution_counter, trace_run_length_hist);
|
||||
if (frame->lltrace >= 2) {
|
||||
if (frame->lltrace >= 3) {
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
printf("SIDE EXIT: [UOp ");
|
||||
_PyUOpPrint(&next_uop[-1]);
|
||||
printf(", exit %tu, temp %d, target %d -> %s, is_control_flow %d]\n",
|
||||
exit - current_executor->exits, exit->temperature.value_and_backoff,
|
||||
(int)(target - _PyFrame_GetBytecode(frame)),
|
||||
_PyOpcode_OpName[target->op.code], exit->is_control_flow);
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
}
|
||||
#endif
|
||||
tstate->jit_exit = exit;
|
||||
TIER2_TO_TIER2(exit->executor);
|
||||
break;
|
||||
}
|
||||
|
||||
case _DYNAMIC_EXIT: {
|
||||
PyObject *exit_p = (PyObject *)CURRENT_OPERAND0();
|
||||
#if defined(Py_DEBUG) && !defined(_Py_JIT)
|
||||
_PyExitData *exit = (_PyExitData *)exit_p;
|
||||
_Py_CODEUNIT *target = frame->instr_ptr;
|
||||
OPT_HIST(trace_uop_execution_counter, trace_run_length_hist);
|
||||
if (frame->lltrace >= 3) {
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
printf("DYNAMIC EXIT: [UOp ");
|
||||
_PyUOpPrint(&next_uop[-1]);
|
||||
printf(", exit %tu, temp %d, target %d -> %s]\n",
|
||||
exit - current_executor->exits, exit->temperature.value_and_backoff,
|
||||
(int)(target - _PyFrame_GetBytecode(frame)),
|
||||
@@ -7121,8 +7147,8 @@
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
}
|
||||
#endif
|
||||
tstate->jit_exit = exit;
|
||||
TIER2_TO_TIER2(exit->executor);
|
||||
|
||||
GOTO_TIER_ONE(frame->instr_ptr);
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -7419,7 +7445,8 @@
|
||||
}
|
||||
|
||||
case _DEOPT: {
|
||||
GOTO_TIER_ONE(_PyFrame_GetBytecode(frame) + CURRENT_TARGET());
|
||||
GOTO_TIER_ONE((frame->owner == FRAME_OWNED_BY_INTERPRETER)
|
||||
? _Py_INTERPRETER_TRAMPOLINE_INSTRUCTIONS_PTR : _PyFrame_GetBytecode(frame) + CURRENT_TARGET());
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -7460,37 +7487,101 @@
|
||||
case _COLD_EXIT: {
|
||||
_PyExitData *exit = tstate->jit_exit;
|
||||
assert(exit != NULL);
|
||||
assert(frame->owner < FRAME_OWNED_BY_INTERPRETER);
|
||||
_Py_CODEUNIT *target = _PyFrame_GetBytecode(frame) + exit->target;
|
||||
_Py_BackoffCounter temperature = exit->temperature;
|
||||
if (!backoff_counter_triggers(temperature)) {
|
||||
exit->temperature = advance_backoff_counter(temperature);
|
||||
GOTO_TIER_ONE(target);
|
||||
}
|
||||
_PyExecutorObject *executor;
|
||||
if (target->op.code == ENTER_EXECUTOR) {
|
||||
PyCodeObject *code = _PyFrame_GetCode(frame);
|
||||
executor = code->co_executors->executors[target->op.arg];
|
||||
Py_INCREF(executor);
|
||||
assert(tstate->jit_exit == exit);
|
||||
exit->executor = executor;
|
||||
TIER2_TO_TIER2(exit->executor);
|
||||
}
|
||||
else {
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
_PyExecutorObject *previous_executor = _PyExecutor_FromExit(exit);
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
assert(tstate->current_executor == (PyObject *)previous_executor);
|
||||
int chain_depth = previous_executor->vm_data.chain_depth + 1;
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
int optimized = _PyOptimizer_Optimize(frame, target, &executor, chain_depth);
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
if (optimized <= 0) {
|
||||
exit->temperature = restart_backoff_counter(temperature);
|
||||
GOTO_TIER_ONE(optimized < 0 ? NULL : target);
|
||||
if (!backoff_counter_triggers(temperature)) {
|
||||
exit->temperature = advance_backoff_counter(temperature);
|
||||
GOTO_TIER_ONE(target);
|
||||
}
|
||||
exit->temperature = initial_temperature_backoff_counter();
|
||||
_PyExecutorObject *previous_executor = _PyExecutor_FromExit(exit);
|
||||
assert(tstate->current_executor == (PyObject *)previous_executor);
|
||||
int chain_depth = previous_executor->vm_data.chain_depth + !exit->is_control_flow;
|
||||
int succ = _PyJit_TryInitializeTracing(tstate, frame, target, target, target, STACK_LEVEL(), chain_depth, exit, target->op.arg);
|
||||
exit->temperature = restart_backoff_counter(exit->temperature);
|
||||
if (succ) {
|
||||
GOTO_TIER_ONE_CONTINUE_TRACING(target);
|
||||
}
|
||||
GOTO_TIER_ONE(target);
|
||||
}
|
||||
assert(tstate->jit_exit == exit);
|
||||
exit->executor = executor;
|
||||
TIER2_TO_TIER2(exit->executor);
|
||||
break;
|
||||
}
|
||||
|
||||
case _COLD_DYNAMIC_EXIT: {
|
||||
_Py_CODEUNIT *target = frame->instr_ptr;
|
||||
GOTO_TIER_ONE(target);
|
||||
break;
|
||||
}
|
||||
|
||||
case _GUARD_IP__PUSH_FRAME: {
|
||||
#define OFFSET_OF__PUSH_FRAME ((0))
|
||||
PyObject *ip = (PyObject *)CURRENT_OPERAND0();
|
||||
_Py_CODEUNIT *target = frame->instr_ptr + OFFSET_OF__PUSH_FRAME;
|
||||
if (target != (_Py_CODEUNIT *)ip) {
|
||||
frame->instr_ptr += OFFSET_OF__PUSH_FRAME;
|
||||
if (true) {
|
||||
UOP_STAT_INC(uopcode, miss);
|
||||
JUMP_TO_JUMP_TARGET();
|
||||
}
|
||||
}
|
||||
#undef OFFSET_OF__PUSH_FRAME
|
||||
break;
|
||||
}
|
||||
|
||||
case _GUARD_IP_YIELD_VALUE: {
|
||||
#define OFFSET_OF_YIELD_VALUE ((1+INLINE_CACHE_ENTRIES_SEND))
|
||||
PyObject *ip = (PyObject *)CURRENT_OPERAND0();
|
||||
_Py_CODEUNIT *target = frame->instr_ptr + OFFSET_OF_YIELD_VALUE;
|
||||
if (target != (_Py_CODEUNIT *)ip) {
|
||||
frame->instr_ptr += OFFSET_OF_YIELD_VALUE;
|
||||
if (true) {
|
||||
UOP_STAT_INC(uopcode, miss);
|
||||
JUMP_TO_JUMP_TARGET();
|
||||
}
|
||||
}
|
||||
#undef OFFSET_OF_YIELD_VALUE
|
||||
break;
|
||||
}
|
||||
|
||||
case _GUARD_IP_RETURN_VALUE: {
|
||||
#define OFFSET_OF_RETURN_VALUE ((frame->return_offset))
|
||||
PyObject *ip = (PyObject *)CURRENT_OPERAND0();
|
||||
_Py_CODEUNIT *target = frame->instr_ptr + OFFSET_OF_RETURN_VALUE;
|
||||
if (target != (_Py_CODEUNIT *)ip) {
|
||||
frame->instr_ptr += OFFSET_OF_RETURN_VALUE;
|
||||
if (true) {
|
||||
UOP_STAT_INC(uopcode, miss);
|
||||
JUMP_TO_JUMP_TARGET();
|
||||
}
|
||||
}
|
||||
#undef OFFSET_OF_RETURN_VALUE
|
||||
break;
|
||||
}
|
||||
|
||||
case _GUARD_IP_RETURN_GENERATOR: {
|
||||
#define OFFSET_OF_RETURN_GENERATOR ((frame->return_offset))
|
||||
PyObject *ip = (PyObject *)CURRENT_OPERAND0();
|
||||
_Py_CODEUNIT *target = frame->instr_ptr + OFFSET_OF_RETURN_GENERATOR;
|
||||
if (target != (_Py_CODEUNIT *)ip) {
|
||||
frame->instr_ptr += OFFSET_OF_RETURN_GENERATOR;
|
||||
if (true) {
|
||||
UOP_STAT_INC(uopcode, miss);
|
||||
JUMP_TO_JUMP_TARGET();
|
||||
}
|
||||
}
|
||||
#undef OFFSET_OF_RETURN_GENERATOR
|
||||
break;
|
||||
}
|
||||
|
||||
|
||||
#undef TIER_TWO
|
||||
|
||||
104
Python/generated_cases.c.h
generated
104
Python/generated_cases.c.h
generated
@@ -5476,6 +5476,10 @@
|
||||
INSTRUCTION_STATS(ENTER_EXECUTOR);
|
||||
opcode = ENTER_EXECUTOR;
|
||||
#ifdef _Py_TIER2
|
||||
if (IS_JIT_TRACING()) {
|
||||
next_instr = this_instr;
|
||||
JUMP_TO_LABEL(stop_tracing);
|
||||
}
|
||||
PyCodeObject *code = _PyFrame_GetCode(frame);
|
||||
_PyExecutorObject *executor = code->co_executors->executors[oparg & 255];
|
||||
assert(executor->vm_data.index == INSTR_OFFSET() - 1);
|
||||
@@ -7589,7 +7593,7 @@
|
||||
/* Skip 1 cache entry */
|
||||
// _SPECIALIZE_JUMP_BACKWARD
|
||||
{
|
||||
#if ENABLE_SPECIALIZATION_FT
|
||||
#if ENABLE_SPECIALIZATION
|
||||
if (this_instr->op.code == JUMP_BACKWARD) {
|
||||
uint8_t desired = tstate->interp->jit ? JUMP_BACKWARD_JIT : JUMP_BACKWARD_NO_JIT;
|
||||
FT_ATOMIC_STORE_UINT8_RELAXED(this_instr->op.code, desired);
|
||||
@@ -7645,30 +7649,20 @@
|
||||
{
|
||||
#ifdef _Py_TIER2
|
||||
_Py_BackoffCounter counter = this_instr[1].counter;
|
||||
if (backoff_counter_triggers(counter) && this_instr->op.code == JUMP_BACKWARD_JIT) {
|
||||
_Py_CODEUNIT *start = this_instr;
|
||||
if (!IS_JIT_TRACING() && backoff_counter_triggers(counter) &&
|
||||
this_instr->op.code == JUMP_BACKWARD_JIT &&
|
||||
next_instr->op.code != ENTER_EXECUTOR) {
|
||||
_Py_CODEUNIT *insert_exec_at = this_instr;
|
||||
while (oparg > 255) {
|
||||
oparg >>= 8;
|
||||
start--;
|
||||
insert_exec_at--;
|
||||
}
|
||||
_PyExecutorObject *executor;
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
int optimized = _PyOptimizer_Optimize(frame, start, &executor, 0);
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
if (optimized <= 0) {
|
||||
this_instr[1].counter = restart_backoff_counter(counter);
|
||||
if (optimized < 0) {
|
||||
JUMP_TO_LABEL(error);
|
||||
}
|
||||
int succ = _PyJit_TryInitializeTracing(tstate, frame, this_instr, insert_exec_at, next_instr, STACK_LEVEL(), 0, NULL, oparg);
|
||||
if (succ) {
|
||||
ENTER_TRACING();
|
||||
}
|
||||
else {
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
this_instr[1].counter = initial_jump_backoff_counter();
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
assert(tstate->current_executor == NULL);
|
||||
assert(executor != tstate->interp->cold_executor);
|
||||
tstate->jit_exit = NULL;
|
||||
TIER1_TO_TIER2(executor);
|
||||
this_instr[1].counter = restart_backoff_counter(counter);
|
||||
}
|
||||
}
|
||||
else {
|
||||
@@ -12265,5 +12259,75 @@ JUMP_TO_LABEL(error);
|
||||
DISPATCH();
|
||||
}
|
||||
|
||||
LABEL(record_previous_inst)
|
||||
{
|
||||
#if _Py_TIER2
|
||||
assert(IS_JIT_TRACING());
|
||||
int opcode = next_instr->op.code;
|
||||
bool stop_tracing = (opcode == WITH_EXCEPT_START ||
|
||||
opcode == RERAISE || opcode == CLEANUP_THROW ||
|
||||
opcode == PUSH_EXC_INFO || opcode == INTERPRETER_EXIT);
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
int full = !_PyJit_translate_single_bytecode_to_trace(tstate, frame, next_instr, stop_tracing);
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
if (full) {
|
||||
LEAVE_TRACING();
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
int err = stop_tracing_and_jit(tstate, frame);
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
if (err < 0) {
|
||||
JUMP_TO_LABEL(error);
|
||||
}
|
||||
DISPATCH_GOTO_NON_TRACING();
|
||||
}
|
||||
_PyThreadStateImpl *_tstate = (_PyThreadStateImpl *)tstate;
|
||||
if ((_tstate->jit_tracer_state.prev_state.instr->op.code == CALL_LIST_APPEND &&
|
||||
opcode == POP_TOP) ||
|
||||
(_tstate->jit_tracer_state.prev_state.instr->op.code == BINARY_OP_INPLACE_ADD_UNICODE &&
|
||||
opcode == STORE_FAST)) {
|
||||
_tstate->jit_tracer_state.prev_state.instr_is_super = true;
|
||||
}
|
||||
else {
|
||||
_tstate->jit_tracer_state.prev_state.instr = next_instr;
|
||||
}
|
||||
PyObject *prev_code = PyStackRef_AsPyObjectBorrow(frame->f_executable);
|
||||
if (_tstate->jit_tracer_state.prev_state.instr_code != (PyCodeObject *)prev_code) {
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
Py_SETREF(_tstate->jit_tracer_state.prev_state.instr_code, (PyCodeObject*)Py_NewRef((prev_code)));
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
}
|
||||
_tstate->jit_tracer_state.prev_state.instr_frame = frame;
|
||||
_tstate->jit_tracer_state.prev_state.instr_oparg = oparg;
|
||||
_tstate->jit_tracer_state.prev_state.instr_stacklevel = PyStackRef_IsNone(frame->f_executable) ? 2 : STACK_LEVEL();
|
||||
if (_PyOpcode_Caches[_PyOpcode_Deopt[opcode]]) {
|
||||
(&next_instr[1])->counter = trigger_backoff_counter();
|
||||
}
|
||||
DISPATCH_GOTO_NON_TRACING();
|
||||
#else
|
||||
Py_FatalError("JIT label executed in non-jit build.");
|
||||
#endif
|
||||
}
|
||||
|
||||
LABEL(stop_tracing)
|
||||
{
|
||||
#if _Py_TIER2
|
||||
assert(IS_JIT_TRACING());
|
||||
int opcode = next_instr->op.code;
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
_PyJit_translate_single_bytecode_to_trace(tstate, frame, NULL, true);
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
LEAVE_TRACING();
|
||||
_PyFrame_SetStackPointer(frame, stack_pointer);
|
||||
int err = stop_tracing_and_jit(tstate, frame);
|
||||
stack_pointer = _PyFrame_GetStackPointer(frame);
|
||||
if (err < 0) {
|
||||
JUMP_TO_LABEL(error);
|
||||
}
|
||||
DISPATCH_GOTO_NON_TRACING();
|
||||
#else
|
||||
Py_FatalError("JIT label executed in non-jit build.");
|
||||
#endif
|
||||
}
|
||||
|
||||
/* END LABELS */
|
||||
#undef TIER_ONE
|
||||
|
||||
@@ -18,6 +18,7 @@
|
||||
#include "pycore_tuple.h" // _PyTuple_FromArraySteal()
|
||||
|
||||
#include "opcode_ids.h"
|
||||
#include "pycore_optimizer.h"
|
||||
|
||||
|
||||
/* Uncomment this to dump debugging output when assertions fail */
|
||||
@@ -1785,6 +1786,7 @@ force_instrument_lock_held(PyCodeObject *code, PyInterpreterState *interp)
|
||||
_PyCode_Clear_Executors(code);
|
||||
}
|
||||
_Py_Executors_InvalidateDependency(interp, code, 1);
|
||||
_PyJit_Tracer_InvalidateDependency(PyThreadState_GET(), code);
|
||||
#endif
|
||||
int code_len = (int)Py_SIZE(code);
|
||||
/* Exit early to avoid creating instrumentation
|
||||
|
||||
@@ -604,7 +604,7 @@ _PyJIT_Compile(_PyExecutorObject *executor, const _PyUOpInstruction trace[], siz
|
||||
unsigned char *code = memory;
|
||||
state.trampolines.mem = memory + code_size;
|
||||
unsigned char *data = memory + code_size + state.trampolines.size + code_padding;
|
||||
assert(trace[0].opcode == _START_EXECUTOR || trace[0].opcode == _COLD_EXIT);
|
||||
assert(trace[0].opcode == _START_EXECUTOR || trace[0].opcode == _COLD_EXIT || trace[0].opcode == _COLD_DYNAMIC_EXIT);
|
||||
for (size_t i = 0; i < length; i++) {
|
||||
const _PyUOpInstruction *instruction = &trace[i];
|
||||
group = &stencil_groups[instruction->opcode];
|
||||
|
||||
526
Python/opcode_targets.h
generated
526
Python/opcode_targets.h
generated
@@ -257,8 +257,270 @@ static void *opcode_targets_table[256] = {
|
||||
&&TARGET_INSTRUMENTED_LINE,
|
||||
&&TARGET_ENTER_EXECUTOR,
|
||||
};
|
||||
#if _Py_TIER2
|
||||
static void *opcode_tracing_targets_table[256] = {
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&_unknown_opcode,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
&&record_previous_inst,
|
||||
};
|
||||
#endif
|
||||
#else /* _Py_TAIL_CALL_INTERP */
|
||||
static py_tail_call_funcptr instruction_funcptr_table[256];
|
||||
static py_tail_call_funcptr instruction_funcptr_handler_table[256];
|
||||
|
||||
static py_tail_call_funcptr instruction_funcptr_tracing_table[256];
|
||||
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_pop_2_error(TAIL_CALL_PARAMS);
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_pop_1_error(TAIL_CALL_PARAMS);
|
||||
@@ -266,6 +528,8 @@ Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_error(TAIL_CALL_PARAMS);
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_exception_unwind(TAIL_CALL_PARAMS);
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_exit_unwind(TAIL_CALL_PARAMS);
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_start_frame(TAIL_CALL_PARAMS);
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_record_previous_inst(TAIL_CALL_PARAMS);
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_stop_tracing(TAIL_CALL_PARAMS);
|
||||
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_BINARY_OP(TAIL_CALL_PARAMS);
|
||||
Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_BINARY_OP_ADD_FLOAT(TAIL_CALL_PARAMS);
|
||||
@@ -503,7 +767,7 @@ Py_PRESERVE_NONE_CC static PyObject *_TAIL_CALL_UNKNOWN_OPCODE(TAIL_CALL_PARAMS)
|
||||
JUMP_TO_LABEL(error);
|
||||
}
|
||||
|
||||
static py_tail_call_funcptr instruction_funcptr_table[256] = {
|
||||
static py_tail_call_funcptr instruction_funcptr_handler_table[256] = {
|
||||
[BINARY_OP] = _TAIL_CALL_BINARY_OP,
|
||||
[BINARY_OP_ADD_FLOAT] = _TAIL_CALL_BINARY_OP_ADD_FLOAT,
|
||||
[BINARY_OP_ADD_INT] = _TAIL_CALL_BINARY_OP_ADD_INT,
|
||||
@@ -761,4 +1025,262 @@ static py_tail_call_funcptr instruction_funcptr_table[256] = {
|
||||
[232] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[233] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
};
|
||||
static py_tail_call_funcptr instruction_funcptr_tracing_table[256] = {
|
||||
[BINARY_OP] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_ADD_FLOAT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_ADD_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_ADD_UNICODE] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_EXTEND] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_INPLACE_ADD_UNICODE] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_MULTIPLY_FLOAT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_MULTIPLY_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_SUBSCR_DICT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_SUBSCR_GETITEM] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_SUBSCR_LIST_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_SUBSCR_LIST_SLICE] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_SUBSCR_STR_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_SUBSCR_TUPLE_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_SUBTRACT_FLOAT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_OP_SUBTRACT_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[BINARY_SLICE] = _TAIL_CALL_record_previous_inst,
|
||||
[BUILD_INTERPOLATION] = _TAIL_CALL_record_previous_inst,
|
||||
[BUILD_LIST] = _TAIL_CALL_record_previous_inst,
|
||||
[BUILD_MAP] = _TAIL_CALL_record_previous_inst,
|
||||
[BUILD_SET] = _TAIL_CALL_record_previous_inst,
|
||||
[BUILD_SLICE] = _TAIL_CALL_record_previous_inst,
|
||||
[BUILD_STRING] = _TAIL_CALL_record_previous_inst,
|
||||
[BUILD_TEMPLATE] = _TAIL_CALL_record_previous_inst,
|
||||
[BUILD_TUPLE] = _TAIL_CALL_record_previous_inst,
|
||||
[CACHE] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_ALLOC_AND_ENTER_INIT] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_BOUND_METHOD_EXACT_ARGS] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_BOUND_METHOD_GENERAL] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_BUILTIN_CLASS] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_BUILTIN_FAST] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_BUILTIN_FAST_WITH_KEYWORDS] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_BUILTIN_O] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_FUNCTION_EX] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_INTRINSIC_1] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_INTRINSIC_2] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_ISINSTANCE] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_KW] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_KW_BOUND_METHOD] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_KW_NON_PY] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_KW_PY] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_LEN] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_LIST_APPEND] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_METHOD_DESCRIPTOR_FAST] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_METHOD_DESCRIPTOR_NOARGS] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_METHOD_DESCRIPTOR_O] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_NON_PY_GENERAL] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_PY_EXACT_ARGS] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_PY_GENERAL] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_STR_1] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_TUPLE_1] = _TAIL_CALL_record_previous_inst,
|
||||
[CALL_TYPE_1] = _TAIL_CALL_record_previous_inst,
|
||||
[CHECK_EG_MATCH] = _TAIL_CALL_record_previous_inst,
|
||||
[CHECK_EXC_MATCH] = _TAIL_CALL_record_previous_inst,
|
||||
[CLEANUP_THROW] = _TAIL_CALL_record_previous_inst,
|
||||
[COMPARE_OP] = _TAIL_CALL_record_previous_inst,
|
||||
[COMPARE_OP_FLOAT] = _TAIL_CALL_record_previous_inst,
|
||||
[COMPARE_OP_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[COMPARE_OP_STR] = _TAIL_CALL_record_previous_inst,
|
||||
[CONTAINS_OP] = _TAIL_CALL_record_previous_inst,
|
||||
[CONTAINS_OP_DICT] = _TAIL_CALL_record_previous_inst,
|
||||
[CONTAINS_OP_SET] = _TAIL_CALL_record_previous_inst,
|
||||
[CONVERT_VALUE] = _TAIL_CALL_record_previous_inst,
|
||||
[COPY] = _TAIL_CALL_record_previous_inst,
|
||||
[COPY_FREE_VARS] = _TAIL_CALL_record_previous_inst,
|
||||
[DELETE_ATTR] = _TAIL_CALL_record_previous_inst,
|
||||
[DELETE_DEREF] = _TAIL_CALL_record_previous_inst,
|
||||
[DELETE_FAST] = _TAIL_CALL_record_previous_inst,
|
||||
[DELETE_GLOBAL] = _TAIL_CALL_record_previous_inst,
|
||||
[DELETE_NAME] = _TAIL_CALL_record_previous_inst,
|
||||
[DELETE_SUBSCR] = _TAIL_CALL_record_previous_inst,
|
||||
[DICT_MERGE] = _TAIL_CALL_record_previous_inst,
|
||||
[DICT_UPDATE] = _TAIL_CALL_record_previous_inst,
|
||||
[END_ASYNC_FOR] = _TAIL_CALL_record_previous_inst,
|
||||
[END_FOR] = _TAIL_CALL_record_previous_inst,
|
||||
[END_SEND] = _TAIL_CALL_record_previous_inst,
|
||||
[ENTER_EXECUTOR] = _TAIL_CALL_record_previous_inst,
|
||||
[EXIT_INIT_CHECK] = _TAIL_CALL_record_previous_inst,
|
||||
[EXTENDED_ARG] = _TAIL_CALL_record_previous_inst,
|
||||
[FORMAT_SIMPLE] = _TAIL_CALL_record_previous_inst,
|
||||
[FORMAT_WITH_SPEC] = _TAIL_CALL_record_previous_inst,
|
||||
[FOR_ITER] = _TAIL_CALL_record_previous_inst,
|
||||
[FOR_ITER_GEN] = _TAIL_CALL_record_previous_inst,
|
||||
[FOR_ITER_LIST] = _TAIL_CALL_record_previous_inst,
|
||||
[FOR_ITER_RANGE] = _TAIL_CALL_record_previous_inst,
|
||||
[FOR_ITER_TUPLE] = _TAIL_CALL_record_previous_inst,
|
||||
[GET_AITER] = _TAIL_CALL_record_previous_inst,
|
||||
[GET_ANEXT] = _TAIL_CALL_record_previous_inst,
|
||||
[GET_AWAITABLE] = _TAIL_CALL_record_previous_inst,
|
||||
[GET_ITER] = _TAIL_CALL_record_previous_inst,
|
||||
[GET_LEN] = _TAIL_CALL_record_previous_inst,
|
||||
[GET_YIELD_FROM_ITER] = _TAIL_CALL_record_previous_inst,
|
||||
[IMPORT_FROM] = _TAIL_CALL_record_previous_inst,
|
||||
[IMPORT_NAME] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_CALL] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_CALL_FUNCTION_EX] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_CALL_KW] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_END_ASYNC_FOR] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_END_FOR] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_END_SEND] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_FOR_ITER] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_INSTRUCTION] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_JUMP_BACKWARD] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_JUMP_FORWARD] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_LINE] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_LOAD_SUPER_ATTR] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_NOT_TAKEN] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_POP_ITER] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_POP_JUMP_IF_FALSE] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_POP_JUMP_IF_NONE] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_POP_JUMP_IF_NOT_NONE] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_POP_JUMP_IF_TRUE] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_RESUME] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_RETURN_VALUE] = _TAIL_CALL_record_previous_inst,
|
||||
[INSTRUMENTED_YIELD_VALUE] = _TAIL_CALL_record_previous_inst,
|
||||
[INTERPRETER_EXIT] = _TAIL_CALL_record_previous_inst,
|
||||
[IS_OP] = _TAIL_CALL_record_previous_inst,
|
||||
[JUMP_BACKWARD] = _TAIL_CALL_record_previous_inst,
|
||||
[JUMP_BACKWARD_JIT] = _TAIL_CALL_record_previous_inst,
|
||||
[JUMP_BACKWARD_NO_INTERRUPT] = _TAIL_CALL_record_previous_inst,
|
||||
[JUMP_BACKWARD_NO_JIT] = _TAIL_CALL_record_previous_inst,
|
||||
[JUMP_FORWARD] = _TAIL_CALL_record_previous_inst,
|
||||
[LIST_APPEND] = _TAIL_CALL_record_previous_inst,
|
||||
[LIST_EXTEND] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_CLASS] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_CLASS_WITH_METACLASS_CHECK] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_GETATTRIBUTE_OVERRIDDEN] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_INSTANCE_VALUE] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_METHOD_LAZY_DICT] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_METHOD_NO_DICT] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_METHOD_WITH_VALUES] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_MODULE] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_NONDESCRIPTOR_NO_DICT] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_NONDESCRIPTOR_WITH_VALUES] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_PROPERTY] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_SLOT] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_ATTR_WITH_HINT] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_BUILD_CLASS] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_COMMON_CONSTANT] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_CONST] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_DEREF] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_FAST] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_FAST_AND_CLEAR] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_FAST_BORROW] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_FAST_BORROW_LOAD_FAST_BORROW] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_FAST_CHECK] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_FAST_LOAD_FAST] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_FROM_DICT_OR_DEREF] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_FROM_DICT_OR_GLOBALS] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_GLOBAL] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_GLOBAL_BUILTIN] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_GLOBAL_MODULE] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_LOCALS] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_NAME] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_SMALL_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_SPECIAL] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_SUPER_ATTR] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_SUPER_ATTR_ATTR] = _TAIL_CALL_record_previous_inst,
|
||||
[LOAD_SUPER_ATTR_METHOD] = _TAIL_CALL_record_previous_inst,
|
||||
[MAKE_CELL] = _TAIL_CALL_record_previous_inst,
|
||||
[MAKE_FUNCTION] = _TAIL_CALL_record_previous_inst,
|
||||
[MAP_ADD] = _TAIL_CALL_record_previous_inst,
|
||||
[MATCH_CLASS] = _TAIL_CALL_record_previous_inst,
|
||||
[MATCH_KEYS] = _TAIL_CALL_record_previous_inst,
|
||||
[MATCH_MAPPING] = _TAIL_CALL_record_previous_inst,
|
||||
[MATCH_SEQUENCE] = _TAIL_CALL_record_previous_inst,
|
||||
[NOP] = _TAIL_CALL_record_previous_inst,
|
||||
[NOT_TAKEN] = _TAIL_CALL_record_previous_inst,
|
||||
[POP_EXCEPT] = _TAIL_CALL_record_previous_inst,
|
||||
[POP_ITER] = _TAIL_CALL_record_previous_inst,
|
||||
[POP_JUMP_IF_FALSE] = _TAIL_CALL_record_previous_inst,
|
||||
[POP_JUMP_IF_NONE] = _TAIL_CALL_record_previous_inst,
|
||||
[POP_JUMP_IF_NOT_NONE] = _TAIL_CALL_record_previous_inst,
|
||||
[POP_JUMP_IF_TRUE] = _TAIL_CALL_record_previous_inst,
|
||||
[POP_TOP] = _TAIL_CALL_record_previous_inst,
|
||||
[PUSH_EXC_INFO] = _TAIL_CALL_record_previous_inst,
|
||||
[PUSH_NULL] = _TAIL_CALL_record_previous_inst,
|
||||
[RAISE_VARARGS] = _TAIL_CALL_record_previous_inst,
|
||||
[RERAISE] = _TAIL_CALL_record_previous_inst,
|
||||
[RESERVED] = _TAIL_CALL_record_previous_inst,
|
||||
[RESUME] = _TAIL_CALL_record_previous_inst,
|
||||
[RESUME_CHECK] = _TAIL_CALL_record_previous_inst,
|
||||
[RETURN_GENERATOR] = _TAIL_CALL_record_previous_inst,
|
||||
[RETURN_VALUE] = _TAIL_CALL_record_previous_inst,
|
||||
[SEND] = _TAIL_CALL_record_previous_inst,
|
||||
[SEND_GEN] = _TAIL_CALL_record_previous_inst,
|
||||
[SETUP_ANNOTATIONS] = _TAIL_CALL_record_previous_inst,
|
||||
[SET_ADD] = _TAIL_CALL_record_previous_inst,
|
||||
[SET_FUNCTION_ATTRIBUTE] = _TAIL_CALL_record_previous_inst,
|
||||
[SET_UPDATE] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_ATTR] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_ATTR_INSTANCE_VALUE] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_ATTR_SLOT] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_ATTR_WITH_HINT] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_DEREF] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_FAST] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_FAST_LOAD_FAST] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_FAST_STORE_FAST] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_GLOBAL] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_NAME] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_SLICE] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_SUBSCR] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_SUBSCR_DICT] = _TAIL_CALL_record_previous_inst,
|
||||
[STORE_SUBSCR_LIST_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[SWAP] = _TAIL_CALL_record_previous_inst,
|
||||
[TO_BOOL] = _TAIL_CALL_record_previous_inst,
|
||||
[TO_BOOL_ALWAYS_TRUE] = _TAIL_CALL_record_previous_inst,
|
||||
[TO_BOOL_BOOL] = _TAIL_CALL_record_previous_inst,
|
||||
[TO_BOOL_INT] = _TAIL_CALL_record_previous_inst,
|
||||
[TO_BOOL_LIST] = _TAIL_CALL_record_previous_inst,
|
||||
[TO_BOOL_NONE] = _TAIL_CALL_record_previous_inst,
|
||||
[TO_BOOL_STR] = _TAIL_CALL_record_previous_inst,
|
||||
[UNARY_INVERT] = _TAIL_CALL_record_previous_inst,
|
||||
[UNARY_NEGATIVE] = _TAIL_CALL_record_previous_inst,
|
||||
[UNARY_NOT] = _TAIL_CALL_record_previous_inst,
|
||||
[UNPACK_EX] = _TAIL_CALL_record_previous_inst,
|
||||
[UNPACK_SEQUENCE] = _TAIL_CALL_record_previous_inst,
|
||||
[UNPACK_SEQUENCE_LIST] = _TAIL_CALL_record_previous_inst,
|
||||
[UNPACK_SEQUENCE_TUPLE] = _TAIL_CALL_record_previous_inst,
|
||||
[UNPACK_SEQUENCE_TWO_TUPLE] = _TAIL_CALL_record_previous_inst,
|
||||
[WITH_EXCEPT_START] = _TAIL_CALL_record_previous_inst,
|
||||
[YIELD_VALUE] = _TAIL_CALL_record_previous_inst,
|
||||
[121] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[122] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[123] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[124] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[125] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[126] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[127] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[210] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[211] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[212] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[213] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[214] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[215] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[216] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[217] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[218] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[219] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[220] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[221] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[222] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[223] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[224] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[225] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[226] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[227] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[228] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[229] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[230] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[231] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[232] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
[233] = _TAIL_CALL_UNKNOWN_OPCODE,
|
||||
};
|
||||
#endif /* _Py_TAIL_CALL_INTERP */
|
||||
|
||||
1063
Python/optimizer.c
1063
Python/optimizer.c
File diff suppressed because it is too large
Load Diff
@@ -142,8 +142,10 @@ incorrect_keys(PyObject *obj, uint32_t version)
|
||||
#define STACK_LEVEL() ((int)(stack_pointer - ctx->frame->stack))
|
||||
#define STACK_SIZE() ((int)(ctx->frame->stack_len))
|
||||
|
||||
#define CURRENT_FRAME_IS_INIT_SHIM() (ctx->frame->code == ((PyCodeObject *)&_Py_InitCleanup))
|
||||
|
||||
#define WITHIN_STACK_BOUNDS() \
|
||||
(STACK_LEVEL() >= 0 && STACK_LEVEL() <= STACK_SIZE())
|
||||
(CURRENT_FRAME_IS_INIT_SHIM() || (STACK_LEVEL() >= 0 && STACK_LEVEL() <= STACK_SIZE()))
|
||||
|
||||
|
||||
#define GETLOCAL(idx) ((ctx->frame->locals[idx]))
|
||||
@@ -267,7 +269,7 @@ static
|
||||
PyCodeObject *
|
||||
get_current_code_object(JitOptContext *ctx)
|
||||
{
|
||||
return (PyCodeObject *)ctx->frame->func->func_code;
|
||||
return (PyCodeObject *)ctx->frame->code;
|
||||
}
|
||||
|
||||
static PyObject *
|
||||
@@ -298,10 +300,6 @@ optimize_uops(
|
||||
JitOptContext context;
|
||||
JitOptContext *ctx = &context;
|
||||
uint32_t opcode = UINT16_MAX;
|
||||
int curr_space = 0;
|
||||
int max_space = 0;
|
||||
_PyUOpInstruction *first_valid_check_stack = NULL;
|
||||
_PyUOpInstruction *corresponding_check_stack = NULL;
|
||||
|
||||
// Make sure that watchers are set up
|
||||
PyInterpreterState *interp = _PyInterpreterState_GET();
|
||||
@@ -320,13 +318,18 @@ optimize_uops(
|
||||
ctx->frame = frame;
|
||||
|
||||
_PyUOpInstruction *this_instr = NULL;
|
||||
JitOptRef *stack_pointer = ctx->frame->stack_pointer;
|
||||
|
||||
for (int i = 0; !ctx->done; i++) {
|
||||
assert(i < trace_len);
|
||||
this_instr = &trace[i];
|
||||
|
||||
int oparg = this_instr->oparg;
|
||||
opcode = this_instr->opcode;
|
||||
JitOptRef *stack_pointer = ctx->frame->stack_pointer;
|
||||
|
||||
if (!CURRENT_FRAME_IS_INIT_SHIM()) {
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
}
|
||||
|
||||
#ifdef Py_DEBUG
|
||||
if (get_lltrace() >= 3) {
|
||||
@@ -345,9 +348,11 @@ optimize_uops(
|
||||
Py_UNREACHABLE();
|
||||
}
|
||||
assert(ctx->frame != NULL);
|
||||
DPRINTF(3, " stack_level %d\n", STACK_LEVEL());
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
assert(STACK_LEVEL() >= 0);
|
||||
if (!CURRENT_FRAME_IS_INIT_SHIM()) {
|
||||
DPRINTF(3, " stack_level %d\n", STACK_LEVEL());
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
assert(STACK_LEVEL() >= 0);
|
||||
}
|
||||
}
|
||||
if (ctx->out_of_space) {
|
||||
DPRINTF(3, "\n");
|
||||
@@ -355,27 +360,21 @@ optimize_uops(
|
||||
}
|
||||
if (ctx->contradiction) {
|
||||
// Attempted to push a "bottom" (contradiction) symbol onto the stack.
|
||||
// This means that the abstract interpreter has hit unreachable code.
|
||||
// This means that the abstract interpreter has optimized to trace
|
||||
// to an unreachable estate.
|
||||
// We *could* generate an _EXIT_TRACE or _FATAL_ERROR here, but hitting
|
||||
// bottom indicates type instability, so we are probably better off
|
||||
// bottom usually indicates an optimizer bug, so we are probably better off
|
||||
// retrying later.
|
||||
DPRINTF(3, "\n");
|
||||
DPRINTF(1, "Hit bottom in abstract interpreter\n");
|
||||
_Py_uop_abstractcontext_fini(ctx);
|
||||
OPT_STAT_INC(optimizer_contradiction);
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* Either reached the end or cannot optimize further, but there
|
||||
* would be no benefit in retrying later */
|
||||
_Py_uop_abstractcontext_fini(ctx);
|
||||
if (first_valid_check_stack != NULL) {
|
||||
assert(first_valid_check_stack->opcode == _CHECK_STACK_SPACE);
|
||||
assert(max_space > 0);
|
||||
assert(max_space <= INT_MAX);
|
||||
assert(max_space <= INT32_MAX);
|
||||
first_valid_check_stack->opcode = _CHECK_STACK_SPACE_OPERAND;
|
||||
first_valid_check_stack->operand0 = max_space;
|
||||
}
|
||||
return trace_len;
|
||||
|
||||
error:
|
||||
@@ -460,6 +459,7 @@ remove_unneeded_uops(_PyUOpInstruction *buffer, int buffer_size)
|
||||
buffer[pc].opcode = _NOP;
|
||||
}
|
||||
break;
|
||||
case _EXIT_TRACE:
|
||||
default:
|
||||
{
|
||||
// Cancel out pushes and pops, repeatedly. So:
|
||||
@@ -493,7 +493,7 @@ remove_unneeded_uops(_PyUOpInstruction *buffer, int buffer_size)
|
||||
}
|
||||
/* _PUSH_FRAME doesn't escape or error, but it
|
||||
* does need the IP for the return address */
|
||||
bool needs_ip = opcode == _PUSH_FRAME;
|
||||
bool needs_ip = (opcode == _PUSH_FRAME || opcode == _YIELD_VALUE || opcode == _DYNAMIC_EXIT || opcode == _EXIT_TRACE);
|
||||
if (_PyUop_Flags[opcode] & HAS_ESCAPES_FLAG) {
|
||||
needs_ip = true;
|
||||
may_have_escaped = true;
|
||||
@@ -503,10 +503,14 @@ remove_unneeded_uops(_PyUOpInstruction *buffer, int buffer_size)
|
||||
buffer[last_set_ip].opcode = _SET_IP;
|
||||
last_set_ip = -1;
|
||||
}
|
||||
if (opcode == _EXIT_TRACE) {
|
||||
return pc + 1;
|
||||
}
|
||||
break;
|
||||
}
|
||||
case _JUMP_TO_TOP:
|
||||
case _EXIT_TRACE:
|
||||
case _DYNAMIC_EXIT:
|
||||
case _DEOPT:
|
||||
return pc + 1;
|
||||
}
|
||||
}
|
||||
@@ -518,7 +522,7 @@ remove_unneeded_uops(_PyUOpInstruction *buffer, int buffer_size)
|
||||
// > 0 - length of optimized trace
|
||||
int
|
||||
_Py_uop_analyze_and_optimize(
|
||||
_PyInterpreterFrame *frame,
|
||||
PyFunctionObject *func,
|
||||
_PyUOpInstruction *buffer,
|
||||
int length,
|
||||
int curr_stacklen,
|
||||
@@ -528,8 +532,8 @@ _Py_uop_analyze_and_optimize(
|
||||
OPT_STAT_INC(optimizer_attempts);
|
||||
|
||||
length = optimize_uops(
|
||||
_PyFrame_GetFunction(frame), buffer,
|
||||
length, curr_stacklen, dependencies);
|
||||
func, buffer,
|
||||
length, curr_stacklen, dependencies);
|
||||
|
||||
if (length == 0) {
|
||||
return length;
|
||||
|
||||
@@ -342,7 +342,6 @@ dummy_func(void) {
|
||||
int already_bool = optimize_to_bool(this_instr, ctx, value, &value);
|
||||
if (!already_bool) {
|
||||
sym_set_type(value, &PyBool_Type);
|
||||
value = sym_new_truthiness(ctx, value, true);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -752,8 +751,14 @@ dummy_func(void) {
|
||||
}
|
||||
|
||||
op(_PY_FRAME_KW, (callable, self_or_null, args[oparg], kwnames -- new_frame)) {
|
||||
new_frame = PyJitRef_NULL;
|
||||
ctx->done = true;
|
||||
assert((this_instr + 2)->opcode == _PUSH_FRAME);
|
||||
PyCodeObject *co = get_code_with_logging((this_instr + 2));
|
||||
if (co == NULL) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
|
||||
new_frame = PyJitRef_Wrap((JitOptSymbol *)frame_new(ctx, co, 0, NULL, 0));
|
||||
}
|
||||
|
||||
op(_CHECK_AND_ALLOCATE_OBJECT, (type_version/2, callable, self_or_null, args[oparg] -- callable, self_or_null, args[oparg])) {
|
||||
@@ -764,8 +769,20 @@ dummy_func(void) {
|
||||
}
|
||||
|
||||
op(_CREATE_INIT_FRAME, (init, self, args[oparg] -- init_frame)) {
|
||||
init_frame = PyJitRef_NULL;
|
||||
ctx->done = true;
|
||||
ctx->frame->stack_pointer = stack_pointer - oparg - 2;
|
||||
_Py_UOpsAbstractFrame *shim = frame_new(ctx, (PyCodeObject *)&_Py_InitCleanup, 0, NULL, 0);
|
||||
if (shim == NULL) {
|
||||
break;
|
||||
}
|
||||
/* Push self onto stack of shim */
|
||||
shim->stack[0] = self;
|
||||
shim->stack_pointer++;
|
||||
assert((int)(shim->stack_pointer - shim->stack) == 1);
|
||||
ctx->frame = shim;
|
||||
ctx->curr_frame_depth++;
|
||||
assert((this_instr + 1)->opcode == _PUSH_FRAME);
|
||||
PyCodeObject *co = get_code_with_logging((this_instr + 1));
|
||||
init_frame = PyJitRef_Wrap((JitOptSymbol *)frame_new(ctx, co, 0, args-1, oparg+1));
|
||||
}
|
||||
|
||||
op(_RETURN_VALUE, (retval -- res)) {
|
||||
@@ -773,42 +790,65 @@ dummy_func(void) {
|
||||
JitOptRef temp = PyJitRef_StripReferenceInfo(retval);
|
||||
DEAD(retval);
|
||||
SAVE_STACK();
|
||||
PyCodeObject *co = get_current_code_object(ctx);
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
frame_pop(ctx);
|
||||
PyCodeObject *returning_code = get_code_with_logging(this_instr);
|
||||
if (returning_code == NULL) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
int returning_stacklevel = this_instr->operand1;
|
||||
if (ctx->curr_frame_depth >= 2) {
|
||||
PyCodeObject *expected_code = ctx->frames[ctx->curr_frame_depth - 2].code;
|
||||
if (expected_code == returning_code) {
|
||||
assert((this_instr + 1)->opcode == _GUARD_IP_RETURN_VALUE);
|
||||
REPLACE_OP((this_instr + 1), _NOP, 0, 0);
|
||||
}
|
||||
}
|
||||
if (frame_pop(ctx, returning_code, returning_stacklevel)) {
|
||||
break;
|
||||
}
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
|
||||
/* Stack space handling */
|
||||
assert(corresponding_check_stack == NULL);
|
||||
assert(co != NULL);
|
||||
int framesize = co->co_framesize;
|
||||
assert(framesize > 0);
|
||||
assert(framesize <= curr_space);
|
||||
curr_space -= framesize;
|
||||
|
||||
RELOAD_STACK();
|
||||
res = temp;
|
||||
}
|
||||
|
||||
op(_RETURN_GENERATOR, ( -- res)) {
|
||||
SYNC_SP();
|
||||
PyCodeObject *co = get_current_code_object(ctx);
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
frame_pop(ctx);
|
||||
PyCodeObject *returning_code = get_code_with_logging(this_instr);
|
||||
if (returning_code == NULL) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
_Py_BloomFilter_Add(dependencies, returning_code);
|
||||
int returning_stacklevel = this_instr->operand1;
|
||||
if (frame_pop(ctx, returning_code, returning_stacklevel)) {
|
||||
break;
|
||||
}
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
res = sym_new_unknown(ctx);
|
||||
|
||||
/* Stack space handling */
|
||||
assert(corresponding_check_stack == NULL);
|
||||
assert(co != NULL);
|
||||
int framesize = co->co_framesize;
|
||||
assert(framesize > 0);
|
||||
assert(framesize <= curr_space);
|
||||
curr_space -= framesize;
|
||||
}
|
||||
|
||||
op(_YIELD_VALUE, (unused -- value)) {
|
||||
value = sym_new_unknown(ctx);
|
||||
op(_YIELD_VALUE, (retval -- value)) {
|
||||
// Mimics PyStackRef_MakeHeapSafe in the interpreter.
|
||||
JitOptRef temp = PyJitRef_StripReferenceInfo(retval);
|
||||
DEAD(retval);
|
||||
SAVE_STACK();
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
PyCodeObject *returning_code = get_code_with_logging(this_instr);
|
||||
if (returning_code == NULL) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
_Py_BloomFilter_Add(dependencies, returning_code);
|
||||
int returning_stacklevel = this_instr->operand1;
|
||||
if (frame_pop(ctx, returning_code, returning_stacklevel)) {
|
||||
break;
|
||||
}
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
RELOAD_STACK();
|
||||
value = temp;
|
||||
}
|
||||
|
||||
op(_GET_ITER, (iterable -- iter, index_or_null)) {
|
||||
@@ -835,8 +875,6 @@ dummy_func(void) {
|
||||
}
|
||||
|
||||
op(_CHECK_STACK_SPACE, (unused, unused, unused[oparg] -- unused, unused, unused[oparg])) {
|
||||
assert(corresponding_check_stack == NULL);
|
||||
corresponding_check_stack = this_instr;
|
||||
}
|
||||
|
||||
op (_CHECK_STACK_SPACE_OPERAND, (framesize/2 -- )) {
|
||||
@@ -848,38 +886,29 @@ dummy_func(void) {
|
||||
|
||||
op(_PUSH_FRAME, (new_frame -- )) {
|
||||
SYNC_SP();
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
if (!CURRENT_FRAME_IS_INIT_SHIM()) {
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
}
|
||||
ctx->frame = (_Py_UOpsAbstractFrame *)PyJitRef_Unwrap(new_frame);
|
||||
ctx->curr_frame_depth++;
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
uint64_t operand = this_instr->operand0;
|
||||
if (operand == 0 || (operand & 1)) {
|
||||
// It's either a code object or NULL
|
||||
if (operand == 0) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
PyFunctionObject *func = (PyFunctionObject *)operand;
|
||||
PyCodeObject *co = (PyCodeObject *)func->func_code;
|
||||
assert(PyFunction_Check(func));
|
||||
ctx->frame->func = func;
|
||||
/* Stack space handling */
|
||||
int framesize = co->co_framesize;
|
||||
assert(framesize > 0);
|
||||
curr_space += framesize;
|
||||
if (curr_space < 0 || curr_space > INT32_MAX) {
|
||||
// won't fit in signed 32-bit int
|
||||
ctx->done = true;
|
||||
break;
|
||||
if (!(operand & 1)) {
|
||||
PyFunctionObject *func = (PyFunctionObject *)operand;
|
||||
// No need to re-add to dependencies here. Already
|
||||
// handled by the tracer.
|
||||
ctx->frame->func = func;
|
||||
}
|
||||
max_space = curr_space > max_space ? curr_space : max_space;
|
||||
if (first_valid_check_stack == NULL) {
|
||||
first_valid_check_stack = corresponding_check_stack;
|
||||
// Fixed calls don't need IP guards.
|
||||
if ((this_instr-1)->opcode == _SAVE_RETURN_OFFSET ||
|
||||
(this_instr-1)->opcode == _CREATE_INIT_FRAME) {
|
||||
assert((this_instr+1)->opcode == _GUARD_IP__PUSH_FRAME);
|
||||
REPLACE_OP(this_instr+1, _NOP, 0, 0);
|
||||
}
|
||||
else if (corresponding_check_stack) {
|
||||
// delete all but the first valid _CHECK_STACK_SPACE
|
||||
corresponding_check_stack->opcode = _NOP;
|
||||
}
|
||||
corresponding_check_stack = NULL;
|
||||
}
|
||||
|
||||
op(_UNPACK_SEQUENCE, (seq -- values[oparg], top[0])) {
|
||||
@@ -1024,6 +1053,10 @@ dummy_func(void) {
|
||||
ctx->done = true;
|
||||
}
|
||||
|
||||
op(_DEOPT, (--)) {
|
||||
ctx->done = true;
|
||||
}
|
||||
|
||||
op(_REPLACE_WITH_TRUE, (value -- res)) {
|
||||
REPLACE_OP(this_instr, _POP_TOP_LOAD_CONST_INLINE_BORROW, 0, (uintptr_t)Py_True);
|
||||
res = sym_new_const(ctx, Py_True);
|
||||
|
||||
153
Python/optimizer_cases.c.h
generated
153
Python/optimizer_cases.c.h
generated
@@ -280,7 +280,6 @@
|
||||
int already_bool = optimize_to_bool(this_instr, ctx, value, &value);
|
||||
if (!already_bool) {
|
||||
sym_set_type(value, &PyBool_Type);
|
||||
value = sym_new_truthiness(ctx, value, true);
|
||||
}
|
||||
stack_pointer[-1] = value;
|
||||
break;
|
||||
@@ -1116,16 +1115,24 @@
|
||||
JitOptRef temp = PyJitRef_StripReferenceInfo(retval);
|
||||
stack_pointer += -1;
|
||||
assert(WITHIN_STACK_BOUNDS());
|
||||
PyCodeObject *co = get_current_code_object(ctx);
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
frame_pop(ctx);
|
||||
PyCodeObject *returning_code = get_code_with_logging(this_instr);
|
||||
if (returning_code == NULL) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
int returning_stacklevel = this_instr->operand1;
|
||||
if (ctx->curr_frame_depth >= 2) {
|
||||
PyCodeObject *expected_code = ctx->frames[ctx->curr_frame_depth - 2].code;
|
||||
if (expected_code == returning_code) {
|
||||
assert((this_instr + 1)->opcode == _GUARD_IP_RETURN_VALUE);
|
||||
REPLACE_OP((this_instr + 1), _NOP, 0, 0);
|
||||
}
|
||||
}
|
||||
if (frame_pop(ctx, returning_code, returning_stacklevel)) {
|
||||
break;
|
||||
}
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
assert(corresponding_check_stack == NULL);
|
||||
assert(co != NULL);
|
||||
int framesize = co->co_framesize;
|
||||
assert(framesize > 0);
|
||||
assert(framesize <= curr_space);
|
||||
curr_space -= framesize;
|
||||
res = temp;
|
||||
stack_pointer[0] = res;
|
||||
stack_pointer += 1;
|
||||
@@ -1167,9 +1174,28 @@
|
||||
}
|
||||
|
||||
case _YIELD_VALUE: {
|
||||
JitOptRef retval;
|
||||
JitOptRef value;
|
||||
value = sym_new_unknown(ctx);
|
||||
stack_pointer[-1] = value;
|
||||
retval = stack_pointer[-1];
|
||||
JitOptRef temp = PyJitRef_StripReferenceInfo(retval);
|
||||
stack_pointer += -1;
|
||||
assert(WITHIN_STACK_BOUNDS());
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
PyCodeObject *returning_code = get_code_with_logging(this_instr);
|
||||
if (returning_code == NULL) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
_Py_BloomFilter_Add(dependencies, returning_code);
|
||||
int returning_stacklevel = this_instr->operand1;
|
||||
if (frame_pop(ctx, returning_code, returning_stacklevel)) {
|
||||
break;
|
||||
}
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
value = temp;
|
||||
stack_pointer[0] = value;
|
||||
stack_pointer += 1;
|
||||
assert(WITHIN_STACK_BOUNDS());
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -2103,6 +2129,8 @@
|
||||
break;
|
||||
}
|
||||
|
||||
/* _JUMP_BACKWARD_NO_INTERRUPT is not a viable micro-op for tier 2 */
|
||||
|
||||
case _GET_LEN: {
|
||||
JitOptRef obj;
|
||||
JitOptRef len;
|
||||
@@ -2557,8 +2585,6 @@
|
||||
}
|
||||
|
||||
case _CHECK_STACK_SPACE: {
|
||||
assert(corresponding_check_stack == NULL);
|
||||
corresponding_check_stack = this_instr;
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -2601,34 +2627,26 @@
|
||||
new_frame = stack_pointer[-1];
|
||||
stack_pointer += -1;
|
||||
assert(WITHIN_STACK_BOUNDS());
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
if (!CURRENT_FRAME_IS_INIT_SHIM()) {
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
}
|
||||
ctx->frame = (_Py_UOpsAbstractFrame *)PyJitRef_Unwrap(new_frame);
|
||||
ctx->curr_frame_depth++;
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
uint64_t operand = this_instr->operand0;
|
||||
if (operand == 0 || (operand & 1)) {
|
||||
if (operand == 0) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
PyFunctionObject *func = (PyFunctionObject *)operand;
|
||||
PyCodeObject *co = (PyCodeObject *)func->func_code;
|
||||
assert(PyFunction_Check(func));
|
||||
ctx->frame->func = func;
|
||||
int framesize = co->co_framesize;
|
||||
assert(framesize > 0);
|
||||
curr_space += framesize;
|
||||
if (curr_space < 0 || curr_space > INT32_MAX) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
if (!(operand & 1)) {
|
||||
PyFunctionObject *func = (PyFunctionObject *)operand;
|
||||
ctx->frame->func = func;
|
||||
}
|
||||
max_space = curr_space > max_space ? curr_space : max_space;
|
||||
if (first_valid_check_stack == NULL) {
|
||||
first_valid_check_stack = corresponding_check_stack;
|
||||
if ((this_instr-1)->opcode == _SAVE_RETURN_OFFSET ||
|
||||
(this_instr-1)->opcode == _CREATE_INIT_FRAME) {
|
||||
assert((this_instr+1)->opcode == _GUARD_IP__PUSH_FRAME);
|
||||
REPLACE_OP(this_instr+1, _NOP, 0, 0);
|
||||
}
|
||||
else if (corresponding_check_stack) {
|
||||
corresponding_check_stack->opcode = _NOP;
|
||||
}
|
||||
corresponding_check_stack = NULL;
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -2761,9 +2779,24 @@
|
||||
}
|
||||
|
||||
case _CREATE_INIT_FRAME: {
|
||||
JitOptRef *args;
|
||||
JitOptRef self;
|
||||
JitOptRef init_frame;
|
||||
init_frame = PyJitRef_NULL;
|
||||
ctx->done = true;
|
||||
args = &stack_pointer[-oparg];
|
||||
self = stack_pointer[-1 - oparg];
|
||||
ctx->frame->stack_pointer = stack_pointer - oparg - 2;
|
||||
_Py_UOpsAbstractFrame *shim = frame_new(ctx, (PyCodeObject *)&_Py_InitCleanup, 0, NULL, 0);
|
||||
if (shim == NULL) {
|
||||
break;
|
||||
}
|
||||
shim->stack[0] = self;
|
||||
shim->stack_pointer++;
|
||||
assert((int)(shim->stack_pointer - shim->stack) == 1);
|
||||
ctx->frame = shim;
|
||||
ctx->curr_frame_depth++;
|
||||
assert((this_instr + 1)->opcode == _PUSH_FRAME);
|
||||
PyCodeObject *co = get_code_with_logging((this_instr + 1));
|
||||
init_frame = PyJitRef_Wrap((JitOptSymbol *)frame_new(ctx, co, 0, args-1, oparg+1));
|
||||
stack_pointer[-2 - oparg] = init_frame;
|
||||
stack_pointer += -1 - oparg;
|
||||
assert(WITHIN_STACK_BOUNDS());
|
||||
@@ -2948,8 +2981,13 @@
|
||||
|
||||
case _PY_FRAME_KW: {
|
||||
JitOptRef new_frame;
|
||||
new_frame = PyJitRef_NULL;
|
||||
ctx->done = true;
|
||||
assert((this_instr + 2)->opcode == _PUSH_FRAME);
|
||||
PyCodeObject *co = get_code_with_logging((this_instr + 2));
|
||||
if (co == NULL) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
new_frame = PyJitRef_Wrap((JitOptSymbol *)frame_new(ctx, co, 0, NULL, 0));
|
||||
stack_pointer[-3 - oparg] = new_frame;
|
||||
stack_pointer += -2 - oparg;
|
||||
assert(WITHIN_STACK_BOUNDS());
|
||||
@@ -3005,17 +3043,19 @@
|
||||
|
||||
case _RETURN_GENERATOR: {
|
||||
JitOptRef res;
|
||||
PyCodeObject *co = get_current_code_object(ctx);
|
||||
ctx->frame->stack_pointer = stack_pointer;
|
||||
frame_pop(ctx);
|
||||
PyCodeObject *returning_code = get_code_with_logging(this_instr);
|
||||
if (returning_code == NULL) {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
_Py_BloomFilter_Add(dependencies, returning_code);
|
||||
int returning_stacklevel = this_instr->operand1;
|
||||
if (frame_pop(ctx, returning_code, returning_stacklevel)) {
|
||||
break;
|
||||
}
|
||||
stack_pointer = ctx->frame->stack_pointer;
|
||||
res = sym_new_unknown(ctx);
|
||||
assert(corresponding_check_stack == NULL);
|
||||
assert(co != NULL);
|
||||
int framesize = co->co_framesize;
|
||||
assert(framesize > 0);
|
||||
assert(framesize <= curr_space);
|
||||
curr_space -= framesize;
|
||||
stack_pointer[0] = res;
|
||||
stack_pointer += 1;
|
||||
assert(WITHIN_STACK_BOUNDS());
|
||||
@@ -3265,6 +3305,10 @@
|
||||
break;
|
||||
}
|
||||
|
||||
case _DYNAMIC_EXIT: {
|
||||
break;
|
||||
}
|
||||
|
||||
case _CHECK_VALIDITY: {
|
||||
break;
|
||||
}
|
||||
@@ -3399,6 +3443,7 @@
|
||||
}
|
||||
|
||||
case _DEOPT: {
|
||||
ctx->done = true;
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -3418,3 +3463,23 @@
|
||||
break;
|
||||
}
|
||||
|
||||
case _COLD_DYNAMIC_EXIT: {
|
||||
break;
|
||||
}
|
||||
|
||||
case _GUARD_IP__PUSH_FRAME: {
|
||||
break;
|
||||
}
|
||||
|
||||
case _GUARD_IP_YIELD_VALUE: {
|
||||
break;
|
||||
}
|
||||
|
||||
case _GUARD_IP_RETURN_VALUE: {
|
||||
break;
|
||||
}
|
||||
|
||||
case _GUARD_IP_RETURN_GENERATOR: {
|
||||
break;
|
||||
}
|
||||
|
||||
|
||||
@@ -817,9 +817,14 @@ _Py_uop_frame_new(
|
||||
JitOptRef *args,
|
||||
int arg_len)
|
||||
{
|
||||
assert(ctx->curr_frame_depth < MAX_ABSTRACT_FRAME_DEPTH);
|
||||
if (ctx->curr_frame_depth >= MAX_ABSTRACT_FRAME_DEPTH) {
|
||||
ctx->done = true;
|
||||
ctx->out_of_space = true;
|
||||
OPT_STAT_INC(optimizer_frame_overflow);
|
||||
return NULL;
|
||||
}
|
||||
_Py_UOpsAbstractFrame *frame = &ctx->frames[ctx->curr_frame_depth];
|
||||
|
||||
frame->code = co;
|
||||
frame->stack_len = co->co_stacksize;
|
||||
frame->locals_len = co->co_nlocalsplus;
|
||||
|
||||
@@ -901,13 +906,42 @@ _Py_uop_abstractcontext_init(JitOptContext *ctx)
|
||||
}
|
||||
|
||||
int
|
||||
_Py_uop_frame_pop(JitOptContext *ctx)
|
||||
_Py_uop_frame_pop(JitOptContext *ctx, PyCodeObject *co, int curr_stackentries)
|
||||
{
|
||||
_Py_UOpsAbstractFrame *frame = ctx->frame;
|
||||
ctx->n_consumed = frame->locals;
|
||||
|
||||
ctx->curr_frame_depth--;
|
||||
assert(ctx->curr_frame_depth >= 1);
|
||||
ctx->frame = &ctx->frames[ctx->curr_frame_depth - 1];
|
||||
|
||||
if (ctx->curr_frame_depth >= 1) {
|
||||
ctx->frame = &ctx->frames[ctx->curr_frame_depth - 1];
|
||||
|
||||
// We returned to the correct code. Nothing to do here.
|
||||
if (co == ctx->frame->code) {
|
||||
return 0;
|
||||
}
|
||||
// Else: the code we recorded doesn't match the code we *think* we're
|
||||
// returning to. We could trace anything, we can't just return to the
|
||||
// old frame. We have to restore what the tracer recorded
|
||||
// as the traced next frame.
|
||||
// Remove the current frame, and later swap it out with the right one.
|
||||
else {
|
||||
ctx->curr_frame_depth--;
|
||||
}
|
||||
}
|
||||
// Else: trace stack underflow.
|
||||
|
||||
// This handles swapping out frames.
|
||||
assert(curr_stackentries >= 1);
|
||||
// -1 to stackentries as we push to the stack our return value after this.
|
||||
_Py_UOpsAbstractFrame *new_frame = _Py_uop_frame_new(ctx, co, curr_stackentries - 1, NULL, 0);
|
||||
if (new_frame == NULL) {
|
||||
ctx->done = true;
|
||||
return 1;
|
||||
}
|
||||
|
||||
ctx->curr_frame_depth++;
|
||||
ctx->frame = new_frame;
|
||||
|
||||
return 0;
|
||||
}
|
||||
|
||||
@@ -552,10 +552,6 @@ init_interpreter(PyInterpreterState *interp,
|
||||
_Py_brc_init_state(interp);
|
||||
#endif
|
||||
|
||||
#ifdef _Py_TIER2
|
||||
// Ensure the buffer is to be set as NULL.
|
||||
interp->jit_uop_buffer = NULL;
|
||||
#endif
|
||||
llist_init(&interp->mem_free_queue.head);
|
||||
llist_init(&interp->asyncio_tasks_head);
|
||||
interp->asyncio_tasks_lock = (PyMutex){0};
|
||||
@@ -805,10 +801,6 @@ interpreter_clear(PyInterpreterState *interp, PyThreadState *tstate)
|
||||
|
||||
#ifdef _Py_TIER2
|
||||
_Py_ClearExecutorDeletionList(interp);
|
||||
if (interp->jit_uop_buffer != NULL) {
|
||||
_PyObject_VirtualFree(interp->jit_uop_buffer, UOP_BUFFER_SIZE);
|
||||
interp->jit_uop_buffer = NULL;
|
||||
}
|
||||
#endif
|
||||
_PyAST_Fini(interp);
|
||||
_PyAtExit_Fini(interp);
|
||||
@@ -831,6 +823,14 @@ interpreter_clear(PyInterpreterState *interp, PyThreadState *tstate)
|
||||
assert(cold->vm_data.warm);
|
||||
_PyExecutor_Free(cold);
|
||||
}
|
||||
|
||||
struct _PyExecutorObject *cold_dynamic = interp->cold_dynamic_executor;
|
||||
if (cold_dynamic != NULL) {
|
||||
interp->cold_dynamic_executor = NULL;
|
||||
assert(cold_dynamic->vm_data.valid);
|
||||
assert(cold_dynamic->vm_data.warm);
|
||||
_PyExecutor_Free(cold_dynamic);
|
||||
}
|
||||
/* We don't clear sysdict and builtins until the end of this function.
|
||||
Because clearing other attributes can execute arbitrary Python code
|
||||
which requires sysdict and builtins. */
|
||||
@@ -1501,6 +1501,9 @@ init_threadstate(_PyThreadStateImpl *_tstate,
|
||||
_tstate->asyncio_running_loop = NULL;
|
||||
_tstate->asyncio_running_task = NULL;
|
||||
|
||||
#ifdef _Py_TIER2
|
||||
_tstate->jit_tracer_state.code_buffer = NULL;
|
||||
#endif
|
||||
tstate->delete_later = NULL;
|
||||
|
||||
llist_init(&_tstate->mem_free_queue);
|
||||
@@ -1807,6 +1810,14 @@ tstate_delete_common(PyThreadState *tstate, int release_gil)
|
||||
assert(tstate_impl->refcounts.values == NULL);
|
||||
#endif
|
||||
|
||||
#if _Py_TIER2
|
||||
_PyThreadStateImpl *_tstate = (_PyThreadStateImpl *)tstate;
|
||||
if (_tstate->jit_tracer_state.code_buffer != NULL) {
|
||||
_PyObject_VirtualFree(_tstate->jit_tracer_state.code_buffer, UOP_BUFFER_SIZE);
|
||||
_tstate->jit_tracer_state.code_buffer = NULL;
|
||||
}
|
||||
#endif
|
||||
|
||||
HEAD_UNLOCK(runtime);
|
||||
|
||||
// XXX Unbind in PyThreadState_Clear(), or earlier
|
||||
|
||||
Reference in New Issue
Block a user