Fix interpreter: pop/push/shift/unshift on @{...} expressions#284
Merged
Conversation
The interpreter compiler was not handling BlockNode operands for array
deref expressions like @{$h->{PATH}}. When compiling pop/push/shift/
unshift with such expressions, the BlockNode was visited without proper
context, causing lastResultReg to be -1.
Changes:
- CompileOperator.emitArrayOperandRegister: Accept BlockNode alongside
OperatorNode, and use compileNode() with SCALAR context instead of
raw accept()
- BytecodeCompiler.handlePushUnshift: Use compileNode() with SCALAR
context when evaluating array dereference operands
Generated with [Devin](https://cli.devin.ai/docs)
Co-Authored-By: Devin <noreply@cognition.ai>
Two interpreter parity fixes:
1. BytecodeInterpreter CALL_SUB: Resolve symbolic code references using
current package for STRING/BYTE_STRING types. This fixes cases like
\&$func() where $func contains a subroutine name string - the name
is now resolved in the current package instead of main.
2. CompileAssignment: Handle BlockNode operands for @{...} = ... just
like OperatorNode. This fixes assignment to array deref expressions
like @{$h->{list}} = (1, 2, 3).
Generated with [Devin](https://cli.devin.ai/docs)
Co-Authored-By: Devin <noreply@cognition.ai>
- Add CODE_DEREF_NONSTRICT opcode (375) for &{$name} dynamic code refs
- Implement handler in SlowOpcodeHandler.executeCodeDerefNonStrict()
- Handle BlockNode for * (glob) dereference operator
- Fix tr/// to compile operands in SCALAR context
Generated with [Devin](https://cli.devin.ai/docs)
Co-Authored-By: Devin <noreply@cognition.ai>
Several places in the bytecode compiler were using accept(this) instead of compileNode with RuntimeContextType.SCALAR for hash keys and array indices. This caused ClassCastException when constant functions or method calls were used as keys/indices, as they return RuntimeList but the opcodes expect RuntimeScalar. Fixed locations: - BytecodeCompiler: handleArrayElementAccess, handleGeneralArrayAccess, handleArrayKeyValueSlice, handleHashSlice, handleHashKeyValueSlice, handleArraySlice (added BlockNode support) - CompileAssignment: hash key compilation in 3 places, added %$ref = ... hash dereference assignment support - CompileExistsDelete: visitExistsArrow, visitDeleteArrow, visitDeleteHashSlice, compileHashKey, compileArrayIndex - CompileOperator: compileArrayIndex Generated with [Devin](https://cli.devin.ai/docs) Co-Authored-By: Devin <noreply@cognition.ai>
The issue was that 'my $x = EXPR if COND' was transformed to 'COND && (my $x = EXPR)' which would skip the variable declaration entirely when the condition was false. This caused null pointer exceptions in the interpreter when later code tried to access $x. The fix transforms 'my $x = EXPR if COND' to '(my $x, COND && ($x = EXPR))' using the comma operator. This ensures: - The variable is always declared in the current scope (via 'my $x') - The assignment only happens when the condition is true This matches Perl behavior where 'my $x = 1 if 0' declares $x as undef. Same fix applies to 'unless' modifier. Generated with [Devin](https://cli.devin.ai/docs) Co-Authored-By: Devin <noreply@cognition.ai>
The hash slice compiler was missing support for BlockNode operands,
which is needed for syntax like @{$$et{OPTIONS}}{qw(key1 key2)}.
This was causing 'Hash slice requires hash variable or reference'
compile error.
Added BlockNode handling to handleHashSlice() matching the existing
pattern in handleArraySlice().
Generated with [Devin](https://cli.devin.ai/docs)
Co-Authored-By: Devin <noreply@cognition.ai>
Added BlockNode handling for $#{BLOCK} syntax in both:
- CompileOperator.java: for reading $#{...}
- CompileAssignment.java: for assigning $#{...} -= value
This fixes compile error for syntax like:
$#{$$dirInfo{VarFormatData}} -= 1 if $wasVar;
Generated with [Devin](https://cli.devin.ai/docs)
Co-Authored-By: Devin <noreply@cognition.ai>
The commit 1d2dde2 incorrectly forced SCALAR context for hash slice keys like @hash{@Codes}, which prevented array keys from expanding. Revert hash slice key compilation to use LIST context (like JVM backend), while keeping SCALAR context for single element access. This fixes 4340 test regressions in pack.t. Generated with [Devin](https://cli.devin.ai/docs) Co-Authored-By: Devin <noreply@cognition.ai>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Test plan
Generated with Devin