[riscv] Restore temporarily modified PTE within 32-bit transition code

If the virtual address offset is precisely one page (i.e. each virtual
address maps to a physical address one page higher), and if the 32-bit
transition code happens to end up at the end of a page (which would
require an unrealistic 2MB of content in .prefix), then it would be
possible for the program counter to cross into the portion of the
virtual address space still borrowed for use as the temporary physical
map.

Avoid this remote possibility by moving the restoration of the
temporarily modified PTE within the transition code block (which is
guaranteed to remain within a single page since it is aligned on its
own size).

This unfortunately requires increasing the alignment of the transition
code (and hence the maximum number of NOPs inserted).  The assembler
syntax theoretically allows us to avoid inserting any NOPs via a
directive such as:

   .balign PAGE_SIZE, , enable_paging_32_max_len

(i.e. relying on the fact that if the transition code is already
sufficiently far away from the end of a page, then no padding needs to
be inserted).  However, alignment on RISC-V is implemented using the
R_RISCV_ALIGN relaxing relocation, which doesn't encode any concept of
a maximum padding length, and so the maximum padding length value is
effectively ignored.

Signed-off-by: Michael Brown <mcb30@ipxe.org>
This commit is contained in:
Michael Brown
2025-05-08 11:03:38 +01:00
parent 0279015d09
commit 5e518c744e

View File

@@ -626,7 +626,7 @@ enable_paging_64_done:
* address, then restore the temporarily modified PTE.
*/
.equ enable_paging_32_xalign, 16
.equ enable_paging_32_xalign, 32
.section ".prefix.enable_paging_32", "ax", @progbits
enable_paging_32:
@@ -670,6 +670,9 @@ enable_paging_32:
ori t0, t0, PTE_LEAF
STOREN t0, (a3)
/* Adjust PTE pointer to a virtual address */
sub a3, a3, a1
/* Attempt to enable paging, and read back active paging level */
la t0, 1f
sub t0, t0, a1
@@ -684,16 +687,12 @@ enable_paging_32_xstart:
csrrw a0, satp, t1
beqz a0, enable_paging_32_done
jr t0
1: /* End of transition code */
.equ enable_paging_32_xlen, . - enable_paging_32_xstart
li a0, 1
/* Adjust PTE pointer to a virtual address */
sub a3, a3, a1
/* Restore temporarily modified PTE */
1: /* Restore temporarily modified PTE */
STOREN a4, (a3)
sfence.vma
/* End of transition code */
.equ enable_paging_32_xlen, . - enable_paging_32_xstart
li a0, 1
/* Adjust return address */
sub ra, ra, a1