summaryrefslogtreecommitdiffstats
path: root/ArmPkg/Library/ArmLib/AArch64/AArch64Support.S
diff options
context:
space:
mode:
authorArd Biesheuvel <ard.biesheuvel@linaro.org>2015-11-09 13:27:15 +0000
committerabiesheuvel <abiesheuvel@Edk2>2015-11-09 13:27:15 +0000
commitc722289324223c472fcf920f860dc4b49314dedf (patch)
tree7a59af5936ffd8ac8f6779f653562cb278b10131 /ArmPkg/Library/ArmLib/AArch64/AArch64Support.S
parentfbf658ebc8e2e9340b036b16f2c94403696df1c0 (diff)
downloadedk2-c722289324223c472fcf920f860dc4b49314dedf.tar.gz
edk2-c722289324223c472fcf920f860dc4b49314dedf.tar.bz2
edk2-c722289324223c472fcf920f860dc4b49314dedf.zip
ArmPkg/ArmLib: move cache maintenance sync barriers out of loop
There is no need to issue a full data synchronization barrier and an instruction synchronization barrier after each and every set/way or MVA cache maintenance operation. For the set/way case, we can simply remove them, since the set/way outer loop already issues the required barriers after completing its traversal over all the cache levels. For the MVA case, move the data synchronization barrier out of the loop, and add the instruction synchronization barrier to the I-cache invalidation by MVA routine. Contributed-under: TianoCore Contribution Agreement 1.0 Signed-off-by: Ard Biesheuvel <ard.biesheuvel@linaro.org> Reviewed-by: Mark Rutland <mark.rutland@arm.com> Reviewed-by: Leif Lindholm <leif.lindholm@linaro.org> git-svn-id: https://svn.code.sf.net/p/edk2/code/trunk/edk2@18755 6f19259b-4bc3-4df7-8a09-765794883524
Diffstat (limited to 'ArmPkg/Library/ArmLib/AArch64/AArch64Support.S')
-rw-r--r--ArmPkg/Library/ArmLib/AArch64/AArch64Support.S12
1 files changed, 0 insertions, 12 deletions
diff --git a/ArmPkg/Library/ArmLib/AArch64/AArch64Support.S b/ArmPkg/Library/ArmLib/AArch64/AArch64Support.S
index f973a35c21..df2dc935c1 100644
--- a/ArmPkg/Library/ArmLib/AArch64/AArch64Support.S
+++ b/ArmPkg/Library/ArmLib/AArch64/AArch64Support.S
@@ -65,43 +65,31 @@ GCC_ASM_EXPORT (ArmReadCurrentEL)
ASM_PFX(ArmInvalidateDataCacheEntryByMVA):
dc ivac, x0 // Invalidate single data cache line
- dsb sy
- isb
ret
ASM_PFX(ArmCleanDataCacheEntryByMVA):
dc cvac, x0 // Clean single data cache line
- dsb sy
- isb
ret
ASM_PFX(ArmCleanInvalidateDataCacheEntryByMVA):
dc civac, x0 // Clean and invalidate single data cache line
- dsb sy
- isb
ret
ASM_PFX(ArmInvalidateDataCacheEntryBySetWay):
dc isw, x0 // Invalidate this line
- dsb sy
- isb
ret
ASM_PFX(ArmCleanInvalidateDataCacheEntryBySetWay):
dc cisw, x0 // Clean and Invalidate this line
- dsb sy
- isb
ret
ASM_PFX(ArmCleanDataCacheEntryBySetWay):
dc csw, x0 // Clean this line
- dsb sy
- isb
ret