Fix TLB invalidation

Add synchronization barriers and use the fix TLBI calls when
invalidating single VA entries. TLBI instructions expect the virtual
address shifted right by 12 bits which was missing from the
implementation.

Signed-off-by: Imre Kis <imre.kis@arm.com>
Change-Id: I413f986fffbdecb875a8ddc3356bae61b73e51d8
diff --git a/src/descriptor.rs b/src/descriptor.rs
index b5db7cc..c3ed68f 100644
--- a/src/descriptor.rs
+++ b/src/descriptor.rs
@@ -354,7 +354,7 @@
         unsafe {
             ptr::write_volatile(self.cell.get(), value);
             #[cfg(target_arch = "aarch64")]
-            core::arch::asm!("dsb nsh");
+            core::arch::asm!("dsb ishst");
         }
     }