diff options
author | Eric Biggers <ebiggers@google.com> | 2019-05-30 10:53:08 -0700 |
---|---|---|
committer | Greg Kroah-Hartman <gregkh@linuxfoundation.org> | 2019-07-14 08:01:06 +0200 |
commit | eff1b6abd9a065180968175a5e93f1ef644b7ab5 (patch) | |
tree | a790721e1731d4ba05bfb30c95eac921df56e9de /crypto | |
parent | 0ecfebd2b52404ae0c54a878c872bb93363ada36 (diff) | |
download | linux-stable-eff1b6abd9a065180968175a5e93f1ef644b7ab5.tar.gz linux-stable-eff1b6abd9a065180968175a5e93f1ef644b7ab5.tar.bz2 linux-stable-eff1b6abd9a065180968175a5e93f1ef644b7ab5.zip |
crypto: lrw - use correct alignmask
commit 20a0f9761343fba9b25ea46bd3a3e5e533d974f8 upstream.
Commit c778f96bf347 ("crypto: lrw - Optimize tweak computation")
incorrectly reduced the alignmask of LRW instances from
'__alignof__(u64) - 1' to '__alignof__(__be32) - 1'.
However, xor_tweak() and setkey() assume that the data and key,
respectively, are aligned to 'be128', which has u64 alignment.
Fix the alignmask to be at least '__alignof__(be128) - 1'.
Fixes: c778f96bf347 ("crypto: lrw - Optimize tweak computation")
Cc: <stable@vger.kernel.org> # v4.20+
Cc: Ondrej Mosnacek <omosnace@redhat.com>
Signed-off-by: Eric Biggers <ebiggers@google.com>
Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
Signed-off-by: Greg Kroah-Hartman <gregkh@linuxfoundation.org>
Diffstat (limited to 'crypto')
-rw-r--r-- | crypto/lrw.c | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/crypto/lrw.c b/crypto/lrw.c index 58009cf63a6e..be829f6afc8e 100644 --- a/crypto/lrw.c +++ b/crypto/lrw.c @@ -384,7 +384,7 @@ static int create(struct crypto_template *tmpl, struct rtattr **tb) inst->alg.base.cra_priority = alg->base.cra_priority; inst->alg.base.cra_blocksize = LRW_BLOCK_SIZE; inst->alg.base.cra_alignmask = alg->base.cra_alignmask | - (__alignof__(__be32) - 1); + (__alignof__(be128) - 1); inst->alg.ivsize = LRW_BLOCK_SIZE; inst->alg.min_keysize = crypto_skcipher_alg_min_keysize(alg) + |