Message ID | 20211221150611.3692437-1-kuba@kernel.org |
---|---|
State | Accepted |
Commit | d480a26bdf872529919e7c30e17f79d0d7b8c4da |
Headers | show |
Series | [crypto] x86/aesni: don't require alignment of data | expand |
On Tue, 21 Dec 2021 at 16:06, Jakub Kicinski <kuba@kernel.org> wrote: > > x86 AES-NI routines can deal with unaligned data. Crypto context > (key, iv etc.) have to be aligned but we take care of that separately > by copying it onto the stack. We were feeding unaligned data into > crypto routines up until commit 83c83e658863 ("crypto: aesni - > refactor scatterlist processing") switched to use the full > skcipher API which uses cra_alignmask to decide data alignment. > > This fixes 21% performance regression in kTLS. > > Tested by booting with CONFIG_CRYPTO_MANAGER_EXTRA_TESTS=y > (and running thru various kTLS packets). > > CC: stable@vger.kernel.org # 5.15+ > Fixes: 83c83e658863 ("crypto: aesni - refactor scatterlist processing") > Signed-off-by: Jakub Kicinski <kuba@kernel.org> Acked-by: Ard Biesheuvel <ardb@kernel.org> > --- > CC: herbert@gondor.apana.org.au > CC: x86@kernel.org > CC: ardb@kernel.org > CC: linux-crypto@vger.kernel.org > --- > arch/x86/crypto/aesni-intel_glue.c | 4 ++-- > 1 file changed, 2 insertions(+), 2 deletions(-) > > diff --git a/arch/x86/crypto/aesni-intel_glue.c b/arch/x86/crypto/aesni-intel_glue.c > index e09f4672dd38..41901ba9d3a2 100644 > --- a/arch/x86/crypto/aesni-intel_glue.c > +++ b/arch/x86/crypto/aesni-intel_glue.c > @@ -1107,7 +1107,7 @@ static struct aead_alg aesni_aeads[] = { { > .cra_flags = CRYPTO_ALG_INTERNAL, > .cra_blocksize = 1, > .cra_ctxsize = sizeof(struct aesni_rfc4106_gcm_ctx), > - .cra_alignmask = AESNI_ALIGN - 1, > + .cra_alignmask = 0, > .cra_module = THIS_MODULE, > }, > }, { > @@ -1124,7 +1124,7 @@ static struct aead_alg aesni_aeads[] = { { > .cra_flags = CRYPTO_ALG_INTERNAL, > .cra_blocksize = 1, > .cra_ctxsize = sizeof(struct generic_gcmaes_ctx), > - .cra_alignmask = AESNI_ALIGN - 1, > + .cra_alignmask = 0, > .cra_module = THIS_MODULE, > }, > } }; > -- > 2.31.1 >
On Tue, Dec 21, 2021 at 07:06:11AM -0800, Jakub Kicinski wrote: > x86 AES-NI routines can deal with unaligned data. Crypto context > (key, iv etc.) have to be aligned but we take care of that separately > by copying it onto the stack. We were feeding unaligned data into > crypto routines up until commit 83c83e658863 ("crypto: aesni - > refactor scatterlist processing") switched to use the full > skcipher API which uses cra_alignmask to decide data alignment. > > This fixes 21% performance regression in kTLS. > > Tested by booting with CONFIG_CRYPTO_MANAGER_EXTRA_TESTS=y > (and running thru various kTLS packets). > > CC: stable@vger.kernel.org # 5.15+ > Fixes: 83c83e658863 ("crypto: aesni - refactor scatterlist processing") > Signed-off-by: Jakub Kicinski <kuba@kernel.org> > --- > CC: herbert@gondor.apana.org.au > CC: x86@kernel.org > CC: ardb@kernel.org > CC: linux-crypto@vger.kernel.org > --- > arch/x86/crypto/aesni-intel_glue.c | 4 ++-- > 1 file changed, 2 insertions(+), 2 deletions(-) Patch applied. Thanks.
diff --git a/arch/x86/crypto/aesni-intel_glue.c b/arch/x86/crypto/aesni-intel_glue.c index e09f4672dd38..41901ba9d3a2 100644 --- a/arch/x86/crypto/aesni-intel_glue.c +++ b/arch/x86/crypto/aesni-intel_glue.c @@ -1107,7 +1107,7 @@ static struct aead_alg aesni_aeads[] = { { .cra_flags = CRYPTO_ALG_INTERNAL, .cra_blocksize = 1, .cra_ctxsize = sizeof(struct aesni_rfc4106_gcm_ctx), - .cra_alignmask = AESNI_ALIGN - 1, + .cra_alignmask = 0, .cra_module = THIS_MODULE, }, }, { @@ -1124,7 +1124,7 @@ static struct aead_alg aesni_aeads[] = { { .cra_flags = CRYPTO_ALG_INTERNAL, .cra_blocksize = 1, .cra_ctxsize = sizeof(struct generic_gcmaes_ctx), - .cra_alignmask = AESNI_ALIGN - 1, + .cra_alignmask = 0, .cra_module = THIS_MODULE, }, } };
x86 AES-NI routines can deal with unaligned data. Crypto context (key, iv etc.) have to be aligned but we take care of that separately by copying it onto the stack. We were feeding unaligned data into crypto routines up until commit 83c83e658863 ("crypto: aesni - refactor scatterlist processing") switched to use the full skcipher API which uses cra_alignmask to decide data alignment. This fixes 21% performance regression in kTLS. Tested by booting with CONFIG_CRYPTO_MANAGER_EXTRA_TESTS=y (and running thru various kTLS packets). CC: stable@vger.kernel.org # 5.15+ Fixes: 83c83e658863 ("crypto: aesni - refactor scatterlist processing") Signed-off-by: Jakub Kicinski <kuba@kernel.org> --- CC: herbert@gondor.apana.org.au CC: x86@kernel.org CC: ardb@kernel.org CC: linux-crypto@vger.kernel.org --- arch/x86/crypto/aesni-intel_glue.c | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-)