diff options
author | H.J. Lu <hjl.tools@gmail.com> | 2016-03-04 08:37:40 -0800 |
---|---|---|
committer | H.J. Lu <hjl.tools@gmail.com> | 2016-03-04 08:39:07 -0800 |
commit | 14a1d7cc4c4fd5ee8e4e66b777221dd32a84efe8 (patch) | |
tree | 86611a9511bcc3cafb5de83890af6c0508e569a9 /ChangeLog | |
parent | 4b230f6a60f3bb9cae92306d016535f40578ff2e (diff) | |
download | glibc-14a1d7cc4c4fd5ee8e4e66b777221dd32a84efe8.zip glibc-14a1d7cc4c4fd5ee8e4e66b777221dd32a84efe8.tar.gz glibc-14a1d7cc4c4fd5ee8e4e66b777221dd32a84efe8.tar.bz2 |
x86-64: Fix memcpy IFUNC selection
Chek Fast_Unaligned_Load, instead of Slow_BSF, and also check for
Fast_Copy_Backward to enable __memcpy_ssse3_back. Existing selection
order is updated with following selection order:
1. __memcpy_avx_unaligned if AVX_Fast_Unaligned_Load bit is set.
2. __memcpy_sse2_unaligned if Fast_Unaligned_Load bit is set.
3. __memcpy_sse2 if SSSE3 isn't available.
4. __memcpy_ssse3_back if Fast_Copy_Backward bit it set.
5. __memcpy_ssse3
[BZ #18880]
* sysdeps/x86_64/multiarch/memcpy.S: Check Fast_Unaligned_Load,
instead of Slow_BSF, and also check for Fast_Copy_Backward to
enable __memcpy_ssse3_back.
Diffstat (limited to 'ChangeLog')
-rw-r--r-- | ChangeLog | 8 |
1 files changed, 8 insertions, 0 deletions
@@ -1,3 +1,11 @@ +2016-03-04 Amit Pawar <Amit.Pawar@amd.com> + H.J. Lu <hongjiu.lu@intel.com> + + [BZ #18880] + * sysdeps/x86_64/multiarch/memcpy.S: Check Fast_Unaligned_Load, + instead of Slow_BSF, and also check for Fast_Copy_Backward to + enable __memcpy_ssse3_back. + 2016-03-03 H.J. Lu <hongjiu.lu@intel.com> [BZ #19758] |