diff options
author | Richard Sandiford <richard.sandiford@linaro.org> | 2017-12-21 07:02:13 +0000 |
---|---|---|
committer | Richard Sandiford <rsandifo@gcc.gnu.org> | 2017-12-21 07:02:13 +0000 |
commit | aca52e6f8d29064f4712e5f3f4429a36f918f099 (patch) | |
tree | 45c2884952716e83f81ee8162585cbf16e06bf73 /gcc/tree-data-ref.c | |
parent | 3fed2ce96f7ec8ee8603b33ba0426ac40acecf24 (diff) | |
download | gcc-aca52e6f8d29064f4712e5f3f4429a36f918f099.zip gcc-aca52e6f8d29064f4712e5f3f4429a36f918f099.tar.gz gcc-aca52e6f8d29064f4712e5f3f4429a36f918f099.tar.bz2 |
poly_int: MEM_REF offsets
This patch allows MEM_REF offsets to be polynomial, with mem_ref_offset
now returning a poly_offset_int instead of an offset_int. The
non-mechanical changes to callers of mem_ref_offset were handled by
previous patches.
2017-12-21 Richard Sandiford <richard.sandiford@linaro.org>
Alan Hayward <alan.hayward@arm.com>
David Sherwood <david.sherwood@arm.com>
gcc/
* fold-const.h (mem_ref_offset): Return a poly_offset_int rather
than an offset_int.
* tree.c (mem_ref_offset): Likewise.
(build_simple_mem_ref_loc): Treat MEM_REF offsets as poly_ints.
* builtins.c (get_object_alignment_2): Likewise.
* expr.c (get_inner_reference, expand_expr_real_1): Likewise.
* gimple-fold.c (get_base_constructor): Likewise.
* gimple-ssa-strength-reduction.c (restructure_reference): Likewise.
* gimple-ssa-warn-restrict.c (builtin_memref::builtin_memref):
Likewise.
* ipa-polymorphic-call.c
(ipa_polymorphic_call_context::ipa_polymorphic_call_context): Likewise.
* ipa-prop.c (compute_complex_assign_jump_func): Likewise.
(get_ancestor_addr_info): Likewise.
* ipa-param-manipulation.c (ipa_get_adjustment_candidate): Likewise.
* match.pd: Likewise.
* tree-data-ref.c (dr_analyze_innermost): Likewise.
* tree-dfa.c (get_addr_base_and_unit_offset_1): Likewise.
* tree-eh.c (tree_could_trap_p): Likewise.
* tree-object-size.c (addr_object_size): Likewise.
* tree-ssa-address.c (copy_ref_info): Likewise.
* tree-ssa-alias.c (indirect_ref_may_alias_decl_p): Likewise.
(indirect_refs_may_alias_p): Likewise.
* tree-ssa-sccvn.c (copy_reference_ops_from_ref): Likewise.
* tree-ssa.c (maybe_rewrite_mem_ref_base): Likewise.
(non_rewritable_mem_ref_base): Likewise.
* tree-vect-data-refs.c (vect_check_gather_scatter): Likewise.
* tree-vrp.c (vrp_prop::check_array_ref): Likewise.
* varasm.c (decode_addr_const): Likewise.
Co-Authored-By: Alan Hayward <alan.hayward@arm.com>
Co-Authored-By: David Sherwood <david.sherwood@arm.com>
From-SVN: r255930
Diffstat (limited to 'gcc/tree-data-ref.c')
-rw-r--r-- | gcc/tree-data-ref.c | 28 |
1 files changed, 17 insertions, 11 deletions
diff --git a/gcc/tree-data-ref.c b/gcc/tree-data-ref.c index 86a587d..2707cf8 100644 --- a/gcc/tree-data-ref.c +++ b/gcc/tree-data-ref.c @@ -820,16 +820,16 @@ dr_analyze_innermost (innermost_loop_behavior *drb, tree ref, } /* Calculate the alignment and misalignment for the inner reference. */ - unsigned int HOST_WIDE_INT base_misalignment; - unsigned int base_alignment; - get_object_alignment_1 (base, &base_alignment, &base_misalignment); + unsigned int HOST_WIDE_INT bit_base_misalignment; + unsigned int bit_base_alignment; + get_object_alignment_1 (base, &bit_base_alignment, &bit_base_misalignment); /* There are no bitfield references remaining in BASE, so the values we got back must be whole bytes. */ - gcc_assert (base_alignment % BITS_PER_UNIT == 0 - && base_misalignment % BITS_PER_UNIT == 0); - base_alignment /= BITS_PER_UNIT; - base_misalignment /= BITS_PER_UNIT; + gcc_assert (bit_base_alignment % BITS_PER_UNIT == 0 + && bit_base_misalignment % BITS_PER_UNIT == 0); + unsigned int base_alignment = bit_base_alignment / BITS_PER_UNIT; + poly_int64 base_misalignment = bit_base_misalignment / BITS_PER_UNIT; if (TREE_CODE (base) == MEM_REF) { @@ -837,8 +837,8 @@ dr_analyze_innermost (innermost_loop_behavior *drb, tree ref, { /* Subtract MOFF from the base and add it to POFFSET instead. Adjust the misalignment to reflect the amount we subtracted. */ - offset_int moff = mem_ref_offset (base); - base_misalignment -= moff.to_short_addr (); + poly_offset_int moff = mem_ref_offset (base); + base_misalignment -= moff.force_shwi (); tree mofft = wide_int_to_tree (sizetype, moff); if (!poffset) poffset = mofft; @@ -925,8 +925,14 @@ dr_analyze_innermost (innermost_loop_behavior *drb, tree ref, drb->offset = fold_convert (ssizetype, offset_iv.base); drb->init = init; drb->step = step; - drb->base_alignment = base_alignment; - drb->base_misalignment = base_misalignment & (base_alignment - 1); + if (known_misalignment (base_misalignment, base_alignment, + &drb->base_misalignment)) + drb->base_alignment = base_alignment; + else + { + drb->base_alignment = known_alignment (base_misalignment); + drb->base_misalignment = 0; + } drb->offset_alignment = highest_pow2_factor (offset_iv.base); drb->step_alignment = highest_pow2_factor (step); |