diff options
author | Richard Sandiford <richard.sandiford@linaro.org> | 2017-12-20 12:51:50 +0000 |
---|---|---|
committer | Richard Sandiford <rsandifo@gcc.gnu.org> | 2017-12-20 12:51:50 +0000 |
commit | 36fd64086542ed734aded849304723218fa4d6fd (patch) | |
tree | 8ecb64d56e437c9912d8f6db4a70c9aec93c5019 /gcc/tree.c | |
parent | 0c12fc9b2d605cf323cfdab28a972d86398e71a1 (diff) | |
download | gcc-36fd64086542ed734aded849304723218fa4d6fd.zip gcc-36fd64086542ed734aded849304723218fa4d6fd.tar.gz gcc-36fd64086542ed734aded849304723218fa4d6fd.tar.bz2 |
poly_int: tree constants
This patch adds a tree representation for poly_ints. Unlike the
rtx version, the coefficients are INTEGER_CSTs rather than plain
integers, so that we can easily access them as poly_widest_ints
and poly_offset_ints.
The patch also adjusts some places that previously
relied on "constant" meaning "INTEGER_CST". It also makes
sure that the TYPE_SIZE agrees with the TYPE_SIZE_UNIT for
vector booleans, given the existing:
/* Several boolean vector elements may fit in a single unit. */
if (VECTOR_BOOLEAN_TYPE_P (type)
&& type->type_common.mode != BLKmode)
TYPE_SIZE_UNIT (type)
= size_int (GET_MODE_SIZE (type->type_common.mode));
else
TYPE_SIZE_UNIT (type) = int_const_binop (MULT_EXPR,
TYPE_SIZE_UNIT (innertype),
size_int (nunits));
2017-12-20 Richard Sandiford <richard.sandiford@linaro.org>
Alan Hayward <alan.hayward@arm.com>
David Sherwood <david.sherwood@arm.com>
gcc/
* doc/generic.texi (POLY_INT_CST): Document.
* tree.def (POLY_INT_CST): New tree code.
* treestruct.def (TS_POLY_INT_CST): New tree layout.
* tree-core.h (tree_poly_int_cst): New struct.
(tree_node): Add a poly_int_cst field.
* tree.h (POLY_INT_CST_P, POLY_INT_CST_COEFF): New macros.
(wide_int_to_tree, force_fit_type): Take a poly_wide_int_ref
instead of a wide_int_ref.
(build_int_cst, build_int_cst_type): Take a poly_int64 instead
of a HOST_WIDE_INT.
(build_int_cstu, build_array_type_nelts): Take a poly_uint64
instead of an unsigned HOST_WIDE_INT.
(build_poly_int_cst, tree_fits_poly_int64_p, tree_fits_poly_uint64_p)
(ptrdiff_tree_p): Declare.
(tree_to_poly_int64, tree_to_poly_uint64): Likewise. Provide
extern inline implementations if the target doesn't use POLY_INT_CST.
(poly_int_tree_p): New function.
(wi::unextended_tree): New class.
(wi::int_traits <unextended_tree>): New override.
(wi::extended_tree): Add a default constructor.
(wi::extended_tree::get_tree): New function.
(wi::widest_extended_tree, wi::offset_extended_tree): New typedefs.
(wi::tree_to_widest_ref, wi::tree_to_offset_ref): Use them.
(wi::tree_to_poly_widest_ref, wi::tree_to_poly_offset_ref)
(wi::tree_to_poly_wide_ref): New typedefs.
(wi::ints_for): Provide overloads for extended_tree and
unextended_tree.
(poly_int_cst_value, wi::to_poly_widest, wi::to_poly_offset)
(wi::to_wide): New functions.
(wi::fits_to_boolean_p, wi::fits_to_tree_p): Handle poly_ints.
* tree.c (poly_int_cst_hasher): New struct.
(poly_int_cst_hash_table): New variable.
(tree_node_structure_for_code, tree_code_size, simple_cst_equal)
(valid_constant_size_p, add_expr, drop_tree_overflow): Handle
POLY_INT_CST.
(initialize_tree_contains_struct): Handle TS_POLY_INT_CST.
(init_ttree): Initialize poly_int_cst_hash_table.
(build_int_cst, build_int_cst_type, build_invariant_address): Take
a poly_int64 instead of a HOST_WIDE_INT.
(build_int_cstu, build_array_type_nelts): Take a poly_uint64
instead of an unsigned HOST_WIDE_INT.
(wide_int_to_tree): Rename to...
(wide_int_to_tree_1): ...this.
(build_new_poly_int_cst, build_poly_int_cst): New functions.
(force_fit_type): Take a poly_wide_int_ref instead of a wide_int_ref.
(wide_int_to_tree): New function that takes a poly_wide_int_ref.
(ptrdiff_tree_p, tree_to_poly_int64, tree_to_poly_uint64)
(tree_fits_poly_int64_p, tree_fits_poly_uint64_p): New functions.
* lto-streamer-out.c (DFS::DFS_write_tree_body, hash_tree): Handle
TS_POLY_INT_CST.
* tree-streamer-in.c (lto_input_ts_poly_tree_pointers): Likewise.
(streamer_read_tree_body): Likewise.
* tree-streamer-out.c (write_ts_poly_tree_pointers): Likewise.
(streamer_write_tree_body): Likewise.
* tree-streamer.c (streamer_check_handled_ts_structures): Likewise.
* asan.c (asan_protect_global): Require the size to be an INTEGER_CST.
* cfgexpand.c (expand_debug_expr): Handle POLY_INT_CST.
* expr.c (expand_expr_real_1, const_vector_from_tree): Likewise.
* gimple-expr.h (is_gimple_constant): Likewise.
* gimplify.c (maybe_with_size_expr): Likewise.
* print-tree.c (print_node): Likewise.
* tree-data-ref.c (data_ref_compare_tree): Likewise.
* tree-pretty-print.c (dump_generic_node): Likewise.
* tree-ssa-address.c (addr_for_mem_ref): Likewise.
* tree-vect-data-refs.c (dr_group_sort_cmp): Likewise.
* tree-vrp.c (compare_values_warnv): Likewise.
* tree-ssa-loop-ivopts.c (determine_base_object, constant_multiple_of)
(get_loop_invariant_expr, add_candidate_1, get_computation_aff_1)
(force_expr_to_var_cost): Likewise.
* tree-ssa-loop.c (for_each_index): Likewise.
* fold-const.h (build_invariant_address, size_int_kind): Take a
poly_int64 instead of a HOST_WIDE_INT.
* fold-const.c (fold_negate_expr_1, const_binop, const_unop)
(fold_convert_const, multiple_of_p, fold_negate_const): Handle
POLY_INT_CST.
(size_binop_loc): Likewise. Allow int_const_binop_1 to fail.
(int_const_binop_2): New function, split out from...
(int_const_binop_1): ...here. Handle POLY_INT_CST.
(size_int_kind): Take a poly_int64 instead of a HOST_WIDE_INT.
* expmed.c (make_tree): Handle CONST_POLY_INT_P.
* gimple-ssa-strength-reduction.c (slsr_process_add)
(slsr_process_mul): Check for INTEGER_CSTs before using them
as candidates.
* stor-layout.c (bits_from_bytes): New function.
(bit_from_pos): Use it.
(layout_type): Likewise. For vectors, multiply the TYPE_SIZE_UNIT
by BITS_PER_UNIT to get the TYPE_SIZE.
* tree-cfg.c (verify_expr, verify_types_in_gimple_reference): Allow
MEM_REF and TARGET_MEM_REF offsets to be a POLY_INT_CST.
Co-Authored-By: Alan Hayward <alan.hayward@arm.com>
Co-Authored-By: David Sherwood <david.sherwood@arm.com>
From-SVN: r255863
Diffstat (limited to 'gcc/tree.c')
-rw-r--r-- | gcc/tree.c | 249 |
1 files changed, 233 insertions, 16 deletions
@@ -204,6 +204,17 @@ struct int_cst_hasher : ggc_cache_ptr_hash<tree_node> static GTY ((cache)) hash_table<int_cst_hasher> *int_cst_hash_table; +/* Class and variable for making sure that there is a single POLY_INT_CST + for a given value. */ +struct poly_int_cst_hasher : ggc_cache_ptr_hash<tree_node> +{ + typedef std::pair<tree, const poly_wide_int *> compare_type; + static hashval_t hash (tree t); + static bool equal (tree x, const compare_type &y); +}; + +static GTY ((cache)) hash_table<poly_int_cst_hasher> *poly_int_cst_hash_table; + /* Hash table for optimization flags and target option flags. Use the same hash table for both sets of options. Nodes for building the current optimization and target option nodes. The assumption is most of the time @@ -459,6 +470,7 @@ tree_node_structure_for_code (enum tree_code code) /* tcc_constant cases. */ case VOID_CST: return TS_TYPED; case INTEGER_CST: return TS_INT_CST; + case POLY_INT_CST: return TS_POLY_INT_CST; case REAL_CST: return TS_REAL_CST; case FIXED_CST: return TS_FIXED_CST; case COMPLEX_CST: return TS_COMPLEX; @@ -516,6 +528,7 @@ initialize_tree_contains_struct (void) case TS_COMMON: case TS_INT_CST: + case TS_POLY_INT_CST: case TS_REAL_CST: case TS_FIXED_CST: case TS_VECTOR: @@ -649,6 +662,8 @@ init_ttree (void) int_cst_hash_table = hash_table<int_cst_hasher>::create_ggc (1024); + poly_int_cst_hash_table = hash_table<poly_int_cst_hasher>::create_ggc (64); + int_cst_node = make_int_cst (1, 1); cl_option_hash_table = hash_table<cl_option_hasher>::create_ggc (64); @@ -835,6 +850,7 @@ tree_code_size (enum tree_code code) { case VOID_CST: return sizeof (tree_typed); case INTEGER_CST: gcc_unreachable (); + case POLY_INT_CST: return sizeof (tree_poly_int_cst); case REAL_CST: return sizeof (tree_real_cst); case FIXED_CST: return sizeof (tree_fixed_cst); case COMPLEX_CST: return sizeof (tree_complex); @@ -1313,31 +1329,51 @@ build_new_int_cst (tree type, const wide_int &cst) return nt; } -/* Create an INT_CST node with a LOW value sign extended to TYPE. */ +/* Return a new POLY_INT_CST with coefficients COEFFS and type TYPE. */ + +static tree +build_new_poly_int_cst (tree type, tree (&coeffs)[NUM_POLY_INT_COEFFS]) +{ + size_t length = sizeof (struct tree_poly_int_cst); + record_node_allocation_statistics (POLY_INT_CST, length); + + tree t = ggc_alloc_cleared_tree_node_stat (length PASS_MEM_STAT); + + TREE_SET_CODE (t, POLY_INT_CST); + TREE_CONSTANT (t) = 1; + TREE_TYPE (t) = type; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + POLY_INT_CST_COEFF (t, i) = coeffs[i]; + return t; +} + +/* Create a constant tree that contains CST sign-extended to TYPE. */ tree -build_int_cst (tree type, HOST_WIDE_INT low) +build_int_cst (tree type, poly_int64 cst) { /* Support legacy code. */ if (!type) type = integer_type_node; - return wide_int_to_tree (type, wi::shwi (low, TYPE_PRECISION (type))); + return wide_int_to_tree (type, wi::shwi (cst, TYPE_PRECISION (type))); } +/* Create a constant tree that contains CST zero-extended to TYPE. */ + tree -build_int_cstu (tree type, unsigned HOST_WIDE_INT cst) +build_int_cstu (tree type, poly_uint64 cst) { return wide_int_to_tree (type, wi::uhwi (cst, TYPE_PRECISION (type))); } -/* Create an INT_CST node with a LOW value sign extended to TYPE. */ +/* Create a constant tree that contains CST sign-extended to TYPE. */ tree -build_int_cst_type (tree type, HOST_WIDE_INT low) +build_int_cst_type (tree type, poly_int64 cst) { gcc_assert (type); - return wide_int_to_tree (type, wi::shwi (low, TYPE_PRECISION (type))); + return wide_int_to_tree (type, wi::shwi (cst, TYPE_PRECISION (type))); } /* Constructs tree in type TYPE from with value given by CST. Signedness @@ -1365,7 +1401,7 @@ double_int_to_tree (tree type, double_int cst) tree -force_fit_type (tree type, const wide_int_ref &cst, +force_fit_type (tree type, const poly_wide_int_ref &cst, int overflowable, bool overflowed) { signop sign = TYPE_SIGN (type); @@ -1377,8 +1413,21 @@ force_fit_type (tree type, const wide_int_ref &cst, || overflowable < 0 || (overflowable > 0 && sign == SIGNED)) { - wide_int tmp = wide_int::from (cst, TYPE_PRECISION (type), sign); - tree t = build_new_int_cst (type, tmp); + poly_wide_int tmp = poly_wide_int::from (cst, TYPE_PRECISION (type), + sign); + tree t; + if (tmp.is_constant ()) + t = build_new_int_cst (type, tmp.coeffs[0]); + else + { + tree coeffs[NUM_POLY_INT_COEFFS]; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + { + coeffs[i] = build_new_int_cst (type, tmp.coeffs[i]); + TREE_OVERFLOW (coeffs[i]) = 1; + } + t = build_new_poly_int_cst (type, coeffs); + } TREE_OVERFLOW (t) = 1; return t; } @@ -1435,8 +1484,8 @@ int_cst_hasher::equal (tree x, tree y) the upper bits and ensures that hashing and value equality based upon the underlying HOST_WIDE_INTs works without masking. */ -tree -wide_int_to_tree (tree type, const wide_int_ref &pcst) +static tree +wide_int_to_tree_1 (tree type, const wide_int_ref &pcst) { tree t; int ix = -1; @@ -1583,6 +1632,66 @@ wide_int_to_tree (tree type, const wide_int_ref &pcst) return t; } +hashval_t +poly_int_cst_hasher::hash (tree t) +{ + inchash::hash hstate; + + hstate.add_int (TYPE_UID (TREE_TYPE (t))); + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + hstate.add_wide_int (wi::to_wide (POLY_INT_CST_COEFF (t, i))); + + return hstate.end (); +} + +bool +poly_int_cst_hasher::equal (tree x, const compare_type &y) +{ + if (TREE_TYPE (x) != y.first) + return false; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + if (wi::to_wide (POLY_INT_CST_COEFF (x, i)) != y.second->coeffs[i]) + return false; + return true; +} + +/* Build a POLY_INT_CST node with type TYPE and with the elements in VALUES. + The elements must also have type TYPE. */ + +tree +build_poly_int_cst (tree type, const poly_wide_int_ref &values) +{ + unsigned int prec = TYPE_PRECISION (type); + gcc_assert (prec <= values.coeffs[0].get_precision ()); + poly_wide_int c = poly_wide_int::from (values, prec, SIGNED); + + inchash::hash h; + h.add_int (TYPE_UID (type)); + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + h.add_wide_int (c.coeffs[i]); + poly_int_cst_hasher::compare_type comp (type, &c); + tree *slot = poly_int_cst_hash_table->find_slot_with_hash (comp, h.end (), + INSERT); + if (*slot == NULL_TREE) + { + tree coeffs[NUM_POLY_INT_COEFFS]; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + coeffs[i] = wide_int_to_tree_1 (type, c.coeffs[i]); + *slot = build_new_poly_int_cst (type, coeffs); + } + return *slot; +} + +/* Create a constant tree with value VALUE in type TYPE. */ + +tree +wide_int_to_tree (tree type, const poly_wide_int_ref &value) +{ + if (value.is_constant ()) + return wide_int_to_tree_1 (type, value.coeffs[0]); + return build_poly_int_cst (type, value); +} + void cache_integer_cst (tree t) { @@ -2716,6 +2825,55 @@ really_constant_p (const_tree exp) exp = TREE_OPERAND (exp, 0); return TREE_CONSTANT (exp); } + +/* Return true if T holds a polynomial pointer difference, storing it in + *VALUE if so. A true return means that T's precision is no greater + than 64 bits, which is the largest address space we support, so *VALUE + never loses precision. However, the signedness of the result does + not necessarily match the signedness of T: sometimes an unsigned type + like sizetype is used to encode a value that is actually negative. */ + +bool +ptrdiff_tree_p (const_tree t, poly_int64_pod *value) +{ + if (!t) + return false; + if (TREE_CODE (t) == INTEGER_CST) + { + if (!cst_and_fits_in_hwi (t)) + return false; + *value = int_cst_value (t); + return true; + } + if (POLY_INT_CST_P (t)) + { + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + if (!cst_and_fits_in_hwi (POLY_INT_CST_COEFF (t, i))) + return false; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + value->coeffs[i] = int_cst_value (POLY_INT_CST_COEFF (t, i)); + return true; + } + return false; +} + +poly_int64 +tree_to_poly_int64 (const_tree t) +{ + gcc_assert (tree_fits_poly_int64_p (t)); + if (POLY_INT_CST_P (t)) + return poly_int_cst_value (t).force_shwi (); + return TREE_INT_CST_LOW (t); +} + +poly_uint64 +tree_to_poly_uint64 (const_tree t) +{ + gcc_assert (tree_fits_poly_uint64_p (t)); + if (POLY_INT_CST_P (t)) + return poly_int_cst_value (t).force_uhwi (); + return TREE_INT_CST_LOW (t); +} /* Return first list element whose TREE_VALUE is ELEM. Return 0 if ELEM is not in LIST. */ @@ -4707,7 +4865,7 @@ mem_ref_offset (const_tree t) offsetted by OFFSET units. */ tree -build_invariant_address (tree type, tree base, HOST_WIDE_INT offset) +build_invariant_address (tree type, tree base, poly_int64 offset) { tree ref = fold_build2 (MEM_REF, TREE_TYPE (type), build_fold_addr_expr (base), @@ -6603,6 +6761,25 @@ tree_fits_shwi_p (const_tree t) && wi::fits_shwi_p (wi::to_widest (t))); } +/* Return true if T is an INTEGER_CST or POLY_INT_CST whose numerical + value (extended according to TYPE_UNSIGNED) fits in a poly_int64. */ + +bool +tree_fits_poly_int64_p (const_tree t) +{ + if (t == NULL_TREE) + return false; + if (POLY_INT_CST_P (t)) + { + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; i++) + if (!wi::fits_shwi_p (wi::to_wide (POLY_INT_CST_COEFF (t, i)))) + return false; + return true; + } + return (TREE_CODE (t) == INTEGER_CST + && wi::fits_shwi_p (wi::to_widest (t))); +} + /* Return true if T is an INTEGER_CST whose numerical value (extended according to TYPE_UNSIGNED) fits in an unsigned HOST_WIDE_INT. */ @@ -6614,6 +6791,25 @@ tree_fits_uhwi_p (const_tree t) && wi::fits_uhwi_p (wi::to_widest (t))); } +/* Return true if T is an INTEGER_CST or POLY_INT_CST whose numerical + value (extended according to TYPE_UNSIGNED) fits in a poly_uint64. */ + +bool +tree_fits_poly_uint64_p (const_tree t) +{ + if (t == NULL_TREE) + return false; + if (POLY_INT_CST_P (t)) + { + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; i++) + if (!wi::fits_uhwi_p (wi::to_widest (POLY_INT_CST_COEFF (t, i)))) + return false; + return true; + } + return (TREE_CODE (t) == INTEGER_CST + && wi::fits_uhwi_p (wi::to_widest (t))); +} + /* T is an INTEGER_CST whose numerical value (extended according to TYPE_UNSIGNED) fits in a signed HOST_WIDE_INT. Return that HOST_WIDE_INT. */ @@ -6822,6 +7018,12 @@ simple_cst_equal (const_tree t1, const_tree t2) return 0; default: + if (POLY_INT_CST_P (t1)) + /* A false return means maybe_ne rather than known_ne. */ + return known_eq (poly_widest_int::from (poly_int_cst_value (t1), + TYPE_SIGN (TREE_TYPE (t1))), + poly_widest_int::from (poly_int_cst_value (t2), + TYPE_SIGN (TREE_TYPE (t2)))); break; } @@ -6881,6 +7083,15 @@ compare_tree_int (const_tree t, unsigned HOST_WIDE_INT u) bool valid_constant_size_p (const_tree size) { + if (POLY_INT_CST_P (size)) + { + if (TREE_OVERFLOW (size)) + return false; + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + if (!valid_constant_size_p (POLY_INT_CST_COEFF (size, i))) + return false; + return true; + } if (! tree_fits_uhwi_p (size) || TREE_OVERFLOW (size) || tree_int_cst_sign_bit (size) != 0) @@ -7176,6 +7387,12 @@ add_expr (const_tree t, inchash::hash &hstate, unsigned int flags) } /* FALL THROUGH */ default: + if (POLY_INT_CST_P (t)) + { + for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i) + hstate.add_wide_int (wi::to_wide (POLY_INT_CST_COEFF (t, i))); + return; + } tclass = TREE_CODE_CLASS (code); if (tclass == tcc_declaration) @@ -7715,7 +7932,7 @@ build_nonshared_array_type (tree elt_type, tree index_type) sizetype. */ tree -build_array_type_nelts (tree elt_type, unsigned HOST_WIDE_INT nelts) +build_array_type_nelts (tree elt_type, poly_uint64 nelts) { return build_array_type (elt_type, build_index_type (size_int (nelts - 1))); } @@ -12487,8 +12704,8 @@ drop_tree_overflow (tree t) gcc_checking_assert (TREE_OVERFLOW (t)); /* For tree codes with a sharing machinery re-build the result. */ - if (TREE_CODE (t) == INTEGER_CST) - return wide_int_to_tree (TREE_TYPE (t), wi::to_wide (t)); + if (poly_int_tree_p (t)) + return wide_int_to_tree (TREE_TYPE (t), wi::to_poly_wide (t)); /* For VECTOR_CST, remove the overflow bits from the encoded elements and canonicalize the result. */ |