diff options
Diffstat (limited to 'clang/docs')
-rw-r--r-- | clang/docs/InternalsManual.rst | 32 | ||||
-rw-r--r-- | clang/docs/ReleaseNotes.rst | 5 | ||||
-rw-r--r-- | clang/docs/ShadowCallStack.rst | 2 | ||||
-rw-r--r-- | clang/docs/analyzer/user-docs/CrossTranslationUnit.rst | 4 |
4 files changed, 24 insertions, 19 deletions
diff --git a/clang/docs/InternalsManual.rst b/clang/docs/InternalsManual.rst index 8a44db7..f33950b 100644 --- a/clang/docs/InternalsManual.rst +++ b/clang/docs/InternalsManual.rst @@ -312,7 +312,7 @@ Description: * number: A simple decimal number matches if the argument is the same as the number. Example: ``"%plural{1:mouse|:mice}0"`` * range: A range in square brackets matches if the argument is within the - range. Then range is inclusive on both ends. Example: + range. The range is inclusive on both ends. Example: ``"%plural{0:none|1:one|[2,5]:some|:many}0"`` * modulo: A modulo operator is followed by a number, and equals sign and either a number or a range. The tests are the same as for plain numbers @@ -341,7 +341,7 @@ Example: Class: Integers Description: - This is a formatter which represents the argument number in a human readable + This is a formatter which represents the argument number in a human-readable format: the value ``123`` stays ``123``, ``12345`` becomes ``12.34k``, ``6666666` becomes ``6.67M``, and so on for 'G' and 'T'. @@ -561,7 +561,7 @@ documentation for the ``-verify`` mode can be found at There are many other possible implementations of this interface, and this is why we prefer diagnostics to pass down rich structured information in -arguments. For example, an HTML output might want declaration names be +arguments. For example, an HTML output might want declaration names to be linkified to where they come from in the source. Another example is that a GUI might let you click on typedefs to expand them. This application would want to pass significantly more information about types through to the GUI than a @@ -846,7 +846,7 @@ Option Marshalling Infrastructure The option marshalling infrastructure automates the parsing of the Clang ``-cc1`` frontend command line arguments into ``CompilerInvocation`` and their generation from ``CompilerInvocation``. The system replaces lots of repetitive -C++ code with simple, declarative tablegen annotations and it's being used for +C++ code with simple, declarative tablegen annotations and is being used for the majority of the ``-cc1`` command line interface. This section provides an overview of the system. @@ -986,7 +986,7 @@ line. NegFlag<SetFalse, [], [], "Use the new pass manager in LLVM">, BothFlags<[], [ClangOption, CC1Option]>>; -With most such pair of flags, the ``-cc1`` frontend accepts only the flag that +With most such pairs of flags, the ``-cc1`` frontend accepts only the flag that changes the default key path value. The Clang driver is responsible for accepting both and either forwarding the changing flag or discarding the flag that would just set the key path to its default. @@ -1042,8 +1042,8 @@ and the result is assigned to the key path on success. The key path defaults to the value specified in ``MarshallingInfoEnum`` prefixed by the contents of ``NormalizedValuesScope`` and ``::``. This ensures correct reference to an enum case is formed even if the enum resides in different -namespace or is an enum class. If the value present on command line does not -match any of the comma-separated values from ``Values``, an error diagnostics is +namespace or is an enum class. If the value present on the command line does not +match any of the comma-separated values from ``Values``, an error diagnostic is issued. Otherwise, the corresponding element from ``NormalizedValues`` at the same index is assigned to the key path (also correctly scoped). The number of comma-separated string values and elements of the array within @@ -1111,7 +1111,7 @@ The Token class --------------- The ``Token`` class is used to represent a single lexed token. Tokens are -intended to be used by the lexer/preprocess and parser libraries, but are not +intended to be used by the lexer/preprocessor and parser libraries, but are not intended to live beyond them (for example, they should not live in the ASTs). Tokens most often live on the stack (or some other location that is efficient @@ -1253,7 +1253,7 @@ In order to do this, whenever the parser expects a ``tok::identifier`` or ``tok::coloncolon``, it should call the ``TryAnnotateTypeOrScopeToken`` or ``TryAnnotateCXXScopeToken`` methods to form the annotation token. These methods will maximally form the specified annotation tokens and replace the -current token with them, if applicable. If the current tokens is not valid for +current token with them, if applicable. If the current token is not valid for an annotation token, it will remain an identifier or "``::``" token. .. _Lexer: @@ -1276,7 +1276,7 @@ The lexer has a couple of interesting modal features: This mode is used for lexing within an "``#if 0``" block, for example. * The lexer can capture and return comments as tokens. This is required to support the ``-C`` preprocessor mode, which passes comments through, and is - used by the diagnostic checker to identifier expect-error annotations. + used by the diagnostic checker to identify expect-error annotations. * The lexer can be in ``ParsingFilename`` mode, which happens when preprocessing after reading a ``#include`` directive. This mode changes the parsing of "``<``" to return an "angled string" instead of a bunch of tokens @@ -1308,7 +1308,7 @@ The ``TokenLexer`` class ------------------------ The ``TokenLexer`` class is a token provider that returns tokens from a list of -tokens that came from somewhere else. It typically used for two things: 1) +tokens that came from somewhere else. It is typically used for two things: 1) returning tokens from a macro definition as it is being expanded 2) returning tokens from an arbitrary buffer of tokens. The later use is used by ``_Pragma`` and will most likely be used to handle unbounded look-ahead for the @@ -1509,7 +1509,7 @@ type checker must verify that the operand has a pointer type. It would not be correct to check that with "``isa<PointerType>(SubExpr->getType())``", because this predicate would fail if the subexpression had a typedef type. -The solution to this problem are a set of helper methods on ``Type``, used to +The solution to this problem is a set of helper methods on ``Type``, used to check their properties. In this case, it would be correct to use "``SubExpr->getType()->isPointerType()``" to do the check. This predicate will return true if the *canonical type is a pointer*, which is true any time the @@ -1632,7 +1632,7 @@ the names are inside the ``DeclarationName`` class). ``CXXLiteralOperatorName`` - The name is a C++11 user defined literal operator. User defined + The name is a C++11 user-defined literal operator. User-defined Literal operators are named according to the suffix they define, e.g., "``_foo``" for "``operator "" _foo``". Use ``N.getCXXLiteralIdentifier()`` to retrieve the corresponding @@ -2215,7 +2215,7 @@ Consequently, we must either set the virtual flag for the definition (but then we create a malformed AST which the parser would never create), or we import the whole redeclaration chain of the function. The most recent version of the ``ASTImporter`` uses the latter mechanism. We do import all function -declarations - regardless if they are definitions or prototypes - in the order +declarations - regardless of whether they are definitions or prototypes - in the order as they appear in the "from" context. .. One definition @@ -2338,7 +2338,7 @@ library receive an Error object, which they must check. During import of a specific declaration, it may happen that some AST nodes had already been created before we recognize an error. In this case, we signal back the error to the caller, but the "to" context remains polluted with those nodes -which had been created. Ideally, those nodes should not had been created, but +which had been created. Ideally, those nodes should not have been created, but that time we did not know about the error, the error happened later. Since the AST is immutable (most of the cases we can't remove existing nodes) we choose to mark these nodes as erroneous. @@ -2579,7 +2579,7 @@ that there are global declarations which collide with declarations from other translation units, but they are not referenced outside from their translation unit. These declarations should be in an unnamed namespace ideally. If we treat these collisions liberally then CTU analysis can find more results. Note, the -feature be able to choose between name conflict handling strategies is still an +feature to be able to choose between name conflict handling strategies is still an ongoing work. .. _CFG: diff --git a/clang/docs/ReleaseNotes.rst b/clang/docs/ReleaseNotes.rst index 9d9a000..4a2edae 100644 --- a/clang/docs/ReleaseNotes.rst +++ b/clang/docs/ReleaseNotes.rst @@ -134,6 +134,11 @@ Bug Fixes in This Version ------------------------- - Fix a crash when marco name is empty in ``#pragma push_macro("")`` or ``#pragma pop_macro("")``. (#GH149762). +- `-Wunreachable-code`` now diagnoses tautological or contradictory + comparisons such as ``x != 0 || x != 1.0`` and ``x == 0 && x == 1.0`` on + targets that treat ``_Float16``/``__fp16`` as native scalar types. Previously + the warning was silently lost because the operands differed only by an implicit + cast chain. (#GH149967). Bug Fixes to Compiler Builtins ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ diff --git a/clang/docs/ShadowCallStack.rst b/clang/docs/ShadowCallStack.rst index fc8bea8..9b104cc 100644 --- a/clang/docs/ShadowCallStack.rst +++ b/clang/docs/ShadowCallStack.rst @@ -61,7 +61,7 @@ The instrumentation makes use of the platform register ``x18`` on AArch64, ``x3`` (``gp``) on RISC-V with software shadow stack and ``ssp`` on RISC-V with hardware shadow stack, which needs `Zicfiss`_ and ``-fcf-protection=return``. Users can choose between the software and hardware based shadow stack -implementation on RISC-V backend by passing ``-fsanitize=shadowcallstack`` +implementation on RISC-V backend by passing ``-fsanitize=shadow-call-stack`` or ``Zicfiss`` with ``-fcf-protection=return``. For simplicity we will refer to this as the ``SCSReg``. On some platforms, ``SCSReg`` is reserved, and on others, it is designated as a scratch register. diff --git a/clang/docs/analyzer/user-docs/CrossTranslationUnit.rst b/clang/docs/analyzer/user-docs/CrossTranslationUnit.rst index 700dac0..a04b9f8 100644 --- a/clang/docs/analyzer/user-docs/CrossTranslationUnit.rst +++ b/clang/docs/analyzer/user-docs/CrossTranslationUnit.rst @@ -132,7 +132,7 @@ Once we have set up the `PATH` environment variable and we activated the python .. code-block:: bash - $ CodeChecker analyze --ctu compile_commands.json -o reports + $ CodeChecker analyze --ctu --ctu-ast-mode load-from-pch compile_commands.json -o reports $ ls -F compile_commands.json foo.cpp foo.cpp.ast main.cpp reports/ $ tree reports @@ -318,7 +318,7 @@ Once we have set up the `PATH` environment variable and we activated the python .. code-block:: bash - $ CodeChecker analyze --ctu --ctu-ast-loading-mode on-demand compile_commands.json -o reports + $ CodeChecker analyze --ctu compile_commands.json -o reports $ ls -F compile_commands.json foo.cpp main.cpp reports/ $ tree reports |