diff options
75 files changed, 1016 insertions, 329 deletions
diff --git a/docs/markdown/Dependencies.md b/docs/markdown/Dependencies.md index 17c9991..572a3d1 100644 --- a/docs/markdown/Dependencies.md +++ b/docs/markdown/Dependencies.md @@ -242,6 +242,9 @@ libgcrypt_dep = dependency('libgcrypt', version: '>= 1.8') gpgme_dep = dependency('gpgme', version: '>= 1.0') ``` +*Since 0.55.0* Meson won't search $PATH any more for a config tool binary when +cross compiling if the config tool did not have an entry in the cross file. + ## AppleFrameworks Use the `modules` keyword to list frameworks required, e.g. diff --git a/docs/markdown/Kconfig-module.md b/docs/markdown/Keyval-module.md index 5807f8d..643265e 100644 --- a/docs/markdown/Kconfig-module.md +++ b/docs/markdown/Keyval-module.md @@ -1,15 +1,15 @@ --- -short-description: Unstable kconfig module +short-description: Unstable keyval module authors: - name: Mark Schulte, Paolo Bonzini years: [2017, 2019] has-copyright: false ... -# Unstable kconfig module +# keyval module -This module parses Kconfig output files to allow use of kconfig -configurations in meson projects. +This module parses files consisting of a series of `key=value` lines. One use +of this module is to load kconfig configurations in meson projects. **Note**:Â this does not provide kconfig frontend tooling to generate a configuration. You still need something such as kconfig frontends (see @@ -23,20 +23,23 @@ chosen the configuration options), output a ".config" file. The module may be imported as follows: ``` meson -kconfig = import('unstable-kconfig') +keyval = import('unstable-keyval') ``` The following functions will then be available as methods on the object -with the name `kconfig`. You can, of course, replace the name -`kconfig` with anything else. +with the name `keyval`. You can, of course, replace the name +`keyval` with anything else. -### kconfig.load() +### keyval.load() -This function loads a kconfig output file and returns a dictionary object. +This function loads a file consisting of a series of `key=value` lines +and returns a dictionary object. -`kconfig.load()` makes no attempt at parsing the values in the -file. Therefore, true boolean values will be represented as the string "y" -and integer values will have to be converted with `.to_int()`. +`keyval.load()` makes no attempt at parsing the values in the file. +In particular boolean and integer values will be represented as strings, +and strings will keep any quoting that is present in the input file. It +can be useful to create a [`configuration_data()`](#configuration_data) +object from the dictionary and use methods such as `get_unquoted()`. Kconfig frontends usually have ".config" as the default name for the configuration file. However, placing the configuration file in the source diff --git a/docs/markdown/Precompiled-headers.md b/docs/markdown/Precompiled-headers.md index d9ac7a4..05b50bc 100644 --- a/docs/markdown/Precompiled-headers.md +++ b/docs/markdown/Precompiled-headers.md @@ -51,7 +51,7 @@ Using precompiled headers with GCC and derivatives -- Once you have a file to precompile, you can enable the use of pch for -a give target with a *pch* keyword argument. As an example, let's assume +a given target with a *pch* keyword argument. As an example, let's assume you want to build a small C binary with precompiled headers. Let's say the source files of the binary use the system headers `stdio.h` and `string.h`. Then you create a header file `pch/myexe_pch.h` with this diff --git a/docs/markdown/Reference-manual.md b/docs/markdown/Reference-manual.md index 963af9d..97d3e83 100644 --- a/docs/markdown/Reference-manual.md +++ b/docs/markdown/Reference-manual.md @@ -600,8 +600,12 @@ be passed to [shared and static libraries](#library). depends on such as a symbol visibility map. The purpose is to automatically trigger a re-link (but not a re-compile) of the target when this file changes. -- `link_language` since 0.51.0 makes the linker for this target - be for the specified language. This is helpful for multi-language targets. +- `link_language` since 0.51.0 (broken until 0.55.0) makes the linker for this + target be for the specified language. It is generally unnecessary to set + this, as meson will detect the right linker to use in most cases. There are + only two cases where this is needed. One, your main function in an + executable is not in the language meson picked, or second you want to force + a library to use only one ABI. - `link_whole` links all contents of the given static libraries whether they are used by not, equivalent to the `-Wl,--whole-archive` argument flag of GCC, available since 0.40.0. @@ -1662,11 +1666,14 @@ test(..., env: nomalloc, ...) before test is executed even if they have `build_by_default : false`. Since 0.46.0 -- `protocol` specifies how the test results are parsed and can be one - of `exitcode` (the executable's exit code is used by the test harness - to record the outcome of the test) or `tap` ([Test Anything - Protocol](https://www.testanything.org/)). For more on the Meson test - harness protocol read [Unit Tests](Unit-tests.md). Since 0.50.0 +- `protocol` *(Since 0.50.0)* specifies how the test results are parsed and can + be one of `exitcode`, `tap`, or `gtest`. For more information about test + harness protocol read [Unit Tests](Unit-tests.md). The following values are + accepted: + - `exitcode`: the executable's exit code is used by the test harness + to record the outcome of the test) + - `tap` ([Test Anything Protocol](https://www.testanything.org/)) + - `gtest`. *(Since 0.55.0)* for Google Tests. - `priority` specifies the priority of a test. Tests with a higher priority are *started* before tests with a lower priority. @@ -1735,6 +1742,8 @@ the following methods. 0.49.0, the function only accepted a single argument. Since 0.54.0 the `MESON_SOURCE_ROOT` and `MESON_BUILD_ROOT` environment variables are set when dist scripts are run. + *(Since 0.55.0)* The output of `configure_file`, `files`, and `find_program` + as well as strings. - `add_install_script(script_name, arg1, arg2, ...)` causes the script given as an argument to be run during the install step, this script @@ -1742,6 +1751,9 @@ the following methods. `MESON_BUILD_ROOT`, `MESON_INSTALL_PREFIX`, `MESON_INSTALL_DESTDIR_PREFIX`, and `MESONINTROSPECT` set. All positional arguments are passed as parameters. + *(Since 0.55.0)* The output of `configure_file`, `files`, `find_program`, + `custom_target`, indexes of `custom_target`, `executable`, `library`, and + other built targets as well as strings. *(added 0.54)* If `meson install` is called with the `--quiet` option, the environment variable `MESON_INSTALL_QUIET` will be set. @@ -1772,6 +1784,8 @@ the following methods. executable given as an argument after all project files have been generated. This script will have the environment variables `MESON_SOURCE_ROOT` and `MESON_BUILD_ROOT` set. + *(Since 0.55.0)* The output of `configure_file`, `files`, and `find_program` + as well as strings. - `backend()` *(added 0.37.0)* returns a string representing the current backend: `ninja`, `vs2010`, `vs2015`, `vs2017`, `vs2019`, diff --git a/docs/markdown/Reference-tables.md b/docs/markdown/Reference-tables.md index dfae339..c42d608 100644 --- a/docs/markdown/Reference-tables.md +++ b/docs/markdown/Reference-tables.md @@ -81,6 +81,7 @@ set in the cross file. | alpha | DEC Alpha processor | | arc | 32 bit ARC processor | | arm | 32 bit ARM processor | +| avr | Atmel AVR processor | | e2k | MCST Elbrus processor | | c2000 | 32 bit C2000 processor | | ia64 | Itanium processor | diff --git a/docs/markdown/Syntax.md b/docs/markdown/Syntax.md index cf0516c..666d50e 100644 --- a/docs/markdown/Syntax.md +++ b/docs/markdown/Syntax.md @@ -588,3 +588,73 @@ FAQ](FAQ.md#why-is-meson-not-just-a-python-module-so-i-could-code-my-build-setup because of this limitation you find yourself copying and pasting code a lot you may be able to use a [`foreach` loop instead](#foreach-statements). + +Stability Promises +-- + +Meson is very actively developed and continuously improved. There is a +possibility that future enhancements to the Meson build system will require +changes to the syntax. Such changes might be the addition of new reserved +keywords, changing the meaning of existing keywords or additions around the +basic building blocks like statements and fundamental types. It is planned +to stabilize the syntax with the 1.0 release. + +Grammar +-- + +This is the full Meson grammar, as it is used to parse Meson build definition files: + +``` +additive_expression: multiplicative_expression | (additive_expression additive_operator multiplicative_expression) +additive_operator: "+" | "-" +argument_list: positional_arguments ["," keyword_arguments] | keyword_arguments +array_literal: "[" [expression_list] "]" +assignment_expression: conditional_expression | (logical_or_expression assignment_operator assignment_expression) +assignment_operator: "=" | "*=" | "/=" | "%=" | "+=" | "-=" +boolean_literal: "true" | "false" +build_definition: (NEWLINE | statement)* +condition: expression +conditional_expression: logical_or_expression | (logical_or_expression "?" expression ":" assignment_expression +decimal_literal: DECIMAL_NUMBER +DECIMAL_NUMBER: /[1-9][0-9]*/ +dictionary_literal: "{" [key_value_list] "}" +equality_expression: relational_expression | (equality_expression equality_operator relational_expression) +equality_operator: "==" | "!=" +expression: assignment_expression +expression_list: expression ("," expression)* +expression_statememt: expression +function_expression: id_expression "(" [argument_list] ")" +hex_literal: "0x" HEX_NUMBER +HEX_NUMBER: /[a-fA-F0-9]+/ +id_expression: IDENTIFIER +IDENTIFIER: /[a-zA-Z_][a-zA-Z_0-9]*/ +identifier_list: id_expression ("," id_expression)* +integer_literal: decimal_literal | octal_literal | hex_literal +iteration_statement: "foreach" identifier_list ":" id_expression NEWLINE (statement | jump_statement)* "endforeach" +jump_statement: ("break" | "continue") NEWLINE +key_value_item: expression ":" expression +key_value_list: key_value_item ("," key_value_item)* +keyword_item: id_expression ":" expression +keyword_arguments: keyword_item ("," keyword_item)* +literal: integer_literal | string_literal | boolean_literal | array_literal | dictionary_literal +logical_and_expression: equality_expression | (logical_and_expression "and" equality_expression) +logical_or_expression: logical_and_expression | (logical_or_expression "or" logical_and_expression) +method_expression: postfix_expression "." function_expression +multiplicative_expression: unary_expression | (multiplicative_expression multiplicative_operator unary_expression) +multiplicative_operator: "*" | "/" | "%" +octal_literal: "0o" OCTAL_NUMBER +OCTAL_NUMBER: /[0-7]+/ +positional_arguments: expression ("," expression)* +postfix_expression: primary_expression | subscript_expression | function_expression | method_expression +primary_expression: literal | ("(" expression ")") | id_expression +relational_expression: additive_expression | (relational_expression relational_operator additive_expression) +relational_operator: ">" | "<" | ">=" | "<=" | "in" | ("not" "in") +selection_statement: "if" condition NEWLINE (statement)* ("elif" condition NEWLINE (statement)*)* ["else" (statement)*] "endif" +statement: (expression_statement | selection_statement | iteration_statement) NEWLINE +string_literal: ("'" STRING_SIMPLE_VALUE "'") | ("'''" STRING_MULTILINE_VALUE "'''") +STRING_MULTILINE_VALUE: \.*?(''')\ +STRING_SIMPLE_VALUE: \.*?(?<!\\)(\\\\)*?'\ +subscript_expression: postfix_expression "[" expression "]" +unary_expression: postfix_expression | (unary_operator unary_expression) +unary_operator: "not" | "+" | "-" +``` diff --git a/docs/markdown/Unit-tests.md b/docs/markdown/Unit-tests.md index 0785549..bd91dbb 100644 --- a/docs/markdown/Unit-tests.md +++ b/docs/markdown/Unit-tests.md @@ -4,20 +4,24 @@ short-description: Meson's own unit-test system # Unit tests -Meson comes with a fully functional unit test system. To use it simply build an executable and then use it in a test. +Meson comes with a fully functional unit test system. To use it simply build +an executable and then use it in a test. ```meson e = executable('prog', 'testprog.c') test('name of test', e) ``` -You can add as many tests as you want. They are run with the command `ninja test`. +You can add as many tests as you want. They are run with the command `ninja +test`. -Meson captures the output of all tests and writes it in the log file `meson-logs/testlog.txt`. +Meson captures the output of all tests and writes it in the log file +`meson-logs/testlog.txt`. ## Test parameters -Some tests require the use of command line arguments or environment variables. These are simple to define. +Some tests require the use of command line arguments or environment +variables. These are simple to define. ```meson test('command line test', exe, args : ['first', 'second']) @@ -29,38 +33,46 @@ Note how you need to specify multiple values as an array. ### MALLOC_PERTURB_ By default, environment variable -[`MALLOC_PERTURB_`](http://man7.org/linux/man-pages/man3/mallopt.3.html) -is set to a random value between 1..255. This can help find memory -leaks on configurations using glibc, including with non-GCC compilers. -This feature can be disabled as discussed in [test()](Reference-manual.md#test). +[`MALLOC_PERTURB_`](http://man7.org/linux/man-pages/man3/mallopt.3.html) is +set to a random value between 1..255. This can help find memory leaks on +configurations using glibc, including with non-GCC compilers. This feature +can be disabled as discussed in [test()](Reference-manual.md#test). ## Coverage If you enable coverage measurements by giving Meson the command line flag -`-Db_coverage=true`, you can generate coverage reports after running the tests -(running the tests is required to gather the list of functions that get -called). Meson will autodetect what coverage generator tools you have installed -and will generate the corresponding targets. These targets are `coverage-xml` -and `coverage-text` which are both provided by [Gcovr](http://gcovr.com) -(version 3.3 or higher) and `coverage-html`, which requires -[Lcov](https://ltp.sourceforge.io/coverage/lcov.php) and -[GenHTML](https://linux.die.net/man/1/genhtml) or -[Gcovr](http://gcovr.com). As a convenience, a high-level `coverage` target is -also generated which will produce all 3 coverage report types, if possible. - -The output of these commands is written to the log directory `meson-logs` in your build directory. +`-Db_coverage=true`, you can generate coverage reports after running the +tests (running the tests is required to gather the list of functions that get +called). Meson will autodetect what coverage generator tools you have +installed and will generate the corresponding targets. These targets are +`coverage-xml` and `coverage-text` which are both provided by +[Gcovr](http://gcovr.com) (version 3.3 or higher) and `coverage-html`, which +requires [Lcov](https://ltp.sourceforge.io/coverage/lcov.php) and +[GenHTML](https://linux.die.net/man/1/genhtml) or [Gcovr](http://gcovr.com). +As a convenience, a high-level `coverage` target is also generated which will +produce all 3 coverage report types, if possible. + +The output of these commands is written to the log directory `meson-logs` in +your build directory. ## Parallelism -To reduce test times, Meson will by default run multiple unit tests in parallel. It is common to have some tests which can not be run in parallel because they require unique hold on some resource such as a file or a D-Bus name. You have to specify these tests with a keyword argument. +To reduce test times, Meson will by default run multiple unit tests in +parallel. It is common to have some tests which can not be run in parallel +because they require unique hold on some resource such as a file or a D-Bus +name. You have to specify these tests with a keyword argument. ```meson test('unique test', t, is_parallel : false) ``` -Meson will then make sure that no other unit test is running at the same time. Non-parallel tests take longer to run so it is recommended that you write your unit tests to be parallel executable whenever possible. +Meson will then make sure that no other unit test is running at the same +time. Non-parallel tests take longer to run so it is recommended that you +write your unit tests to be parallel executable whenever possible. -By default Meson uses as many concurrent processes as there are cores on the test machine. You can override this with the environment variable `MESON_TESTTHREADS` like this. +By default Meson uses as many concurrent processes as there are cores on the +test machine. You can override this with the environment variable +`MESON_TESTTHREADS` like this. ```console $ MESON_TESTTHREADS=5 ninja test @@ -70,7 +82,10 @@ $ MESON_TESTTHREADS=5 ninja test *(added in version 0.52.0)* -Tests can be assigned a priority that determines when a test is *started*. Tests with higher priority are started first, tests with lower priority started later. The default priority is 0, meson makes no guarantee on the ordering of tests with identical priority. +Tests can be assigned a priority that determines when a test is *started*. +Tests with higher priority are started first, tests with lower priority +started later. The default priority is 0, meson makes no guarantee on the +ordering of tests with identical priority. ```meson test('started second', t, priority : 0) @@ -78,23 +93,37 @@ test('started third', t, priority : -50) test('started first', t, priority : 1000) ``` -Note that the test priority only affects the starting order of tests and subsequent tests are affected by how long it takes previous tests to complete. It is thus possible that a higher-priority test is still running when lower-priority tests with a shorter runtime have completed. +Note that the test priority only affects the starting order of tests and +subsequent tests are affected by how long it takes previous tests to +complete. It is thus possible that a higher-priority test is still running +when lower-priority tests with a shorter runtime have completed. ## Skipped tests and hard errors Sometimes a test can only determine at runtime that it can not be run. -For the default `exitcode` testing protocol, the GNU standard approach in this case is to exit the program with error code 77. Meson will detect this and report these tests as skipped rather than failed. This behavior was added in version 0.37.0. +For the default `exitcode` testing protocol, the GNU standard approach in +this case is to exit the program with error code 77. Meson will detect this +and report these tests as skipped rather than failed. This behavior was added +in version 0.37.0. -For TAP-based tests, skipped tests should print a single line starting with `1..0 # SKIP`. +For TAP-based tests, skipped tests should print a single line starting with +`1..0 # SKIP`. -In addition, sometimes a test fails set up so that it should fail even if it is marked as an expected failure. The GNU standard approach in this case is to exit the program with error code 99. Again, Meson will detect this and report these tests as `ERROR`, ignoring the setting of `should_fail`. This behavior was added in version 0.50.0. +In addition, sometimes a test fails set up so that it should fail even if it +is marked as an expected failure. The GNU standard approach in this case is +to exit the program with error code 99. Again, Meson will detect this and +report these tests as `ERROR`, ignoring the setting of `should_fail`. This +behavior was added in version 0.50.0. ## Testing tool -The goal of the meson test tool is to provide a simple way to run tests in a variety of different ways. The tool is designed to be run in the build directory. +The goal of the meson test tool is to provide a simple way to run tests in a +variety of different ways. The tool is designed to be run in the build +directory. -The simplest thing to do is just to run all tests, which is equivalent to running `ninja test`. +The simplest thing to do is just to run all tests, which is equivalent to +running `ninja test`. ```console $ meson test @@ -125,7 +154,8 @@ Tests belonging to a suite `suite` can be run as follows $ meson test --suite (sub)project_name:suite ``` -Since version *0.46*, `(sub)project_name` can be omitted if it is the top-level project. +Since version *0.46*, `(sub)project_name` can be omitted if it is the +top-level project. Multiple suites are specified like: @@ -145,7 +175,8 @@ Sometimes you need to run the tests multiple times, which is done like this: $ meson test --repeat=10 ``` -Invoking tests via a helper executable such as Valgrind can be done with the `--wrap` argument +Invoking tests via a helper executable such as Valgrind can be done with the +`--wrap` argument ```console $ meson test --wrap=valgrind testname @@ -163,17 +194,25 @@ Meson also supports running the tests under GDB. Just doing this: $ meson test --gdb testname ``` -Meson will launch `gdb` all set up to run the test. Just type `run` in the GDB command prompt to start the program. +Meson will launch `gdb` all set up to run the test. Just type `run` in the +GDB command prompt to start the program. -The second use case is a test that segfaults only rarely. In this case you can invoke the following command: +The second use case is a test that segfaults only rarely. In this case you +can invoke the following command: ```console $ meson test --gdb --repeat=10000 testname ``` -This runs the test up to 10 000 times under GDB automatically. If the program crashes, GDB will halt and the user can debug the application. Note that testing timeouts are disabled in this case so `meson test` will not kill `gdb` while the developer is still debugging it. The downside is that if the test binary freezes, the test runner will wait forever. +This runs the test up to 10 000 times under GDB automatically. If the program +crashes, GDB will halt and the user can debug the application. Note that +testing timeouts are disabled in this case so `meson test` will not kill +`gdb` while the developer is still debugging it. The downside is that if the +test binary freezes, the test runner will wait forever. -Sometimes, the GDB binary is not in the PATH variable or the user wants to use a GDB replacement. Therefore, the invoked GDB program can be specified *(added 0.52.0)*: +Sometimes, the GDB binary is not in the PATH variable or the user wants to +use a GDB replacement. Therefore, the invoked GDB program can be specified +*(added 0.52.0)*: ```console $ meson test --gdb --gdb-path /path/to/gdb testname @@ -183,12 +222,41 @@ $ meson test --gdb --gdb-path /path/to/gdb testname $ meson test --print-errorlogs ``` -Meson will report the output produced by the failing tests along with other useful information as the environmental variables. This is useful, for example, when you run the tests on Travis-CI, Jenkins and the like. +Meson will report the output produced by the failing tests along with other +useful information as the environmental variables. This is useful, for +example, when you run the tests on Travis-CI, Jenkins and the like. -For further information see the command line help of Meson by running `meson test -h`. +For further information see the command line help of Meson by running `meson +test -h`. ## Legacy notes -If `meson test` does not work for you, you likely have a old version of Meson. -In that case you should call `mesontest` instead. If `mesontest` doesn't work -either you have a very old version prior to 0.37.0 and should upgrade. +If `meson test` does not work for you, you likely have a old version of +Meson. In that case you should call `mesontest` instead. If `mesontest` +doesn't work either you have a very old version prior to 0.37.0 and should +upgrade. + +## Test outputs + +Meson will write several different files with detailed results of running +tests. These will be written into $builddir/meson-logs/ + +### testlog.json + +This is not a proper json file, but a file containing one valid json object +per line. This is file is designed so each line is streamed out as each test +is run, so it can be read as a stream while the test harness is running + +### testlog.junit.xml + +This is a valid JUnit XML description of all tests run. It is not streamed +out, and is written only once all tests complete running. + +When tests use the `tap` protocol each test will be recorded as a testsuite +container, with each case named by the number of the result. + +When tests use the `gtest` protocol meson will inject arguments to the test +to generate it's own JUnit XML, which meson will include as part of this XML +file. + +*New in 0.55.0* diff --git a/docs/markdown/Users.md b/docs/markdown/Users.md index bfc8a7a..41d8dfa 100644 --- a/docs/markdown/Users.md +++ b/docs/markdown/Users.md @@ -124,6 +124,7 @@ format files - [Terminology](https://github.com/billiob/terminology), a terminal emulator based on the Enlightenment Foundation Libraries - [Tilix](https://github.com/gnunn1/tilix), a tiling terminal emulator for Linux using GTK+ 3 - [Tizonia](https://github.com/tizonia/tizonia-openmax-il), a command-line cloud music player for Linux with support for Spotify, Google Play Music, YouTube, SoundCloud, TuneIn, Plex servers and Chromecast devices + - [Vala Language Server](https://github.com/benwaffle/vala-language-server), code intelligence engine for the Vala and Genie programming languages - [Valum](https://github.com/valum-framework/valum), a micro web framework written in Vala - [Venom](https://github.com/naxuroqa/Venom), a modern Tox client for the GNU/Linux desktop - [VMAF](https://github.com/Netflix/vmaf) (by Netflix), a perceptual video quality assessment based on multi-method fusion diff --git a/docs/markdown/snippets/add_foo_script_type_additions.md b/docs/markdown/snippets/add_foo_script_type_additions.md new file mode 100644 index 0000000..88a88b2 --- /dev/null +++ b/docs/markdown/snippets/add_foo_script_type_additions.md @@ -0,0 +1,24 @@ +## meson.add_*_script methods accept new types + +All three (`add_install_script`, `add_dist_script`, and +`add_postconf_script`) now accept ExternalPrograms (as returned by +`find_program`), Files, and the output of `configure_file`. The dist and +postconf methods cannot accept other types because of when they are run. +While dist could, in theory, take other dependencies, it would require more +extensive changes, particularly to the backend. + +```meson +meson.add_install_script(find_program('foo'), files('bar')) +meson.add_dist_script(find_program('foo'), files('bar')) +meson.add_postconf_script(find_program('foo'), files('bar')) +``` + +The install script variant is also able to accept custom_targets, +custom_target indexes, and build targets (executables, libraries), and can +use built executables a the script to run + +```meson +installer = executable('installer', ...) +meson.add_install_script(installer, ...) +meson.add_install_script('foo.py', installer) +``` diff --git a/docs/markdown/snippets/config_tool_no_cross_path.md b/docs/markdown/snippets/config_tool_no_cross_path.md new file mode 100644 index 0000000..cec22e4 --- /dev/null +++ b/docs/markdown/snippets/config_tool_no_cross_path.md @@ -0,0 +1,7 @@ +## Config tool based dependencies no longer search PATH for cross compiling + +Before 0.55.0 config tool based dependencies (llvm-config, cups-config, etc), +would search system $PATH if they weren't defined in the cross file. This has +been a source of bugs and has been deprecated. It is now removed, config tool +binaries must be specified in the cross file now or the dependency will not +be found. diff --git a/docs/markdown/snippets/gtest_protocol.md b/docs/markdown/snippets/gtest_protocol.md new file mode 100644 index 0000000..14f3af9 --- /dev/null +++ b/docs/markdown/snippets/gtest_protocol.md @@ -0,0 +1,6 @@ +## Test protocol for gtest + +Due to the popularity of Gtest (google test) among C and C++ developers meson +now supports a special protocol for gtest. With this protocol meson injects +arguments to gtests to output JUnit, reads that JUnit, and adds the output to +the JUnit it generates. diff --git a/docs/markdown/snippets/keyval_kobject.md b/docs/markdown/snippets/keyval_kobject.md new file mode 100644 index 0000000..4add23c --- /dev/null +++ b/docs/markdown/snippets/keyval_kobject.md @@ -0,0 +1,6 @@ +## `unstable-kconfig` module renamed to `unstable-keyval` + +The `unstable-kconfig` module is now renamed to `unstable-keyval`. +We expect this module to become stable once it has some usage experience, +specifically in the next or the following release + diff --git a/docs/markdown/snippets/link_language_all_targets.md b/docs/markdown/snippets/link_language_all_targets.md new file mode 100644 index 0000000..9019d50 --- /dev/null +++ b/docs/markdown/snippets/link_language_all_targets.md @@ -0,0 +1,8 @@ +## link_language argument added to all targets + +Previously the `link_language` argument was only supposed to be allowed in +executables, because the linker used needs to be the linker for the language +that implements the main function. Unfortunately it didn't work in that case, +and, even worse, if it had been implemented properly it would have worked for +*all* targets. In 0.55.0 this restriction has been removed, and the bug fixed. +It now is valid for `executable` and all derivative of `library`. diff --git a/docs/sitemap.txt b/docs/sitemap.txt index 3ac138e..4029a60 100644 --- a/docs/sitemap.txt +++ b/docs/sitemap.txt @@ -48,7 +48,7 @@ index.md SourceSet-module.md Windows-module.md Cuda-module.md - Kconfig-module.md + Keyval-module.md Java.md Vala.md D.md diff --git a/docs/theme/extra/templates/navbar_links.html b/docs/theme/extra/templates/navbar_links.html index 6980f81..832bd2c 100644 --- a/docs/theme/extra/templates/navbar_links.html +++ b/docs/theme/extra/templates/navbar_links.html @@ -14,7 +14,7 @@ ("Hotdoc-module.html","Hotdoc"), \ ("i18n-module.html","i18n"), \ ("Icestorm-module.html","Icestorm"), \ - ("Kconfig-module.html","kconfig"), \ + ("Keyval-module.html","Keyval"), \ ("Pkgconfig-module.html","Pkgconfig"), \ ("Python-module.html","Python"), \ ("Python-3-module.html","Python 3"), \ diff --git a/mesonbuild/backend/backends.py b/mesonbuild/backend/backends.py index 31ddfb4..7f7c434 100644 --- a/mesonbuild/backend/backends.py +++ b/mesonbuild/backend/backends.py @@ -12,23 +12,54 @@ # See the License for the specific language governing permissions and # limitations under the License. -import os, pickle, re +from collections import OrderedDict +from functools import lru_cache +import enum +import json +import os +import pickle +import re +import shlex +import subprocess import textwrap +import typing as T + from .. import build from .. import dependencies from .. import mesonlib from .. import mlog -import json -import subprocess -from ..mesonlib import MachineChoice, MesonException, OrderedSet, OptionOverrideProxy -from ..mesonlib import classify_unity_sources, unholder -from ..mesonlib import File from ..compilers import CompilerArgs, VisualStudioLikeCompiler -from ..interpreter import Interpreter -from collections import OrderedDict -import shlex -from functools import lru_cache -import typing as T +from ..mesonlib import ( + File, MachineChoice, MesonException, OrderedSet, OptionOverrideProxy, + classify_unity_sources, unholder +) + +if T.TYPE_CHECKING: + from ..interpreter import Interpreter + + +class TestProtocol(enum.Enum): + + EXITCODE = 0 + TAP = 1 + GTEST = 2 + + @classmethod + def from_str(cls, string: str) -> 'TestProtocol': + if string == 'exitcode': + return cls.EXITCODE + elif string == 'tap': + return cls.TAP + elif string == 'gtest': + return cls.GTEST + raise MesonException('unknown test format {}'.format(string)) + + def __str__(self) -> str: + if self is self.EXITCODE: + return 'exitcode' + elif self is self.GTEST: + return 'gtest' + return 'tap' class CleanTrees: @@ -83,11 +114,11 @@ class ExecutableSerialisation: class TestSerialisation: def __init__(self, name: str, project: str, suite: str, fname: T.List[str], - is_cross_built: bool, exe_wrapper: T.Optional[build.Executable], + is_cross_built: bool, exe_wrapper: T.Optional[dependencies.ExternalProgram], needs_exe_wrapper: bool, is_parallel: bool, cmd_args: T.List[str], env: build.EnvironmentVariables, should_fail: bool, timeout: T.Optional[int], workdir: T.Optional[str], - extra_paths: T.List[str], protocol: str, priority: int): + extra_paths: T.List[str], protocol: TestProtocol, priority: int): self.name = name self.project_name = project self.suite = suite @@ -107,7 +138,7 @@ class TestSerialisation: self.priority = priority self.needs_exe_wrapper = needs_exe_wrapper -def get_backend_from_name(backend: str, build: T.Optional[build.Build] = None, interpreter: T.Optional[Interpreter] = None) -> T.Optional['Backend']: +def get_backend_from_name(backend: str, build: T.Optional[build.Build] = None, interpreter: T.Optional['Interpreter'] = None) -> T.Optional['Backend']: if backend == 'ninja': from . import ninjabackend return ninjabackend.NinjaBackend(build, interpreter) @@ -134,7 +165,7 @@ def get_backend_from_name(backend: str, build: T.Optional[build.Build] = None, i # This class contains the basic functionality that is needed by all backends. # Feel free to move stuff in and out of it as you see fit. class Backend: - def __init__(self, build: T.Optional[build.Build], interpreter: T.Optional[Interpreter]): + def __init__(self, build: T.Optional[build.Build], interpreter: T.Optional['Interpreter']): # Make it possible to construct a dummy backend # This is used for introspection without a build directory if build is None: @@ -196,7 +227,7 @@ class Backend: return os.path.join(self.get_target_dir(target), target.get_filename()) elif isinstance(target, (build.CustomTarget, build.CustomTargetIndex)): if not target.is_linkable_target(): - raise MesonException('Tried to link against custom target "%s", which is not linkable.' % target.name) + raise MesonException('Tried to link against custom target "{}", which is not linkable.'.format(target.name)) return os.path.join(self.get_target_dir(target), target.get_filename()) elif isinstance(target, build.Executable): if target.import_filename: @@ -282,7 +313,7 @@ class Backend: ofile = init_language_file(comp.get_default_suffix(), unity_file_number) unity_file_number += 1 files_in_current = 0 - ofile.write('#include<%s>\n' % src) + ofile.write('#include<{}>\n'.format(src)) files_in_current += 1 if ofile: ofile.close() @@ -537,14 +568,14 @@ class Backend: def create_msvc_pch_implementation(self, target, lang, pch_header): # We have to include the language in the file name, otherwise # pch.c and pch.cpp will both end up as pch.obj in VS backends. - impl_name = 'meson_pch-%s.%s' % (lang, lang) + impl_name = 'meson_pch-{}.{}'.format(lang, lang) pch_rel_to_build = os.path.join(self.get_target_private_dir(target), impl_name) # Make sure to prepend the build dir, since the working directory is # not defined. Otherwise, we might create the file in the wrong path. pch_file = os.path.join(self.build_dir, pch_rel_to_build) os.makedirs(os.path.dirname(pch_file), exist_ok=True) - content = '#include "%s"' % os.path.basename(pch_header) + content = '#include "{}"'.format(os.path.basename(pch_header)) pch_file_tmp = pch_file + '.tmp' with open(pch_file_tmp, 'w') as f: f.write(content) @@ -664,7 +695,7 @@ class Backend: args = [] for d in deps: if not (d.is_linkable_target()): - raise RuntimeError('Tried to link with a non-library target "%s".' % d.get_basename()) + raise RuntimeError('Tried to link with a non-library target "{}".'.format(d.get_basename())) arg = self.get_target_filename_for_linking(d) if not arg: continue @@ -737,7 +768,7 @@ class Backend: # E.g. an external verifier or simulator program run on a generated executable. # Can always be run without a wrapper. test_for_machine = MachineChoice.BUILD - is_cross = not self.environment.machines.matches_build_machine(test_for_machine) + is_cross = self.environment.is_cross_build(test_for_machine) if is_cross and self.environment.need_exe_wrapper(): exe_wrapper = self.environment.get_exe_wrapper() else: @@ -853,7 +884,7 @@ class Backend: m = regex.search(arg) while m is not None: index = int(m.group(1)) - src = '@OUTPUT%d@' % index + src = '@OUTPUT{}@'.format(index) arg = arg.replace(src, os.path.join(private_dir, output_list[index])) m = regex.search(arg) newargs.append(arg) diff --git a/mesonbuild/backend/ninjabackend.py b/mesonbuild/backend/ninjabackend.py index e765466..9b895c9 100644 --- a/mesonbuild/backend/ninjabackend.py +++ b/mesonbuild/backend/ninjabackend.py @@ -67,9 +67,9 @@ def ninja_quote(text, is_build_line=False): if '\n' in text: errmsg = '''Ninja does not support newlines in rules. The content was: -%s +{} -Please report this error with a test case to the Meson bug tracker.''' % text +Please report this error with a test case to the Meson bug tracker.'''.format(text) raise MesonException(errmsg) return text @@ -101,18 +101,18 @@ class NinjaRule: if not self.refcount: return - outfile.write('rule %s\n' % self.name) + outfile.write('rule {}\n'.format(self.name)) if self.rspable: - outfile.write(' command = %s @$out.rsp\n' % ' '.join(self.command)) + outfile.write(' command = {} @$out.rsp\n'.format(' '.join(self.command))) outfile.write(' rspfile = $out.rsp\n') - outfile.write(' rspfile_content = %s\n' % ' '.join(self.args)) + outfile.write(' rspfile_content = {}\n'.format(' '.join(self.args))) else: - outfile.write(' command = %s\n' % ' '.join(self.command + self.args)) + outfile.write(' command = {}\n'.format(' '.join(self.command + self.args))) if self.deps: - outfile.write(' deps = %s\n' % self.deps) + outfile.write(' deps = {}\n'.format(self.deps)) if self.depfile: - outfile.write(' depfile = %s\n' % self.depfile) - outfile.write(' description = %s\n' % self.description) + outfile.write(' depfile = {}\n'.format(self.depfile)) + outfile.write(' description = {}\n'.format(self.description)) if self.extra: for l in self.extra.split('\n'): outfile.write(' ') @@ -185,7 +185,7 @@ class NinjaBuildElement: for e in self.elems: (name, elems) = e should_quote = name not in raw_names - line = ' %s = ' % name + line = ' {} = '.format(name) newelems = [] for i in elems: if not should_quote or i == '&&': # Hackety hack hack @@ -204,7 +204,7 @@ class NinjaBuildElement: def check_outputs(self): for n in self.outfilenames: if n in self.all_outputs: - raise MesonException('Multiple producers for Ninja target "%s". Please rename your targets.' % n) + raise MesonException('Multiple producers for Ninja target "{}". Please rename your targets.'.format(n)) self.all_outputs[n] = True class NinjaBackend(backends.Backend): @@ -299,8 +299,7 @@ int dummy; outfilename = os.path.join(self.environment.get_build_dir(), self.ninja_filename) tempfilename = outfilename + '~' with open(tempfilename, 'w', encoding='utf-8') as outfile: - outfile.write('# This is the build file for project "%s"\n' % - self.build.get_project()) + outfile.write('# This is the build file for project "{}"\n'.format(self.build.get_project())) outfile.write('# It is autogenerated by the Meson build system.\n') outfile.write('# Do not edit by hand.\n\n') outfile.write('ninja_required_version = 1.7.1\n\n') @@ -308,9 +307,9 @@ int dummy; num_pools = self.environment.coredata.backend_options['backend_max_links'].value if num_pools > 0: outfile.write('''pool link_pool - depth = %d + depth = {} -''' % num_pools) +'''.format(num_pools)) with self.detect_vs_dep_prefix(tempfilename) as outfile: self.generate_rules() @@ -765,7 +764,7 @@ int dummy; target_name = 'meson-{}'.format(self.build_run_target_name(target)) elem = NinjaBuildElement(self.all_outputs, target_name, 'CUSTOM_COMMAND', []) elem.add_item('COMMAND', cmd) - elem.add_item('description', 'Running external command %s' % target.name) + elem.add_item('description', 'Running external command {}'.format(target.name)) elem.add_item('pool', 'console') # Alias that runs the target defined above with the name the user specified self.create_target_alias(target_name) @@ -980,12 +979,12 @@ int dummy; ofilename = os.path.join(self.get_target_private_dir(target), ofilebase) elem = NinjaBuildElement(self.all_outputs, ofilename, "CUSTOM_COMMAND", rel_sourcefile) elem.add_item('COMMAND', ['resgen', rel_sourcefile, ofilename]) - elem.add_item('DESC', 'Compiling resource %s' % rel_sourcefile) + elem.add_item('DESC', 'Compiling resource {}'.format(rel_sourcefile)) self.add_build(elem) deps.append(ofilename) a = '-resource:' + ofilename else: - raise InvalidArguments('Unknown resource file %s.' % r) + raise InvalidArguments('Unknown resource file {}.'.format(r)) args.append(a) return args, deps @@ -1278,7 +1277,7 @@ int dummy; main_rust_file = None for i in target.get_sources(): if not rustc.can_compile(i): - raise InvalidArguments('Rust target %s contains a non-rust source file.' % target.get_basename()) + raise InvalidArguments('Rust target {} contains a non-rust source file.'.format(target.get_basename())) if main_rust_file is None: main_rust_file = i.rel_to_builddir(self.build_to_src) if main_rust_file is None: @@ -1377,11 +1376,11 @@ int dummy; @classmethod def get_compiler_rule_name(cls, lang: str, for_machine: MachineChoice) -> str: - return '%s_COMPILER%s' % (lang, cls.get_rule_suffix(for_machine)) + return '{}_COMPILER{}'.format(lang, cls.get_rule_suffix(for_machine)) @classmethod def get_pch_rule_name(cls, lang: str, for_machine: MachineChoice) -> str: - return '%s_PCH%s' % (lang, cls.get_rule_suffix(for_machine)) + return '{}_PCH{}'.format(lang, cls.get_rule_suffix(for_machine)) @classmethod def compiler_to_rule_name(cls, compiler: Compiler) -> str: @@ -1453,7 +1452,7 @@ int dummy; abs_headers.append(absh) header_imports += swiftc.get_header_import_args(absh) else: - raise InvalidArguments('Swift target %s contains a non-swift source file.' % target.get_basename()) + raise InvalidArguments('Swift target {} contains a non-swift source file.'.format(target.get_basename())) os.makedirs(self.get_target_private_dir_abs(target), exist_ok=True) compile_args = swiftc.get_compile_only_args() compile_args += swiftc.get_optimization_args(self.get_option_for_target('optimization', target)) @@ -1540,7 +1539,7 @@ int dummy; static_linker = self.build.static_linker[for_machine] if static_linker is None: return - rule = 'STATIC_LINKER%s' % self.get_rule_suffix(for_machine) + rule = 'STATIC_LINKER{}'.format(self.get_rule_suffix(for_machine)) cmdlist = [] args = ['$in'] # FIXME: Must normalize file names with pathlib.Path before writing @@ -1574,7 +1573,7 @@ int dummy; or langname == 'rust' \ or langname == 'cs': continue - rule = '%s_LINKER%s' % (langname, self.get_rule_suffix(for_machine)) + rule = '{}_LINKER{}'.format(langname, self.get_rule_suffix(for_machine)) command = compiler.get_linker_exelist() args = ['$ARGS'] + compiler.get_linker_output_args('$out') + ['$in', '$LINK_ARGS'] description = 'Linking target $out' @@ -1645,7 +1644,7 @@ int dummy; self.add_rule(NinjaRule(rule, command, [], description)) def generate_fortran_dep_hack(self, crstr): - rule = 'FORTRAN_DEP_HACK%s' % (crstr) + rule = 'FORTRAN_DEP_HACK{}'.format(crstr) if mesonlib.is_windows(): cmd = ['cmd', '/C'] else: @@ -1698,7 +1697,7 @@ https://gcc.gnu.org/bugzilla/show_bug.cgi?id=47485''')) command = [ninja_quote(i) for i in compiler.get_exelist()] args = ['$ARGS'] + quoted_depargs + compiler.get_output_args('$out') + compiler.get_compile_only_args() + ['$in'] - description = 'Compiling %s object $out' % compiler.get_display_language() + description = 'Compiling {} object $out'.format(compiler.get_display_language()) if isinstance(compiler, VisualStudioLikeCompiler): deps = 'msvc' depfile = None @@ -1859,9 +1858,8 @@ https://gcc.gnu.org/bugzilla/show_bug.cgi?id=47485''')) modname = modmatch.group(1).lower() if modname in module_files: raise InvalidArguments( - 'Namespace collision: module %s defined in ' - 'two files %s and %s.' % - (modname, module_files[modname], s)) + 'Namespace collision: module {} defined in ' + 'two files {} and {}.'.format(modname, module_files[modname], s)) module_files[modname] = s else: submodmatch = submodre.match(line) @@ -1872,9 +1870,8 @@ https://gcc.gnu.org/bugzilla/show_bug.cgi?id=47485''')) if submodname in submodule_files: raise InvalidArguments( - 'Namespace collision: submodule %s defined in ' - 'two files %s and %s.' % - (submodname, submodule_files[submodname], s)) + 'Namespace collision: submodule {} defined in ' + 'two files {} and {}.'.format(submodname, submodule_files[submodname], s)) submodule_files[submodname] = s self.fortran_deps[target.get_basename()] = {**module_files, **submodule_files} diff --git a/mesonbuild/backend/vs2010backend.py b/mesonbuild/backend/vs2010backend.py index 80ff910..b5803bf 100644 --- a/mesonbuild/backend/vs2010backend.py +++ b/mesonbuild/backend/vs2010backend.py @@ -1192,7 +1192,8 @@ class Vs2010Backend(backends.Backend): # /nologo ET.SubElement(link, 'SuppressStartupBanner').text = 'true' # /release - ET.SubElement(link, 'SetChecksum').text = 'true' + if not self.environment.coredata.get_builtin_option('debug'): + ET.SubElement(link, 'SetChecksum').text = 'true' meson_file_group = ET.SubElement(root, 'ItemGroup') ET.SubElement(meson_file_group, 'None', Include=os.path.join(proj_to_src_dir, build_filename)) diff --git a/mesonbuild/build.py b/mesonbuild/build.py index 98930b3..fdfca73 100644 --- a/mesonbuild/build.py +++ b/mesonbuild/build.py @@ -12,12 +12,14 @@ # See the License for the specific language governing permissions and # limitations under the License. -import copy, os, re from collections import OrderedDict, defaultdict -import itertools, pathlib +from functools import lru_cache +import copy import hashlib +import itertools, pathlib +import os import pickle -from functools import lru_cache +import re import typing as T from . import environment @@ -82,6 +84,7 @@ buildtarget_kwargs = set([ 'override_options', 'sources', 'gnu_symbol_visibility', + 'link_language', ]) known_build_target_kwargs = ( @@ -92,7 +95,7 @@ known_build_target_kwargs = ( rust_kwargs | cs_kwargs) -known_exe_kwargs = known_build_target_kwargs | {'implib', 'export_dynamic', 'link_language', 'pie'} +known_exe_kwargs = known_build_target_kwargs | {'implib', 'export_dynamic', 'pie'} known_shlib_kwargs = known_build_target_kwargs | {'version', 'soversion', 'vs_module_defs', 'darwin_versions'} known_shmod_kwargs = known_build_target_kwargs | {'vs_module_defs'} known_stlib_kwargs = known_build_target_kwargs | {'pic'} @@ -494,6 +497,7 @@ class BuildTarget(Target): self.link_targets = [] self.link_whole_targets = [] self.link_depends = [] + self.added_deps = set() self.name_prefix_set = False self.name_suffix_set = False self.filename = 'no_name' @@ -729,7 +733,7 @@ class BuildTarget(Target): File.from_source_file(environment.source_dir, self.subdir, s)) elif hasattr(s, 'get_outputs'): self.link_depends.extend( - [File.from_built_file(s.subdir, p) for p in s.get_outputs()]) + [File.from_built_file(s.get_subdir(), p) for p in s.get_outputs()]) else: raise InvalidArguments( 'Link_depends arguments must be strings, Files, ' @@ -1053,6 +1057,8 @@ This will become a hard error in a future Meson release.''') def add_deps(self, deps): deps = listify(deps) for dep in unholder(deps): + if dep in self.added_deps: + continue if isinstance(dep, dependencies.InternalDependency): # Those parts that are internal. self.process_sourcelist(dep.sources) @@ -1091,6 +1097,7 @@ You probably should put it in link_with instead.''') 'either an external dependency (returned by find_library() or ' 'dependency()) or an internal dependency (returned by ' 'declare_dependency()).'.format(type(dep).__name__)) + self.added_deps.add(dep) def get_external_deps(self): return self.external_deps @@ -1220,11 +1227,7 @@ You probably should put it in link_with instead.''') See: https://github.com/mesonbuild/meson/issues/1653 ''' - langs = [] - - # User specified link_language of target (for multi-language targets) - if self.link_language: - return [self.link_language] + langs = [] # type: T.List[str] # Check if any of the external libraries were written in this language for dep in self.external_deps: @@ -1256,6 +1259,12 @@ You probably should put it in link_with instead.''') # Populate list of all compilers, not just those being used to compile # sources in this target all_compilers = self.environment.coredata.compilers[self.for_machine] + + # If the user set the link_language, just return that. + if self.link_language: + comp = all_compilers[self.link_language] + return comp, comp.language_stdlib_only_link_flags() + # Languages used by dependencies dep_langs = self.get_langs_used_by_deps() # Pick a compiler based on the language priority-order diff --git a/mesonbuild/cmake/executor.py b/mesonbuild/cmake/executor.py index 66713a1..adc028c 100644 --- a/mesonbuild/cmake/executor.py +++ b/mesonbuild/cmake/executor.py @@ -132,7 +132,7 @@ class CMakeExecutor: msg += '\n\nOn Unix-like systems this is often caused by scripts that are not executable.' mlog.warning(msg) return None - cmvers = re.sub(r'\s*(cmake|cmake3) version\s*', '', out.split('\n')[0]).strip() + cmvers = re.search(r'(cmake|cmake3)\s*version\s*([\d.]+)', out).group(2) return cmvers def set_exec_mode(self, print_cmout: T.Optional[bool] = None, always_capture_stderr: T.Optional[bool] = None) -> None: diff --git a/mesonbuild/cmake/interpreter.py b/mesonbuild/cmake/interpreter.py index 125f18b..0a452d1 100644 --- a/mesonbuild/cmake/interpreter.py +++ b/mesonbuild/cmake/interpreter.py @@ -289,7 +289,15 @@ class ConverterTarget: for j in self.compile_opts[i]: m = ConverterTarget.std_regex.match(j) if m: - self.override_options += ['{}_std={}'.format(i, m.group(2))] + std = m.group(2) + if std not in self._all_lang_stds(i): + mlog.warning( + 'Unknown {}_std "{}" -> Ingoring. Try setting the project' + 'level {}_std if build errors occur.'.format(i, std), + once=True + ) + continue + self.override_options += ['{}_std={}'.format(i, std)] elif j in ['-fPIC', '-fpic', '-fPIE', '-fpie']: self.pie = True elif j in blacklist_compiler_flags: @@ -539,6 +547,13 @@ class ConverterTarget: suffixes += [x for x in exts] return suffixes + @lru_cache(maxsize=None) + def _all_lang_stds(self, lang: str) -> T.List[str]: + lang_opts = self.env.coredata.compiler_options.build.get(lang, None) + if not lang_opts or 'std' not in lang_opts: + return [] + return lang_opts['std'].choices + def process_inter_target_dependencies(self): # Move the dependencies from all transfer_dependencies_from to the target to_process = list(self.depends) diff --git a/mesonbuild/compilers/compilers.py b/mesonbuild/compilers/compilers.py index 3d3a503..385ef5e 100644 --- a/mesonbuild/compilers/compilers.py +++ b/mesonbuild/compilers/compilers.py @@ -1136,7 +1136,7 @@ class Compiler: def remove_linkerlike_args(self, args): rm_exact = ('-headerpad_max_install_names',) rm_prefixes = ('-Wl,', '-L',) - rm_next = ('-L',) + rm_next = ('-L', '-framework',) ret = [] iargs = iter(args) for arg in iargs: diff --git a/mesonbuild/compilers/mixins/clike.py b/mesonbuild/compilers/mixins/clike.py index 124c49c..e7b0cd2 100644 --- a/mesonbuild/compilers/mixins/clike.py +++ b/mesonbuild/compilers/mixins/clike.py @@ -369,7 +369,8 @@ class CLikeCompiler: dependencies=dependencies, mode='link', disable_cache=disable_cache) def run(self, code: str, env, *, extra_args=None, dependencies=None): - if self.is_cross and self.exe_wrapper is None: + need_exe_wrapper = env.need_exe_wrapper(self.for_machine) + if need_exe_wrapper and self.exe_wrapper is None: raise compilers.CrossNoRunException('Can not run test applications in this cross environment.') with self._build_wrapper(code, env, extra_args, dependencies, mode='link', want_output=True) as p: if p.returncode != 0: @@ -377,7 +378,7 @@ class CLikeCompiler: p.input_name, p.returncode)) return compilers.RunResult(False) - if self.is_cross: + if need_exe_wrapper: cmdlist = self.exe_wrapper + [p.output_name] else: cmdlist = p.output_name @@ -658,7 +659,7 @@ class CLikeCompiler: # is not run so we don't care what the return value is. main = '''\nint main(void) {{ void *a = (void*) &{func}; - long b = (long) a; + long long b = (long long) a; return (int) b; }}''' return head, main @@ -727,24 +728,29 @@ class CLikeCompiler: # need to look for them differently. On nice compilers like clang, we # can just directly use the __has_builtin() macro. fargs['no_includes'] = '#include' not in prefix - fargs['__builtin_'] = '' if funcname.startswith('__builtin_') else '__builtin_' + is_builtin = funcname.startswith('__builtin_') + fargs['is_builtin'] = is_builtin + fargs['__builtin_'] = '' if is_builtin else '__builtin_' t = '''{prefix} int main(void) {{ + + /* With some toolchains (MSYS2/mingw for example) the compiler + * provides various builtins which are not really implemented and + * fall back to the stdlib where they aren't provided and fail at + * build/link time. In case the user provides a header, including + * the header didn't lead to the function being defined, and the + * function we are checking isn't a builtin itself we assume the + * builtin is not functional and we just error out. */ + #if !{no_includes:d} && !defined({func}) && !{is_builtin:d} + #error "No definition for {__builtin_}{func} found in the prefix" + #endif + #ifdef __has_builtin #if !__has_builtin({__builtin_}{func}) #error "{__builtin_}{func} not found" #endif #elif ! defined({func}) - /* Check for {__builtin_}{func} only if no includes were added to the - * prefix above, which means no definition of {func} can be found. - * We would always check for this, but we get false positives on - * MSYS2 if we do. Their toolchain is broken, but we can at least - * give them a workaround. */ - #if {no_includes:d} - {__builtin_}{func}; - #else - #error "No definition for {__builtin_}{func} found in the prefix" - #endif + {__builtin_}{func}; #endif return 0; }}''' diff --git a/mesonbuild/coredata.py b/mesonbuild/coredata.py index c4fe8db..754be1d 100644 --- a/mesonbuild/coredata.py +++ b/mesonbuild/coredata.py @@ -99,16 +99,16 @@ class UserBooleanOption(UserOption[bool]): class UserIntegerOption(UserOption[int]): def __init__(self, description, value, yielding=None): min_value, max_value, default_value = value - super().__init__(description, [True, False], yielding) self.min_value = min_value self.max_value = max_value - self.set_value(default_value) c = [] if min_value is not None: c.append('>=' + str(min_value)) if max_value is not None: c.append('<=' + str(max_value)) - self.choices = ', '.join(c) + choices = ', '.join(c) + super().__init__(description, choices, yielding) + self.set_value(default_value) def validate_value(self, value) -> int: if isinstance(value, str): @@ -385,6 +385,7 @@ class CoreData: # Only to print a warning if it changes between Meson invocations. self.config_files = self.__load_config_files(options, scratch_dir, 'native') self.init_builtins('') + self.libdir_cross_fixup() @staticmethod def __load_config_files(options: argparse.Namespace, scratch_dir: str, ftype: str) -> T.List[str]: @@ -510,7 +511,6 @@ class CoreData: for for_machine in iter(MachineChoice): for key, opt in builtin_options_per_machine.items(): self.add_builtin_option(self.builtins_per_machine[for_machine], key, opt, subproject) - self.libdir_cross_fixup() def add_builtin_option(self, opts_map, key, opt, subproject): if subproject: @@ -682,7 +682,9 @@ class CoreData: if type(oldval) != type(value): self.user_options[name] = value - def is_cross_build(self) -> bool: + def is_cross_build(self, when_building_for: MachineChoice = MachineChoice.HOST) -> bool: + if when_building_for == MachineChoice.BUILD: + return False return len(self.cross_files) > 0 def strip_build_option_names(self, options): diff --git a/mesonbuild/dependencies/base.py b/mesonbuild/dependencies/base.py index c0ec089..bcb1531 100644 --- a/mesonbuild/dependencies/base.py +++ b/mesonbuild/dependencies/base.py @@ -437,28 +437,11 @@ class ConfigToolDependency(ExternalDependency): """ if not isinstance(versions, list) and versions is not None: versions = listify(versions) - - tool = self.env.lookup_binary_entry(self.for_machine, self.tool_name) - if tool is not None: - tools = [tool] - else: - if not self.env.machines.matches_build_machine(self.for_machine): - mlog.deprecation('No entry for {0} specified in your cross file. ' - 'Falling back to searching PATH. This may find a ' - 'native version of {0}! This will become a hard ' - 'error in a future version of meson'.format(self.tool_name)) - tools = [[t] for t in self.tools] - best_match = (None, None) - for tool in tools: - if len(tool) == 1: - # In some situations the command can't be directly executed. - # For example Shell scripts need to be called through sh on - # Windows (see issue #1423). - potential_bin = ExternalProgram(tool[0], silent=True) - if not potential_bin.found(): - continue - tool = potential_bin.get_command() + for potential_bin in self.search_tool(self.tool_name, self.tool_name, self.tools): + if not potential_bin.found(): + continue + tool = potential_bin.get_command() try: p, out = Popen_safe(tool + [self.version_arg])[:2] except (FileNotFoundError, PermissionError): @@ -1800,6 +1783,10 @@ class ExternalProgram: self.name = name if command is not None: self.command = listify(command) + if mesonlib.is_windows(): + cmd = self.command[0] + args = self.command[1:] + self.command = self._search_windows_special_cases(name, cmd) + args else: all_search_dirs = [search_dir] if extra_search_dirs: diff --git a/mesonbuild/dependencies/boost.py b/mesonbuild/dependencies/boost.py index 13054f5..3262d8b 100644 --- a/mesonbuild/dependencies/boost.py +++ b/mesonbuild/dependencies/boost.py @@ -344,6 +344,7 @@ class BoostDependency(ExternalDependency): self.multithreading = kwargs.get('threading', 'multi') == 'multi' self.boost_root = None + self.explicit_static = 'static' in kwargs # Extract and validate modules self.modules = mesonlib.extract_as_list(kwargs, 'modules') # type: T.List[str] @@ -522,7 +523,7 @@ class BoostDependency(ExternalDependency): except (KeyError, IndexError, AttributeError): pass - libs = [x for x in libs if x.static == self.static] + libs = [x for x in libs if x.static == self.static or not self.explicit_static] libs = [x for x in libs if x.mt == self.multithreading] libs = [x for x in libs if x.version_matches(lib_vers)] libs = [x for x in libs if x.arch_matches(self.arch)] @@ -637,11 +638,8 @@ class BoostDependency(ExternalDependency): return BoostIncludeDir(hfile.parents[1], int(m.group(1))) def _extra_compile_args(self) -> T.List[str]: - args = [] # type: T.List[str] - args += ['-DBOOST_ALL_NO_LIB'] # Disable automatic linking - if not self.static: - args += ['-DBOOST_ALL_DYN_LINK'] - return args + # BOOST_ALL_DYN_LINK should not be required with the known defines below + return ['-DBOOST_ALL_NO_LIB'] # Disable automatic linking # See https://www.boost.org/doc/libs/1_72_0/more/getting_started/unix-variants.html#library-naming @@ -665,9 +663,9 @@ boost_arch_map = { #### ---- BEGIN GENERATED ---- #### # # # Generated with tools/boost_names.py: -# - boost version: 1.72.0 -# - modules found: 158 -# - libraries found: 42 +# - boost version: 1.73.0 +# - modules found: 159 +# - libraries found: 43 # class BoostLibrary(): @@ -690,16 +688,16 @@ class BoostModule(): boost_libraries = { 'boost_atomic': BoostLibrary( name='boost_atomic', - shared=[], - static=[], + shared=['-DBOOST_ATOMIC_DYN_LINK=1'], + static=['-DBOOST_ATOMIC_STATIC_LINK=1'], single=[], multi=[], ), 'boost_chrono': BoostLibrary( name='boost_chrono', - shared=['-DBOOST_ALL_DYN_LINK=1'], - static=['-DBOOST_All_STATIC_LINK=1'], - single=[], + shared=['-DBOOST_CHRONO_DYN_LINK=1'], + static=['-DBOOST_CHRONO_STATIC_LINK=1'], + single=['-DBOOST_CHRONO_THREAD_DISABLED'], multi=[], ), 'boost_container': BoostLibrary( @@ -711,28 +709,28 @@ boost_libraries = { ), 'boost_context': BoostLibrary( name='boost_context', - shared=[], + shared=['-DBOOST_CONTEXT_DYN_LINK=1'], static=[], single=[], multi=[], ), 'boost_contract': BoostLibrary( name='boost_contract', - shared=[], - static=[], - single=[], + shared=['-DBOOST_CONTRACT_DYN_LINK'], + static=['-DBOOST_CONTRACT_STATIC_LINK'], + single=['-DBOOST_CONTRACT_DISABLE_THREADS'], multi=[], ), 'boost_coroutine': BoostLibrary( name='boost_coroutine', - shared=[], + shared=['-DBOOST_COROUTINES_DYN_LINK=1'], static=[], single=[], multi=[], ), 'boost_date_time': BoostLibrary( name='boost_date_time', - shared=[], + shared=['-DBOOST_DATE_TIME_DYN_LINK=1'], static=[], single=[], multi=[], @@ -746,14 +744,14 @@ boost_libraries = { ), 'boost_fiber': BoostLibrary( name='boost_fiber', - shared=[], + shared=['-DBOOST_FIBERS_DYN_LINK=1'], static=[], single=[], multi=[], ), 'boost_fiber_numa': BoostLibrary( name='boost_fiber_numa', - shared=[], + shared=['-DBOOST_FIBERS_DYN_LINK=1'], static=[], single=[], multi=[], @@ -767,84 +765,91 @@ boost_libraries = { ), 'boost_graph': BoostLibrary( name='boost_graph', - shared=['-DBOOST_GRAPH_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_iostreams': BoostLibrary( name='boost_iostreams', - shared=['-DBOOST_IOSTREAMS_DYN_LINK=1', '-DBOOST_IOSTREAMS_DYN_LINK=1'], + shared=['-DBOOST_IOSTREAMS_DYN_LINK=1'], static=[], single=[], multi=[], ), 'boost_locale': BoostLibrary( name='boost_locale', - shared=['-DBOOST_LOCALE_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_log': BoostLibrary( name='boost_log', - shared=['-DBOOST_LOG_DLL', '-DBOOST_LOG_DYN_LINK=1'], + shared=['-DBOOST_LOG_DYN_LINK=1'], static=[], - single=['BOOST_LOG_NO_THREADS'], + single=['-DBOOST_LOG_NO_THREADS'], multi=[], ), 'boost_log_setup': BoostLibrary( name='boost_log_setup', - shared=['-DBOOST_LOG_DYN_LINK=1', '-DBOOST_LOG_SETUP_DLL', '-DBOOST_LOG_SETUP_DYN_LINK=1'], + shared=['-DBOOST_LOG_SETUP_DYN_LINK=1'], static=[], - single=['BOOST_LOG_NO_THREADS'], + single=['-DBOOST_LOG_NO_THREADS'], multi=[], ), 'boost_math_c99': BoostLibrary( name='boost_math_c99', - shared=['-DBOOST_MATH_TR1_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_math_c99f': BoostLibrary( name='boost_math_c99f', - shared=['-DBOOST_MATH_TR1_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_math_c99l': BoostLibrary( name='boost_math_c99l', - shared=['-DBOOST_MATH_TR1_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_math_tr1': BoostLibrary( name='boost_math_tr1', - shared=['-DBOOST_MATH_TR1_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_math_tr1f': BoostLibrary( name='boost_math_tr1f', - shared=['-DBOOST_MATH_TR1_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_math_tr1l': BoostLibrary( name='boost_math_tr1l', - shared=['-DBOOST_MATH_TR1_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_mpi': BoostLibrary( name='boost_mpi', - shared=['-DBOOST_MPI_DYN_LINK=1'], + shared=[], + static=[], + single=[], + multi=[], + ), + 'boost_nowide': BoostLibrary( + name='boost_nowide', + shared=['-DBOOST_NOWIDE_DYN_LINK=1'], static=[], single=[], multi=[], @@ -865,63 +870,63 @@ boost_libraries = { ), 'boost_random': BoostLibrary( name='boost_random', - shared=[], + shared=['-DBOOST_RANDOM_DYN_LINK'], static=[], single=[], multi=[], ), 'boost_regex': BoostLibrary( name='boost_regex', - shared=['-DBOOST_REGEX_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_serialization': BoostLibrary( name='boost_serialization', - shared=['-DBOOST_SERIALIZATION_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_stacktrace_addr2line': BoostLibrary( name='boost_stacktrace_addr2line', - shared=['-DBOOST_STACKTRACE_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_stacktrace_backtrace': BoostLibrary( name='boost_stacktrace_backtrace', - shared=['-DBOOST_STACKTRACE_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_stacktrace_basic': BoostLibrary( name='boost_stacktrace_basic', - shared=['-DBOOST_STACKTRACE_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_stacktrace_noop': BoostLibrary( name='boost_stacktrace_noop', - shared=['-DBOOST_STACKTRACE_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_stacktrace_windbg': BoostLibrary( name='boost_stacktrace_windbg', - shared=['-DBOOST_STACKTRACE_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], ), 'boost_stacktrace_windbg_cached': BoostLibrary( name='boost_stacktrace_windbg_cached', - shared=['-DBOOST_STACKTRACE_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], @@ -942,8 +947,8 @@ boost_libraries = { ), 'boost_thread': BoostLibrary( name='boost_thread', - shared=['-DBOOST_THREAD_USE_DLL=1'], - static=['-DBOOST_THREAD_USE_LIB=1'], + shared=['-DBOOST_THREAD_BUILD_DLL=1', '-DBOOST_THREAD_USE_DLL=1'], + static=['-DBOOST_THREAD_BUILD_LIB=1', '-DBOOST_THREAD_USE_LIB=1'], single=[], multi=[], ), @@ -956,7 +961,7 @@ boost_libraries = { ), 'boost_type_erasure': BoostLibrary( name='boost_type_erasure', - shared=[], + shared=['-DBOOST_TYPE_ERASURE_DYN_LINK'], static=[], single=[], multi=[], @@ -977,7 +982,7 @@ boost_libraries = { ), 'boost_wserialization': BoostLibrary( name='boost_wserialization', - shared=['-DBOOST_SERIALIZATION_DYN_LINK=1'], + shared=[], static=[], single=[], multi=[], diff --git a/mesonbuild/envconfig.py b/mesonbuild/envconfig.py index 25b3c7f..b74be35 100644 --- a/mesonbuild/envconfig.py +++ b/mesonbuild/envconfig.py @@ -40,6 +40,7 @@ known_cpu_families = ( 'alpha', 'arc', 'arm', + 'avr', 'c2000', 'e2k', 'ia64', @@ -121,7 +122,7 @@ def get_env_var_pair(for_machine: MachineChoice, # ones. ([var_name + '_FOR_BUILD'] if is_cross else [var_name]), # Always just the unprefixed host verions - ([] if is_cross else [var_name]), + [var_name] )[for_machine] for var in candidates: value = os.environ.get(var) diff --git a/mesonbuild/environment.py b/mesonbuild/environment.py index 64efda6..8fad628 100644 --- a/mesonbuild/environment.py +++ b/mesonbuild/environment.py @@ -635,8 +635,8 @@ class Environment: self.coredata.meson_command = mesonlib.meson_command self.first_invocation = True - def is_cross_build(self) -> bool: - return self.coredata.is_cross_build() + def is_cross_build(self, when_building_for: MachineChoice = MachineChoice.HOST) -> bool: + return self.coredata.is_cross_build(when_building_for) def dump_coredata(self): return coredata.save(self.coredata, self.get_build_dir()) @@ -899,7 +899,7 @@ class Environment: def _detect_c_or_cpp_compiler(self, lang: str, for_machine: MachineChoice) -> Compiler: popen_exceptions = {} compilers, ccache, exe_wrap = self._get_compilers(lang, for_machine) - is_cross = not self.machines.matches_build_machine(for_machine) + is_cross = self.is_cross_build(for_machine) info = self.machines[for_machine] for compiler in compilers: @@ -985,12 +985,15 @@ class Environment: if 'Emscripten' in out: cls = EmscriptenCCompiler if lang == 'c' else EmscriptenCPPCompiler self.coredata.add_lang_args(cls.language, cls, for_machine, self) - # emcc cannot be queried to get the version out of it (it - # ignores -Wl,--version and doesn't have an alternative). - # Further, wasm-id *is* lld and will return `LLD X.Y.Z` if you - # call `wasm-ld --version`, but a special version of lld that - # takes different options. - p, o, _ = Popen_safe(['wasm-ld', '--version']) + + # emcc requires a file input in order to pass arguments to the + # linker. It'll exit with an error code, but still print the + # linker version. Old emcc versions ignore -Wl,--version completely, + # however. We'll report "unknown version" in that case. + with tempfile.NamedTemporaryFile(suffix='.c') as f: + cmd = compiler + [cls.LINKER_PREFIX + "--version", f.name] + _, o, _ = Popen_safe(cmd) + linker = WASMDynamicLinker( compiler, for_machine, cls.LINKER_PREFIX, [], version=search_version(o)) @@ -1149,7 +1152,7 @@ class Environment: def detect_cuda_compiler(self, for_machine): popen_exceptions = {} - is_cross = not self.machines.matches_build_machine(for_machine) + is_cross = self.is_cross_build(for_machine) compilers, ccache, exe_wrap = self._get_compilers('cuda', for_machine) info = self.machines[for_machine] for compiler in compilers: @@ -1189,7 +1192,7 @@ class Environment: def detect_fortran_compiler(self, for_machine: MachineChoice): popen_exceptions = {} compilers, ccache, exe_wrap = self._get_compilers('fortran', for_machine) - is_cross = not self.machines.matches_build_machine(for_machine) + is_cross = self.is_cross_build(for_machine) info = self.machines[for_machine] for compiler in compilers: if isinstance(compiler, str): @@ -1308,7 +1311,7 @@ class Environment: def _detect_objc_or_objcpp_compiler(self, for_machine: MachineInfo, objc: bool) -> 'Compiler': popen_exceptions = {} compilers, ccache, exe_wrap = self._get_compilers('objc' if objc else 'objcpp', for_machine) - is_cross = not self.machines.matches_build_machine(for_machine) + is_cross = self.is_cross_build(for_machine) info = self.machines[for_machine] for compiler in compilers: @@ -1399,7 +1402,7 @@ class Environment: def detect_vala_compiler(self, for_machine): exelist = self.lookup_binary_entry(for_machine, 'vala') - is_cross = not self.machines.matches_build_machine(for_machine) + is_cross = self.is_cross_build(for_machine) info = self.machines[for_machine] if exelist is None: # TODO support fallback @@ -1419,7 +1422,7 @@ class Environment: def detect_rust_compiler(self, for_machine): popen_exceptions = {} compilers, ccache, exe_wrap = self._get_compilers('rust', for_machine) - is_cross = not self.machines.matches_build_machine(for_machine) + is_cross = self.is_cross_build(for_machine) info = self.machines[for_machine] cc = self.detect_c_compiler(for_machine) @@ -1510,7 +1513,7 @@ class Environment: arch = 'x86_mscoff' popen_exceptions = {} - is_cross = not self.machines.matches_build_machine(for_machine) + is_cross = self.is_cross_build(for_machine) results, ccache, exe_wrap = self._get_compilers('d', for_machine) for exelist in results: # Search for a D compiler. @@ -1601,7 +1604,7 @@ class Environment: def detect_swift_compiler(self, for_machine): exelist = self.lookup_binary_entry(for_machine, 'swift') - is_cross = not self.machines.matches_build_machine(for_machine) + is_cross = self.is_cross_build(for_machine) info = self.machines[for_machine] if exelist is None: # TODO support fallback diff --git a/mesonbuild/interpreter.py b/mesonbuild/interpreter.py index 0dfb616..7901e5a 100644 --- a/mesonbuild/interpreter.py +++ b/mesonbuild/interpreter.py @@ -33,6 +33,7 @@ from .interpreterbase import FeatureNew, FeatureDeprecated, FeatureNewKwargs from .interpreterbase import ObjectHolder from .modules import ModuleReturnValue from .cmake import CMakeInterpreter +from .backend.backends import TestProtocol from pathlib import Path, PurePath import os @@ -979,7 +980,7 @@ class Test(InterpreterObject): self.should_fail = should_fail self.timeout = timeout self.workdir = workdir - self.protocol = protocol + self.protocol = TestProtocol.from_str(protocol) self.priority = priority def get_exe(self): @@ -1893,48 +1894,98 @@ class MesonMain(InterpreterObject): 'backend': self.backend_method, }) - def _find_source_script(self, name, args): + def _find_source_script(self, prog: T.Union[str, ExecutableHolder], args): + if isinstance(prog, ExecutableHolder): + prog_path = self.interpreter.backend.get_target_filename(prog.held_object) + return build.RunScript([prog_path], args) + elif isinstance(prog, ExternalProgramHolder): + return build.RunScript(prog.get_command(), args) + # Prefer scripts in the current source directory search_dir = os.path.join(self.interpreter.environment.source_dir, self.interpreter.subdir) - key = (name, search_dir) + key = (prog, search_dir) if key in self._found_source_scripts: found = self._found_source_scripts[key] else: - found = dependencies.ExternalProgram(name, search_dir=search_dir) + found = dependencies.ExternalProgram(prog, search_dir=search_dir) if found.found(): self._found_source_scripts[key] = found else: m = 'Script or command {!r} not found or not executable' - raise InterpreterException(m.format(name)) + raise InterpreterException(m.format(prog)) return build.RunScript(found.get_command(), args) - @permittedKwargs({}) - def add_install_script_method(self, args, kwargs): + def _process_script_args( + self, name: str, args: T.List[T.Union[ + str, mesonlib.File, CustomTargetHolder, + CustomTargetIndexHolder, ConfigureFileHolder, + ExternalProgramHolder, ExecutableHolder, + ]], allow_built: bool = False) -> T.List[str]: + script_args = [] # T.List[str] + new = False + for a in args: + a = unholder(a) + if isinstance(a, str): + script_args.append(a) + elif isinstance(a, mesonlib.File): + new = True + script_args.append(a.rel_to_builddir(self.interpreter.environment.source_dir)) + elif isinstance(a, (build.BuildTarget, build.CustomTarget, build.CustomTargetIndex)): + if not allow_built: + raise InterpreterException('Arguments to {} cannot be built'.format(name)) + new = True + script_args.extend([os.path.join(a.get_subdir(), o) for o in a.get_outputs()]) + + # This feels really hacky, but I'm not sure how else to fix + # this without completely rewriting install script handling. + # This is complicated by the fact that the install target + # depends on all. + if isinstance(a, build.CustomTargetIndex): + a.target.build_by_default = True + else: + a.build_by_default = True + elif isinstance(a, build.ConfigureFile): + new = True + script_args.append(os.path.join(a.subdir, a.targetname)) + elif isinstance(a, dependencies.ExternalProgram): + script_args.extend(a.command) + new = True + else: + raise InterpreterException( + 'Arguments to {} must be strings, Files, CustomTargets, ' + 'Indexes of CustomTargets, or ConfigureFiles'.format(name)) + if new: + FeatureNew('Calling "{}" with File, CustomTaget, Index of CustomTarget, ConfigureFile, Executable, or ExternalProgram'.format(name), '0.55.0').use( + self.interpreter.subproject) + return script_args + + @permittedKwargs(set()) + def add_install_script_method(self, args: 'T.Tuple[T.Union[str, ExecutableHolder], T.Union[str, mesonlib.File, CustomTargetHolder, CustomTargetIndexHolder, ConfigureFileHolder], ...]', kwargs): if len(args) < 1: raise InterpreterException('add_install_script takes one or more arguments') - check_stringlist(args, 'add_install_script args must be strings') - script = self._find_source_script(args[0], args[1:]) + script_args = self._process_script_args('add_install_script', args[1:], allow_built=True) + script = self._find_source_script(args[0], script_args) self.build.install_scripts.append(script) - @permittedKwargs({}) + @permittedKwargs(set()) def add_postconf_script_method(self, args, kwargs): if len(args) < 1: raise InterpreterException('add_postconf_script takes one or more arguments') - check_stringlist(args, 'add_postconf_script arguments must be strings') - script = self._find_source_script(args[0], args[1:]) + script_args = self._process_script_args('add_postconf_script', args[1:], allow_built=True) + script = self._find_source_script(args[0], script_args) self.build.postconf_scripts.append(script) - @permittedKwargs({}) + @permittedKwargs(set()) def add_dist_script_method(self, args, kwargs): if len(args) < 1: raise InterpreterException('add_dist_script takes one or more arguments') if len(args) > 1: FeatureNew('Calling "add_dist_script" with multiple arguments', '0.49.0').use(self.interpreter.subproject) - check_stringlist(args, 'add_dist_script argument must be a string') if self.interpreter.subproject != '': raise InterpreterException('add_dist_script may not be used in a subproject.') - script = self._find_source_script(args[0], args[1:]) + script_args = self._process_script_args('add_dist_script', args[1:], allow_built=True) + script = self._find_source_script(args[0], script_args) self.build.dist_scripts.append(script) @noPosargs @@ -3772,6 +3823,8 @@ This will become a hard error in the future.''' % kwargs['input'], location=self @FeatureNewKwargs('test', '0.52.0', ['priority']) @permittedKwargs(permitted_kwargs['test']) def func_test(self, node, args, kwargs): + if kwargs.get('protocol') == 'gtest': + FeatureNew('"gtest" protocol for tests', '0.55.0').use(self.subproject) self.add_test(node, args, kwargs, True) def unpack_env_kwarg(self, kwargs) -> build.EnvironmentVariables: @@ -3823,8 +3876,8 @@ This will become a hard error in the future.''' % kwargs['input'], location=self if not isinstance(timeout, int): raise InterpreterException('Timeout must be an integer.') protocol = kwargs.get('protocol', 'exitcode') - if protocol not in ('exitcode', 'tap'): - raise InterpreterException('Protocol must be "exitcode" or "tap".') + if protocol not in {'exitcode', 'tap', 'gtest'}: + raise InterpreterException('Protocol must be "exitcode", "tap", or "gtest".') suite = [] prj = self.subproject if self.is_subproject() else self.build.project_name for s in mesonlib.stringlistify(kwargs.get('suite', '')): @@ -4663,6 +4716,8 @@ This will become a hard error in the future.''', location=self.current_node) if len(args) < 1 or len(args) > 2: raise InvalidCode('Get_variable takes one or two arguments.') varname = args[0] + if isinstance(varname, Disabler): + return varname if not isinstance(varname, str): raise InterpreterException('First argument must be a string.') try: diff --git a/mesonbuild/interpreterbase.py b/mesonbuild/interpreterbase.py index 1a7aa38..82d16f1 100644 --- a/mesonbuild/interpreterbase.py +++ b/mesonbuild/interpreterbase.py @@ -18,6 +18,7 @@ from . import mparser, mesonlib, mlog from . import environment, dependencies +import abc import os, copy, re import collections.abc from functools import wraps @@ -212,12 +213,11 @@ class permittedKwargs: return f(*wrapped_args, **wrapped_kwargs) return wrapped -class FeatureCheckBase: +class FeatureCheckBase(metaclass=abc.ABCMeta): "Base class for feature version checks" - # Class variable, shared across all instances - # - # Format: {subproject: {feature_version: set(feature_names)}} + # In python 3.6 we can just forward declare this, but in 3.5 we can't + # This will be overwritten by the subclasses by necessity feature_registry = {} # type: T.ClassVar[T.Dict[str, T.Dict[str, T.Set[str]]]] def __init__(self, feature_name: str, version: str): @@ -231,13 +231,18 @@ class FeatureCheckBase: return '' return mesonlib.project_meson_versions[subproject] + @staticmethod + @abc.abstractmethod + def check_version(target_version: str, feature_Version: str) -> bool: + pass + def use(self, subproject: str) -> None: tv = self.get_target_version(subproject) # No target version if tv == '': return # Target version is new enough - if mesonlib.version_compare_condition_with_min(tv, self.feature_version): + if self.check_version(tv, self.feature_version): return # Feature is too new for target version, register it if subproject not in self.feature_registry: @@ -283,6 +288,15 @@ class FeatureCheckBase: class FeatureNew(FeatureCheckBase): """Checks for new features""" + # Class variable, shared across all instances + # + # Format: {subproject: {feature_version: set(feature_names)}} + feature_registry = {} # type: T.ClassVar[T.Dict[str, T.Dict[str, T.Set[str]]]] + + @staticmethod + def check_version(target_version: str, feature_version: str) -> bool: + return mesonlib.version_compare_condition_with_min(target_version, feature_version) + @staticmethod def get_warning_str_prefix(tv: str) -> str: return 'Project specifies a minimum meson_version \'{}\' but uses features which were added in newer versions:'.format(tv) @@ -294,6 +308,16 @@ class FeatureNew(FeatureCheckBase): class FeatureDeprecated(FeatureCheckBase): """Checks for deprecated features""" + # Class variable, shared across all instances + # + # Format: {subproject: {feature_version: set(feature_names)}} + feature_registry = {} # type: T.ClassVar[T.Dict[str, T.Dict[str, T.Set[str]]]] + + @staticmethod + def check_version(target_version: str, feature_version: str) -> bool: + # For deprecatoin checks we need to return the inverse of FeatureNew checks + return not mesonlib.version_compare_condition_with_min(target_version, feature_version) + @staticmethod def get_warning_str_prefix(tv: str) -> str: return 'Deprecated features used:' @@ -819,7 +843,7 @@ The result of this is undefined and will become a hard error in a future Meson r def function_call(self, node: mparser.FunctionNode) -> T.Optional[TYPE_var]: func_name = node.func_name (posargs, kwargs) = self.reduce_arguments(node.args) - if is_disabled(posargs, kwargs) and func_name != 'set_variable' and func_name != 'is_disabler': + if is_disabled(posargs, kwargs) and func_name not in {'get_variable', 'set_variable', 'is_disabler'}: return Disabler() if func_name in self.funcs: func = self.funcs[func_name] diff --git a/mesonbuild/linkers.py b/mesonbuild/linkers.py index 44c720f..db735e7 100644 --- a/mesonbuild/linkers.py +++ b/mesonbuild/linkers.py @@ -761,6 +761,11 @@ class WASMDynamicLinker(GnuLikeDynamicLinkerMixin, PosixDynamicLinkerMixin, Dyna def get_asneeded_args(self) -> T.List[str]: return [] + def build_rpath_args(self, env: 'Environment', build_dir: str, from_dir: str, + rpath_paths: str, build_rpath: str, + install_rpath: str) -> T.List[str]: + return [] + class CcrxDynamicLinker(DynamicLinker): diff --git a/mesonbuild/mintro.py b/mesonbuild/mintro.py index d5516d4..54e302b 100644 --- a/mesonbuild/mintro.py +++ b/mesonbuild/mintro.py @@ -328,7 +328,7 @@ def get_test_list(testdata) -> T.List[T.Dict[str, T.Union[str, int, T.List[str], to['suite'] = t.suite to['is_parallel'] = t.is_parallel to['priority'] = t.priority - to['protocol'] = t.protocol + to['protocol'] = str(t.protocol) result.append(to) return result diff --git a/mesonbuild/modules/gnome.py b/mesonbuild/modules/gnome.py index 14cb4c4..a97fffa 100644 --- a/mesonbuild/modules/gnome.py +++ b/mesonbuild/modules/gnome.py @@ -406,11 +406,19 @@ class GnomeModule(ExtensionModule): kwargs = {'native': True, 'required': True} holder = self.interpreter.func_dependency(state.current_node, ['gobject-introspection-1.0'], kwargs) self.gir_dep = holder.held_object - if self.gir_dep.type_name == 'pkgconfig': + giscanner = state.environment.lookup_binary_entry(MachineChoice.HOST, 'g-ir-scanner') + if giscanner is not None: + self.giscanner = ExternalProgram.from_entry('g-ir-scanner', giscanner) + elif self.gir_dep.type_name == 'pkgconfig': self.giscanner = ExternalProgram('g_ir_scanner', self.gir_dep.get_pkgconfig_variable('g_ir_scanner', {})) - self.gicompiler = ExternalProgram('g_ir_compiler', self.gir_dep.get_pkgconfig_variable('g_ir_compiler', {})) else: self.giscanner = self.interpreter.find_program_impl('g-ir-scanner') + gicompiler = state.environment.lookup_binary_entry(MachineChoice.HOST, 'g-ir-compiler') + if gicompiler is not None: + self.gicompiler = ExternalProgram.from_entry('g-ir-compiler', gicompiler) + elif self.gir_dep.type_name == 'pkgconfig': + self.gicompiler = ExternalProgram('g_ir_compiler', self.gir_dep.get_pkgconfig_variable('g_ir_compiler', {})) + else: self.gicompiler = self.interpreter.find_program_impl('g-ir-compiler') return self.gir_dep, self.giscanner, self.gicompiler @@ -732,8 +740,12 @@ class GnomeModule(ExtensionModule): gir_dep, giscanner, gicompiler = self._get_gir_dep(state) - ns = kwargs.pop('namespace') - nsversion = kwargs.pop('nsversion') + ns = kwargs.get('namespace') + if not ns: + raise MesonException('Missing "namespace" keyword argument') + nsversion = kwargs.get('nsversion') + if not nsversion: + raise MesonException('Missing "nsversion" keyword argument') libsources = mesonlib.extract_as_list(kwargs, 'sources', pop=True) girfile = '%s-%s.gir' % (ns, nsversion) srcdir = os.path.join(state.environment.get_source_dir(), state.subdir) diff --git a/mesonbuild/modules/python.py b/mesonbuild/modules/python.py index 79e1824..ceabd76 100644 --- a/mesonbuild/modules/python.py +++ b/mesonbuild/modules/python.py @@ -361,7 +361,7 @@ class PythonInstallation(ExternalProgramHolder): @permittedKwargs(['pure', 'subdir']) def install_sources_method(self, args, kwargs): - pure = kwargs.pop('pure', False) + pure = kwargs.pop('pure', True) if not isinstance(pure, bool): raise InvalidArguments('"pure" argument must be a boolean.') diff --git a/mesonbuild/modules/unstable_kconfig.py b/mesonbuild/modules/unstable_keyval.py index 6685710..3da2992 100644 --- a/mesonbuild/modules/unstable_kconfig.py +++ b/mesonbuild/modules/unstable_keyval.py @@ -21,9 +21,9 @@ from ..interpreter import InvalidCode import os -class KconfigModule(ExtensionModule): +class KeyvalModule(ExtensionModule): - @FeatureNew('Kconfig Module', '0.51.0') + @FeatureNew('Keyval Module', '0.55.0') def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) self.snippets.add('load') @@ -56,9 +56,7 @@ class KconfigModule(ExtensionModule): s = sources[0] is_built = False if isinstance(s, mesonlib.File): - if s.is_built: - FeatureNew('kconfig.load() of built files', '0.52.0').use(state.subproject) - is_built = True + is_built = is_built or s.is_built s = s.absolute_path(interpreter.environment.source_dir, interpreter.environment.build_dir) else: s = os.path.join(interpreter.environment.source_dir, s) @@ -70,4 +68,4 @@ class KconfigModule(ExtensionModule): def initialize(*args, **kwargs): - return KconfigModule(*args, **kwargs) + return KeyvalModule(*args, **kwargs) diff --git a/mesonbuild/mtest.py b/mesonbuild/mtest.py index 3239736..4592c90 100644 --- a/mesonbuild/mtest.py +++ b/mesonbuild/mtest.py @@ -43,6 +43,7 @@ from . import environment from . import mlog from .dependencies import ExternalProgram from .mesonlib import MesonException, get_wine_shortpath, split_args +from .backend.backends import TestProtocol if T.TYPE_CHECKING: from .backend.backends import TestSerialisation @@ -94,6 +95,9 @@ def add_arguments(parser: argparse.ArgumentParser) -> None: parser.add_argument('--wrapper', default=None, dest='wrapper', type=split_args, help='wrapper to run tests with (e.g. Valgrind)') parser.add_argument('-C', default='.', dest='wd', + # https://github.com/python/typeshed/issues/3107 + # https://github.com/python/mypy/issues/7177 + type=os.path.abspath, # type: ignore help='directory to cd into before running') parser.add_argument('--suite', default=[], dest='include_suites', action='append', metavar='SUITE', help='Only run tests belonging to the given suite.') @@ -348,6 +352,19 @@ class JunitBuilder: def log(self, name: str, test: 'TestRun') -> None: """Log a single test case.""" + if test.junit is not None: + for suite in test.junit.findall('.//testsuite'): + # Assume that we don't need to merge anything here... + suite.attrib['name'] = '{}.{}.{}'.format(test.project, name, suite.attrib['name']) + + # GTest can inject invalid attributes + for case in suite.findall('.//testcase[@result]'): + del case.attrib['result'] + for case in suite.findall('.//testcase[@timestamp]'): + del case.attrib['timestamp'] + self.root.append(suite) + return + # In this case we have a test binary with multiple results. # We want to record this so that each result is recorded # separately @@ -429,10 +446,24 @@ class JunitBuilder: class TestRun: @classmethod + def make_gtest(cls, test: 'TestSerialisation', test_env: T.Dict[str, str], + returncode: int, starttime: float, duration: float, + stdo: T.Optional[str], stde: T.Optional[str], + cmd: T.Optional[T.List[str]]) -> 'TestRun': + filename = '{}.xml'.format(test.name) + if test.workdir: + filename = os.path.join(test.workdir, filename) + tree = et.parse(filename) + + return cls.make_exitcode( + test, test_env, returncode, starttime, duration, stdo, stde, cmd, + junit=tree) + + @classmethod def make_exitcode(cls, test: 'TestSerialisation', test_env: T.Dict[str, str], returncode: int, starttime: float, duration: float, stdo: T.Optional[str], stde: T.Optional[str], - cmd: T.Optional[T.List[str]]) -> 'TestRun': + cmd: T.Optional[T.List[str]], **kwargs) -> 'TestRun': if returncode == GNU_SKIP_RETURNCODE: res = TestResult.SKIP elif returncode == GNU_ERROR_RETURNCODE: @@ -441,15 +472,15 @@ class TestRun: res = TestResult.EXPECTEDFAIL if bool(returncode) else TestResult.UNEXPECTEDPASS else: res = TestResult.FAIL if bool(returncode) else TestResult.OK - return cls(test, test_env, res, [], returncode, starttime, duration, stdo, stde, cmd) + return cls(test, test_env, res, [], returncode, starttime, duration, stdo, stde, cmd, **kwargs) @classmethod def make_tap(cls, test: 'TestSerialisation', test_env: T.Dict[str, str], returncode: int, starttime: float, duration: float, stdo: str, stde: str, cmd: T.Optional[T.List[str]]) -> 'TestRun': - res = None # T.Optional[TestResult] - results = [] # T.List[TestResult] + res = None # type: T.Optional[TestResult] + results = [] # type: T.List[TestResult] failed = False for i in TAPParser(io.StringIO(stdo)).parse(): @@ -485,7 +516,7 @@ class TestRun: res: TestResult, results: T.List[TestResult], returncode: int, starttime: float, duration: float, stdo: T.Optional[str], stde: T.Optional[str], - cmd: T.Optional[T.List[str]]): + cmd: T.Optional[T.List[str]], *, junit: T.Optional[et.ElementTree] = None): assert isinstance(res, TestResult) self.res = res self.results = results # May be an empty list @@ -498,6 +529,7 @@ class TestRun: self.env = test_env self.should_fail = test.should_fail self.project = test.project_name + self.junit = junit def get_log(self) -> str: res = '--- command ---\n' @@ -544,9 +576,7 @@ def write_json_log(jsonlogfile: T.TextIO, test_name: str, result: TestRun) -> No jsonlogfile.write(json.dumps(jresult) + '\n') def run_with_mono(fname: str) -> bool: - if fname.endswith('.exe') and not (is_windows() or is_cygwin()): - return True - return False + return fname.endswith('.exe') and not (is_windows() or is_cygwin()) def load_benchmarks(build_dir: str) -> T.List['TestSerialisation']: datafile = Path(build_dir) / 'meson-private' / 'meson_benchmark_setup.dat' @@ -633,7 +663,7 @@ class SingleTestRunner: if not self.options.verbose: stdout = tempfile.TemporaryFile("wb+") stderr = tempfile.TemporaryFile("wb+") if self.options.split else stdout - if self.test.protocol == 'tap' and stderr is stdout: + if self.test.protocol is TestProtocol.TAP and stderr is stdout: stdout = tempfile.TemporaryFile("wb+") # Let gdb handle ^C instead of us @@ -653,7 +683,14 @@ class SingleTestRunner: # errors avoid not being able to use the terminal. os.setsid() # type: ignore - p = subprocess.Popen(cmd, + extra_cmd = [] # type: T.List[str] + if self.test.protocol is TestProtocol.GTEST: + gtestname = '{}.xml'.format(self.test.name) + if self.test.workdir: + gtestname = '{}:{}'.format(self.test.workdir, self.test.name) + extra_cmd.append('--gtest_output=xml:{}'.format(gtestname)) + + p = subprocess.Popen(cmd + extra_cmd, stdout=stdout, stderr=stderr, env=self.env, @@ -743,8 +780,10 @@ class SingleTestRunner: if timed_out: return TestRun(self.test, self.test_env, TestResult.TIMEOUT, [], p.returncode, starttime, duration, stdo, stde, cmd) else: - if self.test.protocol == 'exitcode': + if self.test.protocol is TestProtocol.EXITCODE: return TestRun.make_exitcode(self.test, self.test_env, p.returncode, starttime, duration, stdo, stde, cmd) + elif self.test.protocol is TestProtocol.GTEST: + return TestRun.make_gtest(self.test, self.test_env, p.returncode, starttime, duration, stdo, stde, cmd) else: if self.options.verbose: print(stdo, end='') @@ -1162,7 +1201,6 @@ def run(options: argparse.Namespace) -> int: if not exe.found(): print('Could not find requested program: {!r}'.format(check_bin)) return 1 - options.wd = os.path.abspath(options.wd) if not options.list and not options.no_rebuild: if not rebuild_all(options.wd): diff --git a/run_project_tests.py b/run_project_tests.py index 3abe88c..18731d6 100755 --- a/run_project_tests.py +++ b/run_project_tests.py @@ -50,7 +50,7 @@ from run_tests import ensure_backend_detects_changes from run_tests import guess_backend ALL_TESTS = ['cmake', 'common', 'warning-meson', 'failing-meson', 'failing-build', 'failing-test', - 'kconfig', 'platform-osx', 'platform-windows', 'platform-linux', + 'keyval', 'platform-osx', 'platform-windows', 'platform-linux', 'java', 'C#', 'vala', 'rust', 'd', 'objective c', 'objective c++', 'fortran', 'swift', 'cuda', 'python3', 'python', 'fpga', 'frameworks', 'nasm', 'wasm' ] @@ -904,7 +904,7 @@ def detect_tests_to_run(only: T.List[str], use_tmp: bool) -> T.List[T.Tuple[str, ('failing-meson', 'failing', False), ('failing-build', 'failing build', False), ('failing-test', 'failing test', False), - ('kconfig', 'kconfig', False), + ('keyval', 'keyval', False), ('platform-osx', 'osx', not mesonlib.is_osx()), ('platform-windows', 'windows', not mesonlib.is_windows() and not mesonlib.is_cygwin()), diff --git a/run_unittests.py b/run_unittests.py index da898a3..d6f7911 100755 --- a/run_unittests.py +++ b/run_unittests.py @@ -4625,8 +4625,7 @@ recommended as it is not supported on some platforms''') schema = et.XMLSchema(et.parse(str(Path(__file__).parent / 'data' / 'schema.xsd'))) - testdir = os.path.join(self.common_test_dir, case) - self.init(testdir) + self.init(case) self.run_tests() junit = et.parse(str(Path(self.builddir) / 'meson-logs' / 'testlog.junit.xml')) @@ -4636,10 +4635,29 @@ recommended as it is not supported on some platforms''') self.fail(e.error_log) def test_junit_valid_tap(self): - self._test_junit('213 tap tests') + self._test_junit(os.path.join(self.common_test_dir, '213 tap tests')) def test_junit_valid_exitcode(self): - self._test_junit('44 test args') + self._test_junit(os.path.join(self.common_test_dir, '44 test args')) + + def test_junit_valid_gtest(self): + self._test_junit(os.path.join(self.framework_test_dir, '2 gtest')) + + def test_link_language_linker(self): + # TODO: there should be some way to query how we're linking things + # without resorting to reading the ninja.build file + if self.backend is not Backend.ninja: + raise unittest.SkipTest('This test reads the ninja file') + + testdir = os.path.join(self.common_test_dir, '232 link language') + self.init(testdir) + + build_ninja = os.path.join(self.builddir, 'build.ninja') + with open(build_ninja, 'r', encoding='utf-8') as f: + contents = f.read() + + self.assertRegex(contents, r'build main(\.exe)?.*: c_LINKER') + self.assertRegex(contents, r'build (lib|cyg)?mylib.*: c_LINKER') class FailureTests(BasePlatformTests): @@ -5159,7 +5177,7 @@ class WindowsTests(BasePlatformTests): raise raise unittest.SkipTest('pefile module not found') testdir = os.path.join(self.common_test_dir, '6 linkshared') - self.init(testdir) + self.init(testdir, extra_args=['--buildtype=release']) self.build() # Test that binaries have a non-zero checksum env = get_fake_env() @@ -5307,7 +5325,7 @@ class DarwinTests(BasePlatformTests): def test_removing_unused_linker_args(self): testdir = os.path.join(self.common_test_dir, '108 has arg') - env = {'CFLAGS': '-L/tmp -L /var/tmp -headerpad_max_install_names -Wl,-export_dynamic'} + env = {'CFLAGS': '-L/tmp -L /var/tmp -headerpad_max_install_names -Wl,-export_dynamic -framework Foundation'} self.init(testdir, override_envvars=env) @@ -6667,6 +6685,17 @@ class LinuxCrossArmTests(BasePlatformTests): return self.assertTrue(False, 'Option libdir not in introspect data.') + def test_cross_libdir_subproject(self): + # Guard against a regression where calling "subproject" + # would reset the value of libdir to its default value. + testdir = os.path.join(self.unit_test_dir, '75 subdir libdir') + self.init(testdir, extra_args=['--libdir=fuf']) + for i in self.introspect('--buildoptions'): + if i['name'] == 'libdir': + self.assertEqual(i['value'], 'fuf') + return + self.assertTrue(False, 'Libdir specified on command line gets reset.') + def test_std_remains(self): # C_std defined in project options must be in effect also when cross compiling. testdir = os.path.join(self.unit_test_dir, '51 noncross options') diff --git a/test cases/common/104 postconf with args/meson.build b/test cases/common/104 postconf with args/meson.build index 8510c5b..a34502c 100644 --- a/test cases/common/104 postconf with args/meson.build +++ b/test cases/common/104 postconf with args/meson.build @@ -1,5 +1,10 @@ project('postconf script', 'c') -meson.add_postconf_script('postconf.py', '5', '33') +conf = configure_file( + configuration : configuration_data(), + output : 'out' +) + +meson.add_postconf_script(find_program('postconf.py'), '5', '33', conf) test('post', executable('prog', 'prog.c')) diff --git a/test cases/common/163 disabler/meson.build b/test cases/common/163 disabler/meson.build index 5554f14..d132e2b 100644 --- a/test cases/common/163 disabler/meson.build +++ b/test cases/common/163 disabler/meson.build @@ -9,6 +9,7 @@ d2 = dependency(d) d3 = (d == d2) d4 = d + 0 d5 = d2 or true +set_variable('d6', disabler()) has_not_changed = false if is_disabler(d) @@ -23,12 +24,14 @@ assert(is_disabler(d2), 'Function laundered disabler was not identified correctl assert(is_disabler(d3), 'Disabler comparison should yield disabler.') assert(is_disabler(d4), 'Disabler addition should yield disabler.') assert(is_disabler(d5), 'Disabler logic op should yield disabler.') +assert(is_disabler(d6), 'set_variable with a disabler should set variable to disabler.') assert(d, 'Disabler did not cause this to be skipped.') assert(d2, 'Function laundered disabler did not cause this to be skipped.') assert(d3, 'Disabler comparison should yield disabler and thus this would not be called.') assert(d4, 'Disabler addition should yield disabler and thus this would not be called.') assert(d5, 'Disabler logic op should yield disabler and thus this would not be called.') +assert(d6, 'set_variable with a disabler did not cause this to be skipped.') number = 0 @@ -80,6 +83,31 @@ else endif assert(has_not_changed, 'App has changed.') +assert(not is_disabler(is_variable('d6')), 'is_variable should not return a disabler') +assert(is_variable('d6'), 'is_variable for a disabler should return true') + +if_is_not_disabled = false +if is_variable('d6') + if_is_not_disabled = true +else + if_is_not_disabled = true +endif +assert(if_is_not_disabled, 'Disabler in is_variable should not skip blocks') + +get_d = get_variable('d6') +assert(is_disabler(get_d), 'get_variable should yield a disabler') + +get_fallback_d = get_variable('nonexistant', disabler()) +assert(is_disabler(get_fallback_d), 'get_variable fallback should yield a disabler') + +var_true = true +get_no_fallback_d = get_variable('var_true', disabler()) +assert(not is_disabler(get_no_fallback_d), 'get_variable should not fallback to disabler') +assert(get_no_fallback_d, 'get_variable should yield true') + +assert(is_disabler(get_variable(disabler())), 'get_variable should yield a disabler') +assert(is_disabler(get_variable(disabler(), var_true)), 'get_variable should yield a disabler') + if_is_disabled = true if disabler() if_is_disabled = false diff --git a/test cases/common/222 source set realistic example/meson.build b/test cases/common/222 source set realistic example/meson.build index 5b0e495..106b81d 100644 --- a/test cases/common/222 source set realistic example/meson.build +++ b/test cases/common/222 source set realistic example/meson.build @@ -1,4 +1,4 @@ -# a sort-of realistic example that combines the sourceset and kconfig +# a sort-of realistic example that combines the sourceset and keyval # modules, inspired by QEMU's build system project('sourceset-example', 'cpp', default_options: ['cpp_std=c++11']) @@ -9,7 +9,7 @@ if cppid == 'pgi' endif ss = import('sourceset') -kconfig = import('unstable-kconfig') +keyval = import('unstable-keyval') zlib = declare_dependency(compile_args: '-DZLIB=1') another = declare_dependency(compile_args: '-DANOTHER=1') @@ -39,7 +39,7 @@ targets = [ 'arm', 'aarch64', 'x86' ] target_dirs = { 'arm' : 'arm', 'aarch64' : 'arm', 'x86': 'x86' } foreach x : targets - config = kconfig.load('config' / x) + config = keyval.load('config' / x) target_specific = specific.apply(config, strict: false) target_common = common.apply(config, strict: false) target_deps = target_specific.dependencies() + target_common.dependencies() diff --git a/test cases/common/232 link language/c_linkage.cpp b/test cases/common/232 link language/c_linkage.cpp new file mode 100644 index 0000000..dc006b9 --- /dev/null +++ b/test cases/common/232 link language/c_linkage.cpp @@ -0,0 +1,5 @@ +extern "C" { + int makeInt(void) { + return 0; + } +} diff --git a/test cases/common/232 link language/c_linkage.h b/test cases/common/232 link language/c_linkage.h new file mode 100644 index 0000000..1609f47 --- /dev/null +++ b/test cases/common/232 link language/c_linkage.h @@ -0,0 +1,10 @@ + +#ifdef __cplusplus +extern "C" { +#endif + +int makeInt(void); + +#ifdef __cplusplus +} +#endif diff --git a/test cases/common/232 link language/lib.cpp b/test cases/common/232 link language/lib.cpp new file mode 100644 index 0000000..ab43828 --- /dev/null +++ b/test cases/common/232 link language/lib.cpp @@ -0,0 +1,5 @@ +extern "C" { + int makeInt(void) { + return 1; + } +} diff --git a/test cases/common/232 link language/main.c b/test cases/common/232 link language/main.c new file mode 100644 index 0000000..5a167e7 --- /dev/null +++ b/test cases/common/232 link language/main.c @@ -0,0 +1,5 @@ +#include "c_linkage.h" + +int main(void) { + return makeInt(); +} diff --git a/test cases/common/232 link language/meson.build b/test cases/common/232 link language/meson.build new file mode 100644 index 0000000..f9af6cd --- /dev/null +++ b/test cases/common/232 link language/meson.build @@ -0,0 +1,18 @@ +project( + 'link_language', + ['c', 'cpp'], +) + +exe = executable( + 'main', + ['main.c', 'c_linkage.cpp'], + link_language : 'c', +) + +lib = library( + 'mylib', + ['lib.cpp'], + link_language : 'c', +) + +test('main', exe) diff --git a/test cases/common/233 link depends indexed custom target/foo.c b/test cases/common/233 link depends indexed custom target/foo.c new file mode 100644 index 0000000..58c86a6 --- /dev/null +++ b/test cases/common/233 link depends indexed custom target/foo.c @@ -0,0 +1,15 @@ +#include <stdio.h> + +int main(void) { + const char *fn = DEPFILE; + FILE *f = fopen(fn, "r"); + if (!f) { + printf("could not open %s", fn); + return 1; + } + else { + printf("successfully opened %s", fn); + } + + return 0; +} diff --git a/test cases/common/233 link depends indexed custom target/make_file.py b/test cases/common/233 link depends indexed custom target/make_file.py new file mode 100644 index 0000000..6a43b7d --- /dev/null +++ b/test cases/common/233 link depends indexed custom target/make_file.py @@ -0,0 +1,8 @@ +#!/usr/bin/env python3 +import sys + +with open(sys.argv[1], 'w') as f: + print('# this file does nothing', file=f) + +with open(sys.argv[2], 'w') as f: + print('# this file does nothing', file=f) diff --git a/test cases/common/233 link depends indexed custom target/meson.build b/test cases/common/233 link depends indexed custom target/meson.build new file mode 100644 index 0000000..5c066e9 --- /dev/null +++ b/test cases/common/233 link depends indexed custom target/meson.build @@ -0,0 +1,19 @@ +project('link_depends_indexed_custom_target', 'c') + +if meson.backend().startswith('vs') + # FIXME: Broken on the VS backends + error('MESON_SKIP_TEST see https://github.com/mesonbuild/meson/issues/1799') +endif + +cmd = find_program('make_file.py') + +dep_files = custom_target('gen_dep', + command: [cmd, '@OUTPUT@'], + output: ['dep_file1', 'dep_file2']) + +exe = executable('foo', 'foo.c', + link_depends: dep_files[1], + c_args: ['-DDEPFILE="' + dep_files[0].full_path()+ '"']) + +# check that dep_file1 exists, which means that link_depends target ran +test('runtest', exe) diff --git a/test cases/common/56 install script/customtarget.py b/test cases/common/56 install script/customtarget.py new file mode 100755 index 0000000..e28373a --- /dev/null +++ b/test cases/common/56 install script/customtarget.py @@ -0,0 +1,19 @@ +#!/usr/bin/env python3 + +import argparse +import os + + +def main() -> None: + parser = argparse.ArgumentParser() + parser.add_argument('dirname') + args = parser.parse_args() + + with open(os.path.join(args.dirname, '1.txt'), 'w') as f: + f.write('') + with open(os.path.join(args.dirname, '2.txt'), 'w') as f: + f.write('') + + +if __name__ == "__main__": + main() diff --git a/test cases/common/56 install script/meson.build b/test cases/common/56 install script/meson.build index 6351518..e80e666 100644 --- a/test cases/common/56 install script/meson.build +++ b/test cases/common/56 install script/meson.build @@ -5,3 +5,29 @@ meson.add_install_script('myinstall.py', 'diiba/daaba', 'file.dat') meson.add_install_script('myinstall.py', 'this/should', 'also-work.dat') subdir('src') + +meson.add_install_script('myinstall.py', 'dir', afile, '--mode=copy') + +data = configuration_data() +data.set10('foo', true) +conf = configure_file( + configuration : data, + output : 'conf.txt' +) + +meson.add_install_script('myinstall.py', 'dir', conf, '--mode=copy') + +t = custom_target( + 'ct', + command : [find_program('customtarget.py'), '@OUTDIR@'], + output : ['1.txt', '2.txt'], +) + +meson.add_install_script('myinstall.py', 'customtarget', t, '--mode=copy') +meson.add_install_script('myinstall.py', 'customtargetindex', t[0], '--mode=copy') + +meson.add_install_script(exe, 'generated.txt') +wrap = find_program('wrap.py') +# Yes, these are getting silly +meson.add_install_script(wrap, exe, 'wrapped.txt') +meson.add_install_script(wrap, wrap, exe, 'wrapped2.txt') diff --git a/test cases/common/56 install script/myinstall.py b/test cases/common/56 install script/myinstall.py index 812561e..a573342 100644 --- a/test cases/common/56 install script/myinstall.py +++ b/test cases/common/56 install script/myinstall.py @@ -1,12 +1,31 @@ #!/usr/bin/env python3 +import argparse import os -import sys +import shutil prefix = os.environ['MESON_INSTALL_DESTDIR_PREFIX'] -dirname = os.path.join(prefix, sys.argv[1]) -os.makedirs(dirname) -with open(os.path.join(dirname, sys.argv[2]), 'w') as f: - f.write('') +def main() -> None: + parser = argparse.ArgumentParser() + parser.add_argument('dirname') + parser.add_argument('files', nargs='+') + parser.add_argument('--mode', action='store', default='create', choices=['create', 'copy']) + args = parser.parse_args() + + dirname = os.path.join(prefix, args.dirname) + if not os.path.exists(dirname): + os.makedirs(dirname) + + if args.mode == 'create': + for name in args.files: + with open(os.path.join(dirname, name), 'w') as f: + f.write('') + else: + for name in args.files: + shutil.copy(name, dirname) + + +if __name__ == "__main__": + main() diff --git a/test cases/common/56 install script/src/a file.txt b/test cases/common/56 install script/src/a file.txt new file mode 100644 index 0000000..e69de29 --- /dev/null +++ b/test cases/common/56 install script/src/a file.txt diff --git a/test cases/common/56 install script/src/exe.c b/test cases/common/56 install script/src/exe.c new file mode 100644 index 0000000..b573b91 --- /dev/null +++ b/test cases/common/56 install script/src/exe.c @@ -0,0 +1,24 @@ +#include <stdio.h> +#include <stdlib.h> +#include <string.h> + +int main(int argc, char * argv[]) { + if (argc != 2) { + fprintf(stderr, "Takes exactly 2 arguments\n"); + return 1; + } + + char * dirname = getenv("MESON_INSTALL_DESTDIR_PREFIX"); + char * fullname = malloc(strlen(dirname) + 1 + strlen(argv[1]) + 1); + strcpy(fullname, dirname); + strcat(fullname, "/"); + strcat(fullname, argv[1]); + + FILE * fp = fopen(fullname, "w"); + fputs("Some text\n", fp); + fclose(fp); + + free(fullname); + + return 0; +} diff --git a/test cases/common/56 install script/src/meson.build b/test cases/common/56 install script/src/meson.build index b23574a..1db424f 100644 --- a/test cases/common/56 install script/src/meson.build +++ b/test cases/common/56 install script/src/meson.build @@ -1 +1,5 @@ meson.add_install_script('myinstall.py', 'this/does', 'something-different.dat') + +afile = files('a file.txt') + +exe = executable('exe', 'exe.c', install : false, native : true) diff --git a/test cases/common/56 install script/src/myinstall.py b/test cases/common/56 install script/src/myinstall.py index 3b7ce37..3a9d89b 100644 --- a/test cases/common/56 install script/src/myinstall.py +++ b/test cases/common/56 install script/src/myinstall.py @@ -7,6 +7,8 @@ prefix = os.environ['MESON_INSTALL_DESTDIR_PREFIX'] dirname = os.path.join(prefix, sys.argv[1]) -os.makedirs(dirname) +if not os.path.exists(dirname): + os.makedirs(dirname) + with open(os.path.join(dirname, sys.argv[2] + '.in'), 'w') as f: f.write('') diff --git a/test cases/common/56 install script/test.json b/test cases/common/56 install script/test.json index d17625f..b2a5971 100644 --- a/test cases/common/56 install script/test.json +++ b/test cases/common/56 install script/test.json @@ -4,6 +4,14 @@ {"type": "pdb", "file": "usr/bin/prog"}, {"type": "file", "file": "usr/diiba/daaba/file.dat"}, {"type": "file", "file": "usr/this/should/also-work.dat"}, - {"type": "file", "file": "usr/this/does/something-different.dat.in"} + {"type": "file", "file": "usr/this/does/something-different.dat.in"}, + {"type": "file", "file": "usr/dir/a file.txt"}, + {"type": "file", "file": "usr/dir/conf.txt"}, + {"type": "file", "file": "usr/customtarget/1.txt"}, + {"type": "file", "file": "usr/customtarget/2.txt"}, + {"type": "file", "file": "usr/customtargetindex/1.txt"}, + {"type": "file", "file": "usr/generated.txt"}, + {"type": "file", "file": "usr/wrapped.txt"}, + {"type": "file", "file": "usr/wrapped2.txt"} ] } diff --git a/test cases/common/56 install script/wrap.py b/test cases/common/56 install script/wrap.py new file mode 100755 index 0000000..87508e0 --- /dev/null +++ b/test cases/common/56 install script/wrap.py @@ -0,0 +1,6 @@ +#!/usr/bin/env python3 + +import subprocess +import sys + +subprocess.run(sys.argv[1:]) diff --git a/test cases/frameworks/1 boost/meson.build b/test cases/frameworks/1 boost/meson.build index 501ed29..6c23360 100644 --- a/test cases/frameworks/1 boost/meson.build +++ b/test cases/frameworks/1 boost/meson.build @@ -13,7 +13,7 @@ endif # within one project. The need to be independent of each other. # Use one without a library dependency and one with it. -linkdep = dependency('boost', static: s, modules : ['thread', 'system']) +linkdep = dependency('boost', static: s, modules : ['thread', 'system', 'date_time']) testdep = dependency('boost', static: s, modules : ['unit_test_framework']) nomoddep = dependency('boost', static: s) extralibdep = dependency('boost', static: s, modules : ['thread', 'system', 'date_time', 'log_setup', 'log', 'filesystem', 'regex']) diff --git a/test cases/frameworks/2 gtest/meson.build b/test cases/frameworks/2 gtest/meson.build index 2d93b52..ea3ef48 100644 --- a/test cases/frameworks/2 gtest/meson.build +++ b/test cases/frameworks/2 gtest/meson.build @@ -8,7 +8,7 @@ endif gtest_nomain = dependency('gtest', main : false, method : 'system') e = executable('testprog', 'test.cc', dependencies : gtest) -test('gtest test', e) +test('gtest test', e, protocol : 'gtest') e = executable('testprog_nomain', 'test_nomain.cc', dependencies : gtest_nomain) -test('gtest nomain test', e) +test('gtest nomain test', e, protocol : 'gtest') diff --git a/test cases/frameworks/21 libwmf/meson.build b/test cases/frameworks/21 libwmf/meson.build index 6952bf7..9dbab6a 100644 --- a/test cases/frameworks/21 libwmf/meson.build +++ b/test cases/frameworks/21 libwmf/meson.build @@ -1,7 +1,7 @@ project('libwmf test', 'c') wm = find_program('libwmf-config', required : false) -if not wm.found() +if not wm.found() or meson.is_cross_build() error('MESON_SKIP_TEST: libwmf-config not installed') endif diff --git a/test cases/kconfig/1 basic/.config b/test cases/keyval/1 basic/.config index 071d185..071d185 100644 --- a/test cases/kconfig/1 basic/.config +++ b/test cases/keyval/1 basic/.config diff --git a/test cases/kconfig/1 basic/meson.build b/test cases/keyval/1 basic/meson.build index 5dc8d19..fc7ddb3 100644 --- a/test cases/kconfig/1 basic/meson.build +++ b/test cases/keyval/1 basic/meson.build @@ -1,6 +1,6 @@ -project('kconfig basic test') +project('keyval basic test') -k = import('unstable-kconfig') +k = import('unstable-keyval') conf = k.load('.config') if not conf.has_key('CONFIG_VAL1') diff --git a/test cases/kconfig/2 subdir/.config b/test cases/keyval/2 subdir/.config index 0599d46..0599d46 100644 --- a/test cases/kconfig/2 subdir/.config +++ b/test cases/keyval/2 subdir/.config diff --git a/test cases/kconfig/2 subdir/dir/meson.build b/test cases/keyval/2 subdir/dir/meson.build index 12f1502..dc1b478 100644 --- a/test cases/kconfig/2 subdir/dir/meson.build +++ b/test cases/keyval/2 subdir/dir/meson.build @@ -1,5 +1,5 @@ -k = import('unstable-kconfig') +k = import('unstable-keyval') conf = k.load(meson.source_root() / '.config') diff --git a/test cases/kconfig/3 load_config files/meson.build b/test cases/keyval/2 subdir/meson.build index 1245b18..0651acf 100644 --- a/test cases/kconfig/3 load_config files/meson.build +++ b/test cases/keyval/2 subdir/meson.build @@ -1,4 +1,4 @@ -project('kconfig subdir test') +project('keyval subdir test') # Test into sub directory subdir('dir') diff --git a/test cases/kconfig/3 load_config files/dir/config b/test cases/keyval/3 load_config files/dir/config index 0599d46..0599d46 100644 --- a/test cases/kconfig/3 load_config files/dir/config +++ b/test cases/keyval/3 load_config files/dir/config diff --git a/test cases/kconfig/3 load_config files/dir/meson.build b/test cases/keyval/3 load_config files/dir/meson.build index d7b8d44..43fba13 100644 --- a/test cases/kconfig/3 load_config files/dir/meson.build +++ b/test cases/keyval/3 load_config files/dir/meson.build @@ -1,5 +1,5 @@ -k = import('unstable-kconfig') +k = import('unstable-keyval') conf = k.load(files('config')) diff --git a/test cases/kconfig/2 subdir/meson.build b/test cases/keyval/3 load_config files/meson.build index 1245b18..0651acf 100644 --- a/test cases/kconfig/2 subdir/meson.build +++ b/test cases/keyval/3 load_config files/meson.build @@ -1,4 +1,4 @@ -project('kconfig subdir test') +project('keyval subdir test') # Test into sub directory subdir('dir') diff --git a/test cases/kconfig/4 load_config builddir/config b/test cases/keyval/4 load_config builddir/config index 0599d46..0599d46 100644 --- a/test cases/kconfig/4 load_config builddir/config +++ b/test cases/keyval/4 load_config builddir/config diff --git a/test cases/kconfig/4 load_config builddir/meson.build b/test cases/keyval/4 load_config builddir/meson.build index 1924d23..1bb0285 100644 --- a/test cases/kconfig/4 load_config builddir/meson.build +++ b/test cases/keyval/4 load_config builddir/meson.build @@ -1,6 +1,6 @@ -project('kconfig builddir test') +project('keyval builddir test') -k = import('unstable-kconfig') +k = import('unstable-keyval') out_conf = configure_file(input: 'config', output: 'out-config', copy: true) conf = k.load(out_conf) diff --git a/test cases/unit/35 dist script/meson.build b/test cases/unit/35 dist script/meson.build index fd672a9..2ae9438 100644 --- a/test cases/unit/35 dist script/meson.build +++ b/test cases/unit/35 dist script/meson.build @@ -5,3 +5,4 @@ exe = executable('comparer', 'prog.c') test('compare', exe) meson.add_dist_script('replacer.py', '"incorrect"', '"correct"') +meson.add_dist_script(find_program('replacer.py'), '"incorrect"', '"correct"') diff --git a/test cases/unit/75 subdir libdir/meson.build b/test cases/unit/75 subdir libdir/meson.build new file mode 100644 index 0000000..5099c91 --- /dev/null +++ b/test cases/unit/75 subdir libdir/meson.build @@ -0,0 +1,2 @@ +project('toplevel', 'c') +subproject('flub') diff --git a/test cases/unit/75 subdir libdir/subprojects/flub/meson.build b/test cases/unit/75 subdir libdir/subprojects/flub/meson.build new file mode 100644 index 0000000..7bfd2c5 --- /dev/null +++ b/test cases/unit/75 subdir libdir/subprojects/flub/meson.build @@ -0,0 +1 @@ +project('subflub', 'c') diff --git a/tools/boost_names.py b/tools/boost_names.py index d26d34b..b66c6cc 100755 --- a/tools/boost_names.py +++ b/tools/boost_names.py @@ -43,10 +43,10 @@ export_modules = False class BoostLibrary(): def __init__(self, name: str, shared: T.List[str], static: T.List[str], single: T.List[str], multi: T.List[str]): self.name = name - self.shared = shared - self.static = static - self.single = single - self.multi = multi + self.shared = sorted(set(shared)) + self.static = sorted(set(static)) + self.single = sorted(set(single)) + self.multi = sorted(set(multi)) def __lt__(self, other: T.Any) -> T.Union[bool, 'NotImplemented']: if isinstance(other, BoostLibrary): @@ -99,15 +99,35 @@ def get_libraries(jamfile: Path) -> T.List[BoostLibrary]: cmds = raw.split(';') # Commands always terminate with a ; (I hope) cmds = [x.strip() for x in cmds] # Some cleanup + project_usage_requirements: T.List[str] = [] + # "Parse" the relevant sections for i in cmds: parts = i.split(' ') - parts = [x for x in parts if x not in ['', ':']] + parts = [x for x in parts if x not in ['']] if not parts: continue - # Parese libraries - if parts[0] in ['lib', 'boost-lib']: + # Parse project + if parts[0] in ['project']: + attributes: T.Dict[str, T.List[str]] = {} + curr: T.Optional[str] = None + + for j in parts: + if j == ':': + curr = None + elif curr is None: + curr = j + else: + if curr not in attributes: + attributes[curr] = [] + attributes[curr] += [j] + + if 'usage-requirements' in attributes: + project_usage_requirements = attributes['usage-requirements'] + + # Parse libraries + elif parts[0] in ['lib', 'boost-lib']: assert len(parts) >= 2 # Get and check the library name @@ -117,28 +137,36 @@ def get_libraries(jamfile: Path) -> T.List[BoostLibrary]: if not lname.startswith('boost_'): continue + # Count `:` to only select the 'usage-requirements' + # See https://boostorg.github.io/build/manual/master/index.html#bbv2.main-target-rule-syntax + colon_counter = 0 + usage_requirements: T.List[str] = [] + for j in parts: + if j == ':': + colon_counter += 1 + elif colon_counter >= 4: + usage_requirements += [j] + # Get shared / static defines shared: T.List[str] = [] static: T.List[str] = [] single: T.List[str] = [] multi: T.List[str] = [] - for j in parts: + for j in usage_requirements + project_usage_requirements: m1 = re.match(r'<link>shared:<define>(.*)', j) m2 = re.match(r'<link>static:<define>(.*)', j) m3 = re.match(r'<threading>single:<define>(.*)', j) m4 = re.match(r'<threading>multi:<define>(.*)', j) if m1: - shared += [m1.group(1)] + shared += [f'-D{m1.group(1)}'] if m2: - static += [m2.group(1)] + static += [f'-D{m2.group(1)}'] if m3: - single += [m3.group(1)] + single +=[f'-D{m3.group(1)}'] if m4: - multi += [m4.group(1)] + multi += [f'-D{m4.group(1)}'] - shared = [f'-D{x}' for x in shared] - static = [f'-D{x}' for x in static] libs += [BoostLibrary(lname, shared, static, single, multi)] return libs |