2018-04-12 00:46:37 +02:00
|
|
|
#!/usr/bin/env python3
|
2017-08-02 15:47:13 +02:00
|
|
|
# Test suites code generator.
|
|
|
|
#
|
2020-08-07 13:07:28 +02:00
|
|
|
# Copyright The Mbed TLS Contributors
|
2023-11-02 20:47:20 +01:00
|
|
|
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
"""
|
2018-07-02 17:01:04 +02:00
|
|
|
This script is a key part of Mbed TLS test suites framework. For
|
|
|
|
understanding the script it is important to understand the
|
|
|
|
framework. This doc string contains a summary of the framework
|
|
|
|
and explains the function of this script.
|
|
|
|
|
|
|
|
Mbed TLS test suites:
|
|
|
|
=====================
|
|
|
|
Scope:
|
|
|
|
------
|
|
|
|
The test suites focus on unit testing the crypto primitives and also
|
2018-07-03 12:57:54 +02:00
|
|
|
include x509 parser tests. Tests can be added to test any Mbed TLS
|
2018-07-02 17:01:04 +02:00
|
|
|
module. However, the framework is not capable of testing SSL
|
|
|
|
protocol, since that requires full stack execution and that is best
|
|
|
|
tested as part of the system test.
|
|
|
|
|
|
|
|
Test case definition:
|
|
|
|
---------------------
|
|
|
|
Tests are defined in a test_suite_<module>[.<optional sub module>].data
|
|
|
|
file. A test definition contains:
|
|
|
|
test name
|
|
|
|
optional build macro dependencies
|
|
|
|
test function
|
|
|
|
test parameters
|
|
|
|
|
|
|
|
Test dependencies are build macros that can be specified to indicate
|
|
|
|
the build config in which the test is valid. For example if a test
|
|
|
|
depends on a feature that is only enabled by defining a macro. Then
|
|
|
|
that macro should be specified as a dependency of the test.
|
|
|
|
|
|
|
|
Test function is the function that implements the test steps. This
|
|
|
|
function is specified for different tests that perform same steps
|
|
|
|
with different parameters.
|
|
|
|
|
|
|
|
Test parameters are specified in string form separated by ':'.
|
|
|
|
Parameters can be of type string, binary data specified as hex
|
|
|
|
string and integer constants specified as integer, macro or
|
|
|
|
as an expression. Following is an example test definition:
|
|
|
|
|
2018-07-18 18:48:37 +02:00
|
|
|
AES 128 GCM Encrypt and decrypt 8 bytes
|
|
|
|
depends_on:MBEDTLS_AES_C:MBEDTLS_GCM_C
|
|
|
|
enc_dec_buf:MBEDTLS_CIPHER_AES_128_GCM:"AES-128-GCM":128:8:-1
|
2018-07-02 17:01:04 +02:00
|
|
|
|
|
|
|
Test functions:
|
|
|
|
---------------
|
|
|
|
Test functions are coded in C in test_suite_<module>.function files.
|
|
|
|
Functions file is itself not compilable and contains special
|
|
|
|
format patterns to specify test suite dependencies, start and end
|
|
|
|
of functions and function dependencies. Check any existing functions
|
|
|
|
file for example.
|
|
|
|
|
|
|
|
Execution:
|
|
|
|
----------
|
|
|
|
Tests are executed in 3 steps:
|
|
|
|
- Generating test_suite_<module>[.<optional sub module>].c file
|
|
|
|
for each corresponding .data file.
|
|
|
|
- Building each source file into executables.
|
|
|
|
- Running each executable and printing report.
|
|
|
|
|
|
|
|
Generating C test source requires more than just the test functions.
|
|
|
|
Following extras are required:
|
|
|
|
- Process main()
|
|
|
|
- Reading .data file and dispatching test cases.
|
|
|
|
- Platform specific test case execution
|
|
|
|
- Dependency checking
|
|
|
|
- Integer expression evaluation
|
|
|
|
- Test function dispatch
|
|
|
|
|
|
|
|
Build dependencies and integer expressions (in the test parameters)
|
|
|
|
are specified as strings in the .data file. Their run time value is
|
|
|
|
not known at the generation stage. Hence, they need to be translated
|
|
|
|
into run time evaluations. This script generates the run time checks
|
|
|
|
for dependencies and integer expressions.
|
|
|
|
|
|
|
|
Similarly, function names have to be translated into function calls.
|
|
|
|
This script also generates code for function dispatch.
|
|
|
|
|
|
|
|
The extra code mentioned here is either generated by this script
|
|
|
|
or it comes from the input files: helpers file, platform file and
|
|
|
|
the template file.
|
|
|
|
|
|
|
|
Helper file:
|
|
|
|
------------
|
|
|
|
Helpers file contains common helper/utility functions and data.
|
|
|
|
|
|
|
|
Platform file:
|
|
|
|
--------------
|
|
|
|
Platform file contains platform specific setup code and test case
|
|
|
|
dispatch code. For example, host_test.function reads test data
|
|
|
|
file from host's file system and dispatches tests.
|
|
|
|
|
|
|
|
Template file:
|
|
|
|
---------
|
|
|
|
Template file for example main_test.function is a template C file in
|
|
|
|
which generated code and code from input files is substituted to
|
|
|
|
generate a compilable C file. It also contains skeleton functions for
|
|
|
|
dependency checks, expression evaluation and function dispatch. These
|
|
|
|
functions are populated with checks and return codes by this script.
|
|
|
|
|
|
|
|
Template file contains "replacement" fields that are formatted
|
2018-07-19 12:32:30 +02:00
|
|
|
strings processed by Python string.Template.substitute() method.
|
2018-07-02 17:01:04 +02:00
|
|
|
|
|
|
|
This script:
|
|
|
|
============
|
|
|
|
Core function of this script is to fill the template file with
|
|
|
|
code that is generated or read from helpers and platform files.
|
|
|
|
|
|
|
|
This script replaces following fields in the template and generates
|
|
|
|
the test source file:
|
|
|
|
|
2022-11-09 18:27:33 +01:00
|
|
|
__MBEDTLS_TEST_TEMPLATE__TEST_COMMON_HELPERS
|
|
|
|
All common code from helpers.function
|
|
|
|
is substituted here.
|
|
|
|
__MBEDTLS_TEST_TEMPLATE__FUNCTIONS_CODE
|
|
|
|
Test functions are substituted here
|
|
|
|
from the input test_suit_xyz.function
|
|
|
|
file. C preprocessor checks are generated
|
|
|
|
for the build dependencies specified
|
|
|
|
in the input file. This script also
|
|
|
|
generates wrappers for the test
|
|
|
|
functions with code to expand the
|
|
|
|
string parameters read from the data
|
|
|
|
file.
|
|
|
|
__MBEDTLS_TEST_TEMPLATE__EXPRESSION_CODE
|
|
|
|
This script enumerates the
|
|
|
|
expressions in the .data file and
|
|
|
|
generates code to handle enumerated
|
|
|
|
expression Ids and return the values.
|
|
|
|
__MBEDTLS_TEST_TEMPLATE__DEP_CHECK_CODE
|
|
|
|
This script enumerates all
|
|
|
|
build dependencies and generate
|
|
|
|
code to handle enumerated build
|
|
|
|
dependency Id and return status: if
|
|
|
|
the dependency is defined or not.
|
|
|
|
__MBEDTLS_TEST_TEMPLATE__DISPATCH_CODE
|
|
|
|
This script enumerates the functions
|
|
|
|
specified in the input test data file
|
|
|
|
and generates the initializer for the
|
|
|
|
function table in the template
|
|
|
|
file.
|
|
|
|
__MBEDTLS_TEST_TEMPLATE__PLATFORM_CODE
|
|
|
|
Platform specific setup and test
|
|
|
|
dispatch code.
|
2018-07-02 17:01:04 +02:00
|
|
|
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
|
2017-03-28 02:48:31 +02:00
|
|
|
import os
|
|
|
|
import re
|
2018-04-12 00:46:37 +02:00
|
|
|
import sys
|
2018-07-19 12:32:30 +02:00
|
|
|
import string
|
2017-03-28 02:48:31 +02:00
|
|
|
import argparse
|
|
|
|
|
|
|
|
|
2023-04-26 19:57:46 +02:00
|
|
|
# Types recognized as signed integer arguments in test functions.
|
2022-12-04 15:57:49 +01:00
|
|
|
SIGNED_INTEGER_TYPES = frozenset([
|
|
|
|
'char',
|
|
|
|
'short',
|
|
|
|
'short int',
|
|
|
|
'int',
|
|
|
|
'int8_t',
|
|
|
|
'int16_t',
|
|
|
|
'int32_t',
|
|
|
|
'int64_t',
|
|
|
|
'intmax_t',
|
|
|
|
'long',
|
|
|
|
'long int',
|
|
|
|
'long long int',
|
|
|
|
'mbedtls_mpi_sint',
|
|
|
|
'psa_status_t',
|
|
|
|
])
|
2022-12-04 15:11:00 +01:00
|
|
|
# Types recognized as string arguments in test functions.
|
|
|
|
STRING_TYPES = frozenset(['char*', 'const char*', 'char const*'])
|
|
|
|
# Types recognized as hex data arguments in test functions.
|
|
|
|
DATA_TYPES = frozenset(['data_t*', 'const data_t*', 'data_t const*'])
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
BEGIN_HEADER_REGEX = r'/\*\s*BEGIN_HEADER\s*\*/'
|
|
|
|
END_HEADER_REGEX = r'/\*\s*END_HEADER\s*\*/'
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
BEGIN_SUITE_HELPERS_REGEX = r'/\*\s*BEGIN_SUITE_HELPERS\s*\*/'
|
|
|
|
END_SUITE_HELPERS_REGEX = r'/\*\s*END_SUITE_HELPERS\s*\*/'
|
2018-02-06 14:08:01 +01:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
BEGIN_DEP_REGEX = r'BEGIN_DEPENDENCIES'
|
|
|
|
END_DEP_REGEX = r'END_DEPENDENCIES'
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
BEGIN_CASE_REGEX = r'/\*\s*BEGIN_CASE\s*(?P<depends_on>.*?)\s*\*/'
|
2018-07-03 12:57:54 +02:00
|
|
|
END_CASE_REGEX = r'/\*\s*END_CASE\s*\*/'
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
DEPENDENCY_REGEX = r'depends_on:(?P<dependencies>.*)'
|
2018-11-27 15:35:20 +01:00
|
|
|
C_IDENTIFIER_REGEX = r'!?[a-z_][a-z0-9_]*'
|
|
|
|
CONDITION_OPERATOR_REGEX = r'[!=]=|[<>]=?'
|
|
|
|
# forbid 0ddd which might be accidentally octal or accidentally decimal
|
|
|
|
CONDITION_VALUE_REGEX = r'[-+]?(0x[0-9a-f]+|0|[1-9][0-9]*)'
|
|
|
|
CONDITION_REGEX = r'({})(?:\s*({})\s*({}))?$'.format(C_IDENTIFIER_REGEX,
|
|
|
|
CONDITION_OPERATOR_REGEX,
|
|
|
|
CONDITION_VALUE_REGEX)
|
2018-07-05 18:31:46 +02:00
|
|
|
TEST_FUNCTION_VALIDATION_REGEX = r'\s*void\s+(?P<func_name>\w+)\s*\('
|
2018-07-05 00:29:46 +02:00
|
|
|
FUNCTION_ARG_LIST_END_REGEX = r'.*\)'
|
|
|
|
EXIT_LABEL_REGEX = r'^exit:'
|
|
|
|
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-06-26 15:35:25 +02:00
|
|
|
class GeneratorInputError(Exception):
|
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Exception to indicate error in the input files to this script.
|
|
|
|
This includes missing patterns, test function names and other
|
|
|
|
parsing errors.
|
2018-06-26 15:35:25 +02:00
|
|
|
"""
|
|
|
|
pass
|
|
|
|
|
|
|
|
|
2023-04-19 09:03:20 +02:00
|
|
|
class FileWrapper:
|
2017-06-30 10:35:21 +02:00
|
|
|
"""
|
2023-04-19 09:03:20 +02:00
|
|
|
This class extends the file object with attribute line_no,
|
2018-06-29 03:36:57 +02:00
|
|
|
that indicates line number for the line that is read.
|
2017-06-30 10:35:21 +02:00
|
|
|
"""
|
|
|
|
|
2023-04-19 09:03:20 +02:00
|
|
|
def __init__(self, file_name) -> None:
|
2017-06-30 10:35:21 +02:00
|
|
|
"""
|
2023-04-19 09:03:20 +02:00
|
|
|
Instantiate the file object and initialize the line number to 0.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param file_name: File path to open.
|
2017-06-30 10:35:21 +02:00
|
|
|
"""
|
2023-04-19 09:03:20 +02:00
|
|
|
# private mix-in file object
|
|
|
|
self._f = open(file_name, 'rb')
|
2018-07-03 12:57:54 +02:00
|
|
|
self._line_no = 0
|
2017-06-30 10:35:21 +02:00
|
|
|
|
2023-04-19 09:03:20 +02:00
|
|
|
def __iter__(self):
|
|
|
|
return self
|
|
|
|
|
2022-11-10 19:33:25 +01:00
|
|
|
def __next__(self):
|
2017-06-30 10:35:21 +02:00
|
|
|
"""
|
2023-04-19 09:03:20 +02:00
|
|
|
This method makes FileWrapper iterable.
|
|
|
|
It counts the line numbers as each line is read.
|
2018-06-29 03:36:57 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: Line read from file.
|
2017-06-30 10:35:21 +02:00
|
|
|
"""
|
2023-04-19 09:03:20 +02:00
|
|
|
line = self._f.__next__()
|
|
|
|
self._line_no += 1
|
|
|
|
# Convert byte array to string with correct encoding and
|
|
|
|
# strip any whitespaces added in the decoding process.
|
|
|
|
return line.decode(sys.getdefaultencoding()).rstrip()+ '\n'
|
|
|
|
|
|
|
|
def __enter__(self):
|
|
|
|
return self
|
|
|
|
|
|
|
|
def __exit__(self, exc_type, exc_val, exc_tb):
|
|
|
|
self._f.__exit__(exc_type, exc_val, exc_tb)
|
|
|
|
|
|
|
|
@property
|
|
|
|
def line_no(self):
|
2018-07-03 12:57:54 +02:00
|
|
|
"""
|
2023-04-19 09:03:20 +02:00
|
|
|
Property that indicates line number for the line that is read.
|
2018-07-03 12:57:54 +02:00
|
|
|
"""
|
|
|
|
return self._line_no
|
|
|
|
|
2023-04-19 09:03:20 +02:00
|
|
|
@property
|
|
|
|
def name(self):
|
|
|
|
"""
|
|
|
|
Property that indicates name of the file that is read.
|
|
|
|
"""
|
|
|
|
return self._f.name
|
2017-06-30 10:35:21 +02:00
|
|
|
|
|
|
|
|
|
|
|
def split_dep(dep):
|
2017-08-02 15:47:13 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
Split NOT character '!' from dependency. Used by gen_dependencies()
|
2017-08-02 15:47:13 +02:00
|
|
|
|
|
|
|
:param dep: Dependency list
|
2018-06-29 03:36:57 +02:00
|
|
|
:return: string tuple. Ex: ('!', MACRO) for !MACRO and ('', MACRO) for
|
|
|
|
MACRO.
|
2017-08-02 15:47:13 +02:00
|
|
|
"""
|
2017-06-30 10:35:21 +02:00
|
|
|
return ('!', dep[1:]) if dep[0] == '!' else ('', dep)
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def gen_dependencies(dependencies):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Test suite data and functions specifies compile time dependencies.
|
|
|
|
This function generates C preprocessor code from the input
|
|
|
|
dependency list. Caller uses the generated preprocessor code to
|
|
|
|
wrap dependent code.
|
|
|
|
A dependency in the input list can have a leading '!' character
|
|
|
|
to negate a condition. '!' is separated from the dependency using
|
|
|
|
function split_dep() and proper preprocessor check is generated
|
|
|
|
accordingly.
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
:param dependencies: List of dependencies.
|
2018-06-28 17:49:13 +02:00
|
|
|
:return: if defined and endif code with macro annotations for
|
|
|
|
readability.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
dep_start = ''.join(['#if %sdefined(%s)\n' % (x, y) for x, y in
|
|
|
|
map(split_dep, dependencies)])
|
|
|
|
dep_end = ''.join(['#endif /* %s */\n' %
|
|
|
|
x for x in reversed(dependencies)])
|
2017-06-30 10:35:21 +02:00
|
|
|
|
2017-03-28 02:48:31 +02:00
|
|
|
return dep_start, dep_end
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def gen_dependencies_one_line(dependencies):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
Similar to gen_dependencies() but generates dependency checks in one line.
|
2018-06-29 03:36:57 +02:00
|
|
|
Useful for generating code with #else block.
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
:param dependencies: List of dependencies.
|
|
|
|
:return: Preprocessor check code
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
defines = '#if ' if dependencies else ''
|
|
|
|
defines += ' && '.join(['%sdefined(%s)' % (x, y) for x, y in map(
|
|
|
|
split_dep, dependencies)])
|
2017-06-30 10:35:21 +02:00
|
|
|
return defines
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def gen_function_wrapper(name, local_vars, args_dispatch):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-28 17:49:13 +02:00
|
|
|
Creates test function wrapper code. A wrapper has the code to
|
|
|
|
unpack parameters from parameters[] array.
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param name: Test function name
|
2018-07-03 12:57:54 +02:00
|
|
|
:param local_vars: Local variables declaration code
|
2018-06-28 17:49:13 +02:00
|
|
|
:param args_dispatch: List of dispatch arguments.
|
2023-04-26 19:59:28 +02:00
|
|
|
Ex: ['(char *) params[0]', '*((int *) params[1])']
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: Test function wrapper.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
|
|
|
# Then create the wrapper
|
|
|
|
wrapper = '''
|
|
|
|
void {name}_wrapper( void ** params )
|
|
|
|
{{
|
2018-06-18 17:51:40 +02:00
|
|
|
{unused_params}{locals}
|
2017-03-28 02:48:31 +02:00
|
|
|
{name}( {args} );
|
|
|
|
}}
|
2018-06-18 17:51:40 +02:00
|
|
|
'''.format(name=name,
|
2018-06-26 15:06:52 +02:00
|
|
|
unused_params='' if args_dispatch else ' (void)params;\n',
|
2017-06-30 10:35:21 +02:00
|
|
|
args=', '.join(args_dispatch),
|
2018-07-03 12:57:54 +02:00
|
|
|
locals=local_vars)
|
2017-03-28 02:48:31 +02:00
|
|
|
return wrapper
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def gen_dispatch(name, dependencies):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Test suite code template main_test.function defines a C function
|
|
|
|
array to contain test case functions. This function generates an
|
|
|
|
initializer entry for a function in that array. The entry is
|
|
|
|
composed of a compile time check for the test function
|
|
|
|
dependencies. At compile time the test function is assigned when
|
|
|
|
dependencies are met, else NULL is assigned.
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param name: Test function name
|
2018-07-03 12:57:54 +02:00
|
|
|
:param dependencies: List of dependencies
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: Dispatch code.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
if dependencies:
|
|
|
|
preprocessor_check = gen_dependencies_one_line(dependencies)
|
2017-03-28 02:48:31 +02:00
|
|
|
dispatch_code = '''
|
2018-07-03 12:57:54 +02:00
|
|
|
{preprocessor_check}
|
2017-03-28 02:48:31 +02:00
|
|
|
{name}_wrapper,
|
|
|
|
#else
|
|
|
|
NULL,
|
|
|
|
#endif
|
2018-07-03 12:57:54 +02:00
|
|
|
'''.format(preprocessor_check=preprocessor_check, name=name)
|
2017-03-28 02:48:31 +02:00
|
|
|
else:
|
|
|
|
dispatch_code = '''
|
|
|
|
{name}_wrapper,
|
|
|
|
'''.format(name=name)
|
|
|
|
|
|
|
|
return dispatch_code
|
|
|
|
|
|
|
|
|
2018-02-06 14:08:01 +01:00
|
|
|
def parse_until_pattern(funcs_f, end_regex):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Matches pattern end_regex to the lines read from the file object.
|
|
|
|
Returns the lines read until end pattern is matched.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
:param funcs_f: file object for .function file
|
2018-02-06 14:08:01 +01:00
|
|
|
:param end_regex: Pattern to stop parsing
|
2018-06-29 03:36:57 +02:00
|
|
|
:return: Lines read before the end pattern
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2017-06-30 10:35:21 +02:00
|
|
|
headers = '#line %d "%s"\n' % (funcs_f.line_no + 1, funcs_f.name)
|
2017-03-28 02:48:31 +02:00
|
|
|
for line in funcs_f:
|
2018-02-06 14:08:01 +01:00
|
|
|
if re.search(end_regex, line):
|
2017-03-28 02:48:31 +02:00
|
|
|
break
|
|
|
|
headers += line
|
|
|
|
else:
|
2018-06-29 03:36:57 +02:00
|
|
|
raise GeneratorInputError("file: %s - end pattern [%s] not found!" %
|
2018-07-03 12:57:54 +02:00
|
|
|
(funcs_f.name, end_regex))
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2017-06-30 10:35:21 +02:00
|
|
|
return headers
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
def validate_dependency(dependency):
|
|
|
|
"""
|
|
|
|
Validates a C macro and raises GeneratorInputError on invalid input.
|
|
|
|
:param dependency: Input macro dependency
|
|
|
|
:return: input dependency stripped of leading & trailing white spaces.
|
|
|
|
"""
|
|
|
|
dependency = dependency.strip()
|
2018-11-27 15:35:20 +01:00
|
|
|
if not re.match(CONDITION_REGEX, dependency, re.I):
|
2018-07-05 00:29:46 +02:00
|
|
|
raise GeneratorInputError('Invalid dependency %s' % dependency)
|
|
|
|
return dependency
|
|
|
|
|
|
|
|
|
|
|
|
def parse_dependencies(inp_str):
|
|
|
|
"""
|
|
|
|
Parses dependencies out of inp_str, validates them and returns a
|
|
|
|
list of macros.
|
|
|
|
|
|
|
|
:param inp_str: Input string with macros delimited by ':'.
|
|
|
|
:return: list of dependencies
|
|
|
|
"""
|
2020-03-24 18:36:56 +01:00
|
|
|
dependencies = list(map(validate_dependency, inp_str.split(':')))
|
2018-07-05 00:29:46 +02:00
|
|
|
return dependencies
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def parse_suite_dependencies(funcs_f):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Parses test suite dependencies specified at the top of a
|
|
|
|
.function file, that starts with pattern BEGIN_DEPENDENCIES
|
|
|
|
and end with END_DEPENDENCIES. Dependencies are specified
|
|
|
|
after pattern 'depends_on:' and are delimited by ':'.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
:param funcs_f: file object for .function file
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: List of test suite dependencies.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
dependencies = []
|
2017-03-28 02:48:31 +02:00
|
|
|
for line in funcs_f:
|
2018-07-05 00:29:46 +02:00
|
|
|
match = re.search(DEPENDENCY_REGEX, line.strip())
|
2018-07-03 12:57:54 +02:00
|
|
|
if match:
|
2018-07-05 00:29:46 +02:00
|
|
|
try:
|
|
|
|
dependencies = parse_dependencies(match.group('dependencies'))
|
|
|
|
except GeneratorInputError as error:
|
|
|
|
raise GeneratorInputError(
|
|
|
|
str(error) + " - %s:%d" % (funcs_f.name, funcs_f.line_no))
|
2017-03-28 02:48:31 +02:00
|
|
|
if re.search(END_DEP_REGEX, line):
|
|
|
|
break
|
|
|
|
else:
|
2018-06-29 03:36:57 +02:00
|
|
|
raise GeneratorInputError("file: %s - end dependency pattern [%s]"
|
2018-07-03 12:57:54 +02:00
|
|
|
" not found!" % (funcs_f.name,
|
|
|
|
END_DEP_REGEX))
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
return dependencies
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def parse_function_dependencies(line):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Parses function dependencies, that are in the same line as
|
|
|
|
comment BEGIN_CASE. Dependencies are specified after pattern
|
|
|
|
'depends_on:' and are delimited by ':'.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
:param line: Line from .function file that has dependencies.
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: List of dependencies.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
dependencies = []
|
|
|
|
match = re.search(BEGIN_CASE_REGEX, line)
|
2018-07-05 00:29:46 +02:00
|
|
|
dep_str = match.group('depends_on')
|
2018-07-03 12:57:54 +02:00
|
|
|
if dep_str:
|
2018-07-05 00:29:46 +02:00
|
|
|
match = re.search(DEPENDENCY_REGEX, dep_str)
|
2018-07-03 12:57:54 +02:00
|
|
|
if match:
|
2018-07-05 00:29:46 +02:00
|
|
|
dependencies += parse_dependencies(match.group('dependencies'))
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
return dependencies
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-05 15:20:08 +02:00
|
|
|
|
2022-12-04 15:11:00 +01:00
|
|
|
ARGUMENT_DECLARATION_REGEX = re.compile(r'(.+?) ?(?:\bconst\b)? ?(\w+)\Z', re.S)
|
2022-12-04 14:10:39 +01:00
|
|
|
def parse_function_argument(arg, arg_idx, args, local_vars, args_dispatch):
|
|
|
|
"""
|
|
|
|
Parses one test function's argument declaration.
|
|
|
|
|
|
|
|
:param arg: argument declaration.
|
|
|
|
:param arg_idx: current wrapper argument index.
|
|
|
|
:param args: accumulator of arguments' internal types.
|
|
|
|
:param local_vars: accumulator of internal variable declarations.
|
|
|
|
:param args_dispatch: accumulator of argument usage expressions.
|
|
|
|
:return: the number of new wrapper arguments,
|
|
|
|
or None if the argument declaration is invalid.
|
|
|
|
"""
|
2022-12-04 15:11:00 +01:00
|
|
|
# Normalize whitespace
|
2022-12-04 14:10:39 +01:00
|
|
|
arg = arg.strip()
|
2022-12-04 15:11:00 +01:00
|
|
|
arg = re.sub(r'\s*\*\s*', r'*', arg)
|
|
|
|
arg = re.sub(r'\s+', r' ', arg)
|
|
|
|
# Extract name and type
|
|
|
|
m = ARGUMENT_DECLARATION_REGEX.search(arg)
|
|
|
|
if not m:
|
|
|
|
# E.g. "int x[42]"
|
|
|
|
return None
|
|
|
|
typ, _ = m.groups()
|
2022-12-04 15:57:49 +01:00
|
|
|
if typ in SIGNED_INTEGER_TYPES:
|
2022-12-04 14:10:39 +01:00
|
|
|
args.append('int')
|
2023-04-26 19:59:28 +02:00
|
|
|
args_dispatch.append('((mbedtls_test_argument_t *) params[%d])->sint' % arg_idx)
|
2022-12-04 14:10:39 +01:00
|
|
|
return 1
|
2022-12-04 15:11:00 +01:00
|
|
|
if typ in STRING_TYPES:
|
2022-12-04 14:10:39 +01:00
|
|
|
args.append('char*')
|
|
|
|
args_dispatch.append('(char *) params[%d]' % arg_idx)
|
|
|
|
return 1
|
2022-12-04 15:11:00 +01:00
|
|
|
if typ in DATA_TYPES:
|
2022-12-04 14:10:39 +01:00
|
|
|
args.append('hex')
|
|
|
|
# create a structure
|
|
|
|
pointer_initializer = '(uint8_t *) params[%d]' % arg_idx
|
2023-04-26 19:59:28 +02:00
|
|
|
len_initializer = '((mbedtls_test_argument_t *) params[%d])->len' % (arg_idx+1)
|
2022-12-04 14:10:39 +01:00
|
|
|
local_vars.append(' data_t data%d = {%s, %s};\n' %
|
|
|
|
(arg_idx, pointer_initializer, len_initializer))
|
|
|
|
args_dispatch.append('&data%d' % arg_idx)
|
|
|
|
return 2
|
|
|
|
return None
|
|
|
|
|
2022-12-04 14:29:06 +01:00
|
|
|
ARGUMENT_LIST_REGEX = re.compile(r'\((.*?)\)', re.S)
|
2018-07-05 18:31:46 +02:00
|
|
|
def parse_function_arguments(line):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Parses test function signature for validation and generates
|
|
|
|
a dispatch wrapper function that translates input test vectors
|
|
|
|
read from the data file into test function arguments.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
:param line: Line from .function file that has a function
|
2018-06-28 17:49:13 +02:00
|
|
|
signature.
|
2018-07-05 18:31:46 +02:00
|
|
|
:return: argument list, local variables for
|
2018-06-28 17:49:13 +02:00
|
|
|
wrapper function and argument dispatch code.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-05 00:29:46 +02:00
|
|
|
# Process arguments, ex: <type> arg1, <type> arg2 )
|
|
|
|
# This script assumes that the argument list is terminated by ')'
|
|
|
|
# i.e. the test functions will not have a function pointer
|
|
|
|
# argument.
|
2022-12-04 14:29:06 +01:00
|
|
|
m = ARGUMENT_LIST_REGEX.search(line)
|
|
|
|
arg_list = m.group(1).strip()
|
|
|
|
if arg_list in ['', 'void']:
|
|
|
|
return [], '', []
|
|
|
|
args = []
|
|
|
|
local_vars = []
|
|
|
|
args_dispatch = []
|
|
|
|
arg_idx = 0
|
|
|
|
for arg in arg_list.split(','):
|
2022-12-04 14:10:39 +01:00
|
|
|
indexes = parse_function_argument(arg, arg_idx,
|
|
|
|
args, local_vars, args_dispatch)
|
|
|
|
if indexes is None:
|
2018-06-28 17:49:13 +02:00
|
|
|
raise ValueError("Test function arguments can only be 'int', "
|
2018-06-29 12:05:32 +02:00
|
|
|
"'char *' or 'data_t'\n%s" % line)
|
2022-12-04 14:10:39 +01:00
|
|
|
arg_idx += indexes
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2022-12-04 14:00:32 +01:00
|
|
|
return args, ''.join(local_vars), args_dispatch
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
2018-07-06 01:29:09 +02:00
|
|
|
def generate_function_code(name, code, local_vars, args_dispatch,
|
|
|
|
dependencies):
|
|
|
|
"""
|
|
|
|
Generate function code with preprocessor checks and parameter dispatch
|
|
|
|
wrapper.
|
|
|
|
|
|
|
|
:param name: Function name
|
|
|
|
:param code: Function code
|
|
|
|
:param local_vars: Local variables for function wrapper
|
|
|
|
:param args_dispatch: Argument dispatch code
|
|
|
|
:param dependencies: Preprocessor dependencies list
|
|
|
|
:return: Final function code
|
|
|
|
"""
|
|
|
|
# Add exit label if not present
|
|
|
|
if code.find('exit:') == -1:
|
|
|
|
split_code = code.rsplit('}', 1)
|
|
|
|
if len(split_code) == 2:
|
|
|
|
code = """exit:
|
|
|
|
;
|
|
|
|
}""".join(split_code)
|
|
|
|
|
|
|
|
code += gen_function_wrapper(name, local_vars, args_dispatch)
|
|
|
|
preprocessor_check_start, preprocessor_check_end = \
|
|
|
|
gen_dependencies(dependencies)
|
|
|
|
return preprocessor_check_start + code + preprocessor_check_end
|
|
|
|
|
2022-11-11 16:37:16 +01:00
|
|
|
COMMENT_START_REGEX = re.compile(r'/[*/]')
|
|
|
|
|
|
|
|
def skip_comments(line, stream):
|
|
|
|
"""Remove comments in line.
|
|
|
|
|
|
|
|
If the line contains an unfinished comment, read more lines from stream
|
|
|
|
until the line that contains the comment.
|
|
|
|
|
|
|
|
:return: The original line with inner comments replaced by spaces.
|
|
|
|
Trailing comments and whitespace may be removed completely.
|
|
|
|
"""
|
|
|
|
pos = 0
|
|
|
|
while True:
|
|
|
|
opening = COMMENT_START_REGEX.search(line, pos)
|
|
|
|
if not opening:
|
|
|
|
break
|
|
|
|
if line[opening.start(0) + 1] == '/': # //...
|
|
|
|
continuation = line
|
2022-11-30 16:38:49 +01:00
|
|
|
# Count the number of line breaks, to keep line numbers aligned
|
|
|
|
# in the output.
|
|
|
|
line_count = 1
|
2022-11-11 16:37:16 +01:00
|
|
|
while continuation.endswith('\\\n'):
|
|
|
|
# This errors out if the file ends with an unfinished line
|
2022-11-18 22:26:03 +01:00
|
|
|
# comment. That's acceptable to not complicate the code further.
|
2022-11-11 16:37:16 +01:00
|
|
|
continuation = next(stream)
|
2022-11-30 16:38:49 +01:00
|
|
|
line_count += 1
|
|
|
|
return line[:opening.start(0)].rstrip() + '\n' * line_count
|
2022-11-11 16:37:16 +01:00
|
|
|
# Parsing /*...*/, looking for the end
|
|
|
|
closing = line.find('*/', opening.end(0))
|
|
|
|
while closing == -1:
|
|
|
|
# This errors out if the file ends with an unfinished block
|
2022-11-18 22:26:03 +01:00
|
|
|
# comment. That's acceptable to not complicate the code further.
|
2022-11-11 16:37:16 +01:00
|
|
|
line += next(stream)
|
|
|
|
closing = line.find('*/', opening.end(0))
|
|
|
|
pos = closing + 2
|
2022-11-18 22:27:37 +01:00
|
|
|
# Replace inner comment by spaces. There needs to be at least one space
|
|
|
|
# for things like 'int/*ihatespaces*/foo'. Go further and preserve the
|
2022-11-29 22:03:32 +01:00
|
|
|
# width of the comment and line breaks, this way positions in error
|
|
|
|
# messages remain correct.
|
2022-11-11 16:37:16 +01:00
|
|
|
line = (line[:opening.start(0)] +
|
2022-11-29 22:03:32 +01:00
|
|
|
re.sub(r'.', r' ', line[opening.start(0):pos]) +
|
2022-11-11 16:37:16 +01:00
|
|
|
line[pos:])
|
2022-11-29 22:03:32 +01:00
|
|
|
# Strip whitespace at the end of lines (it's irrelevant to error messages).
|
2022-11-11 16:37:16 +01:00
|
|
|
return re.sub(r' +(\n|\Z)', r'\1', line)
|
2018-07-06 01:29:09 +02:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def parse_function_code(funcs_f, dependencies, suite_dependencies):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-28 17:49:13 +02:00
|
|
|
Parses out a function from function file object and generates
|
|
|
|
function and dispatch code.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param funcs_f: file object of the functions file.
|
2018-07-03 12:57:54 +02:00
|
|
|
:param dependencies: List of dependencies
|
|
|
|
:param suite_dependencies: List of test suite dependencies
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: Function name, arguments, function code and dispatch code.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-05 18:31:46 +02:00
|
|
|
line_directive = '#line %d "%s"\n' % (funcs_f.line_no + 1, funcs_f.name)
|
|
|
|
code = ''
|
2018-07-05 00:29:46 +02:00
|
|
|
has_exit_label = False
|
2017-03-28 02:48:31 +02:00
|
|
|
for line in funcs_f:
|
2018-07-05 18:31:46 +02:00
|
|
|
# Check function signature. Function signature may be split
|
|
|
|
# across multiple lines. Here we try to find the start of
|
|
|
|
# arguments list, then remove '\n's and apply the regex to
|
|
|
|
# detect function start.
|
2022-11-11 16:37:16 +01:00
|
|
|
line = skip_comments(line, funcs_f)
|
2018-07-05 18:31:46 +02:00
|
|
|
up_to_arg_list_start = code + line[:line.find('(') + 1]
|
|
|
|
match = re.match(TEST_FUNCTION_VALIDATION_REGEX,
|
|
|
|
up_to_arg_list_start.replace('\n', ' '), re.I)
|
2018-07-03 12:57:54 +02:00
|
|
|
if match:
|
2017-03-28 02:48:31 +02:00
|
|
|
# check if we have full signature i.e. split in more lines
|
2018-07-05 18:31:46 +02:00
|
|
|
name = match.group('func_name')
|
2018-07-05 00:29:46 +02:00
|
|
|
if not re.match(FUNCTION_ARG_LIST_END_REGEX, line):
|
2017-03-28 02:48:31 +02:00
|
|
|
for lin in funcs_f:
|
2022-11-11 16:37:16 +01:00
|
|
|
line += skip_comments(lin, funcs_f)
|
2018-07-05 00:29:46 +02:00
|
|
|
if re.search(FUNCTION_ARG_LIST_END_REGEX, line):
|
2017-03-28 02:48:31 +02:00
|
|
|
break
|
2018-07-05 18:31:46 +02:00
|
|
|
args, local_vars, args_dispatch = parse_function_arguments(
|
2018-07-03 12:57:54 +02:00
|
|
|
line)
|
2018-07-05 00:29:46 +02:00
|
|
|
code += line
|
2018-07-05 18:31:46 +02:00
|
|
|
break
|
|
|
|
code += line
|
2017-03-28 02:48:31 +02:00
|
|
|
else:
|
2018-06-29 03:36:57 +02:00
|
|
|
raise GeneratorInputError("file: %s - Test functions not found!" %
|
2018-07-03 12:57:54 +02:00
|
|
|
funcs_f.name)
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-05 18:31:46 +02:00
|
|
|
# Prefix test function name with 'test_'
|
|
|
|
code = code.replace(name, 'test_' + name, 1)
|
|
|
|
name = 'test_' + name
|
|
|
|
|
2023-07-26 16:47:45 +02:00
|
|
|
# If a test function has no arguments then add 'void' argument to
|
2023-07-31 17:38:10 +02:00
|
|
|
# avoid "-Wstrict-prototypes" warnings from clang
|
2023-07-26 16:47:45 +02:00
|
|
|
if len(args) == 0:
|
|
|
|
code = code.replace('()', '(void)', 1)
|
|
|
|
|
2017-03-28 02:48:31 +02:00
|
|
|
for line in funcs_f:
|
|
|
|
if re.search(END_CASE_REGEX, line):
|
|
|
|
break
|
2018-07-05 00:29:46 +02:00
|
|
|
if not has_exit_label:
|
|
|
|
has_exit_label = \
|
|
|
|
re.search(EXIT_LABEL_REGEX, line.strip()) is not None
|
2017-03-28 02:48:31 +02:00
|
|
|
code += line
|
|
|
|
else:
|
2018-06-29 03:36:57 +02:00
|
|
|
raise GeneratorInputError("file: %s - end case pattern [%s] not "
|
2018-07-03 12:57:54 +02:00
|
|
|
"found!" % (funcs_f.name, END_CASE_REGEX))
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-06 01:29:09 +02:00
|
|
|
code = line_directive + code
|
|
|
|
code = generate_function_code(name, code, local_vars, args_dispatch,
|
|
|
|
dependencies)
|
2018-07-03 12:57:54 +02:00
|
|
|
dispatch_code = gen_dispatch(name, suite_dependencies + dependencies)
|
2018-07-06 01:29:09 +02:00
|
|
|
return (name, args, code, dispatch_code)
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
|
|
|
def parse_functions(funcs_f):
|
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Parses a test_suite_xxx.function file and returns information
|
|
|
|
for generating a C source file for the test suite.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param funcs_f: file object of the functions file.
|
2018-06-28 17:49:13 +02:00
|
|
|
:return: List of test suite dependencies, test function dispatch
|
|
|
|
code, function code and a dict with function identifiers
|
|
|
|
and arguments info.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-02-06 14:08:01 +01:00
|
|
|
suite_helpers = ''
|
2018-07-03 12:57:54 +02:00
|
|
|
suite_dependencies = []
|
2017-03-28 02:48:31 +02:00
|
|
|
suite_functions = ''
|
|
|
|
func_info = {}
|
|
|
|
function_idx = 0
|
|
|
|
dispatch_code = ''
|
|
|
|
for line in funcs_f:
|
|
|
|
if re.search(BEGIN_HEADER_REGEX, line):
|
2018-07-06 01:29:09 +02:00
|
|
|
suite_helpers += parse_until_pattern(funcs_f, END_HEADER_REGEX)
|
2018-02-06 14:08:01 +01:00
|
|
|
elif re.search(BEGIN_SUITE_HELPERS_REGEX, line):
|
2018-07-06 01:29:09 +02:00
|
|
|
suite_helpers += parse_until_pattern(funcs_f,
|
|
|
|
END_SUITE_HELPERS_REGEX)
|
2017-03-28 02:48:31 +02:00
|
|
|
elif re.search(BEGIN_DEP_REGEX, line):
|
2018-07-03 12:57:54 +02:00
|
|
|
suite_dependencies += parse_suite_dependencies(funcs_f)
|
2017-03-28 02:48:31 +02:00
|
|
|
elif re.search(BEGIN_CASE_REGEX, line):
|
2018-07-05 00:29:46 +02:00
|
|
|
try:
|
|
|
|
dependencies = parse_function_dependencies(line)
|
|
|
|
except GeneratorInputError as error:
|
|
|
|
raise GeneratorInputError(
|
|
|
|
"%s:%d: %s" % (funcs_f.name, funcs_f.line_no,
|
|
|
|
str(error)))
|
2018-06-28 17:49:13 +02:00
|
|
|
func_name, args, func_code, func_dispatch =\
|
2018-07-03 12:57:54 +02:00
|
|
|
parse_function_code(funcs_f, dependencies, suite_dependencies)
|
2017-03-28 02:48:31 +02:00
|
|
|
suite_functions += func_code
|
|
|
|
# Generate dispatch code and enumeration info
|
2018-06-26 15:35:25 +02:00
|
|
|
if func_name in func_info:
|
|
|
|
raise GeneratorInputError(
|
2018-07-03 12:57:54 +02:00
|
|
|
"file: %s - function %s re-declared at line %d" %
|
2018-06-26 15:35:25 +02:00
|
|
|
(funcs_f.name, func_name, funcs_f.line_no))
|
2017-03-28 02:48:31 +02:00
|
|
|
func_info[func_name] = (function_idx, args)
|
|
|
|
dispatch_code += '/* Function Id: %d */\n' % function_idx
|
|
|
|
dispatch_code += func_dispatch
|
|
|
|
function_idx += 1
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
func_code = (suite_helpers +
|
|
|
|
suite_functions).join(gen_dependencies(suite_dependencies))
|
|
|
|
return suite_dependencies, dispatch_code, func_code, func_info
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def escaped_split(inp_str, split_char):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
Split inp_str on character split_char but ignore if escaped.
|
2018-06-28 17:49:13 +02:00
|
|
|
Since, return value is used to write back to the intermediate
|
|
|
|
data file, any escape characters in the input are retained in the
|
|
|
|
output.
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
:param inp_str: String to split
|
2018-07-05 00:29:46 +02:00
|
|
|
:param split_char: Split character
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: List of splits
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
if len(split_char) > 1:
|
2017-03-28 02:48:31 +02:00
|
|
|
raise ValueError('Expected split character. Found string!')
|
2018-07-05 18:53:11 +02:00
|
|
|
out = re.sub(r'(\\.)|' + split_char,
|
|
|
|
lambda m: m.group(1) or '\n', inp_str,
|
|
|
|
len(inp_str)).split('\n')
|
2018-07-06 01:29:09 +02:00
|
|
|
out = [x for x in out if x]
|
2017-03-28 02:48:31 +02:00
|
|
|
return out
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def parse_test_data(data_f):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Parses .data file for each test case name, test function name,
|
|
|
|
test dependencies and test arguments. This information is
|
|
|
|
correlated with the test functions file for generating an
|
|
|
|
intermediate data file replacing the strings for test function
|
|
|
|
names, dependencies and integer constant expressions with
|
|
|
|
identifiers. Mainly for optimising space for on-target
|
|
|
|
execution.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param data_f: file object of the data file.
|
2022-12-03 22:58:52 +01:00
|
|
|
:return: Generator that yields line number, test name, function name,
|
2018-06-28 17:49:13 +02:00
|
|
|
dependency list and function argument list.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
__state_read_name = 0
|
|
|
|
__state_read_args = 1
|
|
|
|
state = __state_read_name
|
|
|
|
dependencies = []
|
2017-07-03 14:58:20 +02:00
|
|
|
name = ''
|
2017-03-28 02:48:31 +02:00
|
|
|
for line in data_f:
|
|
|
|
line = line.strip()
|
2018-07-05 00:29:46 +02:00
|
|
|
# Skip comments
|
|
|
|
if line.startswith('#'):
|
2017-03-28 02:48:31 +02:00
|
|
|
continue
|
|
|
|
|
2017-07-03 14:58:20 +02:00
|
|
|
# Blank line indicates end of test
|
2018-07-03 12:57:54 +02:00
|
|
|
if not line:
|
|
|
|
if state == __state_read_args:
|
2018-06-28 17:49:13 +02:00
|
|
|
raise GeneratorInputError("[%s:%d] Newline before arguments. "
|
|
|
|
"Test function and arguments "
|
|
|
|
"missing for %s" %
|
|
|
|
(data_f.name, data_f.line_no, name))
|
2017-03-28 02:48:31 +02:00
|
|
|
continue
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
if state == __state_read_name:
|
2017-03-28 02:48:31 +02:00
|
|
|
# Read test name
|
|
|
|
name = line
|
2018-07-03 12:57:54 +02:00
|
|
|
state = __state_read_args
|
|
|
|
elif state == __state_read_args:
|
2017-03-28 02:48:31 +02:00
|
|
|
# Check dependencies
|
2018-07-05 00:29:46 +02:00
|
|
|
match = re.search(DEPENDENCY_REGEX, line)
|
2018-07-03 12:57:54 +02:00
|
|
|
if match:
|
2018-07-05 00:29:46 +02:00
|
|
|
try:
|
|
|
|
dependencies = parse_dependencies(
|
|
|
|
match.group('dependencies'))
|
|
|
|
except GeneratorInputError as error:
|
|
|
|
raise GeneratorInputError(
|
|
|
|
str(error) + " - %s:%d" %
|
|
|
|
(data_f.name, data_f.line_no))
|
2017-03-28 02:48:31 +02:00
|
|
|
else:
|
|
|
|
# Read test vectors
|
|
|
|
parts = escaped_split(line, ':')
|
2018-07-03 12:57:54 +02:00
|
|
|
test_function = parts[0]
|
2017-03-28 02:48:31 +02:00
|
|
|
args = parts[1:]
|
2022-12-03 22:58:52 +01:00
|
|
|
yield data_f.line_no, name, test_function, dependencies, args
|
2018-07-03 12:57:54 +02:00
|
|
|
dependencies = []
|
|
|
|
state = __state_read_name
|
|
|
|
if state == __state_read_args:
|
2018-06-28 17:49:13 +02:00
|
|
|
raise GeneratorInputError("[%s:%d] Newline before arguments. "
|
|
|
|
"Test function and arguments missing for "
|
|
|
|
"%s" % (data_f.name, data_f.line_no, name))
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
|
|
|
def gen_dep_check(dep_id, dep):
|
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Generate code for checking dependency with the associated
|
|
|
|
identifier.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param dep_id: Dependency identifier
|
|
|
|
:param dep: Dependency macro
|
|
|
|
:return: Dependency check code
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-26 15:35:25 +02:00
|
|
|
if dep_id < 0:
|
2018-06-28 17:49:13 +02:00
|
|
|
raise GeneratorInputError("Dependency Id should be a positive "
|
|
|
|
"integer.")
|
2018-07-03 12:57:54 +02:00
|
|
|
_not, dep = ('!', dep[1:]) if dep[0] == '!' else ('', dep)
|
|
|
|
if not dep:
|
2018-06-26 15:35:25 +02:00
|
|
|
raise GeneratorInputError("Dependency should not be an empty string.")
|
2018-11-27 15:35:20 +01:00
|
|
|
|
|
|
|
dependency = re.match(CONDITION_REGEX, dep, re.I)
|
|
|
|
if not dependency:
|
|
|
|
raise GeneratorInputError('Invalid dependency %s' % dep)
|
|
|
|
|
|
|
|
_defined = '' if dependency.group(2) else 'defined'
|
|
|
|
_cond = dependency.group(2) if dependency.group(2) else ''
|
|
|
|
_value = dependency.group(3) if dependency.group(3) else ''
|
|
|
|
|
2017-03-28 02:48:31 +02:00
|
|
|
dep_check = '''
|
2017-07-07 18:14:02 +02:00
|
|
|
case {id}:
|
|
|
|
{{
|
2018-11-27 15:35:20 +01:00
|
|
|
#if {_not}{_defined}({macro}{_cond}{_value})
|
2017-07-07 18:14:02 +02:00
|
|
|
ret = DEPENDENCY_SUPPORTED;
|
2017-07-10 12:54:01 +02:00
|
|
|
#else
|
2017-07-07 18:14:02 +02:00
|
|
|
ret = DEPENDENCY_NOT_SUPPORTED;
|
2017-07-10 12:54:01 +02:00
|
|
|
#endif
|
2017-07-07 18:14:02 +02:00
|
|
|
}}
|
2018-11-27 15:35:20 +01:00
|
|
|
break;'''.format(_not=_not, _defined=_defined,
|
|
|
|
macro=dependency.group(1), id=dep_id,
|
|
|
|
_cond=_cond, _value=_value)
|
2017-03-28 02:48:31 +02:00
|
|
|
return dep_check
|
|
|
|
|
|
|
|
|
|
|
|
def gen_expression_check(exp_id, exp):
|
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Generates code for evaluating an integer expression using
|
|
|
|
associated expression Id.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param exp_id: Expression Identifier
|
|
|
|
:param exp: Expression/Macro
|
|
|
|
:return: Expression check code
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-26 15:35:25 +02:00
|
|
|
if exp_id < 0:
|
2018-06-28 17:49:13 +02:00
|
|
|
raise GeneratorInputError("Expression Id should be a positive "
|
|
|
|
"integer.")
|
2018-07-03 12:57:54 +02:00
|
|
|
if not exp:
|
2018-06-26 15:35:25 +02:00
|
|
|
raise GeneratorInputError("Expression should not be an empty string.")
|
2017-03-28 02:48:31 +02:00
|
|
|
exp_code = '''
|
2017-07-07 18:14:02 +02:00
|
|
|
case {exp_id}:
|
|
|
|
{{
|
|
|
|
*out_value = {expression};
|
|
|
|
}}
|
|
|
|
break;'''.format(exp_id=exp_id, expression=exp)
|
2017-03-28 02:48:31 +02:00
|
|
|
return exp_code
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def write_dependencies(out_data_f, test_dependencies, unique_dependencies):
|
2017-07-03 14:58:20 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Write dependencies to intermediate test data file, replacing
|
|
|
|
the string form with identifiers. Also, generates dependency
|
|
|
|
check code.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param out_data_f: Output intermediate data file
|
2018-07-03 12:57:54 +02:00
|
|
|
:param test_dependencies: Dependencies
|
|
|
|
:param unique_dependencies: Mutable list to track unique dependencies
|
2018-06-28 17:49:13 +02:00
|
|
|
that are global to this re-entrant function.
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: returns dependency check code.
|
2017-07-03 14:58:20 +02:00
|
|
|
"""
|
2017-07-06 18:34:27 +02:00
|
|
|
dep_check_code = ''
|
2018-07-03 12:57:54 +02:00
|
|
|
if test_dependencies:
|
2017-07-06 18:34:27 +02:00
|
|
|
out_data_f.write('depends_on')
|
2018-07-03 12:57:54 +02:00
|
|
|
for dep in test_dependencies:
|
|
|
|
if dep not in unique_dependencies:
|
|
|
|
unique_dependencies.append(dep)
|
|
|
|
dep_id = unique_dependencies.index(dep)
|
2017-07-06 18:34:27 +02:00
|
|
|
dep_check_code += gen_dep_check(dep_id, dep)
|
|
|
|
else:
|
2018-07-03 12:57:54 +02:00
|
|
|
dep_id = unique_dependencies.index(dep)
|
2017-07-06 18:34:27 +02:00
|
|
|
out_data_f.write(':' + str(dep_id))
|
|
|
|
out_data_f.write('\n')
|
|
|
|
return dep_check_code
|
2017-07-03 14:58:20 +02:00
|
|
|
|
|
|
|
|
Simplify parsing of integers in .datax files
In the .datax parser, since we're calling strtol() anyway, rely on it for
verification. This makes the .datax parser very slightly more
liberal (leading spaces and '+' are now accepted), and changes the
interpretation of numbers with leading zeros to octal.
Before, an argument like :0123: was parsed as decimal, but an argument like
:0123+1: was parsed as a C expression and hence the leading zero marked an
octal representation. Now, a leading zero is always interpreted according to
C syntax, namely indicating octal. There are no nonzero integer constants
with a leading zero in a .data file, so this does not affect existing test
cases.
In the .datax generator, allow negative arguments to be 'int' (before, they
were systematically treated as 'exp' even though they didn't need to be).
In the .datax parser, validate the range of integer constants. They have to
fit in int32_t. In the .datax generator, use 'exp' instead of 'int' for
integer constants that are out of range.
Signed-off-by: Gilles Peskine <Gilles.Peskine@arm.com>
2022-12-04 00:28:56 +01:00
|
|
|
INT_VAL_REGEX = re.compile(r'-?(\d+|0x[0-9a-f]+)$', re.I)
|
|
|
|
def val_is_int(val: str) -> bool:
|
|
|
|
"""Whether val is suitable as an 'int' parameter in the .datax file."""
|
|
|
|
if not INT_VAL_REGEX.match(val):
|
|
|
|
return False
|
|
|
|
# Limit the range to what is guaranteed to get through strtol()
|
|
|
|
return abs(int(val, 0)) <= 0x7fffffff
|
|
|
|
|
2017-07-06 18:34:27 +02:00
|
|
|
def write_parameters(out_data_f, test_args, func_args, unique_expressions):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Writes test parameters to the intermediate data file, replacing
|
|
|
|
the string form with identifiers. Also, generates expression
|
|
|
|
check code.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param out_data_f: Output intermediate data file
|
|
|
|
:param test_args: Test parameters
|
|
|
|
:param func_args: Function arguments
|
2018-06-28 17:49:13 +02:00
|
|
|
:param unique_expressions: Mutable list to track unique
|
|
|
|
expressions that are global to this re-entrant function.
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: Returns expression check code.
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
|
|
|
expression_code = ''
|
2018-07-03 12:57:54 +02:00
|
|
|
for i, _ in enumerate(test_args):
|
2017-07-06 18:34:27 +02:00
|
|
|
typ = func_args[i]
|
|
|
|
val = test_args[i]
|
|
|
|
|
Simplify parsing of integers in .datax files
In the .datax parser, since we're calling strtol() anyway, rely on it for
verification. This makes the .datax parser very slightly more
liberal (leading spaces and '+' are now accepted), and changes the
interpretation of numbers with leading zeros to octal.
Before, an argument like :0123: was parsed as decimal, but an argument like
:0123+1: was parsed as a C expression and hence the leading zero marked an
octal representation. Now, a leading zero is always interpreted according to
C syntax, namely indicating octal. There are no nonzero integer constants
with a leading zero in a .data file, so this does not affect existing test
cases.
In the .datax generator, allow negative arguments to be 'int' (before, they
were systematically treated as 'exp' even though they didn't need to be).
In the .datax parser, validate the range of integer constants. They have to
fit in int32_t. In the .datax generator, use 'exp' instead of 'int' for
integer constants that are out of range.
Signed-off-by: Gilles Peskine <Gilles.Peskine@arm.com>
2022-12-04 00:28:56 +01:00
|
|
|
# Pass small integer constants literally. This reduces the size of
|
|
|
|
# the C code. Register anything else as an expression.
|
|
|
|
if typ == 'int' and not val_is_int(val):
|
2017-07-06 18:34:27 +02:00
|
|
|
typ = 'exp'
|
|
|
|
if val not in unique_expressions:
|
|
|
|
unique_expressions.append(val)
|
2018-06-28 17:49:13 +02:00
|
|
|
# exp_id can be derived from len(). But for
|
|
|
|
# readability and consistency with case of existing
|
|
|
|
# let's use index().
|
2017-07-06 18:34:27 +02:00
|
|
|
exp_id = unique_expressions.index(val)
|
|
|
|
expression_code += gen_expression_check(exp_id, val)
|
|
|
|
val = exp_id
|
|
|
|
else:
|
|
|
|
val = unique_expressions.index(val)
|
|
|
|
out_data_f.write(':' + typ + ':' + str(val))
|
|
|
|
out_data_f.write('\n')
|
|
|
|
return expression_code
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def gen_suite_dep_checks(suite_dependencies, dep_check_code, expression_code):
|
2017-07-06 18:34:27 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
Generates preprocessor checks for test suite dependencies.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
:param suite_dependencies: Test suite dependencies read from the
|
2018-07-05 00:29:46 +02:00
|
|
|
.function file.
|
2017-08-02 15:47:13 +02:00
|
|
|
:param dep_check_code: Dependency check code
|
|
|
|
:param expression_code: Expression check code
|
2018-06-28 17:49:13 +02:00
|
|
|
:return: Dependency and expression code guarded by test suite
|
|
|
|
dependencies.
|
2017-07-06 18:34:27 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
if suite_dependencies:
|
|
|
|
preprocessor_check = gen_dependencies_one_line(suite_dependencies)
|
2017-06-15 15:45:56 +02:00
|
|
|
dep_check_code = '''
|
2018-07-03 12:57:54 +02:00
|
|
|
{preprocessor_check}
|
2017-06-15 15:45:56 +02:00
|
|
|
{code}
|
|
|
|
#endif
|
2018-07-03 12:57:54 +02:00
|
|
|
'''.format(preprocessor_check=preprocessor_check, code=dep_check_code)
|
2017-06-15 15:45:56 +02:00
|
|
|
expression_code = '''
|
2018-07-03 12:57:54 +02:00
|
|
|
{preprocessor_check}
|
2017-06-15 15:45:56 +02:00
|
|
|
{code}
|
|
|
|
#endif
|
2018-07-03 12:57:54 +02:00
|
|
|
'''.format(preprocessor_check=preprocessor_check, code=expression_code)
|
2017-03-28 02:48:31 +02:00
|
|
|
return dep_check_code, expression_code
|
|
|
|
|
|
|
|
|
2022-12-04 17:27:25 +01:00
|
|
|
def get_function_info(func_info, function_name, line_no):
|
|
|
|
"""Look up information about a test function by name.
|
|
|
|
|
|
|
|
Raise an informative expression if function_name is not found.
|
|
|
|
|
|
|
|
:param func_info: dictionary mapping function names to their information.
|
|
|
|
:param function_name: the function name as written in the .function and
|
|
|
|
.data files.
|
|
|
|
:param line_no: line number for error messages.
|
|
|
|
:return Function information (id, args).
|
|
|
|
"""
|
|
|
|
test_function_name = 'test_' + function_name
|
|
|
|
if test_function_name not in func_info:
|
|
|
|
raise GeneratorInputError("%d: Function %s not found!" %
|
|
|
|
(line_no, test_function_name))
|
|
|
|
return func_info[test_function_name]
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def gen_from_test_data(data_f, out_data_f, func_info, suite_dependencies):
|
2017-07-06 18:34:27 +02:00
|
|
|
"""
|
2018-06-29 03:36:57 +02:00
|
|
|
This function reads test case name, dependencies and test vectors
|
|
|
|
from the .data file. This information is correlated with the test
|
|
|
|
functions file for generating an intermediate data file replacing
|
|
|
|
the strings for test function names, dependencies and integer
|
|
|
|
constant expressions with identifiers. Mainly for optimising
|
|
|
|
space for on-target execution.
|
|
|
|
It also generates test case dependency check code and expression
|
|
|
|
evaluation code.
|
2018-06-13 17:31:26 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param data_f: Data file object
|
2018-07-05 00:29:46 +02:00
|
|
|
:param out_data_f: Output intermediate data file
|
2018-06-28 17:49:13 +02:00
|
|
|
:param func_info: Dict keyed by function and with function id
|
|
|
|
and arguments info
|
2018-07-03 12:57:54 +02:00
|
|
|
:param suite_dependencies: Test suite dependencies
|
2017-08-02 15:47:13 +02:00
|
|
|
:return: Returns dependency and expression check code
|
2017-07-06 18:34:27 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
unique_dependencies = []
|
2017-07-06 18:34:27 +02:00
|
|
|
unique_expressions = []
|
|
|
|
dep_check_code = ''
|
|
|
|
expression_code = ''
|
2022-12-03 22:58:52 +01:00
|
|
|
for line_no, test_name, function_name, test_dependencies, test_args in \
|
2018-07-03 12:57:54 +02:00
|
|
|
parse_test_data(data_f):
|
2017-07-06 18:34:27 +02:00
|
|
|
out_data_f.write(test_name + '\n')
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
# Write dependencies
|
|
|
|
dep_check_code += write_dependencies(out_data_f, test_dependencies,
|
|
|
|
unique_dependencies)
|
2017-07-06 18:34:27 +02:00
|
|
|
|
|
|
|
# Write test function name
|
2022-12-04 17:27:25 +01:00
|
|
|
func_id, func_args = \
|
|
|
|
get_function_info(func_info, function_name, line_no)
|
2017-07-06 18:34:27 +02:00
|
|
|
out_data_f.write(str(func_id))
|
|
|
|
|
|
|
|
# Write parameters
|
2018-06-26 15:35:25 +02:00
|
|
|
if len(test_args) != len(func_args):
|
2022-12-03 22:58:52 +01:00
|
|
|
raise GeneratorInputError("%d: Invalid number of arguments in test "
|
2018-07-03 12:57:54 +02:00
|
|
|
"%s. See function %s signature." %
|
2022-12-03 22:58:52 +01:00
|
|
|
(line_no, test_name, function_name))
|
2018-06-28 17:49:13 +02:00
|
|
|
expression_code += write_parameters(out_data_f, test_args, func_args,
|
|
|
|
unique_expressions)
|
2017-07-06 18:34:27 +02:00
|
|
|
|
|
|
|
# Write a newline as test case separator
|
|
|
|
out_data_f.write('\n')
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
dep_check_code, expression_code = gen_suite_dep_checks(
|
|
|
|
suite_dependencies, dep_check_code, expression_code)
|
2017-07-06 18:34:27 +02:00
|
|
|
return dep_check_code, expression_code
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def add_input_info(funcs_file, data_file, template_file,
|
|
|
|
c_file, snippets):
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
Add generator input info in snippets.
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2017-08-02 15:47:13 +02:00
|
|
|
:param funcs_file: Functions file object
|
|
|
|
:param data_file: Data file object
|
|
|
|
:param template_file: Template file object
|
|
|
|
:param c_file: Output C file object
|
2018-07-03 12:57:54 +02:00
|
|
|
:param snippets: Dictionary to contain code pieces to be
|
|
|
|
substituted in the template.
|
2017-03-28 02:48:31 +02:00
|
|
|
:return:
|
|
|
|
"""
|
2018-07-03 12:57:54 +02:00
|
|
|
snippets['test_file'] = c_file
|
|
|
|
snippets['test_main_file'] = template_file
|
|
|
|
snippets['test_case_file'] = funcs_file
|
|
|
|
snippets['test_case_data_file'] = data_file
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def read_code_from_input_files(platform_file, helpers_file,
|
|
|
|
out_data_file, snippets):
|
|
|
|
"""
|
|
|
|
Read code from input files and create substitutions for replacement
|
|
|
|
strings in the template file.
|
|
|
|
|
|
|
|
:param platform_file: Platform file object
|
|
|
|
:param helpers_file: Helper functions file object
|
|
|
|
:param out_data_file: Output intermediate data file object
|
|
|
|
:param snippets: Dictionary to contain code pieces to be
|
|
|
|
substituted in the template.
|
|
|
|
:return:
|
|
|
|
"""
|
2017-03-28 02:48:31 +02:00
|
|
|
# Read helpers
|
2018-06-29 03:36:57 +02:00
|
|
|
with open(helpers_file, 'r') as help_f, open(platform_file, 'r') as \
|
2018-06-28 17:49:13 +02:00
|
|
|
platform_f:
|
2018-06-29 03:36:57 +02:00
|
|
|
snippets['test_common_helper_file'] = helpers_file
|
2017-03-28 02:48:31 +02:00
|
|
|
snippets['test_common_helpers'] = help_f.read()
|
|
|
|
snippets['test_platform_file'] = platform_file
|
2018-06-28 17:49:13 +02:00
|
|
|
snippets['platform_code'] = platform_f.read().replace(
|
2018-07-03 12:57:54 +02:00
|
|
|
'DATA_FILE', out_data_file.replace('\\', '\\\\')) # escape '\'
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def write_test_source_file(template_file, c_file, snippets):
|
|
|
|
"""
|
|
|
|
Write output source file with generated source code.
|
|
|
|
|
|
|
|
:param template_file: Template file name
|
|
|
|
:param c_file: Output source file
|
|
|
|
:param snippets: Generated and code snippets
|
|
|
|
:return:
|
|
|
|
"""
|
2022-11-03 18:49:29 +01:00
|
|
|
|
|
|
|
# Create a placeholder pattern with the correct named capture groups
|
|
|
|
# to override the default provided with Template.
|
|
|
|
# Match nothing (no way of escaping placeholders).
|
|
|
|
escaped = "(?P<escaped>(?!))"
|
|
|
|
# Match the "__MBEDTLS_TEST_TEMPLATE__PLACEHOLDER_NAME" pattern.
|
|
|
|
named = "__MBEDTLS_TEST_TEMPLATE__(?P<named>[A-Z][_A-Z0-9]*)"
|
|
|
|
# Match nothing (no braced placeholder syntax).
|
|
|
|
braced = "(?P<braced>(?!))"
|
|
|
|
# If not already matched, a "__MBEDTLS_TEST_TEMPLATE__" prefix is invalid.
|
|
|
|
invalid = "(?P<invalid>__MBEDTLS_TEST_TEMPLATE__)"
|
2022-11-09 18:27:33 +01:00
|
|
|
placeholder_pattern = re.compile("|".join([escaped, named, braced, invalid]))
|
2022-11-03 18:49:29 +01:00
|
|
|
|
2017-03-28 02:48:31 +02:00
|
|
|
with open(template_file, 'r') as template_f, open(c_file, 'w') as c_f:
|
2018-07-18 18:48:37 +02:00
|
|
|
for line_no, line in enumerate(template_f.readlines(), 1):
|
2018-06-28 17:49:13 +02:00
|
|
|
# Update line number. +1 as #line directive sets next line number
|
|
|
|
snippets['line_no'] = line_no + 1
|
2022-11-03 18:49:29 +01:00
|
|
|
template = string.Template(line)
|
|
|
|
template.pattern = placeholder_pattern
|
|
|
|
snippets = {k.upper():v for (k, v) in snippets.items()}
|
|
|
|
code = template.substitute(**snippets)
|
2017-03-28 02:48:31 +02:00
|
|
|
c_f.write(code)
|
|
|
|
|
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
def parse_function_file(funcs_file, snippets):
|
|
|
|
"""
|
|
|
|
Parse function file and generate function dispatch code.
|
|
|
|
|
|
|
|
:param funcs_file: Functions file name
|
|
|
|
:param snippets: Dictionary to contain code pieces to be
|
|
|
|
substituted in the template.
|
|
|
|
:return:
|
|
|
|
"""
|
|
|
|
with FileWrapper(funcs_file) as funcs_f:
|
|
|
|
suite_dependencies, dispatch_code, func_code, func_info = \
|
|
|
|
parse_functions(funcs_f)
|
|
|
|
snippets['functions_code'] = func_code
|
|
|
|
snippets['dispatch_code'] = dispatch_code
|
|
|
|
return suite_dependencies, func_info
|
|
|
|
|
|
|
|
|
|
|
|
def generate_intermediate_data_file(data_file, out_data_file,
|
|
|
|
suite_dependencies, func_info, snippets):
|
|
|
|
"""
|
|
|
|
Generates intermediate data file from input data file and
|
|
|
|
information read from functions file.
|
|
|
|
|
|
|
|
:param data_file: Data file name
|
|
|
|
:param out_data_file: Output/Intermediate data file
|
|
|
|
:param suite_dependencies: List of suite dependencies.
|
|
|
|
:param func_info: Function info parsed from functions file.
|
|
|
|
:param snippets: Dictionary to contain code pieces to be
|
|
|
|
substituted in the template.
|
|
|
|
:return:
|
|
|
|
"""
|
|
|
|
with FileWrapper(data_file) as data_f, \
|
|
|
|
open(out_data_file, 'w') as out_data_f:
|
|
|
|
dep_check_code, expression_code = gen_from_test_data(
|
|
|
|
data_f, out_data_f, func_info, suite_dependencies)
|
|
|
|
snippets['dep_check_code'] = dep_check_code
|
|
|
|
snippets['expression_code'] = expression_code
|
|
|
|
|
|
|
|
|
|
|
|
def generate_code(**input_info):
|
|
|
|
"""
|
|
|
|
Generates C source code from test suite file, data file, common
|
|
|
|
helpers file and platform file.
|
|
|
|
|
|
|
|
input_info expands to following parameters:
|
|
|
|
funcs_file: Functions file object
|
|
|
|
data_file: Data file object
|
|
|
|
template_file: Template file object
|
|
|
|
platform_file: Platform file object
|
|
|
|
helpers_file: Helper functions file object
|
|
|
|
suites_dir: Test suites dir
|
|
|
|
c_file: Output C file object
|
|
|
|
out_data_file: Output intermediate data file object
|
|
|
|
:return:
|
|
|
|
"""
|
|
|
|
funcs_file = input_info['funcs_file']
|
|
|
|
data_file = input_info['data_file']
|
|
|
|
template_file = input_info['template_file']
|
|
|
|
platform_file = input_info['platform_file']
|
|
|
|
helpers_file = input_info['helpers_file']
|
|
|
|
suites_dir = input_info['suites_dir']
|
|
|
|
c_file = input_info['c_file']
|
|
|
|
out_data_file = input_info['out_data_file']
|
|
|
|
for name, path in [('Functions file', funcs_file),
|
|
|
|
('Data file', data_file),
|
|
|
|
('Template file', template_file),
|
|
|
|
('Platform file', platform_file),
|
|
|
|
('Helpers code file', helpers_file),
|
|
|
|
('Suites dir', suites_dir)]:
|
|
|
|
if not os.path.exists(path):
|
|
|
|
raise IOError("ERROR: %s [%s] not found!" % (name, path))
|
|
|
|
|
|
|
|
snippets = {'generator_script': os.path.basename(__file__)}
|
|
|
|
read_code_from_input_files(platform_file, helpers_file,
|
|
|
|
out_data_file, snippets)
|
|
|
|
add_input_info(funcs_file, data_file, template_file,
|
|
|
|
c_file, snippets)
|
|
|
|
suite_dependencies, func_info = parse_function_file(funcs_file, snippets)
|
|
|
|
generate_intermediate_data_file(data_file, out_data_file,
|
|
|
|
suite_dependencies, func_info, snippets)
|
|
|
|
write_test_source_file(template_file, c_file, snippets)
|
|
|
|
|
|
|
|
|
2018-07-05 00:29:46 +02:00
|
|
|
def main():
|
2017-03-28 02:48:31 +02:00
|
|
|
"""
|
|
|
|
Command line parser.
|
|
|
|
|
|
|
|
:return:
|
|
|
|
"""
|
2018-06-28 17:49:13 +02:00
|
|
|
parser = argparse.ArgumentParser(
|
2018-06-29 03:36:57 +02:00
|
|
|
description='Dynamically generate test suite code.')
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
parser.add_argument("-f", "--functions-file",
|
|
|
|
dest="funcs_file",
|
|
|
|
help="Functions file",
|
2018-06-29 03:36:57 +02:00
|
|
|
metavar="FUNCTIONS_FILE",
|
2017-03-28 02:48:31 +02:00
|
|
|
required=True)
|
|
|
|
|
|
|
|
parser.add_argument("-d", "--data-file",
|
|
|
|
dest="data_file",
|
|
|
|
help="Data file",
|
2018-06-29 03:36:57 +02:00
|
|
|
metavar="DATA_FILE",
|
2017-03-28 02:48:31 +02:00
|
|
|
required=True)
|
|
|
|
|
|
|
|
parser.add_argument("-t", "--template-file",
|
|
|
|
dest="template_file",
|
|
|
|
help="Template file",
|
2018-06-29 03:36:57 +02:00
|
|
|
metavar="TEMPLATE_FILE",
|
2017-03-28 02:48:31 +02:00
|
|
|
required=True)
|
|
|
|
|
|
|
|
parser.add_argument("-s", "--suites-dir",
|
|
|
|
dest="suites_dir",
|
|
|
|
help="Suites dir",
|
2018-06-29 03:36:57 +02:00
|
|
|
metavar="SUITES_DIR",
|
2017-03-28 02:48:31 +02:00
|
|
|
required=True)
|
|
|
|
|
2018-06-29 03:36:57 +02:00
|
|
|
parser.add_argument("--helpers-file",
|
|
|
|
dest="helpers_file",
|
|
|
|
help="Helpers file",
|
|
|
|
metavar="HELPERS_FILE",
|
2017-03-28 02:48:31 +02:00
|
|
|
required=True)
|
|
|
|
|
|
|
|
parser.add_argument("-p", "--platform-file",
|
|
|
|
dest="platform_file",
|
|
|
|
help="Platform code file",
|
|
|
|
metavar="PLATFORM_FILE",
|
|
|
|
required=True)
|
|
|
|
|
|
|
|
parser.add_argument("-o", "--out-dir",
|
|
|
|
dest="out_dir",
|
|
|
|
help="Dir where generated code and scripts are copied",
|
|
|
|
metavar="OUT_DIR",
|
|
|
|
required=True)
|
|
|
|
|
|
|
|
args = parser.parse_args()
|
|
|
|
|
|
|
|
data_file_name = os.path.basename(args.data_file)
|
|
|
|
data_name = os.path.splitext(data_file_name)[0]
|
|
|
|
|
|
|
|
out_c_file = os.path.join(args.out_dir, data_name + '.c')
|
2018-06-28 14:10:19 +02:00
|
|
|
out_data_file = os.path.join(args.out_dir, data_name + '.datax')
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
out_c_file_dir = os.path.dirname(out_c_file)
|
|
|
|
out_data_file_dir = os.path.dirname(out_data_file)
|
2018-07-03 12:57:54 +02:00
|
|
|
for directory in [out_c_file_dir, out_data_file_dir]:
|
|
|
|
if not os.path.exists(directory):
|
|
|
|
os.makedirs(directory)
|
2017-03-28 02:48:31 +02:00
|
|
|
|
2018-07-03 12:57:54 +02:00
|
|
|
generate_code(funcs_file=args.funcs_file, data_file=args.data_file,
|
|
|
|
template_file=args.template_file,
|
|
|
|
platform_file=args.platform_file,
|
|
|
|
helpers_file=args.helpers_file, suites_dir=args.suites_dir,
|
|
|
|
c_file=out_c_file, out_data_file=out_data_file)
|
2017-03-28 02:48:31 +02:00
|
|
|
|
|
|
|
|
|
|
|
if __name__ == "__main__":
|
2018-06-26 15:35:25 +02:00
|
|
|
try:
|
2018-07-05 00:29:46 +02:00
|
|
|
main()
|
2018-07-03 12:57:54 +02:00
|
|
|
except GeneratorInputError as err:
|
2018-07-18 13:50:49 +02:00
|
|
|
sys.exit("%s: input error: %s" %
|
|
|
|
(os.path.basename(sys.argv[0]), str(err)))
|