Testing

Overview

The test-driven development mode is used during the development process. You can develop new cases or modify existing cases to test new or enhanced system features. The test helps you develop high-quality code in the development phase.

Directory Structure

test/
├── developertest             # Developers test framework
│   ├── aw                   # Static library of the test framework
│   ├── config               # Test framework configuration
│   ├── examples             # Test case examples
│   ├── src                  # Source code of the test framework
│   ├── third_party          # Adaptation code for third-party modules on which the test framework depends
│   ├── start.bat            # Developers test entry for Windows
│   ├── start.sh             # Developers test entry for Linux
│   └── BUILD.gn             # Build entry of the test framework
├── xdevice                   # Basic component of the test framework
│   ├── config               # Framework configuration file
│   ├── extension            # Extension for the basic component
│   ├── resource             # Test resources of the basic component
│   └── src                  # Source code of the basic component
└── xts                       # X test suite

Constraints

The test tool environment must meet the following requirements:

  1. Python version: 3.7.5 or later
  2. Paramiko version: 2.7.1 or later
  3. Setuptools version: 40.8.0 or later
  4. RSA version: 4.0 or later
  5. NFS version: V4 or later (required when device supports connection using the serial port but not the hdc)
  6. pySerial version: 3.3 or later (required when the device supports connection using the serial port but not the hdc)
  7. OS version: Windows 10 or later; Ubuntu 18.04

Installation

The Python environment is required.

  1. Run the following command to install the Linux extension component Readline:

    sudo apt-get install libreadline-dev
    

    If the installation is successful, the following prompts are displayed:

    Reading package lists... Done
    Building dependency tree
    Reading state information... Done
    libreadline-dev is already the newest version (7.0-3).
    0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
    
  2. Run the following command to install the plug-in Setuptools:

    pip3 install setuptools
    

    If the installation is successful, the following prompts are displayed:

    Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0)
    
  3. Run the following command to install the plug-in Paramiko:

    pip3 install paramiko
    

    If the installation is successful, the following prompts are displayed:

    Installing collected packages: pycparser, cffi, pynacl, bcrypt, cryptography, paramiko
    Successfully installed bcrypt-3.2.0 cffi-1.14.4 cryptography-3.3.1 paramiko-2.7.2 pycparser-2.20 pynacl-1.4.0
    
  4. Run the following command to install the Python plug-in RSA:

    pip3 install rsa
    

    If the installation is successful, the following prompts are displayed:

    Installing collected packages: pyasn1, rsa
    Successfully installed pyasn1-0.4.8 rsa-4.7
    
  5. Run the following command to install the serial port plug-in pySerial for Python on the local PC:

    pip3 install pyserial
    

    If the installation is successful, the following prompts are displayed:

    Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4)
    
  6. If the device supports test result output only using the serial port, install the NFS server.

    For example, to install haneWIN NFS Server 1.2.50 for Windows, download the installation package from https://www.hanewin.net/nfs-e.htm.

    For Linux, run the following command:

    sudo apt install nfs-kernel-server
    

    If the installation is successful, the following prompts are displayed:

    Reading package lists... Done
    Building dependency tree
    Reading state information... Done
    nfs-kernel-server is already the newest version (1:1.3.4-2.1ubuntu5.3).
    0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
    

Test Cases

  • Test case specifications

    • Naming rules

      The source file name of the test case must be consistent with the test suite content. A test suite can contain multiple test cases and has only one test source file that is globally unique. Source files are named in the [Feature]_[Function]_[Subfunction 1]_[Subfunction 1.1] format. Subfunctions can be further divided.

      A source file name consists of lowercase letters and underscores (_), and must end with test, for example, developertest/examples/calculator.

    • Test case coding specifications

      The test cases must comply with the feature code coding specifications. In addition, necessary case description information must be added. For details, see Test case template.

    • Test case compilation and configuration specifications

      The test cases are compiled in GN mode. The configuration must comply with the compilation guide of the open-source project.

  • Test case template

    For details, see the test case developertest/examples/calculator/test/unittest/common/calculator_add_test.cpp.

  • Directories planned for test cases

    subsystem   # Subsystem and system module
    ├── parts    # Components
    │     └── test    # Module test
    │             └── unittest    # Unit test
    │                    ├── common    # Common test cases
    │                    ├── phone     # Test case of the smartphone form
    │                    └── ivi       # Test case of the head unit form
    │                    └── liteos-a  # Test case of the IP camera form
    │             └── moduletest    # Module test
    │                    ├── common
    │                    ├── phone
    │                    └── ivi
    │                    └── liteos-a
    └── test    # Subsystem test
           └── resource    # Test resources
                   ├── module
                          ├── common
                          ├── phone
                          ├── ivi
                          ├── liteos-a
           └── systemtest    # System test
                 ├── common
                 ├── phone
                 ├── ivi
                 ├── liteos-a
    

    NOTE: The phone, ivi, liteos-a test cases are used as examples only for different device forms. For the same feature on different development boards, if the test cases are the same, they are stored in the common directory. For the same feature, if the test cases are used to distinguish different device forms and may include kernel differences and chip platform differences, the test cases are distinguished by directory.

  • Writing a test case

    1. Add comments to the test case header file.

    2. Reference the gtest header file and ext namespace.

    3. Add the header file to test.

    4. Define test suites (test classes).

    5. Implement specific test cases of the test suite, including test case comments and logic implementation.

    6. Set the test case compilation configuration.

      NOTE: The following examples are provided for reference: For devices supporting the serial port only: developertest/examples/lite/cxx_demo/test/unittest/common/calc_subtraction_test.cpp. For devices supporting the hdc: developertest/examples/calculator/test/unittest/common/calculator_add_test.cpp.

      • SetUp and TearDown are the processing logic before and after each test case in the test suite is executed.
      • SetUpTestCase and TearDownTestCase are the processing logic before and after all cases in the test suite are executed.
      • HWTEST usage: This method is applicable only to simple tests (not depending on Setup and Teardown). This method is not applicable to the scenario where multiple test scenarios require the same data configuration. The test cases may affect each other and are not independent.
      • Use the printf function to print logs.
  • Writing a test case compilation file

    • Define test case compilation and building objectives.

      1. Add comments to the test case compilation header file.
      2. Import the test case compilation template file.
      3. Specify the output path of the test case file.
      4. Configure the directory contained in the test case compilation dependency.
      5. Specify the file name generated by the test case compilation target.
      6. Write a specific test case compilation script and add the source files, configurations, and dependencies involved in the compilation.
      7. Group the target test case files by condition. The group name is fixed to unittest/moduletest.
    • If there are multiple test suites, define the common compilation configuration.

    • Add test cases to the build system.

      NOTE: The following examples are provided for reference:

      • Devices supporting serial port connection only Test case compilation configuration: developertest/examples/lite/cxx_demo/test/unittest/common/BUILD.gn Compilation entry configuration: developertest/examples/lite/BUILD.gn
      • Devices supporting the hdc connection Test case compilation configuration: developertest/examples/calculator/test/unittest/common/BUILD.gn Compilation entry configuration: developertest/examples/ohos.build
  • Writing a test case resource file

    1. Create the resource directory in the test directory of a component or module.

    2. Create a directory for a device form, for example, phone, in the resource directory.

    3. Create a folder named after the module in the device form directory, for example, testmodule.

    4. Create the ohos_test.xml file in the folder named after the module. The file content is in the following format:

      <?xml version="1.0" encoding="UTF-8"?>
      <configuration ver="2.0">
          <target name="DetectorFileTest">
              <preparer>
                  <option name="push" value="test.txt -> /data/test/resource" src="res"/>
              </preparer>
          </target>
      </configuration>
      
    5. Define resource_config_file in the compilation configuration file of the test case to specify the resource file ohos_test.xml.

      NOTE: The resource file is used to push the test.txt file in the resource directory to the /data/test/resource directory of the device to test. To do so, run the hdc push command.

    6. Configure the ohos_test.xml file, which contains the following tags:

      NOTE: target_name: name of the test unit, which is usually defined in the BUILD.gn file in the test directory preparer: action to take before the test unit is executed cleaner: action to take after the test unit is executed src="res" indicates that test resources are stored in the resource directory under the root directory of the subsystem. src="out" indicates that test resources are in the out/release/$subsystem name directory.

  • Test case levels

    • Basic (level 1)
    • Major (level 2)
    • Minor (level 3)
    • Uncommon (level 4)

Test Framework Usage

  • (Optional) Install the XDevice module.

    1. Open the xdevice installation directory, for example, test/xdevice in Windows.

    2. Open the console and run the following command:

      python setup.py install
      

      The following figure is displayed when the installation is complete.

      Installed d:\programs\python37\lib\site-packages\xdevice-0.0.0-py3.7.egg
      Processing dependencies for xdevice==0.0.0
      Finished processing dependencies for xdevice==0.0.0
      
  • Configure the developers test module.

    Configuration file: developertest/config/user_config.xml

    1. Modify basic configuration parameters.

      [build] # Set build parameters of the test case.

      <build>
          <example>false</example>
          <version>false</version>
          <testcase>true</testcase>
          ... ...
      </build>
      

      NOTE: example: whether to build the test case example. The default value is false. version: whether to build the test version. The default value is false. testcase: whether to build the test case. The default value is true.

    2. For devices that support the Harmony device connector (hdc), modify the configuration file as follows:

      [device] # Configure the device information with the "usb-hdc" attribute, including the test device IP address and the matched hdc port.

      <device type="usb-hdc">
          <ip>192.168.1.1</ip>
          <port>9111</port>
          <sn></sn>
      </device>
      
    3. For devices that support serial port connection only, modify the configuration file as follows:

      [board_info] # Configure development board information.

      <board_info>
          <board_series>hispark</board_series>
          <board_type>taurus</board_type>
          <board_product>ipcamera</board_product>
          <build_command>hb build</build_command>
      </board_info>
      

      NOTE: board_series: development board series. The default value is hispark. board_type: development board type. The default value is taurus. board_product: target product. The default value is ipcamera. build_command: command used for building the test version and test case. The default value is hb build.

      [device] # Configure the serial port information with the "ipcamera" attribute, including the COM port and baud rate. For example:

      <device type="com" label="ipcamera">
          <serial>
              <com>COM1</com>
              <type>cmd</type>
              <baud_rate>115200</baud_rate>
              <data_bits>8</data_bits>
              <stop_bits>1</stop_bits>
              <timeout>1</timeout>
          </serial>
      </device>
      
  • Modify the configuration of the developertest component.

    (Optional) If a test case has been compiled, specify the compilation output path of the test case. In this case, the test platform will not recompile the test case.

    Configuration file: config/user_config.xml

    1. Specify the output path of the test case and the compilation output directory. Example:

      <test_cases>
          <dir>/home/source_code/out/release/tests</dir>
      </test_cases>
      
    2. For devices that support serial port connection only, specify the NFS directory for the PC (host_dir) and the corresponding directory for the development board (board_dir) inside the <NFS> tags. For example:

      <NFS>
          <host_dir>D:\nfs</host_dir>
          <board_dir>user</board_dir>
      </NFS>
      
  • Prepare the test environment. Check that the test environment meets the following conditions if the tested device supports only serial ports:

    • The system image and file system have been burnt into a development board and are running properly on the development board. For example, in system mode, the device prompt OHOS# is displayed during shell login, indicating that the system is running properly.
    • The development host has been connected to the serial port of the development board and the network port.
    • The IP addresses of the development host and development board are in the same network segment and can ping each other.
    • An empty directory is created on the development host for mounting test cases through NFS, and the NFS service is started properly.
  • Run test suites.

    • Start the test framework and go to the test/developertest directory.

      1. Run the following command to start the test framework in Windows.

        start.bat
        
      2. Run the following command to start the test framework in Linux.

        ./strat.sh
        
    • Select a device form.

      Configure device forms based on the actual development board, for example, developertest/config/framework_config.xml.

    • Run the test command.

      1. To query the subsystems, modules, product forms, and test types supported by test cases, run the show commands.

        usage: 
            show productlist      Querying supported product forms
            show typelist         Querying the supported test type
            show subsystemlist    Querying supported subsystems
            show modulelist       Querying supported modules
        
      2. Run the following command to execute the test (-t is mandatory, and -ss and -tm are optional):

        run -t ut -ss test -tm example
        
      3. Specify the parameters that can be used to execute the test suite specific to a specified feature or module.

        usage: run [-h] [-p PRODUCTFORM] [-t [TESTTYPE [TESTTYPE ...]]]
            [-ss SUBSYSTEM] [-tm TESTMODULE] [-ts TESTSUIT]
            [-tc TESTCASE] [-tl TESTLEVEL] 
        
        optional arguments:
            -h, --help            show this help message and exit
            -p PRODUCTFORM, --productform PRODUCTFORM    Specified product form
            -t [TESTTYPE [TESTTYPE ...]], --testtype [TESTTYPE [TESTTYPE ...]]
                Specify test type(UT,MST,ST,PERF,ALL)
            -ss SUBSYSTEM, --subsystem SUBSYSTEM    Specify test subsystem
            -tm TESTMODULE, --testmodule TESTMODULE    Specified test module
            -ts TESTSUIT, --testsuite TESTSUIT    Specify test suite
            -tc TESTCASE, --testcase TESTCASE    Specify test case
            -tl TESTLEVEL, --testlevel TESTLEVEL    Specify test level
        
  • View test framework help if needed.

    Run the following command to query commands supported by the test platform:

    help
    
  • Run the following command to exit the self-test platform:

    quit
    

Test Result and Logs

  • Test logs and test reports are generated after you execute the test commands.

  • Test result

    • Reports are displayed on the console. The root directory of the test result is as follows:

      reports/xxxx-xx-xx-xx-xx-xx
      
    • Test case formatting result

      result/
      
    • Test case logs

      log/plan_log_xxxx-xx-xx-xx-xx-xx.log
      
    • Report summary

      summary_report.html
      
    • Report details

      details_report.html
      
  • Test framework logs

    reports/platform_log_xxxx-xx-xx-xx-xx-xx.log
    
  • Latest test reports

    reports/latest
    

Repositories Involved

Testing subsystem

test_developertest

test_xdevice

test_xdevice_extension