support release test
Description
Environment
Attachments
Activity
Jan-Simon Moeller July 19, 2022 at 12:15 PM
du erp July 18, 2022 at 7:23 AM
Hi, Jan-Simon
I have understood the principle of "artifactorial", but I still want to see some details about it.
So I want to see the file "pyartiproxy.py", to confirm my guesses below:
I guess the specific directory where I upload the file to "artifactorial" is written in the "pyartiproxy.py".
The upload directory is /home/agl directory, rather then /pub directory
Moreover, the "pyartiproxy.py" file will generate subdirectories under the /home/agl directory according to the time when I submit the file.
For example, the following directory represents the submission time of file is 15:02 on May 12, 2022
/home/agl/2022/05/12/15/02
du erp July 15, 2022 at 3:09 AM
Hi, Jan-Simon
Reply to questions:
The "agl-test framework" encapsulates pytest, which aims to provide a unified test set execution entrance. It supports to run various test sets, even these test sets come from different test frameworks, processing these test logs uniformly, and generating complete test report.
In this way, it is convenient to test as many targets as possible in a wide range, so that the test covers a wider range of objects and is more comprehensive
At present, we plan to support the following test sets in "agl-test":
1. Transplant test sets under Fuego and AGL-JTA
2. Retain the test set under pyagl and agl-ptest (so will depend on "agl-ptest")
3. Migrate new test sets (with upstream)
4. Append new test sets (without upstream)
The output of test run is summarized by levels. The first level is the summary of all test sets, and the second level is the summary of a single test set. Now, they are displayed in HTML format, and other formats also can be considered later.
We can see the brief test results on the console, it will show whether the execution of the test case is successful, the detailed log can be viewed by downloading the "agl-test-log-xxx.zip" file. This can avoid printing too many logs on the console which is not convenient for us to find the information we need.
The results seen on the console is looks like:
Use lava jobs to automate the release tests.
For example, we could add an option of image build to cross-compile tests and
put them into rootfs.
In this way, we can solve the problem of inconvenient cross compilation in lava,
and it will not cause trouble to abandon Jenkins's job.