mirror of
https://github.com/httprunner/httprunner.git
synced 2026-05-12 02:21:29 +08:00
Merge pull request #399 from HttpRunner/pipline
rename testset => testcase
This commit is contained in:
@@ -20,7 +20,7 @@ Former name: `ApiTestEngine`.
|
||||
- Supports `function`/`variable`/`extract`/`validate` mechanisms to create full test scenarios.
|
||||
- Supports perfect hook mechanism.
|
||||
- With `debugtalk.py` plugin, module functions can be auto-discovered in recursive upward directories.
|
||||
- Testcases can be run in diverse ways, with single testset, multiple testsets, or entire project folder.
|
||||
- Testcases can be run in diverse ways, with single testcase, multiple testcases, or entire project folder.
|
||||
- Test report is concise and clear, with detailed log records.
|
||||
- With reuse of [`Locust`][Locust], you can run performance test without extra work.
|
||||
- CLI command supported, perfect combination with `CI/CD`.
|
||||
|
||||
@@ -47,25 +47,43 @@ To see available options, run:
|
||||
|
||||
```bash
|
||||
$ httprunner -h # same as: hrun -h
|
||||
usage: httprunner [-h] [-V] [--log-level LOG_LEVEL] [--report-name REPORT_NAME]
|
||||
[--failfast] [--startproject STARTPROJECT]
|
||||
[testset_paths [testset_paths ...]]
|
||||
usage: main-debug.py [-h] [-V] [--no-html-report]
|
||||
[--html-report-name HTML_REPORT_NAME]
|
||||
[--html-report-template HTML_REPORT_TEMPLATE]
|
||||
[--log-level LOG_LEVEL] [--log-file LOG_FILE]
|
||||
[--dot-env-path DOT_ENV_PATH] [--failfast]
|
||||
[--startproject STARTPROJECT]
|
||||
[--validate [VALIDATE [VALIDATE ...]]]
|
||||
[--prettify [PRETTIFY [PRETTIFY ...]]]
|
||||
[testcase_paths [testcase_paths ...]]
|
||||
|
||||
HttpRunner.
|
||||
One-stop solution for HTTP(S) testing.
|
||||
|
||||
positional arguments:
|
||||
testset_paths testset file path
|
||||
testcase_paths testcase file path
|
||||
|
||||
optional arguments:
|
||||
-h, --help show this help message and exit
|
||||
-V, --version show version
|
||||
--log-level LOG_LEVEL
|
||||
-h, --help show this help message and exit
|
||||
-V, --version show version
|
||||
--no-html-report do not generate html report.
|
||||
--html-report-name HTML_REPORT_NAME
|
||||
specify html report name, only effective when
|
||||
generating html report.
|
||||
--html-report-template HTML_REPORT_TEMPLATE
|
||||
specify html report template path.
|
||||
--log-level LOG_LEVEL
|
||||
Specify logging level, default is INFO.
|
||||
--report-name REPORT_NAME
|
||||
Specify report name, default is generated time.
|
||||
--failfast Stop the test run on the first error or failure.
|
||||
--startproject STARTPROJECT
|
||||
--log-file LOG_FILE Write logs to specified file path.
|
||||
--dot-env-path DOT_ENV_PATH
|
||||
Specify .env file path, which is useful for keeping
|
||||
sensitive data.
|
||||
--failfast Stop the test run on the first error or failure.
|
||||
--startproject STARTPROJECT
|
||||
Specify new project name.
|
||||
--validate [VALIDATE [VALIDATE ...]]
|
||||
Validate JSON testcase format.
|
||||
--prettify [PRETTIFY [PRETTIFY ...]]
|
||||
Prettify JSON testcase format.
|
||||
```
|
||||
|
||||
## Supported Python Versions
|
||||
|
||||
@@ -10,7 +10,7 @@ Take full reuse of Python's existing powerful libraries: [`Requests`][requests],
|
||||
- Define testcases in YAML or JSON format in concise and elegant manner.
|
||||
- Supports `function`/`variable`/`extract`/`validate` mechanisms to create full test scenarios.
|
||||
- With `debugtalk.py` plugin, module functions can be auto-discovered in recursive upward directories.
|
||||
- Testcases can be run in diverse ways, with single testset, multiple testsets, or entire project folder.
|
||||
- Testcases can be run in diverse ways, with single testcase, multiple testcases, or entire project folder.
|
||||
- Test report is concise and clear, with detailed log records. See [`PyUnitReport`][PyUnitReport].
|
||||
- With reuse of [`Locust`][Locust], you can run performance test without extra work.
|
||||
- CLI command supported, perfect combination with [Jenkins][Jenkins].
|
||||
|
||||
@@ -156,7 +156,7 @@ However, the test cases are only `YAML` documents, it is impossible to generate
|
||||
|
||||
To achieve this goal, we can utilize `debugtalk.py` plugin and `variables` mechanisms.
|
||||
|
||||
To be specific, we can create a Python file (`examples/debugtalk.py`) and implement the related algorithm in it. The `debugtalk.py` file can not only be located beside `YAML/JSON` testset file, but also can be in any upward recursive folder. Since we want `debugtalk.py` to be importable, we should put a `__init__.py` in its folder to make it as a Python module.
|
||||
To be specific, we can create a Python file (`examples/debugtalk.py`) and implement the related algorithm in it. The `debugtalk.py` file can not only be located beside `YAML/JSON` testcase file, but also can be in any upward recursive folder. Since we want `debugtalk.py` to be importable, we should put a `__init__.py` in its folder to make it as a Python module.
|
||||
|
||||
```python
|
||||
import hashlib
|
||||
|
||||
@@ -2,19 +2,19 @@
|
||||
|
||||
`HttpRunner` can run testcases in diverse ways.
|
||||
|
||||
You can run single testset by specifying testset file path.
|
||||
You can run single testcase by specifying testcase file path.
|
||||
|
||||
```text
|
||||
$ httprunner filepath/testcase.yml
|
||||
```
|
||||
|
||||
You can also run several testsets by specifying multiple testset file paths.
|
||||
You can also run several testcases by specifying multiple testcase file paths.
|
||||
|
||||
```text
|
||||
$ httprunner filepath1/testcase1.yml filepath2/testcase2.yml
|
||||
```
|
||||
|
||||
If you want to run testsets of a whole project, you can achieve this goal by specifying the project folder path.
|
||||
If you want to run testcases of a whole project, you can achieve this goal by specifying the project folder path.
|
||||
|
||||
```text
|
||||
$ httprunner testcases_folder_path
|
||||
|
||||
@@ -2,11 +2,11 @@ It is recommended to write testcases in `YAML` format.
|
||||
|
||||
## demo
|
||||
|
||||
Here is a testset example of typical scenario: get `token` at the beginning, and each subsequent requests should take the `token` in the headers.
|
||||
Here is a testcase example of typical scenario: get `token` at the beginning, and each subsequent requests should take the `token` in the headers.
|
||||
|
||||
```yaml
|
||||
- config:
|
||||
name: "create user testsets."
|
||||
name: "create user testcases."
|
||||
variables:
|
||||
- user_agent: 'iOS/10.3'
|
||||
- device_sn: ${gen_random_string(15)}
|
||||
|
||||
@@ -22,8 +22,8 @@ def main_hrun():
|
||||
'-V', '--version', dest='version', action='store_true',
|
||||
help="show version")
|
||||
parser.add_argument(
|
||||
'testset_paths', nargs='*',
|
||||
help="testset file path")
|
||||
'testcase_paths', nargs='*',
|
||||
help="testcase file path")
|
||||
parser.add_argument(
|
||||
'--no-html-report', action='store_true', default=False,
|
||||
help="do not generate html report.")
|
||||
@@ -50,10 +50,10 @@ def main_hrun():
|
||||
help="Specify new project name.")
|
||||
parser.add_argument(
|
||||
'--validate', nargs='*',
|
||||
help="Validate JSON testset format.")
|
||||
help="Validate JSON testcase format.")
|
||||
parser.add_argument(
|
||||
'--prettify', nargs='*',
|
||||
help="Prettify JSON testset format.")
|
||||
help="Prettify JSON testcase format.")
|
||||
|
||||
args = parser.parse_args()
|
||||
logger.setup_logger(args.log_level, args.log_file)
|
||||
@@ -82,7 +82,7 @@ def main_hrun():
|
||||
failfast=args.failfast
|
||||
)
|
||||
runner.run(
|
||||
args.testset_paths,
|
||||
args.testcase_paths,
|
||||
dot_env_path=args.dot_env_path
|
||||
)
|
||||
except Exception:
|
||||
|
||||
@@ -423,7 +423,7 @@ def gen_cartesian_product(*args):
|
||||
|
||||
|
||||
def validate_json_file(file_list):
|
||||
""" validate JSON testset format
|
||||
""" validate JSON testcase format
|
||||
"""
|
||||
for json_file in set(file_list):
|
||||
if not json_file.endswith(".json"):
|
||||
@@ -442,7 +442,7 @@ def validate_json_file(file_list):
|
||||
|
||||
|
||||
def prettify_json_file(file_list):
|
||||
""" prettify JSON testset format
|
||||
""" prettify JSON testcase format
|
||||
"""
|
||||
for json_file in set(file_list):
|
||||
if not json_file.endswith(".json"):
|
||||
|
||||
@@ -13,7 +13,7 @@ else:
|
||||
color_print("Miss debugging type.", "RED")
|
||||
example = "\n".join([
|
||||
"e.g.",
|
||||
"python main-debug.py hrun /path/to/testset_file",
|
||||
"python main-debug.py locusts -f /path/to/testset_file"
|
||||
"python main-debug.py hrun /path/to/testcase_file",
|
||||
"python main-debug.py locusts -f /path/to/testcase_file"
|
||||
])
|
||||
color_print(example, "yellow")
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
- config:
|
||||
name: "user management testset."
|
||||
name: "user management testcase."
|
||||
parameters:
|
||||
- user_agent: ["iOS/10.1", "iOS/10.2", "iOS/10.3"]
|
||||
- username-password:
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
- config:
|
||||
name: "create user testsets."
|
||||
name: "create user testcases."
|
||||
variables:
|
||||
- user_agent: 'iOS/10.3'
|
||||
- device_sn: ${gen_random_string(15)}
|
||||
@@ -1,5 +1,5 @@
|
||||
- config:
|
||||
name: "user management testset."
|
||||
name: "user management testcase."
|
||||
variables:
|
||||
- user_agent: 'iOS/10.3'
|
||||
- device_sn: ${gen_random_string(15)}
|
||||
@@ -1,5 +1,5 @@
|
||||
- config:
|
||||
name: "create user testsets."
|
||||
name: "create user testcases."
|
||||
variables:
|
||||
- device_sn: 'HZfFBh6tU59EdXJ'
|
||||
request:
|
||||
@@ -11,12 +11,12 @@ from tests.base import ApiServerUnittest
|
||||
class TestHttpRunner(ApiServerUnittest):
|
||||
|
||||
def setUp(self):
|
||||
self.testcase_cli_path = "tests/data/demo_testset_cli.yml"
|
||||
self.testcase_cli_path = "tests/data/demo_testcase_cli.yml"
|
||||
self.testcase_file_path_list = [
|
||||
os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_hardcode.yml'),
|
||||
os.getcwd(), 'tests/data/demo_testcase_hardcode.yml'),
|
||||
os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_hardcode.json')
|
||||
os.getcwd(), 'tests/data/demo_testcase_hardcode.json')
|
||||
]
|
||||
self.testcases = [{
|
||||
'config': {
|
||||
@@ -273,21 +273,21 @@ class TestHttpRunner(ApiServerUnittest):
|
||||
|
||||
def test_run_testcase_template_variables(self):
|
||||
testcase_file_path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_variables.yml')
|
||||
os.getcwd(), 'tests/data/demo_testcase_variables.yml')
|
||||
runner = HttpRunner().run(testcase_file_path)
|
||||
summary = runner.summary
|
||||
self.assertTrue(summary["success"])
|
||||
|
||||
def test_run_testcase_template_import_functions(self):
|
||||
testcase_file_path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_functions.yml')
|
||||
os.getcwd(), 'tests/data/demo_testcase_functions.yml')
|
||||
runner = HttpRunner().run(testcase_file_path)
|
||||
summary = runner.summary
|
||||
self.assertTrue(summary["success"])
|
||||
|
||||
def test_run_testcase_layered(self):
|
||||
testcase_file_path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_layer.yml')
|
||||
os.getcwd(), 'tests/data/demo_testcase_layer.yml')
|
||||
runner = HttpRunner().run(testcase_file_path)
|
||||
summary = runner.summary
|
||||
self.assertTrue(summary["success"])
|
||||
@@ -295,7 +295,7 @@ class TestHttpRunner(ApiServerUnittest):
|
||||
|
||||
def test_run_testcase_output(self):
|
||||
testcase_file_path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_layer.yml')
|
||||
os.getcwd(), 'tests/data/demo_testcase_layer.yml')
|
||||
runner = HttpRunner(failfast=True).run(testcase_file_path)
|
||||
summary = runner.summary
|
||||
self.assertTrue(summary["success"])
|
||||
@@ -304,7 +304,7 @@ class TestHttpRunner(ApiServerUnittest):
|
||||
|
||||
def test_run_testcase_with_variables_mapping(self):
|
||||
testcase_file_path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_layer.yml')
|
||||
os.getcwd(), 'tests/data/demo_testcase_layer.yml')
|
||||
variables_mapping = {
|
||||
"app_version": '2.9.7'
|
||||
}
|
||||
|
||||
@@ -64,7 +64,7 @@ class TestFileLoader(unittest.TestCase):
|
||||
|
||||
def test_load_json_testcases(self):
|
||||
testcase_file_path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_hardcode.json')
|
||||
os.getcwd(), 'tests/data/demo_testcase_hardcode.json')
|
||||
testcases = loader.load_file(testcase_file_path)
|
||||
self.assertEqual(len(testcases), 3)
|
||||
test = testcases[0]["test"]
|
||||
@@ -75,7 +75,7 @@ class TestFileLoader(unittest.TestCase):
|
||||
|
||||
def test_load_yaml_testcases(self):
|
||||
testcase_file_path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_hardcode.yml')
|
||||
os.getcwd(), 'tests/data/demo_testcase_hardcode.yml')
|
||||
testcases = loader.load_file(testcase_file_path)
|
||||
self.assertEqual(len(testcases), 3)
|
||||
test = testcases[0]["test"]
|
||||
@@ -404,7 +404,7 @@ class TestSuiteLoader(unittest.TestCase):
|
||||
|
||||
# absolute file path
|
||||
path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_hardcode.json')
|
||||
os.getcwd(), 'tests/data/demo_testcase_hardcode.json')
|
||||
testcases_list = loader.load_tests(path)
|
||||
self.assertEqual(len(testcases_list), 1)
|
||||
self.assertEqual(len(testcases_list[0]["teststeps"]), 3)
|
||||
@@ -415,7 +415,7 @@ class TestSuiteLoader(unittest.TestCase):
|
||||
self.assertIn("get_sign", testcases_list[0]["config"]["refs"]["debugtalk"]["functions"])
|
||||
|
||||
# relative file path
|
||||
path = 'tests/data/demo_testset_hardcode.yml'
|
||||
path = 'tests/data/demo_testcase_hardcode.yml'
|
||||
testcases_list = loader.load_tests(path)
|
||||
self.assertEqual(len(testcases_list), 1)
|
||||
self.assertEqual(len(testcases_list[0]["teststeps"]), 3)
|
||||
@@ -427,8 +427,8 @@ class TestSuiteLoader(unittest.TestCase):
|
||||
|
||||
# list/set container with file(s)
|
||||
path = [
|
||||
os.path.join(os.getcwd(), 'tests/data/demo_testset_hardcode.json'),
|
||||
'tests/data/demo_testset_hardcode.yml'
|
||||
os.path.join(os.getcwd(), 'tests/data/demo_testcase_hardcode.json'),
|
||||
'tests/data/demo_testcase_hardcode.yml'
|
||||
]
|
||||
testcases_list = loader.load_tests(path)
|
||||
self.assertEqual(len(testcases_list), 2)
|
||||
@@ -447,21 +447,21 @@ class TestSuiteLoader(unittest.TestCase):
|
||||
def test_load_testcases_by_path_folder(self):
|
||||
# absolute folder path
|
||||
path = os.path.join(os.getcwd(), 'tests/data')
|
||||
testset_list_1 = loader.load_tests(path)
|
||||
self.assertGreater(len(testset_list_1), 4)
|
||||
testcase_list_1 = loader.load_tests(path)
|
||||
self.assertGreater(len(testcase_list_1), 4)
|
||||
|
||||
# relative folder path
|
||||
path = 'tests/data/'
|
||||
testset_list_2 = loader.load_tests(path)
|
||||
self.assertEqual(len(testset_list_1), len(testset_list_2))
|
||||
testcase_list_2 = loader.load_tests(path)
|
||||
self.assertEqual(len(testcase_list_1), len(testcase_list_2))
|
||||
|
||||
# list/set container with file(s)
|
||||
path = [
|
||||
os.path.join(os.getcwd(), 'tests/data'),
|
||||
'tests/data/'
|
||||
]
|
||||
testset_list_3 = loader.load_tests(path)
|
||||
self.assertEqual(len(testset_list_3), 2 * len(testset_list_1))
|
||||
testcase_list_3 = loader.load_tests(path)
|
||||
self.assertEqual(len(testcase_list_3), 2 * len(testcase_list_1))
|
||||
|
||||
def test_load_testcases_by_path_not_exist(self):
|
||||
# absolute folder path
|
||||
@@ -484,7 +484,7 @@ class TestSuiteLoader(unittest.TestCase):
|
||||
|
||||
def test_load_testcases_by_path_layered(self):
|
||||
path = os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_layer.yml')
|
||||
os.getcwd(), 'tests/data/demo_testcase_layer.yml')
|
||||
testcases_list = loader.load_tests(path)
|
||||
self.assertIn("variables", testcases_list[0]["config"])
|
||||
self.assertIn("request", testcases_list[0]["config"])
|
||||
|
||||
@@ -380,7 +380,7 @@ class TestParser(unittest.TestCase):
|
||||
{"app_version": "${gen_app_version()}"},
|
||||
{"username-password": "${get_account()}"}
|
||||
]
|
||||
testset_path = os.path.join(
|
||||
testcase_path = os.path.join(
|
||||
os.getcwd(),
|
||||
"tests/data/demo_parameters.yml"
|
||||
)
|
||||
@@ -425,7 +425,7 @@ class TestParser(unittest.TestCase):
|
||||
]
|
||||
variables_mapping = {}
|
||||
functions_mapping = project_mapping["debugtalk"]["functions"]
|
||||
testset_path = os.path.join(
|
||||
testcase_path = os.path.join(
|
||||
os.getcwd(),
|
||||
"tests/data/demo_parameters.yml"
|
||||
)
|
||||
|
||||
@@ -27,9 +27,9 @@ class TestRunner(ApiServerUnittest):
|
||||
def test_run_single_testcase(self):
|
||||
testcase_file_path_list = [
|
||||
os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_hardcode.yml'),
|
||||
os.getcwd(), 'tests/data/demo_testcase_hardcode.yml'),
|
||||
os.path.join(
|
||||
os.getcwd(), 'tests/data/demo_testset_hardcode.json')
|
||||
os.getcwd(), 'tests/data/demo_testcase_hardcode.json')
|
||||
]
|
||||
|
||||
for testcase_file_path in testcase_file_path_list:
|
||||
@@ -76,7 +76,7 @@ class TestRunner(ApiServerUnittest):
|
||||
with self.assertRaises(exceptions.ValidationFailure):
|
||||
self.test_runner.run_test(test)
|
||||
|
||||
def test_run_testset_with_hooks(self):
|
||||
def test_run_testcase_with_hooks(self):
|
||||
start_time = time.time()
|
||||
|
||||
config_dict = {
|
||||
@@ -117,17 +117,17 @@ class TestRunner(ApiServerUnittest):
|
||||
}
|
||||
test_runner = runner.Runner(config_dict)
|
||||
end_time = time.time()
|
||||
# check if testset setup hook executed
|
||||
# check if testcase setup hook executed
|
||||
self.assertGreater(end_time - start_time, 0.5)
|
||||
|
||||
start_time = time.time()
|
||||
test_runner.run_test(test)
|
||||
test_runner.run_test(test)
|
||||
end_time = time.time()
|
||||
# testset teardown hook has not been executed now
|
||||
# testcase teardown hook has not been executed now
|
||||
self.assertLess(end_time - start_time, 1)
|
||||
|
||||
def test_run_testset_with_hooks_modify_request(self):
|
||||
def test_run_testcase_with_hooks_modify_request(self):
|
||||
config_dict = {
|
||||
"name": "basic test with httpbin",
|
||||
"variables": self.debugtalk_module["variables"],
|
||||
@@ -161,7 +161,7 @@ class TestRunner(ApiServerUnittest):
|
||||
test_runner = runner.Runner(config_dict)
|
||||
test_runner.run_test(test)
|
||||
|
||||
def test_run_testset_with_teardown_hooks_success(self):
|
||||
def test_run_testcase_with_teardown_hooks_success(self):
|
||||
test = {
|
||||
"name": "get token",
|
||||
"request": {
|
||||
@@ -192,7 +192,7 @@ class TestRunner(ApiServerUnittest):
|
||||
# check if teardown function executed
|
||||
self.assertLess(end_time - start_time, 0.5)
|
||||
|
||||
def test_run_testset_with_teardown_hooks_fail(self):
|
||||
def test_run_testcase_with_teardown_hooks_fail(self):
|
||||
test = {
|
||||
"name": "get token",
|
||||
"request": {
|
||||
@@ -226,10 +226,10 @@ class TestRunner(ApiServerUnittest):
|
||||
def test_run_testcase_with_empty_header(self):
|
||||
testcase_file_path = os.path.join(
|
||||
os.getcwd(), 'tests/data/test_bugfix.yml')
|
||||
testsets = loader.load_tests(testcase_file_path)
|
||||
testset = testsets[0]
|
||||
config_dict_headers = testset["config"]["request"]["headers"]
|
||||
test_dict_headers = testset["teststeps"][0]["request"]["headers"]
|
||||
testcases = loader.load_tests(testcase_file_path)
|
||||
testcase = testcases[0]
|
||||
config_dict_headers = testcase["config"]["request"]["headers"]
|
||||
test_dict_headers = testcase["teststeps"][0]["request"]["headers"]
|
||||
headers = deep_update_dict(
|
||||
config_dict_headers,
|
||||
test_dict_headers
|
||||
|
||||
Reference in New Issue
Block a user