Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Below notes are documented while doing the review for below issue. Please follow these guidelines to raise an issue while reviewing Marvin code or any Test Features code under CS.

https://reviews.apache.org/r/14058/Image Removed

1. Please add module level doc strings. Add some explanation through these doc strings as what a particular test module covers if possible give specific examples. Please raise an issue while doing review, if proper code comments\doc strings are not followed. Adding proper comments,docstrings makes it easy to generate and use documentation for the code using known document generation tools.
'''
@Dates: Add Dates viz., module creation date, modified date
@version: Add version variable
@Author: Add Author Variables. Mention NA if not to be used
@CopyRight: proper copy right strings. Mention NA if not to be used
@Dependencies: Dependencies etc.
@Desc:
@Example:
'''

2. Add Doc Strings to each test\setup\teardown as steps\workflow. EX: Under “setUpClass” class method, add doc strings as below ( generic example ). Currently, class level setup does lot of operations, it’s confusing to follow them without a workflow as below.
EX: ‘’’
Step 1: Creating DB object.
‘’’
‘’’
Step2 : Creating ssh object.
‘’’
‘’’
Step3 : Create work flow etc. Add these steps as part of doc strings for each step.
‘’’
The above will enhance readability, maintainable maintainability as well.

3. Separate Data from test code. Remove Services class from test module altogether. Currently, code and data are mixed together. This code is repetitive under available each test modules. Instead, separate it in a config file. Use self.api_client to retrieve this configuration information. Pass this config file path to getClsTestClient call to get a config object dynamically. If needed change the lib\api for getClsTestClient. This removes lot of repetitive code across various test modules.

...

22. Use pylint, tabnanny and other code checker tools to run through the test code before commit. Follow pep8 and known coding standards as below both for naming convention and other standards. http://www.python.org/dev/peps/pep-0008/Image Removedhttp://google-styleguide.googlecode.com/svn/trunk/pyguide.htmlImage Removed

23. Raise all exceptions including generic ones through CS related exception module, this way all exceptions for debug purpose can be logged or tracked through one single module, don’t write separate exception classes for each test.

24. Instead of adding multiple lines of asserts as below to the test code, Please use a simple "verifyElementInList" utility function added to cs/tools/integration/lib/utils.py. See the usage below.

self.assertEqual(
isinstance(list_network_offerings_response, list),
True,
"listNetworkOfferings returned invalid object in response."
)

self.assertNotEqual(
len(list_network_offerings_response),
0,
"listNetworkOfferings returned empty list."
)

self.assertEqual(
list_network_offerings_response0.state,
"Enabled",
"The network offering state should get updated to Enabled."
)

If we want to verify whether "a" is a list, its empty or not, then verify element 1 is available in the list and assert, then the below line ( generic example, change it to whichever assert we want ) should suffice.

assert verifyElementInList(a,1)0,PASS,"TC FAILED"

We just need to import verifyElementInList from utils before using it

25. While doing a review, if you are a reviewer, please raise a review issue, if there are no document strings enough to understand test feature code. Especially module level,class level document strings with specific usages or examples would be ideal, or wherever we see relevant documentation is missing.

26. Remove hard coded string values EX: values for ip addresses, urls, hostnames, and other configuration information EX: Services class from test feature code. Instead, separate it in a config file ( If test feature is named test_abc.py, then follow naming convention for feature configuration as test_abc.cfg ), place this configuration file to config folder under cs/tools/marvin/marvin. Once checked in, the new configuration is then available to test feature as a dictionary for usage by using ConfigManager Interface. Please use the new ConfigManager class support added under marvin/marvin/configGeneratory.py for this. See the below example.

'''
1. cloudstackTestClient will now have configuration object exposed
and available through its method getConfigParser()
2. The object will now have all options exposed through ConfigManager
3. cls.configObj will now provide ConfigManager interface
User can use this to parse and get the dictionary represenation
of his configuaration
'''
@classmethod
def setUpClass(cls):
cls.api_client = super(
TestAddMultipleNetScaler,
cls
).getClsTestClient().getApiClient()
cls.configObj = super(
TestAddMultipleNetScaler,
cls
).getConfigParser()
if cls.configObj is not None:
'''
Here, cls.configDict will now have all the configuration parsed and available as Dictionary
This Dictionary is available to total test feature code accessible wherever applicable
'''
cls.configDict = cls.configObj.getConfig(self, "/setup/config/myfeature.cfg", None )

Now, cls.configDict will have dictionary representation for your configuration and we can use it through out the code.

27. For better readability and maintainability, maintain all widely and more often used codes under codes.py ( new module added under cs/tools/marvin/marvin/codes.py ). See codes.py for specific example.

EX: Replace "Diabled","Enabled","network_offering" strings etc in test code, with readable and well named constants and notations, place them under codes.py. Then, use them uniformly across the code for test features, with these we need alteration at one place if required and change gets available to all features. We as well avoid different naming notations under different test features.

28. Use uniform naming and usage of object creation wherever possible. See the below example current usage of creating apiclient ( cls.api_client and self.apiclient ) as such both are same objects but the way, creation is made different. Instead ,we can use common way of creating and retrieving apiclient, use the way we created api_client under setUpClass as well for setUp method we want.

class TestPublicIP(cloudstackTestCase):

def setUp(self):
self.apiclient = self.testClient.getApiClient()
self.services = Services().services

@classmethod
def setUpClass(cls):
cls.api_client = super(TestPublicIP, cls).getClsTestClient().getApiClient()
cls.services = Services().services

29. Use assertions wherever possible for product failures, avoid using

30. Any new method added, make sure to have proper return codes wherever possible. We are using exceptions for treating the fail case , Using exceptions has a different purpose and is good to have exception handling, but please dont assume that raising exception to be a return value for a function or method for noting failure. Instead, provide explicit return values for new code added wherever possible. Check the return codes and then raise assertion or exception accordingly.

31. Whenever we pick up a feature for automation, add a new field "Automation" to your test case document wherever it is ( excel,tms like quality center etc) , this will help to know during reviews and thereafter as which test cases are pending for automation for that feature even though feature may be automated.

32. Dont add asserts in library functions or non test functions. Add asserts\failures only in your test cases.

33. Add doc strings rather then comments to a test case as below. This will help to extract the test case doc strings easily and automatically for documentation or otherwise.

Instead of comments as below:

# Validate the following
# 1. Deploy a vm and create 5 data disk
# 2. Attach all the created Volume to the vm.
# 3. Reboot the VM. VM should be successfully rebooted
# 4. Stop the VM. Stop VM should be successful
# 5. Start The VM. Start VM should be successful

Use DocStrings:

'''

@Desc : test_abc_functionality

@steps:

 1. Deploy a vm and create 5 data disk
 2. Attach all the created Volume to the vm.
 3. Reboot the VM. VM should be successfully rebooted

 4. Stop the VM. Stop VM should be successful
 5. Start The VM. Start VM should be successful

'''

34: Catch exceptions in setup as well and In case of exception in setup, don't run tests, setup failures will not call tear down in many cases. So, clean up wont happen of resources we created in setup. We are seeing many clean up failures. We need to do clean up despite of tests ran or not, setup failed or not provided setup is run atleast once.

35.  We added runCommand to sshClient.py, instead of using current execute command, please use runCommand. The current execute command return contains output and error. It does not differentiate between output and error. The new command provides the status of command executed and corresponding output or error accordingly.