top of page

Google Summer of Code

Google summer of code is one of the few initiatives that support open source contributions by undergrad students across various fields. Students spend their summers working with one open source organisation and get paid in return by Google.


My role

Student developer (Intern)

Problem

Create a unit testing framework for automated testing of various modules of Peace Corps Safety Web Application (PCSAWeb)

Time

4 months

My organisation was AnitaB.org that supports active role to uplift participation of women in technology. AnitaB.org (formerly Anita Borg Institute for Women and Technology, and Institute for Women in Technology) is a global nonprofit organisation based in Belmont California. Founded by computer scientists Anita Borg, PhD and Telle Whitney, PhD, the institute’s primary aim is to recruit, retain, and advance women in technology.


What I aim to accomplish?

This project aims to:

  • Extend the suite with additional test cases and cover new potential features that will be implemented during the program

  • Improve and optimize the current test suite by adding relevant technologies to have increased coverage and reduced execution time for the Peace Corps Safety Web Application (PCSAWeb).

The tests which are already written can be expanded and be optimized for faster processing and increased coverage including addition of new test cases along with refactoring the code wherever needed. Also, I believe that some new features will be incorporated in the project (PCSAWeb) this year, even old ones might get altered. So, new tests will also be written with collaboration with fellow intern working on the project to have the tests updated for the project.

I have my fork of the project setup using XAMPP using the steps given the repo’s README. I have been using it for quite some time. I have gone through the test framework that has been already written. The same can be improved with some enhancements I propose below.



Areas of improvement / deliverables


1.Upgrading dependencies and browser versions


The current state of the tests use dependencies that are outdated. Not only this is not recommended but there might be bug fixes and performance enhancements that come with this upgradation. These need to be upgraded to the latest versions and code changes must be made, if any, for its functioning.Some of them:

  • Selenium-java (latest is 3.3.1)

  • Selenium firefox driver (latest is 3.3.1)

  • Mysql (6.0.6)


Along with these compatible firefox version also changes and the latest version uses an explicit call for the path for its geckodriver. After upgrading, if the code breaks, changes are also must be made in the code.


2. Use of tabs to reduce execution times The thing with testing with selenium which uses a web browser is that these are slow. While running current tests, I saw that every time a new test case is tested, it’d execute in a completely new firefox window which not only uses a lot execution time and resources but is not recommended from the perspective of end user. Tabs, however, use much less resources as they remain in the same window and can help testing multiple pages at the same time, which most of the time is the approach any actual user would do too.Selenium, as of now does not know the concept of tabs. All it knows is windows which I will be using but for us these windows will be tabs.

3. Use of configuration files Currently, the test variables are stored in a class and the data with which the tests are carried out are stored in a .xlsx file which is divided into different sheets. I’d like to propose the use of configuration files here. They are much more cleaner, less complicated, easy to read, can be edit from within a simple terminal window and don’t need any third party softwares to open them and make changes.As for the use of classes for defining the variables and changing them to ‘configure’ the tests can be handled easily with with config files (or .properties file). The current tests use more of a “hard-coded-configurations”In addition to that, I want to give control of the testing in the tester's’ hands. Not every time one would like to run every single possible test or using all the browsers. The control to enable/disable testing with HTTPrequests (next point) can be in tester’s hands. Just changing some key, value pairs the whole configuration can be easily changed and that too without the need of any third party software. Exact listing of configurations can’t be made at this point of time as I have yet to decide the cases and write the code. But they will be something like this

4. Use of HTTPrequests for simple tasks/ quick testing/ load testing/ checking connections Sometimes just making sure the connection is made to the website and not loading the whole webpage is more convenient. Again, uses a lot less resources. The plan to use this for quick testing of the connection and for load testing to the website (using maven surefire, discussed below). Use of this can be controlled with configuration files and will be incorporated with functions just to check if the connections are established properly to pages.


5. Running parallel scripts and setting thread count. This powerful tool can be exploited to reduce the execution time greatly. Running parallel scripts on top of already fast HTTPrequests will boost testing. I am bit skeptical about the use of parallel scripts with selenium and firefox (or any other browser for that matter) as it might break. In any case, time will devoted to study this. If there are no issues with use of parallel scripts+selenium webdriver, I would go for it and will implement it in as many tests as possible since it will reduce the execution time significantly with the already proposed use of tabs, if not alone. Though, parallel scripts will be used with HTTPrequests.


6. Issues with database testing Presently, in dbChecks and connectToDB check the connection with the mySQL database and whether Login data is present or not. I propose to add functions for checking comrades data as well.Not only that, whether the actions triggered in the browser are creating appropriate data_entries/tables/schemas in the database or not will also be checked. For example, if a comrade’s number is changed, whether or not that is being reflected/ updated in the database.


7. Negative testing In my opinion adding negative tests to the project would be good idea as they would keep a check on blank form submissions and whether appropriate pages/popups/error messages are loaded in case of failures. For PCSA these would prove very effective as the project depends on many external services. A check on them should also be kept as well. There are many error boxes as well which shows that a specific functionality doesn’t work. These should be listed down as well.


8. Security testing Adding security tests is important to the project to check if a logged in user can’t see other user profiles or any content that is not meant for that user. Also, any user who is not logged in should not be given any page but the login/signup page. These small tests should be also added.


9. Perform performance tests, load balance tests, etc using Apache JMeter I propose the use of Apache Jmeter and building test plans in it to set/ configure this testing in order to create load tests and add a thread group. Setting up different thread groups and testing many but realistic scenarios will also be done.


10. Refactor code and extend the test framework Features like Support Services, Sexual Assault Awareness and Policies and Glossary were not developed including pages like Settings and Profile page. Chances are there they will be developed this year. Not only that there are chances of major design change also. So modifying existing code accordingly is also necessary. Also, currently website is not responsive. If it becomes so, that needs to be checked as well.


11. Integration with Travis CI Continuous Integration using Travis CI would be recommended. For the I would like to build mvn test which travis carries out using a pom.xml file at the root level. Also there are separate pom files for each framework and test directory. I would be merged into one.


12. Documentation Detailed documentation will be done for this one too. And existing documentation will be updated and fixed wherever necessary. I would like to use readthedocs.org for this.





Timeline

Week

Dates

Task

Community bonding period

May 29, 2017

Milestone 1: Setting up —Engage in discussions with mentors and the community in general about what more can be done apart from the solutions I have provided and/or discuss and explore efficient methods to implement the deliverables. —Settings up project with all dependencies. tools , plugins and frameworks. I also plan to do this on multiple Operating Systems (Windows and Mac apart from my ubuntu on which i’ll be developing most of my tests) —Engage in discussions with the intern to whom the PCSAWeb project would be assigned about the potential features and reorient my plan according to that. —Start incorporating features such as use of tabs , use of config files etc.

Week 1

May 30, 2017 → June 5, 2017

—Sorting out the the features/ code snippets according to: Which can be used as it is Which need a bit optimizations Which can’t be reused And write the optimizations and remove the Unnecesary code —Setup updated project structure —Create test suits and config files and add new potential configurations.

Week 2

June 6, 2017 → June 12, 2017

Milestone 2: Basic functional Tests (for potential features) —Complete writing tests for functional testing, if implemented —Discuss changes to be made and get it reviewed by mentors. —Start Documenting the code written —Start Write Negative tests

Week 3

June 13, 2017 → June 19, 2017

Milestone 3: Negative tests

<Buffer period / finish any incomplete task>

—Complete writing negative tests.

—Start with security testing.

Week 4

June 20, 2017 → June 26, 2017

Milestone 4: Security Tests

—Preparing for code submissions for phase one submissions.

—Discuss mentors / intern working on PCSAWeb regarding the security issues mentioned in the Deliverables section.

—Complete Security testing.

Week 5

June 27, 2017 → July 3, 2017

Milestone 5: Phase 1 evaluation

—Get the code reviewed by the mentor.

—Submit code review

—Phase 1 evaluation deadline

—Update the documentations.

Week 6

July 4, 2017 → July 10, 2017

—Complete any incomplete tasks if left. —Start writing Database tests.

Week 7

July 11, 2017 → July 17, 2017

—Dedicating this whole week for database testing.

Week 8

July 18, 2017 → July 24, 2017

Milestone 5: Database Tests —Finish database testing. —Get the code reviewed by the mentor. —Discuss the major changes in the project with the intern working on PCSAWeb. —Catch up to the documentation. —Start writing tests for responsive testing. —Preparing for code submissions for phase two

Week 9

July 25, 2017 → July 31, 2017

Milestone 6: P2 evaluations and checking responsiveness

—Submit Phase 2 evaluations

—Phase two evaluations deadline

—Finish responsive testing.

Week 10

August 1, 2017 → August 7, 2017

<Buffer period>

—Write tests to check responsiveness of the project.

—Integrating with Travis CI

—Start testing load/ performance.

Week 11

August 8, 2017 → August 14, 2017

Milestone 7: Load testing

—Finish writing tests for load/performance.

—Test using different operating systems. Make changes to code wherever necessary to incorporate OS changes.

—Run manual thread based testing to check load manually at my end.

—Discuss with fellow intern regarding any further changes.

—Get the code reviewed by the mentor

—See compatibility testing.

—Adding support for BrowserStack local testing setting up caps.

—Re run all the tests individually and again as a whole while different configurations/ combinations (these will be listed and documented as well)

Week 12

August 15, 2017 → August 21, 2017

Milestone 8: Documentation and regression testing

<Buffer period>

—Engage in discussion with mentor regarding any changes and make them.

—Final documentation of all the work that has been done during the program.

—Prepare for final code review and mentor evaluation submission.

Week 13

August 22, 2017 → August 28, 2017

Milestone 9: Final submission

—Submit Final work and mentor evaluation.

Final evaluations

With the above timeline the plan was executed. The automated tests and the list of commits can be seen from here.


Made with ♥️ + ☕️

© 2022

bottom of page