I know absolutely nothing about automated testing, though it has been decided that my organisation is going to start testing one of our older services.

This service essentially takes database records from one table, performs a series of calculations on these records based on various database related factors, and outputs the results into another table. This happens in real time (ie, if another record is added to Table A, then the service processes it into table B around a minute later).

The proposed solution from our technical director is to:

  • Take the previous version of the processing application and run that on any database deployment
  • Using a batch file, call a SQL script which selects certain data from the database and outputs the results to a text file
  • Run the newer build of the processing application
  • Re-run the batch file and output the results to a different text file
  • Using notepad++, diff the two files. If they are not identical the test failed. Requires a human to “eyeball” the differences…

This to me seems like a terrible solution. My proposal would be:

  • We have a set of known good test data in the SQL database
  • We run the processing
  • We compare the output to a known good set of validation data

Whilst I’m confident this is a much better approach, I was wondering what automation testing tools would be most appropriate for a service such as this, as my research has mostly unearthed tools such as selenium and cucumber, which seem geared towards web development, which may be okay, but I’m not sure, hence the question.

submitted by /u/CrumpyYouLumpy
[link] [comments]