Developers at Adevinta Spain wanted to run unit, integration and end-to-end tests on Spark applications. Learn how they implemented this.
A useful read for big data developers, get into the detail of how the team at Adevinta Spain implements unit, integration and end-to-end tests on Spark applications as they build a Data Platform. The full article reveals a typical project layout, how to use Spark sessions, what the various test functions enable and how this improves the quality of the code and the CI pipelines.
Highlights include:
- Layout of a Spark project
- How to isolate different Spark tests
- Details of the test implementation code
- The idea behind SharedSparkSessionHelper
- Links to real-life Adevinta Spain examples