GitHub Repo

Framework Design: Q&A

How did a real interview experience inspire this framework?
A: My motivation began after an interview where I was asked to design test cases using Rest Assured and JUnit. At that time, I struggled with implementing the object mapper, which inspired me to revisit and master these concepts, leading to the creation of this framework.
What led me to explore Amadeus APIs for this project?
A: I had an interview with a company using Airports and Airlines APIs. To gain confidence in how such systems behave, I explored Amadeus APIs, focusing on reference data and resources related to airports, airlines, and hotels. I created Java record classes like Location.java, Address.java, and AirportQueriesWrapper.java to handle API responses, and used JsonUtils.java for JSON parsing and object mapping.
What core design values shaped my approach?
A: My approach was guided by a strong focus on modularity, reusability, and maintainability, in addition to robust logging. I structured the framework so that each component—such as authentication, request building, response validation, and reporting—could be easily extended or reused for new APIs or test scenarios. I used Java record classes for clear data modeling, utility classes for common operations, and followed a package-by-feature structure to keep the codebase organized and scalable. For logging, I leveraged Rest Assured's filter mechanism to add custom logging and validation by implementing the Filter interface in RestAssuredLoggerFilter.java, ensuring that every request and response could be traced and debugged efficiently. This holistic approach made the framework both powerful and easy to maintain as it grows.
Q: How did I implement custom logging?
A: My custom filter in RestAssuredLoggerFilter.java logs the request method and URI at the info level, and headers and body at the debug level. The filter() method uses Log4j2's Logger to capture request/response details and stores the status code in ITestResult context for retry analysis. This keeps logs clean during normal runs, but allows for detailed inspection when debugging by adjusting the Log4j configuration.
What challenges did I face with API rate limits, and how did I solve them?
A: When running tests on GitHub, I encountered 429 errors due to faster execution. To handle this, I implemented a retry mechanism using TestNG's RetryAnalyzer.java and AnnotationTransformer.java. The RestAssuredLoggerFilter sets the response status code in the ITestResult context via result.setAttribute("statusCode", response.getStatusCode()), which the retry analyzer's retry() method uses to decide if a test should be retried. This approach ensures robust and scalable error handling across the entire test suite.
How did I ensure API credentials and secrets are truly secure?
A: Since Amadeus APIs require OAuth 2.0 tokens, I needed to securely store the client ID and secret. Initially, I stored them as plain text in config.properties, but later implemented AES 256-bit encryption using EncryptionUtils.java. The encrypt() and decrypt() methods use the Cipher class with AES algorithm. The encrypted secrets are stored in the config file, while the encryption key is kept as an environment variable, ensuring sensitive data is protected. The ReadProperties.java utility handles decryption at runtime.
How did I make assertions and debugging more insightful?
A: Default assertions from TestNG or Rest Assured only provide pass/fail results without detailed logs. To address this, I created a wrapper utility class Assertion.java with methods like assertEquals() and assertTrue() that log assertion details using assertWithLog(), including actual and expected values, for both successful and failed assertions. I also created a LoggingMatcher.java that extends Hamcrest's TypeSafeMatcher with an overridden matchesSafely() method to log matcher-based assertions, making validation results transparent and easy to trace.
How did I make test execution logs truly useful for debugging?
A: To make logs more readable and distinguish between test cases, I implemented a TestNG ITestListener in TestResultLoggerListener.java with methods like onTestStart(), onTestSuccess(), and onTestFailure() that log separator lines and method parameters using result.getParameters() at the start, success, and failure of each test. This is especially useful for data-driven tests, as it logs the specific data used for each run.
Why did I choose Allure for reporting, and how do I keep results meaningful?
A: The default TestNG report was not detailed or user-friendly, so I implemented Allure reporting. I added the Allure dependency and a Rest Assured filter from the Allure library to capture request and response details in the report. To maintain report history, I manually copy the history folder from the previous report into the allure-results directory before generating a new report.
Q: How did I integrate the framework with CI/CD and GitHub Pages?
A: For CI/CD, I used a containerized environment with Maven and a Java container to run tests in GitHub Actions. Tests are organized in packages like tests.airports, tests.flights, and tests.destinationExperiences, all extending BaseTest.java. After running mvn clean test, the Allure report is generated using allure generate. The Allure CLI is installed in the container, previous results are downloaded and merged for history via copying the history/ folder, and the new report is published to GitHub Pages. GitHub Pages hosts the main Javadocs, test Javadocs, and the latest Allure report, providing easy access to results and trends. The testng.xml configuration includes listeners for retry analysis and logging.
How is the project structured for clarity and maintainability?
A: The project follows a clear separation of concerns:
  • Main Source: src/main/java/ contains utilities (utils/) and data models (records/)
  • Test Source: src/test/java/ contains test utilities (testUtils/) and organized test packages (tests/) by API domain
  • Configuration: pom.xml for Maven dependencies, testng.xml for test execution, config.properties for API credentials and endpoints
  • Reporting: allure-results/ for raw test data, github-pages/ for published reports and documentation
  • CI/CD: GitHub Actions workflows handle automated testing, report generation, and deployment