Uncovering Bugs and Enhancing Usability in Scrape Any Website (SAW)

Uncovering Bugs and Enhancing Usability in Scrape Any Website (SAW)

Introduction

Exploratory testing is a critical phase in software development, aiming to uncover hidden bugs and usability issues that automated testing might miss. Recently, I conducted exploratory testing on version 1.1.19.0 of the Scrape Any Website (SAW) application using my HP EliteBook 1040 G4 running Windows 11. Here’s a detailed report of my findings and recommendations for enhancement.

Exploratory Testing Approach

My testing approach involved thoroughly exploring the functionalities of SAW, seeking to identify and document at least three significant bugs. I focused on severe and critical issues that could impact usability, performance, or security. This testing covered both expected workflows and edge cases to ensure comprehensive assessment.

Findings

  1. Website with Expired SSL Certificate

    Description: SAW does not detect websites with expired SSL certificates and proceeds with scraping them, which can pose security risks.

    Environment: HP EliteBook 1040 G4, Windows 11, Chrome.

    Steps to Reproduce:

    • Launch the SAW application.

    • Enter the URL http://dv.ssl/ into Scrape Any Website.

    • Observe that SAW proceeds with scraping despite the expired SSL certificate.

Expected Result: SAW should detect websites with expired SSL certificates and report them as insecure, rather than proceeding with scraping.

Actual Result: Scrape Any Website (SAW) scraped the URL http://dv.ssl/, which has an expired SSL certificate.

Severity: Major.

  1. ScrapeAny Website App Not Responsive When Minimized

    Description: When the SAW application is minimized, it becomes non-responsive, causing some contents to be lost or inaccessible to the user.

    Steps to Reproduce:

    • Launch the SAW application on a Windows device.

    • Minimize the application.

Expected Result: The application should remain fully responsive, maintaining visibility and accessibility of all contents.

Actual Result: The application is not responsive when minimized, leading to loss of content visibility.

Severity: Major.

  1. Lack of User Guidance for New Users

    Description: The application lacks intuitive guidance for new users, making it difficult to navigate and use the features effectively.

    Steps to Reproduce:

    • Download, install, and launch SAW.

    • Attempt to use the features without prior experience or instructions.

Expected Result: An intuitive user interface or a tutorial guiding new users.

Actual Result: New users may struggle to navigate and use the application.

Severity: Minor.

  1. Empty Name Field

    Description: The Name field for the scraped URL is empty, which should contain a meaningful identifier for each scrape task.

    Steps to Reproduce:

    • Launch Scrape Any Website (SAW) application.

    • Initiate a scrape task for the URL https://heroicons.com/.

    • Review the output data and observe the Name field.

Expected Result: The Name field should be populated with a meaningful identifier for the scraped URL (https://heroicons.com/), aiding in task identification and organization.

Actual Result: The Name field is empty, lacking any identifier or name associated with the scraped task.

Severity: Minor. While this issue does not impact functionality directly, it affects clarity and organization within the scraping process.

Suggestions for Improvement

  1. SSL Certificate Detection: Implement a mechanism to detect expired SSL certificates and alert users, preventing the application from scraping insecure websites.

  2. Improve Responsiveness When Minimized: Ensure the application remains fully responsive even when minimized, preserving visibility and accessibility of all content.

  3. Enhance User Guidance: Introduce comprehensive onboarding materials or tutorials within the application to facilitate new users in navigating and utilizing SAW effectively.

  4. Populate Name Field: Ensure the Name field is automatically populated with a meaningful identifier for each scrape task, aiding in task identification and organization.

    Conclusion

    Exploratory testing of SAW revealed several key areas for improvement, particularly in SSL certificate handling, application responsiveness, user guidance, and task identification. Addressing these issues will significantly enhance the user experience and reliability of the application.

    For more details, you can view the complete bug report here.