Hello everyone, I am Marx Chryz and I do bug bounty hunting for about a year now. It’s also been two and a half years since I started doing web penetration testing.
The site I am hunting on is a private program so I can’t disclose the urls, I am just going to discuss the method.
I found an XSS once but it was just P5 and was counted as Self-XSS because the payload is only visible to the attacker.
Because of this, I tried improving my methods on finding URLs with potential XSS vulnerabilities. Here I started looking for hard to find places. Hidden legacy pages
Finding the vulnerability
For those who don’t know, legacy pages are old pages that are still available in the production site. Legacy pages exists, even old, because they are still functional and is still used by other portions of the application.
Upon recon, I found a common subdomain, lets call it sub.redacted.com. In the root of this url (https://sub.redacted.com), I can’t find any XSS vulnerabilities since I assume it is widely tested by fellow bug bounty hunters. So I tried finding folders inside the sub.redacted.com. I was lazy to fire up FFUF and Dirbuster because of my slow internet connection so I started by looking if the site has robots.txt
- Open https://sub.redacted.com/robots.txt
- Found a directory named “web-app”
- /web-app/ is blank so I tried guessing random files.
- /web-app/dashboard.php redirects to /web-app/logout.php
- Then I view-souce and found lots of .js files.
- .js files contains URLs and lots of parameters 😎
- Manually checked all the URLs and parameters (a lot are not working since they are legacy pages). This is to see if any of the parameter values get reflected in the page.
- Finally found 2 reflected XSS vulnerabilities (1 authenticated and 1 unauthenticated).
July 19 and 21, 2021 — Report Submitted
Sept 1, 2021 — Triaged as P3 and eligible for $500 bounty each