Ahmed.Attia

Offensive Security Certified Professional‬‎ / OSCP |Module5 - Ex2: Dega Challeng

Bash scripting - Exercise 2. Offensive Security Certified Professional (OSCP) video series by Ahmed: https://www.linkedin.com/in/limbo0x01/ https://twitter.c...
Thanks ,,
But how i know that my-sub_finder is working for all websites , i made one and i think working for (megacorpone , StackOverFlow)
but when i tested it on (google.com) i got zero results . Checking back index.HTML of google.com found it different compared to megacorpone.com .
So any hints ?
Thanks again .
 
@elshaikh Yeah sure . In megacorpone.com each sub domain in new line like this :
<li><a href="http://intranet.megacorpone.com">LOG IN</a></li>
<!--<li><a href="https://cp.megacorpone.net/">LOG IN</a></li>-->
So when u grep for megacorpone.com , you will get all of them .
But , in something like uber.com etc .. All the sub domain in just one line , when you cut or grep them u will not get the correct result .

What do u need? you have to use " sed command " to just replace the spaces to new lines , and this is can help .


Side note : This is not the video for sub_finder xD .
 

Media information

Album
OSCP
Added by
Ahmed.Attia
Date added
View count
3,246
Comment count
3
Rating
5.00 star(s) 1 ratings

Caption

Dega file :

https://mega.nz/file/Yh4VECAL#2GvF-DCdzRikpZQnGH_ztnNIKf98GtrPI-VfTI1SRVQ

stuffs i used :
tcpdump -r data.pcap -l -w - src 192.168.101.132 and dst port 3000 | tcpflow -C -r -

cat out1 |grep -v "POST / HTTP/1.1"|grep -v "Host: 192.168.101.133:3000"|grep -v "Connection: keep-alive"|grep -v "Accept-Encoding: gzip, deflate"|grep -v "Accept: */*"|grep -v "User-Agent: python-requests/2.22.0" | grep -v "Content-Type: application/x-www-form-urlencoded" | grep -v "Content-Length: "



tr -d '\n'
strings out2 | sed 's/%0A//g'| sed 's/%3D/=/g' >out3

Share this media

Top