Tips & Tricks To Get Hall of Fame In NASAđ
Hi, Ajak Amicoâs welcome back to another blog. Today In this blog, I will help you guys with Tips and Tricks to Get Hall of Fame from NASA So Before starting, if you havenât subscribed to our channel, do subscribe, guys. Contents related to cyber security, Bug Bounty, and Digital Forensics Investigation.
Follow our Youtube Channel: @ajakcybersecurity (360 Videos)
Follow on Instagram: AjakCybersecurity
Introduction:
Hi guys so I recently got a Hall of Fame from NASA, if you haven't checked it, here is the link
https://medium.com/@Ajakcybersecurity/how-i-got-hall-of-fame-in-nasa-4d098c413f9e
So after hunting in NASA for 31 days, I submitted 7 bugs, out of these 2 got N/A, 4 Duplicate/Informational and 1 Got Triaged. here is the list of bugs which I submitted.
1)Directory Listing Leads to Sensitive info disclosure -> P5
2) Bypassing Email Auth Leads to Pre-Account takeover -> N/A
3) Email HTML injection in Name parameter -> P5/Duplicate
4) Bypassing Restricted Directory via Forced Browsing-> N/A
5) URL Misconfiguration Leads to source code disclosure->P4/Accepted
6)No Rate limit on password reset leads to Email flooding_->P5/Duplicate
7) PII Information Leakage of NASA users-> P5
As you can see 1 got triaged, and during this journey I have encountered, numerous subdomains and numerous parameters. which I will share below.
1) Subdomain Enumeration
When it comes to NASA subdomains, there are enormous amount of subdomains, scattered out there. trust me the subdomain which I got hall of Fame was completely new and found it on day 30. you will find subdomains like âsubdomin.nasa.govâ and âsubdomain.subdomain.nasa.govâ and the tools which I used for the subdomain enumerations are:
- Subfinder
- https://subdomainfinder.c99.nl/
- Amass
- Google Dorks
My suggestion when hunting on NASA is donât use HTTPX and Aquatone, instead use URL bulk Opener.
https://chromewebstore.google.com/detail/bulk-url-opener-extension/hgenngnjgfkdggambccohomebieocekm
The reason to say this is you canât simply go through the Screenshots one-by-one, and see which is suitable for hunting, as I said Earlier, there are Enormous amount of subdomains out there. so the best way is to open the all your sites via BulkURL opener. so this will be your
Step-1:
- Enumerate Mass subdomains from above-mentioned Tools .
- Combine all the enumerated subdomains in one file and save it in NASA_subdomains.txt
- Use âANEWâ tool to remove duplicate URLâ from the file and save it as final_NASA_subdomins.txt
- Copy 20â25 sites at a time and give it via BulkURL opener.
- Check each and every sites manually which you find suspicious.
- Copy the suspicious site's and save them in new file suspicous_domain.txt so that you can take a closer look at the site later.
Tip: When Checking for suspicious sites, check the last updated date of the site, which you can find in the bottom corner of the each NASA subdomain.
Fuzzing:
So the next part is fuzzing, I donât want to go deep into this since most of my fuzzing which I did was manual fuzzing for all sites. but anyway when it comes to fuzzing you should aim to look for Information disclosure types of fuzzing, I will say why in the later part of the blog. and these are the tools I used for Fuzzing:
- Burpsuite Intruder to Fuzz directories.
- Dirsearch
- Shodan.
- Paramspider
- Nuclei
- Wapplyzer.
- Waybackurls
- GAU.
so this will be your next steps
Step 2:
- Upon gathering all subdomains, check each and every subdomains manually.
- Next, collect all URLâs from the subdomains via waybackurls and GAU.
- Use https://raw.githubusercontent.com/Karanxa/Bug-Bounty-Wordlists/main/all_fuzz.txt for directory enumeration.
- Try to find out sensitive directories when you fuzz on NASA.
Manual Fuzzing:
**The below Parts are so Important so donât skip guys**
As I said earlier most of the time the bugs I found in NASA were through manual fuzzing. What is manual fuzzing? simple! just click on the each and every links in a website manually and notice what kind of parameters are passing through the URL. if you find any weird URL try to disturb the URL and try to attain all injection types of bugs such as XSS, SQLI, OS command injection etc. When you click on every URL notice on which site itâs redirecting you, try to attain Open-redirection via this method. the beauty of the NASA domain is you go deeper and deeper with new links and parameters when you do Manual fuzzing, until the 17th day of hunting in NASA, I was automating so much of things to find bugs, but when I did manual fuzzing, I came to know each and every link you click in a website, it will take you deeper and deeper to fuzz more. And by this method, I found 3 SelfXSS and 1 Email HTML injection. so right after the 17th day, I left my automation method completely and started to hunt manually. and you can see my triaged bug in NASA, it was completely on manual fuzzing, just by exploring the each and every link manually and very deeply.
ForcedBrowsing
The Next one is ForcedBrowsing One of the most common types of bugs which I found in NASA was ForcedBrowsing which will potentially lead you to information disclosure bugs. what is forced browsing?
When you click on Normal Link in NASA:
https://www.dev.nasa.gov/User/Private/User1.json
Forced Browsing URL:
https://www.dev.nasa.gov/User/
So here you change the URL from normal to Forced in this case which is user1.json to USER, if you get the user directory and see PII leaks it is vulnerable to Forced browsing which leads to information disclosure. Remember you canât simply report every forced browsing directories bug, you just need to check for sensitive info from the directories which you find from forced browsing. so this will be your next step.
Step3:
- Enumerate mass subdomains
- Fuzz via automation use tools such as Shodan, Paramspider, Dirsearch and Burp intruder to get directories.
- Use Waybackurls and GAU to gather URLs of a subdomain and open them via Bulk URL opener and check for sensitive parameters to inject payloads.
- Do manual fuzzing, click on each and every link manually and try to go deep, to attain more URLs and parameters.
- Use forced browsing methods to get sensitive info from directories.
Quick Tip: Try to read all blogs which are posted via social media, on how people got Hall of Fame in NASA, I did that and it was very very useful for me.
What Bugs you can generally find in NASA?
Ok after reading blogs on how people got the Hall of Fame in NASA and in 31 days of hunting, I came across the common bugs which you can find in NASA.
1) Reflected XSS/Self XSS
When I read blogs and asked people how they got hall of fame in NASA, most of the answer was XSS and Info disclosure. and to my luck when I did manual fuzzing I found 3SelfXSS bugs, but sadly couldn't escalate.
2) Information Disclosure bugs.
This is the most common bugs you can find in NASA, as I mentioned earlier, using manual fuzzing techniques and using forced browsing techniques you can easily able to find many information disclosure bugs.
3) Open Redirection
This is another common bug you can find in the NASA domain, upon asking many security researchers who got Hall of Fame in NASA, they stated this bug too, and you can achieve this via gathering URLs and via deep manual hunting.
4) Authentication Flaws:
I would give this the least priority since, many people hunt forAuthflaws and even though you try to find one unless itâs highly severe they wonât triage the bug instead you will be marked as informational or N/A, if you donât trust me here is my another blog link when I found an Email Auth flaw in one their main subdomains, which was marked as N/A.
Tips: Google Dorking is another efficient way to find info disclosure bugs.
Google Dorks to find out Sensitive Info:
Yeah using Google Dorks you can find many sensitive info information, here is the list of manually gathered Google Dorks upon fetching from open source.
site:.*.*.nasa.gov "Server Status" | confidential | âemployee onlyâ | proprietary | top secret | classified | trade secret | internal | private
intitle:Welcome to Firebase Hosting inurl:firebaseapp *Nasa.gov
inurl:conf | inurl:env | inurl:cgi | inurl:bin | inurl:etc | inurl:root | inurl:sql | inurl:backup | inurl:admin | inurl:php site:example[.]com
inurl:"error" | intitle:"exception" | intitle:"failure" | intitle:"server at" | inurl:exception | "database error" | "SQL syntax" | "undefined index" | "unhandled exception" | "stack trace" site:example[.]com
ext:txt | ext:pdf | ext:xml | ext:xls | ext:xlsx | ext:ppt | ext:pptx | ext:doc | ext:docx
intext:âconfidentialâ | intext:âNot for Public Releaseâ | intext:âinternal use onlyâ || intext:âpasswordsâ | intext:âdo not distributeâ
intext:register site:*nasa.gov
ext:log | ext:txt | ext:conf | ext:cnf | ext:ini | ext:env | ext:sh | ext:bak | ext:backup | ext:swp | ext:old | ext:~ | ext:git | ext:svn | ext:htpasswd | ext:htaccess site:*nasa.gov
inurl:q= | inurl:s= | inurl:search= | inurl:query= | inurl:keyword= | inurl:lang= inurl:& site:nasa.gov
site:.*.*.nasa.gov "Server Status" | confidential | âemployee onlyâ | proprietary | top secret | classified | trade secret | internal | private
site:s3.amazonaws.com intext:â@nasa.govâ
Try the above-mentioned google dorks in each and every subdomain, try to automate using your own script, I swear you will find some sensitive info from this.
Advantages and Disadvantages of NASA VPD.
- NASA VDP is managed by Bugcrowd.
- Email HTML injection bugs are marked as P5 unless you escalate to XSS
- Your points will be not be deducted when you submit N/A bugs.
- You wonât get any bounty/Points when you find a bug in NASA.
- You will get a letter of appreciation from NASA when your bug is triaged between P1-P4.
- The only one worse in NASA VDP is getting duplicates, many security researchers out there aim for that appreciation letter, and many people hunt on NASA, one of the bug which I submitted & marked as information/duplicate but, the original bug was reported back in June 2023. and I reported again in 2024 May, so initially it would be tough to get a Hall of Fame unless luck favours you.
- Report Whatever bug you find in NASA, it may be a simple stack trace error also, when my bug got triaged, I didnât have much faith that it would be accepted, but to my luck, a source code leakage bug was accepted, almost similar to stack trace error bugs. so just give it a go you wonât lose any points too upon submitting N/A bugs.
Conclusion:
The only one thing you need to keep in mind when you hunt on NASA is consistency and patience, for me it took 31 days to find one valid bug. so keep trying you too will find. before concluding my special thanks to @https://medium.com/@kamilrahman32 for his tips. hope you would have enjoyed this blog cheers will meet in next blog Happy weekend makkale :)
PS: If you find a bug in NASA donât forget to give a credit :) and if you find this blog useful buy me a coffee here https://www.buymeacoffee.com/Ajak
â â â â â â â â â â â â â â â â â â â â â â â â â â â â â â â â
Hope you would have learned some information from this blog if so, kindly press that follow button for further updates. Best wishes from Ajak Cybersecurity.â€ïž
âàźàź±àŻàź±àź”àŻ àźȘàź±àŻàź±àź”àŻđ„â
Learn Everyday, Happy Hacking đđ
https://www.buymeacoffee.com/Ajak
â â â â â â â â â â â â â â â â â â â â â â â â â â â â â â
Follow our Youtube Channel: @ajakcybersecurity
Follow on Instagram: @ajakcybersecurity