Recently, the Cyber Op Source team were hired to review a brand new website by a third party for our client. Our client was ready to make the cutover to their new website the day we were called in. They were towards the end of the project and decided to have us review the website before going public. We were called in last so they could save some money. The project was over budget, and this organization wanted to get the product out early without putting the website through its paces. There is much to unpack in just the first few sentences, but I won't get into it. In addition, Cyber Op Source, like Planet Fitness is a judge free zone. Let's say security controls should be built into the entire development process. The client wanted to get the website published quickly, and the website did look great. From a business perspective, we understood the client's mindset and goal. From a cyber perspective, we must ensure we are protecting our client. We know we must strike the perfect balance and remember that, ultimately, it is our job to aid, not hinder, the client's progress.
Before accessing the internal system, we scanned the public website for vulnerabilities. The scan results did not come up with any major vulnerabilities, but we found many other information leakage-related issues. The website, as beautifully designed as it was, had security holes that needed to be addressed. The information found during the scan could easily be used to exploit vulnerabilities. With our client's permission, we will share with you the top 3 issues found during scanning.
PHPINFO Output
Issue 1 - The phpinfo() function was found in production
The phpinfo() function is a commonly used function that provides a comprehensive overview of a web server's PHP configuration and environment variables. The first thing any good hacker does before starting to hack any system is to gather information via a process known as reconnaissance (recon). When the phpinfo() function is placed inside a PHP page and executed, the webpage displays information about the PHP version, installed extensions, server settings, environment variables, and more. This goes a far way in aiding an attacker during the recon phase.
There are legitimate uses for using the phpinfo() function. The primary purpose of phpinfo() is to help developers, system administrators, and system operators to diagnose and troubleshoot PHP-related issues. You can use it to verify the PHP version, identify enabled extensions, check server settings, and confirm the configuration of various PHP parameters. This information can also help debug and optimize performance-related issues and ensure compatibility with specific PHP functions and features.
The flipside to the phpinfo() function is it can also pose significant dangers if not handled carefully:
- Information Disclosure: The main danger of phpinfo() is that it can expose sensitive information about the server, it's configuration, and the server environment to anyone who can access the PHP page, including attackers. This information can include the server's operating system, PHP version, database credentials, directory paths, and other sensitive environment information. Attackers can use this information to gain insights into the server's vulnerabilities and plan targeted attacks.
- Security Vulnerabilities: Now that the attackers have this information, what is next. Attackers can now use this information to discover misconfigured or vulnerable systems. They can use the information to exploit known vulnerabilities in the system, not just the ones associated with specific PHP versions or extensions.
- Exploiting the System: Finally, this can lead to unauthorized access, code execution, or other malicious activities. Trust me, you do not want these issues, especially for a pretty website.
Solutions for Issue 1
- Remove or Comment Out phpinfo() Calls: You can do manual scans during code review, automated scans during your CI/CD process for phpinfo() function call, or both. You can also disable the phpinfo() function in your php.ini file.
- Restrict Access to Files: If, for whatever reason, the file is needed in the test environment and not production, you can deny access via your .htaccess file in production. You can also create access controls, proxy filters, and or special code that allows it to execute in various scenarios. You can do a combination of these things for a defense-in-depth approach.
- Educate IT Team: As part of your overall cyber security training program, developers, website designers, system administrators, etc should be educated on the dangers of information leakage in general.
Issue 2 - The robots.txt file contains sensitive information
The robots.txt could potentially expose directories that wouldn't have been explored otherwise. Robots.txt is a commonly used website file that guides search engine crawlers and their web crawling activities. It improves website visibility and manages search engine indexing. A misconfigured robots.txt can have unforeseen consequences, leading to a security breach because of the exposure of sensitive information. This leads us to issue 3.
Browsable directory - Image taken from https://subscription.packtpub.com
Issue 3 - Upload directory is browsable
Having a browsable directory means that the directory where the files live on the server is accessible and viewable by anyone who visits the directory's URL. This can be very dangerous. It can lead to the potential exposure of private files and sensitive information. In addition, there could be additional information that attackers can use as part of their recon to further exploit the system.
Solutions for Issue 2 and Issue 3
I decided to bundle the solutions for issues 2 and 3 together because they are closely related. Here are some potential solutions:
Disable Directory Browsing: Depending on your server setup and permissions, you can disable directory browsing via your web server configuration file. For example, Apache, Nginx, or IIS allows you to disable directory browsing. You can also add an index.html or index.php to the directory. Again, depending on your server configuration, this should be relatively straight forward.
Store Uploaded Files Outside the Web Root: Along with my cyber security degree, networking degree, I also have a computer science degree. In fact, I am a Full Stack Java developer by trade. Since I started programming, I have been taught that you never store your upload directory inside the web root. For example, the web root could be /var/www/html in Ubuntu and many other Linux distributions. (Note: There is no guarantee that this is the default web root location, so check your web server configuration file). Depending on how the web application is designed, you could find the upload directory in /var/www/html/uploads. You can move it to somewhere outside the directory, like /srv/uploads. In addition, depending on the complexity of your environment and your IT department skill level, you can move this folder to a separate storage service such as Amazon S3 or Google Cloud.
Consistent and Accurate Configurations: Whether we are discussing your robot.txt file, environment variables, server configurations, server folders, etc, you must make sure the information contained within these environments is consistent and accurate. Again, manual scans should be done, as well as automated scanning for inaccuracies and inconsistencies within files and folders in your development, test, and production environments.
Regular Review and Updates: A periodic review and update of configuration files such as robots.txt is advised. I know there are automated tools for everything nowadays, but I recommend doing both a manual and automatic scan. Have someone else review the files to introduce a peer-reviewed approach.
Don't Practice Security By Obscurity: There are better security strategies than this. Be mindful of what you put online. Attackers look for any piece of information they can get. It can be information contained within a robot.txt file or the information contained within a phpinfo() function. It is all a piece of the puzzle.
Along with the tips outlined above, please visit OWASP Top Ten and SANS Top 25. Remember to scan all front-facing web applications. You can use tools such as Burp Suite, OWASP ZAP, Nessus, Nikto2, WPScan (Wordpress websites). These are all tools that can be used to establish good cyber hygiene. Just remember, the above suggestions are just that, suggestions. You should always consult with your IT department before making these changes or decisions. In addition, consider your organization's policies, procedures, and guidelines before making any changes to existing processes. In some cases, this is something that takes time to be decided.
No reactions