Skip to Main Content
June 22, 2015

Web Application Security in a Large Organization - Knowing What You Don’t Know…

Written by Scott Nusbaum
Application Security Assessment Security Testing & Analysis
As news of a new breach appears almost daily anymore, one has to wonder if their organization may be next. Having the opportunity to help organizations of all sizes and lines of business, there are many different struggles that each may face. Some of the organizations are new to a security program, know they need to do something, but don’t know where to start. Unfortunately, this seems to be a common case when it comes to mergers and acquisitions. Often times years of technology has been built upon other older solutions. Having been engaged to help in such scenarios with multiple clients from a web application security perspective, we had to come up with a game plan on how to proceed. Things such as vulnerability scanning/management, user account provisioning/auditing, and policies and procedures were being handled by other people. My effort was to focus on the external perimeter first, then internal applications, and then everything else later. Since these clients were large companies (even international) and were finally doing something about security for a variety of reasons (merger and acquisition, breach, PCI compliance, 3rd party mandate, etc.), the first step is cataloging the current external applications. The clients did not have a list of IP addresses blocks that they owned, domain names, websites, etc. One might think this is crazy, but if you’ve ever been privy to see the inner workings of some non-profit organizations, many operate on miniscule budgets, antiquated technology, and do so with limited resources and expertise. The same goes for many other organizations, not just non-profit ones. The most important part of the task was to know what we don’t know. What do we know? What don’t we know? Interviews with current IT employees, marketing, and others reveal processes (or lack thereof) when creating new online content. Sometimes IT creates a website, sometimes marketing does, sometimes a 3rd party hired by another department does. The answers were not encouraging, so we set off on our task. We had to do some research as to where the organizations operated but also looked elsewhere in case sites were up that they didn’t know about. We first searched the organizations’ names and any subsidiaries’ names as well in ARIN, LACNIC, RIPE, AFRINIC, and APNIC attempting to find IP addresses that might belong to them. We utilized looking BGP glass sites (such as https://stat.ripe.net/widget/looking-glass) looking for autonomous system numbers as well when IP addresses were found. Google hacking was used to find domain names and sites that may be owned by the organizations as well. After IP address ranges were discovered, reverse lookups were done to attempt to identify more domain names. We obtained a DNS zone transfer from the IT to see what it contained. We slowly built a list of IP addresses owned, top level domains, subdomains, websites present including port, IP and where they were hosted, and any contact information found on each resource. Each time we found a new site or domain in Google, we would see where it was hosted, investigate the IP block, etc. After about a week we had a pretty good list of resources thought to be owned by each organization. The spreadsheet of info was then used by internal audit to track down business owners, if they could be found. The history behind each resource was slowly discovered, and many of the sites that were outsource by the market departments or others had ownership transferred to IT in many cases. It was a slow and long process but the most important part was understanding what we knew and didn’t know. Once external sites and applications were indexed into an “application catalog”, efforts could be made to assess and secure each one accordingly. After the cataloging was complete, automated scanning (Nessus, OpenVAS, etc.) was performed in an attempt to identify low hanging fruit. From there the vulnerabilities were handed off to the newly created vulnerability management team. After those efforts were complete, dynamic web application security scanners were employed to look for web application security vulnerabilities that may exist. Next, manual web application testing was performed on the sites to identified other issues that weren’t caught with the low hanging fruit by the scanners. This entire process occurred over several months however a better handle on what was owned, what was present externally, and what type of technical risk was present was now known. The importance of cataloging applications that were present, who the business owner is, and how updates are performed allowed the organizations to go from a reactive and defensive approach of firefighting to a more proactive and offensive approach to information security. Once the internet facing applications were accounted for, assessed, patched, and identified risks mitigated or accepted, the vulnerability management team aided in cataloging internal applications using similar steps. Port scanning, DHCP, DNS, WMI (Windows management instrumentation), Active Directory, and a variety of other tools, protocols, and technologies were utilized to identify internal systems and networked devices. The cataloging was the responsibility of another team however many of the tasks were the same as external identification of sites and web applications. The applications were further categorized into custom coded and 3rd party to determine how they were assessed and risks remediated. After months of effort, regular perimeter scans were being done to find new sites and assets. Antivirus, HIPS, and other solutions and technologies were implemented. Logging was being performed, monitoring of traffic and detection of anomalies was being done 24/7, and the scary notion of knowing nothing to knowing what was going on became a reality. There were many folks working on multiple aspects of the security programs, however regular communication and organization kept everyone on the same page and schedule. With situations like these, a merger and acquisition had been turned from a very risky or even ugly investment into a well functioning organization with some up front investment. That time and monetary investment may seem like a lot, but the return on investment is quickly identified when compared to the cost of a single breach, let alone the loss of reputation. The real secret to the transformation was being able to acknowledge ignorance, and knowing what we didn’t know… This article was written by Scott White.