Lessons Organizations Can Learn From Major Data Breaches | DFLabs

Free community edition

Request a demo

What We Can Learn From Recent Major Data Breaches

Back to all articles

major data breaches

Data breaches are on the rise, and regardless of what sector an organization is in, the risk of becoming the next victim is higher than ever before. Attackers continue to evolve their methods, and unfortunately our network defenders are spread dangerously thin. Hundreds if not thousands of security alerts a day pour into Security Operations Centers (SOCs), which creates a never-ending fire drill for most security professionals.

The sophistication of these attack methods means that the increasing volume of cyber attacks are harder to detect and the evidence of them may be lurking among seemingly low priority security alerts. The manual triage process means that detection to response times, often referred to as dwell time, continues to rise. This gives potential attackers ample amount of time to probe a network, infect its hosts, and pivot to gain access to its high value data. In most cases, the damage has already been done by the time an attack has even been detected.

According to the 2017 Data Breach Year-End Review, released by the Identity Theft Resource Center (ITRC) and CyberScout, 2017 hit an all-time record of 1,579 breaches. That is up a startling 44.7% from 2016! With 2018 figures expected to surpass this record high, this trend seems to no longer be a trend, but an epidemic. From Anthem to Yahoo, Target and Equifax it seems that no company is safe. So, what can be learned from the misfortune of these companies to ensure that your organization does not join this ever-growing list of victims?

Lesson 1: Third-Party Vendor Compromise

The Target breach is the poster child for this misstep of security. The initial compromise came from a third-party vendor hired by Target to monitor and manage their HVAC system. Using the vendors stolen credentials, attackers were able to gain access to and pivot through the Target network all the way up to their Point of Sale (POS) network. From there we all know how that story ended, so what could have been done to prevent this from happening?

Almost every company has a third-party vendor relationship. In some cases, these vendors may even have access to the corporate environment to perform the job they were contracted to do. Before granting access to a third-party vendor, organizations need to take a few precautions to prevent a breach of their network through their contracted vendor. Prerequisites should include:

  • Require the vendor to be audited for security gaps. Request validation that activities such as user security training are regularly taking place. This will ensure that your vendor takes security of their systems as serious as your organization.

  • Perform a network audit of the systems and network segments these vendors will have access to. Ensure their exposure to sensitive areas of the network are prohibited.

  • In the case that the vendor must have access to systems or network segments that can not be segregated from the rest of the network, they MUST have strict access controls and monitoring rulesets applied to their user accounts. Any abnormality such as attempting to access a system outside of their assigned job function must be flagged immediately. Automation rules may also be set up to immediately revoke network access if this activity is observed.

  • Deploy a User and Entity Behavior Analytics (UEBA) solution. This solution will allow for network defenders to monitor user and other entity behavior, protecting an organization from external threats which may have eluded their defenses.

  • If possible, require vendors to obtain static IP addresses. Strict monitoring rules such as usernames coming from different IPs or geolocation differences can be flagged immediately and shut down via automation rules.

Lesson 2: Exposure of Vulnerable Systems to the Internet

If I had a dollar for every instance where a company was breached due to an unpatched vulnerability or open exposure of sensitive systems to the Internet, I could retire early. This is without a doubt one of the most common occurrences between all victims of a data breach. Vulnerability management and remediation is central to preventing this method of intrusion.

Another essential component to consider is asset management. Security operations centers face challenges and cannot protect what they do not know is there. This is extremely important for organizations who may have gone through a merger or acquisition. When merging two companies together it will require the convergence of networked environments. This can be an Achilles heel for an organization if not approached with caution.

Asset identification and management can be a hard and brutal task, but it must be done. If an organization is low staffed and incapable of performing this assessment, hire an outside security provider. Most vulnerability assessments will have a discovery phase. This can help an organization begin to identify what assets they have in place, their vulnerabilities, and how to properly secure them. Vulnerability assessments and vulnerability scanning must be performed regularly. Organizations can use automated processes to help prioritize what systems need to be attended to first.

An additional weakness has begun to surface as more companies move their operations to the cloud. An incomplete understanding of cloud infrastructure and its security has left some of these organizations and their sensitive data fully exposed to the Internet. If an organization has or is planning to move some of its operations to the cloud, take the extra precaution of hiring a security provider who specializes in cloud infrastructure and security to assess and supervise the deployment. These professionals will help to close any gaps and expose vulnerabilities that may have inadvertently been created during implementation.

As these vulnerabilities are uncovered and remediation plans are created, ensure that strict monitoring rules are put in place around these vulnerable machines. By deploying these monitoring rules around an organization’s most vulnerable assets, security professionals will have greater visibility into their attack surface and can accelerate their response times to prevent an incident from turning into a breach.

Lesson 3: Weak Authentication and Access Controls

Another commonality these companies share is the lack of strong authentication enforcement and loose access controls. This security control is thought to be “número uno” when creating a secure networking environment, especially if the company has an Internet presence. However, this attack method continues to be a source of most intrusions.

When implementing authentication for your internal users or external customer-base these rules must apply:

  • Force users to create strong passwords. Recommending they use long passphrases instead of easy to guess words which can be cracked through dictionary attacks.

  • For every and all systems, change default user accounts and disable any unneeded accounts to prevent account hijacking.

  • Elevated user accounts should have strict monitoring rules applied. This will ensure if this type of account is compromised network defenders will be alerted immediately and can take preventative measures to prevent further infiltration. Automation tasks can also aid security teams by taking containment actions such as disabling the account or moving its group membership until it can be remediated.

  • Utilize secondary mechanisms for forgotten passwords. Never send usernames and passwords in the same communication and ensure its delivery in encrypted both in transit and at rest.

  • Use Single Sign On (SSO) authentication. This will reduce the risk of end-users choosing simple passwords and also provides a single access point to monitor and manage.

  • One of the most effective controls is to utilize Two-factor Identification. This method reduces the risk of unauthorized access by requiring more than just a user’s password to gain access to a system.

  • If storing customer or user passwords, network defenders must ensure passwords are hashed and not stored on the same database as their user accounts. The hashing algorithm should be greater than SHA-1 and should have strict monitoring and segregation in place.

  • User access must follow the least privilege principle. All user and group accounts should be reviewed regularly to ensure that proper restrictions are in place.

Lesson 4: Human Error

Finally, and most importantly, the largest risk to an organization is its end-users. Human error is extremely hard to predict and prevent. Whether their actions are carless, negligent, or just uniformed it can have devastating consequences.

Social engineering is by far the most effective method to expose human vulnerabilities. The desire to click on links, open unknown attachments, and share personal and confidential information with familiar accounts or individuals continues to be the primary point of failure, even though we as a society know the risk.

The sophistication of these social engineering campaigns continues to evolve, which makes securing our users more important than ever before. Every organization knows that user awareness training must be performed regularly and built into the processes and procedures. However, the effort put forth in these trainings still seems to fall short. So, how can we protect users from themselves?

  • Most email services allow administrators to implement an email policy which will disable all hyperlinks within an email. By disabling these links, the risk of this intrusion technique is diminished.

  • Utilize automation to detonate attachments in a sandbox before delivery. This along with limiting what types of attachments can be sent and received by an organization may help reduce the risk of receiving and executing malicious attachments.

  • Deploy application and website whitelisting. This is not a task for the faint at heart. It can be complicated and can take a long time to implement and must have the buy in of all departments that are served. However, application and website whitelisting can help prevent the risk of users browsing to a hijacked or malicious site.

  • Implement cost-effective remote browser isolation in your organization. This technology essentially creates an air-gap between a user’s browser and their local machine. This will create a physical barrier which malware cannot circumvent. By utilizing this technology, organizations can shut down the most common attack vector affecting your users.

There is no denying organizations have a lot to do to secure these gaps. Whether its budget constraints or lack of staffing that is preventing an organization from adopting these simple principles, we must find a way to pull our heads out of the sand and recognize that we are one missed security control away from being the next victim.

These data breaches will not stop, in fact they will only get worse. The sophistication will evolve and create even greater gaps than there are today. Begin taking steps, no matter how small, to start implementing some of these recommendations. Assess your vulnerabilities, put monitoring rules in place, know your assets and educate your users, or the next organization we may read about could be yours.

Get Started with a One-to-One Personalized Demo

Dramatically reduce the mean time to detection, response and remediation of all potential security incidents, ensuring no alert goes untouched.

See IncMan SOAR in Action.

Request a demo