Team Sentra
Read insightful articles by the Sentra team about different topics, such as, preventing data breaches, securing sensitive data, and more.
Name's Data Security Posts
8 Holiday Data Security Tips for Businesses
8 Holiday Data Security Tips for Businesses
As the end of the year approaches and the holiday season brings a slight respite to many businesses, it's the perfect time to review and strengthen your data security practices. With fewer employees in the office and a natural dip in activity, the holidays present an opportunity to take proactive steps that can safeguard your organization in the new year. From revisiting access permissions to guarding sensitive data access during downtime, these tips will help you ensure that your data remains protected, even when things are quieter.
Here's how you can bolster your business’s security efforts before the year ends:
- Review Access and Permissions Before the New Year
Take advantage of the holiday downtime to review data access permissions in your systems. Ensure employees only have access to the data they need, and revoke permissions for users who no longer require them (or worse, are no longer employees). It's a proactive way to start the new year securely. - Limit Access to Sensitive Data During Holiday Downtime
With many staff members out of the office, review who has access to sensitive data. Temporarily restrict access to critical systems and data for those not on active duty to minimize the risk of accidental or malicious data exposure during the holidays. - Have a Data Usage Policy
With the holidays bringing a mix of time off and remote work, it’s a good idea to revisit your data usage policy. Creating and maintaining a data usage policy ensures clear guidelines for who can access what data, when, and how, especially during the busy holiday season when staff availability may be lower. By setting clear rules, you can help prevent unauthorized access or misuse, ensuring that your data remains secure throughout the holidays, and all the way to 2025. - Eliminate Unnecessary Data to Reduce Shadow Data Risks
Data security risks increase as long as data remains accessible. With the holiday season bringing potential distractions, it's a great time to review and delete any unnecessary sensitive data, such as PII or PHI, to prevent shadow data from posing a security risk as the year wraps up with the new year approaching. - Apply Proper Hygiene to Protect Sensitive Data
For sensitive data that must exist, be certain to apply proper hygiene such as masking/de-identification, encryption, logging, etc., to ensure the data isn’t improperly disclosed. With holiday sales, year-end reporting, and customer gift transactions in full swing, ensuring sensitive data is secure is more important than ever. Many stores have native tools that can assist (e.g., Snowflake DDM, Purview MIP, etc.). - Monitor Third-Party Data Access
Unchecked third-party access can lead to data breaches, financial loss, and reputational damage. The holidays often mean new partnerships or vendors handling seasonal activities like marketing campaigns or order fulfillment. Keep track of how vendors collect, use, and share your data. Create an inventory of vendors and map their data access to ensure proper oversight, especially during this busy time. - Monitor Data Movement and Transformations
Data is dynamic and constantly on the move. Monitor whenever data is copied, moved from one environment to another, crosses regulated perimeters (e.g., GDPR), or is ETL-processed, as these activities may introduce new sensitive data vulnerabilities. The holiday rush often involves increased data activity for promotions, logistics, and end-of-year tasks, making it crucial to ensure new data locations are secure and configurations are correct. - Continuously Monitor for New Data Threats
Despite our best protective measures, bad things happen. A user’s credentials are compromised. A partner accesses sensitive information. An intruder gains access to our network. A disgruntled employee steals secrets. The holiday season’s unique pressures and distractions increase the likelihood of these incidents. Watch for anomalies by continually monitoring data activity and alerting whenever suspicious things occur—so you can react swiftly to prevent damage or leakage, even amid the holiday bustle. A user’s credentials are compromised. A partner accesses sensitive information. An intruder gains access to our network. A disgruntled employee steals secrets. Watch for these anomalies by continually monitoring data activity and alerting whenever suspicious things occur - so you can react swiftly to prevent damage or leakage.
Wrapping Up the Year with Stronger Data Security
By taking the time to review and update your data security practices before the year wraps up, you can start the new year with confidence, knowing that your systems are secure and your data is protected. Implementing these simple but effective measures will help mitigate risks and set a strong foundation for 2025. Don't let the holiday season be an excuse for lax security - use this time wisely to ensure your organization is prepared for any data security challenges the new year may bring.
Visit Sentra's demo page to learn more about how you can ensure your organization can stay ahead and start 2025 with a stronger data security posture.
Why DSPM Should Take A Slice of Your Cyber Security Budget
Why DSPM Should Take A Slice of Your Cyber Security Budget
We find ourselves in interesting times. Enterprise cloud transformations have given rise to innovative cloud security technologies that are running at a pace even seasoned security leaders find head-spinning. As security professionals grapple with these evolving dynamics, they face a predicament of conflicting priorities that directly impact budget decisions.
So much innovation and possibilities, yet, the economic climate is demanding consolidation, simplification, and yes, budget cuts. So, how do you navigate this tricky balancing act? On one hand, you need to close those critical cybersecurity gaps, and on the other, you must embrace new technology to innovate and stay competitive. To add a touch more complexity, there’s the issue of CIOs suffering from "change fatigue." According to Gartner, this fatigue manifests as CIOs hesitate to invest in new projects and initiatives, pushing a portion of 2023's IT spending into 2024, a trend that is likely to continue into 2025. CIOs are prioritizing cost control, efficiencies, and automation, while scaling back those long IT projects that take ages to show returns.
Cloud Security: A Top Investment
PwC suggests that cloud security is one of the top investment areas for 2024. The cloud's complex landscape, often poorly managed, presents a significant challenge. Astoundingly, 97% of organizations have gaps in their cloud risk management plans. The cloud security arena is nothing short of a maze that is difficult to navigate, driving enterprises towards vendor consolidation in an effort to reduce complexity, drive greater predictability and achieve positive ROI quickly.
The cloud data security challenge is far from being solved, and this is precisely why the demand for Data Security Posture Management (DSPM) solutions is on the rise. DSPM shines a light on the entire multi-cloud estate by bringing in the data context. With easy integrations, DSPM enriches the entire cloud security stack, driving more operational efficiencies as a result of accurate data risk quantification and prioritization. By proactively reducing the data attack surface on an ongoing basis, DSPM plays a role in reducing the overall risk profile of the organization.
DSPM's Role in Supporting C-Suite Challenges
Sometimes amid economic uncertainty and regulatory complexities, taking a comprehensive and granular approach to prioritize data risks can greatly enhance your 2024 cybersecurity investments.
DSPM plays a vital role in addressing the intricate challenges faced by CISOs and their teams. By ensuring the correct security posture for sensitive data, DSPM brings a new level of clarity and control to data security, making it an indispensable tool for navigating the complex data risk landscape. DSPM enables CISOs to make informed decisions and stay one step ahead of evolving threats, even in the face of uncertainty.
Let's break it down and bottom line why DSPM should have a spot in your budget:
- DSPM isn't just a technology; it's a proactive and strategic approach that empowers you to harness the full potential of your cloud data while having a clear prioritized view of your most critical data risks that will impact remediation efficiency and accurate assessment of your organization’s overall risk profile.
- Reduce Cloud Storage Costs via the detection and elimination of unused data, and drive up operational efficiency from targeted and prioritized remediation efforts that focus on the critical data risks that matter.
- Cloud Data Visibility comes from DSPM providing security leaders with a crystal-clear view of their organization's most critical data risks. It offers unmatched visibility into sensitive data across multi-cloud environments, ensuring that no sensitive data remains undiscovered. The depth and breadth of data classification provides enterprises with a solid foundation to benefit from multiple use case scenarios spanning DLP, data access governance, data privacy and compliance, and cloud security enrichment.
- Manage & Monitor Risk Proactively: Thanks to its ability to understand data context, DSPM offers accurate and prioritized data risk scores. It's about embracing the intricate details within larger multi-cloud environments that enable security professionals to make well-informed decisions. Adding the layer of data sensitivity, with its nuanced scoring, enriches this context even further. DSPM tools excel at recognizing vulnerabilities, misconfigurations, and policy violations. This empowers organizations to address these issues before they escalate into incidents.
- Regulatory Compliance undertakings to abide by data protection regulations, becomes simplified with DSPM, helping organizations steer clear of hefty penalties. Security teams can align their data security practices with industry-specific data regulations and standards. Sentra assesses how your data security posture stacks up against standard compliance and security frameworks your organization needs to comply with.
- Sentra's agentless DSPM platform offers quick setup, rapid ROI, and seamless integration with your existing cloud security tools. It deploys effortlessly in your multi-cloud environment within minutes, providing valuable insights from day one. DSPM enhances your security stack, collaborating with CSPMs, CNAPPs, and CWPPs to prioritize data risks based on data sensitivity and security posture. It ensures data catalog accuracy and completeness, supports data backup, and aids SIEMs and Security Lakes in threat detection. DSPM also empowers Identity Providers for precise access control and bolsters detection and access workflows by tagging data-based cloud workloads, optimizing data management, compliance, and efficiency.
The Path Forward
2024 is approaching fast, and DSPM is an investment in long-term resilience against the ever-evolving data risk landscape. In planning 2024's cybersecurity budget, it's essential to find a balance between simplification, innovation and cost reduction. DSPM plays an important part in this intricate budgeting dance and stands ready to play its part.
Why Data is the New Center of Gravity in a Connected Cloud Security Ecosystem
Why Data is the New Center of Gravity in a Connected Cloud Security Ecosystem
As many forward-thinking organizations embrace the transformational potential of innovative cloud architectures- new dimensions of risk are emerging, centered around data privacy, compliance, and the protection of sensitive data. This shift has catapulted cloud data security to the top of the Chief Information Security Officer's (CISO) agenda.
At the Gartner Security and Risk Management summit, Gartner cited some of the pressing priorities for CISOs as safeguarding data across its various forms, adopting a simplified approach, optimizing resource utilization, and achieving low-risk, high-value outcomes. While these may seem like a tall order, they provide a clear roadmap for the future of cloud security.
In light of these priorities, Gartner also highlighted the pivotal trend of integrated security systems. Imagine a holistic ecosystem where proactive and predictive controls harmonize with preventative measures and detection mechanisms. Such an environment empowers security professionals to continuously monitor, assess, detect, and respond to multifaceted risks. This integrated approach catalyzes the move from reaction to anticipation and resolution to prevention.
In this transformative ecosystem, we at Sentra believe that data is the gravitational center of connected cloud security systems and an essential element of the risk equation. Let's unpack this some more.
It's All About the Data.
Given the undeniable impact of major data breaches that have shaken organizations like Discord, Northern Ireland Police, and Docker Hub, we all know that often the most potent risks lead to sensitive data.
Security teams have many cloud security tools at their disposal, from Cloud Security Posture Management (CSPM) and Cloud Native Application Protection Platform (CNAPP) to Cloud Access Security Broker (CASB). These are all valuable tools for identifying and prioritizing risks and threats in the cloud infrastructure, network, and applications, but what really matters is the data.
Let's look at an example of a configuration issue detected in an S3 bucket. The next logical question will be what kind of data resides inside that datastore, how sensitive the data is, and how much of a risk it poses to the organization when aligned with specific security policies that have been set up. These are the critical factors that determine the real risk. Can you imagine assessing risk without understanding the data? Such an assessment would inevitably fall short, lacking the contextual depth necessary to gauge the true extent of risk.
Why is this important? Because sensitive data will raise the severity of the alert. By factoring data sensitivity into risk assessments, prioritizing data-related risks becomes more accurate. This is where Sentra's innovative technology comes into play. By automatically assigning risk scores to the most vital data risks within an organization, Sentra empowers security teams and executives with a comprehensive view of sensitive data at risk. This overview extends the option to delve deep into the root causes of vulnerabilities, even down to the code level.
Prioritized Data Risk Scoring: The Sentra Advantage
Sentra's automated risk scoring is built from a rich data security context. This context originates from a thorough understanding of various layers:
- Data Access: Who has access to the data, and how is it governed?
- User Activity: What are the users doing with the data?
- Data Movement: How does data move within a complex multi-cloud environment?
- Data Sensitivity: How sensitive is the data?
- Misconfigurations: Are there any errors that could expose data?
This creates a holistic picture of data risk, laying a firm and comprehensive foundation for Sentra's unique approach to data risk assessment and prioritized risk scoring.
Contextualizing Data Risk
Context is everything when it comes to accurate risk prioritization and scoring. Adding the layer of data sensitivity – with its nuanced scoring – further enriches this context, providing a more detailed perspective of the risk landscape. This is the essence of an integrated security system designed to empower security leaders with a clear view of their exposure while offering actionable steps for risk reduction.
The value of this approach becomes evident when security professionals are empowered to manage and monitor risk proactively. The CISO is armed with insights into the organization's vulnerabilities and the means to address them. Data security platforms, such as Sentra's, should seamlessly integrate with the workflows of risk owners. This facilitates timely action, eliminating the need for bottlenecks and unnecessary back-and-forth with security teams.
Moving Forward
The connection between cloud security and data is profound, shaping the future of cybersecurity practices. A data-centric approach to cloud security will empower organizations to harness the full potential of the cloud while safeguarding the most valuable asset: their data.
5 Key Findings for Cloud Data Security Professionals from ESG's Survey
5 Key Findings for Cloud Data Security Professionals from ESG's Survey
Securing sensitive cloud data is a key challenge and priority for 2023 and there's increasing evidence that traditional data security approaches are not sufficient. Recently, Enterprise Strategy Group surveyed hundreds of IT, Cloud Security, and DevOps professionals who are responsible for securing sensitive cloud data. The survey had 4 main objectives:
- Determine how public cloud adoption was changing data security priorities
- Explore data loss - particularly sensitive data - from public cloud environments.
- Learn the different approaches organizations are adopting to secure their sensitive cloud data.
- Examine data security spending trends
The 26 page report is full of insights regarding each of these topics. In this blog, we’ll dive into 5 of the most compelling findings and explore what each of them mean for cloud data security leaders.
More Data is Migrating to the Cloud - Even Though Security Teams Aren’t Confident they Can Keep it Secure.
ESG’s findings show that currently 26% of organizations have more than 40% of their company’s data in the cloud. But in 24 months more organizations ( 58%) will have that much of their data in the cloud.
On the one hand, this isn’t surprising. The report notes that digital transformation initiatives combined with the growth of remote/hybrid work environments are pushing this migration. The challenge is that the report also shows that sensitive data is being stored in more than one cloud platform and when it comes to IaaS and PaaS data, more than half admit that a large amount of that data is insufficiently secured. In other words - security isn’t keeping pace with this push to store more and more data in the public cloud.
Cloud Data Loss Affects Nearly 60% of Respondents. Yet They’re Confident They Know Where their Data is
59% of surveyed respondents know they’ve lost sensitive data or suspect they have (with the vast majority saying they lost it more than once). There are naturally many reasons for this, including misconfigurations, misclassifications, and malicious insiders. But at the same time, over 90% said they’re confident in their data discovery and classification abilities. Something doesn’t add up. This gives us a clear indication that existing/defensive security controls are insufficient to deal with cloud data security challenges.
The problem here is likely shadow data. Of course security leaders would secure the sensitive data that they know about. But you can’t secure what you’re unaware of. And with data being constantly moved and duplicated, sensitive assets can be abandoned and forgotten. Solving the data loss problem requires a richer data discovery to provide a meaningful security context. Otherwise, this false sense of security will continue to contribute to sensitive data loss.
Almost All Data Warehouses Have Sensitive Data
Where is this sensitive data being stored? 86% of survey respondents say that they have sensitive data in data lakes or data warehouses. A third of this data is business critical, with almost all the remaining data considered ‘important’ for the business.
Data lakes and warehouses allow data scientists and engineers to leverage their business and customer data to use analytics and machine learning to generate business insights, and have a clear impact on the enterprise. Keeping this growing amount of business critical sensitive data secure is leading to increasing adoption of cloud data security tools.
The Ability to Secure Structured and Unstructured Data is the Most Important Attribute for Data Security Platforms
With 45% of organizations facing a cybersecurity skills shortage, there’s a clear movement towards automation and security platforms to pick up some of the work securing cloud data. With data being stored across different cloud platforms and environments, two thirds of respondents mentioned preferring a single tool for cloud data security.
When choosing a data security platform, the 3 most important attributes were:
- Data type coverage (structured and unstructured data)
- Data location coverage
- Integration with security tools
It’s clear that as organizations plan for a future with increasing amounts of data in the public cloud, we will see a widespread adoption of cloud data security tools that can find and secure data across different environments.
Cloud Data Security has an Address in the Organization - The Cloud Security Architect
Cloud data security has always been a role that was assigned to any number of different team members. Devops, legal, security, and compliance teams all have a role to play. But increasingly, we’re seeing data security become the responsibility chiefly of the cloud security architect.
86% of organizations surveyed now have a cloud security architect role, and 11% more are hiring for this role in the next 12-24 months - and for good reason. Of course, the other teams, including infrastructure and development continue to play a major role. But there is finally some agreement that sensitive data requires its own focus and is best secured by the cloud security architect.
Top 8 AWS Cloud Security Tools and Features
Top 8 AWS Cloud Security Tools and Features
AWS – like other major cloud providers – has a ‘shared responsibility’ security model for its customers. This means that AWS takes full responsibility for the security of its platform – but customers are ultimately responsible for the security of the applications and datasets they host on the platform.
This doesn’t mean, however, that AWS washes its hands of customer security concerns. Far from it. To support customers in meeting their mission critical cloud security requirements, AWS has developed a portfolio of cloud security tools and features that help keep AWS applications and accounts secure. Some are offered free, some on a subscription basis. Below, we’ve compiled some key points about the top eight of these tools and features:
1. Amazon GuarDuty
Amazon’s GuardDuty threat detection service analyzes your network activity, API calls, workloads, and data access patterns across all your AWS accounts. It uses AI to check and analyze multiple sources – from Amazon CloudTrail event logs, DNS logs, Amazon VPC Flow Logs, and more. GuardDuty looks for anomalies that could indicate infiltration, credentials theft, API calls from malicious IPs, unauthorized data access, cryptocurrency mining, and other serious cyberthreats. The subscription-based tool also draws updated threat intel from feeds like Proofpoint and Crowdstrike, to ensure workloads are fully protected from emerging threats.
2. AWS CloudTrail
Identity is an increasingly serious attack surface in the cloud. And this makes visibility over AWS user account activity crucial to maintaining uptime and even business continuity. AWS CloudTrail enables you to monitor and record account activity - fully controlling storage, analysis and remediation - across all your AWS accounts. In addition to improving overall security posture through recording user activity and events, CloudTrail offers important audit functionality for proof of compliance with emerging and existing regulatory regimes like HIPAA, SOC and PCI. CloudTrail is an invaluable addition to any AWS security war chest, empowering admins to capture and monitor API usage and user activity across all AWS regions and accounts.
3. AWS Web Application Firewall
Web applications are attractive targets for threat actors, who can easily exploit known web layer vulnerabilities to gain entry to your network. AWS Web Application Firewall (WAF) guards web applications and APIs from bots and web exploits that can compromise security and availability, or unnecessarily consume valuable processing resources. AWS WAF addresses these threats by enabling control over which traffic reaches applications, and how it reaches them. The tool lets you create fully-customizable security rules to block known attack patterns like cross-site scripting and SQL injection. It also helps you control traffic from automated bots, which can cause downtime or throw off metrics owing to excessive resource consumption.
4. AWS Shield
Distributed Denial of Service (DDoS) attacks continue to plague companies, organizations, governments, and even individuals. AWS Shield is the platform’s built-in DDoS protection service. Shield ensures the safety of AWS-based web applications – minimizing both downtime and latency. Happily, the standard tier of this particular AWS service is free of charge and protects against most common transport and network layer DDoS attacks. The advanced version of AWS Shield, which does carry an additional cost, adds resource-specific detection and mitigation techniques to the mix - protecting against large-scale DDoS attacks that target Amazon ELB instances, AWS Global Accelerator, Amazon CloudFront, Amazon Route 53, and EC2 instances.
5. AWS Inspector
With the rise in adoption of cloud hosting for storage and computing, it’s crucial for organizations to protect themselves from attacks exploiting cloud vulnerabilities. A recent study found that the average cost of recovery from a breach caused by cloud security vulnerabilities was nearly $5 million. Amazon Inspector enables automated vulnerability management for AWS workloads. It automatically scans for software vulnerabilities, as well as network vulnerabilities like remote root login access, exposed EC2 instances, and unsecured ports – all of which could be exploited by threat actors. What’s more, Inspector’s integral rules package is kept up to date with both compliance standards and AWS best practices.
6. Amazon Macie
Supporting Amazon Simple Storage Service (S3), Amazon’s Macie data privacy and security service leverages pattern matching and machine learning to discover and protect sensitive data. Recognizing PII or PHI (Protected Health Information) in S3 buckets, Macie is also able to monitor the access and security of the buckets themselves. Macie makes compliance with regulations like HIPAA and GDPR simpler, since it clarifies what data there is in S3 buckets and exactly how that data is shared and stored publicly and privately.
7. AWS Identity and Access Management
AWS Identity and Access Management (IAM) enables secure management of identities and access to AWS services and resources. IAM works on the principle of least privilege – meaning that each user should only be able to access information and resources necessary for their role. But achieving least privilege is a constantly-evolving process – which is why IAM works continuously to ensure that fine-grained permissions change as your needs change. IAM also allows AWS customers to manage identities per-account or offer multi-account access and application assignments across AWS accounts. Essentially, IAM streamlines AWS streamlines permissions management – helping you set, verify, and refine policies toward achieving least privilege.
8. AWS Secrets Manager
AWS aptly calls their secrets management service Secrets Manager. It’s designed to help protect access to IT resources, services and applications – enabling simpler rotation, management and retrieval of API keys, database credentials and other secrets at any point in the secret lifecycle. And Secrets Manager allows access control based on AWS Identity and Access Management (IAM) and resource-based policies. This means you can leverage the least privilege policies you defined in IAM to help control access to secrets, too. Finally, Secrets Manager handles replication of secrets – facilitating both disaster recovery and work across multiple regions.
There are many more important utilities we couldn’t cover in this blog, including AWS Audit Manager, which are equally important in their own rights. Yet the key takeaway is this: even though AWS customers are responsible for their own data security, AWS makes a real effort to help meet and exceed security standards and expectations.
Access Controls that Move - The Power of Data Security Posture Management
Access Controls that Move - The Power of Data Security Posture Management
Controlling access to data has always been one of the basics of cybersecurity hygiene. Managing this access has evolved from basic access control lists, to an entire Identity and Access Management industry. IAM controls are great at managing access to applications, infrastructure and on-prem data. But cloud data is a trickier issue. Data in the cloud changes environments and is frequently copied, moved, and edited.
This is where data access tools share the same weakness- what happens when the data moves? (Spoiler - the policy doesn’t follow).
The Different Access Management Models
There are 3 basic types of access controls enterprises use to control who can read and edit their data.
Access Control Lists: Basic lists of which users have read/write access.
Role Based Access Control (RBAC): The administrator defines access by what roles the user has - for example, anyone with the role ‘administrator’ is granted access.
Attribute Based Access Control (ABAC): The administrator defines which attributes a user must have to access an object - for example, only users with the job title ‘engineer’ and only those accessing the data from a certain location will be granted access. These policies are usually defined in XACML which stands for "eXtensible Access Control Markup Language’.
How Access Controls are Managed in the Cloud
The major public cloud providers include a number of access control features.
AWS for example, has long included clear instructions on managing access to consoles and S3 buckets. In RDS, users can tag and categorize resources and then build access policies based on those tags.
Similar controls exist in Azure: Azure RBAC allows owners and administrators to create RBAC roles and currently Azure ABAC is in preview mode, and will allow for fine grained access control in Azure environment.
Another aspect of access management in the cloud is ‘assumed roles’ in which a user is given access to a resource they aren’t usually permitted to access via a temporary key. This permission is meant to be temporary and permit cross account access as needed. Learn more about Azure security in our comprehensive guide.
The Problem: Access Controls Don't Follow the Data
So what’s missing? When data access controls are put in place in the cloud, they’re tied to the data store or database that the controls were created for. Imagine the following scenario. An administrator knows that a specific S3 bucket has sensitive data in it. Being a responsible cloud admin, they set up RBAC or ABAC policies and ensure only the right users have permissions at the right times. So far so good.
But now someone comes along and needs some of the data in that bucket. Maybe just a few details from a CSV file. They copy/paste the data somewhere else in your AWS environment.
Now what happens to that RBAC or ABAC policy? It doesn’t apply to the copied data - not only does the data not have the proper access controls set, but even if you’re able to find the exposed sensitive data, it’s not clear where it came from, or how it’s meant to be protected.
How Sentra’s DSPM Ensures that Data Always Has the Proper Access Controls
What we need is a way for the access control policy to travel with the data throughout the public cloud. This is one of the most difficult problems that Data Security Posture Management (DSPM) was created to tackle.
DSPM is an approach to cloud security that focuses on finding and securing sensitive data, as opposed to the cloud infrastructure or applications. It accomplishes this by first discovering sensitive data (including shadow or abandoned data). DSPM classifies the data types using AI models and then determines whether the data has the proper security posture and how best to remediate if it doesn’t.
While data discovery and classification are important, they’re not actionable without understanding:
- Where the data came from
- Who originally had access to the data
- Who has access to the data now
The divide between what a user currently has access to vs what they should have access to, is referred to as the ‘authorization gap’.
Sentra’s DSPM solution is able to understand who has access to the data and close this gap through the following processes:
- Detecting unused privileges and adjusting for least privileged access based on user behavior: For example ,if a user has access to 10 data stores but only accesses 2 of them, Sentra will notice and suggest removing access from the other 8.
- Detecting user groups with excessive access to data. For example, if a user in the finance team has access to the developer environment, Sentra will raise a flag to remove the over privileged user.
- Detecting overprivileged similar data: For example, if sensitive data in production is only accessible by 2 users, but 85% of the data exists somewhere where more people have access, Sentra will alert the data owners to remediate.
Access control and authorization remains one of the most important ways of securing sensitive cloud data. A data centric security solution can help ensure that the right access controls always follow your cloud data.
Types of Sensitive Data: What Cloud Security Teams Should Know
Types of Sensitive Data: What Cloud Security Teams Should Know
Not all data is created equal. If there’s a breach of your public cloud, but all the hackers access is company photos from your last happy hour… well, no one really cares. It’s not making headlines. On the other hand if they leak a file which contains the payment and personal details of your customers, that’s (rightfully) a bigger deal.
This distinction means that it’s critical for data security teams to understand the types of data that they should be securing first. This blog will explain the most common types of sensitive data organizations maintain, and why they need to be secured and monitored as they move throughout your cloud environment.
Types of Sensitive Cloud Data
Personal Identifiable Information (PII): National Institute of Standards and Practices defines PII as:
(1) any information that can be used to distinguish or trace an individual‘s identity, such as name, social security number, date and place of birth, mother‘s maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information.
User and customer data has become an increasingly valuable asset for businesses, and the amount of PII - especially in the cloud- has increased dramatically in only the past few years.
The value and amount of PII means that it is frequently the type of data that is exposed in the most famous data leaks. This includes the 2013 Yahoo! breach, which affected 3 billion records, and the 2017 Equifax breach.
Payment Card Industry (PCI): PCI data includes credit card information and payment details. The Payment Card Industry Security Standards Council created PCI-DSS (Data Security Standard) as a way to standardize how credit cards can be securely processed. To become PCI-DSS compliant, an organization must follow certain security practices with the aim of achieving 6 goals:
- Build and maintain a secure network
- Protect cardholder data
- Maintain a vulnerability management program
- Implement strong access control measures
- Regularly monitor networks
- Maintain an information security policy
Protected Health Information (PHI): In the United States, PHI regulations are defined by the Health Insurance Portability and Accountability Act (HIPAA). This data includes any past and future data about an identifiable individual’s health, treatment, and insurance information. The guidelines for protecting PHI are periodically updated by the US Department of Health and Human Services (HHS) but on a technological level, there is no one ‘magic bullet’ that can guarantee compliance. Compliant companies and healthcare providers will layer different defenses to ensure patient data remains secure. By law, HHS maintains a portal where breaches affecting 500 or more patient records are listed and updated.
Intellectual Property: While every company should consider user and employee data sensitive, what qualifies as a sensitive IP varies from organization to organization. For SaaS companies this could be source code of all customer-facing services or customer base trends. Identifying the most valuable data to your enterprise, securing it, and maintaining that security posture should be a priority for all security teams, regardless of the size of the company or where the data is stored.
Developer Secrets: For software companies, developer secrets such as passwords and API keys can be accidentally left in source code or in the wild. Often these developer secrets are unintentionally copied and stored in lower environments, data lakes, or unused block storage volumes.
The Challenge of Protecting Sensitive Cloud Data
When all sensitive data was stored on-prem, data security basically meant preventing unauthorized access to the company’s data center. Access could be requested, but the data wasn’t actually going anywhere. Of course, the adoption of cloud apps and infrastructures means this is no longer the case. Engineers and data teams need access to data to do their jobs, which often leads to moving, duplicating, or changing sensitive data assets. This growth of the ‘data attack surface’ leads to more sensitive data being exposed/leaked, which leads to more breaches. Breaking this cycle will require a new method of protecting these sensitive data classes.
Cloud Data Security with Data Security Posture Management
Data Security Posture Management (DSPM) was created for this new challenge. Recently recognized by Gartner® as an ‘On the Rise’ category, DSPMs find all cloud data, classify it by sensitivity, and then offer actionable remediation plans for data security teams. By taking a data centric approach to security, DSPM platforms are able to secure what matters to the business first - their data.
To learn more about Sentra’s DSPM solution, you can request a demo here.
Data Context is the Missing Ingredient for Security Teams
Data Context is the Missing Ingredient for Security Teams
Why are we still struggling with remediation and alert fatigue? In every cybersecurity domain, as we get better at identifying vulnerabilities, and add new automation tools, security teams still face the same challenge - what do we remediate first? What poses the greatest risk to the business?
Of course, the capabilities of cyber solutions have grown. We have more information about breaches and potential risk than ever. If in the past, an EDR could tell you which endpoint has been compromised, today an XDR can tell you which servers and applications have been compromised. It’s a deeper level of analysis. But prioritizing what to focus on first is still a challenge. You might have more information, but it’s not always clear what the biggest risk to the business is.
The same can be said for SIEMs and SOAR solutions. If in the past we received alerts and made decisions based on log and event data from the SIEM, now we can factor in threat intelligence and third party sources to better understand compromises and vulnerabilities. But again, when it comes to what to remediate to best protect your specific business these tools aren’t able to prioritize.
The deeper level of analysis we’ve been conducting for the last 5-10 years is still missing what’s needed to make effective remediation recommendations - context about the data at risk. We get all these alerts, and while we might know which endpoints and applications are affected, we’re blind when it comes to the data. That ‘severe’ endpoint vulnerability your team is frantically patching? It might not contain any sensitive data that could affect the business. Meanwhile, the reverse might be true - that less severe vulnerability at the bottom of your to-do list might affect data stores with customer info or source code.
AWS CISO Stephen Schmidt, showing data as the core layer of defense at this years AWS Reinforce
This is the answer to the question ‘why is prioritization still a problem?” - the data. We can’t really prioritize anything properly until we know what data we’re defending. After all, the whole point of exploiting a vulnerability is usually to get to the data.
Now let’s imagine a different scenario. Instead of getting your usual alerts and then trying to prioritize, you get messages that read like this:
‘Severe Data Vulnerability: Company source code has been found in the following unsecured data store:____. This vulnerability can be remediated by taking the following steps: ___’.
You get the context of what’s at-risk, why it’s important, and how to remediate it. That’s data centric security.
Why Data Centric Security is Crucial for Cloud First Companies
Data centric security wasn’t always critical. When everything was stored on the corporate data center, it was enough to just defend the perimeter, and you knew the data was protected. You also knew where all your data was - literally in the room next door. Sure, there were risks around information kept on local devices, but there wasn’t a concern that someone would accidentally save 100 GB of information to their device.
The cloud and data democratization changed all that. Now, besides not having a traditional perimeter, there’s the added issue of data sprawl. Data is moved, duplicated, and changed at previously unimaginable scales. And even when data is secured properly, with the proper security posture, that security posture doesn’t come with when the data is moved. Legacy security tools built for the on-prem era can’t provide the level of security context needed by organizations with petabytes of cloud data.
Data Security Posture Management
This data context is the promise of data security posture management solutions. Recently recognized in Gartner’s Hype Cycle for Data Security Report as an ‘On the Rise’ category, DSPM gets to the core of the context issue. DSPM solutions attack the problem by first identifying all data an organization has in the cloud. This step often leads to the discovery of data stores that security teams didn’t even know existed. Following this, the next stage is classification, where the types of data labeled - this could be PII, PCI, company secrets, source code, etc. Any sensitive data found to have an insufficient security posture is passed to the relevant teams for remediation. Finally, the cloud environment must be continuously assessed for future data vulnerabilities which are again forwarded to the relevant teams with remediation suggestions in real time.
In a clear example of the benefits offered by DSPM, Sentra has identified source code in open S3 buckets of a major ecommerce company. By leveraging machine learning and smart metadata scanning, Sentra quickly identified the valuable nature of the exposed asset and ensured it was quickly remediated.
If you’re interested in learning more about DSPM or Sentra specifically, request a demo here.
Cloud Data Security Means Shrinking the “Data Attack Surface”
Cloud Data Security Means Shrinking the “Data Attack Surface”
Traditionally, the attack surface was just the sum of the different attack vectors that your IT was exposed to. The idea being as you removed vectors through patching and internal audits.
With the adoption of cloud technologies, the way we managed the attack surface changed. As the attack vendors changed, new tools were developed to find security vulnerabilities and misconfigurations. However, the principle remained similar - prevent attackers from accessing your cloud infrastructure and platform by eliminating and remediating attack vectors. But attackers will find their way - it’s only a matter of "when".
Data attack surface is a new concept. When data was all in one location (the enterprise’s on- data center), this wasn’t something we needed to consider. Everything was in the data center, so defending the perimeter was the same thing as protecting the data. But what makes cloud data security vulnerable isn’t primarily technical vulnerabilities into the cloud environment. It’s the fact that there’s so much data to defend and it's not clear where all that data is, who is responsible for it, and what its security posture is supposed to be. The sum of the total vulnerable, sensitive, and shadow data assets is the data attack surface.
Reducing the Data Attack Surface
Traditional attack surface reduction is accomplished by visualizing your enterprise’s architecture and finding unmanaged devices. This first step in data attack surface reduction is similar - except it's about mapping your cloud data. Only following a successful cloud data discovery program can you understand the scope of the project.
The second step of traditional attack surface reduction is finding vulnerabilities and indicators of exposures. This is similarly adaptable to the data attack surface. By classifying the data both by sensitivity (company secrets, compliance) and by security posture (how should this data be secured) cloud security teams can identify their level of exposure.
The final step shrinking the attack surface involves remediating the data vulnerability. This can involve deleting an exposed, unused data store or ensuring that sensitive data has the right level of encryption. The idea should always be not to have more sensitive data than you need, and that data should always have the proper security posture.
3 Ways Reducing the Data Attack Surface Matters to the Business
- Reduce the likelihood of data breaches. Just like shrinking your traditional attack surfaces reduces the risk of a vulnerability being exploited, shrinking the data attack surface reduces the risk of a data breach. This is achieved by eliminating sensitive shadow data, which has the dual benefit of reducing both the overall amount of company data, and the amount of exposed sensitive data. With less data to defend, it’s easier to prioritize securing the most critical data to your business, reducing the risk of a breach.
- Stay compliant with data localization regulations. We’re seeing an increase in the number of data localization rules - data generated and stored in one country or region isn’t allowed to be transferred outside of that area. This can cause some obvious problems for cloud-first enterprises, as they may be using data centers all over the world. Orphaned or shadow data can be non-compliant with local laws, but because no one knows they exist, they pose a real compliance risk if discovered.
- Reducing overall security costs. The smaller your data attack surface is, the less time and money you need to spend defending it. There are cloud costs for scanning and monitoring your environment and there’s of course the time and human resources you need to dedicate to scanning, remediating, and managing the security posture of your cloud data. Shrinking the data attack surface means less to manage.
How Data Security Posture Management (DSPM) Tools Help Reduce the Data Attack Surface
Data Security Posture Management (DSPM) shrinks the attack surface by discovering all of your cloud data, classifying it by sensitivity to the business, and then offering plans to remediate all data vulnerabilities found. Often, shadow cloud data can be eliminated entirely, allowing security teams to focus on a smaller amount of data stores and databases. Additionally, by classifying data according to business impact, a DSPM tool ensures cloud security teams are focused on the most critical data - whether that’s company secrets, customer data, or sensitive employee data. Finally, remediation plans ensure that security teams aren’t just monitoring another dashboard, but are actually given actionable plans for remediating the data vulnerabilities, and shrinking the data attack surface.
The data attack surface might be a relatively new concept, but it’s based on principles security teams are well aware of: limiting exposure by eliminating attack vectors. When it comes to the cloud, the most important way to accomplish this is by focusing on the data first. With a smaller data attack surface, the less likely it is that valuable company data will be compromised.
Why It’s Time to Adopt a Data Centric Approach to Security
Why It’s Time to Adopt a Data Centric Approach to Security
Here’s the typical response to a major data leak: There’s a breach at a large company. And the response from the security community is usually to invest more resources in preventing all possible data breaches. This might entail new DLP tools or infrastructure vulnerability management solutions.
But there’s something missing in this response.
The reason the breach was so catastrophic was because the data that leaked was valuable. It’s not the “network” that’s being leaked.
So that’s not where data centric security should start.
Here’s what the future of data breaches could look like: There’s a breach. But the breach doesn’t affect critical company or customer data because that data all has the proper security posture. There’s no press. And everyone goes home calmly at the end of the day.
This is going to be the future of most data breaches.
It’s just more attainable to secure specific data stores and files than it is to throw up defenses “around” the infrastructure. The truth this that most data stores do not contain sensitive information. So if we can just keep sensitive data in a small number of secured data stores, enterprises will be much more secure. Focusing on the data is a better way to prepare for a compromised environment.
Practical Steps for Achieving Data Centric Security
What does it take to make this a reality? Organizations need a way to find, classify, and remediate all data vulnerabilities. Here are the 5 steps to adopting a data centric security approach:
- Discover shadow data and build a data asset inventory.
You can’t protect what you don’t know you have. This is true of all organizations, but especially cloud first organizations. Cloud architectures make it easy to replicate or move data from one environment or another. It could be something as simple as a developer moving a data table to a staging environment, or a data analyst copying a file to use elsewhere. Regardless of how the shadow data is created, finding it needs to be priority number one.
- Classifying the most sensitive and critical data
Many organizations already use data tagging to classify their data. While this often works well for structured data like credit card numbers, it’s important to remember that ‘sensitive data’ includes unstructured data as well. This includes company secrets like source code and intellectual property which cause as much damage as customer data in the event of a breach.
- Prioritize data security according to business impact
The reason we’re investing time in finding and classifying all of this data is for the simple reason that some types of data matter more than others. We can’t afford to be data agnostic - we should be remediating vulnerabilities based on the severity of the data at risk, not the technical severity of the alert. Differentiating between the signal and the noise is critical for data security. Ignore the severity rating of the infrastructure vulnerabilities if there’s no sensitive data at risk.
- Continuously monitor data access and user activity, and make all employees accountable for their data – this is not only the security team’s problem.
Data is extremely valuable company property. When you give employees physical company property - like a laptop or even a car- they know they’re responsible for it. But when it comes to data, too many employees see themselves as mere users of the data. This attitude needs to change. Data isn’t the security team’s sole responsibility.
- Shrink the data attack surface - take action to reduce the organization’s data sprawl.
Beyond remediating according to business impact, organizations should reduce the number of sensitive data stores by removing sensitive data that don't need to have it. This can be via redaction, anonymization, encryption, etc. By limiting the number of sensitive data stores, security teams effectively shrink the attack surface by reducing the number of assets worth attacking in the first place.
The most important aspect is understanding that data travels and its security posture must travel with it. If a sensitive data asset has a strict security posture in one location in the public cloud, it must always maintain that posture. A Social Security number is always valuable to a threat actor. It doesn’t matter whether it leaks from a secured production environment or a forgotten data store that no one has accessed for two years. Only by appreciating this context will organizations be able to ensure that their sensitive data is always secured properly.
How Data Security Posture Management Helps
The biggest technological obstacles that had to be overcome to make data centric security possible were proper classification of unstructured data and prioritization based on business impact. Advances in Machine Learning have made highly accurate classification and prioritization possible, and created a new type of security solution: Data Security Posture Management (DSPM).
DSPM allows organizations to accurately find and classify their cloud data while offering remediation plans for severe vulnerabilities. By finally giving enterprises a full view of their cloud data, data centric security is finally able to offer a deeper, more effective layer of cloud security than ever before.
Want to see what data centric security looks like with Sentra’s DSPM? Request a demo here
Create an Effective RFP for a Data Security Platform & DSPM
Create an Effective RFP for a Data Security Platform & DSPM
This RFP Guide is designed to help organizations create their own RFP for selection of Cloud-native Data Security Platform (DSP) & Data Security Posture Management (DSPM) solutions. The purpose is to identify key essential requirements that will enable effective discovery, classification, and protection of sensitive data across complex environments, including in public cloud infrastructures and in on-premises environments.
Instructions for Vendors
Each section provides essential and recommended requirements to achieve a best practice capability. These have been accumulated over dozens of customer implementations. Customers may also wish to include their own unique requirements specific to their industry or data environment.
1. Data Discovery & Classification
Requirement | Details |
---|---|
Shadow Data Detection | Can the solution discover and identify shadow data across any data environment (IaaS, PaaS, SaaS, OnPrem)? |
Sensitive Data Classification | Can the solution accurately classify sensitive data, including PII, financial data, and healthcare data? |
Efficient Scanning | Does the solution support smart sampling of large file shares and data lakes to reduce and optimize the cost of scanning, yet provide full scan coverage in less time and lower cloud compute costs? |
AI-based Classification | Does the solution leverage AI/ML to classify data in unstructured documents and stores (Google Drive, OneDrive, SharePoint, etc) and achieve more than 95% accuracy? | Data Context | Can the solution discern and ‘learn’ the business purpose (employee data, customer data, identifiable data subjects, legal data, synthetic data, etc.) of data elements and tag them accordingly? | Data Store Compatibility | Which data stores (e.g., AWS S3, Google Cloud Storage, Azure SQL, Snowflake data warehouse, On Premises file shares, etc.) does the solution support for discovery? | Autonomous Discovery | Can the solution discover sensitive data automatically and continuously, ensuring up to date awareness of data presence? | Data Perimeters Monitoring | Can the solution track data movement between storage solutions and detect risky and non-compliant data transfers and data sprawl? |
2. Data Access Governance
Requirement | Details |
---|---|
Access Controls | Does the solution map access of users and non-human identities to data based on sensitivity and sensitive information types? |
Location Independent Control | Does the solution help organizations apply least privilege access regardless of data location or movement? |
Identity Activity Monitoring | Does the solution identify over-provisioned, unused or abandoned identities (users, keys, secrets) that create unnecessary exposures? |
Data Access Catalog | Does the solution provide an intuitive map of identities, their access entitlements (read/write permissions), and the sensitive data they can access? | Integration with IAM Providers | Does the solution integrate with existing Identity and Access Management (IAM) systems? |
3. Posture, Risk Assessment & Threat Monitoring
Requirement | Details |
---|---|
Risk Assessment | Can the solution assess data security risks and assign risk scores based on data exposure and data sensitivity? |
Compliance Frameworks | Does the solution support compliance with regulatory requirements such as GDPR, CCPA, and HIPAA? |
Similar Data Detection | Does the solution identify data that has been copied, moved, transformed or otherwise modified that may disguise its sensitivity or lessen its security posture? |
Automated Alerts | Does the solution provide automated alerts for policy violations and potential data breaches? | Data Loss Prevention (DLP) | Does the solution include DLP features to prevent unauthorized data exfiltration? | 3rd Party Data Loss Prevention (DLP) | Does the solution integrate with 3rd party DLP solutions? | User Behavior Monitoring | Does the solution track and analyze user behaviors to identify potential insider threats or malicious activity? | Anomaly Detection | Does the solution establish a baseline and use machine learning or AI to detect anomalies in data access or movement? |
4. Incident Response & Remediation
Requirement | Details |
---|---|
Incident Management | Can the solution provide detailed reports, alert details, and activity/change history logs for incident investigation? |
Automated Response | Does the solution support automated incident response, such as blocking malicious users or stopping unauthorized data flows (via API integration to native cloud tools or other)? |
Forensic Capabilities | Can the solution facilitate forensic investigation, such as data access trails and root cause analysis? |
Integration with SIEM | Can the solution integrate with existing Security Information and Event Management (SIEM) or other analysis systems? |
5. Infrastructure & Deployment
Requirement | Details |
---|---|
Deployment Models | Does the solution support flexible deployment models (on-premise, cloud, hybrid)? Is the solution agentless? |
Cloud Native | Does the solution keep all data in the customer’s environment, performing classification via serverless functions? (ie. no data is ever removed from customer environment - only metadata) |
Scalability | Can the solution scale to meet the demands of large enterprises with multi-petabyte data volumes? |
Performance Impact | Does the solution work asynchronously without performance impact on the data production environment? |
Multi-Cloud Support | Does the solution provide unified visibility and management across multiple cloud providers and hybrid environments? |
6. Operations & Support
Requirement | Details |
---|---|
Onboarding | Does the solution vendor assist customers with onboarding? Does this include assistance with customization of policies, classifiers, or other settings? |
24/7 Support | Does the vendor provide 24/7 support for addressing urgent security issues? |
Training & Documentation | Does the vendor provide training and detailed documentation for implementation and operation? |
Managed Services | Does the vendor (or its partners) offer managed services for organizations without dedicated security teams? |
Integration with Security Tools | Can the solution integrate with existing security tools, such as firewalls, DLP systems, and endpoint protection systems? |
7. Pricing & Licensing
Requirement | Details |
---|---|
Pricing Model | What is the pricing structure (e.g., per user, per GB, per endpoint)? |
Licensing | What licensing options are available (e.g., subscription, perpetual)? |
Additional Costs | Are there additional costs for support, maintenance, or feature upgrades? |
Conclusion
This RFP template is designed to facilitate a structured and efficient evaluation of DSP and DSPM solutions. Vendors are encouraged to provide comprehensive and transparent responses to ensure an accurate assessment of their solution’s capabilities.
Sentra’s cloud-native design combines powerful Data Discovery and Classification, DSPM, DAG, and DDR capabilities into a complete Data Security Platform (DSP). With this, Sentra customers achieve enterprise-scale data protection and do so very efficiently - without creating undue burdens on the personnel who must manage it.
To learn more about Sentra’s DSP, request a demo here and choose a time for a meeting with our data security experts. You can also choose to download the RFP as a pdf.