Sentra Expands Data Security Platform with On-Prem Scanners for Hybrid Environments
All Resources
In this article:
minus iconplus icon
Share the Blog

Cloud Data Breaches: Cloud vs On Premise Security

January 24, 2023
3
Min Read
Data Security

"The cloud is more secure than on prem.” This has been taken for granted for years, and is one of the many reasons companies are adopting a ‘cloud first mentality’. But when it comes to data breaches this isn’t always the case.

That’s why you still can’t find a good answer to the question “Is the cloud more secure than on-premise?”

Because like everything else in security, the answer is always ‘it depends’. While having certain security aspects managed by the cloud provider is nice, it’s hardly comprehensive. The cloud presents its own set of data security concerns that need to be addressed.

In this blog, we’ll be looking at data breaches in the cloud vs on premises. What are the unique data security risks associated with both use cases, and can we definitively say one is better at mitigating the risks of data breaches? 

On Premises Data Security

An on-premise architecture is the traditional way organizations manage their networks and data. The company’s servers, hardware, software, and network are all managed directly by the IT department, which assumes full control over uptime, security, and data. While more labor intensive than cloud infrastructures, on-premise architectures have the advantage of having a perimeter to defend. Unlike the cloud,  IT and security teams also know exactly where all of their data is - and where it’s supposed to be. Even if data is duplicated without authorization, it’s duplicated in the on-prem server, with existing perimeter protections in place. The advantage of these solutions can’t be overstated. IT has decades of experience managing on-premise servers and there are hundreds of tested products on the market that do an excellent job of securing an on-prem perimeter.  

Despite these advantages, around half of data breaches are still from on-premise architectures rather than cloud. This is caused by a number of factors. Most importantly, cloud providers like Amazon Web Services, Azure, and GCP are responsible for some aspects of security. Additionally, while securing a perimeter might be more straightforward than the defense in depth approach required for the cloud, it’s also easier for attackers to find and exploit on-premise vulnerabilities by easily searching public exploit databases and then finding organizations that haven’t patched the relevant vulnerability. 

Data Security in the Cloud 

Infrastructure as a Service (IaaS) Cloud computing runs on a ‘shared responsibility model’. The cloud provider is responsible for the hardware, so they provide the physical security, but protecting the software, applications, and data is still the enterprise’s responsibility. And while some data leaks are the result of poor physical security, many of the major leaks today are the result of misconfigurations and vulnerabilities, not someone physically accessing a hard drive. 

So when people claim the cloud is better for data security than on premises, what exactly do they mean?

Essentially they’re saying that data in the cloud is more secure when the cloud is correctly set up. And no, this is not as obvious as it sounds. Because by definition the cloud needs to be accessed through the internet, that also makes it shockingly easy to accidentally expose data to everyone through the internet.

For example, S3 buckets that are improperly configured have been responsible for some of the most well known cloud data breaches, including Booz Allen Hamilton , Accenture, and Prestige Software. This just isn’t a concern for on-prem organizations.  There’s also the matter of the quantity of data being created in the cloud. Because the cloud is provisioned on demand, developers and engineers can easily duplicate databases and applications, and accidentally expose the duplicates to the internet. 

Amazon’s warning against leaving buckets exposed to the internet

Securing your cloud against data breaches is also complicated by the lack of a definable perimeter. When everything is accessible via the internet with the right credentials, guarding a ‘perimeter’ isn’t possible. Instead cloud security teams manage a range of security solutions designed to protect different elements of their cloud - the applications, the networking, the data, etc. And they have to do all of this without slowing down business processes. The whole advantage of moving to the cloud is speed and scalability. If security prevents scalability, the benefits of the cloud vanish.

So we see with the cloud there’s a basic level of security features you need to enable. The good news is that once those features are enabled, the cloud is much harder for an attacker to navigate. There’s monitoring built in to which makes breaches more difficult. It’s also a lot more difficult to understand a cloud architecture than an on-premise one, which means that attackers either have to be more sophisticated or they just go for the low-hanging fruit (exposed s3 buckets being a good example of this). 

However, once you have your monitoring built in, there’s still one challenge facing cloud-first organizations. That’s the data. No matter how many cloud security experts you have, there’s data being constantly created in the cloud that security may not even be aware exists. There’s no issue of visibility on premises - we know where the data is. It’s on the server we’re managing. In the cloud, there’s nothing stopping developers from duplicating data, moving it between environments, and forgetting about it completely (also known as shadow data). Even if you were able to discover the data, it’s no longer clear where it came from, or what security posture it’s supposed to have. Data sprawl leading to a loss of visibility, context, which damages your security posture is the primary cloud security challenge.  

So what’s the verdict on data breaches in the cloud vs data breaches on premises? Which is riskier or more likely? 

Is the Cloud More Secure Than On Premise?

Like we warned in the beginning, the answer is an unsatisfying “it depends”. If your organization properly manages the cloud, configures the basic security features, limits data sprawl, and has cloud experts managing your environment, the cloud can be a fortress. Ultimately though, this may not be a conversation most enterprises are having in the coming years. With the advantages of scalability and speed, many new enterprises are cloud-first and the question won’t be ‘is the cloud secure’ but is our cloud’s data secure.

<blogcta-big>

Discover Ron’s expertise, shaped by over 20 years of hands-on tech and leadership experience in cybersecurity, cloud, big data, and machine learning. As a serial entrepreneur and seed investor, Ron has contributed to the success of several startups, including Axonius, Firefly, Guardio, Talon Cyber Security, and Lightricks, after founding a company acquired by Oracle.

Subscribe

Latest Blog Posts

Ward Balcerzak
Ward Balcerzak
October 20, 2025
3
Min Read
Data Security

2026 Cybersecurity Budget Planning: Make Data Visibility a Priority

2026 Cybersecurity Budget Planning: Make Data Visibility a Priority

Why Data Visibility Belongs in Your 2026 Cybersecurity Budget

As the fiscal year winds down and security leaders tackle cybersecurity budget planning for 2026, you need to decide how to use every remaining 2025 dollar wisely and how to plan smarter for next year. The question isn’t just what to cut or keep, it’s what creates measurable impact. Across programs, data visibility and DSPM deliver provable risk reduction, faster audits, and clearer ROI,making them priority line items whether you’re spending down this year or shaping next year’s plan. Some teams discover unspent funds after project delays, postponed renewals, or slower-than-expected hiring. Others are already deep in planning mode, mapping next year’s security priorities across people, tools, and processes. Either way, one question looms large: where can a limited security budget make the biggest impact - right now and next year?

Across the industry, one theme is clear: data visibility is no longer a “nice-to-have” line item, it’s a foundational control. Whether you’re allocating leftover funds before year-end or shaping your 2026 strategy, investing in Data Security Posture Management (DSPM) should be part of the plan.

As Bitsight notes, many organizations look for smart ways to use remaining funds that don’t roll over. The goal isn’t simply to spend, it’s to invest in initiatives that improve posture and provide measurable, lasting value. And according to Applied Tech, “using remaining IT funds strategically can strengthen your position for the next budget cycle.”

That same principle applies in cybersecurity. Whether you’re closing out this year or planning for 2026, the focus should be on spending that improves security maturity and tells a story leadership understands. Few areas achieve that more effectively than data-centric visibility.

(For additional background, see Sentra’s article on why DSPM should take a slice of your cybersecurity budget.)

Where to Allocate Remaining Year-End Funds (Without Hurting Next Year’s Budget)

It’s important to utilize all of your 2025 budget allocations because finance departments frequently view underspending as a sign of overfunding, leading to smaller allocations next year. Instead, strategic security teams look for ways to convert every remaining dollar into evidence of progress.

That means focusing on investments that:

  • Produce measurable results you can show to leadership.
  • Strengthen core program foundations: people, visibility, and process.
  • Avoid new recurring costs that stretch future budgets.

Top Investments That Pay Off

1. Invest in Your People

One of the strongest points echoed by security professionals across industry communities: the best investment is almost always your people. Security programs are built on human capability. Certifications, practical training, and professional growth not only expand your team’s skills but also build morale and retention, two things that can’t be bought with tooling alone.

High-impact options include:

  • Hands-on training platforms like Hack The Box, INE Skill Dive, or Security Blue Team, which develop real-world skills through simulated environments.
  • Professional certifications (SANS GIAC, OSCP, or cloud security credentials) that validate expertise and strengthen your team’s credibility.
  • Conference attendance for exposure to new threat perspectives and networking with peers.
  • Cross-functional training between SOC, GRC, and AppSec to create operational cohesion.

In practitioner discussions, one common sentiment stood out: training isn’t just an expense, it’s proof of leadership maturity.

As one manager put it, “If you want your analysts to go the extra mile during an incident, show you’ll go the extra mile for them when things are calm.”

2. Invest in Data Visibility (DSPM)

While team capability drives execution, data visibility drives confidence. In recent conversations among mid-market and enterprise security teams, Data Security Posture Management (DSPM) repeatedly surfaced as one of the most valuable investments made in the past year, especially for hybrid-cloud environments.

One security leader described it this way:

“After implementing DSPM, we finally had a clear picture of where sensitive data actually lived. It saved our team hours of manual chasing and made the audit season much easier.”

That feedback reflects a growing consensus: without visibility into where sensitive data resides, who can access it, and how it’s secured, every other layer of defense operates partly in the dark.

*Tip: If your remaining 2025 budget won’t suffice for a full DSPM deployment, you can scope an initial implementation with the remaining budget, then expand to full coverage in 2026.

DSPM solutions provide that clarity by helping teams:

  • Map and classify sensitive data across multi-cloud and SaaS environments.
  • Identify access misconfigurations or risky sharing patterns.
  • Detect policy violations or overexposure before they become incidents.

Beyond security operations, DSPM delivers something finance and leadership appreciate, measurable proof. Dashboards and reports make risk tangible, allowing CISOs to demonstrate progress in data protection and compliance.

The takeaway: DSPM isn’t just a good way to use remaining funds, it’s a baseline investment every forward-looking security program should plan for in 2026 and beyond.

3. Invest in Testing

Training builds capability. Visibility builds understanding. Testing builds credibility.

External red team, purple team, or security posture assessments continue to be among the most effective ways to validate your defenses and generate actionable findings.

Security practitioners often point out that testing engagements create outcomes leadership understands:

“Training is great, but it’s hard to quantify. An external assessment gives you findings, metrics, and a roadmap you can point to when defending next year’s budget.”

Well-scoped assessments do more than uncover vulnerabilities—they benchmark performance, expose process gaps, and generate data-backed justification for continued investment.

4. Preserve Flexibility with a Retainer

If your team can’t launch a new project before year-end, a retainer with a trusted partner is an efficient way to preserve funds without waste. Retainers can cover services like penetration testing, incident response, or advisory hours, providing flexibility when unpredictable needs arise. This approach, often recommended by veteran CISOs, allows teams to close their books responsibly while keeping agility for the next fiscal year.

5. Strengthen Your Foundations

Not every valuable investment requires new tools. Several practitioners emphasized the long-term returns from process improvements and collaboration-focused initiatives:

  • Threat modeling workshops that align development and security priorities.
  • Framework assessments (like NIST CSF or ISO 27001) that provide measurable baselines.
  • Automation pilots to eliminate repetitive manual work.
  • Internal tabletop exercises that enhance cross-team coordination.

These lower-cost efforts improve resilience and efficiency, two metrics that always matter in budget conversations.

How to Decide: A Simple, Measurable Framework

When evaluating where to allocate remaining or future funds, apply a simple framework:

  1. Identify what’s lagging. Which pillar - people, visibility, or process most limits your current effectiveness?
  2. Choose something measurable. Prioritize initiatives that produce clear, demonstrable outputs: reports, dashboards, certifications.
  3. Aim for dual impact. Every investment should strengthen both your operations and your ability to justify next year’s funding.

Final Thoughts

A strong security budget isn’t just about defense, it’s about direction. Every spend tells a story about how your organization prioritizes resilience, efficiency, and visibility.

Whether you’re closing out this year’s funds or preparing your 2026 plan, focus on investments that create both operational value and executive clarity. Because while technologies evolve and threats shift, understanding where your data is, who can access it, and how it’s protected remains the cornerstone of a mature security program.

Or, as one practitioner summed it up: “Spend on the things that make next year’s budget conversation easier.”

DSPM fits that description perfectly.

<blogcta-big>

Read More
Meni Besso
Meni Besso
October 15, 2025
3
Min Read
Compliance

Hybrid Environments: Expand DSPM with On-Premises Scanners

Hybrid Environments: Expand DSPM with On-Premises Scanners

Data Security Posture Management (DSPM) has quickly become a must-have for organizations moving to the cloud. By discovering, classifying, and protecting sensitive data across SaaS apps and cloud services, DSPM gave security teams visibility into data risks they never knew they had before.

But here’s the reality: most enterprises aren’t 100% cloud. Legacy file shares, private databases, and hybrid workloads still hold massive amounts of sensitive data. Without visibility into these environments, even the most advanced DSPM platforms leave critical blind spots.

That’s why DSPM platform support is evolving - from cloud-only to truly hybrid.

The Evolution of DSPM

DSPM emerged as a response to the visibility problem created by rapid cloud adoption. As organizations moved to cloud services, SaaS applications, and collaboration platforms, sensitive data began to sprawl across environments at a pace traditional security tools couldn’t keep up with. Security teams suddenly faced oversharing, inconsistent access controls, and little clarity on where critical information actually lived.

DSPM helped fill this gap by delivering a new level of insight into cloud data. It allowed organizations to map sensitive information across their environments, highlight risky exposures, and begin enforcing least-privilege principles at scale. For cloud-native companies, this represented a huge leap forward - finally, there was a way to keep up with constant data changes and movements, helping customers safely adopt the cloud while maintaining data security best practices and compliance and without slowing innovation.

But for large enterprises, the model was incomplete. Decades of IT infrastructure meant that vast amounts of sensitive information still lived in legacy databases, file shares, and private cloud environments. While DSPM gave them visibility in the cloud, it left everything else in the dark.

The Blind Spot of On-Prem & Private Data

Despite rapid cloud adoption and digital transformation progress, large organizations still rely heavily on hybrid and on-prem environments, since data movement to the cloud can be a year’s long process. On-premises file shares such as NetApp ONTAP, SMB, and NTFS, alongside enterprise databases like Oracle, SQL Server, and MySQL, remain central to operations. Private cloud applications are especially common in regulated industries like healthcare, finance, and government, where compliance demands keep critical data on-premises.

To scan on premises data, many DSPM providers offer partial solutions by taking ephemeral ‘snapshots’ of that data and temporarily moving it to the cloud (either within customer environment, as Sentra does, or to the vendor cloud as some others do) for classification analysis. This can satisfy some requirements, but often is seen as a compliance risk for very sensitive or private data which must remain on-premises. What’s left are two untenable alternatives - ignoring the data which leaves serious visibility gaps or utilizing manual techniques which do not scale.

These approaches were clearly not built for today’s security or operational requirements. Sensitive data is created and proliferates rapidly, which means it may be unclassified, unmonitored, and overexposed, but how do you even know? From a compliance and risk standpoint, DSPM without on-prem visibility is like watching only half the field, and leaving the other half open to attackers or accidental exposure.

Expanding with On-Prem Scanners

Sentra is changing the equation. With the launch of its on-premise scanners, the platform now extends beyond the cloud to hybrid and private environments, giving organizations a single pane of glass for all their data security.

With Sentra, organizations can:

  • Discover and classify sensitive data across traditional file shares (SMB, NFS, CIFS, NTFS) and enterprise databases (Oracle, SQL Server, MySQL, MSSQL, PostgreSDL, MongoDB, MariaDB, IBM DB2, Teradata).
  • Detects and protects critical data as it moves between on-prem and cloud environments.
  • Apply AI-powered classification and enforce Microsoft Purview labeling consistently across environments.
  • Strengthen compliance with frameworks that demand full visibility across hybrid estates.
  • Have a choice of deployment models that best fits their security, compliance, and operational requirements.

Crucially, Sentra’s architecture allows customers to ensure private data always remains in their own environment. They need not move data outside their premises and nothing is ever copied into Sentra’s cloud, making it a trusted choice for enterprises that require secure, private data processing.

Extending the Hybrid Vision

This milestone builds on Sentra’s proven track record as the only cloud-native data security platform that guarantees data always remains within the customer’s cloud environments - never copied or stored in Sentra’s cloud.

Now, Sentra’s AI-powered classification and governance engine can also be deployed in organizations that require onsite data processing, giving them the flexibility to protect both structured and unstructured data across cloud and on-premises systems.

By unifying visibility and governance across all environments while maintaining complete data sovereignty, Sentra continues to lead the next phase of DSPM, one built for modern, hybrid enterprises.

Real-World Impact

Picture a global bank: with modern customer-facing websites and mobile applications hosted in the public cloud, providing agility and scalability for digital services. At the same time, the bank continues to rely on decades-old operational databases running in its private cloud — systems that power core banking functions such as transactions and account management. Without visibility into both, security teams can’t fully understand the risks these stores may pose and enforce least privilege, prevent oversharing, or ensure compliance.

With hybrid DSPM powered by on-prem scanners, that same bank can unify classification and governance across every environment - cloud or on-prem, and close the gaps that attackers or AI systems could otherwise exploit.

Conclusion

DSPM solved the cloud problem. But enterprises aren’t just in the cloud, they’re hybrid. Legacy systems and private environments still hold critical data, and leaving them out of your security posture is no longer an option.

Sentra’s on-premise scanners mark the next stage of DSPM evolution: one unified platform for cloud, on-prem, and private environments. With full visibility, accurate classification, and consistent governance, enterprises finally have the end-to-end data security they need for the AI era. Because protecting half your data is no longer enough.

<blogcta-big>

Read More
Shiri Nossel
Shiri Nossel
September 28, 2025
4
Min Read
Compliance

The Hidden Risks Metadata Catalogs Can’t See

The Hidden Risks Metadata Catalogs Can’t See

In today’s data-driven world, organizations are dealing with more information than ever before. Data pours in from countless production systems and applications, and data analysts are tasked with making sense of it all - fast. To extract valuable insights, teams rely on powerful analytics platforms like Snowflake, Databricks, BigQuery, and Redshift. These tools make it easier to store, process, and analyze data at scale.

But while these platforms are excellent at managing raw data, they don't solve one of the most critical challenges organizations face: understanding and securing that data.

That’s where metadata catalogs come in.

Metadata Catalogs Are Essential But They’re Not Enough

Metadata catalogs such as AWS Glue, Hive Metastore, and Apache Iceberg are designed to bring order to large-scale data ecosystems. They offer a clear inventory of datasets, making it easier for teams to understand what data exists, where it’s stored, and who is responsible for it.

This organizational visibility is essential. With a good catalog in place, teams can collaborate more efficiently, minimize redundancy, and boost productivity by making data discoverable and accessible.

But while these tools are great for discovery, they fall short in one key area: security. They aren’t built to detect risky permissions, identify regulated data, or prevent unintended exposure. And in an era of growing privacy regulations and data breach threats, that’s a serious limitation.

Different Data Tools, Different Gaps

It’s also important to recognize that not all tools in the data stack work the same way. For example, platforms like Snowflake and BigQuery come with fully managed infrastructure, offering seamless integration between storage, compute, and analytics. Others, like Databricks or Redshift, are often layered on top of external cloud storage services like S3 or ADLS, providing more flexibility but also more complexity.

Metadata tools have similar divides. AWS Glue is tightly integrated into the AWS ecosystem, while tools like Apache Iceberg and Hive Metastore are open and cloud-agnostic, making them suitable for diverse lakehouse architectures.

This variety introduces fragmentation, and with fragmentation comes risk. Inconsistent access policies, blind spots in data discovery, and siloed oversight can all contribute to security vulnerabilities.

The Blind Spots Metadata Can’t See

Even with a well-maintained catalog, organizations can still find themselves exposed. Metadata tells you what data exists, but it doesn’t reveal when sensitive information slips into the wrong place or becomes overexposed.

This problem is particularly severe in analytics environments. Unlike production environments, where permissions are strictly controlled, or SaaS applications, which have clear ownership and structured access models, data lakes and warehouses function differently. They are designed to collect as much information as possible, allowing analysts to freely explore and query it.

In practice, this means data often flows in without a clear owner and frequently without strict permissions. Anyone with warehouse access, whether users or automated processes, can add information, and analysts typically have broad query rights across all data. This results in a permissive, loosely governed environment where sensitive data such as PII, financial records, or confidential business information can silently accumulate. Once present, it can be accessed by far more individuals than appropriate.

The good news is that the remediation process doesn't require a heavy-handed approach. Often, it's not about managing complex permission models or building elaborate remediation workflows. The crucial step is the ability to continuously identify and locate sensitive data, understand its location, and then take the correct action whether that involves removal, masking, or locking it down.

How Sentra Bridges the Gap Between Data Visibility & Security

This is where Sentra comes in.

Sentra’s Data Security Posture Management (DSPM) platform is designed to complement and extend the capabilities of metadata catalogs, not just to address their limitations, but to elevate your entire data security strategy. Instead of replacing your metadata layer, Sentra works alongside it enhancing your visibility with real-time insights and powerful security controls.

Sentra scans across modern data platforms like Snowflake, S3, BigQuery, and more. It automatically classifies and tags sensitive data, identifies potential exposure risks, and detects compliance violations as they happen.

With Sentra, your metadata becomes actionable.

sentra dashboard datasets

From Static Maps to Live GPS

Think of your metadata catalog as a map. It shows you what’s out there and how things are connected. But a map is static. It doesn’t tell you when there’s a roadblock, a detour, or a collision. Sentra transforms that map into a live GPS. It alerts you in real time, enforces the rules of the road, and helps you navigate safely no matter how fast your data environment is moving.

Conclusion: Visibility Without Security Is a Risk You Can’t Afford

Metadata catalogs are indispensable for organizing data at scale. But visibility alone doesn’t stop a breach. It doesn’t prevent sensitive data from slipping into the wrong place, or from being accessed by the wrong people.

To truly safeguard your business, you need more than a map of your data—you need a system that continuously detects, classifies, and secures it in real time. Without this, you’re leaving blind spots wide open for attackers, compliance violations, and costly exposure.

Sentra turns static visibility into active defense. With real-time discovery, context-rich classification, and automated protection, it gives you the confidence to not only see your data, but to secure it.

See clearly. Understand fully. Protect confidently with Sentra.

<blogcta-big>

Read More
decorative ball
Expert Data Security Insights Straight to Your Inbox
What Should I Do Now:
1

Get the latest GigaOm DSPM Radar report - see why Sentra was named a Leader and Fast Mover in data security. Download now and stay ahead on securing sensitive data.

2

Sign up for a demo and learn how Sentra’s data security platform can uncover hidden risks, simplify compliance, and safeguard your sensitive data.

3

Follow us on LinkedIn, X (Twitter), and YouTube for actionable expert insights on how to strengthen your data security, build a successful DSPM program, and more!

Before you go...

Get the Gartner Customers' Choice for DSPM Report

Read why 98% of users recommend Sentra.

Gartner Certificate for Sentra