Better than a fix: Tightening backup and restore helps financial services companies innovate

We all know the risks out there. Ransomware is a huge threat, and critical transactional data is constantly under attack. Meanwhile, financial services organizations are being squeezed on all sides, as regulators are tightening legislation, from SOX to CCPA, GDPR and global data privacy laws like PIPL. In this firestorm, it’s never been more important for financial services organizations to level up their data protection and risk mitigation strategies.

financial services data protection

What makes financial services data so complex?

There are four main reasons:

  • Sophisticated data models
  • Strict security requirements
  • Extremely large data volumes
  • Testing environments can introduce risk

Financial services under attack

Threats to data come from many places, from human error to malicious activity, and one such threat is ransomware attacks. In Sophos’ The State of Ransomware in Financial Services 2022 Report, findings showed that ransomware attacks on financial services are on an upwards trajectory. The report found that 55% of organizations were hit in 2021, up from 34% in 2020.

Yet, according to the report, financial services reported the second-lowest rate of data encryption at 54%, compared to a global average of 65%.

Amongst the financial services organizations that were hit, 52% paid the ransom to restore data, which is higher than the global average of 46%, and the survey found that the average remediation cost in financial services was US$1.59M, which is above the global average of US$1.4M.

Response rates are too slow

It follows that securing this data is a huge challenge that requires ever-shifting innovations. Technology solutions are out there but implementing them isn’t always straightforward. This leads to the cringeworthy statistic that 277 days is the average time to identify and contain a data breach, according to IBM Security’s Cost of a Data Breach Report 2022. That’s more than three-quarters of a year spent on containing a single breach.

Complex data challenges

Financial services data is high-frequency data, and it covers complex relationships. Restoration of this type of data is challenging. It’s necessary to utilize the farthest-reaching security measures like encryption key management, data residency, data risk assessment of vendors, and maintaining compliance with offshore development. Without these strict measures, data is simply not secure enough.

Because the volumes of data are so massive, there are unique challenges. Query performance can be slow and clunky for agents to access, leading to customer service issues. With big objects and external data, it’s easy for data to become skewed. Perhaps most importantly, it can be difficult to remove large volumes of data out of SaaS applications, lending this problem to take on a snowball effect.

Another challenge: The creation and maintenance of appropriate testing environments is a hurdle to overcome. The data needs to be complete, to run robust testing on performance. There are data masking requirements to fulfil. And sandboxes can become congested when full-copy data is used.

So, what must financial services organizations do better?

Here’s a four-step plan:

1. Develop a strategy to back up and restore your most critical data. Recovering your data from any point in time is a necessity for financial organizations. Additionally, just because you can back up, doesn’t mean you can restore it within your Recovery Time Objectives (RTOs) or at all.
2. Have a clear archiving strategy, which applies coherent rules that control which data stays in the platform, which moves off-platform, and what gets deleted, all in keeping with your business needs and industry regulation. You also need to ensure you can retrieve the archived data in future if necessary (e.g., in case of an audit).
3. Choose vendors for third-party services that introduce minimal risk, such as “no view” providers, because if they have a breach or malicious actor, your data isn’t affected since they don’t have access to your data. Minimize the number of service providers and technologies that have access to your data. In addition, ensure encryption of data at rest, in transit, and in use – and own the encryption keys instead of relying on your backup restore vendor.
4. Anonymize data in sandbox environments to limit the exposure of data during development and testing.

The good news? By following this four-step plan, organizations can innovate in multiple ways:

1. Protect business continuity. Build a backup and restore process for your data that will stand up to even the most catastrophic data loss scenarios. In the case of Salesforce, understanding that the data models can be highly complex, and choosing a tool that understands the extremes of the customization possible, is critical. Selecting the most flexible toolkits to work around these issues will pay dividends – especially when it comes to recovering from data loss.

2. Operationalize regulatory compliance. In the case of Salesforce, companies are using it for more than just managing customer relationships. Salesforce data is more important than ever as many financial services organizations are using it for material use cases, managing the transactions and interactions that sit at the very heart of the business. But archiving this data requires having the right tools. You can archive data, but you’re required by law not to delete the data for a certain period. Part of the problem is, where do you store this in your enterprise? You must re-evaluate the security of where you store that data. Which country it sits in, for instance – so you are ensuring that the right data privacy laws are followed. It also matters what vendor is hosting the information. This adds an extra layer of complexity, and many enterprises are looking for the right archive solution to mitigate these issues.

3. Improving performance. If data can be archived and removed out of the central SaaS environment, performance will automatically improve. When millions of records are sitting there, it slows the whole system down, and agents will notice delays in accessing records, causing poor customer service. To provide excellent service to customers, teams must have speedy and efficient systems. A continual archiving strategy, based upon the whole data lifecycle, is necessary to avoid this scenario and to avoid the snowball effect of having too much data and not being able to delete any of it.

4. Gaining data agility. The key lies in transforming data into fuel for innovation, by moving it between production and non-production environments. One of the things that financial services want to do is safely test new configurations of their Salesforce or CRM data. A way to do that on scale is to test against Salesforce, using real-world data. Enterprises are asking, how can we safely do that? Using techniques like sandbox anonymization, and sandbox seeding, allows them to innovate faster because they can test against near real-world scenarios, but without any of the risk associated with playing around with production data.

Don't miss