Tokenization that eliminates data collision

Protegrity announced a tokenization enhancement to the Protegrity Data Protection System (DPS) 5.2, the newest version of its data protection platform.

Tokenization is the process of substituting sensitive data with replacement values that retain all the essential characteristics without compromising its security. Businesses are increasingly turning to tokenization to secure high risk data such as credit cards and social security numbers.

Particularly in high volume operations, the usual way of generating tokens is prone to issues that impact the availability and performance of the data. From a security standpoint, it is critical to address the issue of collisions caused when tokenization solutions assign the same token to two separate pieces of data.

After two years of research and development, Protegrity has developed a new solution by altering the traditional backend processes. This has resulted in a patent-pending way to tokenize data which eliminates the challenges associated with standard centralized tokenization.

Protegrity’s solution addresses all the critical issues. System performance, availability and scaling are enhanced, numeric and alpha tokens are generated to protect a wide range of high-risk data, key management is simplified, and collisions are eliminated.

Don't miss