From CISO Marketplace — the hub for security professionals Visit

Tokenization

Data Protection

Definition

Replacing sensitive data with non-sensitive tokens.

Technical Details

Tokenization is a process that converts sensitive data, such as credit card numbers or personal identification information, into non-sensitive equivalents known as tokens. Each token is a unique identifier that has no extrinsic value or meaning outside of the specific context in which it is used. The token is mapped to the original data through a secure tokenization system, which maintains a secure database to store the sensitive information. This mapping ensures that the sensitive data itself is never exposed during transactions or data processing, thereby reducing the risk of data breaches and ensuring compliance with regulations such as PCI DSS.

Practical Usage

Tokenization is widely used in payment processing systems to protect credit card information during transactions. For instance, when a customer makes a purchase online, their credit card number is replaced with a token before it is transmitted. This token can be used for processing the payment without exposing the actual credit card number. Additionally, tokenization is implemented in healthcare systems to protect patient information, allowing healthcare providers to use tokens for billing or insurance claims while keeping the sensitive data secure and compliant with HIPAA regulations.

Examples

Related Terms

Encryption Data Masking Data Breach Payment Card Industry Data Security Standard (PCI DSS) Privacy by Design
← Back to Glossary