Hello, you are using an old browser that's unsafe and no longer supported. Please consider updating your browser to a newer version, or downloading a modern browser.
Training Camp • Cybersecurity Glossary
Data tokenization is the process of substituting sensitive data with unique identification symbols that retain essential data content and characteristics, thereby minimizing the risk associated with storing or transmitting the original data.
Data Tokenization Definition: Data tokenization is the process of substituting sensitive data with unique identification symbols that retain essential data content and characteristics, thereby minimizing the risk associated with storing or transmitting the original data.
Data tokenization is the process of replacing sensitive data with unique identification symbols, known as tokens, which have no exploitable meaning or value. This technique helps protect sensitive information by making it more difficult for unauthorized users to access or misuse the actual data. Tokens are typically generated using algorithms and can be reversed to retrieve the original data using a tokenization system or key.
Turn knowledge into credentials. Browse our instructor-led cybersecurity courses.
View All Courses →