Tokenization
Tokenization should be considered when sensitive data is stored on multiple systems throughout an organization. Tokenization is the process of replacing sensitive data with unique identification numbers (e.g. tokens) and storing the original data on a central server, typically in encrypted form.
By centralizing sensitive data onto a single system, tokenization can help thwart hackers and minimi...