What is tokenization?
Tokenization, in its simplest form, is another way of saying ‘data substitution’. It is the act of using a substitute value, or ‘token’, which has no inherent value, in the place of data that does have value. That way, if the system using tokens is compromised, it is the tokens that are taken, not the actual valuable data. Tokenization works by taking the original data value and generating a substitute value, usually with a random number generator. The mapping between the original data and the token is maintained in a secure database. Obviously, with tokenization, it is imperative to protect the database that contains the mappings between the original data and the tokens.