User:Jfgetejsxj/sandbox

Tokenization is the process of replacing some piece of sensitive data with a value that is not considered sensitive in the context of the environment that consumes the token and the original sensitive data. Tokenization technology can be used with sensitive data of all kinds including bank transactions, medical records, criminal records, vehicle driver information, loan applications, stock trading and voter registration.[1] In the payments industry, tokenization is a process by which the Primary Account Number (PAN) is replaced with a surrogate value called a “token.” De-tokenization is the reverse process of redeeming a token for its associated PAN value. The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value.[2] Impact on Payment Card Industry Data Security Standard (PCI DSS) Compliance

In the payments industry, tokenization has become a popular means of bolstering the security of electronic transactions while minimizing the complexity of compliance with industry standards and government regulations.[3] The Payment Card Industry Data Security Standard is an industry-wide standard that must be met by any organization that stores, processes, or transmits cardholder data, mandates that credit card data must be protected when stored.[4] Tokenization, as applied to payment card data, is often implemented to meet this mandate.[5] Tokens can be formatted in a variety of ways. Some token service providers or applications generate these stand-in values in such a way as to match the format of the original sensitive data. In the case of payment card data, a token might be the same length of a PAN and contain elements of the original data such as the last four digits of the card number. When an authorization request is made to verify the legitimacy of a transaction, a token might be returned to the merchant instead of the card number, along with the authorization code for the transaction. The token is stored in the receiving system while the actual cardholder data is stored in a secure token storage system. Storage of tokens and payment card data must comply with current PCI standards.[6] The PCI Security Standards Council created an information supplement titled, PCI DSS Tokenization Guidelines, which was released in August 2011. The purpose of the document is to provide guidance on how tokenization may impact the PCI DSS scope for merchants that store, process, or transmit cardholder data.[2] Tokenization may reduce the scope of PCI compliance. To be considered out of scope for PCI DSS, tokens, and the system components that store, process, and/or transmit only tokens would also need to meet following requirements: •	Recovery of the PAN value associated with a token must not be computationally feasible through knowledge of only the token, multiple tokens, or other token-to-PAN combinations.[2] •	The PAN cannot be retrieved even if the token and the systems it resides on are compromised.[2] •	System components are segmented (isolated) from any application, system, process, or user with the ability to submit a de-tokenization request for that token and retrieve the PAN or have access to the tokenization system, data vault, or cryptographic keys for that token, or have access to token input data that can be used to de-tokenize or derive the PAN value.[2] •	System components are not connected to any portion of the tokenization system.[2] •	System components are not located within the cardholder data environment (CDE), nor do they have the ability to communicate with any part of the CDE.[2] •	System components do not store, process, or transmit cardholder data or sensitive authentication data through any other channel.[2] •	System components that previously stored, processed, or transmitted cardholder data prior to implementation of the tokenization solution have been sanitized.[2]

Tokenization Service Providers

There are several companies that provide tokenization services. •	TokenEx [7] •	TransArmor (First Data) [8] •	TrueTokenization (Shift4) [9] •	Protegrity Vaultless Tokenization [10] •	Paymetric XiSecure for Cardholder Data [11] •	SafeNet Tokenization Manager [12] Tokenization vs. Encryption

The biggest difference between tokenization and encryption is that tokenization removes PAN data from the enterprise’s PCI scope. Encryption does not. What can be encrypted can be decrypted. Therefore encrypted data are still considered to be in scope for PCI. The one exception is when PCI compliant, strong encryption is managed by an outside entity, and the enterprise has no ability to decrypt the data and get back to cleartext PANs.[13]

Tokenization security concerns:

Tokenization does not address security concerns from where customer card information is transmitted to where the data is tokenized. In order to reduce security concerns, an effective, secure tokenization solution with point-to-point encryption (P2PE), such that the only PANs in the environment are contained within a secure, PIN Transaction Security (PTS) approved point-of-interaction (POI) device [2] Tokenization History:

Tokenization was first applied to payment card data by Shift4 Corporation [14] and released to the public during an industry Security Summit in Las Vegas, Nevada in 2005.[15]

References 1.	What is Tokenization? 2.	Tokenization Guidelines Information Supplement 3.	“Compliance Benefits of Tokenization” 4.	Can Tokenization of Credit Card Numbers Satisfy PCI Requirements? 5.	Data Security: Counterpoint – “The Best Way to Secure Data is Not to Store Data” 6.	“Securing Data: What Tokenization Does” 7.	TokenEx 8.	TransArmor 9.	TrueTokenization 10.	Protegrity 11.	Paymetric 12.	SafeNet 13.	Tokenization Buyers Guide 14.	Shift4 Launches Security Tool That Lets Merchants Re-Use Credit Card Data. Internet Retailer 15.	"Shift4 Corporation Releases Tokenization in Depth White Paper"