nuBridges Calls for Tokenization Standards
March 2, 2010 Alex Woodie
Security software vendor nuBridges yesterday called for the formation of an industry group to create official standards for tokenization. The company says standards are needed to ensure the security of data, to reduce vendor lock-in, and to ensure the long-term viability of this relatively new form of data security. nuBridges, which is exhibiting and presenting at the RSA Security conference this week, also unveiled a new release of its tokenization product, called Protect Token Manager.
Tokenization is advanced form of encryption that is gaining traction among retailers, payment gateways, and banks as a result of the PCI security mandate. The technology works by replacing sensitive data, such as a credit card number, with randomly generated index keys, or “tokens,” that point toward the actual credit card number stored in a central database. Organizations that adopt tokenization reduce the risk of unintended information disclosure by storing sensitive data in fewer places, and also lower their storage requirements (because keys are smaller than encrypted data).
While the concept behind tokenization is well accepted, the actual implementations of tokenization vary from customer to customer, and vendor to vendor, according to Gary Palgon, vice president of product management at nuBridges.
“There are different models developing out there for tokenization, and it’s causing the beginning of difficulties for companies that actually are implementing it,” Palgon tells IT Jungle.
Palgon has two main concerns about the course that tokenization is taking. For starters, the lack of interoperability among tokenization providers decreases a customer’s ability to adapt its systems in the future, and increases vendor lock in. Palgon’s second big concern is that the way some vendors are implementing tokenization is not secure.
The fact that tokenization could cause data to be less secure should set off alarm bells for anybody considering this technology. According to Palgon, companies that use algorithms to generate tokens en masse may be defeating the whole purpose of tokenization.
“Let’s suppose that you’re generating tokens, and lets suppose the algorithm that you use to generate the tokens would add 1. So the first token was 1, the second token was 2,” Palgon says. “That may seem well and good. But what happens if I’m a company that’s generating credit card numbers? As I tokenize those credit card numbers, I’m getting a pattern, 1-2-3-4, and a pattern defeats the whole purpose of a token. The whole concept behind tokenization is to make information worthless. If there’s a pattern behind it, it’s worth something.”
It’s somewhat rare for a company that is at the forefront of an industry, as nuBridges is with tokenization, to call for open standards. After all, the company is doing a decent business writing one-off connections between customers’ business applications and Protect Token Manager. Changing from a black-box, proprietary connection model to an open standards model could jeopardize nuBridges’ foothold and allow customers to leave for another provider.
But as Palgon sees it, without standards in place for the breadth of tools in this category–encryption, key management, and tokenization–customers will not be happy with the results, and overall health of this segment of the security business will falter.
“What we’re trying to do, effectively, is get together with our competitors and say, ‘For the success of our joint customers, certain things over time need to be interoperable,'” Palgon says. “Then we can differentiate on different features and functionality outside of that.”
While PCI is driving the adoption of tokenization today, the data security technology is expected to be much more widely adopted in the future, as organizations realize they must protect all personally identifiable information (PII), not to mention personal health information (PHI).
“From a long term strategic standpoint, we need to iron this out here in the next two years before the massive adoptions,” Palgon says. “Credit card data only represents about 6 percent of the breached data out there. We’re putting all this money and effort into protecting credit card numbers, but the bigger pot of gold of information out there is all this other data. We need standards in place to go after the bigger problems out there, which is the overall PII and PHI.”
The working name nuBridges has given to this group is the Tokenization Standards Organization. So far, nuBridges has invited about 15 vendors in the business to join the group, which the company envisions being hosted by one of the popular standards bodies, such as IEEE or OASIS. Palgon will be busy meeting with prospective members this week at the RSA conference, and hopefully the formal group and its founding charter members will be announced sometime this spring.
So, what does Palgon expect to come out of a standards body? For starters, a solid definition of tokenization would be nice. “Even the basic definitions aren’t out there. There are multiple definitions” of what constitutes a token, he says. “None of us will have the exact answer. We’ll have to work it though together.”
A new tokenization protocol, per se, is not in the mix at this point, as existing protocols such as Web services and message queuing technologies will likely suffice for interoperability and integration needs, Palgon says.
nuBridges also announced Protect Token Manager release 1.3, which added more granular control over the encryption key lifecycle; consistency with Key Management Interoperability Protocol (KMIP) standards; pre-configured templates for UK National Insurance Numbers and Canadian Social Insurance Numbers; enhanced surveillance of client, user, and administrative activities; and better LDAP integration.
Protect Token Manager runs natively on i/OS as well as other platforms, and starts at around $50,000. For more information on the Tokenization Standards Organization or Protect Token Manager, contact nuBridges through its Web site at www.nubridges.com.