October 6, 2015

Fingerprinting the future of payments

Background & status quo

Coins, silver and jewelry were the only form of money until 476 a.d.. Following this era, other forms of payments have developed, like silver bars for large payments and a usage of paper bills to transfer money. The transformation to banknotes (paper money) started during the 11th century, which eventually expanded to the rest of the world as we know today. Due to a problem of money theft back in the days, several companies decided to assign a name to money by introducing a concept of charge card, which is known today as a credit card.
The consumption habits changed many times during the years too, especially nowadays by introducing the online commerce and mobile payments domains.
As a result of introducing various purchasing channels to the consumers, the cyber criminals began to target more retail chains, gas stations and even mom and pop shops.
The main problem todays is that many consumers are going with the flow, while exposing their credit cards to untrusted or insecure parties. Thus, their credit cards can be replicated easily and used by fraudsters for purchasing personal merchandise or even selling it in the underground market (a.k.a deep web).

A paradigm shift

Many retailers has been breached starting from 2013, which caused to millions of credit card holders to feel uncomfortable with coming back to the breached retail stores. From this period, the security of consumers’ personal data and credit cards became a top priority for some retailers.
The solutions to the payments problems started to evolve.  If we look back at the past two years, the questions we may ask ourselves are: who ever thought that consumers will pay using their biometric data? Who imagined that the wallet may be an optional component in the pocket? Today it is possible and the innovation around it grows exponentially.

Fingerprinting the future

Biometric payments are only the beginning of making our purchasing experience better in terms of security and privacy, but most people are not security experts or security minded, and thus, they don't care. On the other hand, there are other incentives that will enable consumers to use their biometric data for improving their lifestyle, for instance, no more waiting in the checkout lanes, paperless traveling or even integration with the Internet of Things (IOT).
If our security, privacy and lifestyle improvement are not enough, our biometric data will be able to assist in emergency situations, like identifying a person and his medical record in a hospital.
I foresee that in the future, we will forget about wallets as we forgot about calling from pay phones.

January 7, 2015

My Defcon 22 Talks

Bug bounty programs evolution:

A journey to protect points of sale:

November 17, 2014

API Keys Explained

In the past years, many cloud and enterprise applications utilize an approach of authenticating using API keys. There are many advantages using this approach in terms of security, but some of the can improve the performance as well.

Security Benefits
  1. Software to software authentication management - in case software needs to communicate with the API-keys-enabled application, there is not need to manage sessions. If session used, then the consuming software may store a session and manage re-authentication if expires. There is another approach to consume services as well - consuming any service following login, which causes for performance issues. Thus, a session-less approach needed.
  2. Entropy - the definition of NIST of entropy is "an estimate of the average amount of work required to guess the password of a selected user". Since regular users don't use long passwords, hackers can execute a Rainbow Tables attack to guess the passwords. Thus, an API key generation policy can enforce the creation of long random passwords, e.g. 40 characters consist of upper case, lower case, numbers and special characters.  For more information about the need for entropy, check Avi Douglen's deck form OWASP IL 2014 - "Passwords, Rehashed All Over Again".
  3. Password policy enforcement - many systems allowing only one password policy for all users. Such policy may contain password expiration, which requires to change passwords on all applications consuming the services from the developed application. This may cause so fails in the production, which are definitely bad for business. 
  4. A single identity management with distinct credentials -  when managing large scale applications serving other applications, the best practice recommendation for managing single user per consuming application becomes hard. It becomes even harder when permissions changes needed across the consuming applications. Thus, by managing API keys for single identity, few applications representing the same user can use distinct API keys. If permissions change needed, only single user's permission to be edited. 
  5. Minimal exposure - API keys generated once and responded to the consuming client via simple response or a file download (more common). This approach allows API keys to be exposed only at a certain time following the generation of them.
  6. Accountability - Since the generation of keys requires permissions (mostly by a non-applicative user), it is easy to trace the user who generated the key. This user is accountable for the security of this key.
Performance Benefits
  1. Fast calculation - API keys taking the advantage of digest authentication, which performs message-based authentication by performing an HMAC function to verify user's identity.
  2. Distributed caching - although caching is not a specific feature of API key management, but in scalable systems, only one call can be executed to the database in order to serve multiple requests. 

I decided to perform sequence diagrams for two main steps of the API authentication mechanism.
The first sequence is the registration of a consuming software using a logged in non-applicative user. This step is required in order to generate the API keys. The sequence diagram of such process illustrated below:

In the process above, the API keys are generated for a specific identity name. In order to store the keys securely without compromising the Secret Key, a Key Encryption Key (KEK) encrypts the Secret Key before storing it in the database. When the software client received the key pair, the Secret Key must be stored securely since all requests to be authenticated by this key.

Following the registration, the software client can consume any service by performing an HMAC with the Secret Key and the payload of the message. The sequence diagram illustrated below:

Although this sequence diagram looks more complex, the process considered as secure and fast since the Secret Key is not exposed in the transport at any stage, while the call to the database for getting the Encrypted Secret Key can be cached. 


I think that API keys are a must for any application serving software clients due to many advantages in both security and performance domains. 

September 18, 2014

SSL Termination Proxy for Windows

There are many open source products developed mainly for Linux, but work on Windows, e.g. Rabbit MQ, OpenSSL and other enterprise solutions.
When developing enterprise software based on Windows, the importance of storing encryption keys in Windows Certificate Store becomes an issue.
The vulnerability of not working with the Windows Certificate Store is storage of encryption keys on the file system and not in a secure location as defined by Microsoft. The risk is not that high since both NTFS file system and Windows Certificate Store can be protected by an ACL, however, this is a standard for the customers I work with.


  1. Each framework implements the SSL handshake distinctly, e.g. The implementation of the .NET framework is not the same as OpenSSL or Erlang. 
  2. The solution should not be coupled with a specific protocol, e.g. HTML, AMQP or any proprietary protocol.


  1. The enterprise software should not be recompiled to overcome the challenge.
  2. Performance should be minimally affected (it's very hard).
  3. The solution should support both SSL and client-side authentication.
  4. Private keys must be stored in the Windows Certificate Store and marked as not exportable. In fact, preferably to store the private key on HSM, but it's just a matter of changing the Windows CSP (Crypto Service Provider).


Since the solution should support both access to the Windows Certificate Store and transparent protocols, the OpenSSL solution is not good enough due to inability to work with the Windows Certificate Store. In addition, reverse proxies with the ability to terminate SSL channels, e.g. NGINX and HAProxy, are also unable to comply with the requirements. 
There is also an option to use WCF transportation, which works pretty fast, secure and it complies with certificate storage requirements etc. However, the main problem with this solution is that code changes required to replace the tunnel of the system, especially if the transport based on 3rd party solutions.

A walk-through the solution

The solution is pretty simple and even scalable. The main idea is to develop SSL termination socket proxy, which means that it is transparent for any application. 
The architecture diagram illustrated below for Rabbit MQ, but it can work for any software:

Since Socket is lower than the application layer in the OSI model, it is much easier to control the flow in this level. Hence, the client should not be changed, except the port on the target server. On the server side, I would protect the Rabbit MQ server to refuse any connections from non-localhost address by configuration. As for the SSL termination socket proxy - it is responsible for all security requirements, e.g. Certificate revocation verification, mutual authentication, using the Windows Certificate Store to bind to the TCP listener etc.
As for the performance... If you develop in lower languages, it may be fast enough. On the other hand, if your'e and expert in non-blocking and fully asynchronous code writing, it can work fast enough even in .NET and Java.


I worked with two of my colleagues - Guy Baron and Nir Rotshtein - both are very experienced in software architecture and development. Folks - it's my pleasure working with you!

August 28, 2014

A new memory scraping tool

A Raising Trend

I've been looking for the term "POS Malware" on Google trends and I found the following result:
From my humble opinion, it looks like that many people getting interested in POS malware because they understand that POS can bring them a lot of $$$. On the other hand, these are only the results on Google, in the Darkent forums you'd find more aggressive results.

The Result

Few students from the "College of Management Academic Studies" decided to develop an open source memory scraping tool that allows organizations to find their vulnerability of scraping their process memory to get specific pattern from it, e.g. credit card numbers, URL or any regular expression.
This is an open source tool and I highly recommend you to download it, compile and run on systems that you need to analyze.