Avoiding XML signature attacks

The other day the security folks over at DUO security posted about a class of bugs in several popular SAML implementations: https://duo.com/blog/duo-finds-saml-vulnerabilities-affecting-multiple-implementations

This is an excellent piece of work that shows how hard has turned out to be for implementors to do xml digital signature and encryption correctly. Writing security sensitive code is often hard because its not enough to make stuff that “works”, your code has to be “secure” aswell.

A lot of people have asked me and my collegues in the Swedish eIDAS project about how to avoid being affected by this bug and the DUO blogpost clearly sais that the software listed in the post is not the only software affected by this bug – its just the only software the DUO team looked at that was affected by the bug.

The further you get from the R&E federation community (where my $dayjob is), the more common it is to find custom SAML implmementations and there are probably a lot of these implementations that the DUO team never looked at.

I decided to do a writeup of my understanding of how to avoid this and a class of similar bugs that arise from not doing c14n correctly.

For the rest of this post I assume that you have a c14n and xmldsig library that “does the right thing” – the question facing you as a coder is this:

How do I safely apply my libraries to validate an xml signature and do something with the result?

The key to avoiding this sort of bug lies in a proper way to do xml signature validation.

Lets start by showing (pseudocode) how NOT to do it:

    xml = parse(some xml text)
    if (is_valid(xml)) {

This is BAD and dangerous. There have been numerous wrapping attacks published in the last few years – the latest one from DUO is just the last in a long row of similar attacks. The right way to do xml signature processing goes something like this:

    xml = parse(some xml text)
    valid_xml = validate(xml)
    if (valid_xml != NULL) {

The point is that the “validate” function should return the valid references (could be > 1 !) after c14n processing. Specifically the “validate” function should do reference processing, c14n and return the resulting valid XML trees.

This approach protects against a whole class of attacks including this latest one providing c14n is done correctly.

We strongly encourage all implementors to review their code and make sure you follow the above pattern (and avoid the anti-pattern).

This advise applies to more than SAML – anytime you do xml signature validation this is how you should do it. However if you are running SAML specifically then there is an additional measure you can take to avoid several attacks: encryption.

There is currently no known practical attack that can be launched against SAML implementations that use encrypted assertions. Using encrypted assertions therefore provides an extra layer of security against attacks.

Implementers should not rely on encrypted assertions to avoid these problems but should follow the patters described above. But for those who are using 3rd party software that may still be vulnerable, encrypting the assertion (or enforcing encrypted assertions in responses) will at least avoid any currently know attack that would lead to identity spoofing using SAML.

Thx to Stefan Santesson <stefan at aaa-sec.com>

Comments Off on Avoiding XML signature attacks

Filed under Uncategorized

We need an eIDAS IAF profile

The eIDAS directive was published the other day. Now follows the work on getting it implemented. To this end I propose the EU develop an eIDAS trust framework as a profile of the Kantara Initiative Identity Assurance Framework.

What is a trust framework?

A trust framework is a set of requirement on a component of an identity system (eg an identity provider or a relying party). The requirements in a trust framework typically cover aspects of subject authentication, operational security, subject identity verification, credential-to-name binding, attribute management, service process and organizational maturity etc.

Trust frameworks tend to be more detailed when crossing jurisdictions or verticals since there are often unstated rules that help to build trust within a jurisdiction or vertical.

For instance, inside the R&E sector trust frameworks are often heavily elided because there is an inherent focus on collaboration in that sector which results in implicit trust.

However when the R&E sector interfaces with the Health sector, considerations of patient safety and security often result in cross-sector trust frameworks that are more detailed due to a need to “spell out the details”.

As trust frameworks are built to cover more jurisdiction and verticals they typically grow in complexity and there is a need to develop and maintain standardized trust frameworks to enable interoperability.

Standard trust frameworks

Over the last few years several industry standard trust frameworks have emerged that try to be independent of jurisdiction. The two most prominent ones are ISO 29115 and the Kantara Identity Assurance Framework (KI-IAF). The former ows much of its genes to the latter which in turn owes genes to NIST SP 800-63. There was also a signifficant amount of influence between ISO 27k and KI-IAF – partly because of a set of shared primary authors.

These 3 frameworks – ISO29115, KI-IAF and NIST SP 800-63 – all rely on the notion of 4 levels of assurance which has become a de-facto standard in the industry ever since. In practice the lowest level (LoA1) is less referenced today because there are no organizations that provide accreditation at this level.

Why are trust frameworks useful?

In an identity system, relying parties (RPs) have to made decisions about weather to trust the assertions of identity providers and attribute authorities. Such decisions are typically made on the basis of knowledge about the operational, technical and policy properties of the identity providers.

The relying party will typicall have a set of business requirements against which the properties of the identity provider is evaluated. For instance the RP may require that the subjects all have been identified to a certain degree of certainty.

As the number of business relationships grow and the number of identity providers grow accordingly the RP is spending more and more resorces evaluating IdPs for compliance with its business requirements.

The trust framework reduces this complexity by relying on standardized trust frameworks.

The role of independent audits

The requirements that make up a trust framework are often intentionally written with a certain amount of leeway to allow for technical innovation. For instance, instead of specifying that (say) subject authentication must be done using a specific technology, a requirement might specify security properties that can be fulfilled by a number of equally acceptable technology alternatives.

Having requirements (or criteria) that are less detailed typically make for trust frameworks that are easy to understand and ready but place a much higher emphasis on the independence of the function that evaluates trust frameworks. To this end it is common to employ specialist auditors that do not have ties to the services under evaluation.

The need for an eIDAS trust framework

The eIDAS directive is based on interoperability of identity systems already deployed in the member states. In many cases the MS has a trust framework that describe the requrements on identity providers approved for use in that jurisdiction. Interoperability for eIDAS implies (as a worst case) that each MS trust framework will have to be evaluated and compared to each other MS trust framework.

Since several trust frameworks in use in the EU (eg the Swedish e-ID trust framework, the UK T-scheme or the Austrian govt federation trust framework) are based on KI-IAF there is already a great deal of commonality already in place.

By creating an eIDAS trust framework as a profile of KI-IAF we achieve the following goals:

1. A set of common requirements based on an industry standard baseline
2. An easy path to interoperability with CA and US (also based on KI-IAF)
3. Reduced complexity for MS-MS interoperability

Why now?

The Kantara Initiative is working actively on a new version of the IAF where the US Federal Goverment Framework (FICAM ATOS) will be a profile along side the CA goverment profile. It makes a lot of sense for this work to happen in parallell with creating an eIDAS profile.

This is the right time to get this done!

Edit: eIDAS is Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC

Comments Off on We need an eIDAS IAF profile

Filed under E-delegationen, Identity

Anfall är bästa försvar

Computer Sweden skriver igår om att 3 myndigheter ifrågasätter säkerheten i nya e-legsystemet och har bett MSB genomföra en granskning. Oberoende granskningar är mycket bra och borde ske oftare när Svenska myndigheter inför ny teknik – då hade vi säkert sluppit en del katastrofala misstag de senaste åren, inklusive det nuvarande BankID-styrda e-legitimationssystemet i Sverige.

För er som inte orkar läsa resten så kommer här en sammanfattning: Utspelet från Försäkringskassan, CSN och Arbetsförmedlingen är ett beställningsjobb från BankID som i och med införandet av ett nytt öppnare e-legsystem riskerar att tappa sitt monopol. Den granskning som MSB borde göra är av systemet för Mobilt BankID som just nu införs i stor skala i Sverige utan att någon oberoende, seriös säkerhetsanalys genomförts.

Federations-farfar gnäller

Det nya e-legsystemet i Sverige är i många delar en kopia av SWAMID, Sveriges högskolors identietsfederation. Som många säkert vet är SWAMID en av de saker jag har jobbat mest med de senaste åren och jag är mycket stolt över hur väl SWAMID fungerar samt att de lösningar som jag och mina kollegor tagit fram blivit kopierade inte bara en gång i Sverige redan 1,2. SWAMID är en del i ett stort ekosystem av identitetsfederationer som bara inom forskningssektorn omfattar flera 10-tals miljoner användare 1. Utanför denna sektor används samma teknik för att bygga samma lösning som e-legsystemet bla i Storbritanien, Finland, Danmark, Österrike & USA.

Federationer innebär den raka motsatsen till centralisering, monopol och är en teknisk garanti för valfrihet. Dessa kärnvärden borde de flesta i Sverige kunna skriva under på. Inklusive Försäkringskassan, CSN och Arbetsförmedlingen.

Men i Sverige har vi BankID som har hittat på eget … självklart mycket bättre.


Det finns en hel del andra dumheter i kritiken, bla att den nya tekniken inte skulle vara “mobilanpassad”. Andra kommer säkert att bemöta detta och mycket annat som bara är “lagom dumt” men jag väljer att lyfta fram en av de springande punkterna i kritiken mot det nya e-legsystemet som är dumt på en alldeles särdeles flagrant idiotisk nivå:

Den nya federations-tekniken skall vara extra sårbar för sk DDOS-attacker! Det nya systemet är ett distribuerat system som är baserat på konkurrens och medger att fler aktörer delar på ansvaret att identifiera medborgare i Sverige. Idag är det nästan bara en aktör som står för denna kritiska samhällsfunktion: Banken. Nuvarande BankID-lösning är inte bara ett affärsmässigt monopol utan tekniken är dessutom i hög grad centraliserad och beroende av kritiska funktioner på BankGiro-Centralen (BGC) för att funka.

En centraliserad lösning är självklart mycket bättre än en distribuerad… Eller var det tvärt om?

Mer konkurrens tack

Vi ska ha mer oberoende granskning av teknik i Sverige. Jag hoppas verkligen att MSB lägger ner de resurser som krävs för att titta på hela ekosystemet för identitet i Sverige. Gör man det är jag övertygad om att man kommer fram till det som saknas är mer konkurrens i Sverige.

Idag har vi ingen konkurrens överhuvudtaget. Ibland lyfts argument mot konkurrens inom detta område (gissa varifrån) eftersom med fler aktörer så blir det ytterligare en sak som medborgare måste välja förutom elleverantör, dagis, skola osv. Detta argument måste man ta på allvar men det är också oerhört farligt ur ett sårbarhetsperspektiv. Ett allvarligt problem hos BGC eller BankID slår idag ut nästan alla medborgares tillgång till kritiska samhällsfunktioner.

Det är alltså detta som är den stora rosa elefanten i rummet: Det finns fortfarande bara en organisation som gör identifiering av medborgare i Sverige: Banken.

Det finns flera andra vertikaler som skulle kunna ta upp konkurrensen med Banken om att identifiera medborgare i Sverige, tex …

  • Daglighandel – vi har ganska få aktörer och flera av dom bedriver redan viss form av bankverksamhet
  • Svenska Spel – redan idag en teknik-intensiv verksamhet med fokus på säker identifiering
  • Mobiloperatörer – Sverige är ett av de mest mobil-täta länderna i världen och här rör det på sig lite redan.
  • Intresseorganisationer – varför inte DFRI tex? Bättre garant för individens fri och rättigheter kan jag inte tänka mig.

Nyligen annonserades att nästa generation av federationsteknologi, OpenID connect nu är publicerad. OpenID Connect kallas ibland SAML3 eftersom den är så nära SAML2 som vi använder i e-legsystemet i Sverige. OpenID Connect är enkelt att koppla ihop med SAML och är redan integrerat i alla mobilplatformar. Det också är den teknik GSMA valt för sitt mobile connect initiative. Ett samarbete med operatörsbranchen skulle kunna bryta upp BankID-monopolet i Sverige.

Heja e-legnämnden!

Jag slutar mitt “rant” med att lyfta på hatten för e-legnämnden. Dom får ofta stå ut med både det ena och det andra från både förespråkare och motståndare till en nystart för e-legitimation i Sverige men dom gör ett väldigt bra jobb under mycket svåra förhållanden.

Det är dags för Svenska myndigheter att släppa sargen och komma in i matchen.


Filed under Identity

Its umbrellas all the way down

The NREN world is changing. It used to be that you could get away with running a network and a decent ftp-server and that would be good enough.

Not so much anymore. NRENs are turning into service portfolios with a network. For some the transition is relatively painless and quick, for others less so.

Travelling at different speeds causes tensile stress between NRENs. There is a natural tendency from the governance layer to try to address this top-down. Often by adding more governance.

Recently both eduGAIN and eduroam has been the focus of this special form of Loving Care.

Beyond creating work for process consultants, adding governance layers seldom adds real value.

Let me explain…

An federation exchange point – like eduGAIN or Kalmar2 – is a lot like an Internet eXchange point (aka an IX). An IX that operates on a free market is mostly controlled by two forces: the value of the connected services pull customers to the IX and the cost of getting and staying connected to the IX push customers away from the IX.

Any clueful IX – like netnod, LINX or AMS-IX – is operated by an entity that understand that the value of the IX lie in the connected customers. Some even go as far as to call their customers members and allow them significant control over policy and direction.

In other words: governance follows the money!

The current governance model for eduGAIN and eduroam is based on the same idea: funding members control the service and this has to date implied a central role for GEANT. Some see this as a problem as eduGAIN and eduroam expand beyond the EU.

Fair enough!

Adding governance to address this largely imaginary problem is hardly the answer though. Adding layers just adds wasteful complexity and ignores the following two facts:

  1. Building a Federation eXchange point no longer involves black magic. The knowledge is widely available. There are at least 10 groups in the world today that could build another eduGAIN.
  2. Unlike an IX, a Federation-IX doesn’t depend on geography, economy or scale: almost anyone can build one. If you can attract services you win. Kalmar2 is proof that this is both practical, cheap and easy.

To those who are hesitant about the governance of eduGAIN I say this: join up and demand representation. Your voice will be heard or me and a lot of others will help you build a new eduGAIN down the road from the old one.

We’ve done it once, we can do it again. It will be done in a New York minute.

Comments Off on Its umbrellas all the way down

Filed under Identity

The bitter taste of good intentions

In a recent blogpost Eran explains why he withdrew from the OAUTH WG. Having observed the workings of that particular WG since its inception I thought I’d provide some perspective.

To put it briefly: Eran is in part right and completely, totally off base.

Let me first say that I admire Eran for sticking with it for so long. Being a document editor for something that needs 30 version to get “done” is not easy.

Eran is completely right in saying that OAUTH 2.0 has grown into a much larger beast than 1.0 and that there are now ways in which you can put 2.0 together that will be unsafe, non-interoperable and probably fattening too. Eran is also right in thinking that the WG has taken way to much time to reach this point.

However Eran is missing an important reason for why things developed they way they did. Eran touches on this when he talks about enterprise vs web.

In fact where Eran talks about enterprise it should really say “Microsoft”.

Early on and for several meetings the WG was totally devoid of traditional software vendors. It did (to some extent) attract the big web companies with a stated interest in OAUTH: Facebook, Google, Yahoo along with a few of the mobile operators. The mobile operators stayed on and have made important contributions but the web companies were a completely different story.

Personally I was surprised at the level of “ego-waving” going on at some of the early meetings and when WRAP appeared. I especially recall one WG meeting where a representative from one large stakeholder disrupted a session by walking out in the middle of an active round-table discussion stating boredom as a reason.

In its formative months when a WG depends on committed and active participation from invested vendors and operators the OAUTH WG had too little of this and too much casting about.

When MSFT turned up (and people who know me know that I seldom sing their praise) their presence stabilized the WG and it started to make progress but important time had been lost.

Is OAUTH 2.0 a failure?

Future will tell. I do not think the fact that FB is still operating on version 20 (or something) is a measure of the success or failure of the protocol. Having implemented OAUTH 2.0 myself I don’t agree with Eran that 2.0 is more complicated than 1.0 – quite the contrary. I agree with Eran in thinking that an important piece of OAUTH 2.0 has been lost by making signatures an optional part of the spec. Ironically the proponents of that change cited more or less the same reasons that the opponents of “WS-*” cite: simplicity.

If there is a lesson to be had, perhaps it is this: make it as simple as possible but no simpler. Unfortunately many standards organizations (SDOs) routinely fail to remember this.

The challenge going forward is how we measure interoperability for something like OAUTH where there are no reference implementations, few traditional software vendors (and those that exist add lots of secret sauce to the mix).

Will OAUTH 2.0 move beyond single-vendor ecosystems where if you want to talk to Facebook you’d better use the Facebook reference code if you expect anything to work?

I sure hope so.


Filed under Identity, Internet



Next I’ll pick up the shovel and keep digging.

Comments Off on #rlbob

Filed under Identity, Internet, Uncategorized

pyFF – another metadata aggregator

In the world of large scale identity federations the problem-du-jour is how federation operators can connect their federations and share services.

The eduGAIN program led by my good friends Valter Nordh and Brook Schofield, in being a concrete instantiation of interfederation, is starting to reveal operational issues in a number of national R&E federation specifically wrt to how SAML metadata is managed and made available to connected relying parties and identity providers.

A couple of years ago Ian Young wrote a a blog post on an operational model for metadata and Andreas Solberg started work on a basic metadata aggregation profile in part based on those ideas. At the recent tf-emc2 OpenSpace in Zurich Brook ran a session on this topic. These efforts will need to converge in the near future to produce a Standard Model for Interfederation.

In order to support such a model the world needs working code.

Ian and the Shibboleth team has been working on MA1 for a while. I’ve had code in this space too – for instance my saml-md-aggregator.

Recently (last Monday) me and the SWAMID operations team realized we needed to modernize the way we manage and publish our metadata so I took the opportunity to roll up my sleeves and write some code.

The result is pyFF – Federation Feeder.

pyFF is based on a simple execution model – metadata goes in one end and out the other and in between processing happens in a pipeline of basic operations described by a simple DSL (domain specific language) using YAML syntax. Right now the code is in rapid development and I expect it to be in production for SWAMID very soon.

Check it out and send me comments: leifj at sunet.se

Comments Off on pyFF – another metadata aggregator

Filed under Uncategorized

Why it is (sometimes) ok to shoot yourself in the foot

I got this link on a list earlier today: Facebook (2 step authentication) fail !

I totally disagree with almost all the assumptions and conclusions of that post. The only bit I can sort-of agree with is that maybe, just maybe it is not a good idea to allow you to opt out of security without proving your identity with a higher level of assurance but I can also totally grok why FB is doing it this way. The reason is spelled “support costs”.

The fundamental mistake of the post is this: The author assumes that strong(er) authentication (eg 2-factor) should be at the discretion of the site owner.

As content owner (my facebook page, my crap) in this case, I carry most of the risk associated with protecting my data. It is therefore totally fine to let me bypass security if I want to – up to a point.

At some point FB assumes some basic level of risk and responsibility which is why they won’t let me create an account without a password.

If this were a bank the border between personal risk and site-owner risk would shift – in part because the law mandates a higher level of responsibility on the part of the bank than in the case of FB.

Higher level-of-assurance/protection is successfully introduced for one of two reasons:

  • the user values his/her data (cf blizzard tokens)
  • “the man” (eg the government) tells you how it must be

Luckily FB isn’t “the man” – at least not yet – and isn’t in a position to force users into valuing their data above a level that is minimally accepted by most users.

This is the reason strong authentication almost always fails when faced with reality: most of us security nerds don’t share the same gut-reaction with respect to data value than most “normal” users and therefore we are willing to accept a higher degree of hassle when it comes to protecting that data.

This brings me back to the fundamental point: the cost of introducing strong authentication is not in tokens, provisioning or identity proofing. Most of the cost is in support. The simple truth is that most ways we have devised to improve security of the authentication step in any protocol suck from a UI perspective. Fundamentally all such measures (be it SMS codes, OTP tokens or so called “smart” cards) all introduce extra steps in the login process. This means that they are seen by the user as an obstacle that he/she must overcome before they can get at whatever content they were going for.

Incidentally this is related to click-through terms-of-use dialogs but that is another story and another blogpost.

It is worth noting (as I usually try to do when this topic comes up in conversation) that some of the most successful deployments of 2-factor tokens are in the gaming industry and I firmly believe that in these cases the user values their data sufficiently much to accept the additional obstacles imposed by stronger authentication.

I also firmly believe that anyone who can design a truly user-friendly strong authentication mechanism would get rich pretty fast and would do a great service to the Internet.


Filed under Uncategorized

Why you should care about the CABforum

The CA browser forum (aka CABforum) announced a couple of days ago that they would form a WG on “organizational reform”.

Why is this important I hear you say?

The CABforum has quite a lot of power. This group makes decisions that affect which CAs are chosen for inclusion in default browser trust stores. Currently the group is comprised of browser and CA vendors. Notably absent are any relying parties.

Here is how to participate (quoted from the cabforum.org announcement):

In support of this process, the special working group is soliciting short (no more than 750 words, please) position papers and statements of interest from organizations and individuals on these topics. We encourage stakeholders to submit their comments to questions@cabforum.org now through March 30, 2012. All submissions will be posted publicly on the CA/Browser Forum website. (www.cabforum.org)

Comments Off on Why you should care about the CABforum

Filed under Trust

convergence & federations?

Convergence is one of several proposed solutions to the problem of lying and poorly managed CAs. DANE is of course another. I like fighting on multiple fronts so when rlbob sent me an inspirational email today after listening to Moxie talk about convergence at #RSAC I just could not resist it.

To make a long story short I went and setup a convergence notary. If you feel like trusting it feel free to visit https://etc.mnt.se/mnt.notary but make sure you visit convergence.io and install their FireFox plugin first.

Here then is the rlbob challenge:

The Chrome guy says they can’t use convergence because the traffic load would be too high for anyone but them to support, and they can’t be the ones to validate pubkeys for their own browser. In steps a worldwide network of registrars run by R&HE using our spare computing power and bandwidth. Let’s do it!

Lets see what happens next!

Comments Off on convergence & federations?

Filed under Trust