Category Archives: Identity

We need an eIDAS IAF profile

The eIDAS directive was published the other day. Now follows the work on getting it implemented. To this end I propose the EU develop an eIDAS trust framework as a profile of the Kantara Initiative Identity Assurance Framework.

What is a trust framework?

A trust framework is a set of requirement on a component of an identity system (eg an identity provider or a relying party). The requirements in a trust framework typically cover aspects of subject authentication, operational security, subject identity verification, credential-to-name binding, attribute management, service process and organizational maturity etc.

Trust frameworks tend to be more detailed when crossing jurisdictions or verticals since there are often unstated rules that help to build trust within a jurisdiction or vertical.

For instance, inside the R&E sector trust frameworks are often heavily elided because there is an inherent focus on collaboration in that sector which results in implicit trust.

However when the R&E sector interfaces with the Health sector, considerations of patient safety and security often result in cross-sector trust frameworks that are more detailed due to a need to “spell out the details”.

As trust frameworks are built to cover more jurisdiction and verticals they typically grow in complexity and there is a need to develop and maintain standardized trust frameworks to enable interoperability.

Standard trust frameworks

Over the last few years several industry standard trust frameworks have emerged that try to be independent of jurisdiction. The two most prominent ones are ISO 29115 and the Kantara Identity Assurance Framework (KI-IAF). The former ows much of its genes to the latter which in turn owes genes to NIST SP 800-63. There was also a signifficant amount of influence between ISO 27k and KI-IAF – partly because of a set of shared primary authors.

These 3 frameworks – ISO29115, KI-IAF and NIST SP 800-63 – all rely on the notion of 4 levels of assurance which has become a de-facto standard in the industry ever since. In practice the lowest level (LoA1) is less referenced today because there are no organizations that provide accreditation at this level.

Why are trust frameworks useful?

In an identity system, relying parties (RPs) have to made decisions about weather to trust the assertions of identity providers and attribute authorities. Such decisions are typically made on the basis of knowledge about the operational, technical and policy properties of the identity providers.

The relying party will typicall have a set of business requirements against which the properties of the identity provider is evaluated. For instance the RP may require that the subjects all have been identified to a certain degree of certainty.

As the number of business relationships grow and the number of identity providers grow accordingly the RP is spending more and more resorces evaluating IdPs for compliance with its business requirements.

The trust framework reduces this complexity by relying on standardized trust frameworks.

The role of independent audits

The requirements that make up a trust framework are often intentionally written with a certain amount of leeway to allow for technical innovation. For instance, instead of specifying that (say) subject authentication must be done using a specific technology, a requirement might specify security properties that can be fulfilled by a number of equally acceptable technology alternatives.

Having requirements (or criteria) that are less detailed typically make for trust frameworks that are easy to understand and ready but place a much higher emphasis on the independence of the function that evaluates trust frameworks. To this end it is common to employ specialist auditors that do not have ties to the services under evaluation.

The need for an eIDAS trust framework

The eIDAS directive is based on interoperability of identity systems already deployed in the member states. In many cases the MS has a trust framework that describe the requrements on identity providers approved for use in that jurisdiction. Interoperability for eIDAS implies (as a worst case) that each MS trust framework will have to be evaluated and compared to each other MS trust framework.

Since several trust frameworks in use in the EU (eg the Swedish e-ID trust framework, the UK T-scheme or the Austrian govt federation trust framework) are based on KI-IAF there is already a great deal of commonality already in place.

By creating an eIDAS trust framework as a profile of KI-IAF we achieve the following goals:

1. A set of common requirements based on an industry standard baseline
2. An easy path to interoperability with CA and US (also based on KI-IAF)
3. Reduced complexity for MS-MS interoperability

Why now?

The Kantara Initiative is working actively on a new version of the IAF where the US Federal Goverment Framework (FICAM ATOS) will be a profile along side the CA goverment profile. It makes a lot of sense for this work to happen in parallell with creating an eIDAS profile.

This is the right time to get this done!

Edit: eIDAS is Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC

Comments Off on We need an eIDAS IAF profile

Filed under E-delegationen, Identity

Anfall är bästa försvar

Computer Sweden skriver igår om att 3 myndigheter ifrågasätter säkerheten i nya e-legsystemet och har bett MSB genomföra en granskning. Oberoende granskningar är mycket bra och borde ske oftare när Svenska myndigheter inför ny teknik – då hade vi säkert sluppit en del katastrofala misstag de senaste åren, inklusive det nuvarande BankID-styrda e-legitimationssystemet i Sverige.

För er som inte orkar läsa resten så kommer här en sammanfattning: Utspelet från Försäkringskassan, CSN och Arbetsförmedlingen är ett beställningsjobb från BankID som i och med införandet av ett nytt öppnare e-legsystem riskerar att tappa sitt monopol. Den granskning som MSB borde göra är av systemet för Mobilt BankID som just nu införs i stor skala i Sverige utan att någon oberoende, seriös säkerhetsanalys genomförts.

Federations-farfar gnäller

Det nya e-legsystemet i Sverige är i många delar en kopia av SWAMID, Sveriges högskolors identietsfederation. Som många säkert vet är SWAMID en av de saker jag har jobbat mest med de senaste åren och jag är mycket stolt över hur väl SWAMID fungerar samt att de lösningar som jag och mina kollegor tagit fram blivit kopierade inte bara en gång i Sverige redan 1,2. SWAMID är en del i ett stort ekosystem av identitetsfederationer som bara inom forskningssektorn omfattar flera 10-tals miljoner användare 1. Utanför denna sektor används samma teknik för att bygga samma lösning som e-legsystemet bla i Storbritanien, Finland, Danmark, Österrike & USA.

Federationer innebär den raka motsatsen till centralisering, monopol och är en teknisk garanti för valfrihet. Dessa kärnvärden borde de flesta i Sverige kunna skriva under på. Inklusive Försäkringskassan, CSN och Arbetsförmedlingen.

Men i Sverige har vi BankID som har hittat på eget … självklart mycket bättre.

DDOS

Det finns en hel del andra dumheter i kritiken, bla att den nya tekniken inte skulle vara “mobilanpassad”. Andra kommer säkert att bemöta detta och mycket annat som bara är “lagom dumt” men jag väljer att lyfta fram en av de springande punkterna i kritiken mot det nya e-legsystemet som är dumt på en alldeles särdeles flagrant idiotisk nivå:

Den nya federations-tekniken skall vara extra sårbar för sk DDOS-attacker! Det nya systemet är ett distribuerat system som är baserat på konkurrens och medger att fler aktörer delar på ansvaret att identifiera medborgare i Sverige. Idag är det nästan bara en aktör som står för denna kritiska samhällsfunktion: Banken. Nuvarande BankID-lösning är inte bara ett affärsmässigt monopol utan tekniken är dessutom i hög grad centraliserad och beroende av kritiska funktioner på BankGiro-Centralen (BGC) för att funka.

En centraliserad lösning är självklart mycket bättre än en distribuerad… Eller var det tvärt om?

Mer konkurrens tack

Vi ska ha mer oberoende granskning av teknik i Sverige. Jag hoppas verkligen att MSB lägger ner de resurser som krävs för att titta på hela ekosystemet för identitet i Sverige. Gör man det är jag övertygad om att man kommer fram till det som saknas är mer konkurrens i Sverige.

Idag har vi ingen konkurrens överhuvudtaget. Ibland lyfts argument mot konkurrens inom detta område (gissa varifrån) eftersom med fler aktörer så blir det ytterligare en sak som medborgare måste välja förutom elleverantör, dagis, skola osv. Detta argument måste man ta på allvar men det är också oerhört farligt ur ett sårbarhetsperspektiv. Ett allvarligt problem hos BGC eller BankID slår idag ut nästan alla medborgares tillgång till kritiska samhällsfunktioner.

Det är alltså detta som är den stora rosa elefanten i rummet: Det finns fortfarande bara en organisation som gör identifiering av medborgare i Sverige: Banken.

Det finns flera andra vertikaler som skulle kunna ta upp konkurrensen med Banken om att identifiera medborgare i Sverige, tex …

  • Daglighandel – vi har ganska få aktörer och flera av dom bedriver redan viss form av bankverksamhet
  • Svenska Spel – redan idag en teknik-intensiv verksamhet med fokus på säker identifiering
  • Mobiloperatörer – Sverige är ett av de mest mobil-täta länderna i världen och här rör det på sig lite redan.
  • Intresseorganisationer – varför inte DFRI tex? Bättre garant för individens fri och rättigheter kan jag inte tänka mig.

Nyligen annonserades att nästa generation av federationsteknologi, OpenID connect nu är publicerad. OpenID Connect kallas ibland SAML3 eftersom den är så nära SAML2 som vi använder i e-legsystemet i Sverige. OpenID Connect är enkelt att koppla ihop med SAML och är redan integrerat i alla mobilplatformar. Det också är den teknik GSMA valt för sitt mobile connect initiative. Ett samarbete med operatörsbranchen skulle kunna bryta upp BankID-monopolet i Sverige.

Heja e-legnämnden!

Jag slutar mitt “rant” med att lyfta på hatten för e-legnämnden. Dom får ofta stå ut med både det ena och det andra från både förespråkare och motståndare till en nystart för e-legitimation i Sverige men dom gör ett väldigt bra jobb under mycket svåra förhållanden.

Det är dags för Svenska myndigheter att släppa sargen och komma in i matchen.

3 Comments

Filed under Identity

Its umbrellas all the way down

The NREN world is changing. It used to be that you could get away with running a network and a decent ftp-server and that would be good enough.

Not so much anymore. NRENs are turning into service portfolios with a network. For some the transition is relatively painless and quick, for others less so.

Travelling at different speeds causes tensile stress between NRENs. There is a natural tendency from the governance layer to try to address this top-down. Often by adding more governance.

Recently both eduGAIN and eduroam has been the focus of this special form of Loving Care.

Beyond creating work for process consultants, adding governance layers seldom adds real value.

Let me explain…

An federation exchange point – like eduGAIN or Kalmar2 – is a lot like an Internet eXchange point (aka an IX). An IX that operates on a free market is mostly controlled by two forces: the value of the connected services pull customers to the IX and the cost of getting and staying connected to the IX push customers away from the IX.

Any clueful IX – like netnod, LINX or AMS-IX – is operated by an entity that understand that the value of the IX lie in the connected customers. Some even go as far as to call their customers members and allow them significant control over policy and direction.

In other words: governance follows the money!

The current governance model for eduGAIN and eduroam is based on the same idea: funding members control the service and this has to date implied a central role for GEANT. Some see this as a problem as eduGAIN and eduroam expand beyond the EU.

Fair enough!

Adding governance to address this largely imaginary problem is hardly the answer though. Adding layers just adds wasteful complexity and ignores the following two facts:

  1. Building a Federation eXchange point no longer involves black magic. The knowledge is widely available. There are at least 10 groups in the world today that could build another eduGAIN.
  2. Unlike an IX, a Federation-IX doesn’t depend on geography, economy or scale: almost anyone can build one. If you can attract services you win. Kalmar2 is proof that this is both practical, cheap and easy.

To those who are hesitant about the governance of eduGAIN I say this: join up and demand representation. Your voice will be heard or me and a lot of others will help you build a new eduGAIN down the road from the old one.

We’ve done it once, we can do it again. It will be done in a New York minute.

Comments Off on Its umbrellas all the way down

Filed under Identity

The bitter taste of good intentions

In a recent blogpost Eran explains why he withdrew from the OAUTH WG. Having observed the workings of that particular WG since its inception I thought I’d provide some perspective.

To put it briefly: Eran is in part right and completely, totally off base.

Let me first say that I admire Eran for sticking with it for so long. Being a document editor for something that needs 30 version to get “done” is not easy.

Eran is completely right in saying that OAUTH 2.0 has grown into a much larger beast than 1.0 and that there are now ways in which you can put 2.0 together that will be unsafe, non-interoperable and probably fattening too. Eran is also right in thinking that the WG has taken way to much time to reach this point.

However Eran is missing an important reason for why things developed they way they did. Eran touches on this when he talks about enterprise vs web.

In fact where Eran talks about enterprise it should really say “Microsoft”.

Early on and for several meetings the WG was totally devoid of traditional software vendors. It did (to some extent) attract the big web companies with a stated interest in OAUTH: Facebook, Google, Yahoo along with a few of the mobile operators. The mobile operators stayed on and have made important contributions but the web companies were a completely different story.

Personally I was surprised at the level of “ego-waving” going on at some of the early meetings and when WRAP appeared. I especially recall one WG meeting where a representative from one large stakeholder disrupted a session by walking out in the middle of an active round-table discussion stating boredom as a reason.

In its formative months when a WG depends on committed and active participation from invested vendors and operators the OAUTH WG had too little of this and too much casting about.

When MSFT turned up (and people who know me know that I seldom sing their praise) their presence stabilized the WG and it started to make progress but important time had been lost.

Is OAUTH 2.0 a failure?

Future will tell. I do not think the fact that FB is still operating on version 20 (or something) is a measure of the success or failure of the protocol. Having implemented OAUTH 2.0 myself I don’t agree with Eran that 2.0 is more complicated than 1.0 – quite the contrary. I agree with Eran in thinking that an important piece of OAUTH 2.0 has been lost by making signatures an optional part of the spec. Ironically the proponents of that change cited more or less the same reasons that the opponents of “WS-*” cite: simplicity.

If there is a lesson to be had, perhaps it is this: make it as simple as possible but no simpler. Unfortunately many standards organizations (SDOs) routinely fail to remember this.

The challenge going forward is how we measure interoperability for something like OAUTH where there are no reference implementations, few traditional software vendors (and those that exist add lots of secret sauce to the mix).

Will OAUTH 2.0 move beyond single-vendor ecosystems where if you want to talk to Facebook you’d better use the Facebook reference code if you expect anything to work?

I sure hope so.

5 Comments

Filed under Identity, Internet

#rlbob

https://spaces.internet2.edu/display/rlbob/Home

Next I’ll pick up the shovel and keep digging.

Comments Off on #rlbob

Filed under Identity, Internet, Uncategorized

Swedish national SAML federation?

The long-awaited (at least if you’re Swedish and interested in public sector IT which does rather limit the audience a bit) e-delegationen report was released today. The section on national identity solutions says “SAML” and “federations” over and over.

On the whole the report promises a significant improvement over todays proprietary solutions. There is still lots of work left to do in order to realize these ideas. Those of us who have worked in identity space for a while know that there are plenty of opportunities to shoot oneself in the foot even if you have the right shoes on.

For reasons that escape me Sweden has a bit of a track record trying to “roll your own” in areas where there are plenty of existing standards and market direction, but this time I do believe e-delegationen is betting on the right horse. Good work!

Comments Off on Swedish national SAML federation?

Filed under Identity

Stork & InfoCard (and maybe U-Prove)

Paul Madsen twittered this networworld article about what i guess must be one of the first public appearances of the EU Stork project.

Kim Cameron and MSFT seem to be shopping InfoCard and Geneva all over the place these days so their comments about Stork shouldn’t be surprising to anyone. The article claims that InfoCard has seen solid industry uptake which may be true but according to the recent Concordia Survey on Federated Identity InfoCard has a very small deployed base.

Nevertheless I think it reasonable to think that InfoCard will get deployed more, even in the R&E community where federated identity is already a Big Thing (TM).

InfoCard shares important infrastructure with SAML making it fairly easy to deploy alongside SAML (even though the semantics and user experience of SAML WebSSO and InfoCard differ quite a bit), namely SAML metadata which, when deployed “the right way” becomes the primary trust fabric of an identity federation. Microsofts Geneva was apparently designed around the same principles of how SAML metadata should be used as is fast becoming best practice among R&E identity federations.

So we learn that STORK will consider SAML 2.0 and holder-of-key as the primary way to interface national eID solutions in the European countries. I really hope they understand that the devil is in the details and design metadata management and trust fabric management in a sensible way.

One can only wonder what lies behind Microsoft pushing Geneva all over the place. Typically Microsoft aren’t happy just following where others lead. Perhaps the idea is to include the U-Prove technology they bought with Credentia last year in Geneva and embrace and extend the identity federation framework…

Then again once you can see the threat it is suddenly less of a threat. The famous embrace and extend tactic is precisely that: famous. People who are interested in open standards and open implementations should recognize where the ball is being played and start to think about how to implement U-prove.

1 Comment

Filed under Identity

Metadata license becomes metadata terms-of-use

Andrew Cormack of ja.net talked at the REFEDS meeting today about recent work they have done on standardizing interfederation agreements. One interesting announcement was that they’ve picked up my old idea of associating a license with federation metadata. They ran this by a set of lawyers who basically said: “don’t call it a license, call it terms-of-use and you’re fine”.

This has the potential of simplifying federation operations (including federation peering) significantly since service-providers don’t have to be tied to federations by legal agreements. For multi-federation service-providers like Microsoft Dreamspark or Elsevier this is good news since they may in time get away from having to sign agreements with every federation on the planet.

While this may seem like a bad idea for federations whose business was driven by being able to charge SPs for inclusion in metadata in the long run everyone benefits from the identity business growing with the removal of a major obstacle.

2 Comments

Filed under Identity

Certificate enrollment in confusa using OAuth

I’ll admit that X.509 certs aren’t the most hot topic in the world these days but they do rear their ugly little heads now and again. Most recently I’ve been involved with the people working on deploying the new Terena Certificate Service (TCS). The TCS is a follow-up of SCS – a pan-European flat-rate certificate service negotiated by Terena. The second round of procurement got us a sweet deal with Comodo which includes unlimited flat-rate user, code and server certificates (!)

Reading Andreas excellent post on adding support for OAuth in simpleSAMLphp and talking to Thomas Zangerl at NDGF who is helping Henrik Austad of UNINETT work on the confusa CA server we’ll use for the emai/GRID certificate part of TCS, we realized that OAuth could also be used in conjunction with Java WebStart to provide secure key generation and enrollment for confusa. Here is a rough outline:

The CA web interface is a federated application – in our case using the Browser Web SSO SAML 2 profile implemented in simpleSAMLphp. Today confusa allows the user to login via one of the trusted IdPs and then upload a PKCS#10 certification request in a form. This CSR is combined with attributes provided by the IdP to provision the certificate.

This works but doesn’t provide a very nice user experience. Instead we could launch a Java WebStart application or applet which does key generation on the client and submits the CSR to the CA server. This approach has been implemented by others. The problem is how to authenticate the CSR and tie it to the authenticated user attributes. A session identifier could be used but would typically need lots of tweaking to be sufficiently time-limited and secure.

If we try to apply OAuth to this situation and view the established session as a protected resource that the user grants access to for the purpose of binding a public key to it we get the following translation of OAuth concepts:

  • Consumer: The Java WebStart application
  • Service Provider: The CA application (confusa in our case)
  • User: The user requesting a certificate.
  • Protected Resource: The established session at the web applicaiton containing the user attributes.

Since the User has already logged in an authorized the request and is provisioned with a consumer key and a pre-authorized request token as part of the launch JNLP file. At this point the JWS application can obtain an access token and use it to associate the CSR with the established session using a PUT request.

I’ll be the first to admit that this is a corner-case – the request-token is authorized before any OAuth protocol flows are initiated but nevertheless it shows that OAuth is a very nice idea adaptable to many situations.

We will look deeper into the security implications of this and this process is expected to get lots of scrutiny by the GridPMA when we submit the TCS Grid CPS to the EuroGRID PMA for review so the jury is still out on weather this gets deployed or not! Stay tuned.

Comments Off on Certificate enrollment in confusa using OAuth

Filed under Identity