Anfall är bästa försvar

Computer Sweden skriver igår om att 3 myndigheter ifrågasätter säkerheten i nya e-legsystemet och har bett MSB genomföra en granskning. Oberoende granskningar är mycket bra och borde ske oftare när Svenska myndigheter inför ny teknik – då hade vi säkert sluppit en del katastrofala misstag de senaste åren, inklusive det nuvarande BankID-styrda e-legitimationssystemet i Sverige.

För er som inte orkar läsa resten så kommer här en sammanfattning: Utspelet från Försäkringskassan, CSN och Arbetsförmedlingen är ett beställningsjobb från BankID som i och med införandet av ett nytt öppnare e-legsystem riskerar att tappa sitt monopol. Den granskning som MSB borde göra är av systemet för Mobilt BankID som just nu införs i stor skala i Sverige utan att någon oberoende, seriös säkerhetsanalys genomförts.

Federations-farfar gnäller

Det nya e-legsystemet i Sverige är i många delar en kopia av SWAMID, Sveriges högskolors identietsfederation. Som många säkert vet är SWAMID en av de saker jag har jobbat mest med de senaste åren och jag är mycket stolt över hur väl SWAMID fungerar samt att de lösningar som jag och mina kollegor tagit fram blivit kopierade inte bara en gång i Sverige redan 1,2. SWAMID är en del i ett stort ekosystem av identitetsfederationer som bara inom forskningssektorn omfattar flera 10-tals miljoner användare 1. Utanför denna sektor används samma teknik för att bygga samma lösning som e-legsystemet bla i Storbritanien, Finland, Danmark, Österrike & USA.

Federationer innebär den raka motsatsen till centralisering, monopol och är en teknisk garanti för valfrihet. Dessa kärnvärden borde de flesta i Sverige kunna skriva under på. Inklusive Försäkringskassan, CSN och Arbetsförmedlingen.

Men i Sverige har vi BankID som har hittat på eget … självklart mycket bättre.

DDOS

Det finns en hel del andra dumheter i kritiken, bla att den nya tekniken inte skulle vara “mobilanpassad”. Andra kommer säkert att bemöta detta och mycket annat som bara är “lagom dumt” men jag väljer att lyfta fram en av de springande punkterna i kritiken mot det nya e-legsystemet som är dumt på en alldeles särdeles flagrant idiotisk nivå:

Den nya federations-tekniken skall vara extra sårbar för sk DDOS-attacker! Det nya systemet är ett distribuerat system som är baserat på konkurrens och medger att fler aktörer delar på ansvaret att identifiera medborgare i Sverige. Idag är det nästan bara en aktör som står för denna kritiska samhällsfunktion: Banken. Nuvarande BankID-lösning är inte bara ett affärsmässigt monopol utan tekniken är dessutom i hög grad centraliserad och beroende av kritiska funktioner på BankGiro-Centralen (BGC) för att funka.

En centraliserad lösning är självklart mycket bättre än en distribuerad… Eller var det tvärt om?

Mer konkurrens tack

Vi ska ha mer oberoende granskning av teknik i Sverige. Jag hoppas verkligen att MSB lägger ner de resurser som krävs för att titta på hela ekosystemet för identitet i Sverige. Gör man det är jag övertygad om att man kommer fram till det som saknas är mer konkurrens i Sverige.

Idag har vi ingen konkurrens överhuvudtaget. Ibland lyfts argument mot konkurrens inom detta område (gissa varifrån) eftersom med fler aktörer så blir det ytterligare en sak som medborgare måste välja förutom elleverantör, dagis, skola osv. Detta argument måste man ta på allvar men det är också oerhört farligt ur ett sårbarhetsperspektiv. Ett allvarligt problem hos BGC eller BankID slår idag ut nästan alla medborgares tillgång till kritiska samhällsfunktioner.

Det är alltså detta som är den stora rosa elefanten i rummet: Det finns fortfarande bara en organisation som gör identifiering av medborgare i Sverige: Banken.

Det finns flera andra vertikaler som skulle kunna ta upp konkurrensen med Banken om att identifiera medborgare i Sverige, tex …

  • Daglighandel – vi har ganska få aktörer och flera av dom bedriver redan viss form av bankverksamhet
  • Svenska Spel – redan idag en teknik-intensiv verksamhet med fokus på säker identifiering
  • Mobiloperatörer – Sverige är ett av de mest mobil-täta länderna i världen och här rör det på sig lite redan.
  • Intresseorganisationer – varför inte DFRI tex? Bättre garant för individens fri och rättigheter kan jag inte tänka mig.

Nyligen annonserades att nästa generation av federationsteknologi, OpenID connect nu är publicerad. OpenID Connect kallas ibland SAML3 eftersom den är så nära SAML2 som vi använder i e-legsystemet i Sverige. OpenID Connect är enkelt att koppla ihop med SAML och är redan integrerat i alla mobilplatformar. Det också är den teknik GSMA valt för sitt mobile connect initiative. Ett samarbete med operatörsbranchen skulle kunna bryta upp BankID-monopolet i Sverige.

Heja e-legnämnden!

Jag slutar mitt “rant” med att lyfta på hatten för e-legnämnden. Dom får ofta stå ut med både det ena och det andra från både förespråkare och motståndare till en nystart för e-legitimation i Sverige men dom gör ett väldigt bra jobb under mycket svåra förhållanden.

Det är dags för Svenska myndigheter att släppa sargen och komma in i matchen.

3 Comments

Filed under Identity

Its umbrellas all the way down

The NREN world is changing. It used to be that you could get away with running a network and a decent ftp-server and that would be good enough.

Not so much anymore. NRENs are turning into service portfolios with a network. For some the transition is relatively painless and quick, for others less so.

Travelling at different speeds causes tensile stress between NRENs. There is a natural tendency from the governance layer to try to address this top-down. Often by adding more governance.

Recently both eduGAIN and eduroam has been the focus of this special form of Loving Care.

Beyond creating work for process consultants, adding governance layers seldom adds real value.

Let me explain…

An federation exchange point – like eduGAIN or Kalmar2 – is a lot like an Internet eXchange point (aka an IX). An IX that operates on a free market is mostly controlled by two forces: the value of the connected services pull customers to the IX and the cost of getting and staying connected to the IX push customers away from the IX.

Any clueful IX – like netnod, LINX or AMS-IX – is operated by an entity that understand that the value of the IX lie in the connected customers. Some even go as far as to call their customers members and allow them significant control over policy and direction.

In other words: governance follows the money!

The current governance model for eduGAIN and eduroam is based on the same idea: funding members control the service and this has to date implied a central role for GEANT. Some see this as a problem as eduGAIN and eduroam expand beyond the EU.

Fair enough!

Adding governance to address this largely imaginary problem is hardly the answer though. Adding layers just adds wasteful complexity and ignores the following two facts:

  1. Building a Federation eXchange point no longer involves black magic. The knowledge is widely available. There are at least 10 groups in the world today that could build another eduGAIN.
  2. Unlike an IX, a Federation-IX doesn’t depend on geography, economy or scale: almost anyone can build one. If you can attract services you win. Kalmar2 is proof that this is both practical, cheap and easy.

To those who are hesitant about the governance of eduGAIN I say this: join up and demand representation. Your voice will be heard or me and a lot of others will help you build a new eduGAIN down the road from the old one.

We’ve done it once, we can do it again. It will be done in a New York minute.

Comments Off

Filed under Identity

The bitter taste of good intentions

In a recent blogpost Eran explains why he withdrew from the OAUTH WG. Having observed the workings of that particular WG since its inception I thought I’d provide some perspective.

To put it briefly: Eran is in part right and completely, totally off base.

Let me first say that I admire Eran for sticking with it for so long. Being a document editor for something that needs 30 version to get “done” is not easy.

Eran is completely right in saying that OAUTH 2.0 has grown into a much larger beast than 1.0 and that there are now ways in which you can put 2.0 together that will be unsafe, non-interoperable and probably fattening too. Eran is also right in thinking that the WG has taken way to much time to reach this point.

However Eran is missing an important reason for why things developed they way they did. Eran touches on this when he talks about enterprise vs web.

In fact where Eran talks about enterprise it should really say “Microsoft”.

Early on and for several meetings the WG was totally devoid of traditional software vendors. It did (to some extent) attract the big web companies with a stated interest in OAUTH: Facebook, Google, Yahoo along with a few of the mobile operators. The mobile operators stayed on and have made important contributions but the web companies were a completely different story.

Personally I was surprised at the level of “ego-waving” going on at some of the early meetings and when WRAP appeared. I especially recall one WG meeting where a representative from one large stakeholder disrupted a session by walking out in the middle of an active round-table discussion stating boredom as a reason.

In its formative months when a WG depends on committed and active participation from invested vendors and operators the OAUTH WG had too little of this and too much casting about.

When MSFT turned up (and people who know me know that I seldom sing their praise) their presence stabilized the WG and it started to make progress but important time had been lost.

Is OAUTH 2.0 a failure?

Future will tell. I do not think the fact that FB is still operating on version 20 (or something) is a measure of the success or failure of the protocol. Having implemented OAUTH 2.0 myself I don’t agree with Eran that 2.0 is more complicated than 1.0 – quite the contrary. I agree with Eran in thinking that an important piece of OAUTH 2.0 has been lost by making signatures an optional part of the spec. Ironically the proponents of that change cited more or less the same reasons that the opponents of “WS-*” cite: simplicity.

If there is a lesson to be had, perhaps it is this: make it as simple as possible but no simpler. Unfortunately many standards organizations (SDOs) routinely fail to remember this.

The challenge going forward is how we measure interoperability for something like OAUTH where there are no reference implementations, few traditional software vendors (and those that exist add lots of secret sauce to the mix).

Will OAUTH 2.0 move beyond single-vendor ecosystems where if you want to talk to Facebook you’d better use the Facebook reference code if you expect anything to work?

I sure hope so.

5 Comments

Filed under Identity, Internet

#rlbob

https://spaces.internet2.edu/display/rlbob/Home

Next I’ll pick up the shovel and keep digging.

Comments Off

Filed under Identity, Internet, Uncategorized

pyFF – another metadata aggregator

In the world of large scale identity federations the problem-du-jour is how federation operators can connect their federations and share services.

The eduGAIN program led by my good friends Valter Nordh and Brook Schofield, in being a concrete instantiation of interfederation, is starting to reveal operational issues in a number of national R&E federation specifically wrt to how SAML metadata is managed and made available to connected relying parties and identity providers.

A couple of years ago Ian Young wrote a a blog post on an operational model for metadata and Andreas Solberg started work on a basic metadata aggregation profile in part based on those ideas. At the recent tf-emc2 OpenSpace in Zurich Brook ran a session on this topic. These efforts will need to converge in the near future to produce a Standard Model for Interfederation.

In order to support such a model the world needs working code.

Ian and the Shibboleth team has been working on MA1 for a while. I’ve had code in this space too – for instance my saml-md-aggregator.

Recently (last Monday) me and the SWAMID operations team realized we needed to modernize the way we manage and publish our metadata so I took the opportunity to roll up my sleeves and write some code.

The result is pyFF – Federation Feeder.

pyFF is based on a simple execution model – metadata goes in one end and out the other and in between processing happens in a pipeline of basic operations described by a simple DSL (domain specific language) using YAML syntax. Right now the code is in rapid development and I expect it to be in production for SWAMID very soon.

Check it out and send me comments: leifj at sunet.se

Comments Off

Filed under Uncategorized

Why it is (sometimes) ok to shoot yourself in the foot

I got this link on a list earlier today: Facebook (2 step authentication) fail !

I totally disagree with almost all the assumptions and conclusions of that post. The only bit I can sort-of agree with is that maybe, just maybe it is not a good idea to allow you to opt out of security without proving your identity with a higher level of assurance but I can also totally grok why FB is doing it this way. The reason is spelled “support costs”.

The fundamental mistake of the post is this: The author assumes that strong(er) authentication (eg 2-factor) should be at the discretion of the site owner.

As content owner (my facebook page, my crap) in this case, I carry most of the risk associated with protecting my data. It is therefore totally fine to let me bypass security if I want to – up to a point.

At some point FB assumes some basic level of risk and responsibility which is why they won’t let me create an account without a password.

If this were a bank the border between personal risk and site-owner risk would shift – in part because the law mandates a higher level of responsibility on the part of the bank than in the case of FB.

Higher level-of-assurance/protection is successfully introduced for one of two reasons:

  • the user values his/her data (cf blizzard tokens)
  • “the man” (eg the government) tells you how it must be

Luckily FB isn’t “the man” – at least not yet – and isn’t in a position to force users into valuing their data above a level that is minimally accepted by most users.

This is the reason strong authentication almost always fails when faced with reality: most of us security nerds don’t share the same gut-reaction with respect to data value than most “normal” users and therefore we are willing to accept a higher degree of hassle when it comes to protecting that data.

This brings me back to the fundamental point: the cost of introducing strong authentication is not in tokens, provisioning or identity proofing. Most of the cost is in support. The simple truth is that most ways we have devised to improve security of the authentication step in any protocol suck from a UI perspective. Fundamentally all such measures (be it SMS codes, OTP tokens or so called “smart” cards) all introduce extra steps in the login process. This means that they are seen by the user as an obstacle that he/she must overcome before they can get at whatever content they were going for.

Incidentally this is related to click-through terms-of-use dialogs but that is another story and another blogpost.

It is worth noting (as I usually try to do when this topic comes up in conversation) that some of the most successful deployments of 2-factor tokens are in the gaming industry and I firmly believe that in these cases the user values their data sufficiently much to accept the additional obstacles imposed by stronger authentication.

I also firmly believe that anyone who can design a truly user-friendly strong authentication mechanism would get rich pretty fast and would do a great service to the Internet.

8 Comments

Filed under Uncategorized

Why you should care about the CABforum

The CA browser forum (aka CABforum) announced a couple of days ago that they would form a WG on “organizational reform”.

Why is this important I hear you say?

The CABforum has quite a lot of power. This group makes decisions that affect which CAs are chosen for inclusion in default browser trust stores. Currently the group is comprised of browser and CA vendors. Notably absent are any relying parties.

Here is how to participate (quoted from the cabforum.org announcement):

In support of this process, the special working group is soliciting short (no more than 750 words, please) position papers and statements of interest from organizations and individuals on these topics. We encourage stakeholders to submit their comments to questions@cabforum.org now through March 30, 2012. All submissions will be posted publicly on the CA/Browser Forum website. (www.cabforum.org)

Comments Off

Filed under Trust

convergence & federations?

Convergence is one of several proposed solutions to the problem of lying and poorly managed CAs. DANE is of course another. I like fighting on multiple fronts so when rlbob sent me an inspirational email today after listening to Moxie talk about convergence at #RSAC I just could not resist it.

To make a long story short I went and setup a convergence notary. If you feel like trusting it feel free to visit https://etc.mnt.se/mnt.notary but make sure you visit convergence.io and install their FireFox plugin first.

Here then is the rlbob challenge:

The Chrome guy says they can’t use convergence because the traffic load would be too high for anyone but them to support, and they can’t be the ones to validate pubkeys for their own browser. In steps a worldwide network of registrars run by R&HE using our spare computing power and bandwidth. Let’s do it!

Lets see what happens next!

Comments Off

Filed under Trust

Not posting enough

Clearly the blog has been, if not dead then asleep for quite some time. I have no idea if people are even reading this but I’ll start posting again presently. My lack of updates has not been due to lack of activity!

Comments Off

Filed under Uncategorized

Gaps to Map

Right before the IETF in Anaheim I’m off to the ISOC Identity event: Mapping the Gaps in DC. This post is a set of possible discussion points for that event. The event will focus on the gaps between the technological and policy/legal view of the identity metasystem.

Standardized Federation Policy and Practice Statements

Building identity federation involves establishing policy documents and practice statements analogous to the CP and CPS of a PKI. In the world of public key infrastructure there are templates to start from – RFC 3647, ETSI TS 102 042, ANSI X9.79, etc. In the world of federations there is no such help. We need those and we need them to be simpler (if possible) than their PKI cousins.

Simplified/Standardized Federation Contracts

Joining a federation (as an SP or IdP) often involves signing some form of contract. For an SP joining multiple federations the fact that no two contracts look alike soon becomes a problem. There are at least two ways around this:

  • Make the contracts easily comparable – i.e standardize!
  • Do away with the contracts all together.

In may situations having a contract will probably be inevitable but in certain cases it might be perfectly reasonable not to have a contractual relationship between (say) an SP and a federation. I’ve blogged about this and there has been some work in this area.

Separate technical trust from federation metadata

Technical trust for identity federation is often (at least in many R&E federations) represented as signatures on SAML metadata documents which contain keys for the member entities. This works (often better than using a traditional PKI) but it does tie technical trust management in with a particular identity technology. We need a way to represent technical trust which is easier to setup and maintain than PKI and which can be applied to all identity technologies in use today.

Comments Off

Filed under Uncategorized