Freedom Can Be Designed by Rejo Zenger
(Rejo Zenger) Thank you.
Thank you, and thanks for having me.
And especially in this beautiful venue.
I'm Rejo Zenger, and I work for Bits of Freedom, which is the leading digital civil rights organization in the Netherlands.
We fight for human rights within the context of digital communications.
We are best known, I think, for our work on net authority We were the first country in Europe to have net authority enshrined by law.
Unfortunately, they seem to be weakening in Brussels right now.
And you may also have heard from us because, in the Netherlands, we have a data retention law which has been invalidated.
But the government wants a new one.
And, also, the government wants to expand the surveillance powers of the intelligence services a lot.
We also organize the Big Brother Awards.
And the next ones are coming up in October, at the end of October.
And it's the awards ceremony where we put on the spotlight on the worst privacy violators.
The previous Minister of Justice has been a four-time winner.
Thank God the previous minister is a previous minister, so this time the awards are probably a bit more like surprising for who will be the winner.
Last year, we had Edward Snowden as a special guest.
I'll get back to him later in this talk.
And this year's special guest is Max Schrems, and I also will get back to him.
So in this talk, I would like to talk about something else than you've heard in the last one and a half days.
I presume, in the last one and a half days, most of the talks were mostly about very technical topics.
This one is about you, and the power you have.
And I want to show you that you can be instrumental in getting freedom on the internet, or even freedom in society.
And although I'm a privacy advocate now, we do share some background.
Because, in the past, I've have an education as industrial product designer.
And the education told me a lot but not really something profound.
It told me how to calculate the maximum load on a chair before it can break.
It told me how to explain the non-complicated stuff to people who even have troubles understanding non-complicated stuff.
And maybe the most beautiful thing I've learned during my education is the color theory of Johannes Itten.
I think most of you, or I presume most of you know this work as well.
But the thing that really, the most profound lesson I learned, I learned it during the last internship just months before I was finishing the education.
I was working at Demotech, which is an NGO whose mission is to design for self-reliance.
The best known product is a water pump.
Not this one, of course, because if this one breaks, the user can't fix it.
They need to call in the supplier of the pump, and they need to come over and need to fix it.
So if you're in a rural area in less developed country, then that would be, of course, a problem.
The water pump, this one, doesn't explain the user how it works.
It's not built from materials which can be easily replaced.
Even if it would have, the people don't have the tools to actually repair the pump.
There are lots of other issues.
And, basically, this water pump is not very transparent to user.
However, these were the pumps which by default delivered when say back in the '70s.
Demotech's water pump was a different one.
It looked like this.
And here you see lots of differences.
The design is open.
There are lots of materials which are available to locals at all times.
So it's a wasted tire from a car.
The rubber pieces on the rope are from flip-flops which became broken.
Also, the design is very open, so it explains how it works.
You can easily get to all the details, you can repair it easily.
And one of the most beautiful things I thought to illustrate this, is when Demotech went to one of those villages to build such a pump.
When they were finished explaining how to build such a pump, they went on to move to the next village.
But the people, the locals from that village, already had copied this design, because it's so easy, so explanatory.
And that's where I learned that, as a designer, you can have a huge impact on the freedom of people.
With your design, you can change the world around you.
That's not something new, by the way.
In the time when Playboy just Playboy, the magazine didn't just have images, but also had long interviews with interesting people.
Like in the '60s and '70s, they were inter-fused with spread over many, many pages.
One of the people who were interviewed was Marshall McLuhan.
And he is a Canadian philosopher on communications theory.
He had many interesting views, I think, on media and on technology, which are still valid as of today.
And I'll mention three of those.
First of all, he said that technology is the extension of man.
And what he says with this is, think of a knife that is an extension of the butcher.
And think of a gas pedal in a car, which is an extension of the motorist.
Of course, sometimes, I think it's the other way around and it's not an extension, but a shortner of man.
But Marshall McLuhan liked to simplify stuff.
He also said the medium is the message.
And he said so the medium itself, already, effects the society in which it is placed.
It's not only content which is delivered over the medium, but it's the medium itself, as well.
So, think of a light bulb that creates an environment in an otherwise dark room.
So the light bulb creates an environment just by its mere presence.
Another example would be a television.
If you ask people how television changes their lives then they will probably tell you, well, I saw these horrible images from Syria.
And that has affected me, and it's changed me and what I'm doing.
But I think that Marshall McLuhan was right when he said, if you have a television, the fact that you can watch it during your dinner in the evening, you can watch those images with a plate of pasta or a pizza slice in front of you.
That is the case.
That is possible.
That does change a lot, as well.
So, the medium is the message, according to Marshall McLuhan.
And the third one is figure and ground.
So, when thinking about new technologies, we are mostly focusing on the positive and short-term effects, while we tend to ignore the long-term and, sometimes, more negative effects.
Think of a car.
When introduced, it was the promise of freedom.
Everyone would be mobile.
No distances could be, there were no distances too big to cover.
But now we know better, because if you drive Monday to your work again, there is a big chance that you will be stuck in a traffic jam.
And that is because, I think, that's because we have a car, we tend to create a bigger distance between where we live and where we work.
So, these effects, these more negative effects of a car, weren't thought of when a car was first created.
More recently, other thinkers have had the same, or more or less, the same thoughts.
So, this is Lawrence Lessig.
He is an American academic.
He's an authority and a political activist.
And, actually, he is candidate for the Democrats to run as president in the presidential elections next year.
I don't think he will make it, but it would be cool if he it would be cool to have him as a president.
So, he had this book in 1990 titled, "Code and Other Laws of Cyberspace".
And it's an influential book, because it says a lot about the structure and the nature of the regulation on the internet.
And the primary idea of the book, as expressed in the title, is the notion that computer code, or West Coast code, as he mentioned it, so the Silicon Valley code, regulates conduct in much the same way as legal code does.
So, the East Coast code, so that Washington, DC.
So, computer code regulates conduct in much the same way that legal code does.
More in general, he says in his book there are four major regulators.
You have laws, norms, market, and architecture.
And to give you an example, think of smoking.
You are not allowed to smoke in certain places.
You are not allowed to buy cigarettes if you are under aged, or at least, in the Netherlands.
That would be law as a regulator.
Norm as a regulator forbids you to blow out the smoke in someone's face.
That is not acceptable.
The market can regulate by differentiating in price.
The higher the price, the more difficult to access those cigarettes.
And architecture is how the cigarettes are built.
Because the nicotine in a cigarette is making is addictive, and therefore, it's a higher constrain on smoking.
But let me focus on the last one.
This is, of course, Paris.
And some of the power that the French Revolution is derived from the architecture of Paris.
Back then you had, mostly, very small and winding streets, which could be easy barricaded, and making it possible for revolutionaries to take control of the city with even very little absolute power.
Napoleon III understood this problem.
And, so, in 1853, he resolved this issue for him by taking down many parts of the city and reorganizing the city and having very broad rows.
And that made it, all those wide boulevards, made it very difficult for insurgents to take control of the city.
He also, by the way, while doing this rebuilding, laid down miles of pipes for gas, so you could have light in the city.
Think of the reference to McLuhan earlier on.
Another example are speed bumps.
If you are building a parking garage or street where kids are playing a lot, then the designers of such a place, they sometimes lay down these speed bumps in order to force drivers to slow down.
So, these structures have the same purpose as a speed limit, legal code, or a norm against driving too fast, but they operate by modifying the architecture.
One more example, most of you probably have been waiting for your luggage at a airport before.
That often takes very long, and people get anxious about it.
They get angry about it.
They start to complain.
So, some of the airports have solved this problem for them very easily.
They just made the distance between the gate where the plane arrives and the baggage belts a little bigger, because that's makes people having to walk a lot longer and wait a little bit less.
And then they get less complaints about it.
That's solving something with architecture as well.
It works the other way around, as well, I think.
If you have a 10-lane highway with no traffic, then it's very difficult for the motorists to drive only very slowly.
I think the biggest effect of having a legal regulator saying, you cannot drive here faster than 30 miles per hour.
The biggest effect of that is that it becomes, the road becomes a cash cow for the government.
So why is all of this relevant to freedom? Why is it that I'm explaining all of this you? Well, maybe you have been outside in the last one and a half days.
And then you've noticed you're now in Amsterdam.
There are lots of canals, which are very beautiful to look at.
And it's especially beautiful, I think, to be explored by bike.
However, due to the layout of Amsterdam, it's very difficult to navigate.
And that's, for me anyways, very often Google comes in.
Google Maps, in particularly.
So when you are trying to go on bike from the Westerpark, which is to the top.
And you go to the Easterpark, Oosterpark, then Google routes you around the canals for one reason or another.
I don't know why that is.
I really don't have a clue.
Maybe it's a hiccup in the algorithm.
Maybe it has been a very deliberate choice because if you would route the user through the canals, it would give you an enormous long list, very, very huge list, of directions.
Maybe that is not very useful.
Or, maybe, it's just because the complaints of the people living among those canals, because they get a lot of tourists and a lot of traffic anyways.
So, I don't know what it is, but it's definitely not transparent.
One other example.
All of you know, probably, this is Michael Brown, a young black man that was shot by white policemen.
It's a horrific story, and, unfortunately, one of many.
This photo was shot by one of the people living just there, and made a photo from out of his, I think, his garden actually.
So he put it up on Twitter.
And, I think, because of the availability of this photo, the topic became trending very, very quickly.
However, it took 24 hours before there was a considerable amount of messages on news feeds on Facebook on the same story.
And I don't know why it is, but it's probably because of an algorithm, or maybe something else.
I just simply don't know.
In any case, the impact is huge.
Because if you don't have Twitter, but you look on Facebook only, you have a completely different reality.
Would we, would you have heard about the protests of the black community following on this death of Michael Brown? Would you have seen the images of people standing up with posters and saying, hands up, don't shoot? And, if that's the case, would you have heard about protests, I'm sorry, the response of the governments to those protests? Would you have seen the images of heavily armed militarized police forces which are definitely not signaling any friendliness? Maybe you would not have, and I think that would be a problem.
Something more closer to you.
This man, sorry.
This man doesn't need any introduction, I hope.
He told us a lot about the NSA, and how the NSA works, and how the GCHQ, the UK intelligence services, how they work.
In a way, I think, the way they work, their methods are very impressive.
Because there just simply seems to be no government institution which does their work so thoroughly as the NSA or the GCHQ.
Their motto is collect all, process it all, and analyze it all.
And what struck me most about all of those revelations of Edward Snowden, is that in the past, we had been thinking about all of those attacks individually.
So we recognized it would be possible to break into the data center of Google and wiretap there.
We thought of if you get any communications through a satellite, that's easy to wiretap.
We thought of maybe the NSA is lowering the encryption standards.
All of it passed our minds.
But we never really thought that the NSA would do all of it at once.
Let's look at one effort more closely.
This is a slide of the NSA where it explains the weaknesses in Google's infrastructure.
So, on the left-hand side you have the public internet.
So the part of Google which you see on a daily basis.
You can login on a Google service, and you do so by an encrypted connection.
On the other side, on the right-hand side, you see the Google Cloud.
So the infrastructure of Google itself.
And the encryption is stripped off in between.
So all the data within Google's infrastructure was being sent unencrypted.
And that is something which NSA noticed and sorta started wiretapping the cables, the fibers, between Google data centers itself without Google knowing it.
So these kind of slides, I think, serve as a lesson for you and what your companies can do.
And I'll get back to that in a second.
I don't think that lesson has been learned by Barack Obama.
I think that Obama's response has been very, very weak following up on those revelations of Edward Snowden.
I don't think a lot has changed by the way how the NSA is working now, like in more than two years after the first revelations of Edward Snowden.
There's no being, there's no considerable, no considerable reform has taken place.
But when speaking about those revelations, he did say some wise things, I think.
One of the things he said is, "There are fewer and fewer technical constraints on what we can do.
That places a special obligation on us to ask tough questions about what we should do."
I don't think he has learned that lesson, but I think it's something worth remembering.
So that begs the question, what can you do? Well, I'm focusing on very practical examples.
I want to show you a couple of things you can do, which you can take away and you can start on working.
Well, maybe not tomorrow because now there's a party, but then the day after.
First of all, a very simple one.
Don't ask for details you don't need.
You bear the responsibility for all the data you get from a customer.
And the easiest way out, I think, is just don't have this data if you really don't need it.
What you don't have, you can't be forced to handle it over to another party, like in government or some other company.
What you don't have, you can't lose control over it.
It can't be hacked, and it can't be taken away.
One example is Spotify, I'm not sure whether you can see it.
But, this is the Spotify settings page, where I can enter my profile.
Some of the stuff that it's asking is my gender, is my postal code, and some other details.
When I created a Spotify account, some of these details were required.
So, I could not enter into this, I couldn't use Spotify without entering those details.
Although, Spotify doesn't need those, because I know for sure.
Because I entered fake details, and it still works.
So that begs the question, why does Spotify ask for these details? Maybe it's because their business plan.
Maybe it's something else.
I just simply don't know.
But I think you have responsibility when you are designing a service like this, and you should not ask for more details than you really, really need.
Some other thing is decentralize whenever possible.
The problem with Facebook, of course, is that you have a social network with all the data in one single, central place.
If the government, the US government, the Dutch government, wants a copy of a profile, they need to go to only this one single place, and they can get it.
If, however, you would have centralized, it would have become-- if Facebook would've been decentralized, it would have become a lot more difficult. If they want
to have my profile, maybe they would have to ask me, instead of to ask Facebook.
That's a lot more transparent to the user.
One other example is this one.
This one, it's from Eneco, which is a Dutch energy supplier.
This is a smart energy meter.
And what they do is, all the data it measures inside a home, it leaves it on this device.
So, all the sensitive data is just kept on this device.
The energy supplier only needs some aggregated data over longer periods for being able to deliver the electricity.
But they don't need all the details which this, they don't need all the measurements that are made.
So, by keeping this data inside the home of user, that is a very safe way to go.
And if it's done right, it's very powerful.
Because if they do it right, then you, as a user, you can hook up your laptop and you can get the data yourself, and you can play with it.
One other important thing is, don't have others spy on your users.
An example would be this one.
It takes a bit long sometimes.
It's this one.
This is Piwik, as an alternative Google, as an alternative to Google Analytics.
Google Analytics market penetration is, I believe, somewhere around 50%.
Probably misleading, because it's most likely being used on the larger sites above average.
And this means that you will get some data, that Google will get lots of data.
And the NSA is known for tracking Google Analytics cookies to identify users on several sites.
And there is an alternative, which maybe does not get all of the data you're used to within Google Analytics, but is definitely a good runner up.
The same goes for Facebook for social media buttons.
I don't know about the market penetration of Facebook, but it's probably pretty high.
And it provides these buttons, or the regular buttons, probably provide you a little bit more likes but it will be violating the privacy of all of your visitors.
Even the ones that are deliberately not using Facebook, which do not have an account.
And there's many alternatives, I think, to solve this.
So you could just leave them out altogether.
And I've heard about reports where people are saying that because they had taken away those Like buttons, they get a high ranking.
And it's probably because if people start sharing the page, then they do so manually, and that will get a higher ranking than just having people clicking on the Like button.
Another way, like in how we do this with Bits of Freedom, is that we link to local hosted images.
So Facebook will not see any calls from the user when someone is visiting the Bits of Freedom website.
And there's also this one example.
There's also readily available code which allows the user to click on a button and only then it will load the code from Facebook.
Inclusive, I'm sorry, including a counter of how many likes the page already has.
Then, make sure to encrypt everything.
The first thing to do is encrypt your communications.
So this is the Frontiers website SSL certificate on how it is configured.
That could be improved a little bit, by the way.
So it makes sure that if someone is wire-tapping, of course, some data will still leak, like the metadata from who's visiting which website.
But it doesn't leak all the details from the contents.
And maybe you have been in a process, maybe you thought it's a little bit too much work configuring the server, or requesting a certificate.
There will be a help soon.
This is Let's Encrypt, which is a project of, amongst others, The EFF, The Electronic Frontier Foundation.
And they are creating a tool which you can install with a packet manager and with only typing in two or three comments.
You will get automatically the configuration and a certificate on your server.
So that's, I think it will go live in about a month time or something.
But that's only about the communication, so the user actually visiting your website.
It also is about the data you store on your own laptops, for example, here.
And make sure you have encrypted the hard disk.
But if you process a web form where people can enter their details, it's obvious to use SSL for encrypting the communication, but is not that obvious to encrypt the data on your server as well.
And I think that's important as well.
Then, another example is host your own code.
Many of us have an account on GitHub.
American, or servers in the US, everything simply brought together.
Very helpful, very useful, very user friendly.
I like it a lot as well.
But there is an alternative.
You can host a GitHub-like environment on your own servers.
In that case, you'll have a decentralized, you own your own code.
You have control on, you have the power over your own code.
It's not in the hands of a US company, where you can do some stuff, but where you do not see who else is getting into the code as well.
Or even better, maybe, just host everything you do yourself.
There's a tool which is coming up for usefulness, which is ownCloud.
That's a tool which is still in heavily development right now.
But it allows, like, you can use your editor's book with it.
You can use it as a drop box.
They've started to add functionality as a alternative to Google Drive, as in working together on the same documents in your web browser.
And I think you should be, you could be, transparent to the user.
This is Max Schrems, as well.
I mentioned him early on already.
And this is from a couple of years ago.
And the paper, the stack of papers he's holding in his hands, is a printout of the profile from Facebook.
So every one of us has the legal rights to send a request to some organization which is processing your data, and ask for a copy of that data.
In the case of Facebook, they will reply with a letter from a lawyer and will make it difficult for you.
But in case of Max Schrems, they seem to have forgotten about it once, and then just sent out a printout of all his data.
And that's very revealing, because when you would browse through those, you would see that stuff that was deleted by Max Schrems isn't actually deleted.
So you can see here, it gets a additional flag saying, this is deleted, and it says, do not show to the user, but it's still kept on the record in Facebook.
And that's not very transparent to the user.
If I click a Delete button, I would expect the message is gone.
But, in fact, it's kept by Facebook.
And the same goes for name changes.
If you change the name in Facebook, well, you can have a real name policy today, so you can only have one name.
But when you can change the name, I would expect an old name would be gone after I've changed it.
Not in the case of Facebook.
They'll just keep it on record with every name change you have.
So I think you should create your site, your product, whatever you make, in a very transparent way to the user.
Which is honest to the user.
And then it's only usable when you're user thinks it is.
And I have this one example.
I have two examples.
One of which I really, really like.
It's this one.
PGP is being used as a way to end-to-end encrypt email.
So if I'm sending you an email message, a message is encrypted on my computer.
I can send it to you, and nobody can read it.
And it's encrypted only on your computer.
This is the interface, or part of the interface, that was used when PGP was developed, which is like in the '90s somewhere.
And it's not very attractive, not very appealing, not very usable, I think.
The same conclusion was made by researchers, which published a paper in 1999.
It's a well-known study, and it looks into the issues with the user interface of PGP.
And they found out that most of users make many mistakes and fail to encrypt the messages they intended to encrypt.
So even sometimes they unintentionally shared their secret key, which is, of course, is secret and should be kept secret and definitely should not be sharing.
So the researchers asked themselves, is it simply because of a failure to apply standard user interface design techniques to security? And our answer is no.
On the contrary, effective security requires a different usability standard and it will not be achieved through user interface design techniques appropriate to other types of consumer software.
So there was a study in 1999.
Think of the interface we were discussing.
Now, let's go forward to today.
How does this interface look today? Well, it looks like this.
What is different? Well it has got the color, that's one thing.
And it there have been added some buttons in the top, and that's it.
Obviously, the confusing of users didn't decrease with this new design.
And PGP is still hard for anyone to use.
I think, well, we are using it in our office, NOTE Paragraph but the team of Bits of Freedom, these are hardcore privacy advocates, and I can't expect other users to be the same.
So it's really weird that this didn't really evolve into something better.
And the lesson here is, I think, that there's no such thing as secure software if the software doesn't have any users.
Security only counts if users are actually using it.
And even with modern, because maybe that's a tunnel vision or something, but even with modern recently-developed tools, it's exactly the same.
There's a signal from open whisper systems, which should be a replacement for SMS or WhatsApp, or tools like that.
A messaging app.
And it is designed to be secure.
So, this tool allows you to do end-to-end encryption on your messaging.
If you look closely, you see the two buttons underneath the, just in the bottom of the application.
And, of course, that's really confusing for users.
Because they are saying, this is totally unclear what do these buttons exactly do? And I had the same problem.
I couldn't tell what they were doing.
And then they say, this is like an inbox archive, or even active archive, or whatever.
So that's a very difficult user interface for people to use.
So you can't expect users to have secure communications because of this.
Well, these were a number of very practical things you could start applying, I think, by tomorrow already again.
If you are still feeling you can't do with anything of this, then make sure you're going to help your local digital rights organizations.
There are many.
To mention a few, Open Rights Group would be one, which is UK-based, just like Privacy International.
Access Now is a more global organization.
In the US, you have the Electronic Frontier Foundation.
And in Brussels you have EDRI.
And all of these organizations can use your help.
Because, if I'm speaking for Bits of Freedom, we try to be independent, which put a lot of constraints on the money we get.
We don't get any governmental funding, for example, or funding from companies which play a large, huge role in our battle.
So just donating is really very helpful.
But even if that is too easy, then helping us with building campaign websites is very helpful.
We need them.
We need campaigning websites from time to time.
And very often, on a very short term.
And we don't really have the experience in building that, so you could help a lot out there.
Then finally, maybe you would ask, me? Why? Can I make a change? Well, I think you can.
And I thought of how to explain.
Maybe you are feeling like the white space in your code.
Like be, well, maybe you are there, but you're not being able to really leverage something.
Then let me counter that with one quote from a great thinker again.
This is Marshall, sorry, this is not Marshall McLuhan.
This is "Bucky" Buckminster Fuller.
And he was asked, more or less, the same question.
And in his answer he used the trimtab as a metaphor for personal empowerment.
And, again, in the "Playboy" in 1972, again, there were those beautiful interviews back then, he was saying, and I'll quote him here.
When my computer is helping me.
It's difficult, yeah.
Here it is.
So he is saying, "Something hit me very hard once, thinking about what one little man can do.
Think of the Queen Mary-- the whole ship goes by and then comes the rudder.
And then there's a tiny thing at the edge of the rudder, called a trimtab.
It's a miniature rudder.
Just moving the little trimtab builds low pressure that pulls the rudder around.
It takes almost no effort at all.
So I said that the little individual can be a trimtab.
Society thinks it's going right by you, that it's left you altogether.
But if you're doing dynamic things mentally, the fact is that you can just put your foot out like that and the whole big ship of state is going to go.
So I said, call me a trimtab."
And to clarify this, a trimtab are these small parts at the end of the rudder and elevator.
Their small surfaces connected to a trailing edge of a larger control surface on a boat or an aircraft.
And so you can control the [INAUDIBLE] of the controls.
And I think they could be you.
Even Bucky, he took it to his grave.
He has a great stone, which exactly says this.
So that's it for me.
And I will say, question everything.
(Bruce) Thank you very much, Rejo.
With your permission, I'm going to invite Sally Jenkinson as well, because many of the questions are asked of each of you are equally applicable to both.
So, if you could both come to the comfy chair where an extra comfy chair is bringing it's way.
Sally, if you sit in the middle so I can pass the mic to you.
I was very, I was very taken with your visual image, Rejo, of the Napoleonic streets.
Because it's two things.
There's a good and a bad.
Yes, they were designed to prevent insurgency, but having big, wide streets in the early 19th century meant that he could put gas, electric, and civic amenities like, you know, refuse collection, ambulances can get down there, et cetera.
And when I introduced Sally, I set this up, yeah, when I introduced Sally, I said, open data can help us hold our masters to account.
And Andy Walpole said, you're stupid and naive, Bruce, and actually, it's just a way for our masters to enslave us.
It seems to me that openness is good when we can see what our masters are doing.
But, too much openness from us is dangerous, because people can advertise at us, sell our data or control us.
And is there a way to square that circle? Is there a way to protect ourselves better through open data? I'll go to you, Rejo, because you've got a head mic on.
Well, that's a tough question to ask, to start with.
I'm not sure.
So, the reason why I showed Paris is a thing that it shows how architecture is influential on our society on which you can, on how it reacts.
How those two things are getting together.
I think that for, it shows that we need to think through, very well, on how we apply technology.
So, it's not just the one or it's not just the other.
I'd love to see more openness on how the government works, for example, but on the same hand, I would like to have my communications privately.
And even with the openness of the government, it's very difficult.
Because I think that if you have lots of open data, chances are you will be leaking personal data, as well.
There's this famous example with the license plates from taxis in New York.
They were hashed so there were not, they took some precautions to prevent leaking, but you can easily refer the process.
So I think it's very difficult.
Yeah, I think that it's a huge area.
And, I mean, this is deliberately an introduction talk, because it could've gone in so many different directions.
And this whole kind of thing about the individual, you know, obviously, it goes into loads of different areas about freedom in terms of the price we're actually paying by, kind of, signing up for free services and stuff like that.
But as I think somebody commented, I've had the bonus of being able to get heads up of questions, which is great.
Somebody said something about how if you get enough data about individuals, and this is a really important thing, because I think that if the services you sign up and we're going to maybe release your data openly.
If it's an individual thing, you might go, yeah sure, you know.
That particular data.
But the point is, obviously, when you start to combine a lot of different data sources, as well, in isolation, something may not be identifiable in terms of individual people.
But then, actually, when you start to combine enough data about people, that's when you can get into real issues about it might be possible to identify people, especially if there's certain people who may be in the public eye, and it's about where they've been and when.
You know, things like that.
It's quite easy to track.
So I think that it's a tricky one.
I'm not, I'm afraid of subject matter x, but as much as I'm standing up here, so I don't, kind of, have all the answers.
But I think that it's something that we need to be very careful about in terms of, as much as it's easy for me to kind of stand up here and go, yeah, put all your data out there.
You need to be considered about how you do it, and about the kind of, the impact you are having on other people as well, with the choices you make.
Yeah, I mean, Andy, and he didn't call me stupid.
And he was right to call me naive, because he made me think about what I was saying.
And it's certainly true.
I read somewhere that 40% of American citizens are potentially identifiable with date of birth and zip code, just by cross-referencing it with other legitimately open data that's out there.
And that, I was about to say, hey everybody, you know, let's publish all of our data.
But the thing is you never know how people will cross-reference it and mash it up.
And it's also complexified and, in English, complicated, by the fact that a lot of government data is actually data about us as well.
So if we somehow legislate the government to open that data, we could be shitting on our own privacy.
Which brings me to the second point.
Sally, you raised the interesting point that in Scandinavia, particularly Norway, where my head office is, tax records are open.
And when tax records are open, salaries are deducible.
And to an English perspective, because we're both English, I don't know what it's like in the Netherlands, and that would be an intolerable invasion of privacy.
But in Scandinavia that is entirely expected and encouraged.
And it has benefits, like you being able to see whether, as a female, you're being paid less than a male.
So it seems to me that privacy is not an absolute thing.
It's a culturally-defined expectation.
And to what extent can open data on a worldwide web coexist happily with cultural expectations of privacy? Go on, Rejo.
Maybe you can, if you have an idea and I need to think about it.
[INTERPOSING VOICES] --so I'm hoping this is on, I think that it's, again, it's one of things that has loads of different facets.
Because you've got the new borders, or whatever, in terms of your traditional kind of country-based expectations and the rules that we've all grown up with.
And then how that, actually kind of, crosses into the web, , is a particularly interesting one, because of this concept, and how some of these concepts do, , kind of, contradict the ways that we've been typically brought up to.
But I think that is not just kind of a country-based identity thing, or anything like that.
I think that the really interesting thing that comes when you start to kind of put that into perspective of actually, fundamentally, when you try to do anything, it is very often down to the attitudes that you have around it.
So, even, like a company, thinking about whether they want to release some data that might be related to that organization, even if it's not come from around the individuals in that organization, that's likely not going to be able to succeed, unless, actually, you do embrace this kind of concept of openness.
Because I did some work at the start of the year for an energy company.
And, naturally, again, there's certain information that energy companies have to be seen as putting out there.
But then, also, in terms of the fact that they want, or they have to, sort of try and help you cut down on your consumption.
you know, so they kind of have to do certain things.
But then, really, do they want you to stop using energy? No.
Because then their profits are going to fall.
So it's this really, kind of, mixed up relationship, not only in terms of what we're kind of comfortable as individuals, but also comfortable as in terms of the bigger perspective, as well.
So, I think that you were asking us about privacy.
When I talk about privacy, I always think about freedom in more, in general terms, because I think those are very related.
And then, I think, this is a problem which is more generic, where we don't have a good solution working yet.
You see the same thing as if Twitter would be removing tweets because they are violating legislation in some country, what does it do with the same tweet in other countries? And how would you, as a citizen from one country, maybe you would use Tor or a proxy or something, to access the tweet.
Still, so you can still access it.
So, that's a bigger problem with those more global services, how to respond to more the old borders which still exist.
No solutions arrived, but lots of food for thought.
Talking of food, it's lunch time.
Put your hands together for Sally Jenkinson and, of course, Rejo Zenger.