«

»

Don’t blame engineers for the shortcomings of Privacy by Design: privacy advocates are equally guilty

1318132096_tLj9kJHR_millennium-bridge-london

By Simon Davies

On 10th June 2000, amidst great ceremony, Her Majesty Queen Elizabeth cut the tape to open the first new Thames river crossing in more than a century. The Millennium Bridge had won acclaim for its sleek shape and elegant design – and London was buzzing with excitement about its new landmark.

Those who design the machines that enable the invasion of privacy are often oblivious to such outcomes, while privacy advocates and data protection regulators are a million miles from understanding the dynamics and priorities of  engineers.

Then… unexpected drama. The bridge lasted less than 48 hours before a fatal design flaw caused it to be closed for more than two years.

The problem came down to people’s refusal to respect the engineering limitations of the structure. As soon as the bridge was traversed by pedestrians it started to sway unnervingly. Everyone was at a complete loss to explain why such a beautiful and efficient design went haywire. Authorities however showed no hesitation in shutting it down.

It turned out that the newly named “Wobbly Bridge” was the victim of a positive feedback phenomenon known as synchronous lateral excitation. The natural sway motion of people walking caused small sideways oscillations in the bridge, which in turn caused people on the bridge to sway in step, increasing the amplitude of the bridge oscillations and continually magnifying the effect.

This resonance had been known to engineers for decades, but the Millennium Bridge had been conceived for submission to a prestigious design competition, so function and elegance were the primary drivers. The interface between human behaviour and engineering was never addressed.

The same phenomenon is all too common in the world of information and communication technologies. Those who design the machines that enable the invasion of privacy are often oblivious to such outcomes, while privacy advocates and data protection regulators are a million miles from understanding the dynamics and priorities of  engineers.

While “Human-Computer Interaction” and “Security Usability” are taught in many security and information systems courses the reality is that the interface between users and machines is still a niche interest. Engineers will design the most ingenious systems but it is usually only in the latter stages of development that someone may ask the difficult question “how will people interact with this device?”

it is usually only in the latter stages of development that someone may ask the difficult question “how will people interact with this device?”

How people behave is of course crucial to privacy. Will users generate vast amounts of sensitive data that machines will unlawfully process? Will they understand the risks associated with information technologies? Will the design attract privacy-crunching apps that are allowed to exploit personal information?

These are of course critically important considerations for concepts such as Privacy by Design (PbD) which seek to embed privacy protection at every level from conception to deployment.

PbD is one of the main pillars of future privacy protection, but it currently exists mostly in the theoretical realm. As a concept PbD was known to the architecture and building sectors from as early as the 1960s. However – within the information arena at least – the phrase appears to have emerged only in the late 1990s on the heels of another expression – “Surveillance by Design” – which was coined during the debates over the US Communications Assistance for Law Enforcement Act” (CALEA) in 1994. This and related legislation globally was intended to ensure that surveillance capability was embedded into communications design by mandating that systems were designed in such a way that law enforcement agencies were able to access whatever data they wanted.

How people behave is of course crucial to privacy. Will users generate vast amounts of sensitive data that machines will unlawfully process? Will they understand the risks associated with information technologies?

In an effort to counter this trend, researchers and regulators started to develop countermeasures that might provide a higher standard of privacy protection built from the core rather than as bolt-on measures. P    bD is amongst the most important of these. This emerging approach is intended to ensure that privacy is maximised by embedding protection seamlessly across every strand of design and deployment of a product or service. As one prominent contributor to the field observed:

“How we get there is through Privacy by Design. Where PETs [Privacy Enhancing Technologies] focused us on the positive potential of technology, Privacy by Design prescribes that we build privacy directly into the design and operation, not only of technology, but also of operational systems, work processes, management structures, physical spaces and networked infrastructure. In this sense, Privacy by Design is the next step in the evolution of the privacy dialogue”.

Some organisations now appear to be more open to the argument that data minimisation is a sensible approach to risk mitigation and that giving users a degree of data autonomy is central to nurturing trust. In both respects the use of PbD can be an invaluable benefit to seeking practical alternative approaches, but is the world of engineering ready for it yet?  Yes, there are celebrated examples of privacy awareness in the world of engineering but the key question is whether this awareness has permeated the mainstream of IT development.

The answer appears to be a resounding “no”.

The reality check for me occurred earlier this week. I was visiting a city that has a very good university with a large and strong Computer Science department – one of the top rated in Britain.

I had been in touch with the department to let them know I planned to visit the area and to ask whether a small gathering over coffee could be organised to discussed emerging data protection and privacy issues. Amazingly there appeared to be no interest in privacy in this department. The meeting never took place.

Some organisations now appear to be more open to the argument that data minimisation is a sensible approach to risk mitigation and that giving users a degree of data autonomy is central to nurturing trust.

This got me to thinking that there may be a real disconnect in the academic world between engineering and data protection. The interface between human behaviour, personal data and privacy rules seems to exist mainly in the theoretical realm (Information Systems is the closest we get, but even that field is largely theoretical).

Is it that pure engineering, design and coding is still a world removed from the discussions my colleagues have about legal rights?

The advisers in my alma mater Privacy International certainly believe this is the case. One experienced IT professional observed: “There is generally some ethical red tape associated with new projects but once that fence is cleared then anything goes. In my experience, legal issues are obstacles to be overcome after a novel IT solution has been built and it is to be rolled out.”

A small number of companies are talking up PbD as a selling point for their commitment to privacy. Microsoft, for example, routinely associates PbD with its product development but it’s still unclear how much further the company can evolve beyind the Trustworthy Computing initiative of a decade ago. While the company appears to have built a credible privacy framework at an organisational level,  the company still has a way to go before PbD is integrated fully into the business model

There is a vast gulf to traverse. In a paper titled “What do IT Professionals Think About Surveillance” noted privacy expert Ivan Szekely observed: “It can be concluded that the attitudes of the IT professionals only marginally influence their actual behaviour …. Those who care more about privacy do not appear to be using more PETs in their own online activities or in the products they have helped develop.”

This parlous situation needs urgent attention. The challenges are, however, not unsurmountable. The theory and practice behind PbD is commonplace, and thus should not be seen as controversial. The concept of embedded protection on the basis of sensitive seamless design has been embraced over the years in numerous environments. In the field of forensics, investigators have for many decades known that the forensic chain of events (collection of material, recording, processing, analysis, reporting etc) is only as reliable as its weakest link and that a “total design” approach should be taken across the entire chain to reduce the risk of failure.

If proponents of PbD are arguing for an integrated and seamless adoption of systems, then they must argue with equal vigour for an integrated approach to developing PbD as a practical framework.

According to this rationale, the “spaces” between events and processes are seen as posing as much of a risk of failure as the component parts themselves. With this threat model in mind, a system or infrastructure can be designed from ground up to ensure a seamless approach to risk reduction.

If proponents of PbD are arguing for an integrated and seamless adoption of systems, then they must argue with equal vigour for an integrated approach to developing PbD as a practical framework.The same approach has been pursued to a varying extent for environmental protection, workplace safety, urban planning, product quality assurance, child protection, national security, health planning, infection control and information security.

This approach is rooted in a belief that reliable protection in a complex ecosystem can only be achieved through an integrated design approach. It is reasoned that unless a system is developed from “ground up” with protection at its core, failure will emerge through unexpected weaknesses.

The three key perspectives in Pbd – regulatory, engineering and managerial – involve significant intersection. However, while the PbD concept continues to run along divergent paths there is a substantial risk that the technique will be characterised by difference rather than convergence. More interaction and dialogue is required involving regulators, business managers and engineers.

Currently, the evolution of PbD is being conducted sporadically. This dynamic is true for the early development of all such techniques. If proponents of PbD are arguing for an integrated and seamless adoption of systems, then they must argue with equal vigour for an integrated approach to developing PbD as a practical framework. Without such an approach investors will remain uneducated and unmotivated and the PbD concept will remain a largely theoretical concept adopted by a small number of the “good” privacy actors.