Limitations of Standards Bodies
How a technology bias creates unnecessary unintended consequences
I’m annoyed with the W3C. For the past 2 years, I’ve participated in a W3C community group with three objectives: to learn about the standards process, to support the development of standards around a particular new technology, and to influence those standards by providing insights that are often missing in technical communities. My insights are around the impact of technology on a broad perspective, long and longer term impacts of technology on our world, including second and third order impacts, and other influences from areas technologists miss — like the political or demographic sphere. My insights are the result of 20+ years of working in the tech industry, understanding and overcoming my own tech biases as I became a degree carrying futurist (M.S. in Foresight, UH).
I heard that the “standards folks” wanted input from diverse views — to make the standards better. My own experience has been disappointing. I have not been able to successfully participate and contribute to the standards. This disconnect concerns me — perhaps others have had similar experiences. But more importantly, technologists are surprised when their technology, that was meant to solve something, leads to unintended consequences.
This is a delicate topic for me — I have an emotional attachment to the content and care deeply about learning from technology’s failures to make the future better. I’ve chosen to focus on the what we lose due to my lack of participation, rather than enumerate my grievances. I’ve watched a missed opportunity unfold and I hope we can learn from it.
Unintended Consequences
Unintended consequences are unforeseen issues that occur when a solution is too narrowly designed for the problem without considering the broader impact of the solution on the system. Unintended consequences happen due to a narrow bias and the inability to see/understand the effects of your work as it ripples into the world.
Technologists tend to believe that (their) technology will save the world, so they go about solving all kinds of problems with (their) technology. Problems that could potentially better solved with or in conjunction with non-technical solutions, like addressing social or political situations. My experience has been that technologists do not address or consider non-technical solutions in whole or part when developing their technology.
We can’t identify all of the potential unintended consequences, but we can anticipate some of them. The way to do this is to include a diverse set of perspectives — futurists and subject matter experts can spy potential unintended consequences before many others. Technologists must listen to the feedback (and if it’s good feedback, will feel foriegn), fully understand its implications (or understand your limitations to understanding the feedback) and understand how the technology (along with other tools) can address the problem.
Part of my desire to participate in the W3C standards process was to test a theory: could we identify and potentially reduce unintended consequences if non-technical people participated?
Technical Bias
There are risks of having only technical people develop standards. Technical people have clear unconscious biases toward technology as the solution and it can be difficult for them to see otherwise. Among the more gifted engineers there may be a belief that their technology is superior and because of this they are the one to solve those problems. Creators of technology don’t often see the flaws of their own creation.
The value of a non-technical, yet tech savvy perspective
Here’s a story from the early days of my tech career. In 2000, I was part of a small team that had been acquired by a large company for their technology. The team was in a meeting, it was me and 5 engineers and we’re digging deep into how <DIV>s can be used and how we wanted to use them. I’m sitting in this room with the team asking them questions, drawing diagrams on the whiteboard, the devs drawing diagrams on the whiteboard. Right there in that room we were designing the technology. I came out of that room with my head spinning, wondering if I had understood it all. I mentioned this to one of the devs, who replied that he definitely hadn’t understood everything.
That’s when I realized that when you build a new technology — something still being formed, explored, and figured out — that a non-technical person asking certain kinds of questions is useful, and perhaps even necessary to the robust development of new technology. It’s the answers to the questions the non-technical person poses that causes the technology to be more deeply vetted and formed.
The Ideal
So what we should strive for? I ran a session at the Internet Identity Workshop (IIW) to explore engaging non-technical people as part of the standards process. We identified the following best practices for involving non-technical people in the standards process.
- Define business requirements: If your goal with standards is to solve real business problems, develop a business requirements document that the technical working group must meet. The development should involve business and marketing people — who will then lead to the adoption of the standards in the marketplace.
- Do research: Rather than take use cases and examples that are volunteered, do real research and outreach. Sometimes the problem we think we’re tackling isn’t the one we actually are.
- Involve users: Building on the above bullet, deliberately partnering with and including users in the process to truly understand their needs, their understandings and expectations.
- Accessible documentation: As a reader of standards, the barrier to entry is really high. Documentation is not accessible, it’s written in technical jargon that is difficult for many technologists to understand. The most skilled communicators are ones that move fluidly between understanding the technology and clearly and simply communicating it to smart professionals. The standards must be accessible beyond the tech bubble.
- Productize standards. Package the standards development process for different stakeholders at different levels.
- Developer friendly: A friend describes standards as “code that an engineer executes.” It’s a challenge to take non-technical input and make it actionable code.
- Beyond the technical: Some standards have failed because of human usability. Many problems we face are not technology problems — in fact, technology may have created or exacerbated the problem — the result of an unintended consequence. It would be wise to consider solutions that may not be tech-focused or may rely on technology in a limited manner.
Standards should be developed with many people in mind. All of us have biases, and the only way we can overcome them is by creating space for those unlike us, with different perspectives and experiences to have a space at the table. We need to change systems to empower those with diverse views to contribute their concerns and incorporate them into the process.
My Experience
I haven’t seen a lot of this happen from my experience, and I’ve become disillusioned with the W3C standards process because of it. (I have primarily participated in the W3C process.) While there is always talk about including a non-technical perspective, talk is talk and I have learned the reality is different. Usability for non-technical people is poor. There’s gatekeeping. In the working group I participated in, all non-technical projects went through one person, removing the possibility of other leaders stepping up (my own volunteering was ignored).
Limitation of W3C system
The W3C’s model for participation is pay to play. If you are a W3C member, you can join any group regardless of experience, however if you aren’t able to pay, it feels like your experience, even if you deeply have it, doesn’t matter.
There is supposedly an “invited expert” way of participating in the pay to play part of W3C standards, but the criteria for what is considered an invited expert is not clear. I’ve been denied twice, despite my involvement in the community group, writing a book on the topic, a $300k government research award and 20+ years of experience; yet companies who pay the W3C fee but don’t have any direct experience can participate. That feels like a recipe for bad standards.
I know things are much worse at other places and the W3C is not the only place to do standards. The Decentralized Identity Federation (DIF) was announced specifically as a community effort that would *not* do standards work, but has evolved into a place where standards will be developed.
Companies don’t even have to use a standard — I have advised companies who develop proprietary software. And another strategy is to transition proprietary code to an open source entity much like what IBM and Sovrin did to create Hyperledger & Hyperledger Indie at the Linux Foundation.
I’ve come to terms with not influencing the remaining development of the DID standard. I’m confident the tech bias (not to mention the goals of their corporate membership) will create unintended consequences. I simply have no power to influence this aspect of the future — despite having the knowledge to potentially help avoid them and improve the impact on the world. That makes me sad, more than anything. Sad because we have the knowledge to make the world better — but those in power can’t or won’t listen. And I don’t have the energy to fight this system.