Mailing List Archive

[Wikimedia-l] Reflecting on my listening tour
Hi everyone,

I joined the Wikimedia Foundation on August 1 of last year in a newly
created role as the Chief Product and Technology Officer (CPTO). (For the
first few weeks, some of the staff called me C3PO as they got used to the
new title :) The role was created to bring both the Product and Technology
departments back under a single accountable leader for the first time since
about 2015. Like Maryana
<https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Chief_Executive_Officer/Maryana%E2%80%99s_Listening_Tour>,
I decided to spend the first few months of my time at Wikimedia listening
and learning. Although I come from the open source technology field, and
have worked with volunteers and communities in prior jobs, it felt
important to start here with curiosity and openness about what’s working
well and what needs to change.

Since then, I have met one on one and in small groups with more than 360
people, who spoke with me from 38 different countries. I also attended 22
large and small convenings and events which included about 3,150 people.
This includes members of the Foundation’s product and technology teams,
other Foundation staff, editors, functionaries, affiliates, movement
organizers and open internet partners. I tried to approach every
conversation with curiosity, openness, and eagerness, letting go of any
preconceptions I may have had (intentionally embracing beginner’s mind
<https://en.wikipedia.org/wiki/Shoshin>) about the Foundation, the
Wikimedia projects, and communities worldwide that contribute to creating
and sharing free knowledge. I can confirm that I quickly found myself awash
in details, experiencing a firehose of information from all sides! My
husband and two young children have also learned a lot more about this
movement in the last six months than you might expect.

To provide myself with some structure, I asked everyone the same kind of
questions about: (1) the impact our product and technology organizations
have had on the movement and/or the world in the last five years, and what
people were most proud of; (2) the current vision and strategy and if they
will take us where we need to go; and (3) the most promising opportunities
that people see in our work, and what is needed to realize that potential.

I want to thank everyone who took the time to share with me, and I’ve
included some direct, anonymized quotes in this letter from the
conversations I had. And I want to confirm that the listening continues — I
will create more spaces in the year ahead for dedicated conversations about
some of the important topics I have highlighted below. I will also be
posting this letter to Meta.

Pulling in the same direction: More visible and shared metrics

On a page of the first notebook I had for my onboarding, I quoted a person
who said they just wanted "meaningful common goals." This was a theme
repeated over and over — a clear desire from everyone to do work together
that was linked by common purpose, and with all the volunteers that have
created all Wikimedia projects. I got to hear so many different voices, and
I heard the details from every side — what’s working, what hasn’t been
working for a long time — some of the problems we face are over ten years
old. People shared what’s missing, what’s extra, who’s fighting to be heard
and who’s feeling lost at sea.

"I think there are lots of promising opportunities to incentivise people to
pay off technical debt and make our existing stack more sustainable. Right
now there are no incentives for engineers in this regard."

"Are we really having impact?"

How can we unite behind meaningful common goals? And which metrics matter
the most? We have so much data, but we really need lodestar
<https://en.wiktionary.org/wiki/lodestar> (or some refer to this as north
star) metrics across the whole Foundation, a system for reviewing and
reflecting on what we learn from them, and then a way to connect those
metrics with the day to day work everyone is doing.

To get at that, we’re doing two main things — one is deepening our
understanding of volunteer activities and the health of the volunteer
communities. This will be through working closely with volunteers using
existing processes and sharing what we’re learning, as well as qualitative
and quantitative research workstreams, including reviewing existing
research of volunteer activities and typical work profiles. The other is
working to establish a set of Foundation-wide lodestar metrics. Shared
metrics help everyone understand how we’re measuring success across the
Foundation, and we’re sharing these publicly as part of our Annual Plan.
Over time, we plan to bring our measures of success for important
initiatives to communities for conversations and debate to help everyone
align what success might look like. Shared metrics and data will empower us
to make more effective and better decisions, along with collaboration with
those who are working on changes and those who may be directly affected by
them.

What does our open source strategy look like for today’s world?

"I strongly believe that Wikipedia will be obsolete by 2030 if we don’t fix
MediaWiki now."

What is our open source strategy?

We have to make some harder choices about what it means to be an
open-source organization, and shift the conversation to resolve historic
debates that prevent us from making important, strategic choices.

Two big areas to resolve are:

-

What is our strategy for MediaWiki support? Today there is a tug of war
about whether we should support MediaWiki for third-party users, even
though their use cases have diverged significantly from those of Wikimedia
projects. I’m planning a MediaWiki convening in late 2023 to begin tackling
this issue.
-

What is our strategy for third-party re-use of Wikimedia content? There
are a lot of nuances around rate limiting and updating the existing API
policies in line with our values around open access. How can we coordinate
more across the Foundation and technical volunteers to build greater
understanding and alignment? Wikimedia project content also has become a
cornerstone of artificial intelligence (AI) products. Wikipedias have long
used machine learning (ML) to improve content and detect vandalism. How can
we help support the use of ML and AI that is a public good? We have
started some
conversations
<https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2023-2024/Draft/External_Trends/Community_call_notes>
about this but need to go further.


What will it take to have impact at scale?

"Before we can think about strategy, we need to answer ‘do we want to
change this culture to work with a unified strategy, or do we want to
change the strategy methodology to work with a decentralized culture? Or
some combination thereof?’"

What is our strategy for scaling that will allow us to have the most impact
with limited resources?

Today we support over 750+ distinct Wikimedia projects, with over 300 of
those including language versions of Wikipedia, Wikisource, multiple
language versions of Wiktionary, and many other free knowledge projects.

What is an efficient and responsible way to steward the limited resources
we have towards Wikipedia and/or the sister projects? And similar to the
earlier conversation about Foundation metrics – we must do this in a way
that can have an impact on our mission of bringing free knowledge to the
whole world.

Some of the big questions that came up included consolidation of projects
and the technology underpinning them where it makes sense, and from a
prompt given to me by the Commons community – how can we think even bigger,
and question elephants in the room, which in part would be to examine the
long-standing and seemingly unquestioned assumption that MediaWiki is the
best software to solve all problems we face. And if we do solve big
problems in different ways, what does that look like? What can we learn
from projects like WikiLearn, which uses free software not made by the
Foundation, as well as people and organizations outside our movement? This
is definitely a multi-year, rich problem space to explore.

Everyone’s relationship with English Wikipedia, including the Wikimedia
Foundation’s

"For various reasons, the Foundation and some parts of the communities are
stuck in an uneasy relationship where the Foundation admires but fears the
communities’ power, like a beautiful but dangerous animal – the tiger might
attack you – and the communities, not least English Wikipedia, distrust the
Foundation."

My experience so far has been that we have a very contentious relationship
with English Wikipedia. The Foundation raises most of the revenue to
support a global movement from English Wikipedia, and it’s often where
volunteers raise most of the concerns and objections to the Foundation’s
work.

It's painfully affecting volunteers and staff that are trying to maintain
content and code, and make important improvements to all the websites, as
with the launch of Vector 2022 this year. It has made product and
engineering teams very conservative in their approach to rolling out
features, making each change take 12 or 18 months, or years!, to get
valuable features to users. And it impacts our ability to collaborate with
communities on and off English Wikipedia on big goals like knowledge equity
and the movement strategy recommendations. As Yoda noted
<https://en.wikiquote.org/wiki/Fear#L>, fear is the path to the dark side.
This is a bummer, and I’d like it to change.

So how do we break this cycle? What I’m doing now is directly engaging.
Today, for example, I participated in an office hours session to talk about
Vector 2022. Some of the product senior leadership in the recent past have
specifically avoided talking directly with people on English Wikipedia, and
this approach will no longer be applied. Engaging human to human is the
best way I know to help resolve some of the mystery, fear and anger that
are present. However, that will absolutely not fix what’s wrong here. We
need systemic solutions. Today, there’s no way to make lasting and mutually
binding agreements with volunteers, and that isn’t a sustainable way to
create and maintain infrastructure software. My hope is that, with a more
open and direct approach to engage and also through the work of the
Movement Charter Drafting Committee, we will chart out a path for more
lasting, productive collaboration.

Being more intentional, and also clear, in our technical support for
volunteers

"We lack clear governance and communication for most of our tech
components, squandering a lot of the opportunities we have for more and
better participation from long-time and new volunteers."

How can the Foundation be more intentional about our relationship with all
volunteers?

Today we have few and incomplete policies about what volunteers can do in
technical spaces. We need to chart clearer boundaries, and move more toward
rational and practical policy instead of precedent guiding our work.

Similarly, the technical spaces where the Foundation "stays out" have felt
ad hoc, which led to volunteers stepping in to do important work. The
Foundation needs to exhibit better accountability in maintaining essential
services (e.g. 2-factor authentication), and to be explicit about the
technical tasks that it is definitely leaving for volunteers to own.

Finally, we really need to embrace a product development model that’s more
collaborative and efficient. This calls into question feedback tools like
RfCs, and takes into consideration movement "technology council" proposals.
What will really make us better together? I’m really interested in finding
an answer to this question.

Three Priorities for the Coming Year

What I have identified above are complex issues that cannot be solved in a
single year. We all need to take a multi-year view, especially in order to
define the precise issues that need to be solved more carefully.

For now, you have seen the draft annual plan priorities for the
Foundation’s Product/Tech teams and they include:

- *Volunteers*: We need closer connections, with a focus on making all
time spent volunteering fulfilling and productive. I will continue to talk
directly to volunteers, on-wiki and in person. I am making a shift in our
Annual Plan to support the work and improve the experience of "editors with
extended rights" (inclusive of admins, stewards, patrollers, and moderators
of all kinds, which are also known as functionaries). The work done by this
group on mis- and dis-information and on enforcing our Universal Code of
Conduct is crucial to the functioning of all Wikimedia projects. Success
requires that we are able to have metrics to guide our progress, identify
ways of measuring the health of communities, and that we do this work hand
in hand with volunteers.
- *Maintenance*: Staff and volunteers have both identified that we have
far too many unfinished technical migrations. This means that we continue
to support both old and new tools and ways of doing our technical work.
This increases the workload of everyone, without necessarily adding
features or improving our technical systems overall. Challenges include
issues with Foundation staff and volunteer community decision making,
accountability for that decision making and the best projects to pursue,
and, on the Foundation's side, a desire to not cause upset among
volunteers. As a result, we have many abandoned or poorly maintained tools.
We must be able to choose maintenance and technical migration areas for
prioritization, and then be ok with not doing work on others in order to
complete some of these big projects. For example, we have big work to do on
our data infrastructure, which is aging and made up of more than 40
distinct and fragile systems supported by a tiny team. We also have big
work to do on MediaWiki to ensure it can support our projects for the next
20 years.
- *Decision making*: From the very start of my time with the Foundation,
a common theme that kept coming up was the confusion that internal teams
had around decision making structures and accountability. I heard stories
about teams being indefinitely stuck, unclear decisions from the past, and
an inability to make and keep a decision. I view decision making like
lifting weights: you get good at it by doing it, incrementally and
consistently, over a long period of time. To start, I am making decisions
around the structure and organization of the Product and Technology teams
within the Foundation in order to make decision owners more clear, direct
and transparent. We’re collaborating better together internally, and
raising long-standing unresolved issues between teams in order to resolve
them, one by one. As I look ahead, clarity of decision making and how we
align our work towards our three objectives will be a core part of how I
organize teams.

In addition, I believe that decision making and achieving lasting positive
results needs to be rooted in data. We will identify essential metrics to
evaluate progress and assess impact on the three objectives of our work.
This allows us to stay focused on our most important goals, make
adjustments as needed, and track our progress over time.

I am committed to promoting transparent and accountable decision making at
all levels of management and individual contributor leadership. As I wrote
earlier in this letter, I also welcome ideas on how to build well-defined
processes for engaging with communities and making decisions that endure.
These changes to how we make decisions will allow us to move more quickly,
be more responsive, and create a larger impact for our goals over time.

What’s Next

During my listening tour, some staff asked me an "elephant in the room"
question: why should they trust me? Given the number of different
executives who have come to the Foundation and left within a year or two,
skepticism about yet another new leader is high. My answer was: I believe
the problems we face, as a Foundation and volunteers striving to bring free
knowledge to the world, are complex puzzles that cannot be solved by one
person, and I’m committed to a multi-year approach to collaboratively solve
them together.


Success requires more than a product roadmap. We need deep and effective
collaboration between the Foundation and all volunteers and communities,
shared ways to learn and be successful together, and constant adaptation to
changes in the internet and world, so we can solve the big puzzles we face
together.

Trust is built over time and through consistency, so I don’t ask for trust
as I begin my work. I ask that people be open to working closely together,
learning as much as we can about the important problems we face, and that
we regularly review our work in a data-informed way.

I would like to be direct about how difficult I know some of these topics
are, even for a discussion. But it is our job to tackle the most difficult
questions, especially where inaction due to fear has led to stagnation and
demotivation amongst both our staff and communities. This is not going to
be a quick turnaround. None of these issues will be a quick or easy fix.
Building and improving systems will be a lot of work, and will take a lot
of patience. But the payoff for solving each of these puzzles will be that
we’re able to engage more fully, and maybe even more joyfully, in our work.

My listening tour was an invaluable opportunity to get candid information
about what exactly is working, what isn’t, and what ideas everyone has for
creating something great together. We have a lot of work ahead of us, but
I’m encouraged by the energy and enthusiasm and I know we’ll be able to
tackle this together.

Next time you’ll hear from me is August to share the outcomes of community
discussions related to annual planning
<https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2023-2024>,
and where I think we’re going to have impact in the coming year. In the
meantime, I want to share a few questions that I’ll be returning to
regularly: Are there examples of big issues that we've tackled well as a
movement? Where would you suggest I draw inspiration? What's worked well?
These are the complex issues that will guide my priorities over the coming
years. What elephants am I missing?

As I shared when I joined
<https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/XO27SCB2UKZ6H6YRKFZLBR4URFW2VPGW/>,
I came to work for the Wikimedia Foundation because free access to
knowledge is the most important thing I can be doing right now. Our work
empowers the people who have knowledge to share. By involving youth, women,
and underrepresented identities to contribute their unique knowledge, we
will continue our journey to share the sum of all human knowledge. And this
kind of mission cannot be accomplished by any one person alone; we are
called to – and I feel strongly committed to – collaborate and truly be in
this mission together.

-selena