Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Web3 Association - Open source contributor funding experiment setup #2370

Merged
merged 3 commits into from
Sep 18, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,204 @@
# Open source contributor funding experiment setup

- **Team Name:** Web3 Association
- **Payment Address:** 14MvsY86yCauNJYp5AfjRYesQ32uXvaBzXEqoBJqj4MQpajR (DOT / USDC)
- **[Level](https://github.com/w3f/Grants-Program/tree/master#level_slider-levels):** 1

## Project Overview :page_facing_up:

### Overview

**Brief description**

This open source contributor funding experiment looks to fund one or multiple developers in the Polkadot ecosystem that will help with developing open source initiatives. Contributors will be funded for a fixed period of time, such as 4 or 6 months. Contributors will follow the funding process guidelines that are outlined in the experiments documentation. As some examples, contributors are expected to submit monthly contribution logs, share public contribution tasks and only work on open source initiatives. The goal of this experiment is to learn as much as possible about contributor funding. This experiment will help with learning about how effective our suggested approaches are that have been identified through analysing how a funding process can be structured (https://funding.treasuries.co). Full details about this contributor funding experiment can be found in our documentation (https://funding.contributors.org).

The purpose of this proposal is to do the work required to setup up this experiment. This involves getting priority suggestions from across the ecosystem and finding developer candidates that would be interested in participating.

**Relevance to Polkadot**

Polkadot currently uses OpenGov and the technical fellowship as two different funding processes. This experiment looks to explore alternate approaches to determine whether these approaches could provide an an improvement or not compared to the existing funding processes.

**Team interest**

The Web3 Association has now been analysing treasuries for over a year. Historically George has completed analysis about the funding process in Cardano called Project Catalyst (https://docs.catalystcontributors.org). George has been heavily interested in understanding treasury systems and funding processes. His recent efforts have been focussed on identifying the most effective funding approaches that could be adopted to effectively scale and accommodate a growing number of voters and contributors. This funding experiment looks to trial the most promising suggestions that have come out of our treasury and funding analysis. We believe many of these approaches could become long term solutions for Polkadot if the experiment proves to be successful.

**Academic research**

The data generated from this experiment is currently not intended to be used in our own academic research. Our experiment will generate data outputs and a report that will highlight some of the trends and findings from the experiment.

### Project Details

**Problems**

The full list of problems with idea funding are highlighted in our open source contributor funding proposal (https://docs.contributors.org/proposal/open-source-contributors). Some of those problems include:
- Voter complexities & high participation time required - It takes a meaningful amount of time for community members to review OpenGov idea proposals and vote on each one due to the amount of written content that is involved. It is difficult for voters to understand the context and feasibility of an idea without having a sufficient amount of relevant skills and experience around the topic area themselves. This makes it difficult for voters to have enough capacity to vote on every proposal and to feel confident in the decisions they are making.
- Contributor complexities & high participation time required - Apart from the technical fellowship, a contributor must create an idea or join an existing idea and submit a proposal to receive funding from the ecosystem. A contributor is unable to easily indicate that they are interested in working in the ecosystem under an idea funding process model. Contributors face budgeting complexities as they are required to estimate how long an idea will take to execute which is difficult in an environment that changes very quickly.
- Low funding process efficiency - OpenGov funding could result in the under or over funding of ideas as it is difficult to know whether an idea is going to be fully executed at the point of submission and how much it changes during execution. If the ecosystem requirements or environment changes or someone else executes a similar idea in that time frame there is a risk that funds are wasted or misallocated.

**Research questions**

- How does the data compare with other funding approaches? - Experimenting with a different funding process will mean there is a need to generate enough data to support the case of whether this alternative approach is effective or not and whether there is an opportunity to pursue future experiments.
- How effective are the suggested approaches? - A number of approaches that have come out of the funding process analysis (https://funding.treasuries.co) have been adopted in this experiment. This experiment will help with gathering data about these approaches and the voters and contributors preferences and sentiments about them.
- What preferences and sentiments do people have about the funding process? - We want to find out whether a contributor funding process is preferred by either the voters and contributors. If it is preferred this can provide some supporting evidence about why this approach could be a positive direction for more experimentation.
- What collaborations occur due to the incentive structure? - Contributor funding can be highly effective for giving more autonomy to contributors. Contributors are able to identify impactful areas of contribution and more easily respond to what actually happens in the ecosystem whilst they are contributing. Contributors are tasked with directing their efforts towards the initiatives where they are able to generate the most impact for the ecosystem.

**Hypothesis**

- The funding process will be quicker and easier for voters - This experiment should be able to prove that it is quicker and easier for voters to participate in the funding process. Voters won't need to review as much information and contributor proposals will be much easier to understand than idea proposals.
- Voters will have higher conviction in their voting decisions - The voter should not need to have as much expertise to have a higher conviction and confidence in their voting decisions. This hypothesis should be more easily provable over repeated experiments as voters can then more easily spot the top performers and reselect them with more confidence.
- The funding process will be quicker and easier for contributors - Any developer in the community will be able to indicate that their interest in working for the ecosystem full time without making an idea proposal in OpenGov. They will be able to participate more quickly as they only need to fill in some personal and professional information.
- Voters will prefer this funding process - A simpler voting process should make it more feasible for voters to participate due to there being a lower time requirement to participate. If it's also easier to understand this could lead to a preference in this funding process over more complicated approaches. Another reason this might be preferred is because voters have more confidence in their decisions and feel they can more easily contribute towards important governance decisions.
- Contributors will prefer this funding process - Contributors should save time when participating in this funding process as the information required is just their personal and professional information. Contributors can also reuse this information in future applications which saves further time. Selected contributors could enjoy the freedom and simplicity of being able to focus on the execution of ideas instead of repeatedly writing and promoting idea proposals each time they want to receive funding. The contributor can focus their time and efforts towards executing one or multiple ideas and wouldn't be tied to any particular idea. If an idea wasn't getting the right results the contributor could redirect their efforts immediately elsewhere when they felt the need to do so.
- This incentive model will lead to higher amounts of collaboration - Contributors are incentivised to generate impact and are not tied to a single idea. This experiment could help to provide some evidence that a wider breadth of ideas get executed or that more collaboration starts happening across the ecosystem due to these incentives. It could be difficult to prove this hypothesis in a smaller experiment that has a short duration. This hypothesis will likely require more rounds of experimentation and more data to find out whether this hypothesis is correct.

**Methodology**

- 1. Identify the most promising funding process approaches - Each part of the funding process can be broken down into the different parts and the different approaches can be identified and compared. Those most promising approaches can then identified and suggested. These are the outcomes from our ongoing funding analysis (https://funding.contributors.org).
- 2. Experiment with the suggested approaches - Experimentation will help to test the most promising approaches that have been identified through analysis. The experiment will help to gather more data, preferences and sentiments about the approaches to identify how effective they were in practice.
- 3. Analyse the experiment data, preferences and sentiments and then compare this with other funding processes - The data, preferences and sentiments that come out of the experiment can be analysed to better understand the effectiveness of the experiment overall and of each of the suggestions that have been trialled. This data and analysis can then be compared with other funding processes to determine how effective the experiment has been overall.
- 4. Iterate and improve the experiment - The lessons learnt from running the experiment can be applied to any future experiment. Ideas that emerged out of the experiment can be further analysed to improve the funding analysis resources.

**Data collection**

The following parts of the funding process will generate data which will be open source and publicly available:
- Contributor proposals
- Voting results
- Contributors open source contributions
- Contribution logs
- Feedback forms

**Analysis procedures**

- Collect and publish data - All data recorded during the experiment will be aggregated together and released in a single resource.
- Identify any trends, weaknesses, strengths and opportunities - The data will be analysed to generate a report that highlights any findings.
- Compare with other industry funding processes - The results from the experiment will be compared with other funding processes to justify any future experimentation and to understand the overall effectiveness of the funding process.

**Expected results & reproducibility**

All outcomes, data, feedback and contributions will be open source and publicly available. The grants team will be able to verify everything required about the experiment from start to finish.

**Relevant related work**

Other ecosystems might decide to experiment with our suggested approach or a variant of our suggestion. If this happens we would look to add any relevant data to our own documentation so that anyone can review it.

**Results publication**

We currently intend to publish all of the results and data related to the experiment in a GitBook repository under the domain data.contributors.org.

**Managing expectations**

The operation of this experiment will not involve any software development for creating new solutions. Only existing tools and services will be used to run the experiment. The services and tools being used to operate the experiment are existing Web2 solutions. These cover all our requirements. The lessons learnt from this experiment will be considered if we decide to help with developing any treasury and funding systems in the future.

### Ecosystem Fit

**Ecosystem fit**

This experiment looks to explore alternative funding approaches so this is highly relevant when thinking about how OpenGov and the technical fellowship could be improved in the future.

**Target audience**

The target audience for the participants is developers in the ecosystem who are interested in being funded to work on open source initiatives. The target audience for the beneficiaries of this experiment are the projects that are building in the ecosystem that will benefit from the open source initiatives that get developed and improved.

**Ecosystem needs**

The ecosystem highly benefits from identifying the most effective approaches that can be adopted to handle and disburse treasury assets. Even small improvements in how funding is allocated could lead to highly impactful outcomes for Polkadot.

**Similar projects**

The technical fellowship is the most similar funding process to this experiment. This suggested experiment focuses on open source initiatives more generally in the ecosystem rather than the core infrastructure. Open source contributors can be seen as a layer on top of the founding entity teams and the technical fellowship.
In other ecosystems the only contributor funding process we've found so far is the Stacks residence program (https://github.com/stacksgov/residence-program/blob/main/README.md)

## Team :busts_in_silhouette:

### Team members

- George Lovegrove (https://docs.web3association.co/contributors/george-lovegrove)

### Contact

- **Contact Name:** George Lovegrove
- **Contact Email:** [email protected]
- **Website:** https://docs.web3association.co

### Legal Structure

- **Registered Address:** Office A, RAK DAO business Centre, RAK BANK ROC Office, Ground Floor, Al Rifaa, Sheikh Mohammed Bin Zayed Road, Ras Al Khaimah, United Arab Emirates
- **Registered Legal Entity:** Web3 Association Ltd

### Team's experience

George has a background in software development of over six years where he mostly worked with frontend technologies such as React and React Native.

George started participating and contributing towards improving the Project Catalyst funding process in Cardano after seeing the potential of Web3 treasury systems. Previous contributions focussed on funding categorisation (https://docs.catalystcontributors.org/funding-categorisation-analysis). This period of analysis was also when contributor funding was first identified as a potentially highly promising funding process. These resources led to the Catalyst team running workshops and changing how they approached funding categorisation from fund 11 onward.

The Web3 Association is an the evolution of these historical contributions. The Web3 Association is focused on Web3 treasuries. Our analysis is relevant and applicable to all Web3 ecosystems. A number of educational resources have already been released about treasuries, funding and income.

### Team Code Repos

- https://github.com/web3association - All Web3 Association repositories

Please also provide the GitHub accounts of all team members. If they contain no activity, references to projects hosted elsewhere or live are also fine.

- https://github.com/lovegrovegeorge

### Team LinkedIn Profiles

- https://www.linkedin.com/in/georgelovegrove

### Google Scholar Profiles

Not applicable

## Development Status :open_book:

Experiment details & example
- https://funding.contributors.org
- https://example.contributors.org

Educational resources
- https://docs.treasuries.co
- https://funding.treasuries.co
- https://income.treasuries.co
- https://suggestions.treasuries.co
- https://docs.contributors.org

## Development Roadmap :nut_and_bolt:

### Overview

- **Total Estimated Duration:** 1 month
- **Full-Time Equivalent (FTE):** 1 FTE
- **Total Costs:** $3,000
- **DOT %:** 50%

### Milestone 1 — Experiment setup

- **Estimated duration:** 1 month
- **FTE:** 1 FTE
- **Costs:** 3,000 USD

| Number | Deliverable | Specification |
| -----: | ----------- | ------------- |
| **0a.** | Copyright and Licenses | MIT |
| **0b.** | Documentation | polkadot.contributors.org |
| **0c.** | Methodology | All documentation, data and proposals will be open source |
| **0d.** | Infrastructure | https://funding.contributors.org/contributor-funding-experiment/experiment-setup/tech-stack |
| **0e.** | Article | The priority suggestions and contributor proposals will be publicly available from the polkadot experiment documentation. |
| 1. | Priority suggestions | A priority suggestions board will be setup and the community will be invited to post their suggestions. |
| 2. | Contributor proposals | The contributor proposal submission process will be setup and community members will be invited to submit their contributor proposals. |
| 3. | Funding experiment proposal | The funding operator will present the priority suggestions and contributor candidate proposals in a new proposal and suggest some parameters that could be suitable for the funding experiment. |
| 4. | Finalising proposal parameters | The funding operator will collaborate with the Web3 Foundation to confirm and finalise the experiment parameters before a decision is made by the committee about whether the funding experiment will be funded or not. The parameters include the number of contributors that can be funded, the maximum dollar amount that a contributor can receive per month and the number of months the experiment will be run. These parameters will result in a maximum total requested experiment budget. |

## Future Plans

The intentions of this experiment is to operate a cost efficient funding process that can explore the opportunity of contributor funding and a number of funding approaches in more depth. The goal is to learn as much as possible about contributor funding as possible and whether our suggested approaches are effective or not. If the experiment proves to be successful then the next steps for us would be to look at what improvements can be made to the funding process. These potential improvements will be documented in a final report. We would then be interested in executing a second experiment that adopts the lessons learnt from the previous experiment and iteratively improve our understanding of how to operate an effective contributor funding process.

Over the long term this funding process has the opportunity to become the most effective and adopted way that Polkadot disburses it treasury assets to more consistently generate impact across the ecosystem.

## Additional Information :heavy_plus_sign:

**How did you hear about the Grants Program?** Web3 Foundation Website

We previously applied for sponsorship to support our Web3 analysis efforts - https://github.com/w3f/Grants-Program/pull/2119/files

This previous grant was not funded, however we still continued with our open source analysis.
Loading