- Conduct stakeholder interviews & surveys with the broad TE ecosystem to understand token engineering needs and challenges.
- Develop a methodology that can be reproducible for future studies.
- Deliver an analysis report with the findings from this team.
- Open source data and taxonomy to engage future studies.
It’s time we understand with depth the challenges and needs from token engineering stakeholders. The discipline has matured since its name was coined in 2018. The economies designed ever since have had time to play out their failures and successes, the Token Engineering Academy has been providing consistent education and forming professionals. Tools like cadCAD, Token Spice, Machinations and others are being used by many projects and the Token Engineering Commons is working to sustain the economic ground for TE (Token Engineering) public goods to be funded.
The next step is to create a solid and well informed roadmap for advancing the TE discipline based on the industry needs, desires and ideas, which should be preceded by understanding exactly who makes token engineering as a field.
This will be solved through the work of this proposal, which covers the development of a methodology, interviews of TE stakeholders who will be grouped during the study, a survey, and a final analysis report from the results.
The interviews will be scheduled one on one with a max 1-hour duration. Fireflies AI will be used for transcription + a designated note taker will be part of the interviews highlighting important points. With the consent of the participants, data will be open sourced to the community and will remain available for engaging junior TE’s and researchers on future studies.
The information collected on the interviews will be synthesized into a report and used for creating stakeholder-specific surveys, which will allow both qualitative and quantitative conclusions to be made.
This work will generate the following artifacts by the end of the research arc:
- A synthesis of the token engineering ecosystem,grouped by stakeholder profiles
- A methodological report on how the synthesis was generated
- The entire unstructured and structured data generated by this research stored on a persistent storage
This is the first study of its kind in the token engineering space. The interviews will expose the needs, challenges and desires of token engineers and this study and its research processes will set the ground for future studies to happen more easily. The information drawn from it will allow the TEC to build a roadmap to serve the community and will likely open room for collaboration between TE projects and professionals.
How much in wxDAI are you requesting?
The amount requested generated some debate with the team:
Danilo is a Token Engineer and researcher at BlockScience, experienced with estimates on cost and project bandwidth. He suggested higher rates than what the TEC can pay, however they reflect the market and quality of work being proposed.
The cost of the project was estimated to be around ~69k xDAI. Of those, about 50% is going to be requested from TEC. The initially suggested rates are compatible with the average Data Analyst / Social Researcher job wages.
In talking with the team, there was a suggestion of reducing the proposed rates and recognizing these reductions as pro bono contributions. So percentages of the initial proposal were reduced by 50%+ to reflect these pro bono contributions and the financial reality of the TEC.
This internal debate served to show us how costs for research and token engineering work are also unclear in our ecosystem, and should be included in the questions of our interviews so that this study can also reflect expected & actual rates of token engineering work and research.
The values in this spreadsheet are an estimate of cost and bandwidth. Everyone in the team will track their hours and commitments. The funds will be distributed according to these roles:
Science SME: The Science SME (Subject Matter Expert) is an expert on qualitative and quantitative research on social sciences that will own the methodology of this project. It is expected he will ensure the results have good quality standards in terms of relevance, inclusiveness and reproducibility.
Infra SME: The Infra SME is an expert on the tooling required for doing reproducible science. He has knowledge on persistent data storage as well as tools for communicating transient data.
TE SME: The TE SMEs are the owners of the synthesis narrative (the final analysis report). Active participation on all steps of this project is to be expected, and required skills include a good understanding of the TE technical fields, projects and groups. Communication and writing skills, as well as synthesis skills are required. Although we have one main TE SME, interviewers will also share this role.
Communications: Comms will be responsible for the content distribution and writing support on the synthesis / summaries. Part of this budget will also be allocated for design.
Social Researchers: They will be responsible for interviewing people with heterogeneous backgrounds on the TE field and producing summaries on it. They will be supported by the Science and Token Engineering SMEs to streamline their work. It is expected they will provide regular feedback on the project methodology and artifacts being produced. The social researchers are able to understand their subject’s language by being familiar with token engineering.
Transcriber: All interviews will be recorded and transcribed by AI. The role of the human transcribers is to be present in all interviews as a support point for the interviewer and help highlight important points and patterns within the interviews.
The roles are roughly described above to delineate the work to be done even though one person may play multiple roles and some roles may require multiple people.
- The methodology is successfully followed
- An informative analysis report is shared with the community and there is a sentiment of relevance between stakeholders
- A reproducible research pattern is shared with the community to incentivize future studies
A weekly report will be submitted in the TEC forum until the project completion.
Bear: Comms, social researcher
Member of the TEC Coordination Team
Danilo Lessa: Science SME
Engineer at Block Science
Gideon Rosenblatt: Financial support, social researcher
Member of the TEC Coordination Team
Lisa Wocken: Social researcher
TalentDAO and Bolster Leadership. PhD on Organizational Leadership
(qualitative researcher and adjunct faculty at the University of Minnesota)
Livia Deschermayer: PM, social researcher
Cultural Build at Commons Stack, TEC former steward
Malik Tag: Transcriber, comms
Electronic engineering background and TE Academy student
Nathalia Scherer: Interviewer, social researcher
Researcher at International Institute of Psychoanalysis and DAOstack. Former TE Academy.
background in industrial & systems engineering, focusing on distributed tech and governance.
Shawn Anderson: TE SME, social researcher
Founder & Data Scientist at Longtail Financial, former TEC Steward
Timeline as proposed in the meta methodology
The timeline is subject to change as the methodology will also evolve, this information is a guiding structure and estimated time.
- Week 1: Initial set-up
- Week 2: Data keeping & provenance is defined
- Week 3: Groups to be interviewed are defined
- Week 4: Sampling strategy and Interview questions are defined
- Week 5: Interviews are ready to be started!
- Weeks 6 to 8: Interviews
- Week 9: Interviews are synthetized and summarized
- Week 10: Follow-up survey is defined and distributed
- Week 11: Follow-up survey analysis is done
- Week 12: Final report is written and distributed along with the research data