Case Study: EA Denmark LCAN Research Volunteer Project

Summary

EA Denmark created an 80/20 version of cause prioritisation research with a team of collaborators in Fall 2020. If you are working alone, you could do a similar project but only choose 1 cause area to research

Project Manager: 1, Collaborators : 4 (all volunteers)

Timespan of MVP project: ~2 months

Hours per person: 0.5-3 hours per week depending on the collaborator. The Project manager had roughly 2-3 hours per week.

You can use their Career Research Template to do something similar.

Contact: Viktor Petukhov (Project Manager) for more questions. You can request to join the LCAN Slack channel to connect to other collaborators working on local career advice research.

Goals

  • (Object-level) Increasing individual impact of members and the number of impactful members
    • Create a knowledge base that can be continuously updated, where newcomers can read through the sections and find people they need to talk to
    • They aspire for the document to be easily updated over time
  • (Meta-level) Establish 1-1s with groups & engage members in group projects

Process

  • In Feb/March 2020 Viktor did initial cause prioritisation research in Denmark & identified top causes
  • Viktor wrote up a quick description to help the volunteers, which you can see at the top of the Cause Research Results.
  • Viktor found other volunteers who each chose 1 cause to research
  • They first had ~3 weekly meetings discussing prioritization stuff and making sure that everyone is on the same page.
  • Then meetings every other week, exchanging the results and making plans for the next two weeks.
  • They created a template for the each cause writeup that each contributor could follow
  • Everyone conducted their research which included: domain expert interviews, and initial research to find existing organisations in Denmark where participants could skills-build or work. Depending on the cause, the type of research varied.
  • Some members used a checklist of things to keep in mind when writing up the results of their research
  • For each cause, they defined an 80/20 version and set a hard deadline of July 1 2020

Results

  • Each volunteer picked a promising cause and did research for that cause
  • The causes researched were: Technical AI Safety, AI Policy, Climate Change, Mental Health

Next Steps

  • 1) Have a system to find people who would benefit from the research
    • As an MVP, think about existing members who would benefit from each of the existing cause areas
      • V: Reach more people rather than less people
      • Reach out to the broader community and find people to do career 1-1s with and walk people through the guide
    • Then: targeted outreach to university programs etc.
  • 2) Have people who will drive the project forward: to achieve 1)
    • The original group or other people
    • Coworking session inviting people over to work on this project
    • Viktor happy to continue to lead project
  • How to use this research more widely?
    • How to engage other EA communities to create this guide for someone else?
      • Bond with Norway/Nordics on this research - Vaidehi to check the Norway - Eiren Evjen
    • Presenting their research at various EA event such as:
      • Unconference in Germany
      • Denmark Retreat
        • EA Denmark Counsel Meeting

Project Evaluation

Positives

  • Found the templates to build up on something that’s been tried out. E.g. beginning with local prioritisation per cause and denmark from there. Doing it both ways created new perspectives. Liked the protocol
  • Next Steps after each meeting was useful to know what each individual will do
  • Having a defined 80/20 version for each promising cause was very good. Having a check-list for writing each section was a big improvement
  • The structure for the document and how to make it more clear for newcomers
  • Having feedback from others is good

Improvements

  • Project Management Improvements
    • Group-work sometimes lacked a framing. Would be useful to have a project management overview. What needs to be done for each meeting. Felt like it was going meeting to meeting.
    • General progress bar for the project & deadlines (E.g. Project Status). Not sure who was doing what (perhaps less important)
      • Would help with personal planning week-on-week
    • This was challenging for a very uncertain project (creating process as they went)
    • Feedback: Would have benefitted from having feedback earlier in the process of the meeting
  • Expectations: not clear what is expected from her (Carmen) or from anyone. Don’t know if she contributed enough, compared to others it’s clear if doing alright or not doing enough work. (Seconded by Viktor)
    • Viktor’s perception: having expectations like # of pages may not be realistic or comparing to others was not good. What was produced seemed good.
    • Good to have a hard deadline for when people should add the their sections to the document. Some people were still working last minute, so this caused some confusion.
    • Check-list of deliverables from the very beginning would be beneficial
    • Ways to improve expectation fulfillment?
      • S: State exactly what’s expected from someone
        • E.g. para by para and with specific deadlines
        • However, this may not be possible because each section was so different, so probably can’t set 1 thing
        • Or at the end of each call - take 5 mins from each person a specific commitment on what to do for each person, what bottlenecks they expect & group brainstorm the possible bottlenecks
      • S: Format of the project - e.g. weekly coworking session
      • Va: Project Manager to do weekly minute check-ins & understanding volunteer’s needs
      • C: differentiate between expected goals and superhero/stretch goals
  • Writing Guides
    • Need to keep the audience in mind from the beginning.

Future of the Project

  • Structure of intermediate report can be improved: align all headings and have them in the same order
  • Will it be consistently updated?
    • Would be nice to update every 6 months, but many uncertainty
    • Could have last updated date for each section for the reader to know
  • Revision process should be easy to do
    • What are the next steps?
    • And revision should include some check ups on whether the organization's we mention are still active
    • Because AI Hub was active in 2016 but then changed and now mostly has a reading group left. Links to organisations should basically be in there and the revision process should include checking whether the projects mentioned are active or not. Can add a label like "active, inactive" etc

How was the LCAN Guide helpful?

  • The structure of the guide was useful
  • Local cause prio & stuff was useful
  • How to do cause prioritisation - what he added to LCAN was the bit they used
  • Interviewing experts - taking from the main LCAN guide
    • Maris & Viktor interviewed 3 people
    • Carmen did interviews - just did it for her own sake so no formal structure in mind, although information was useful since they were semi-formal
  • Rating organisations - didn’t reach that point yet
  • S: used it for ways to do local cause prioritisation - other than that, mostly viktor relying on lcan out of the group. 80/20 version of the guide would be useful & writeup of the guide since it’s a volunteer basis