Metrics for Meta Career Advice Activities

Key Metrics

This is our initial approach to measuring our outcome. I expect that one of the first tasks for LCAN will be to establish robust metrics, and erring on the side of collecting more data rather than less to verify we are having the impact we expect to have. I don’t currently feel confident in making explicit suggestions, but the following is my ideal outcome measuring strategy.

  • Challenges of measuring our outcome
    • Because we are a meta-level network, our primary “clients” will be organisers. Thus, getting their feedback on our impact will be really important. However, there are some biases that may arise here, especially if organisers overestimate the usefulness of our services. Here are useful things we could find out from organisers:
      • How much did their approach to career advising change after they interacted with us?
        • What kind of events did they run?
      • How valuable did they find our various services?
      • How many people did they forward our resources or services to?
      • What other resources did they use? How would they relatively rank different resources?
    • To avoid this, we will need to figure out strategies of getting data on the actions organisers’ take after they interact with LCAN especially:
      • Was there a difference between career events before/after coming across LCAN? (This will be difficult if they interact with us before they start doing such events, but we can use our career advice bottlenecks survey as a rough benchmark for where organisers were)
        • If we rely on organisers’ evaluations of the events, we need to ensure that there is some level of standardization across events. Thus, one of our first tasks would be to work with group organisers and CEA to come up with appropriate methodology to measure the impact of career events and activities, especially in the long-term (e.g. doing 1-3 month follow-ups)
      • If they forwarded their members’ any of our resources, we could directly ask those people. Perhaps adding a feedback form to the bottom of our resources and webpages.
  • Types of metrics which will be used
    • Lead Metrics
      • Thing you’re doing will expect will cause the thing you want. Measuring what you’re doing - views etc.
      • Examples of things we will collect:
        • # of resources, page views, time spent on different resources
        • Mailing list subscribers
        • Feedback forms
        • Request for feedback via slack/mailing list
    • Lag Metrics
      • Actual thing you want to measure
      • People who’ve changed their career because of LCAN
        • Assume they are using the same metrics (LCAN’s first task)
    • Examples: 1-1
      • Lead Metrics: how many calls/topic of discussion with group organiser on career events
        • How many new resources could i share?
        • How many connections made?
      • Lag Metrics: How did participants rate the career event? Testing on:
        • Motivation
        • Connections
        • Information
        • Overall quality of feedback
        • Completion of plans

To Do

  • Re-read 80,000 Hours DIPY metrics
    • Value of change vs speed up?
    • They claim 20% of total lifetime impact
      • How to estimate across local groups?
  • We need reliable mult-year tracking of participants across groups
  • External impact tracking surveys that organisers have access to but are not responsible for sending out over time
  • Challenges
    • Hard to interpret feedback if you don’t know who it is. Need to be careful with how you frame feedback (e.g. maybe ask what their goals were first, and then whether they found the session valuable - maybe they had different goals to what you were optimizing for)

Other thoughts (to be moved)

Older/non-core community members

  • Need to figure out what they can do - what is their comparative advantage
  • May not be very high impact but:
    • Critical mass of the group/group stability
    • Speed up high impact potential
      • Improve quality of discussions, vet their ideas (encourage them to move away from pet projects quicker)
    • Learn more about peoples’ thinking and whether there are people who didn’t seem promising at first but are

Shift in thinking for local groups

  • Shift from 80K priority paths to high impact career paths including short-termist paths (survey)
  • How does 80K’s new metrics change past group analyses of their impact? Was that ever the appropriate methodology?
  • Current status quo: random job & donating, could at least optimize for ETG, if not a bigger change
  • Personal uncertainty because they need to do a lot of thinking & research
  • Ways to engage with out survey research
    • Discussion thread
    • Have been asking organisers to use the potential bottlenecks to identify their groups’
    • Form to continue gathering data (but in a simpler way, could this be tie up with the Hub)