the-turing-way
the-turing-way copied to clipboard
Proposal: Method for determining whether to switch tool
Summary
Following on from #2811 and https://github.com/alan-turing-institute/the-turing-way/issues/3039#issuecomment-1617851889, I'd like to share some initial notes on a proposed process for comparing tools and determining whether or not to switch tools:
- Ratings based on:
- (Mis)alignment with project values
- ✨Synergy✨
- Potential for upstream contributions
- Technical
- Financial
- Other?
- Potential to showcase projects we want to hold up
- license
- perception of community (are they good open-source citizens)
- backing organisations
- relation to open science
- Potential for upstream contributions
- ✨Synergy✨
- Technical cost/benefit of moving
- How much of a pain in the *** is moving from a technical perspective?
- Difficult to set up?
- Would/can we import data from previous system?
- Long term maintenance - easier/harder
- Culture of support for tool - if we have issues with the current/new tool, what are the support systems to help us work it out?
- costs of not moving
- How much of a pain in the *** is moving from a technical perspective?
- Cultural cost/benefit of moving
- Asking folks to learn a new tool
- Moving folks away from tools that they use for multiple projects (lost efficiency)
- Financial cost?
- costs of not moving
- Accessibility
- How does the current tool compare to a potential new tool in terms of accessibility?
- linguistic
- disability / access needs
- geographic
- How does the current tool compare to a potential new tool in terms of accessibility?
- (Mis)alignment with project values
What needs to be done?
- Feedback please!
- What's missing?
- What's most/least important?
- How do we "make it official"? (Link to governance discussions)
- Who gets to determine whether this is the rubric?
- To what extent, and how are assessments made collaboratively/collectively
- What should the process and time course of a proposed switch be?
- Meta question: just copy/pasting my notes above is fine for one-way communication (overhand throws thoughts into your face at point blank range) but it's not great for working this out together...
- Should I start a PR so it can be collectively edited better?
- A hacked/etherpad for quicker flow and reduced barriers to entry (for folks who are less familiar with GH/PRs?)
Who can help?
- Anyone
Updates
- 2023-07-30 Updated to add suggestions from @JimMadge, @gedankenstuecke, and @aleesteele ❤️
This looks great :rocket:!
I think the three main points (alignment, cost of moving, benefit of moving) are exactly right.
I'm sure we could think of more items for (Mis)alignment with project values. (Everything is a subcategory of synergy, is that redundant?). For example, license, perception of community (are they good open-source citizens), backing organisations, relation to open science.
How would you see this being used? Is it a set of prompts to draw out the important information? Is it a more rigid form with compulsory sections? Will it be used to calculate a 'moving score' to compare against a threshold?
Regarding making it official. I'm sympathetic to @sgibson91's suggestion that there should be someone responsible for this, to make sure there is space, and mandate, to investigate and action any changes. That doesn't necessarily mean that person would do everything themselves. I think it would be much better to source opinions and co-create the recommendations.
I'm happy to be involved with this. Maybe it would be a good collaboration cafe task?
I guess it's largely a matter of perspective but I think it might also be useful to include the "costs of not moving" in this calculus.
Arguably one could frame it as "benefit of moving: avoiding cost X that would happen if we didn't move", but (at least to me) that's a bit more convoluted and thus more likely to be forgotten.
Hi @da5nsy - this looks great! Thanks for compiling these important questions - this topic is much needed within TTW, in line with discussions that I know you folks have already had within the infrastructure team. Developing a process for evaluating tools will be really valuable, but I agree with @gedankenstuecke that it is very difficult to do so in a way that isn't completely relative or dependent on a single person's (or a small group's) opinions. A great challenge for being community-led.
In terms of format for such a evaluation process, I can think of a few examples that can help with this brainstorm (in line with @JimMadge's line of 'How would you see this being used? Is it a set of prompts to draw out the important information? Is it a more rigid form with compulsory sections? Will it be used to calculate a 'moving score' to compare against a threshold?'):
- Simply Secure/Superbloom has a quick 'user testing' cheatsheet (pdf version - slightly exampled here) template that frames desired feedback around sets of questions. I used this approach to inform the basis of documentation around our working groups (linked here - still very much a work in progress!). In the context of TTW, I can imagine that each of the titles & sections might have probing questions that can be asked of any suggested tool that we use within the community.
- Last year, I compiled an audit of our Slack platform, which you can see here. There's a wider collection of audits that I would be very happy to handover, adjust, or whatever might be needed if you see it fit for purpose. In general, this might apply more-so in the context of evaluating our current technical tool kit, perhaps as opposed to adopting future tools.
- The lovely Bernard Tyers (@ei8fdb) from Open Source Design made a visual matrix diagram on Miro of software that designers use last year, which may also be of interest as an evaluation mechanism. Again, this diagram might not allow for the kind of evaluation process we are after... however, I think it's a great landscape study, and could be helpful here.
A couple of comments on my end in terms of content, as I diverge somewhat slightly from Bastian and Jim's comments above.
- (Mis)alignment with project values: What are these values, maybe we can start with that first? Interoperability, reproducibility, maintainability, accessibility, upstream-ability? Healthy ecosystem? Language-agnostic? Geographically diverse? This would be a great conversation to host more broadly in the Collaboration Cafe's main room – and leave room for async conversation, debate, and discussion. Attributing values to our infrastructure is a difficult, but necessary conversation to have. Happy to support this.
- Technical cost/benefit of moving & Cultural cost/benefit of moving: In either/both of these sections, another point I would add here is an accessibility evaluation: how easy is it for folks to use/interact with the tool, from both linguistic and disability perspectives? A few examples I can think of is whether or not there is support for alt text (in the case of a platform), or an all-English policy (as is the case with the mastodon instance we are currently on, for example).
In terms of tools for drafting, how about hosting/facilitating a public discussion (we have a couple of slots open in the upcoming months), using a padlet to ensure wider accessibility after in-person conversation, leaving the padlet open for async notes & discussion, then structuring a draft PR from that?
Regarding @JimMadge's comment about fielding support/funds/personnel to do this work, this is absolutely something that we are working on in the delivery team, but cannot act on immediately. @JimMadge - it would be great to ask about what can be asked from Turing REG folks that are working with open source.
@aleesteele I don't think I disagree with any of your points above 🙂.
I like the ideas about how to make sure we do a good job of capturing 'what is important' in terms of values, culture, accessibility. There is a risk of a relatively small section of the community, who are most engaged with this, to not capture the thoughts and feelings of the whole.
Happy to talk about and facilitate REG involvement :+1:.
@JimMadge
I'm sure we could think of more items for (Mis)alignment with project values. (Everything is a subcategory of synergy, is that redundant?). For example, license, perception of community (are they good open-source citizens), backing organisations, relation to open science.
Great suggestions.
How would you see this being used? Is it a set of prompts to draw out the important information? Is it a more rigid form with compulsory sections? Will it be used to calculate a 'moving score' to compare against a threshold?
Potentially - all of the above.
Regarding making it official. I'm sympathetic to @sgibson91's suggestion that there should be someone responsible for this, to make sure there is space, and mandate, to investigate and action any changes. That doesn't necessarily mean that person would do everything themselves. I think it would be much better to source opinions and co-create the recommendations.
Strong agree. Part of my motivation in starting this is to move the conversations around defining that role forward.
I'm happy to be involved with this. Maybe it would be a good collaboration cafe task?
Sounds great. I suspect we might quickly reach a stage where we would want wider input and/or input from those with more decision-making power.
@gedankenstuecke
useful to include the "costs of not moving" in this calculus
Very good point!
@aleesteele
difficult to do so in a way that isn't completely relative or dependent on a single person's (or a small group's) opinions
Totally agree! I hope that within the project, we collectively have the skills and experience to work out how to involve as many people as ~~possible~~ sensible in this type of decision-making process. I do think that that is an incredible amount of work to do though (presumably it would involve coordinating with the different working groups/users/stakeholders) and that that would ideally be the remit of a paid member of staff (see @JimMadge/@sgibson91's points above).
I can think of a few examples that can help with this brainstorm
Ooh yay, thank you so much for sharing! I'll look forward to digging into them when I get chance!
What are these values, maybe we can start with that first
OOF! That's a big question, but you're right! - everything else depends on that! I think we should probably take TTW values as a starting point (For future ref, see doc linked to at the top of #3215, in particular, "Vision, Mission and Scope" and "Guiding Principles and Priorities"), and I love your specific suggestions for what could be infrastructure-specific values! I agree that this could be a good topic for a collab cafe.
another point I would add here is an accessibility evaluation
Yep. That is probably worth being explicit about and giving its own section.
In terms of tools for drafting, how about hosting/facilitating a public discussion (we have a couple of slots open in the upcoming months), using a padlet to ensure wider accessibility after in-person conversation, leaving the padlet open for async notes & discussion, then structuring a draft PR from that?
Sounds great to me!
Just saw this in @HeidiSeibold's newsletter and thought folks involved in this conversation might find it interesting
Text:
Heidi's rules for deciding on tools (WIP):
- Change a running system only when there is a strong pain.
- If there is a tool that is far better than anything else, use it.
- If there are several good options: Open over closed. Non-commercial over commercial.
- Exceptions apply, when the entity offering the tool has intentions we don't like (e.g. never use tools connected to Elsevier).
What are these values, maybe we can start with that first
OOF! That's a big question, but you're right! - everything else depends on that! I think we should probably take TTW values as a starting point (For future ref, see doc linked to at the top of #3215, in particular, "Vision, Mission and Scope" and "Guiding Principles and Priorities"), and I love your specific suggestions for what could be infrastructure-specific values! I agree that this could be a good topic for a collab cafe.
2i2c's Right to Replicate could be a source of inspiration here
Hi folks, just adding a flag here with some research & methods that I've been doing for our urgent switch away from Eventbrite and Tinyletter, as the form changed its pricing structure and the latter is being closed in 2024. Here's the research graphs that I used, which could be templated for future platform comparison?:
Having 'principles' in making platform switches is of course a much larger question, and these resources are a bit more stuck in the weeds. This could be a collaboration cafe topic this year, for sure!
EDIT: Moving to #3594 after discussion with @malvikasharan
Adding in some problems we've had here with the built-in Zoom platform over the past few months of using it for registrations after moving away from Eventbrite:
- Collaborative editing: Only the event host/owner is able to edit details. The event co-hosts are not able to see registrations, edit event information, and/or change anything about the event itself.
- Zoom registration loop: At the February 2024 Community Call/Forum, participants were asked to sign-up, and then received a link that sent them back to the registration form. This is a documented problem according to the Zoom community forums.
- Zoom registration error: The Zoom form remained open for registrations after the event had started, instead of automatically changing into a splash page where people could join the meeting page automatically. The "close registration after meeting date" option was selected.
- Renaming participants: During the Onboarding call in March 2024, the Zoom room automatically renamed all participants in the call to one person. The link from the registration form
- Follow-up emails to participants: While there is an option to resend the Confirmation of Registrations, there is no able to send custom emails to participants as confirmation. In the confirmation of registration email, there are limited options in what can be sent. Custom message texts can be lost in the GUI of the automatic zoom message.
- Customising form for registration: There are limited options of what can be added to a Zoom registration form (i.e. no multiple choice answers). Adding custom messages.
- Keeping track of registrations: There are a few issues with this that we had used with Eventbrite. The sign-up list cannot be easily exported for others to view or edit. Co-hosts cannot see the list of people who have signed up to be able to review registrations or answers (especially important if person who has created the zoom link, who may be hosting) needs support during the call.
- Archiving event information more broadly: As of 2 April, the Zoom archive of the zoom link and other registration information has disappeared for Community Call (#3510), but remains for Onboarding call (#3586).
- No calendar invite:
My suggestion here would be to switch to another platform for registrations – so we can avoid all of these inner-platform issues on Zoom, but will check back with the delivery team!