Research & Design Ops Application Landscape
I'm a firm believer in Individuals and interactions over processes and tools. However, that's not to say tooling isn't a great enabler of teams. To enable their day-to-day operations, support the interfacing colleagues or teams collaboration, and provide the foundations to accelerate and scale.
This article covers end-to-end considerations of tooling your experience design and research teams for software development at scale. Your tooling decisions should be considered as a total integrated landscape and has a lot more to consider than subscribing to the next hot SaaS offering in the market. I'll break it down into five areas of consideration:
Research Tooling Landscape
The tooling to help your team conduct research activities, facilitating attitudinal and behavioural analysis. Providing Qualitative or Quantitative measures. Validity is the method of the standard you perform, not in the results (e.g. if you run an invalid experiment, your results are redundant.)Interface Design Tooling Landscape
After prototyping and navigating through the early stages of a problems space ambiguity by validating the direction of your solutions, it's time to design and build.New tooling onboarding barriers and enablers
Your tools are only as valuable as the team leveraging them. Don't underestimate the efforts required to incite change, unlearn, nurture adoption and integrate workflows.What to expect for procurement
Pending your organisation, procurement can be very involved… incorporating legal, cyber, privacy and finance. I'll provide some advice on navigating procurement as efficient as possible.Measuring your tooling success
Software options are in abundance today. You don't want a circus of every new hot tool. Yet similarly, you don't want to stagnate your tooling that cripples ways of working optimisation. Consider what success criteria here can be regularly measured.
Research Tooling Landscape
Before we jump into what software to subscribe to, we need to look at the requirements and workflows of the research design landscape.
Design thinkers all conduct similar activities, whether you are a design researcher, service designer, strategist, etc. You want to understand why the problem space exists and how you can solve it by developing deep understanding through exploratory and generative analysis providing formative and summative insights and facilitating attitudinal and behavioural analysis, providing Qualitative or Quantitative measures.
Validity is in the method, not the result. We want to empower your teams to recruit their required cohort, conduct the appropriate method to validate their hypothesis and an efficient, effective, collaborative manner to amplify the value of research performed. Strong foundations to enable your team to scale with agility and speed.
I found the market was providing ‘one-stop shops’ with platforms trying to solve all our requirements in one, with pricing models that only benefit themselves. So I prefer a decentralised approach, ensuring we use the best tools for the use-case in front of them whilst never being locked into a single vendor, hindering our ability to optimise our way of working as we wait for them to catch up.
Participant Recruitment - Askable
A particular favourite of mine is Askable. A SaaS offering allows you to take control of participant recruitment and handpick the exact cohort you need. No need to deal with those recruitment agencies anymore… Askable provides team and credit management for a team of teams. Askable regularly take on feedback from their clients and translate it into fast followed iterative feature development of their maturing platform. One of their new offerings is Askable+. Suppose you're lacking funding endorsement or short on capacity. In that case, Askable+ provides a marketplace of Australian design researchers to augment your team and facilitate the sharp point in time moderation, analysis and synthesis of your problem space. It's kind of like Uber for design research, which is saying something…
Recording Moderated & Unmoderated - Lookback
Whether it's a diary study or moderated prototype user-test, Lookback provides an excellent platform for all things recording and note-taking. Lookback has two callouts. a) The Lookback app provides the ability to test mobile (iOS/Android) native app prototypes whilst capturing the camera view of the participant. b) the open invitation for note-takers provides a very inclusive experience for testing and note-taking, even in remote environments.
Facilitating Prototypes & Testing - Maze & Optimal Workshop
Optimal Workshop provides second to none card sorting and tree-testing reports if you're trying to validate information architecture. However, Maze has been a welcome new addition to the industry. Their 'Discovery' suite of card sorting and tree testing isn't yet rich enough to substitute for Optimal Workshop, but they are developing new features very excitingly fast…! But their 'User-testing' offering is impressive. Test anything from product concepts, and UX copy to hi-fidelity prototypes, integration with Figma is a great workflow efficiency gainer. And the reporting summary's are great facilitators of playback to the team, sponsors and stakeholders. which
Ohh Invision & Sketch you had so many years head start…. shame on you.
Research Repository - Dovetail
Typically, research data, insights, and feedback are siloed across multiple tools, software, and teams inaccessible to broader people and stakeholders in your organisation. Resulting in teams repeating the same research activities and producing the same insights and findings that continue to lose context of the original research data. Teams across the broader organisation aren't empowered to use or have access to research data to make better decisions. The Research Repository software category help resolves this.
Beyond the apparent problem space, additional benefits I've found include:
IP loss and staff attrition risk reduced by an indexed centralised repository of accessible information.
Codified workflows and templates realise team efficiencies and effectiveness.
Speed up your entire workflow with advanced transcription and tagging capabilities and say goodbye to complex video editing software.
Identified key trends with confident thematic insight, compounding on existing insights
Turn qualitative data into quantitative data—chart, filter, and segment themes across interview notes, transcripts, survey responses, and more.
Enable stakeholders and teams in our organisation to consume research and customer data efficiently, and read the findings from our research team.
Users speak for themselves with embedded video highlights in your research reports. Engage your team with beautiful typography, visual layouts, comments, and sharing.
It’s worth noting; my team also uses a few mac Automator and CLI scripts to streamline the dependencies between the platforms. I recommend that you face PI data considerations is deface, a simple command-line tool for automatically anonymising faces in media.
Interface Design Tooling Landscape
So by now leveraging the research tools, you will have navigated through the early ambiguity of a problems space, validating direction to your solutions. Below you can see an example of an interface design application landscape that supports cross-functional teams and design system support.
Figma - Truly understanding digital makers and empowering them to collaborate, architect and design at scale. Worth noting I would have listed Miro until recently with the addition of Figjam. Figjam provides (even in Beta, circa 2021) a cleaner, simpler, more human, and more integrated platform to facilitate digital whiteboard workshops. And I could write so many articles dedicated to aspects of Figma workflows because of how rich the platform is. But rather than compare the micro features against the likes of Sketch, AdobeXD, Axure, etc., I’ll say that one similar platform comes close to the collaboration and supporting community tools, plugins, and resulting workflows as Figma this is what makes Figma special.
Zeplin - It’s important to delineate between your design sandpit, enabling new ways to solve problems and frozen states of your design system versions. If Figma is your sandpit, then Zeplin is your frozen state, providing a collaborative environment between BA’s, engineers and testers to elaborate.
Storybook - Component development in isolation of having a complementary 1:1 relationship between design files and engineers assets. Similarly to Figma, it too is supported by a community of powerful plugins.
Zeroheight - Many build their bespoke interface to render their design system. Zeroheight both integrates with Figma, Zeplin & Storybook. Whilst providing a codeless CMS experience to ensure all can contribute.
If you want to learn more about supporting design system development, you can read my article series detailing from design systems journey from inception through to funding and continued growth. The first ‘Part 01 - The Design System Problem Space.’
New tooling onboarding barriers and enablers
Your tools are only as valuable as the team leveraging them. Don't underestimate the efforts required to incite change, unlearn, nurture adoption and integrate new workflows.
The change will always present resistance and friction. Accounting for the barriers ahead with a plan to resolve and enable adoption is critical to succeeding in new ways of working adoption and flow. Some barriers I've experienced and tactics to resolve include:
Misunderstanding about the need for change/when the reason for the change is unclear
Ensure you are articulating the 'why' that this isn't a decision on a whim but rather backed by identified workflow issues or pain-point feedback. It's part of a long-tail investment for your whole application landscape, not a piecemeal decision in isolation. Or maybe it's an experiment to inform. If your intended adoptees believe this is a fad or lack evidence, you will lose trust and adoption. Whatever the reason, ensure the 'why' is communicated clearly, frequently and open with feedback loops.Accepting that the old mental model is no longer relevant or practical - People and teams will have built skills and reputations on mastering the tooling. Please don't underestimate the mental model shift to let go and start over, losing their investment earned and daunting learning curve ahead. Resolve the Lack of competence by ensuring you consider onboarding and supporting training materials to expedite the learning curve as much as possible. Consider incentivising adoption with rewarded brown-bags and train the trainer mentalities to your learning and adoption culture. Providing comparative examples of the old world versus the new world can reduce the fear of the unknown and leverage tangent knowledge and skills for your new adoption.
Sceptics in the ranks - Harvard identified that it's 'It's Better to Avoid a Toxic Employee than Hire a Superstar'. Noting a toxic employee has the detriment of 10 adjacent employees… I'm sure you have all met them. Change and adoption of the new will bring them out of the woodwork! Identify the key challenges or enablers. Convert your sceptics by making them feel heard and involved. And amplify your change agents to bolster adoption.
Although often, adoption of new tooling or ways of working isn't a 'choice' of your team, considering adoption like any new product or service to market will expedite the benefits of your new ways of working. Luckily, clever sociology professor Everett Rogers published 'Diffusion of Innovations' in 1962, modelling a five-stage process for enabling adoption.
Knowledge - The first step in the diffusion of innovation. The point at which the adoptees are exposed to the innovation itself. How can you drip the introduction of your proposed adoption? (e.g. Vendor showcases, industry articles)
Persuasion - prospective adopters are open to the decision. They are actively seeking information that will inform their upcoming decision. What information does your team need to nudge and persuade? (e.g. Trials periods, Experimentation, Industry case studies)
Decision - As a procurer, you will inevitably decide for the team. The catalyst for change should very clearly articulate the ‘Why’ here to the team. Sceptics turned, change champions identified, onboarding, training and support considered and ready for implementation.
Implementation - The tipping point that requires you want to resolve quickly and efficiently. Work through the duality of onboarding your new adoption whilst decommissioning and migrating the legacy with haste. A persistent legacy platform will always provide a safety zone inhibiting your adoption curve from tipping to your late majority.
Confirmation - This is only a point in time. Future change and new adoption will be in front of you. Ensure you are constantly evaluating your success criteria in decided new adoptions.
Appreciate the relationship between types of adopters and their adoption curve.
Use your innovators to experiment with new tooling to optimise your ways of working forever and stay ahead of market trends.
Provide your early adopters, the early majority, all the supporting communications, onboarding, training and support materials, and an accessible, community-oriented approach.
Resolve your decommission and migration of legacy platforms to nudge your late majority into adoption sooner. Whilst And Identify your laggards and target specifically with interpersonal connections and supporting change champions.
What to expect for procurement
Now just a heads up, this next part is not sexy… but it's necessary if you're looking to procure an enterprise licence with your proposed vendor. If you're a small team or business, move on to the next section, 'Measuring your tooling success'. If you're buying for an enterprise and are new to procurement, I hope this next section helps you work through your procurement efficiently.
Every organisation has different procurement policies incorporating legal, cyber, privacy, finance, etc. Having experienced and lead procurement in many enterprises, I have identified common requirements for all future procurement requirements.
Requirements of the provider
Develop a spreadsheet of all the requirements you're expecting from the provider. It doesn't have to be an exhaustive list of features, but enough to assess comparative options in the market. An example of an analysis requirements template for a design system manager is seen below.Engage your procurement team
Your procurement team will have preparatory checks and balances before you can proceed deeply with anything.Options Assessment & Selection
Review the market and schedule a demonstration of each platform. Ensure your evaluating audiences are cross-functional where required. (e.g. Will you need architecture present in demonstrations for SSO considerations?)Legal counsel and MSA redlining
Much is on the table for negotiation when you're dealing with enterprise accounts, pending the contractual value. Whether it's the pricing models, insurance liability caps, or renewal terms, consider it all on the table. Work with your legal counsel from your most advantageous position and find a middle ground both yourself and the vendor are comfortable.Cyber Security, Privacy and Risk Profiling
Sometimes conducted by internal teams, sometimes outsourced for third-party tiering. Pending the SaaS requirements, it will align to a risk profile tier (e.g. SaaS storing customer PI data will be considered higher risk than others)Finance & Procurement Approvals
Accounting for the OPEX spend and whatever renewal agreement, if it's over 12 months, needs to be coordinated with your finance partner. Chances are, you will have a few layers of approvals through the hierarchy of your organisation.
Pending your organisation, this may be a supported workflow or self-service. And the process can take weeks or months, stay persistent and don't do it alone. Align yourself with change champions for support.
Measuring your tooling success
You don’t want to stagnate tooling that cripples ways of working optimisation. Consider what tooling operations success criteria can be regularly measured.
I won’t go into too much detail here as I’ve written a separate article relating to setting up a quarterly survey to measure the sentiment of your team across Culture, People, Tooling, Governance and Delivery. You can see that article here if interested ‘Sentiment Insight of Research & Design Ops’. However, in context to this article, there are some short notes worth mentioning. Your tooling should be an enabler for all your research and design operations. Consider how you can measure the below as examples (e.g. Jira cycle times, survey sentiment, 360 reviews, etc.)
Your new tooling should be enablers, not detractors to delivery, whether its quality output or speed.
Your team should improve their competencies by leveraging your tooling.
Onboarded new employees should be able to integrate themselves quickly and efficiently.
Your tooling integrates with partners and colleagues. Is the value exchange efficient and pain-free?
Determine your measure, develop a baseline over time, set yourself desired thresholds and analyses over time. Over time, you will correlate insight due to milestones in delivery, organisational change, or seasonal trend.