You asked, we answered: Introducing Ask twig+fish

I have had the pleasure of being mentored by some of the finest UX professionals over the past 25 years. I love that so many of them have become family to me. “Pay it forward” has always been a big philosophy of life that my parents shared with me and my siblings. Over my 16-year tenure at Bentley University, I have enjoyed the great fortune of numerous fabulous students coming through my class and the Human Factors and Information Design Program. I even get to work with a fine product of that program on many fun projects, and am so glad our paths crossed (Zarla!). There have always been some students who stay in touch – and ask questions that challenge my viewpoints on research, our field, and my philosophies in general. Sharing information has been one of the best by-products of being a Bentley faculty member.

Just recently, I had a Bentley Alum write me on an interesting quandary he had found himself in – and started his email to me wishing there was a “Dear Meena” column. A “Dear Meena” column? I had to smile, but then, on Zarla’s push – I decided “um, sure.” I am happy to share whatever I can – but be careful – you might not like the response!

Given that my illustrious colleague Zarla has many fine answers to tough questions too, and sometimes much more eloquently presented than mine, we decided on a “Dear twig+fish” column.

So – here it is!

Here are the rules.

This is a sounding board.

This is not legal advice. We will never know every detail about you, your communication style, your knowledge base, your work place, your domain, your colleagues, your counterparts. And, you all know how twig+fish have opinions. We don’t expect you to conduct yourselves in the exact same manner we do. And that can make a difference to the outcome!

Let’s make this a conversation. Perhaps we can throw in how we might approach a situation when it comes to research, or handling a client that doesn’t see value. Or, what resources we tap into when we ourselves are unsure of a situation. We are always facing something new, and sometimes, it does help to just get a thumbs up or sideways on your proposed approach to handling that something.

What is fair game?

Well, really – anything. I have had students and professionals talk to me about research, client relationships, career paths, work-life balance, new ideas, electronic devices and their kids, and recipes for new food or mixed drinks. All of the above apply! Why not!

And lastly – everyone remains anonymous. Just to be fair to respecting people, and their individual situations. Respect first, always. If you want to send us your questions on Twitter, then just keep in mind it will not be so anonymous (@meena_ko #asktwigandfish).

So – let me start with the first question that kicked this off, and we shall see where it goes!

 

ASKING PARTICIPANTS TO SPECULATE.

“My coworker likes to ask participants how they think a particular friend might like a product, and what their friend’s first reaction to a product might be. She says it's a good way to find out what they *really* think but are too polite to say. I've always heard that asking participants what they might like in the future is not reliable, and that asking about second hand reports of what they think OTHER PEOPLE might feel does not generate very reliable or valid data.

So, is there a value to the question I am missing here?

Sincerely,

Confused in C(place)”

RESPONSE FROM TWIG+FISH.

Dear Confused,

I don’t use this approach for a few reasons. Let's paint the scenario. You ask A if A's friend B likes a product. Does A know enough about B? What assumptions is A casting on B about their behavior? What part of B has A studied to truly understand their motivations in liking/disliking something? At the root - we are not there to get A's perceptions on other people. A can report on A's realities, and possibly (possibly) - at some point can cast some generalizations on other people's behaviors. But that is it. We also have no evidence from A that B really is what A perceives them to be - as we cannot task A to give us evidence and proof of B's behavior. To me, there is issue in the approach.

Regarding asking participants what they might like in the future being unreliable – I partially disagree with your statement. We cannot directly ask people what they want. That puts the onus on them to design, and really, A is not a designer. But, we can get an understanding of A’s realities, and then see what they tend toward. Is A the kind of person who creates his/her own solutions? Or does A rely on someone else to create them? We can use A’s analogous experiences to study behaviors and see how the past and future work in the analogous situation.  

At the core, though, we want people to report on themselves and provide us evidence of those reports. It is on us as researchers to craft protocols that allow people to easily share details about themselves. Maybe that means we have to work a little harder on crafting our protocols – but it is absolutely possible.

Hope this helps – and perhaps, I’ll see another question come this way shortly!

Sante!

 

Have any burning UX or qualitative research questions? Send them to us on our website or tweet them @meena_ko #asktwigandfish. We will happily share responses as we get questions!

Part 4 - The Robust Proposal (Qualitative Research Proposal Series)

This is Part 4 of a four-part series on proposal writing for qualitative research. Please read Parts 1, 2, and 3 of this series. They cover how to craft an overview as a starting point of conversation with the client, best practices for developing a budget, and how to justify proposal details.

You have made it this far with the client! You’ve crafted your overview, which piqued their interest and gave them a beacon. Your budget calculation has helped you understand the amount of actual work that will take place to complete the project. Ensuing discussions forged the way for further clarity on the executable details.

Now is the time to iron out the legal language, and document the specifics. In addition to having all the details about the study objectives, methodology, sample, and cost (which are all in the overview), your proposal is going to need:

1.      Phase by phase breakdown

2.    Terms and conditions

3.    Signature page

 

1. Phase by phase breakdown

Accompanying a phase overview (a snapshot of the entire program), we separate each phase onto its own page. We spoke of the table in Part 1 – this takes each “column” from the table and presents all excruciating details. For each phase, we put in specifically what we will do (the tasks), the deliverables, and the assumptions. Tasks are pretty self-explanatory, but being very pointed as to the actual work transpiring is very important. Mention all of the actual work that go into the task, such as making a document, holding a meeting (and length), creating an agenda, etc.

We almost always indicate the form factor of the deliverables at each phase. Creating a PowerPoint presentation for a check in, is very different from drafting a set of engaging posters. If there is an ambiguity about the quantity of certain deliverables, we usually indicate an “up to” point – such as “up to two screeners.”

Assumptions are dependencies from the client. We usually think of them in terms of how much effort the client must put in, and what are some constraints discussed prior to the development of the proposal. We have included assumptions that keep the client accountable if there’s a particularly tight timeline (i.e., client must respond in two days with feedback of draft). We have also included assumptions about field involvement. For instance, to cut costs, sometimes we rely on the client to be a note taker, therefore we put this directly into the assumptions so it is clear that the scope is based on that expectation.

The phase breakdown can be as granular as you feel comfortable, but keep in mind the overall arc of the program and your relationship with the client. Some will appreciate the attention to detail, while others (or their legal departments) will want more clarification on an assumption.

 

2. Terms and conditions

There are many online resources researchers can use as terms and conditions. Always include them in the robust proposal. Sometimes the client legal team will have push back on particular language, or will want to include their own (such as intellectual property ownership). It’s always prudent to have a legal expert to consult with to draft your own terms and in cases in which clients want to make revisions.

 

3. Signature page

This is the dotted line page! Sometimes it gets forgotten, and then the client has nowhere to put their signature. Always have the client sign first, in case they come back with revisions along with their signature.

 

Proposals are so important for qualitative researchers. With so many moving parts, ambiguity in the outcomes, and reliance on the findings to drive other processes, it is crucial for researchers to develop strong practices for sound proposal writing.

This concludes our first foray into a blog series. We hope you enjoyed reading it and look forward to answering any other tactical questions you might have. Meena will be sharing a blog post soon that collects all the many questions we have received from colleagues and practitioners that we are going to share on our blog. Stay tuned for that!

If you have any final comments on proposals, tweet them to Meena @meena+ko, #betterproposals. Maybe in your tweet, indicate why I should get into the Twittersphere!

Part 3 - The Conversation/Talking Points (Qualitative Research Proposal Series)

This is Part 3 of a four-part series on proposal writing for qualitative research. Please read the introduction, and Parts 1 and 2 of this series. They cover how to craft an overview as a starting point of conversation with the client and best practices for developing a budget.

With a proposal overview, supported by a budget calculation and rough field schedule, we can now have a detailed conversation with the client about the project. It is healthy for a bit of back and forth to ensue, typically involving methodology questions and opportunities to cut cost. This demonstrates to the client your willingness to be flexible (while maintaining the integrity of the research study), as well as your knowledge on how to modify the details to suit constraints.

In this section, we will share the common discussion points that then inform the final proposal.

1.      Cost

2.    Partners

3.    Field realities

 

1. Cost

Unless a client has shared a specific number with you, it is inevitable that they will ask questions about cost. Clients sometimes experience sticker shock, but once aware of the resource power and intellectual onus involved in research they become more comfortable with it. We never adjust the cost of the work unless actual tasks are being impacted. In these discussions about cost, we more often than not are giving our client the language to share with others who might be the final purchase decision-maker.

The biggest influencer of the total cost of a qualitative research project is the the sample size. The larger your sample, the more time the researcher will be in the field. If a client is asking to remove ancillary tasks beyond the actual data gathering, then it is likely the total cost of the project will not go down much. For example, if you include an alignment work session, or a bit of ideation coupled with the findings presentation – they will wonder if removing those tasks will effect the bottom line. Most of the time, removing these small tasks will not impact the total cost as much as sample reduction will.

Researchers are also well aware that cost will be impacted by the methodology. The data gathering techniques (in-person versus remote, for example) will impact the cost greatly. It is your responsibility, as a researcher, to help the client understand the benefits and challenges with method contexts, dynamics, and engagements (we will cover these in another blog post!)

The reality with cost is that the client will always want to find ways to bring it down. You have two options at that point – remove tasks, and/or reduce sample. Researchers need to equip themselves with the knowledge to justify either. Shameless plug: Using the twig+fish NCredible Framework gives researchers the talking points on cost justification around different types of research.

 

2. Partners

Offloading some of the non-intellectual parts of research can be one of the smartest moves a researcher can make. From a proposal perspective, working with partners means less in professional fees and more in expenses. We always share the rationale for working with partners to our clients during this stage, so that they understand how they are gaining.

We work with various trusted recruiting specialists, and also with the client (who may be the best recruiter for certain population). Meena and I have determined that we are not experts in finding participants, but are experts in describing identifying criteria. And, we simply do not like the tasks involved with recruiting. Our recruiter typically works on the screener (based on criteria we gather), and schedules participants. Our partner is so good at this that it would be almost insulting for us to take it on – so we do not.

We also partner with data gathering services. Sometimes we propose data gathering engagements that require automated fielding and data capture. There are so many tools out there, so it is important to stay vigilant on their evolving reputations.

Another common partnership for us is design and ideation. We have a very short list of trusted visual and interaction designers we work with. We also have a network that can tap into industrial, service, and space design (among so many others). Designing is not part of our wheelhouse, , nor do we desire to do it – so we volley that to the experts. We suggest baking in time for the designers early in the project (say, kickoff), so that they can be a part of the team from the start. Ensure you communicate openly with these partners in particular as they are directly tied to the lasting, final deliverable for your client.

An undisputed source of burnout is doing unwanted work tasks. We do not want to burnout, because we want to deliver great process and output results to our clients. Outsource the tasks you do not want to do, but justify it in plain language to your client. It is important to be transparent, so that the client has full understanding of the “what” and the “why.”

 

3. Field realities

Being in the field is a lot of work. There is no other way to state that reality. I will repeat: being in the field is a lot of work. Not only are days long, they often consist of active listening, supporting client questions, managing logistics, and hand writing (ouch, the hand writing!) Meena and I have learned how to make field research more efficient (cost/time saving), and also, how to ensure our needs are met as researchers. Take breaks, and advocate for yourself so that you are always at your best when researching!

We have learned that efficiency is subjective, and varies based on the client – it is something that needs to be discussed at this stage. One client’s expectation of efficiency may be constrained by time, while another’s is constrained by cost. Get a sense of where the constraints are and how this will impact the field schedule.

A big part of research is travel. Making sound travel arrangements that also adhere to the field schedule require an adept attention to detail (we will make another blog post on travel). Meena and I always like to have full control of our travel arrangements. This goes back to the comment about burnout – we are less prone to this challenge if we have the agency to manage our arrangements. This does not mean we are ostentatious in our travel choices; often, it just accounts for our own biorhythmic patterns and priorities (are you more of a morning person, or night person? When do you like to shut down? What are personal priorities that must be attended to that never take second place?)

When traveling, we opt for direct flights (which are not easy to come by in Boston), and a safe reliable car. Beyond that, we believe that if we are away from our families and daily lives, our lodging accommodations should be comfortable, but most importantly predictable. Because of this, we usually stay in reputable hotel brands.

Sometimes, we do not have control of our travel. It’s completely fine to adapt to a client’s travel constraints, but talk about the logistics and the importance of them (and impact upon the research) upfront. When a third party books our travel, sometimes, the additional time spent on travel does not necessarily reduce the overall expense.

Make sure that your client understands what is required to field an efficient study. Use this time to discuss what will result in successful data capture. Have a strong sense of what it means and feels like to be out in the field. A no-win situation is one in which the research team arrives exhausted and misses valuable opportunity to learn from participants due to exhaustion.

 

Your discussions with the client should be productive and tactical. Focus on the pros and cons of various solutions, and be honest and transparent. All of the talking points you share with your client should lead back to one core objective: research is executed credibly. If at any point the research falls into jeopardy because of cost cutting or time cutting, it is the researcher who must provide the rationale for the best path forward.

Now it's your turn to share some great and not-so-great field stories that may have impacted or influenced the way you scope out projects today - share them with Meena @meena_ko #betterproposals. I'm still not on the Twitter bandwagon!

Part 2 - The Budget Calculation (Qualitative Research Proposal Series)

This is Part 2 of a four-part series on proposal writing for qualitative research. Please read Part 1 of this series, which covers crafting an overview as a starting point of conversation with the client.

Budget calculations at the most basic level include estimated costs in professional fees and expenses associated with your proposal. These costs are in fact estimates because sometimes project scopes can adjust total fees, and expenses are never easy to predict.

Regardless of whether you apply a fixed fee or variable fee approach – creating a budget calculation is always advisable. Meena and I have always advocated a fixed fee approach, believing it shows sufficient predictability to the process, and one that does not rest on our charging for every small piece of work conducted. The key in a fixed fee approach is the confidence you have in your process.

A good budget calculation requires some previous experience and repeatable processes. Because our projects typically involve research and some strategy, we developed a five-phased research approach that can be applied to pretty much any program. Within each of these general phases, there is a lot of consistency to the tasks included. Often, each is tweaked to meet the demands of the research study.

The five-phased approach is a result of both Meena and my experience. Based on our experiences, we know exactly what it takes us to execute a well thought-out and credible research program. We encourage you to tap into your personal experience to determine the best process for you that can make crafting a budget calculation more predictable.

On a side note, when consulting, it’s easy to spend just as much time writing proposals as it does conducting project work. Set structures in place so that you are not reinventing the wheel every time you create a proposal – this will save you time, and money. This could mean a reusable budget calculator in a spreadsheet, or using a program that you have had good success with.

We have both used project programs, spreadsheets and other tools to calculate budget. I used to be a big fan of Microsoft Project because it not only provides tasks associated with real dollars and hours, but it also links up to actual working time (dates). This is great, but MS Project is a costly tool. We use a simple Excel spreadsheet that does just the trick, but does not include actual dates

Each section will describe how to determine and describe:

1.      Professional fees

2.    Expenses

3.    Field schedule

 

1. Professional Fees

There are a number of calculators online that help freelancers and independent consultants determine an annual living wage. If you have not already determined what this is, we suggest exploring these calculators to determine your hourly wage. It is ALSO very important to look at market data on how much freelancers/independents are charging in your region. We strongly advocate doing your research prior to determining your hourly wage so as not to artificially inflate or weigh down (deflate) the value of qualitative research.

Once you have determined your hourly wage, you can enter this as a header column in a spreadsheet. From there, list all the possible tasks that would be associated with the program (one task per line). Do not forget little administrative tasks like the time it takes you to book travel (unless you outsource it, in which case it becomes an expense). Indicate the number of hours (considering an 8-hour work day) for each task, and multiply these hours by your wage. This will yield your professional fee total, ta da!

Having a solid, well-thought out description of tasks in professional fees is crucial for the discussion with the client. While we do not share the spreadsheet with our client, it gives us specific line items to adjust. Therefore, when the inevitable discussion about cost-cutting comes into play, we can specifically associate any cost reduction with specific tasks. For instance, “we removed an orientation session with client observers,” is a specific task and associated deliverable that is tied to actual dollars. As well, the client can understand where some variability can be applied. Creating a research protocol is a task regardless of meeting with 5, 10 or 15 people. Therefore, its creation does not get affected by the sample or recruitment. The amount of time spent on analysis does get affected – and therefore, the client can appreciate which professional fees can and cannot get affected by sample size change.

 

2. Expenses

Qualitative research requires plenty of purchases, partnerships, and getting around. If a methodology and sample size are set, then the cost of recruiting is typically the same whether a client goes with vendor x or vendor y. As such, we like to focus the conversation on the professional fees and our value-add as opposed to the number of flights and cost of group session catering.

Expenses typically fall into three categories: travel, recruiting, and supplies.

For travel, we always do research into flight options as well as hotel and ground transportation. Having specific numbers for each travel component (such as hotel, number of Uber rides, etc.) will give a better sense of what it will take to be in the field. We always indicate two multipliers in our spreadsheet – the number of researchers or participants, and the number of days in the field. These help us determine total cost of travel. As well, consider times of year, and availability of flights and hotels: doing this legwork upfront can help manage the client perception of research expenses.

For recruiting, our strategy is to outsource. We will discuss this in more length in Part 3 (The Conversation/Talking Points). We work with a trusted partner to give us an estimate on recruiting fees, incentives, floater fees, facility rentals, catering (for behind and in front of the glass), and video recording. We typically ask for these as line items so that we know the per head cost. To bring down costs, video recording is often the first thing to go. Meena and I typically challenge that video is expensive, and does not yield as much actual value to the client. How often does the client team go back and review the recordings? What will the recordings be used for, and more importantly, how will the recordings be stored? Given that it almost always contains personal information on specific individuals recruited for a study, we want to make sure it is given its due respect.

Lastly, supplies are always a big part of our research programs. We believe people have easier times articulating themselves when they do not have to simply sit in conversation. We calculate how much printing, shipping, purchasing, and renting we have to do and make this one lump sum. It is essentially the cost of executing the protocol and conducting analysis. This can also include printing, which we tend to outsource because often our deliverables involve some printed material.

 

3. Field Schedule

In qualitative research, it is important to be as realistic as possible, as early as possible. In the same spreadsheet in which we have our budget calculation, we also include a field schedule. This helps us visually see how many actual days we are in the field, and when actual travel will occur. A flight out on a Sunday evening is going to be a different price than a Monday morning – for example. As well, from a work-life balance standpoint, there will be times you will have to work on Sunday evenings, or late into a weeknight, but, making it a regular occurrence often sets an unhealthy standard.

We indicate the days of the week (often with actual dates), and list off the amount of participants that can be engaged on any given day. We always give ourselves enough travel time between participant locations, and enough time to debrief with our client (who is often in the field with us).

Once we have a rough field schedule, and the proposal is moving to signature, we then populate an actual Google calendar with these dates. This is a shared calendar between Meena and I, and we use it to see if there are any conflicts. It is usually a separate project calendar that can be deleted or hidden once we are finished with the program.

 

Your budget calculation is your hard, fast numbers. It is crucial to give it the diligence and time it deserves. All the details count here, even if they are not ported into your actual proposal. Budgeting is a skill – and not something that comes to everyone right away. We feel comfortable with our process because we have done it so many times. Be sure to set aside time to reflect on the budget, and the fees and expenses applied so that you can continuously make the process more accurate as you evolve in your profession. Note details about the client interaction as well – and where your time might be eaten more so than other projects. This permits you to allow more time for the “client that needs extra coaching” so that can be explained in those very terms when costs are brought up.

What are some costs you consider? Tweet Meena (I ain't on the Twitter!) @meena_ko for #betterproposals.

Part 1 – The Overview (Qualitative Research Proposal Series)

This is Part 1 of a  four-part on proposal writing for qualitative research. Read the introduction for context on why we started this series.

Qualitative research is full of nuanced details. Qualitative research by nature embraces the unquantifiable parts of being human. In a proposal, we are trying to identify and account for those “hard to quantify” elements. The nuance is often to the researcher’s (and client’s) benefit, allowing us to tailor each study to particular needs and constraints. However, for the person reviewing the initial pass at what the approach might look like, these subtle details can be overwhelming.

This first discussion glances at a high level breakdown of the proposal.

Prior to drafting a full proposal with assumptions and terms, we always create an overview document that walks through five sections. Each section is usually one page and consists of:

1.      Research objectives

2.    A suggested approach

3.    Research phases

4.    About us

5.    Case studies

 

1. Research objectives

The objectives are a repeat-back to the client of the program intent, usually in paragraph form and very brief. We reference strategy context to root the research in a meaningful scale to the business. We always include initial questions that team members have raised, and describe the outcome or the final deliverable.

 

2. A suggested approach

We always consider sample and study design in the overview, and begin with a potential methodology and recruiting rationale. Without being too prescriptive, we use this page to educate the client on all variables needed to craft the study.

The sample outlines who it is we intend to recruit for the study. Given that “who” is very open-ended, we further reduce the sample to identifiable behaviors, demographics, psychographics, and aptitudes that we seek to recruit.

The study design does not lead with methodology: instead, we reduce methodology to the most basic common denominators to avoid any confusion, by describing potential contexts, dynamics, and engagements needed for the study. These topics will be explored in further detail in subsequent blog posts

Given that the suggested approach has an associated sample and study design, the proposal can then serve as a conversation-starter to have with clients, so that rough cost and timing can be evaluated. It also serves to educate what factors within the sample and study design can affect cost and timing, giving the client a better sense of what they can, and cannot control.

 

3. Research phases

We cover the phase intent, tasks, cost, and timing in a visual table to provide insight into the study arc. From our perspective, using this table format keeps the process predictable and relatively easy to replicate across research studies. It also allows for clients to walk away with a “1-pager”, which can be shared and internalized easily within their organization. We focus our efforts in making the methodology and recruiting rationale tailored to each client, but in general the key steps in the process form a standard for all qualitative research.

 

4. About us

We always include a paragraph about twig+fish, our philosophy, and how we like to work with clients. Every freelancer, organization, team (whatever you call yourself!) needs to have a research perspective. Share it! We also include individual bios of ourselves, with a nice headshot of course!

 

5. Case studies (optional)

Every so often, we will get a lead for a project in which the potential client does not know us at all. In a scenario like this, we will provide a two paragraph anonymized narrative to further demonstrate our credibility in the space. Making it short and readable is key – at this point it is simply about demonstrating know-how, not getting into the weeds of details.

 

The purpose of the overview is to help the client understand what information we need to define the optimal study design. Using it as a starting point for discussion, the client can then consider what they already know and have, that may further inform the study design.  

As mentioned earlier, a budget calculation is always included in this overview. In Part 2, we will discuss considerations for the budget calculation, and how we create it. Stay tuned for Part 2 coming tomorrow!

We want to hear your thoughts! Tweet Meena (@meena_ko) with what you always consider in crafting a proposal - #betterproposals.

 

Writing Qualitative Research Proposals - A Four-Part Series

Musical performances with complex stage setups can result in one of two ways: a beautiful choreography or a tragic mess. In crafting contracts with venues, the musician’s representatives sometimes insert random requests in the instructions – i.e., remove all blue M&Ms from the singer’s candy bowl. These random requests can be interpreted as diva-ing, but it reveals far more.  It is a means to vet the venue to ensure they have read the entire contract. Are they really paying attention to all the details? Knowing that the contract has been thoroughly read, and all details given due diligence typically implies that the show will go off without a hitch.

In qualitative research, much like in complex stage setups, there are a lot of moving parts. These moving parts are a reality and a point of vulnerability for the researchers doing the work.

Meena and I are sometimes asked about our proposals: how do we write them, what do we include. In the spirit of transparency, we wanted to share the key components of our proposal process.

In this five-part series, we will share how we craft qualitative research proposals. Going into the writing process, we have some information but not all. We use the proposal process as a means of beginning a conversation: one that not only focuses on the study design, but also, one that touts our knowledge, and flexibility in crafting a research study that meets the needs of the client, while preserving the integrity of a credible approach. As we gather this information, we keep the conversation going by providing more project facts. We always begin at a high level, and then end with more detail.

Part 1 will cover our initial proposal overview. Rather than investing a lot of time in a detailed write-up, we give our clients a sense of the project trajectory.

Part 2 will go into considerations for the budget calculation. We include the budget calculation in our proposal overview, but in this part we will dive deeper into the minutiae of fees and expenses.

Part 3 will address the conversations and back and forth that ensue as a result of sharing the proposal overviews with the client.

Part 4 will address the robust proposal itself, which includes all the logistics and expectations.

We encourage others in our domain to weigh in. We would love to hear other processes that have worked for you. Last year, for instance, Meena attended a conference in which a freelance researcher described the reason why he very transparently shares his budget calculations and hourly rate with clients. While it is not something we practice, it was an interesting perspective to consider in some situations.

Stay tuned for part 1 (coming tomorrow).

In the meantime, tweet Meena (I’m not active on Twitter) with some of your proposal best practices - @meena_ko #betterproposals.

The Spaces Between Things

When I was in elementary school, I learned about using negative space in my art work. The idea of focusing on the spaces between things, rather than focusing on the subject of the piece was completely novel to me, shifting my mindset of how art should be done. Rather than taking a piece of paper and turning it into a picture of the sun, I imagined the sun and made it fit into the piece of paper.

Today, we had a client in town for a dedicated period of time to develop an in-depth protocol – that was the subject of her visit. A multi-method ethnographic approach, we needed to wade through the complexities of social dynamics of the commercial environments of our potential study participants. We also needed to consider the almost endless possibilities of sample permutations. Rather than being able to select our participants, the “recruiting” approach centered on work sites, not necessarily individuals, adding a layer of challenge to identifying who we speak with.

Beyond the recruiting challenges, we had to design into our study other uncertainties. Not everyone would be able to devote the desired time frame to speak with us, for instance. After two days of brainstorming, distilling, and documenting the protocol, we all decided we needed to step away from the process and take a break.

We decided we needed to stop focusing so intently on the subject of her visit, and start to look at the space around the subject of her visit. We decided to shift our attention to mental replenishment, just for a moment, so that we can re-center on the topics at hand. So we had a brainstorm while we got pedicures.

Sitting in our chairs, we selected our massage settings, briefly discussed our color selections, and any odd reflections on our feet. Feet are funny, personal parts of our bodies! In our mini-break we also took the time to observe our research selves as we developed the protocol. We spent some time discussing the need to step back, take a moment, and examine the in-between spaces that appear between the doing of work. In it, we shared thoughts on what it means to be a studier of people, and how much mental energy it requires.

We reflected on a few areas that require looking at the in-between for others, so that they can understand the actual subject of the piece. They are:

1.     Describing how we will learn about people. To be good at your job, you need to care about what you do. To be the kind of person that cares about what you do requires quite a bit of mental and emotional energy. This energy, which I hope to talk about in another post, is channeled into helping others understand what needs to be incorporated into a study to gather necessary data points. Often times in research, we go through the motions of gathering data, but don’t spend nearly as much time as we should in helping the consumers of research understand its power and limitations.

2.     Taking time to observe people, generally. It’s a common “hobby” to people watch, especially if you find yourself sitting in a café alone with no laptop, but it’s our job to people watch. As we dried our nails, we suggested to our client that while she’s at the airport she take notes on the people she sees around her. As we discussed what she could do in taking notes we added “but it’s really hard to turn this people watching lens off.” Meaning, simply, that once you start to observe people with a purposeful eye, it’s more challenging not to. In reality, once you become an observer of people, you struggle not to be one.

3.     Being honest about not liking the same music. We heard a song by Grimes come on in our café, and while I quite enjoyed it, my two colleagues were annoyed. We spent some time talking about her music, and then just music in general in coffee shops (where we were). For some people it’s a distraction and for others it’s a way to get the mind working. We examined our own relationships with music in public places and how we manage it. At that point, one of us put on headphones to drown out the noise to work,  one of us turned on our own music, and the third simply enjoyed the atmosphere.

During this week of protocol development, we reflected on the spaces in between the work we were doing. This helped us establish confidence in our protocol and feel more deeply connected to the work we were doing. Rather than being laser focused on the task at hand, we looked at the in-between moments during our week as a point of inspiration to get our work done.

The Importance of a Human-Centered Research Philosophy

A couple of weeks ago we presented at UXPA Boston 2016 - an awesome conference with over 1,000 attendees! We had a great time with all the speakers and catching up with old colleagues. We presented our thoughts on the weaknesses of forming study designs around method, sharing several alternatives to shift the dialog of research in organizations. One of our alternatives is to talk about a personal research philosophy. With a sound, communicated philosophy, one that's public and known to the business, we can begin to move away from crafting research approaches by simply focusing on the data collection method.

As a result of this presentation, some folks have asked "what is a research philosophy and how do I begin to articulate my own?" This prompted us to reexamine our own philosophy and evolve it to a state that we presented for the first time on Monday. While it's still fresh in our minds, we were able to gauge the reaction and questions from the audience so that we may better improve our message.

Because we are human-centered researchers, our philosophy must rely on a truth that applies to everyone: people exhibit observable behaviors. It's not simply our study participants that exhibit behaviors, but it is also the internal teams that we work with. Because we know we can rely on, to some extent, these observable behaviors to help us understand people, we use this as a basis for how we form our studies. 

Our philosophy is simple. Our job is to support the people we want to observe in being able to exhibit their behaviors. As such, in crafting the instruments of research, we look at the context of inquiry, the activities that will illicit candidness, and the dynamics that feel socially sound. More specifically, in each research study we aim to:

Represent the Context. It's all too often that researchers are tasked with pulling a research process into a lab setting or through automated/remote data capture tools. While there is a time and place for these tactics, the richness of true life is lost in the contextual compromise. There has always been dialog in our field about the importance of context, but in practice contextual inquiry and ethnographic research are seen as outlier approaches worthy of special situations to warrant its need. We argue that context is everything and must be central for us to truly understand people. Study-rich contexts should be the norm, not the exception.

When we do research with participants, we want to be in their place of operation (whatever that may be). The same applies to teams. We do in-person meetings, we love getting tours of office spaces, after a client meeting we hang out at the cafe down the street that the design team might go to. 

Use Expressive Activities. People are emotional beings, but not always articulate beings. We focus on helping people pull out their emotional selves in conversations so that they may feel more comfortable in sharing their wants and needs. We borrow techniques from psychology and therapy (but of course with severe caution) that are often more creative and engaging. Rather than centering the conversation around the client's desired topic at hand (say, their offering), we urge teams to abstract the conversation to a level that is related but not direct. People might have a more challenging time discussing how they make investing decisions with our client's online tools, but they may have an easier time talking about decision making for their household, generally.

Working with teams, we also try to use activities that keep the teams engaged and ready to share. We assign homework assignments before most workshops, use game mechanics, storytelling, and other interactive techniques to help teams not simply articulate how and what is important for them to get their work done, but also to share deeply with their colleagues.

Reconstruct the Social Dynamics. Individual, dyad, group - these are the common social constructs considered in crafting research studies, but in focusing on the construct based on sheer sample quota, we may lose out on the true social dynamic needs of a study. Some studies require a reality check (maybe a good friend to keep the study participant honest). Some studies require the input of both decision-makers and doers, other studies might be so personal that even the presence of a researcher is too wrought with bias. We make it clear that dynamics are fundamental to a study design and should not be overlooked. 

When working with teams, we use a number of approaches to disarm and encourage a more socially equitable environment. One of our favorites is "shoeless analysis." As a team returns from the field with their data and stories to share, we encourage everyone to remove their shoes. It shifts the dynamics of the team, disarms, and empowers at the same time. 

Crafting your own philosophy needs to mean something to you. Ours is based on a fundamental belief that our job helps people share their experiences. If we are not able to do this we haven't done our job well. When you have crafted your own research philosophy, you cannot simply stop at communicating it. You have to see how it turns into deliverable mechanics of a research study, and how it maps to the tasked work at hand. 

What a philosophy provides is a beacon for your work. There are moments when you will not be able to execute a study design as you'd ideally like, however, you will have something to point to so that others may understand how best to use your skills for the business. 

We would LOVE to hear about your research philosophy and how you communicate and execute on it. Please send us your own! 

User Experience Research is Better for You than Your Offerings

Standards and definitions are often unifiers for a field of practice. Common approaches, philosophical roots, and success metrics operate to pull practitioners together so that they may understand how they fit into a broader business context.

 User Experience (UX) research is uniquely not unified in this way. We have varied definitions of who we are, how we do our work, and what it means to have done the job well. Personally, I love that our field operates like this – it gives us even more of a qualitative edge. We are skilled at managing and communicating the ambiguities of people’s lived realities – and we can’t always put a number on how we come to knowing and leveraging those realities.

Without such concrete boundaries holding our field together, the reaction can sometimes be to establish our value with ROI or other quantifiable metrics. Mapping the outcomes of research directly to a design decision, that then affected some kind of change in user/customer behavior is sought, but damn hard to pinpoint. In parallel to the charge for these quantifiable metrics, is the demand that UX practitioners operate more like arbiters of design decisions, ushering cross-functional teams through the mess of creative energy. 

The dialog on our value needs to shift toward a more human-centered approach. Right now, we measure the distance between old offering and new, as observed through behavior change. We are measuring the entire value of the User Experience team through a single factor of the team’s potential impact. The offering is touched by so many hands, and often gets to the UX team late, so what are we measuring when we use behavior change as the metric?

This is not to say that we should not look at behavior change as a success metric. But, it would be more meaningful if it was internal teams’ behavior changes, not users’. As such, User Experience’s impact might best be unified and measured through an HR lens. User Experience teams can impact company culture greater than it can impact offering outcomes.  

As an industry, we should set the expectation that our work should be considered across various cultural factors, such as communication, wellness, and performance. In more detail:

 1.       Communication. Well-executed research is able to pull together cross-functional teams, resolve debates, and establish internal folklore about users. It also can do this for the internal team. The work it takes to align teams on research questions, discuss identifying criteria for participants, and share research findings all rely heavily on communication skills. Researchers are fabulous communication facilitators between and among teams, and their value is probably best measured by their ability to make the collaboration points meaningful.

2.      Wellness. The time and space it takes to develop a credible research study allows teams time to reflect on the work they are doing. Rather than operating on auto pilot, User Experience research studies give everyone a moment of pause to think more empathically about the user. Introducing more meaning and context to the work that is being done has profound effects on employee emotional wellbeing.

3.     Performance. When done well, research has actually demonstrated to pull together creative teams more effectively than heavily process-based approaches. The abstract nature of User Experience research, being more qualitative in nature, tends to encourage teams to seek more points of convergence along the nebulous creative path. Research becomes important milestones along a project, and the scope of that research can reveal if a project is doing well, behind, or perhaps becoming asynchronous with other parallel processes.

 The way we should value User Experience research shouldn’t fall solely on its expected outcome, this will set our industry up for failure. If we look at our success more through the eyes of an HR perspective, we might have more room to express our value to the business, but also give us the freedom to use our skills for internal betterment, not simply offering betterment.

 

"Coming Soon..." - practicing accountability

Accountability is a fitting topic for two freelancers working together on logistically complex projects with often ambiguous interpretive processes. As we embark on BLOGGING, I thought it would be best to avoid a "coming soon" post that leaves a too-predictable cliffhanger - no, it won't be here soon.

I wanted to open this blog with a post on accountability. Arguably, this blog will be the most back-burner task we have on our list. Keeping ourselves accountable for the content we put out into the world will be important.

No accountability secret sauce exists for us other than having a mindset to look beyond the many solutions out there that hold us responsible for the work we do. Sure, we make checklists, stay up-to-date with calendars, ask for feedback and set deadlines. However, there's a lot that we don't have evidence for that keeps us on top of our game. 

Accountability does not simply involve the doing of things, the completion of things, or the proof that something is being taken care of. It involves having the mindset to offer something to others and ourselves. Here’s what we strive to do:

1. Imagine the narrative. Rather than understanding our tasks by their completeness and quality, we think about them in relation to other parts of our lives. By contextualizing tasks to a broader set of priorities, it's easier to picture driving them to completion. We like to think of work in terms of "Fun and Gains." If we’re getting neither enjoyment nor tangible benefit from the experience, it's likely that we won't feel the need to be accountable. 

2. Be present for others. When we work with clients, partners, and peers, we are placing some kind of burden on them. Whether it's because we’ve asked for their collaborative energy, told them a story to solicit support, or cracked a joke expecting them to laugh, we do our best to be mindful of what it is we are requesting from them in that interaction. If we don't fill the world with noise (the sound of our own voices), we might gain something in the end.

3. Be curious, not judgmental. While this principle can be considered in all contexts, in reference to accountability its focus is on our treatment of ourselves. If we are having trouble driving a task to completion, we don’t dive deep into feelings of guilt or resentment. Our goal is to wonder why the trouble exists, and work on that first. Sometimes when we are hitting a wall with a deliverable, it’s because we are overworked, bored, or need some inspiration. Taking some time to tap into fulfilling those needs usually gets us to the point of moving forward.

With all that said, stay tuned for our next post. In the meantime, we would love to hear your thoughts on how you stay accountable to yourself and others.