Procedural Knowledge Search by Intelligence Analysts
Survey Study
Discover how to support searchers as they learn how to do work tasks
Project
Client: A Department of Defense client focused on helping intelligence analysts use an internal knowledge base to search for information about work-related tasks.
Timeframe: 26 weeks
My Role: UX Researcher
Team: Sarah Casteel, Bogeum Choi, Rob Capra, Jaime Arguello (UNC Interactive Information Systems Lab members)
Methods: Survey, Thematic Analysis
Tools: Microsoft Word, Excel, PowerPoint, and OneDrive
Project Overview
Research Questions
What kinds of work-related objectives motivate analysts to search the knowledge base?
What kinds of information do they look for when they search the knowledge base?
What relevance criteria do they use to judge the usefulness of information?
What challenges do they face when searching the knowledge base?
Client Engagement
UNC study team held kickoff and bi-weekly remote meetings with our research partners and stakeholders.
Approach
Our survey asked intelligence analysts to recall and describe specific instances in which they searched their knowledge base.
We asked participants to describe their potentially classified work-related tasks using unclassified analogies such as investigative journalism.
Data Collection: Survey
Survey Milestones
Survey Design
Partner Pilot
Revisions
Participant Recruitment
Survey Receipt & Analysis
Reporting & Outcomes
Survey Design
Survey Design Considerations
Restrictions on In-Person Research
Participants’ Comfort
Participants’ Access to Data Collection Tools
Restrictions on In-Person Research
Our survey study started as a semi-structured, in-person interview study.
In-person research activities were suspended due to COVID-19.
To keep our research moving forward, the UNC study team turned our moderator guide into a participant guide.
We turned our semi-structured interview questions into survey questions.
Participants’ Comfort
We chose a survey over remote interviews to prioritize participants’ comfort.
As study planning progressed, our client advised that confidentiality would encourage more detailed survey responses.
The UNC study team chose a format that would allow us to distribute study materials through our research partner so that we would not learn participants’ identities.
Participants’ Access to Data Collection Tools
We used Microsoft Word because:
It was available to and easily accessible by participants.
Participants could save responses and revisit to finish them when needed.
Partner Pilot and Revisions
UNC study team piloted survey questions with representative users.
We prepared an instructional video to specifically address areas where our pilot showed participants would struggle.
Pilot Feedback
We piloted our survey questions with our research partners who had relevant analyst experience.
Our key takeaway was that analysts would likely struggle to describe their classified work in unclassified terms.
Revisions
To address feedback from analysts, we prepared an instructional video with an example of how to use an analogy in a different work domain (journalism) to talk about classified work tasks.
We explained that using an analogy like this would help us understand that their task involved:
Finding alternative solutions to a problem.
Comparing the alternatives.
Selecting the best one based on specific criteria.
Survey Questions
Our survey asked three “general” questions about the knowledge base.
The survey also asked participants to recall and describe one negative and one positive experience they had using the knowledge base and respond to the same ten questions about each instance.
General Questions
What do you like about the knowledge base? Why?
What do you dislike about the knowledge base? Why?
What challenges do you encounter when using the knowledge base?
Specific Instance Questions
What were you looking for?
Why were you looking for this information?
What did you already know about the topic?
What knowledge did you use to support your search process?
Did you find what you were looking for? Please describe.
How much did you already know about the knowledge base?
What features of the knowledge base did you use?
What steps did you take?
Did you encounter any difficulties? If so, please describe.
What affected your search (e.g., helped, hindered)?
Participant Recruitment
Our solid research partnership ensured that recruitment ran smoothly while protecting participants’ confidentiality.
Participant Confidentiality
To maintain participants’ confidentiality, the UNC study team relied on our research partners with security clearance to recruit participants.
UNC study team members prepared and provided recruitment materials.
Research partners advertised the survey study on internal mailing lists and forums.
Client-affiliated reviewers double checked survey responses to ensure that the UNC study team did not receive any classified information.
Data Analysis: Thematic Analysis
UNC study team reviewed a subset of data and selected four dimensions to code based on our research questions.
Coding Dimensions
Work task objectives
What were analysts trying to do?
Information Types
What type of information (blog, forum post, other) helped or hindered searches?
Relevance criteria
What made analysts decide that the information was helpful?
Challenges
What difficulties or barriers did analysts encounter?
Reporting and Outcomes
UNC study team summarized and reported our findings to our research partners and for publication.
Research partners communicated actionable next steps to the right internal audience.
Reporting
Findings
Users reported a wide range of work task objectives.
Most objectives aligned with understand (e.g., understand the purpose of a tool) and evaluate (e.g., determine the root cause of a problem) cognitive processes.
Users sought five main information types: background information, definitions, procedure applicability, detailed steps, and advice.
Users judged information as relevant based on: intended audience, level of detail, specificity vs. generalizability, task requirements, and author information.
Challenges included: wading through information, vocabulary problems, information quality and redundance, and gaps and category mismatches.
Desired system features included: identifying related concepts and explanations for search results.
Recommendations
UNC study team provided socio-technical (e.g., behavioral) and technical (e.g., algorithmic) approaches based on findings.
We prioritized “quick wins” that would quickly improve knowledge base users’ experience.
Outcomes
Research partners provided positive feedback, specifically for insights that aligned with quick and easy implementation options.
They adapted UNC team’s actionable next steps for the appropriate internal audience.
UNC study team’s paper Procedural Knowledge Search by Intelligence Analysts was published by the 2022 ACM SIGIR Conference on Human Information Interaction and Retrieval (CHIIR ‘22).
Example Deliverable
Participants described a wide range of work task objectives (Figure 1).
For most work tasks, searchers strove to understand and evaluate.
Given the range of specific objectives, knowledge base systems need to support a wide range of uncommon tasks and scenarios.
Reflection
If I could repeat this project, I would:
Pivot quickly. We initially waited to start the study with the goal of conducting in-person interviews, which delayed data collection. In a post-pandemic world, I’d pivot quickly to a remote or asynchronous option.
Choose a different asynchronous format. Sending follow-up questions through our research partners took valuable analysis time. I’d work with our research partners to find an asynchronous format that would allow the UNC study team to send follow-up questions immediately after getting participants’ initial responses.
Keep communicating. Communication and collaboration throughout this project contributed to success. I continue to practice clear and consistent team, partner, and stakeholder communication.