The Culture of the 5 Whys: Or, How Not to End Up Building a Mass Transit System in a Small Town with a Centralized Population

Ahhh, Lyle Lanley! A project manager’s worst fear. A shiny, charismatic confidence man who deftly convinces a group of mainly heuristic-based decision makers, to approve and prioritize a project that nearly obliterates a town. Meanwhile, the voice of more algorithmic-based decision logic, Marge Simpson, who wanted to repair the potholes up and down Main Street, is abruptly silenced with the reply, “Sorry Marge, the mob has spoken!”

Throughout the course of over 625 episodes, we have learned that Marge is usually right, and is the consistent purveyor of good advice and solid decision making. We also have learned that Homer has his own unique way of thinking through the outcomes of his choices.

However, despite Marge vs. the Monorail being one of the classic “What did Homer do now” episodes, it is fair to suggest that both styles of decision making voiced in the Springfield Town Hall have value. Heuristic decision making is based on quick and simple choices. Mental shortcuts, which can feel like the best decision in the moment but may not always be right for the longer term, or free from personal bias. Algorithmic decision making attempts to derive the “right” answer and eliminate anything irrational from the outcome, but algorithms take time and have difficulty factoring in nuances of human dynamics, like how regal that sash makes Homer look!

So, why did I want to blog about the 5 Whys? Well, much like the study of failure as insight into how humans think, so is the examination of how we ask questions and make decisions. At its core, asking questions is about problem solving. We ask questions to acquire and assess information about a topic, and use this as criteria to make decisions. It sounds straightforward, doesn’t it? We must all be experts at decision making, given that an average person makes hundreds of decisions a day, from what to wear, to your personalized Starbucks order, or about how close you think you are to the signal, as the light turns from green to yellow. But decision making can be a deeply challenging endeavor. Philosophers from Plato to Hume have wondered if there was room in decision making for anything other than pure logic, or if it should be reason that follows passion. Revisiting Plato’s ship of state metaphor, we know he believed that decision making should reflect both our knowledge and values. Socrates spoke about asking questions and decision making, in terms of developing citizens to “produce a flourishing city-state” through reasoning and logical debate.  One could conclude that the objective of questioning and reasoning your way through issues, is to cultivate a culture of appreciative inquiry, with the outcome of improving the quality of work and life. This would apply to individuals, as well as teams and organizations. I mean we all want to know why, right?

Everyone has heard at least once, that the quality of our lives is directly based on the decisions we make. Such adages as, “you are what you eat,” or “you are what you read,” or “your face is a ship of state!” So, it would follow that good decisions are predicated on a robust process of inquiry for making them. Two notable techniques for asking questions to arrive at a decision/conclusion are the 5 Whys and the Socratic method. The 5 Whys is an attempt to identify root-cause by asking more probing and clarifying questions with each new piece of information received. Ideally, you work through the perceived symptoms of an issue until you reach the true problem. The Socratic method is a process of hypothesis elimination, wherein you search for general truths and analyze them to determine their consistency with other relative beliefs. The Socratic method also utilizes a sequence of questions, including asking opening questions to generate a discussion, then moving to several guiding questions to elaborate, and finally asking closing questions to summarize what feedback was provided and bringing the discussion back to the original problem statement. The 5 Whys and the Socratic method are only a few of the options available for approaching decision making. In general, a good first step is developing a habit of asking questions and advocating for information. Just as we would tell our patrons to do. Just as a democracy should function. Advocate for information!

With any decision making body, the goal should be to follow a consistent methodology that helps accurately identify the problem, business need, or functional requirement, so a valid and sustainable decision can be developed. If no investment is made in asking questions about the root problem/need, any investment made in a solution will be out of alignment. Some other approaches to help gather information and ask questions include peer review and benchmarking. Inevitably someone out there has attempted what you are considering, to varying degrees of success. Asking your peers questions about their decision making process and experience can produce invaluable insights. How did they evaluate the information for a decision? What were their resources and constraints? With hindsight, would they do the same again? Did it put their town on the map?

Evaluating vendors and making site visits are other helpful information gathering steps to take before making a decision. Throughout our implementation of the ArchivesSpace public user interface at Yale, we made multiple “site visits” talking with the good people of Harvard, University of Louisville, NC State, and Georgia Tech. They shared with us what they decided, how they did it, and how they were going to continue. In turn, we shared, and will continue to share, the artifacts of our project team, and outcomes of implementation. (In fact, I think we still owe GT our request de-duplication process). Part of the original intent of the 5 Whys was to physically go and see the problem being described, or the solution in place, and not ask questions from too far a distance. And following up with a vendor to validate they can deliver what they promised is essential. Don’t sign anything until you do!

There is an artistry to question asking. As a facilitator, or business analyst, you want to be mindful not to lead a person too much, or to ask only binary questions. Question asking should be designed to help us validate what we think or know, and help us determine what is unknown. It can be much harder to identify what is unknown, however even acknowledging that there are unknowns is an important step in decision making. An approach to try is called Humble Inquiry. Try asking each person to say one thing they don’t know about the proposal in front of them, and offer to go first to show there is no stigma with not having all the answers. The questions asked by a decision-making body should be constructed in such a way, that they can challenge any assumptions inherent in the problem statement, or project proposal. For example, if a member of the team states we should make decision A, “because it’s the best and others are doing it already” then there should be follow up questions that help document why it is the best (what are the points of comparison), and questions that document how your organization is similar and different from these other places. Advocate for that information. Another important question is “does the solution meet the stated business need?” And again, document how and to what extent it does resolve the need.  Questions should strive to broaden our view of any problem or statement being presented. Questions should prompt us to consider alternatives, including low-tech, or no-tech solutions. They should also help an organization determine if the timing is right for a decision.  Questions such as, “What if we waited 6 months? Or, a year?” are good examples to generate discussion about the issue and describe the near-term environment. Asking exploratory, or adjoining questions can be useful to take a proposed solution through several variations and perspectives. These could be “what ifs?” and “how does this idea relate to that idea?” And be sure to ask clarifying questions. Even in a one-on-one conversation, something can be misunderstood.

I believe revisiting former decisions is also a worthwhile exercise in an organization. Folks shouldn’t rush to say, “but we already tried that,” or “it didn’t work before.” The data points may have changed, or other variables may have become obsolete over time, or now newly introduced. The conditions of the markets we work in are always evolving. Developing a culture of question asking, or appreciative inquiry, requires investment from management. They need to set the example and incorporate question asking and information validation into the organization’s decision, prioritization, and planning procedures. As children, we ask questions to learn and process information about the world around us. Continuing through university, we are encouraged to learn and evaluate by asking questions, and engaging in intellectual debate and dissent. But post-university a shift begins to occur, and then greater power and value is placed on stating the answer. Being certain about your choice. Being right. Acting quickly. And then, asking questions may be viewed as disruptive, or an attempt to slow progress. So, the person who asks, “Is this the best choice for us?” or “Is this sustainable over the next four years?” may not be viewed as a team player, because they aren’t tacitly agreeing.

Using the 5 Whys is also useful for identifying user stories, or functional requirements in a project. “As a processing archivist, I want to unpublish selected components in a series, so that I may prevent sensitive data from appearing in a public search result in the brand-new fancy awesome Archives at Yale discovery tool.” You have to logically walk through not just what you want, but why you need it, and what value it generates. A good 5 Whys interview could result in not only process improvements, but better product design sure to please even the most intractable of stakeholders.

Another important time to ask questions in a project lifecycle is during the After Action Review. I have previously blogged about Before Action Reviews, but the AAR is another phase in the project lifecycle to spend time in reflection. These complementary, question-driven periods in a project help set the message that what we work on is important, and so is how we work on it. The AAR typically asks three questions, “What went well?” “What could have been refined?” and “What would you change next time?” The first two questions reflect on what has happened, and the third is gathering feedback on how the organization could change their process with future projects. And ideally, the information gained should evolve into knowledge applied. By conducting an AAR and asking these questions, you may be able to determine why a problem was not detected during a project implementation, or if it was, why that problem was not prevented (or mitigated).

The 5 whys will not produce a solution for every type of problem or proposal in an organization, but I mention it as a starting place for business analysts and project managers. Developing this skill as part of your workplace culture, should lead to more open conversation and better overall problem identification. A questions-positive culture is also likely to be more creative and innovative, by honoring ideation, iterative design, and exploration. And remember, sometimes it’s OK to just let go, knowing the cosmic ballet will go on, and with the help of science (and doughnuts) everything will resolve to a happy ending.

 

What Could Possiblye Go Wrong? More on Project Management Themes as we Implement the ArchivesSpace PUI

Failure. What don’t I love about failure? Yes, it’s true, my Netflix queue is full of disaster-recovery documentaries, and for *fun* I read case studies about large-scale failures (Note to self: Get dictionary for Christmas).  And I love it not for the reasons you might think. I don’t like wreckage, rubbernecking, or even commotion. I don’t ever want people to be in pain. I mean, I cover my eyes while watching Law & Order! I like examining failure because it’s an opportunity to learn how people think, and the assumptions they make. It’s human to fail, but from a young age we are socially conditioned to avoid it all costs, which ironically, often leads to failure.

I have a very high comfort level with failure. Always have. Hello, old friend. I’m naturally a very serious person. Not that I don’t have fun, or at least my version of it, but in general I can think too much, worry a bit too much, and when I’m all up in it, I can sometimes forget the standard social norms and graces, like polite nodding and small talk. Over time I’ve learned to provide folks a heads up about it. I might tell them not to look directly at my face when I’m thinking, because you’ll think I’m trying to kill you, but I’m not! I just have a terrible thinking face, because I might let everything physical go slack, so I can direct all my internal resources to my sequence of thought. Some of this comes with the job of being a PM. You are trained to look for potential sources of risk and failure, the one thing that everyone else might miss. So you wander a little further into the rain sometimes. What you’re doing though, is trying to protect everyone. That’s how I feel about it. It sounds strange, but by thinking about failure you are trying to bolster their success. I might issue a preface such as, “I’m going to be me for a moment here, and do six months of thinking in the next 3 minutes, so buckle up.” Sometimes I feel like I’m sticking out in the day-to-day, and making folks a bit uncomfortable with my serious outlook. So inevitably when a failure comes, or an emergency, or crisis, I don’t mind as much, because suddenly I become the most normal person in the room. When everything is going up in flames, I’m just like, “Dude, bring it!”

It’s interesting to try and pinpoint where failure begins. Failure can be cumulative or discrete. One of the most referenced examples of failure is the Challenger explosion. I have discussed this case in a group I participate in called #fail4lib, a small subset of the #code4lib group. It’s sort of a support group for people like me! Most folks are familiar with the events of that morning, however, what may be less known, is that the failure began more than a year before that fateful morning. More than a year before the scheduled launch, engineering tests and launch simulations clearly demonstrated the O-ring materials did not consistently perform well. More than a year before that clear, blue, cold morning, a person with experience, integrity, and valid concerns, spoke up and said there is a problem, but he was told by management to back down. Most people view the Challenger explosion as the result of something that went wrong that day, that moment. It’s understandable, the need to see something so horrific as a total accident. And it isn’t something anyone ever wanted to happen. Not ever. But this wasn’t accidental. This was a cumulative failure, not discrete. And the man who spoke up was certainly not the only point of failure/course correction, but I suppose for various reasons his story stands out to me. So what did we learn from this? What did NASA? To this day their procedures reflect the lessons learned from this event. I think an important lesson is if someone raises a concern in their area of expertise or exposure, follow it through. Walk their beat with them, see it as close to their perspective as possible. Have a risk registry for the project and add the raised concern. The risk registry would document the stated issue, the likelihood of it occurring, and a value for what the impact would be. The risk registry also contains your organization’s strategy for dealing with it, whether Avoid, Control, Transfer, or Accept. No one individual or event can control failure, but there are methods to mitigate or better prepare for it to reduce impact. The playwright Arthur Miller is also very reflective about the inevitably of failure in some form, and how as individuals, or organizations, we can better understand the value it can hold.

There is another event a few years back that is a more discrete failure, the crash and sinking of the cruise ship Costa Concordia. First things first, 33 souls were lost. 32 during the sinking, and 1 during recovery. And you can never take that lightly. The captain (who went to trial for manslaughter and abandoning ship) strayed off the chartered course last minute. Well, then came the recovery effort, led by a project manager from Italy. I think for a PM, getting the call to manage a project like this is like going to the show. You’ve been called up to the majors. As PM on such an effort, you have a real countdown clock, each day of the plan is costing hundreds of thousands of dollars, you have unprecedented magnitude of risk, environmental variables you cannot control, and the whole world waiting as a stakeholder. You have to be on your game. You have to think everything through to the greatest extent.You have to imagine multiple outcomes, and prepare for all of them. Managing a recovery effort, or the result of a failed project, means even less tolerance for any additional errors. Costa Concordia was two times the size of the Titanic! It carried 100 thousand gallons of fuel, and it carried supplies and chemicals for 4200 people. It sank in a fragile eco-system where a rare breed of mussel lived. Each diver working on the recovery can only stay down for 45 minutes-talk about a project sprint! This was a half-billion dollar cruise ship, and the recovery effort triple-exceeded that price tag. The force required to to right the ship was one and a half times that required for a space shuttle takeoff!  It took over 7 hours to right it 1/6th of the way up, and once they start, they can’t stop, so they worked even in the dark of night. Can you imagine getting to be the project manager on this one? I’m so nerdy about it, I think he should write a book and go on tour, and I’d be standing first in line to get his autograph.

In my experience, most projects will fail at the beginning or at the end. Why? This is when the thinking and planning require more precision and thoroughness. You must begin and end with input from more than just the project team. People may under-plan, or misunderstand the project’s purpose. People may also assume the project is an isolated event in an organization, or they may not think through what a post-rollout requires. A good PM knows the finish line is never where you think it is going to be. Post-rollout is such an important phase of the project lifecycle. Whether you scheduled a soft-launch, or just a straight cut-over to new implementation, post-rollout still requires a plan. Ideally a smaller “release team” is prepped to monitor production for a two-four week period, and also resolve and archive the project artifacts. Always remember the project doesn’t end the day you go into production. And don’t wait for failure to happen before talking about it. Make it part of your Before Action Review environmental scan. Ask straight up, “How might we fail?” “What could go wrong?” Don’t make it a taboo subject. Don’t assume you won’t be able to know how you might fail before you get started on a project. One of my earliest lessons in enterprise systems administration was to always test for failure.  A colleague from IBM taught me that. When you are testing something, don’t limit scenarios to the “happy path” or the regular use of the system/application. Create tests that you know will fail/not pass go, and see what the outcome is. Did you receive an error message? Did it go into the void? Did that test item get processed anyway, even though you thought it would fail? Failure should be part of your design thinking, and project scenarios. Just as professional sports teams hold practice games in manufactured conditions to simulate potential game-day variables (practicing with blaring music, heckling, artificial rain/snow/cold), organizations could include failure or chaos simulations during their own testing/pre-production efforts. Quite simply, before implementation try to break stuff. You’ll gain more insight into your product then you may anticipate.

It is important not to see success and failure as binaries. An organization’s culture, or philosophy about failure, contributes to a project’s overall success. I’d love to encourage more open conversation about failure, and help organizations define their relationship to it and tolerance for it. There are degrees of failure. Some failure is unavoidable due to complexity, other failure is entirely preventable. There is new thinking on what is called “Intelligent Failure.” These failures are largely experimental in nature. They occur as an organization, or a team, are moving through an ideation/development stage of a product or project design. Some businesses encourage their development staff to “fail faster” or to “fail often.” Some of it depends on the type of industry you work in, and how risk-tolerant you are as an organization. For example, failing often isn’t going to work in hospitals or healthcare. There is a spectrum though. I believe every organization could, and should, have several published methods and tools in place for how to locate failure, discuss it, and ideally, learn from it. There are two interesting areas that often govern attitude towards failure. One is the “sunk-cost fallacy.” This is when someone, or an organization, decides to keep pursuing their course of action or strategy, because they have already invested time and money into it. Even if signs, or instincts, or pure facts are presented, psychologically a decision is made to stick with it and hope for the best outcome. Another is an assumption about consent or control. Sometimes only the most vocal person in the room is listened to. Others may be sitting there, but not speaking up, even if they are holding strong opinions about whether or not to continue down a stated path. I mentioned that every organization should have tools and methods in place for helping manage failure. In this scenario, some helpful methods would include having an agreed upon set of decision making criteria in advance of the project. Setting decision criteria in advance will make it less emotional, if/when a time comes to make a hard choice to evaluate if a project or decision is still working. I also think it helps to document decision making, so it can be referred to uniformly and objectively throughout the project lifecycle.

As we implement the new search and discovery platform for special collections and archives, I have several areas of potential failure in my mind. One of the most common is not having enough time. Everyone working on the project has full time commitments within their regular job. In estimating how much time anything will take, I try to be deeply thorough in my environmental scan, making notes and considerations for other major projects and events within the same time span, including a move back-in after a major renovation in Manuscripts and Archives. Not making preferred deadlines, or the slow shifting of the milestones is always a big concern for me. As a PM, your role is to keep everything moving forward. The work, the teams, the ideas, and conversations. Sometimes you are also managing external parties and vendors, who have their own objectives and deadlines in addition to yours. This project has several technical dependencies within other areas of our infrastructure, and making sure they are also completing successfully is another area of risk. Everyone involved with the project will perceive failure in their own way, just as they’ll perceive project success in their own way. Expectation is another major area where there will be degrees of success and failure. As the PM, you must do your best to be honest about potential points of risk and failure, but also not become paralyzed by them. There is no such state as perfect. There will always be someone in the room who isn’t happy, and seated directly across from them will be someone who is absolutely delighted over your hard work. You can’t take too much of any of it to heart. Just always stick to the philosophy of bringing as much logic and kindness to every situation as possible. In closing, I should note that as much as I am lunch buddies with failure, I spend a good amount of time on the science and art of happiness too! Whatever level you are at fighting the good fight, here is some general advice from my favorite good-hearted failure:

And always remember to put cameras in the back!

 

Meeting User Needs via Improvements to the ArchivesSpace Public User Interface

Hello, everyone! This is Alison Clemens, archivist at Manuscripts & Archives, member of the Yale Archival Management Systems Committee, and team leader of the ArchivesSpace Public User Interface (PUI) Settings & Enhancements Workgroup. Our workgroup is charged with reviewing and documenting any default changes we might want to make to the public user interface, and collecting and maintaining a list of possible future interface changes and enhancements. I’m pleased to give you an overview of some of our workgroup’s initial planning as we prepare to implement the ArchivesSpace PUI here at Yale.

Before I dive into our workgroup’s goals and progress, I’d like to emphasize that lots of behind-the-scenes data cleanup and enhancement work has been and will be instrumental in making the project successful. For example, we did a big project to clean up our people, organization, and subject records in ArchivesSpace, and we literally exorcised some ghosts in the process (no, really — did you know that the Library of Congress Name Authority File includes spirits?). But our ongoing data work will be the subject of a future blog post.

This post will focus on our shared raison d’être: our users, and ensuring that we are providing the best possible services and platforms to meet their needs. I’ll note here that as we consider how to serve our users, we’re thinking about both external users (i.e. patrons) and internal users (i.e. library staff).

Yale’s special collections comprise a fairly large data universe, and figuring out how best to serve the diverse user constituencies of our 10 campus repositories is a challenge. We’re lucky, though, that we’ve got a good team on the case. Our workgroup members include: Stephanie Bredbenner, Processing Archivist at Beinecke Library; Anna Franz, Assistant Head of Access Services at Beinecke Library; and Jonathan Manton, Music Librarian for Access Services at the Music Library. Each of us brings specific skills and perspectives to our work, and we share a focus on taking a user-centered approach to figuring out how to determine and prioritize settings and enhancements for the new PUI.

I’ll pause here to talk a little bit about the logistics of how we’re accomplishing our work. We’re using four platforms to communicate and coordinate: Google Drive, Asana, Slack, and in person and virtual meetings. These tools are invaluable to us as a workgroup, and especially to us as a larger team — there are 30 team members, working from a variety of physical locations, involved in the six workgroups that comprise the PUI project, and communicating about dispersed goals and tasks with a group of that size is a challenge. Fortunately, our virtual communication platforms have made this process easier to manage. We’re also providing ample opportunity for in-person communication via regular workgroup meetings, biweekly workgroup leaders meetings, and monthly open forums.

Now, back to the workgroup’s goals and tasks. Given our desire to put the user first, we naturally concluded that much of our work depends on the data gathered by the Usability & Accessibility (U&A) Workgroup. Although our workgroup members certainly have a sense of user needs, we want to be sure that we take every opportunity to actually hear directly from users about their needs and preferences. Working closely with the U&A Workgroup is instrumental in assuring that we’re successful in doing so.

The activities of the U&A Workgroup will be the focus of a future blog post, but in the meantime, I’ll mention that the U&A Workgroup has already conducted about a dozen user interviews with user constituencies and is in the process of designing and conducting user testing.

This brings us to the PUI Settings & Enhancements Workgroup’s starting point.

Essentially, our job is to take the out-of-the-box PUI (see screenshot below) and turn it into something that’s as responsive as possible to user needs. In order to do this, we revisited the Before Action Review (discussed by Melissa in our last blog post) and considered what we were trying to accomplish, what we thought would change, and how we might know we were successful. In this discussion, we agreed that the two most essential goals for our workgroup are to improve user experience (as demonstrated through metrics provided by the U&A Workgroup) and ensure broad and thoughtful staff input and buy-in.

Screenshot of the Yale ArchivesSpace PUI, titled “ArchivesSpace at Yale DEV INSTANCE,” with a search box and an upper navigation menu with "Repositories," "Collections," "Digital Objects," "Unprocessed Material," "Subjects," "Names," "Classifications," and a small magnifying glass listed as navigation options.
Screenshot of the out-of-the-box Yale ArchivesSpace PUI

To accomplish our goals, we’ve done a few things (so far, and more to come).

Our first task was to address some of the most obvious issues with the current PUI. We’ve discussed a lot of potential improvements, but two general categories of changes have risen to the top. First, we’ve identified some basic improvements we might like to make to the front page. Most of these things are small improvements to make the page clearer and more intuitive; we’re waiting for the U&A Workgroup user testing feedback before we ask for more major changes. We have, though, been examining some other examples of finding aid databases to see what features and framings we like (shoutout to the New York Public Library!), and we’re looking forward to digging more into how to make the PUI’s front page more compelling and user-friendly.

Our second task was to figure out how to address issues with search relevance. Relevance ranking in search results is a known issue in the ASpace PUI, and we wanted to get a sense of exactly how the search should be operating. To assess this, we solicited search use cases from all of the repositories via an online web form. The web form asked staff members to identify any two of their repository’s significant collections and answer questions about how they might search for known items from those collections. Specifically, we asked respondents to indicate a) what search terms they would use to find materials in their collections and b) what search results they would expect to see at the top of the search results list, based on their identified terms. We’re now in the process of analyzing this data. Hearing directly from our colleagues about their search cases and expected outcomes will help us determine a set of use cases to send out for development work.

Although the PUI Settings & Enhancements Workgroup has already done a lot of work, we still have a couple of months of the PUI project left, and our work has really just begun. We look forward to continuing our work and are particularly excited to review the Usability & Accessibility Workgroup’s user testing results and accessibility audit and determine how to prioritize additional changes based on that key data. Stay tuned — we look forward to keeping you updated!

Implementing the ArchivesSpace PUI: A Before Action Review

Plato’s Ship of State metaphor postulates, “A true pilot must of necessity pay attention to the seasons, the heavens, the stars, the winds, and everything proper to the craft.” Accounting for all possible variables, risks, pluses, and minuses is also the mainstay of project management. And, if Plato were alive today, you can bet he’d make you put all of it into the project charter!

Before any project begins, the Project Manager (PM) should initiate a Before Action Review, to identify as much of the current environment as possible. This will include talking to people who have expressed the business need, talking to stakeholders, looking at proposed/active concurrent projects, and remembering to reflect on not just what is presented, but what isn’t (more on that in an upcoming post). Often a project is initially presented as either a loosely defined idea, or a shoot the works scenario wherein everyone is promised a pony. Then the PM should figure out the ideal, sustainable, middle ground within an organization’s current capacity, and manage the expectations sure to follow that recommendation.

Next the PM is going to look at the calendar. (How I do love calendars! I think they are a fascinating combination of math, language, and social construct.) The calendar is either the one she carries in her head, the one drawn with dry-erase markers on a whiteboard, or the one on her mobile. (Another nerdy confession- I love smart phones because it means I always have a calendar within reach.) Time is everything in project management, and calendars are how we identify units of time at work. Mondays, how many business days, a work week, the month of January. The PM asks other time-centered questions-Do you have enough? Do you have work estimates? Have you adequately expressed time as a direct cost? What about time as momentum? And time not just at the party, but the setup and breakdown of the party? Time is always on my mind. That look Steelsen gives me when I know March 27, 2018 is a Tuesday without having to check. During the ArchivesSpace crisis, my dad gave me a replica of the time turner necklace Professor McGonagall gave Hermione, that enabled her to be in two places at once and get all her work done. That last week in January, I never took it off, and I still cry every time they go back to save Buckbeak.

As Yale begins the work of implementing the new public user interface for ArchivesSpace, we took time to conduct a Before Action Review. A BAR is not as typical as an After Action Review (AAR), which is when the work is completed and you reflect on what went well and what could have gone better. But I like the beginnings and endings of projects the best, because they can be the areas that receive the least amount of planning and time. They are often underestimated for their overall influence on project success and stakeholder satisfaction, and ensuring your work and choices are directly addressing the business need. I like this image for what it conveys about taking time to think things through.

I typically ask three questions during a BAR-What are we trying to accomplish? What do we think will change? How will we know if we are successful? These three simple questions generate invaluable insight into what people are thinking, what assumptions exist, what folks are scared about, or excited about, and how we characterize success. The PM should then continue to use the feedback to shape elements of the project. For example, here are some of the responses to What are we trying to accomplish in the PUI project?

  • Roll out a holistic and focused service, fully integrated with production
  • An improved user experience for discovery and access of archival materials

These are important objectives. They express the aging technical environment for archives and special collections at Yale, and they also express our professional and organizational values. They are aspirational, but still possible, if given the right amount of time. As a project team, including stakeholders and sponsors, we need to keep defining and refining what we mean when we say holistic and focused. Those words capture ideas and feelings about what we want, and a PM should help break them out into S.M.A.R.T. style goals. Reviewing the point about an improved user experience, leads to a need to examine current metrics on what the user experience is for discovery and access of archival materials. As we roll into production, we can’t qualitatively, or quantitatively, measure improvements if we don’t know where we started, and if we don’t define improved. If the project team states this as something we want to accomplish, the PM should revisit the project plan to include specific tasks and resources to address it.

Taking that a step further, the PM also must determine what’s in scope, and outline recommendations for what improvements can be made now, and which ones need to wait for post-rollout. The PM might use the project scorecard method to manage any objectives. This involves stating an objective, something like improved user experience for discovery, and identifying measures, targets, and either tasks or program initiatives, depending on the scale of the objective. A measure in this context is just as it sounds, an observable parameter, such as the number of monthly service desk questions concerning user difficulty in locating materials. Then you set several targets as reasonable to the scope and time frame of the project. Maybe the phase 1 target is a 5% reduction in this category of question, and a longer-term target is a 15% reduction. (These are examples). Finally, the project plan would include tasks to initiate this improvement, or if the effort is a greater scale, perhaps a program initiative is setup to increase end-user training and conduct more frequent usability reviews.

The “what do we think will change” question is particularly important to me. Sometimes I observe organizations discussing change on a meta-scale, such as what will libraries be like in 50 years? But this question in a BAR has a more immediate time frame of around 6 months, and the process and outcomes of change are likely to be felt more acutely by people here now. Earlier, I mentioned that the BAR could help suss out what folks might be scared about, or anxious, or unclear in a pending project. And that may not be the best business language to use, but it gets to the heart of our work and our feelings about our place in it. There’s lots I feel anxious about at work. Will we make our preferred upgrade deadline? Am I smart enough to be of service? What if I never fully understand EAD? And don’t even get me started on how CAS works! I know there are tokens involved, but not the kind you can use to buy pizza. If people are given an environment to state their concerns, and be heard, then the fear of change immediately begins to dissipate. If a colleague offers that they are concerned about having to learn a whole new way to process collections, recognize that staff training is an important step, and the PM should build time into the project for it. Make conversation about change a part of the project, the stakeholder registry, and communications plan. This also applies to assumptions. Asking folks what they think will change, is a bullet train to uncover assumptions. Find out what they are early in the process, while there is time for relevant course correction.

An interesting outcome of the “what do we think will change” question for the PUI, is how much feedback there was on anticipated changes for staff workflows. When implementing a public user interface, it’s possible that we could minimize the amount of focus on staff workflow, and think primarily about the user’s experience. 66% of the name is external facing, public user. But for legitimate reason. It is a public user interface. The “what are we trying to accomplish” question generated more user-based objectives, like the improved experience for discovery and access. I think it will be critical to balance the needs in both areas, within our preliminary project time frame. As Melissa Barton noted, “We can’t underestimate the effort required to teach users about what finding aids are.” I’m eager to continue this discussion with stakeholders, to see where there is alignment between these two aspects of the project.

As to the last question regarding, how will we know if we are successful, well that’s easy!