Brad Rose Consulting

Logic Modeling: Contributing to Strategic Planning

  • Posted on July 30, 2013
  • In Models/Tools

What is Strategic Planning?

“Simply put, strategic planning determines where an organization is going over the next year or more, how it’s going to get there and how it’ll know if it got there or not. The focus of a strategic plan is usually on the entire organization…” (See the Free Management Library at http://managementhelp.org/strategicplanning/index.htm#anchor1234 ) Balanced Scorecard says, “Strategic planning is an organizational management activity that is used to set priorities, focus energy and resources, strengthen operations, ensure that employees and other stakeholders are working toward common goals, establish agreement around intended outcomes/results, and assess and adjust the organization’s direction in response to a changing environment. It is a disciplined effort that produces fundamental decisions and actions that shape and guide what an organization is, who it serves, what it does, and why it does it, with a focus on the future. Resource.

Features Common to Both Logic Models and Strategic Plans

My experience working with clients has shown me that logic models raise many of the same questions that strategic plans do: What are our assumptions about how a program works? What is the environment (context) in which a program operates? What are we trying to achieve (goals and objectives)? What investments (inputs) do we make? What activities (outputs) do we engage in? What are the results, changes, impacts (outcomes) that we want to, and in fact, DO, produce? How do we measure our effects and achievements (measures/metrics)?

Although logic modeling can’t do all of the things a strategic plan can, it can become – especially when it includes an organization’s many stakeholders – an important contributor to the process through which an organization reflects upon where it is and where it wants to go. The collective learning that accompanies the process of building a logic model for a specific program, can also inform the organization’s efforts to develop a broader strategic plan.

Read Brad’s current whitepaper “ Logic Modeling ”

What is a Strategic Plan? What is the Balanced Scorecard? The Basics of Strategic Planning and Strategic Management

What a Strategic Plan Is and Isn’t Ten Keys to Successful Strategic Planning for Nonprofit and Foundation Leaders

Types of Strategic Planning Understanding Strategic Planning

Steps to a Strategic Plan Five Steps to a Strategic Plan

Strategic Planning for Non-Profits Strategic Planning for Non-Profits What is the best way to do strategic planning for a nonprofit?

Videos About Strategic Planning

University of Arizona Introduction to Strategic Planning

Understanding How Programs Work: Using Logic Models to “Map” Cause and Effect

Program evaluation: strengthening schools, not just holding them accountable, recommended posts, the tyranny of metrics, what’s the difference 10 things you should know about organizations vs. programs, the use of surveys in evaluation, program evaluation essentials for non-evaluators.

Please enter your information below to receive a copy of this whitepaper.

  • Name * First Last
  • Email This field is for validation purposes and should be left unchanged.

Preparing for a Program Evaluation

  • Name This field is for validation purposes and should be left unchanged.

Logic Modeling

Sample program evaluation report.

  • Phone This field is for validation purposes and should be left unchanged.

strategic planning logic model

Search form

strategic planning logic model

  • Table of Contents
  • Troubleshooting Guide
  • A Model for Getting Started
  • Justice Action Toolkit
  • Coronavirus Response Tool Box
  • Best Change Processes
  • Databases of Best Practices
  • Online Courses
  • Ask an Advisor
  • Subscribe to eNewsletter
  • Community Stories
  • YouTube Channel
  • About the Tool Box
  • How to Use the Tool Box
  • Privacy Statement
  • Workstation/Check Box Sign-In
  • Online Training Courses
  • Capacity Building Training
  • Training Curriculum - Order Now
  • Community Check Box Evaluation System
  • Build Your Toolbox
  • Facilitation of Community Processes
  • Community Health Assessment and Planning
  • Section 1. Developing a Logic Model or Theory of Change

Chapter 2 Sections

  • Section 2. PRECEDE/PROCEED
  • Section 3. Healthy Cities/Healthy Communities
  • Section 4. Asset Development
  • Section 5. Collective Impact
  • Section 6. The Institute of Medicine's Community Health Improvement Process (CHIP)
  • Section 7. Ten Essential Public Health Services
  • Section 8. Communities That Care
  • Section 9. Community Readiness
  • Section 10. The Strategic Prevention Framework
  • Section 11. Health Impact Assessment
  • Section 12. Documenting Health Promotion Initiatives Using the PAHO Guide
  • Section 13. MAPP: Mobilizing for Action through Planning and Partnerships
  • Section 14. MAP-IT: A Model for Implementing Healthy People 2020
  • Section 15. The County Health Rankings & Roadmaps Take Action Cycle
  • Section 16. Building Compassionate Communities
  • Section 17. Addressing Social Determinants of Health in Your Community
  • Section 18. PACE EH: Protocol for Assessing Community Excellence in Environmental Health
  • Main Section

What is a logic model?

When can a logic model be used, how do you create a logic model, what makes a logic model effective, what are the benefits and limitations of logic modeling.

A logic model presents a picture of how your effort or initiative is supposed to work. It explains why your strategy is a good solution to the problem at hand. Effective logic models make an explicit, often visual, statement of the activities that will bring about change and the results you expect to see for the community and its people. A logic model keeps participants in the effort moving in the same direction by providing a common language and point of reference.

More than an observer's tool, logic models become part of the work itself. They energize and rally support for an initiative by declaring precisely what you're trying to accomplish and how.

In this section, the term logic model is used as a generic label for the many ways of displaying how change unfolds.

Some other names include:

  • road map, conceptual map, or pathways map
  • mental model
  • blueprint for change
  • framework for action or program framework
  • program theory or program hypothesis
  • theoretical underpinning or rationale
  • causal chain or chain of causation
  • theory of change or model of change

Each mapping or modeling technique uses a slightly different approach, but they all rest on a foundation of logic - specifically, the logic of how change happens. By whatever name you call it, a logic model supports the work of health promotion and community development by charting the course of community transformation as it evolves.

A word about logic

The word "logic" has many definitions. As a branch of philosophy, scholars devote entire careers to its practice. As a structured method of reasoning, mathematicians depend on it for proofs. In the world of machines, the only language a computer understands is the logic of its programmer.

There is, however, another meaning that lies closer to heart of community change: the logic of how things work. Consider, for example, the logic to the motion of rush-hour traffic. No one plans it. No one controls it. Yet, through experience and awareness of recurrent patterns, we comprehend it, and, in many cases, can successfully avoid its problems (by carpooling, taking alternative routes, etc.).

Logic in this sense refers to "the relationship between elements and between an element and the whole." All of us have a great capacity to see patterns in complex phenomena. We see systems at work and find within them an inner logic, a set of rules or relationships that govern behavior. Working alone, we can usually discern the logic of a simple system. And by working in teams, persistently over time if necessary, there is hardly any system past or present whose logic we can't decipher.

On the flip side, we can also project logic into the future. With an understanding of context and knowledge about cause and effect, we can construct logical theories of change, hypotheses about how things will unfold either on their own or under the influence of planned interventions. Like all predictions, these hypotheses are only as good as their underlying logic. Magical assumptions, poor reasoning, and fuzzy thinking increase the chances that despite our efforts, the future will turn out differently than we expect or hope. On the other hand, some events that seem unexpected to the uninitiated will not be a surprise to long-time residents and careful observers.

The challenge for a logic modeler is to find and accurately represent the wisdom of those who know best how community change happens.

The logic in logic modeling

Like a road map, a logic model shows the route traveled (or steps taken) to reach a certain destination. A detailed model indicates precisely how each activity will lead to desired changes. Alternatively, a broader plan sketches out the chosen routes and how far you will go. This road map aspect of a logic model reveals what causes what, and in what order. At various points on the map, you may need to stop and review your progress and make any necessary adjustments.

A logic model also expresses the thinking behind an initiative's plan. It explains why the program ought to work, why it can succeed where other attempts have failed. This is the "program theory" or "rationale" aspect of a logic model. By defining the problem or opportunity and showing how intervention activities will respond to it, a logic model makes the program planners' assumptions explicit.

The form that a logic model takes is flexible and does not have to be linear (unless your program's logic is itself linear). Flow charts, maps, or tables are the most common formats. It is also possible to use a network, concept map, or web to describe the relationships among more complex program components. Models can even be built around cultural symbols that describe transformation, such as the Native American medicine wheel, if the stakeholders feel it is appropriate. See the "Generic Model for Disease/Injury Control and Prevention" in the Examples section for an illustration of how the same information can be presented in a linear or nonlinear format.

Whatever form you choose, a logic model ought to provide direction and clarity by presenting the big picture of change along with certain important details. Let's illustrate the typical components of a logic model, using as an example a mentoring program in a community where the high-school dropout rate is very high. We'll call this program "On Track."

  • Purpose , or mission. What motivates the need for change? This can also be expressed as the problems or opportunities that the program is addressing. (For On Track, the community focused advocates on the mission of enhancing healthy youth development to improve the high-school dropout rate.)
  • Context , or conditions. What is the climate in which change will take place? (How will new policies and programs for On Track be aligned with existing ones? What trends compete with the effort to engage youth in positive activities? What is the political and economic climate for investing in youth development?)
  • Inputs , or resources or infrastructure. What raw materials will be used to conduct the effort or initiative? (In On Track, these materials are coordinator and volunteers in the mentoring program, agreements with participating school districts, and the endorsement of parent groups and community agencies.) Inputs can also include constraints on the program, such as regulations or funding gaps, which are barriers to your objectives.
  • Activities , or interventions. What will the initiative do with its resources to direct the course of change? (In our example, the program will train volunteer mentors and refer young people who might benefit from a mentor.) Your intervention, and thus your logic model, should be guided by a clear analysis of risk and protective factors .
  • Outputs . What evidence is there that the activities were performed as planned? (Indicators might include the number of mentors trained and youth referred, and the frequency, type, duration, and intensity of mentoring contacts.)
  • Effects , or results, consequences, outcomes, or impacts. What kinds of changes came about as a direct or indirect effect of the activities? (Two examples are bonding between adult mentors and youth and increased self-esteem among youth.)

Putting these elements together graphically gives the following basic structure for a logic model. The arrows between the boxes indicate that review and adjustment are an ongoing process - both in enacting the initiative and developing the model.

Using this generic model as a template, let's fill in the details with another example of a logic model, one that describes a community health effort to prevent tuberculosis.

Remember , although this example uses boxes and arrows, you and your partners in change can use any format or imagery that communicates more effectively with your stakeholders. As mentioned earlier, the generic model for Disease/Injury Control and Prevention in Examples depicts the same relationship of activities and effects in a linear and a nonlinear format. The two formats helped communicate with different groups of stakeholders and made different points. The linear model better guided discussions of cause and effect and how far down the chain of effects a particular program was successful. The circular model more effectively depicted the interdependence of the components to produce the intended effects.

When exploring the results of an intervention, remember that there can be long delays between actions and their effects. Also, certain system changes can trigger feedback loops, which further complicate and delay our ability to see all the effects. (A definition from the System Dynamics Society , might help here: "Feedback refers to the situation of X affecting Y and Y in turn affecting X perhaps through a chain of causes and effects. One cannot study the link between X and Y and, independently, the link between Y and X and predict how the system will behave. Only the study of the whole system as a feedback system will lead to correct results.")

For these reasons, logic models indicate when to expect certain changes. Many planners like to use the following three categories of effects (illustrated in the models above), although you may choose to have more or fewer depending on your situation.

  • Short-term or immediate effects. (In the On Track example, this would be that young people who participate in mentoring improve their self-confidence and understand the importance of staying in school.)
  • Mid-term or intermediate effects. (Mentored students improve their grades and remain in school.)
  • Longer-term or ultimate effects. (High school graduation rates rise, thus giving graduates more employment opportunities, greater financial stability, and improved health status.)
Here are two important notes about constructing and refining logic models. Outcome or Impact? Clarify your language. In a collaborative project, it is wise to anticipate confusion over language. If you understand the basic elements of a logic model, any labels can be meaningful provided stakeholders agree to them. In the generic and TB models above, we called the effects short-, mid-, and long-term. It is also common to hear people talk about effects that are "upstream" or "proximal" (near to the activities) versus "downstream" or "distal" (distant from the activities). Because disciplines have their own jargon, stakeholders from two different fields might define the same word in different ways. Some people are trained to call the earliest effects "outcomes" and the later ones "impacts." Other people are taught the reverse: "impacts" come first, followed by "outcomes." The idea of sequence is the same regardless of which terms you and your partners use. The main point is to clearly show connections between activities and effects over time, thus making explicit your initiative's assumptions about what kinds of change to expect and when. Try to define essential concepts at the design stage and then be consistent in your use of terms. The process of developing a logic model supports this important dialogue and will bring potential misunderstandings into the open. For good or for ill? Understand effects. While the starting point for logic modeling is to identify the effects that correspond to stated goals, your intended effect are not the only effects to watch for. Any intervention capable of changing problem behaviors or altering conditions in communities can also generate unintended effects. These are changes that no one plans and that might somehow make the problem worse. Many times our efforts to solve a problem lead to surprising, counterintuitive results. There is always a risk that our "cure" could be worse than the "disease" if we're not careful. Part of the added value of logic modeling is that the process creates a forum for scrutinizing big leaps of faith, a way to searching for unintended effects. (See the discussion of simulation in "What makes a logic model effective" for some thoughts on how to do this in a disciplined manner.) One of the greatest rewards for the extra effort is the ability to spot potential problems and redesign an initiative (and its logic model) before the unintended effects get out of hand, so that the model truly depicts activities that will plausibly produce the intended effects.

Choosing the right level of detail: the importance of utility and simplicity

It may help at this point to consider what a logic model is not. Although it captures the big picture, it is not an exact representation of everything that's going on. All models simplify reality; if they didn't, they wouldn't be of much use.

Even though it leaves out information, a good model represents those aspects of an initiative that, in the view of your stakeholders, are most important for understanding how the effort works. In most cases, the developers will go through several drafts before producing at a version that the stakeholders agree accurately reflects their story.

Should the information become overly complex, it is possible to create a family of related models, or nested models, each capturing a different level of detail. One model could sketch out the broad pathways of change, whereas others could elaborate on separate components, revealing detailed information about how the program operates on a deeper level. Individually, each model conveys only essential information, and together they provide a more complete overview of how the program or initiative functions. (See "How do you create a logic model?" for further details.)

Imagine "zooming-in" on the inner workings of a specific component and creating another, more detailed model just for that part. For a complex initiative, you may choose to develop an entire family of such related models that display how each part of the effort works, as well as how all the parts fit together. In the end, you may have some or all of the following family of models, each one differing in scope:

  • View from Outer Space. This overall road map shows the major pathways of change and the full spectrum of effects. This view answers questions such s: Do the activities follow a single pathway, or are there separate pathways that converge down the line? How far does the chain of effects go? How do our program activities align with those of other organizations? What other forces might influence the effects that we hope to see? Where can we anticipate feedback loops and in what direction will they travel? Are there significant time delays between any of the connections?
  • View from the Mountaintop. This closer view focuses on a specific component or set of components, yet it is still broad enough to describe the infrastructure, activities, and full sequence of effects. This view answers the same questions as the view from outer space, but with respect to just the selected component(s).
  • You Are Here. This view expands on a particular part of the sequence, such as the roles of different stakeholders, staff, or agencies in a coalition, and functions like a flow chart for someone's work plan. It is a specific model that outlines routine processes and anticipated effects. This is the view that you might need to understand quality control within the initiative.
Families, Nesting, and Zooming-In In the Examples section, the idea of nested models is illustrated in the Tobacco Control family of models. It includes a global model that encompasses three intermediate outcomes in tobacco control - environments without tobacco smoke, reduced smoking initiation among youth, and increased cessation among youth and adults. Then a zoom-in model is elaborated for each one of these intermediate outcomes. The Comprehensive Cancer model illustrates a generic logic model accompanied by a zoom-in on the activities to give program staff the specific details they need. Notably, the intended effects on the zoom-in are identical to those on the global model and all major categories of activities are also apparent. But the zoom in unpacks these activities into their detailed components and, more important, indicates that the activities achieve their effects by influencing intermediaries who then move gatekeepers to take action. This level of detail is necessary for program staff, but it may be too much for discussions with funders and stakeholders.

The Diabetes Control model is another good example of a family of models. In this case, the zoom in models are very similar to the global model in level of detail. They add value by translating the global model into a plan for specific actors (in this case a state diabetes control program) or for specific objectives (e..g., increasing timely foot exams).

Logic models are useful for both new and existing programs and initiatives. If your effort is being planned, a logic model can help get it off to a good start. Alternatively, if your program is already under way, a model can help you describe, modify or enhance it.

Planners, program managers, trainers, evaluators, advocates and other stakeholders can use a logic model in several ways throughout an initiative. One model may serve more than one purpose, or it may be necessary to create different versions tailored for different aims. Here are examples of the various times that a logic model could be used.

During planning to:

  • clarify program strategy
  • identify appropriate outcome targets (and avoid over-promising)
  • align your efforts with those of other organizations
  • write a grant proposal or a request for proposals
  • assess the potential effectiveness of an approach
  • set priorities for allocating resources
  • estimate timelines
  • identify necessary partnerships
  • negotiate roles and responsibilities
  • focus discussions and make planning time more efficient

During implementation to:

  • provide an inventory of what you have and what you need to operate the program or initiative
  • develop a management plan
  • incorporate findings from research and demonstration projects
  • make mid-course adjustments
  • reduce or avoid unintended effects

During staff and stakeholder orientation to:

  • explain how the overall program works
  • show how different people can work together
  • define what each person is expected to do
  • indicate how one would know if the program is working

During evaluation to:

  • document accomplishments
  • organize evidence about the program
  • identify differences between the ideal program and its real operation
  • determine which concepts will (and will not) be measured
  • frame questions about attribution (of cause and effect) and contribution (of initiative components to the outcomes)
  • specify the nature of questions being asked
  • prepare reports and other media
  • tell the story of the program or initiative

During advocacy to:

  • justify why the program will work
  • explain how resource investments will be used

There is no single way to create a logic model. Think of it as something to be used, its form and content governed by the users' needs.

Who creates the model? This depends on your situation. The same people who will use the model - planners, program managers, trainers, evaluators, advocates and other stakeholders - can help create it. For practical reasons, though, you will probably start with a core group, and then take the working draft to others for continued refinement.

Remember that your logic model is a living document, one that tells the story of your efforts in the community. As your strategy changes, so should the model. On the other hand, while developing the model you might see new pathways that are worth exploring in real life.

Two main development strategies are usually combined when constructing a logic model.

  • Moving forward from the activities (also known as forward logic ). This approach explores the rationale for activities that are proposed or currently under way. It is driven by But why? questions or If-then thinking: But why should we focus on briefing Senate staffers? But why do we need them to better understand the issues affecting kids? But why would they create policies and programs to support mentoring? But why would new policies make a difference?... and so on. That same line of reasoning could also be uncovered using if-then statements: If we focus on briefing legislators, then they will better understand the issues affecting kids. If legislators understand, then they will enact new policies...
  • Moving backward from the effects (also known as reverse logic ). This approach begins with the end in mind. It starts with a clearly identified value, a change that you and your colleagues would definitely like to see occur, and asks a series of "But how?" questions: But how do we overcome fear and stigma? But how can we ensure our services are culturally competent? But how can we admit that we don't already know what we're doing?

At first, you may not agree with the answers that certain stakeholders give for these questions. Their logic may not seem convincing or even logical. But therein lies the power of logic modeling. By making each stakeholder's thinking visible on paper, you can decide as a group whether the logic driving your initiative seems reasonable. You can talk about it, clarify misinterpretations, ask for other opinions, check the assumptions, compare them with research findings, and in the end develop a solid system of program logic. This product then becomes a powerful tool for planning, implementation, orientation, evaluation, and advocacy, as described above.

By now you have probably guessed that there is not a rigid step-by-step process for developing a logic model. Like the rest of community work, logic modeling is an ongoing process. Nevertheless, there are a few tasks you should be sure to accomplish.

To illustrate these in action, we'll use another example for an initiative called "HOME: Home Ownership Mobilization Effort." HOME aims to increase home ownership in order to give neighborhood control to the people who live there, rather than to outside landlords with no stake in the community. It does this through a combination of educating community residents, organizing the neighborhood, and building relationships with partners such as businesses.

Steps for drafting a logic model

  • Available written materials often contain more than enough information to get started. Collect narrative descriptions, justifications, grant applications, or overview documents that explain the basic idea behind the intervention effort. If your venture involves a coalition of several organizations, be sure to get descriptions from each agency's point of view. For the HOME campaign, we collected documents from planners who proposed the idea, as well as mortgage companies, homeowner associations, and other neighborhood organizations.
  • Your job as a logic modeler is to decode these documents. Keep a piece of paper by your side and sketch out the logical links as you find them. (This work can be done in a group to save time and engage more people if you prefer.)
  • Read each document with an eye for the logical structure of the program. Sometimes that logic will be clearly spelled out (e.g., The information, counseling, and support services we provide to community residents will help them improve their credit rating, qualify for home loans, purchase homes in the community; over time, this program will change the proportion of owner-occupied housing in the neighborhood).
  • Other times the logic will be buried in vague language, with big leaps from actions to downstream effects (e.g., Ours is a comprehensive community-based program that will transform neighborhoods, making them controlled by people who live there and not outsiders with no stake in the community).
  • As you read each document, ask yourself the But why? and But how? questions. See if the writing provides an answer. Pay close attention to parts of speech. Verbs such as teach, inform, support, or refer are often connected to descriptions of program activities. Adjectives like reduced, improved, higher, or better are often used when describing expected effects.
  • The HOME initiative , for instance, created different models to address the unique needs of their financial partners, program managers, and community educators. Mortgage companies, grant makers, and other decision makers who decided whether to allocate resources for the effort found the global view from space most helpful for setting context. Program managers wanted the closer, yet still broad view from the mountaintop. And community educators benefited most from the you are here version. The important thing to remember is that these are not three different programs, but different ways of understanding how the same program works.
  • Logic models convey the story of community change. Working with the stakeholders, it's your responsibility to ensure that the story you've told in your draft makes sense (i.e., is logical) and is complete (has no loose ends). As you iteratively refine the model, ask yourself and others if it captures the full story.
  • Short-term - Potential home owners attain greater understanding of how credit ratings are calculated and more accurate information about the steps to improve a credit rating; mortgage companies create new policies and procedures allowing renters to buy their own homes; local businesses start incentive programs; and anti-discrimination lawsuits are filed against illegal lending practices.
  • Mid-term - The community's average credit rating improves; applications rise for home loans along with the approval rate; support services are established for first-time home buyers; neighborhood organizing gets stronger, and alliances expand to include businesses, health agencies, and elected officials.
  • Longer-term - The proportion of owner-occupied housing rises; economic revitalization takes off as businesses invest in the community; residents work together to create walking trails, crime patrols, and fire safety screenings; rates of obesity, crime, and injury fall dramatically.
  • An advantage of the graphic model is that it can display both the sequence and the interactions of effects. For example, in the HOME model, credit counseling leads to better understanding of credit ratings, while loan assistance leads to more loan submissions, but the two together (plus other activities such as more new buyer programs) are needed for increased home ownership.
  • Drama (activities, interventions). How will obstacles be overcome? Who is doing what? What kinds of conflict and cooperation are evident? What's being done to re-arrange the forces of change? What new services or conditions are being introduced? Your activities, based on a clear analysis of risk and protective factors, are the answers to these kinds of questions, Your interventions reveal the drama in your story of directed social change.

Dramatic actions in the HOME initiative include offering educational sessions and forming business alliances, homeowner support groups, and a neighborhood organizing council. At evaluation time, each of these actions is closely connected to output indicators that document whether the program is on track and how fast it is moving. These outputs could be the number of educational sessions held, their average attendance, the size of the business alliance, etc. (These outputs are not depicted in the global model, but that could be done if valuable for users.)

  • Raw Materials (inputs, resources, or infrastructure). The energy to create change can't come from nothing. Real resources must come into the system. Those resources may be financial, but they may also include people, space, information, technology, equipment, and other assets. The HOME campaign runs because of the input from volunteer educators, support from schools and faith institutions in the neighborhood, discounts provided by lenders and local businesses, revenue from neighborhood revitalization, and increasing social capital among community residents.
  • Stakeholders working on the HOME campaign understood that they were challenging a history of racial discrimination and economic injustice. They saw gentrification occurring in nearby neighborhoods. They were aware of backlash from outside property owners who benefit from the status quo. None of these facts are included in the model per se, but a shaded box labeled History and Context was added to serve as a visual reminder that these things are in the background.
  • Draft the logic model using both sides of your brain and all the talents of your stakeholders. Use your artistic and your analytic abilities .
  • Arrange activities and intended effects in the expected time sequence. And don't forget to include important feedback loops - after all, most actions provoke a reaction.
  • Link components by drawing arrows or using other visual methods that communicate the order of activities and effects. (Remember - the model does not have to be linear or read from left to right, top to bottom. A circle may better express a repeating cycle.)
  • Allow yourself plenty of space to develop the model. Freely revise the picture to show the relationships better or to add components.
  • Neatness counts, so avoid overlapping lines and unnecessary clutter.
  • Color code regions of the model to help convey the main storyline.
  • Try to keep everything on one page. When the model get too crowded, either adjust its scope or build nested models.
  • Make sure it passes the "laugh test." That is, be sure that the image you create isn't so complex that it provokes an immediate laugh from stakeholders. Of course, different stakeholders will have different laugh thresholds.
  • Use PowerPoint or other computer software to animate the model, building it step-by-step so that when you present it to people in an audience, they can follow the logic behind every connection.
  • Don't let your model become a tedious exercise that you did just to satisfy someone else. Don't let it sit in a drawer. Once you've gone through the effort of creating a model, the rewards are in its use. Revisit it often and be prepared to make changes. All programs evolve and change through time, if only to keep up with changing conditions in the community. Like a roadmap, a good model will help you to recognize new or reinterpret old territory.
  • Also, when things are changing rapidly, it's easy for collaborators to lose sight of their common goals. Having a well-developed logic model can keep stakeholders focused on achieving outcomes while remaining open to finding the best means for accomplishing the work. If you need to take a detour or make a longer stop, the model serves as a framework for incorporating the change.
  • Clarify the path of activities to effects and outcomes
  • Elaborate links
  • Expand activities to reach your goals
  • Establish or revise mile markers
  • Redefine the boundary of your initiative or program
  • Reframe goals or desired outcomes

You will know a model's effectiveness mainly by its usefulness to intended users. A good logic model usually:

  • Logically links activities and effects
  • Is visually engaging (simple, parsimonious) yet contains the appropriate degree of detail for the purpose (not too simple or too confusing)
  • Provokes thought, triggers questions
  • Includes forces known to influence the desired outcomes

The more complete your model, the better your chances of reaching "the promised land" of the story. In order to tell a complete story or present a complete picture in your model, make sure to consider all forces of change (root causes, trends, and system dynamics). Does your model reveal assumptions and hypotheses about the root causes and feedback loops that contribute to problems and their solutions?

In the HOME model, for instance, low home ownership persists when there is a vicious cycle of discrimination, bad credit, and hopelessness preventing neighborhood-wide organizing and social change. Three pathways of change were proposed to break that cycle: education; business reform; and neighborhood organizing. Building a model on one pathway to address only one force would limit the program's effectiveness.

You can discover forces of change in your situation using multiple assessment strategies, including forward logic and reverse logic as described above. When exploring forces of change, be sure to search for personal factors (knowledge, belief, skills) as well as environmental factors (barriers, opportunities, support, incentives) that keep the situation the same as well as ones that push for it to change.

Take time to simulate After you've mapped out the structure of a program strategy, there is still another crucial step to take before taking action: some kind of simulation. As logical as the story you are telling seems to you, as a plan for intervention it runs the risk of failure if you haven't explored how things might turn out in the real world of feedback and resistance. Simulation is one of the most practical ways to find out if a seemingly sensible plan will actually play out as you hope. Simulation is not the same as testing a model with stakeholders to see if it makes logical sense. The point of a simulation is to see how things will change - how the system will behave - through time and under different conditions. Though simulation is a powerful tool, it can be conducted in ways ranging from the simple to the sophisticated. Simulation can be as straightforward as an unstructured role-playing game, in which you talk the model through to its logical conclusions. In a more structured simulation, you could develop a tabletop exercise in which you proceed step by step through a given scenario with pre-defined roles and responsibilities for the participants. Ultimately, you could create a computer-based mathematical simulation by using any number of available software tools. The key point to remember is that creating logical models and simulating how those models will behave involve two different sets of skills, both of which are essential for discovering which change strategies will be effective in your community.

You can probably envision a variety of ways in which you might use the logic model you've developed or that logic modeling would benefit your work.

Here are a few advantages that experienced modelers have discovered.

  • Logic models integrate planning, implementation, and evaluation. As a detailed description of your initiative, from resources to results, the logic model is equally important for planning, implementing, and evaluating the project. If you are a planner, the modeling process challenges you to think more like an evaluator. If your purpose is evaluation, the modeling prompts discussion of planning. And for those who implement, the modeling answers practical questions about how the work will be organized and managed.
  • Logic models prevent mismatches between activities and effects. Planners often summarize an effort by listing its vision, mission, objectives, strategies and action plans . Even with this information, it can be hard to tell how all the pieces fit together. By connecting activities and effects, a logic model helps avoid proposing activities with no intended effect, or anticipating effects with no supporting activities. The ability to spot such mismatches easily is perhaps the main reason why so many logic models use a flow chart format.
  • Logic models leverage the power of partnerships. As the W.K. Kellogg Foundation notes (see Internet Resources below), refining a logic model is an iterative or repeating process that allows participants to "make changes based on consensus-building and a logical process rather than on personalities, politics, or ideology. The clarity of thinking that occurs from the process of building the model becomes an important part of the overall success of the program." With a well-specified logic model, it is possible to note where the baton should be passed from one person or agency to another. This enhances collaboration and guards against things falling through the cracks.
  • Logic models enhance accountability by keeping stakeholders focused on outcomes. As Connie Schmitz and Beverly Parsons point out (see Internet Resources), a list of action steps usually function as a manager's guide for running a project, showing what staff or others need to do to--for example, "Hire an outreach worker for a TB clinic." With a logic model, however, it is also possible to illustrate the effects of those tasks--for example, "Hiring an outreach worker will result in a greater proportion of clients coming into the clinic for treatment." This short-term effect then connects to mid- and longer-term effects, such as "Satisfied clients refer others to the clinic" and "Improved screening and treatment coverage results in fewer deaths due to TB."

In a coalition or collaborative partnership, the logic model makes it clear which effects each partner creates and how all those effects converge to a common goal. The family or nesting approach works well in a collaborative partnership because a model can be developed for each objective along a sequence of effects, thereby showing layers of contributions and points of intersection.

  • Logic models help planners to set priorities for allocating resources . A comprehensive model will reveal where physical, financial, human, and other resources are needed. When planners are discussing options and setting priorities, a logic model can help them make resource-related decisions in light of how the program's activities and outcomes will be affected.
  • Logic models reveal data needs and provide a framework for interpreting results. It is possible to design a documentation system that includes only beginning and end measurements. This is a risky strategy with a good chance of yielding disappointing results. An alternative approach calls for tracking changes at each step along the planned sequence of effects. With a logic model, program planners can identify intermediate effects and define measurable indicators for them.
  • Logic models enhance learning by integrating research findings and practice wisdom . Most initiatives are founded on assumptions about the behaviors and conditions that need to change, and how they are subject to intervention. Frequently, there are different degrees of certainty about those assumptions. For example, some of the links in a logic model may have been tested and proved to be sound through previous research. Other linkages, by contrast, may never have been researched, indeed may never have been tried or thought of before. The explicit form of a logic model means that you can combine evidence-based practices from prior research with innovative ideas that veteran practitioners believe will make a difference. If you are armed with a logic model, it won't be easy for critics to claim that your work is not evidence-based.
  • Logic models define a shared language and shared vision for community change . The terms used in a model help to standardize the way people think and how they speak about community change. It gets everyone rowing in the same direction, and enhances communication with external audiences, such as the media or potential funders. Even stakeholders who are skeptical or antagonistic toward your work can be drawn into the discussion and development of a logic model. Once you've got them talking about the logical connections between activities and effects, they're no longer criticizing from the sidelines. They'll be engaged in problem-solving and they'll be doing so in an open forum, where everyone can see their resistance to change or lack of logic if that's the case.

Limitations

Any tool this powerful must not be approached lightly. When you undertake the task of developing a logic model, be aware of the following challenges and limitations.

First, no matter how logical your model seems, there is always a danger that it will not be correct. The world sometimes works in surprising, counter-intuitive ways, which means we may not comprehend the logic of change until after the fact. With this in mind, modelers will appreciate the fact that the real effects of intervention actions could differ from the intended effects. Certain actions might even make problems worse, so it's important to keep one eye on the plan and another focused on the real-life experiences of community members.

If nothing else, a logic model ought to be logical. Therein lies its strength and its weakness. Those who are trying to follow your logic will magnify any inconsistency or inaccuracy. This places a high burden on modelers to pay attention to detail and refine their own thinking to great degree. Of course, no model can be perfect. You'll have to decide on the basis of stakeholders' uses what level of precision is required.

Establishing the appropriate boundaries of a logic model can be a difficult challenge. In most cases, there is a tension between focusing on a specific program and situating that effort within its broader context. Many models seem to suggest that the only forces of change come from within the program in question, as if there is only one child in the sandbox.

At the other extreme, it would be ridiculous and unproductive to map all the simultaneous forces of change that affect health and community development. A modeler's challenge is to include enough depth so the organizational context is clear, without losing sight of the reasons for developing a logic model in the first place.

On a purely practical level, logic modeling can also be time consuming, requiring much energy in the beginning and continued attention throughout the life of an initiative. The process can demand a high degree of specificity; it risks oversimplifying complex relationships and relies on the skills of graphic artists to convey complex thought processes.

Indeed, logic models can be very difficult to create, but the process of creating them, as well as the product, will yield many benefits over the course of an initiative.

A logic model is a story or picture of how an effort or initiative is supposed to work. The process of developing the model brings together stakeholders to articulate the goals of the program and the values that support it, and to identify strategies and desired outcomes of the initiative.

As a means to communicate a program visually, within your coalition or work group and to external audiences, a logic model provides a common language and reference point for everyone involved in the initiative.

A logic model is useful for planning, implementing and evaluating an initiative. It helps stakeholders agree on short-term as well as long-term objectives during the planning process, outline activities and actors, and establish clear criteria for evaluation during the effort. When the initiative ends, it provides a framework for assessing overall effectiveness of the initiative, as well as the activities, resources, and external factors that played a role in the outcome.

To develop a model, you will probably use both forward and reverse logic. Working backwards, you begin with the desired outcomes and then identify the strategies and resources that will accomplish them. Combining this with forward logic, you will choose certain steps to produce the desired effects.

You will probably revise the model periodically, and that is precisely one advantage to using a logic model. Because it relates program activities to their effect, it helps keep stakeholders focused on achieving outcomes, while it remains flexible and open to finding the best means to enact a unique story of change.

Online Resources

The Community Builder’s Approach to Theory of Change: A Practical Guide to Theory Development , from The Aspen Institute’s Roundtable on Community Change.

A concise definition by Connie C. Schmitz and Beverly A. Parsons .

The CDC Evaluation Working Group provides a linked section on  logic models  in its resources for project evaluation.

The Evaluation Guidebook for Projects Funded by S.T.O.P. Formula Grants under the Violence Against Women Act  includes a chapter on developing and using a logic model (Chapter 2), and additional examples of model in the "Introduction to the Resource Chapters."

A logic model from Harvard  that uses a family/school partnership program.

Excerpts from United Way's publication on Measuring Program Outcomes See especially "Program Outcome Model."

Logic Model Magic Tutorial from the CDC - this tutorial will provide you with information and resources to assist you as you plan and develop a logic model to describe your program and help guide program evaluation. You will have opportunities to interact with the material, and you can proceed at your own pace, reviewing where you need to or skipping to sections of your choice.

Tara Gregory on Using Storytelling to Help Organizations Develop Logic Models discusses techniques to facilitate creative discussion while still attending to the elements in a traditional logic model. These processes encourage participation by multiple staff, administrators and stakeholders and can use the organization’s vision or impact statement as the “happily ever after.”

Theory of Change: A Practical Tool for Action, Results and Learning , prepared by Organizational Research Services.

Theories of Change and Logic Models: Telling Them Apart  is a helpful PowerPoint presentation saved as a PDF. It’s from the Aspen Institute Roundtable on Community Change.

University of Wisconsin’s Program Development and Evaluation  provides a comprehensive template for a logic model and elaborates on creating and developing logic models.

The U.S. Centers for Disease Control Evaluation Group  provides links to a variety of logic model resources.

The W.K. Kellogg Foundation Logic Model Development Guide  is a comprehensive source for background information, examples and templates (Adobe Acrobat format).

Print Resources

American Cancer Society (1998).  Stating outcomes for American Cancer Society programs: a handbook for volunteers and staff . Atlanta, GA, American Cancer Society.

Julian, D. (1997). The utilization of the logic model as a system level planning and evaluation device.  Evaluation and Program Planning  20(3): 251-257.

McEwan, K., &  Bigelow, A. (1997).  Using a logic model to focus health services on population health goals . Canadian Journal of Program Evaluation 12(1): 167-174.

McLaughlin, J., & Jordan, B. (1999). Logic models: a tool for telling your program's performance story.  Evaluation and Program Planning  22(1): 65-72.

Moyer, A., Verhovsek, et al. (1997). Facilitating the shift to population-based public health programs: innovation through the use of framework and logic model tools.  Canadian Journal of Public Health  88(2): 95-98.

Rush, B. & Ogbourne, A. (1991). Program logic models: expanding their role and structure for program planning and evaluation.  Canadian Journal of Program Evaluation  6: 95-106.

Taylor-Powell, E., Rossing, B., et al. (1998).  Evaluating collaboratives: reaching the potential . Madison, WI, University of Wisconsin Cooperative Extension.

United Way of America (1996).  Measuring program outcomes: a practical approach . Alexandria, VA, United Way of America.

Western Center for the Application of Prevention Technologies. (1999)  Building a Successful Prevention Program . Reno, NV, Western Center for the Application of Prevention Technologies.

Wong-Reiger, D., & David, L. (1995). Using program logic models to plan and evaluate education and prevention programs. In  Love, A. Ed.  Evaluation Methods Sourcebook II.  Ottawa, Ontario, Canadian Evaluation Society.

strategic planning logic model

Logic Model Guide

In this article.

strategic planning logic model

What is the Logic Model?

A logic model is useful for planning, implementing, and evaluating programs more effectively. Sopact Sense allows you to create logic models. But, it takes it further by aligning with data strategy and dashboard plan. Turn them into actionable strategies for success, unlocking their full potential.

Build a visual representation of the relationships between program inputs, activities, outputs, and outcomes. It can help you design, check, and communicate the effectiveness of your program.

The logic model can help you improve program efficiency and accountability to enhance stakeholder engagement and funding opportunities. However, creating effective logic models can be challenging. It requires a deep understanding of program theory, data collection, and analysis of stakeholder feedback. Sopact Sense offers comprehensive tools and resources to guide you through the process and help you succeed.

Watch our video to see how it works, explore strategies for inspiration, and sign up. With Sopact's actionable approach to impact strategy, you can unlock your full program potential and make a real difference.

Logic Model Components

An effective logic model helps program staff define longer-term resources, activities, and goals. Understanding each detailed component is crucial for developing a model that effectively guides your project or program. Elements of logic model include:

  • Inputs: These are the foundational resources required for your project. Inputs include financial investments, human resources, materials, equipment, and technology. This section of the logic model answers the question: What resources do we need to carry out our plan?
  • Activities: Program activities are the actions undertaken using the inputs. They are the strategies, techniques, and processes you deploy to meet your objectives. This can range from conducting research hosting workshops, to implementing specific interventions. Activities are the ‘doing’ part of your model.
  • Outputs: Outputs are the direct, tangible products of your activities. You can measure them, and they may include the amount of training, articles, workshops, or people involved. Outputs represent the immediate results of your efforts.
  • Outcomes are the changes or benefits that occur because of your activities and outputs. We typically categorize expected outcomes as short-term (immediate), medium-term (intermediate), or long-term (final). Changes can happen in what people know, think, do, or can do, or in their situation.
  • Impact: This is your project's ultimate goal or long-term effect. Impact reflects the broader changes or improvements in the community, system, or organization resulting from your program. It answers the question: What difference did our project make in the long run?

A logic model ensures that your project is more than just a collection of activities. It helps create a well-thought-out strategy to achieve significant outcomes.

Understanding the Logic Model Framework

Are you involved in program evaluation and planning? Want to create effective strategies for your organization's success? Look no further! The logic model framework is here to help.

This guide will explain logic models and how they can improve your program.

If you are a nonprofit, a business, or just someone who wants to make a difference, it is important to grasp the logic model framework.

From logic model assumptions to outcome measurement, strategic planning to theory of change, we will cover it all.  Prepare yourself to harness the potential of the logic model framework and elevate your program evaluation to a higher tier.

Let's dive in and discover the logic model framework together!

What is a Logic Model?

A logic model is a visual representation of how a program is expected to work. It outlines the resources, activities, outputs, outcomes, and impacts of a program in a logical sequence. The logic model framework is based on the idea that if you do certain activities, you will produce certain outputs, which will lead to specific outcomes and ultimately, impact. It is a useful tool for program planning, implementation, and evaluation.

The Components of a Logic Model

A logic model typically consists of four main components: inputs, activities, outputs, and outcomes. Some models also include a fifth component, impact. Let's take a closer look at each of logic model components below.

Assumptions

Logic model assumptions are an important aspect of a logic model. They are the beliefs or expectations about how the program will work and the conditions necessary for success. Assumptions are often based on previous experience, research, or expert knowledge.

Inputs are the resources needed to implement a program. These can include funding, staff, volunteers, equipment, and other resources. Inputs are essential for the success of a program, as they provide the necessary support for activities to take place.

Activities are the actions taken to achieve the desired outcomes of a program. These can include workshops, training sessions, events, and other interventions. Activities are the most visible part of a program and are directly linked to the inputs.

Outputs are the direct results of the activities. These can include the number of people trained, the number of workshops held, or the number of resources distributed. Outputs are tangible and measurable and provide evidence of the program's progress.

Outcomes are the changes that occur as a result of the program. These can be short-term, intermediate, or long-term. Short-term outcomes are immediate changes that occur as a result of the program, while intermediate and long-term outcomes are changes that occur over time. Outcomes are often categorized into three types: knowledge, attitude, and behavior.

Impact is the ultimate goal of a program. It is the long-term change that the program aims to achieve. Impact is often measured by looking at the overall effect of the program on the target population.

How program logic model can help?

Here are several ways in which a logic model can be helpful:

Strategic Planning: A logic model can be useful for planning and designing a program or intervention. It helps planners figure out what they need, what they will do, and what they hope to achieve.

Communication: A logic model can communicate the logic behind a program. Staff can decide on interventions for stakeholders, such as funders, partners, and community members. It can help clarify the program's goals and objectives and demonstrate how it intends to achieve its desired outcomes.

Evaluation: A logic model can guide the assessment of a program. It can help evaluators to identify the appropriate indicators and data and track the program performance over time.

Continuous improvement : The logic model can be a valuable tool for continuous improvement. Staff can learn about programs that are working well. Staff can also learn about a program that requires modification or improvement. It can also help to identify any unintended consequences of the program.

Why use logic model

A logic model helps understand and improve program effectiveness. The main difference is that you start with a dashboard design when creating a logic model. Then, you promptly connect your data strategy to achieve the best outcomes.

The power of logic model program evaluation

Logic model program evaluation is a powerful tool for assessing the effectiveness of programs and interventions. It provides a framework for rigorously evaluating a program's inputs, activities, outputs, and outcomes. This evaluation helps organizations identify strengths and areas for improvement. It enables better decision-making and leads to improved outcomes.

Let's take the example of a non-profit organization focused on reducing "food insecurity" in a particular community. The organization has implemented a program that provides free meals to children in low-income families. The program has been running for a few years, and the organization wants to assess its effectiveness.

Logic model nonprofit

The organization would first develop a logic model that outlines the program's inputs, activities, outputs, and outcomes. For example,

  • The inputs would include the resources and funding that the organization has invested in the program.
  • The activities would include the processes and strategies that the organization uses to provide free meals to children.
  • The outputs would include the number of meals served, the number of children reached, and other relevant metrics.
  • The outcomes would include the program's short- and long-term impacts, such as the children's improved health and academic performance.

Once the organization develops the logic model, it can collect data to assess the program's effectiveness. This could involve surveys of program participants and their families, analyzing program data, and using other relevant sources. The organization can then use this data to make evidence-based decisions about the program's future.

Logic Model Framework

Creating a logic model framework involves the following steps. You can start with Sopact Impact strategy that comes with over 200+ logic model templates.

1. Identify the purpose of your program or intervention: Determine the specific goals and objectives you want to achieve. For example, if your program aims to reduce food insecurity, the goal may be to increase access to nutritious meals for low-income families.

2.  Define the inputs: Identify the resources and funding needed to implement the program. This includes staff, volunteers, equipment, facilities, and financial support.

3. Outline the activities: Describe the processes and strategies that will be used to deliver the program. This includes the steps involved in providing free meals to children, such as meal preparation, distribution, and coordination with community partners.

4. Determine the outputs: Specify the tangible and measurable results of your program. This includes the number of meals served, the number of children reached, and any other relevant metrics that demonstrate program delivery.

5. Identify the outcomes: Determine the short-term, intermediate, and long-term impacts of your program. This includes the changes or improvements you hope to see in the target population, such as improved health and academic performance in children.

6. Establish the assumptions: Identify any assumptions or beliefs that underlie your logic model. These are the key factors you believe will contribute to the success of your program.

7. Create a logic diagram:  Create a visual image, like a flowchart or sketch, to portray the logical links among the inputs, tasks, outputs, and results of your program. It aids in picturing the associations and dependencies among various elements.

8. Test and refine the logic model: Collect data and evidence to assess the effectiveness of your program. This may involve conducting surveys, interviews, or analyzing program data. Use this information to refine and improve your logic model as needed.

Remember, the logic model framework provides a structured approach to outcome measurement  and helps organizations understand the theory of change behind their programs. It enables better decision-making and supports evidence-based practices.

Program outcome measurement

Real world logic model example

Start understanding with a simple example of a logic model from Upaya Social Ventures, an impact fund based in Seattle. Their mission focuses on creating dignified jobs to eliminate extreme poverty, aligning with SDG 1 and SDG 8.

Explore their model and the metrics they use to assess fair pay, economic growth, and the eradication of extreme poverty.

An insightful video by Sachi Shenoy, the founder of Upaya Ventures, addresses a crucial question: "How do you create jobs?"

Upaya supports entrepreneurs in India's underprivileged communities to generate jobs for the extremely poor. They make early seed investments in promising businesses.

Upaya provides patient equity capital to entrepreneurs, helping them expand their businesses. They offer guidance in operations and financial management and facilitate connections for future investments, ensuring growth beyond their involvement.

While Upaya has influence over these initial steps, the subsequent job creation and poverty alleviation, though hoped for, are not directly under their control. The aim is that new jobs will provide stable, higher incomes, lifting people out of poverty. Upaya tracks this progress across various dimensions.

By defining a logic model, Upaya then sets specific metrics to ensure they adhere to their mission.

Review Upaya's Logic Model to deeply understand goal achievement.

Logic model planning

Logic model planning uses logic to design and implement a program or intervention. It includes finding the things needed, doing the planned activities, and reaching the program's goals.

The process of logic model planning typically involves several steps, including:

  • First, identify the problem or need that the program will address. This is the initial step in logic model planning. This may involve gathering data or conducting a needs assessment to determine the nature and extent of the problem.
  • After identifying the problem or need, the next step is determining the program's goals and objectives. These should be specific, measurable, attainable, relevant, and time-bound goals that the program hopes to achieve.
  • Next, identify the resources needed for the program, like money, workers, and supplies.
  • Creating activities using identified resources is the next step after identifying them. Design these activities to achieve the program's goals and objectives.
  • The last step in logic model planning is to identify the goals the program aims to achieve. These outcomes should be specific, measurable, achievable, and aligned with the program's goals and objectives.

Logic model planning is about creating a plan to implement a program or intervention. It ensures the program aligns with the resources and activities needed to achieve the desired outcomes.

Logic Model Evaluation

This involves collecting and analyzing data to determine whether the program meets its objectives. It also involves identifying any areas that may require modifications or improvements.

Logic model evaluation typically involves several steps, including:

  • The program intends to achieve identifying the outcomes. The logic model should describe the program's goals, like improving the knowledge and skills of participants.
  • Identifying indicators and data sources: Evaluators need to find indicators and data sources to measure the progress and impact of each outcome.
  • Collecting and analyzing data: Evaluators should study the data to see if the program achieves its goals. They should first find the indicators and data sources.
  • Reporting findings: Evaluators should indicate the program's strengths and identify areas that require change or improvement.

The logic model evaluation helps program planners and stakeholders understand the program's effectiveness and identify areas for improvement. This evaluation is crucial for achieving the desired results.

strategic planning logic model

A logic model is valuable for program planning, implementation, and evaluation. Breaking down a program or intervention into smaller parts can help with planning and understanding. This can be done and clearly. A good logic model helps organizations understand program theory, make evidence-based decisions, and improve outcomes.

Explore more on Theory of change,

Dive more into Sopact University

Frequently asked questions

When should a logic model be used, how does a logic model differ from a theory of change, how detailed should a logic model be, related articles.

strategic planning logic model

Affordable Housing Dashboard: Visualize Impactful Data

strategic planning logic model

Social Impact Definitions : Key Social Impact Terms

strategic planning logic model

Grant Reporting Best Practices

Useful links.

How to Use Strategic Planning Frameworks and Models

By Joe Weller | April 12, 2019 (updated July 26, 2021)

  • Share on Facebook
  • Share on Twitter
  • Share on LinkedIn

Link copied

Strategic planning models and frameworks can help guide you through the strategic planning process. In this article, seasoned industry experts explain the models and frameworks to help you identify which is best for you.

Included on this page, you'll find different types of models and frameworks , tools to help you decide which models and frameworks to use , and details on how to use strategic planning models .

Strategic Planning Basics

Strategic planning is a team process that sets up how your company will accomplish its goals. When you deploy it correctly, strategic planning highlights problems, helps find solutions, and monitors progress. To learn more about the basics of strategic planning, read this article.

A strategic plan includes many sections. When done well, a strategic plan can help you prioritize your company’s functions and stay in line with your mission and vision.

There are different ways to present a strategic plan — for example, it can be a written document, a spreadsheet, or an animated presentation. To learn how to write a strategic plan, read this article.

Strategic Planning Frameworks and Models

Just as there are many approaches to presenting a strategic plan, you have several ways to frame or model your plan.

The terms strategic planning framework and strategic framing models are often used interchangeably, but some say they are different.

Tom Wright

“Think of models as a way of ideating strategy. [A model is] a template: You use it at the beginning of the planning process. The idea behind a model is to tease out the ideas,” says Tom Wright, CEO and Co-Founder of Cascade Strategy , a software company based in Sydney, Australia, with offices all over the world. “Frameworks are like a lens to help you see different perspectives, whereas the model is a process you would use from the beginning. You add a framework on top of the [strategic plan] to slice and dice the model.”

There are many strategic planning models and frameworks — some are tried and true, others are newer and more adaptive, and planners and managers may be more familiar with some methods than others. There is no one right or wrong way to create a strategic plan, and you can modify models and frameworks based on your company culture, your current situation, and the purpose behind your planning.

“The major driver [for picking a model or framework] is what type of business you are and what you want to accomplish,” Wright explains.

Strategic planners often utilize different frameworks or customize particular models as they move through the planning process. But be careful; customizing models or frameworks too much might confuse people who are familiar with a particular planning process.

Below is a list of some of the most common frameworks and models:

Alignment Model: This model helps align your mission statement with available resources. It is particularly effective for businesses facing internal struggles.

Balanced Scorecard (BSC): The balanced scorecard system strives to connect big-picture elements with operational elements. BSC is well-known and works for companies of varying sizes.

The Basic Model: Sometimes called a simple strategic planning model , the basic model  involves creating a mission statement, goals, and strategies.

Blue Ocean Strategy: This framework emphasizes new markets and uncontested space.

Gap Planning: A strategy gap is the distance between how a company is currently performing and its desired goal. Gap planning is the analysis and evaluation of that difference.

Inspirational Model: This is a somewhat quick method of strategic planning that begins by coming up with a highly inspirational vision for the organization and the goals to match.

Issue-Based or Goal-Based Model: A step up from the basic model, this model is better for more established businesses. It incorporates SWOT or other types of assessments to determine goals, mission statements, action plans, and other steps.

Organic Model: As the name implies, this model does not necessarily follow a set plan, instead evolving and changing as conditions warrant.

PEST Model: The PEST (political, economic, social, and technological) approach looks at elements of the external environment, including the forces in its name.

Porter’s Five Forces: This model looks at five competitive forces that are present in every industry and helps to determine strengths and weaknesses: competition in the industry, the potential of new entrants into the industry, the power of suppliers, the power of customers, and the threat of substitute products.

Real-Time Model: This a fluid process that works best for companies that operate in a rapidly changing environment.

Scenario Model: When used in conjunction with other models, the scenario model can help you identify goals, as well as issues around them, by using scenarios that might arise. Some experts say this is more of a technique than a model.

Strategy Mapping: This approach helps organizations design and communicate their strategies. Strategy mapping often falls under the umbrella of a BSC, but strategy maps can also stand alone.

SWOT Analysis or SWOT Matrix: SWOT (strengths, weaknesses, opportunities, and threats) offers a way to examine both internal and external forces impacting your company.

VRIO Framework: This approach looks at the questions of value, rarity, imitability, and organization concerning the competitive potential of a company.

Using Strategic Planning Models

In this section, you’ll learn the specifics of the different strategic planning models and frameworks. Sometimes, models can serve as a visual guide. In contrast, frameworks function as an overlay to a model that helps clarify particular items, such as goals.

The Basic Model

The basic model of strategic planning is the most common and simplistic approach. The basic model works well for companies that are small, do not have much time to plan, don’t need to address many serious issues, or operate in stable external environments. It also works for companies that are new to strategic planning.

The basic model is not meant for organizations with significant resources to pursue ambitious visions and goals.

The basic model centers on the mission and vision statements. The vision statement identifies your company’s purpose on a higher level, and the mission statement outlines what happens within the organization to achieve that vision. It makes sense to build the rest of your plan from these statements.

The next step is to come up with goals you must achieve to live up to your mission and make it a reality, then outline what must happen to achieve those goals. Next, list the specific activities you must implement and who will participate in those activities. Lastly, create a simple monitoring plan to make sure your organization stays on track.

The Issue-Based or Goal-Based Model

The issue- or goal-based model evolves from the basic model and results in a more comprehensive plan. The steps vary.

This approach is dynamic and fluid, and it works well for businesses that want to go deeper into strategic planning but have the following concerns:

Limited resources for planning

Several issues to address

Limited past success reaching ambitious goals

No buy-in for the strategic planning process

The issue-based model requires organizations to identify their most important current issues, suggest action plans to address those issues, and include that information in the strategic plan.

The goal-based model often includes the following:

A way to monitor and amend the plan

Action plans, including objectives, resources, and implementation roles

Core values

Major issues and goals, along with ways to address them

Mission statement

SWOT analysis

Vision statement

Yearly operating plan, including a budget

The Alignment Model

The alignment model focuses on making sure an organization’s actions align with its vision. In the plan, you outline the mission, resources, programs, and support your organization needs to ensure it fulfills its vision.

The alignment model works well for organizations that are trying to figure out what is and isn’t working well, along with what needs adjusting. The process can help identify issues, such as internal inefficiencies and productivity problems. However, some critics of this model say it functions more like an internal development plan than a strategic plan.

The Scenario Model

The scenario model looks at what is happening outside of an organization, including regulatory, demographic, or political forces, to determine how they can impact what is happening inside of a company.

The scenario model works best when used in combination with other models and is more of a technique than a model.

For each change in an external force, discuss how it could impact the future of your organization in the following three ways:

A best-case scenario

A worst-case scenario

A reasonable-case scenario

After looking at the three potential impacts, figure out how to best respond to each. Then pick the most likely scenario and discuss strategies to address it.

The scenario model works well for businesses that need help planning for several potential situations.

The Organic Model

If your company wants to stay away from strict and formal strategic planning, the organic model might be a good fit. As the name implies, the organic model of strategic planning is more of a free-spirited conversation, rather than a set process. It emphasizes the journey over the destination.

The organic model relies on everyone having a shared vision and being willing to openly discuss how to get there using common values. This less systematic model requires patience since it involves constant dialogue and is never really finished.

The organic model works well for organizations where traditional methods feel static and obsolete. If you are looking for a set plan outlining steps to follow, the organic model is not for you.

Storyboarding techniques and open dialogue are often a part of the organic model, and everyone is encouraged to participate openly. The focus is more on learning and less about the method of strategic planning.

Real-Time Strategic Planning

The real-time method of strategic planning is even more fluid than the organic model. It helps articulate an organization’s mission and, sometimes, its vision and values. Real-time strategic planning often involves presenting lists to board members or management for further discussion.

Like the organic model, real-time strategic planning is a continuous process and works best for rapidly changing organizations that might not have the need for set, detailed, or traditional strategic planning.

Inspirational Model

Like the name implies, the inspirational model can be energizing to participants, but also have less of a strategic impact on an organization than other, more formal models.

The process works by gathering people to talk about a highly inspirational vision for the company. Leaders encourage participants to brainstorm exciting and far-reaching goals, then capture the details using powerful and poignant wording.

The inspirational model works well for organizations looking to lift the spirits of a staff or to quickly produce a plan.

Strategic Planning Frameworks

Like models, strategic planning frameworks help an organization through the strategic planning process. Most frameworks cover the basics of strategic planning (mission, vision, goals), but include additional sections and have more specific focus areas.

Balanced Scorecard Framework

One of the more popular strategic planning frameworks is the balanced scorecard. It functions as both a strategic planning and management system, and it helps connect a company’s plan to the operational elements that make it happen. The balanced scorecard takes more than financial profits into account when measuring success.

Companies use the balanced scorecard to do the following:

Align the daily work to the longer-term strategy.

Communicate where they are doing and why.

Set priorities.

Monitor progress and measure success.

When Drs. Robert Kaplan and David Norton created the balanced scorecard in the 1990s, it changed the way many companies do their strategic planning because it focused on more than one performance metric.

Strategic Models Balanced Scorecard Quadrants

Companies that use the balanced scorecard try to look at themselves using four unique perspectives to get a better understanding of their planning:

Financial performance

Stakeholders and customers the company is serving

An internal review of how the company is operating

Learning and growth (including capacity, infrastructure, technology, and culture)

The key to the balanced scorecard is that a business should be a balance of the four quadrants.

Cascade’s Wright says the balanced scorecard works well for medium and large companies that don’t change very quickly or don’t need to make radical changes.

To learn more about the balanced scorecard and find free templates and examples, read this article .

Strategy Mapping

Strategy mapping can help an organization reach its goals by providing a visual tool to communicate a strategic plan. Strategy mapping is often part of (but is not exclusive to) the balanced scorecard framework.

Because the graphic is visual and simple, it is an easy way to show how one objective impacts others.

Strategy mapping helps you identify key goals and unify those goals into strategies. People can refer back to it in order to stick to the overall plan when working on tasks or making decisions.

The map shows how different items interact with each other in various ways, including a cause-and-effect relationship.

Strategy maps are often set up in the following manner:

List the four perspectives (financial, customer, process and learning, and growth) horizontally.

Place objectives within those perspectives.

Write sets of linked objectives across different perspectives (these are called strategic themes ).

Show cause-and-effect impacts between objectives and across perspectives.

Strategic Planning Models Strategy Map Example

Image credit: Clearpoint Strategy

Use this template to help you organize your thoughts visually. By thinking of how different perspectives relate to each other, you can come up with your objectives.

Strategy Map Template

‌ Download Strategy Map Template

Excel | Smartsheet

Porter’s Five Forces

Porter’s Five Forces approach helps companies assess the competitiveness of the market. Introduced in 1979, it is one of the oldest strategic planning frameworks.

This approach focuses on the five forces that can impact the profitability of an organization:

The Threat of Entry: Can new companies enter the market?

The Threat of Other Substitute Products or Services: Is there a competitor on the market that your customers could use instead of your product or service?

Customers’ Bargaining Power: Can customers pressure you to react to their demands?

Suppliers’ Bargaining Power: Can suppliers apply pressure to your company?

Competitive Rivalry Among Companies: If a rival company changes its strategy, will it impact yours?

The key is to look at the amount of pressure each force applies to a company in order to determine that company’s future.

Five Force Model

Download Five Forces Model Template

Excel | PDF

SWOT Analysis

Most strategic planning processes include a SWOT analysis. Many companies perform a SWOT analysis at the beginning of the strategic planning process, as it offers them a look at what they are doing well and where they can improve.

A SWOT analysis examines the following:

Strengths: What the business does well to achieve its objectives

Weaknesses: What activities could keep a business from achieving its objectives

Opportunities: The external factors that could help achieve its objectives

Threats: Possible external factors that could keep the company from achieving its objectives

Strengths and weaknesses are internal characteristics, while opportunities and threats are external.

You've seen how the four quadrants of a SWOT analysis work. Use this template to write down each factor, so you can view your strengths, weaknesses, opportunities, and threats.

SWOT Analysis Strategic Template

‌ Download SWOT Analysis Strategy

Excel | Word | PDF | Smartsheet

PEST/PESTEL Planning

PEST stands for political, economic, sociocultural, and technological factors. There are several variations based on the idea, including PESTEL or PESTLE (when you also consider environmental and legal factors) or STEEPLED, where you consider sociocultural, technological, economic, environmental, political, legal, education, and demographic information.

These frameworks look at an industry or business environment and see what factors could impact an organization’s overall health and well-being. These do not stand alone and often go along with a SWOT analysis and other frameworks.

Below are some possible examples of these factors:

Political: Changes in tax laws, trading relationships, grant changes

Economic: Interest rate changes, inflation, consumer demand

Social: Changing lifestyle trends, demographic shifts

Technological: Competing technologies, productivity changes

Legal: Changes in regulations, employment laws

Environmental: Changes in customer expectations or regulations

PEST analysis template

Download PEST Analysis Template

Gap Planning

Gap planning allows you to compare an organization’s current position to its goal, then identify ways to bridge that gap. Gap planning can also help you identify internal deficiencies. Gap planning is sometimes known as gap analysis, needs assessment, or a strategic planning gap.

For a more detailed look at gap planning, read this article .

Blue Ocean Strategy

Created by professors W. Chan Kim and Renee Mauborgne in 2005, the blue ocean strategy is a relatively new planning framework. The idea of a blue ocean is to create an uncontested market space for your company. By contrast, a red ocean is a market space that is already developed and saturated.

A blue ocean is the unknown. A company creates demand for a product or service instead of fighting over it, so there is plenty of opportunity for everyone. The idea is to pursue differentiation, thereby creating market share instead of trying to beat competitors.

A red ocean is the known market space. Industries in that space define and accept the boundaries that exist, and they play by the rules. The only way to get ahead is to outperform rivals to claim a bigger share of the market. The competition can be bloody, which leads to the term red ocean .

An example of an organization that found a blue ocean is Cirque du Soleil. Instead of operating as a typical circus, it found and expanded on a niche. The key to the blue ocean strategy is to make the competition irrelevant because you are doing something the others are not.

VRIO Framework

VRIO (value, rarity, imitability, and organization) is a framework that deals primarily with the vision statement, rather than the entire strategy for a company. By answering four main questions, an organization should be able to create a vision statement to take it through the rest of the planning process. This results in a competitive advantage in your marketplace.

Below are the four main questions:

Value: Using a particular resource, can you exploit an opportunity or get rid of a threat?

Rarity: Is there a lot of competition in your market, or do a few entities control most of the market?

Imitability: Can anyone else do what you do?

Organization: Are you organized enough as a company to adequately exploit your product or service?

Companies can use the VRIO framework to evaluate its resources and capabilities as part of the overall strategic planning process. VRIO comes into play after a company creates a vision statement, but before the rest of the planning process. The advantages you identify help determine what you need to do in order to achieve them.

McKinsey’s Strategic Horizons

McKinsey’s Strategic Horizons framework focuses on growth and innovation by categorizing goals into three categories: the core business, emerging opportunities, and new business.

Strategic Planning Models McKinsey Three Horizons

Image credit: CASCADE

“McKinsey’s is one of my favorites because it applies to businesses small to large and generates excitement,” says Wright. He adds that it is an easy model because it does not involve much jargon and focuses on the future.

The first horizon deals mostly with core activities in which a company is already engaged. Existing revenue is placed here, so goals mostly deal with improving margins and processes, as well as maintaining incoming cash flow. The second horizon involves taking what is already happening and expanding it into new areas. The third horizon involves new directions, possibly including research and new programs. Wright recommends a 70/20/10 split between the three horizons.

Fast-growing and startup organizations might find McKinsey’s framework helpful.

The Ansoff Matrix

Sometimes called the product-market matrix , the Ansoff matrix looks at market penetration and potential future growth. It helps companies that want to try to grow sales volumes or have it as a major focus area.

In this matrix, market development concerns selling more of an existing product or service to a new group of people. Market penetration focuses on selling even more of a current product or service to the same people. Product development focuses on developing new products for current customers. Diversification is all about new products and services and new markets; this carries the most risk, but potentially offers large gains.

Wright says this framework helps companies think deeply about how they will achieve growth instead of merely saying they want to grow.

The Bryson Model or Strategy Change Cycle

John M. Bryson, McKnight Presidential Professor of Planning and Public Affairs at the Hubert H. Humphrey School of Public Affairs, University of Minnesota and author of Strategic Planning for Public and Nonprofit Organizations: A Guide to Strengthening and Sustaining Organizational Achievement , created the Bryson model. Some people, himself included, call it the Strategy Change Cycle.

John Bryson

“It’s a framework, not a recipe. It’s a reference point, the logic does not go step by step from one to 10,” Bryson says. “You start with purposes in mind and then figure out how to get there.”

There are 10 standard steps in the cycle, but Bryson stresses they are not sequential and often happen simultaneously.

Initiate and agree on a strategic planning process

Identify organizational mandates

Clarify organizational mission and values

Assess the external and internal environment to identify strengths, weaknesses, opportunities, and threats (SWOT)

Identify the issues facing the organization

Formulate strategies to manage the issues

Review and adopt the strategies or strategic plan

Establish an effective organizational vision

Develop an effective implementation process

Reassess the strategies and the strategic planning process

Using this cycle, changes to the norm often happen. “You might think you know what your mission and goals are, and after you go through the process, you might need to change your mission and goals,” Bryson explains. “We try to let the mission and goals emerge from the conversations rather than starting there.”

Other Planning Models and Frameworks

In addition to the models and frameworks listed above, there are several other types, including the following:

The Stakeholder Theory: This approach focuses on adding value to specific groups of people, including employees, customers, the community, shareholders, and society. Organizations can add groups as necessary since the model is very flexible.

Kaufman Model: Also called mega planning, the Kaufman Model relies on a needs assessment. This model focuses on the impact an organization can have on society and clients.

Global Model: As the name implies, global strategic planning includes what is necessary to compete in an international marketplace. It involves looking at both the internal and external environments of multinational organizations.

Maturity Model: The maturity model assesses how strategic management is working within an organization and how it stands up to other organizations.

Diamond-E Framework: The Diamond-E framework helps identify possible gaps in an organization to decide whether or not to pursue an opportunity.

Value Migration: This model helps companies plan ahead of the competition. Its creator, Adrian Slywotzky, defines value migration as the shifting of forces that create value, and that shift goes from an outdated business model to a better-designed model that satisfies customers.

Value Disciplines: This flexible framework focuses on what an organization is already good at and builds on it. Three areas of focus are operational excellence, customer intimacy, and product leadership.

Agile Strategic Planning Model: The flexibility of Agile planning allows for growth and change in strategic planning. The cornerstone of Agile is being able to respond quickly to change, which seems like the antithesis of strategic planning. The Agile approach to strategic planning involves reviewing and adapting your strategic plan at regular intervals and whenever conditions warrant it.

General Electric Model: Also known as the McKinsey Matrix, this model looks at the industry externally versus the internal forces. Since it helps to identify the attractiveness of an industry and a firm’s strengths, the grid can help evaluate market share and identify areas for development.

How to Decide Which Strategic Planning Model or Framework to Use

Though strategic planning has changed over the years, the need remains for organizations to have some kind of vision and mission, as well as an outline about how to achieve them.

There is no right or wrong way to decide which model or framework to use for your strategic planning process. The key is to figure out which one best applies to your company and its needs — for example, VRIO can help you create a vision statement, and BSC can help keep plans on track. Additionally, some methods work well together.

“The perfect plan is the one that actually gets done,” says Wright. “A poor plan well executed is worth more than a great plan that never gets off the ground. Most people know what they need to do; it’s getting the traction and about democratizing the process. Constantly, people undervalue the role of buy-in with strategic planning. People need to be involved.”

Ted Jackson

“The framework you choose would have to deal with the sophistication of your business,” says Ted Jackson, founder and managing partner of ClearPoint Strategy . He recommends adapting a model or framework to meet your needs, rather than attempting to stick to hard and fast rules that might come from a book or a similar source. “I think if you read a book and try to implement it exactly [as the book outlines it] to your organization, you will fail,” he says.

Jackson advises simplifying some frameworks and adapting them, but he has some cautionary advice about trying to combine parts of different frameworks. “One mistake is not picking one framework. You can’t be so flexible that you’re implementing multiple frameworks together. People within an organization get really confused,” Jackson says, adding that people who have some knowledge of specific models or frameworks will not understand different terms and ideas, and they’ll probably be afraid to ask.

Some organizations might not get to choose the framework they use. For example, governmental organizations or companies that receive grant funding might need to produce a strategic plan that fits into a formula the government dictates.

Even though you should not use a strategic plan solely because a similar company does, it might help to look at their preferred framework to pick the one that is right for you.

Below are other criteria to help you decide:

Check the size of your organization and the resources you can devote to planning.

If your organization is in trouble, you might want to focus on a framework or model that addresses immediate issues rather than tackles the longer term.

Look at the health of your organization and its developmental stage.

See who is excited about the planning process.

“If you have a cultural challenge in your organization about getting excited about planning, the model you pick is important. Some models are sexier than others,” Wright explains.

Wright does not recommend changing models during the planning process. “[The model] is a template you use to get your ideas on paper. The model is just a vehicle. If you’re struggling with the model, it might be you.”

It isn’t the same for frameworks, according to Wright. “There is a ton of value in changing frameworks and using multiple frameworks at the same time [to view things differently],” he says. Though Wright encourages using different frameworks, he echoes Jackson’s warning to not use different models at the same time.

In certain cases, strategic planning is not an immediate need — for instance, when a company is failing financially or is autocratic, or when a major upheaval is occurring.

Strategic Planning for Specific Areas

Strategic planning for specific departments is a bit different than planning for a company as a whole. In this section, we’ll explain the basics of how some departments typically approach strategic planning.

IT Strategic Planning

A strategic plan for the IT department details how technology will help a company succeed in reaching its goals and objectives. You can think of it as a technology roadmap that outlines where IT can do its part to implement a company’s strategies.

The IT plan must align with the company’s overall mission and vision statements, but it has a secondary mission statement that states how the IT strategy relates to the overall plans for the organization.

The IT plan should also include a SWOT analysis, goals, and objectives. The plan will help make sure you purchase the right assets and work with existing technology effectively. Use the template below to draft a strong, comprehensive IT strategic plan.

IT Strategic Plan Template

Download IT Strategic Planning Template - Excel

Strategic Human Capital Planning

Strategic human capital planning refers to when a company looks at how people — and how to manage them — go along with the organization’s strategic goals. The end result is a plan to help attract and maintain the talent necessary to achieve the company’s mission and vision.

You can use the following HR strategic plan to list, assess, and plan for future program strategies.

HR Strategic Plan template

Download HR Strategic Planning Template - Excel

Succession Planning

At its core, succession planning relies on developing and identifying new leaders. Because employees move on or retire, a company needs to have a plan in place to assume new and important roles.

Often, succession planning happens as a part of the overarching strategic planning process — for example, when you look at the resources available to a company and their productivity.

Note that available human resources can be both strengths and weaknesses. The planning process can help companies identify specific hiring needs.

For more about human resources management, this article can help. Additionally, you can find templates for succession planning here .

Healthcare Strategic Planning

The world of healthcare is changing, and healthcare organizations have to adapt. Still, the following general ideas persist:

There will be a continued need to provide quality patient care.

Operating costs and government regulations will impact the bottom line.

The volume and demographics of patients will change.

There will be a change in the labor supply, especially in the number of primary care physicians available.

Wellness and prevention will gain importance.

New technologies will continue to emerge.

Even with the ever-changing healthcare industry, strategic plans will continue to help organizations stay focused on their goals and objectives. By having a structured planning process, rather than following models that are more organic and reactionary, healthcare entities can survive and succeed.

But be careful with metrics that only consider financial success — there is much more to healthcare than profit.

Improve Strategic Planning with Real-Time Work Management in Smartsheet

Empower your people to go above and beyond with a flexible platform designed to match the needs of your team — and adapt as those needs change. 

The Smartsheet platform makes it easy to plan, capture, manage, and report on work from anywhere, helping your team be more effective and get more done. Report on key metrics and get real-time visibility into work as it happens with roll-up reports, dashboards, and automated workflows built to keep your team connected and informed. 

When teams have clarity into the work getting done, there’s no telling how much more they can accomplish in the same amount of time.  Try Smartsheet for free, today.

Discover why over 90% of Fortune 100 companies trust Smartsheet to get work done.

  • Business strategy |
  • 7 strategic planning models, plus 8 fra ...

7 strategic planning models, plus 8 frameworks to help you get started

Team Asana contributor image

Strategic planning is vital in defining where your business is going in the next three to five years. With the right strategic planning models and frameworks, you can uncover opportunities, identify risks, and create a strategic plan to fuel your organization’s success. We list the most popular models and frameworks and explain how you can combine them to create a strategic plan that fits your business.

A strategic plan is a great tool to help you hit your business goals . But sometimes, this tool needs to be updated to reflect new business priorities or changing market conditions. If you decide to use a model that already exists, you can benefit from a roadmap that’s already created. The model you choose can improve your knowledge of what works best in your organization, uncover unknown strengths and weaknesses, or help you find out how you can outpace your competitors.

In this article, we cover the most common strategic planning models and frameworks and explain when to use which one. Plus, get tips on how to apply them and which models and frameworks work well together. 

Strategic planning models vs. frameworks

First off: This is not a one-or-nothing scenario. You can use as many or as few strategic planning models and frameworks as you like. 

When your organization undergoes a strategic planning phase, you should first pick a model or two that you want to apply. This will provide you with a basic outline of the steps to take during the strategic planning process.

[Inline illustration] Strategic planning models vs. frameworks (Infographic)

During that process, think of strategic planning frameworks as the tools in your toolbox. Many models suggest starting with a SWOT analysis or defining your vision and mission statements first. Depending on your goals, though, you may want to apply several different frameworks throughout the strategic planning process.

For example, if you’re applying a scenario-based strategic plan, you could start with a SWOT and PEST(LE) analysis to get a better overview of your current standing. If one of the weaknesses you identify has to do with your manufacturing process, you could apply the theory of constraints to improve bottlenecks and mitigate risks. 

Now that you know the difference between the two, learn more about the seven strategic planning models, as well as the eight most commonly used frameworks that go along with them.

[Inline illustration] The seven strategic planning models (Infographic)

1. Basic model

The basic strategic planning model is ideal for establishing your company’s vision, mission, business objectives, and values. This model helps you outline the specific steps you need to take to reach your goals, monitor progress to keep everyone on target, and address issues as they arise.

If it’s your first strategic planning session, the basic model is the way to go. Later on, you can embellish it with other models to adjust or rewrite your business strategy as needed. Let’s take a look at what kinds of businesses can benefit from this strategic planning model and how to apply it.

Small businesses or organizations

Companies with little to no strategic planning experience

Organizations with few resources 

Write your mission statement. Gather your planning team and have a brainstorming session. The more ideas you can collect early in this step, the more fun and rewarding the analysis phase will feel.

Identify your organization’s goals . Setting clear business goals will increase your team’s performance and positively impact their motivation.

Outline strategies that will help you reach your goals. Ask yourself what steps you have to take in order to reach these goals and break them down into long-term, mid-term, and short-term goals .

Create action plans to implement each of the strategies above. Action plans will keep teams motivated and your organization on target.

Monitor and revise the plan as you go . As with any strategic plan, it’s important to closely monitor if your company is implementing it successfully and how you can adjust it for a better outcome.

2. Issue-based model

Also called goal-based planning model, this is essentially an extension of the basic strategic planning model. It’s a bit more dynamic and very popular for companies that want to create a more comprehensive plan.

Organizations with basic strategic planning experience

Businesses that are looking for a more comprehensive plan

Conduct a SWOT analysis . Assess your organization’s strengths, weaknesses, opportunities, and threats with a SWOT analysis to get a better overview of what your strategic plan should focus on. We’ll give into how to conduct a SWOT analysis when we get into the strategic planning frameworks below.

Identify and prioritize major issues and/or goals. Based on your SWOT analysis, identify and prioritize what your strategic plan should focus on this time around.

Develop your main strategies that address these issues and/or goals. Aim to develop one overarching strategy that addresses your highest-priority goal and/or issue to keep this process as simple as possible.

Update or create a mission and vision statement . Make sure that your business’s statements align with your new or updated strategy. If you haven’t already, this is also a chance for you to define your organization’s values.

Create action plans. These will help you address your organization’s goals, resource needs, roles, and responsibilities. 

Develop a yearly operational plan document. This model works best if your business repeats the strategic plan implementation process on an annual basis, so use a yearly operational plan to capture your goals, progress, and opportunities for next time.

Allocate resources for your year-one operational plan. Whether you need funding or dedicated team members to implement your first strategic plan, now is the time to allocate all the resources you’ll need.

Monitor and revise the strategic plan. Record your lessons learned in the operational plan so you can revisit and improve it for the next strategic planning phase.

The issue-based plan can repeat on an annual basis (or less often once you resolve the issues). It’s important to update the plan every time it’s in action to ensure it’s still doing the best it can for your organization.

You don’t have to repeat the full process every year—rather, focus on what’s a priority during this run.

3. Alignment model

This model is also called strategic alignment model (SAM) and is one of the most popular strategic planning models. It helps you align your business and IT strategies with your organization’s strategic goals. 

You’ll have to consider four equally important, yet different perspectives when applying the alignment strategic planning model:

Strategy execution: The business strategy driving the model

Technology potential: The IT strategy supporting the business strategy

Competitive potential: Emerging IT capabilities that can create new products and services

Service level: Team members dedicated to creating the best IT system in the organization

Ideally, your strategy will check off all the criteria above—however, it’s more likely you’ll have to find a compromise. 

Here’s how to create a strategic plan using the alignment model and what kinds of companies can benefit from it.

Organizations that need to fine-tune their strategies

Businesses that want to uncover issues that prevent them from aligning with their mission

Companies that want to reassess objectives or correct problem areas that prevent them from growing

Outline your organization’s mission, programs, resources, and where support is needed. Before you can improve your statements and approaches, you need to define what exactly they are.

Identify what internal processes are working and which ones aren’t. Pinpoint which processes are causing problems, creating bottlenecks , or could otherwise use improving. Then prioritize which internal processes will have the biggest positive impact on your business.

Identify solutions. Work with the respective teams when you’re creating a new strategy to benefit from their experience and perspective on the current situation.

Update your strategic plan with the solutions. Update your strategic plan and monitor if implementing it is setting your business up for improvement or growth. If not, you may have to return to the drawing board and update your strategic plan with new solutions.

4. Scenario model

The scenario model works great if you combine it with other models like the basic or issue-based model. This model is particularly helpful if you need to consider external factors as well. These can be government regulations, technical, or demographic changes that may impact your business.

Organizations trying to identify strategic issues and goals caused by external factors

Identify external factors that influence your organization. For example, you should consider demographic, regulation, or environmental factors.

Review the worst case scenario the above factors could have on your organization. If you know what the worst case scenario for your business looks like, it’ll be much easier to prepare for it. Besides, it’ll take some of the pressure and surprise out of the mix, should a scenario similar to the one you create actually occur.

Identify and discuss two additional hypothetical organizational scenarios. On top of your worst case scenario, you’ll also want to define the best case and average case scenarios. Keep in mind that the worst case scenario from the previous step can often provoke strong motivation to change your organization for the better. However, discussing the other two will allow you to focus on the positive—the opportunities your business may have ahead.

Identify and suggest potential strategies or solutions. Everyone on the team should now brainstorm different ways your business could potentially respond to each of the three scenarios. Discuss the proposed strategies as a team afterward.

Uncover common considerations or strategies for your organization. There’s a good chance that your teammates come up with similar solutions. Decide which ones you like best as a team or create a new one together.

Identify the most likely scenario and the most reasonable strategy. Finally, examine which of the three scenarios is most likely to occur in the next three to five years and how your business should respond to potential changes.

5. Self-organizing model

Also called the organic planning model, the self-organizing model is a bit different from the linear approaches of the other models. You’ll have to be very patient with this method. 

This strategic planning model is all about focusing on the learning and growing process rather than achieving a specific goal. Since the organic model concentrates on continuous improvement , the process is never really over.

Large organizations that can afford to take their time

Businesses that prefer a more naturalistic, organic planning approach that revolves around common values, communication, and shared reflection

Companies that have a clear understanding of their vision

Define and communicate your organization’s cultural values . Your team can only think clearly and with solutions in mind when they have a clear understanding of your organization's values.

Communicate the planning group’s vision for the organization. Define and communicate the vision with everyone involved in the strategic planning process. This will align everyone’s ideas with your company’s vision.

Discuss what processes will help realize the organization’s vision on a regular basis. Meet every quarter to discuss strategies or tactics that will move your organization closer to realizing your vision.

6. Real-time model

This fluid model can help organizations that deal with rapid changes to their work environment. There are three levels of success in the real-time model: 

Organizational: At the organizational level, you’re forming strategies in response to opportunities or trends.

Programmatic: At the programmatic level, you have to decide how to respond to specific outcomes or environmental changes.

Operational: On the operational level, you will study internal systems, policies, and people to develop a strategy for your company.

Figuring out your competitive advantage can be difficult, but this is absolutely crucial to ensure success. Whether it’s a unique asset or strength your organization has or an outstanding execution of services or programs—it’s important that you can set yourself apart from others in the industry to succeed.

Companies that need to react quickly to changing environments

Businesses that are seeking new tools to help them align with their organizational strategy

Define your mission and vision statement. If you ever feel stuck formulating your company’s mission or vision statement, take a look at those of others. Maybe Asana’s vision statement sparks some inspiration.

Research, understand, and learn from competitor strategy and market trends. Pick a handful of competitors in your industry and find out how they’ve created success for themselves. How did they handle setbacks or challenges? What kinds of challenges did they even encounter? Are these common scenarios in the market? Learn from your competitors by finding out as much as you can about them.

Study external environments. At this point, you can combine the real-time model with the scenario model to find solutions to threats and opportunities outside of your control.

Conduct a SWOT analysis of your internal processes, systems, and resources. Besides the external factors your team has to consider, it’s also important to look at your company’s internal environment and how well you’re prepared for different scenarios.

Develop a strategy. Discuss the results of your SWOT analysis to develop a business strategy that builds toward organizational, programmatic, and operational success.

Rinse and repeat. Monitor how well the new strategy is working for your organization and repeat the planning process as needed to ensure you’re on top or, perhaps, ahead of the game. 

7. Inspirational model

This last strategic planning model is perfect to inspire and energize your team as they work toward your organization’s goals. It’s also a great way to introduce or reconnect your employees to your business strategy after a merger or acquisition.

Businesses with a dynamic and inspired start-up culture

Organizations looking for inspiration to reinvigorate the creative process

Companies looking for quick solutions and strategy shifts

Gather your team to discuss an inspirational vision for your organization. The more people you can gather for this process, the more input you will receive.

Brainstorm big, hairy audacious goals and ideas. Encouraging your team not to hold back with ideas that may seem ridiculous will do two things: for one, it will mitigate the fear of contributing bad ideas. But more importantly, it may lead to a genius idea or suggestion that your team wouldn’t have thought of if they felt like they had to think inside of the box.

Assess your organization’s resources. Find out if your company has the resources to implement your new ideas. If they don’t, you’ll have to either adjust your strategy or allocate more resources.

Develop a strategy balancing your resources and brainstorming ideas. Far-fetched ideas can grow into amazing opportunities but they can also bear great risk. Make sure to balance ideas with your strategic direction. 

Now, let’s dive into the most commonly used strategic frameworks.

8. SWOT analysis framework

One of the most popular strategic planning frameworks is the SWOT analysis . A SWOT analysis is a great first step in identifying areas of opportunity and risk—which can help you create a strategic plan that accounts for growth and prepares for threats.

SWOT stands for strengths, weaknesses, opportunities, and threats. Here’s an example:

[Inline illustration] SWOT analysis (Example)

9. OKRs framework

A big part of strategic planning is setting goals for your company. That’s where OKRs come into play. 

OKRs stand for objective and key results—this goal-setting framework helps your organization set and achieve goals. It provides a somewhat holistic approach that you can use to connect your team’s work to your organization’s big-picture goals.  When team members understand how their individual work contributes to the organization’s success, they tend to be more motivated and produce better results

10. Balanced scorecard (BSC) framework

The balanced scorecard is a popular strategic framework for businesses that want to take a more holistic approach rather than just focus on their financial performance. It was designed by David Norton and Robert Kaplan in the 1990s, it’s used by companies around the globe to: 

Communicate goals

Align their team’s daily work with their company’s strategy

Prioritize products, services, and projects

Monitor their progress toward their strategic goals

Your balanced scorecard will outline four main business perspectives:

Customers or clients , meaning their value, satisfaction, and/or retention

Financial , meaning your effectiveness in using resources and your financial performance

Internal process , meaning your business’s quality and efficiency

Organizational capacity , meaning your organizational culture, infrastructure and technology, and human resources

With the help of a strategy map, you can visualize and communicate how your company is creating value. A strategy map is a simple graphic that shows cause-and-effect connections between strategic objectives. 

The balanced scorecard framework is an amazing tool to use from outlining your mission, vision, and values all the way to implementing your strategic plan .

You can use an integration like Lucidchart to create strategy maps for your business in Asana.

11. Porter’s Five Forces framework

If you’re using the real-time strategic planning model, Porter’s Five Forces are a great framework to apply. You can use it to find out what your product’s or service’s competitive advantage is before entering the market.

Developed by Michael E. Porter , the framework outlines five forces you have to be aware of and monitor:

[Inline illustration] Porter’s Five Forces framework (Infographic)

Threat of new industry entrants: Any new entry into the market results in increased pressure on prices and costs. 

Competition in the industry: The more competitors that exist, the more difficult it will be for you to create value in the market with your product or service.

Bargaining power of suppliers: Suppliers can wield more power if there are less alternatives for buyers or it’s expensive, time consuming, or difficult to switch to a different supplier.

Bargaining power of buyers: Buyers can wield more power if the same product or service is available elsewhere with little to no difference in quality.

Threat of substitutes: If another company already covers the market’s needs, you’ll have to create a better product or service or make it available for a lower price at the same quality in order to compete.

Remember, industry structures aren’t static. The more dynamic your strategic plan is, the better you’ll be able to compete in a market.

12. VRIO framework

The VRIO framework is another strategic planning tool designed to help you evaluate your competitive advantage. VRIO stands for value, rarity, imitability, and organization.

It’s a resource-based theory developed by Jay Barney. With this framework, you can study your firmed resources and find out whether or not your company can transform them into sustained competitive advantages. 

Firmed resources can be tangible (e.g., cash, tools, inventory, etc.) or intangible (e.g., copyrights, trademarks, organizational culture, etc.). Whether these resources will actually help your business once you enter the market depends on four qualities:

Valuable : Will this resource either increase your revenue or decrease your costs and thereby create value for your business?

Rare : Are the resources you’re using rare or can others use your resources as well and therefore easily provide the same product or service?

Inimitable : Are your resources either inimitable or non-substitutable? In other words, how unique and complex are your resources?

Organizational: Are you organized enough to use your resources in a way that captures their value, rarity, and inimitability?

It’s important that your resources check all the boxes above so you can ensure that you have sustained competitive advantage over others in the industry.

13. Theory of Constraints (TOC) framework

If the reason you’re currently in a strategic planning process is because you’re trying to mitigate risks or uncover issues that could hurt your business—this framework should be in your toolkit.

The theory of constraints (TOC) is a problem-solving framework that can help you identify limiting factors or bottlenecks preventing your organization from hitting OKRs or KPIs . 

Whether it’s a policy, market, or recourse constraint—you can apply the theory of constraints to solve potential problems, respond to issues, and empower your team to improve their work with the resources they have.

14. PEST/PESTLE analysis framework

The idea of the PEST analysis is similar to that of the SWOT analysis except that you’re focusing on external factors and solutions. It’s a great framework to combine with the scenario-based strategic planning model as it helps you define external factors connected to your business’s success.

PEST stands for political, economic, sociological, and technological factors. Depending on your business model, you may want to expand this framework to include legal and environmental factors as well (PESTLE). These are the most common factors you can include in a PESTLE analysis:

Political: Taxes, trade tariffs, conflicts

Economic: Interest and inflation rate, economic growth patterns, unemployment rate

Social: Demographics, education, media, health

Technological: Communication, information technology, research and development, patents

Legal: Regulatory bodies, environmental regulations, consumer protection

Environmental: Climate, geographical location, environmental offsets

15. Hoshin Kanri framework

Hoshin Kanri is a great tool to communicate and implement strategic goals. It’s a planning system that involves the entire organization in the strategic planning process. The term is Japanese and stands for “compass management” and is also known as policy management. 

This strategic planning framework is a top-down approach that starts with your leadership team defining long-term goals which are then aligned and communicated with every team member in the company. 

You should hold regular meetings to monitor progress and update the timeline to ensure that every teammate’s contributions are aligned with the overarching company goals.

Stick to your strategic goals

Whether you’re a small business just starting out or a nonprofit organization with decades of experience, strategic planning is a crucial step in your journey to success. 

If you’re looking for a tool that can help you and your team define, organize, and implement your strategic goals, Asana is here to help. Our goal-setting software allows you to connect all of your team members in one place, visualize progress, and stay on target.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

A logic model framework for evaluation and planning in a primary care practice-based research network (PBRN)

Holly hayes.

1 Department of Family and Community Medicine, University of Texas Health Science Center San Antonio

Michael L. Parchman

2 VERDICT Health Services Research Program, South Texas Veterans Health Care System

3 Academic Center for Excellence in Teaching

Evaluating effective growth and development of a Practice-Based Research Network (PBRN) can be challenging. The purpose of this article is to describe the development of a logic model and how the framework has been used for planning and evaluation in a primary care PBRN.

An evaluation team was formed consisting of the PBRN directors, staff and its board members. After the mission and the target audience were determined, facilitated meetings and discussions were held with stakeholders to identify the assumptions, inputs, activities, outputs, outcomes and outcome indicators.

The long-term outcomes outlined in the final logic model are two-fold: 1.) Improved health outcomes of patients served by PBRN community clinicians; and 2.) Community clinicians are recognized leaders of quality research projects. The Logic Model proved useful in identifying stakeholder interests and dissemination activities as an area that required more attention in the PBRN. The logic model approach is a useful planning tool and project management resource that increases the probability that the PBRN mission will be successfully implemented.

Introduction

With the heightened emphasis on translational and comparative effectiveness research to improve patient outcomes, Practice-Based Research Networks (PBRNs) have an unprecedented opportunity to become effective laboratories to address high priority research questions. As PBRNs engage in more funded research, these research dollars come with increased accountability to demonstrate the effectiveness of the work conducted in PBRNs. Despite a significant growth in the number of PBRNs over the past 15 years, little is known about effective and useful methods of evaluating PBRNs ( 1 ). One method with significant potential for PBRN evaluation and planning is a logic model.

What is a logic model?

The logic model has proven to be a successful tool for program planning as well as implementation and performance management in numerous fields, including primary care ( 2 – 14 ). A logic model (see Figure One ) is defined as a graphical/textual representation of how a program is intended to work and links outcomes with processes and the theoretical assumptions of the program ( 6 ). It is a depiction of a program or project showing what the program or project will do and what it is to accomplish. It is a series of “if then” relationships that, if implemented as intended lead to the desired outcomes. Stated another way, it is a framework for describing the relationships between resources, activities and results as they related to a specific program or project goal. The logic model also helps to make underlying assumptions about the program or project explicit. It provides a common approach to integrating planning, implementation and evaluation. Figure One below defines the key components of a logic model and what variables are included for each section

An external file that holds a picture, illustration, etc.
Object name is nihms347454f1.jpg

Program/Project Logic Model Framework

Why use a logic model?

A logic model is an efficient tool that requires little resources other than personnel time. Since evaluation dollars are not usually set aside in PBRN budgets, the cost-efficiency of this framework is attractive. In addition, the process of developing the logic model requires PBRN team members to work together in a manner that has a side benefit of improving team relationships and focus. A logic model can also provide much needed detail about how resources and activities can be connected with the desired results which helps with project management, resource allocation and strategic planning ( 2 – 14 ). The process of developing the logic model also facilitates critical thinking through the process of planning and communicating network objectives and outcomes. According to the Kellogg Foundation, the development of a logic model is a “conscious process that creates an explicit understanding of the challenges ahead, the resources available, and the timetable in which to hit the target” ( 6 ). For more detailed information regarding logic models, refer to the W.K. Kellogg Foundation Logic Model Development Guide ( 6 ).

To date, there are no publications demonstrating how a logic model framework can be used for evaluation and program planning in a primary care PBRN. The purpose of this article is to describe the development of a logic model and how the framework has been used in a primary care PBRN, the South Texas Ambulatory Research Network (STARNet).

Setting and Context

STARNet was founded in 1992 “ to conduct & disseminate practice-based research that results in new knowledge and improves the health of patients in South Texas .” STARNet has 165 practitioners in 108 primary care practices. These are primarily small group practices or solo practitioners located throughout south Texas – spanning a territory from the southernmost Mexico/Texas border to north central Austin, Texas. Over the years, STARNet has published over 20 peer-reviewed manuscripts of research findings from studies conducted in member primary care practice settings ( 15 – 34 ).

Development of a Logic Model

Step one: agree on the mission and target audience.

The STARNet Board of Directors had previously agreed that the primary goal of all STARNet projects is to improve the health of primary care patients in South Texas. The Board believed that to achieve this goal, STARNet clinicians and academic investigators (Target Audiences) were both equally critical for the success of the network. Investigators facilitate the research process and pursue grant opportunities for the overall sustainability of the network and STARNet clinicians are needed to frame and define the research questions that are relevant to their daily practice and assist in the interpretation of results.

Step Two: Identify and describe assumptions, inputs and activities

After defining the mission and the target audience, the STARNet coordinator and evaluation specialist facilitated ten meetings and discussions with key stakeholders over a six month period. Stakeholders at the meetings included: STARNet Board of Directors who are full-time primary care clinicians in family and internal medicine, practice facilitators who visit clinics regularly and assist with change processes, two STARNet directors with over 10 years of experience with the Network and STARNet partners including the School of Public Health and the South Texas Area Health Education Center. This group was tasked with identifying the assumptions, inputs and activities for the STARNet logic model. Assumptions are identified elements that you assume are in place and necessary to carrying out your strategies. For example, one assumption for PBRN research is that clinicians have time to participate in PBRN research and that investigators have funded grants that will contribute to network support. Once assumptions are identified, inputs are defined. Inputs include a list of identified resources (e.g. Network directors with clinical expertise and connections with the community) as well as constraints (e.g. lack of discretionary funds for relationship building – food, small gifts).

After assumptions and inputs are defined, the activities are described for the program which meets the needs of the target audience. Since the network has existed for over 18 years, it took a concerted effort on all members to think beyond current and past activities and initiatives. The coordinator encouraged the team to place equal attention to thinking about STARNet’s past and current activities and what activities need to take place in order to fulfill its mission. Well-designed activities are an essential element for logic model development. For STARNet, if activities could not be linked directly or deemed relevant to the two long-term outcomes (improved health outcomes of patients and clinician-led research projects), they would not be included in the logic model.

Step Three: Identify Outputs, Outcomes, and Outcome Indicators

To demonstrate STARNet’s growth and development, it was necessary to identify the specific outputs and outcomes necessary to fulfill its mission. Outputs are the actual deliverables or the units of service specific to STARNet– what occurs as a result of the planned activities. For example, the specific output for recruiting STARNet clinicians to the network is the number of new network members. The outcome is the actual impact and change associated with each output and is typically broken down into short-term (1–3 years), intermediate (3–5 years) and long-term (5–10 years). For example, an outcome that would apply to most PBRNs would be the development of the research and resource capacity of the STARNet clinicians (short-term) would lead to an increase in the number/quality of research projects in which STARNet clinicians participate (intermediate) which would in turn result in STARNet clinicians becoming recognized leaders of quality research projects (long-term). Once the outcomes were identified, we created the outcome indicators.

The outcome indicators are the milestones that can be observed and measured toward meeting the program’s mission. These measures are an indicator of how successful your program is in making progress towards the identified goals.

The most time-consuming component of the logic model process was identifying the activities, outputs and outcomes, especially ensuring that linkages existed between these three components. Developing meaningful outcomes that would be useful for grants, reports, publications and that informed members was the most difficult exercise during the logic model development process. The evaluation specialist was extremely helpful in assisting the logic model team in determining what outcomes were important enough to measure. The initial model was circulated to the group several times through e-mail and monthly meetings and further refined in an iterative process.

Final Logic Model

As a result of the above activities, the logic model in Figure 2 was agreed upon by all members. [Insert Figure 2 ] The logic model begins with the target population and underlying assumptions and leads into the inputs, activities, outputs and outcomes (short-term, intermediate, and long-term). The long-term outcomes of STARNet are two-fold: 1.) Improved health outcomes of patients served by STARNet clinicians; and 2.) STARNet clinicians are recognized leaders of quality research projects. Every input, activity, and outcome in STARNet’s logic model can be linked back to these two long-term outcomes – our mission’s “bull’s eye”.

An external file that holds a picture, illustration, etc.
Object name is nihms347454f2a.jpg

Program Goal: To establish a collaborative planning and implementation model for evaluating STARNet

Application of the Logic Model to PBRN Activities

Development of the logic model was considered only an initial phase in the process of evaluating, planning and developing the network. It remained clear throughout the process that an ongoing review and refinement of the logic model would be necessary to ensure that the PBRN implementation activities remained consistent with established outcomes. The group agreed that the first step in using our logic model would be to track the key indicators outlined in the outcomes.

Collecting Outcome Data

The group agreed that the first step in using our logic model would be to track the key indicators outlined in the outcomes. The group created detailed “to-do” lists based on the logic model, quarterly reports and updated Board member job descriptions. STARNet staff made a concerted effort to collect data on all of the outputs in an excel spreadsheet. Thus, the logic model informed and focused staff on what specific data needed to be recorded. The STARNet coordinator is charged with collecting all of the quantitative process and qualitative data each year. Detailed minutes and recordings are now being kept for the following meetings: Network staff, all membership, Board of Directors, and one-on-one site visits with STARNet clinicians. Qualitative data has proven to be very important in documenting the extent of involvement of members in network activities (output 9), not just the number involved, and network contextual changes.

Assessing PBRN Progress

The team meets monthly to assess progress and perform an internal evaluation based on logic model activities, outputs and outcomes. One example of this use of the Logic Model occurred when we discussed our progress towards conducting the activities outlined in the logic model framework during one of our monthly meetings. It was obvious that no efforts had been made to “disseminate research findings” to the network members and the broader community (Activity 9 and Output #6). The Board of Directors and STARNet leadership considered this to be a major process gap if the ultimate outcome is to improve patient health. As a result, STARNet is currently working with the University of Texas School Of Public Health and the South Central Area Health Education Center (AHEC) to create two comprehensive social marketing plans regarding research findings of studies conducted in STARNet for clinicians and their staff as well as patients. STARNet Directors and members will participate in focus groups in Summer, 2011 to develop a strategic communication and dissemination plan. This exemplifies how the logic model can also be used for problem identification and reallocation of resources in order to meet a pre-determined outcome.

Another example of the value of our logic model is when the STARNet Board of Directors decided to take a more proactive role in the financial status of the network. Board agendas now include a financial report at every meeting. STARNet recently became an incorporated non-profit and has updated by-laws and elected officers to Executive Committee. The Board of Directors considers these as crucial steps in meeting the mission of the network and is now developing a business plan to assist with future planning.

Subsequent to initiating our work on a logic model, Bleeker ( 35 ) and colleagues from the Netherlands identified only two existing PBRN evaluation tools. These tools were developed by Clement ( 36 ) and Fenton ( 37 ) to evaluate the overall effectiveness of PBRNs. Clement ( 36 ) proposed a conceptual framework to evaluate primary care PBRNs based on seven primary objectives with specific process and outcome indicators. The objectives could be categorized into network infrastructure, activity and dissemination efforts. Based on our review of the evaluation framework proposed by Clement ( 36 ), it appeared to be a very usable and feasible tool for implementation. However, Bleeker ( 35 ) questioned the validity of these indicators and the feasibility of Clement’s framework for conducting an overall evaluation.

Fenton ( 37 ) and colleagues developed the second identified evaluation tool, a Primary Care Research Network Toolkit, which includes a contextualized case study of five networks in the United Kingdom. This toolkit described eight primary dimensions of networks – each one with associated sub-dimensions. Networks could score themselves over time and even conduct comparisons across networks. Although the Primary Care Research Network Toolkit may be a useful in conducting formal evaluations, it lacked sufficient information regarding resources and time needed to successfully replicate the process in the United States.

Considering the relative limited resources of PBRNs, it is not surprising that a majority of PBRNs have not conducted a thorough evaluation of their efforts. Although evaluating a network takes time and requires the involvement of various individuals throughout the process, outcome evaluation efforts are a worthwhile investment. Unfortunately we realized early on, that our budget would not allow us to complete all of the activities outlined in the logic model. It became important to prioritize activities within the logic model due to budget constraints. The logic model should be modified regularly based on the changing capacity and resources of the network. It is yet to be proven whether our logic model framework will meet the planning and evaluation needs of STARNet.

In addition, logic models can be a tremendous tool in determining what is working well and what is not. The Board of Directors continually reminded the staff that all of the activities need to be centered on the mission -- improving patient care . As a result, all of the activities – planned and not planned - are viewed critically from that perspective. It is important to note, however, that every activity cannot be linked directly to long-term outcomes. Based on the logic model framework, the Co-Directors turned away investigators wanting to initiate projects in the network that did not meet the current priorities of the members. This was one of the first times in the history of the Network that it appropriately said “no” to an incompatible research interest. The logic model, in essence, united and empowered the efforts of members in advancing the STARNet mission.

Finally, the logic model reminded the PBRN team that a balance has to be maintained between the hard/traditional measures such as number of studies and publications and the more subjective measures such as easy access to PBRN member offices by PBRN coordinators and researchers. In addition, the core tenet of successful PBRNs is developing and maintaining respectful and trusting long-term relationships that continue beyond research studies (38). The complexity of the relationships and communication within a network is difficult to capture in evaluation efforts. The logic model helped us realize that it’s not just about the quantitative outcomes. In order to share a comprehensive story of STARNet, we also began to collect qualitative data (e.g. rich stories from the members). The logic model helped us realize that in the future, we need to collect this data more systematically from members and patients following the completion of research studies.

In conclusion, we found the logic model to be an effective planning and evaluation tool and a useful project management resource that greatly increases the probability that PBRN goals will be reached consistent with its mission. The logic model framework not only helped facilitate the Network evaluation process, but equally important, it engaged the leadership and members in a meaningful way. As a result, the Board of Directors, community clinician members, academic investigators and staff all have taken a more proactive role working together to advance the STARNet mission.

Acknowledgments

Funding for this study was provided by Clinical Translational Science Award # UL1RR025767 from NCRR/NIH to the University of Texas Health Sciences Center at San Antonio. The authors would like to thank the members of the South Texas Ambulatory Research Network for their support and contribution to this study.

None of the authors have a conflict of interest.

Logic models

Quick summary.

A logic model is a visual representation of an organization or program that shows what its inputs, activities, outputs and intended outcomes are. By putting those details down on paper, logic models are useful for helping leadership and staff clarify what the organization does, what its theory of change is, and how successful it is at achieving its mission.

  VIDEO OVERVIEW

Coming soon

STRATEGY DETAILS

Q1. What is a logic model?

A logic model is a detailed visual representation of an organization or office or program or initiative (we'll just say organization) that conveys its theory of change. It communicates how an organization works by depicting the intended relationships among its components:

  • Inputs: The resources that the organization uses, whether people, materials, funding, and so on.
  • Activities: What the organization does in its day to day work.
  • Outputs: What the organization produces because of its activities.
  • Outcomes: What the organization aims to achieve.

strategic planning logic model

Logic models are important and useful for several reasons, including:

  • Generating a clear and shared understanding of how a program works. In other words, if program staff work together to create a detailed logic model, it helps to clarify what the program does and why.
  • Improvement work. Once a program has created a logic model -- say during a leadership offsite -- it's a useful entrée into a discussion of where bottlenecks or other problems are occurring are how those could be addressed.
  • Strategic planning work. Logic models help clarify what the program is trying to achieve which can be valuable for strategic planning work such as goal setting.
  • Serving as foundation for program evaluation. Logic models are also a useful basis for a discussion around the question, "Is what we're doing as a program (our activities and outputs) actually producing the results (outcomes) we were hoping and expecting?" Answering that question is, in fact, a basic form of program evaluation.

Q3. What's an example logic model?

Figure 3 shows an example from a report published by the U.S. Department of Education of a logic model of a hypothetical computer-assisted math program within a school called Summing Up, designed to help students practice their math skills. The inputs include the software, teachers' time, associated fees, classroom space and other factors. Activities, meanwhile, focuses on what goes on within the program, including software use by students and teacher's related work. Next, outputs are what the program produces and includes students achieving the target level (90 hours per year) of engagement with the program. Lastly, outcomes are divided in this case into two categories -- short term and long term -- and related to the intended improvements in students' math abilities.

strategic planning logic model

Q4. In which direction do you read or create a logic model?

You generally read a logic model from left to right, starting with inputs and reading how they translate into activities, outputs and outcomes.

When you develop a logic model, however, you can choose the direction that makes the most sense for your situation (see Figure 4):

  • Left to right logic model development works best for programs that are up and running. It focuses on the "value chain" (how value is created by the organization) by putting on paper the inputs, then the activities created by those inputs, then the outcomes produced, and finally the desired outcomes.
  • Right to left logic model development uses what might be called reverse logic and is especially useful if a program is being designed. The idea here is to write down the desired long-term outcomes to be achieved by the program and then work backwards: What nearer-term outcomes would be needed? What outputs would we need to produce to achieve those outcomes? And so on all, the way back to inputs.

strategic planning logic model

HANDOUTS / TEMPLATES

  • Warm-up exercise: [ Template here ] For a team that is developing a logic model for their organization, it can be useful to start with a light-hearted practice version: creating a logic model for a summer vacation. This roughly 20-minute exercise can be done in breakout groups of three to five people, followed by presentations of their chosen destinations and logic models. Facilitators can provide quick feedback on the logic models. Note that some groups may define activities as preparation for a vacation (e.g., buy a guidebook or book tickets) while others may define them as vacation activities (e.g., go snorkeling or relax on the beach), providing an opportunity for the facilitator to underscore that logic model categories can be interpreted differently depending on the purpose.
  • Organizational logic model development exercise: [ Template here ] This template can be used for a facilitated group exercise to enable staff to develop a logic model for their organization -- for example, a 60-minute exercise, whether in breakout groups or as a full group.

ADDITIONAL RESOURCES

  • Article: “ One Key to Making a Leadership Offsite a Success: Logic Models, ” by Andrew Feldman, Government Executive. 
  • Chapter: Chapter 1, "Introducing Logic Models, " is available for free online of the book The Logic Model Guidebook by Lisa Wyatt Knowlton and Cynthia Phillips.
  • Five questions public agencies should ask to put their logic models to work : Cynthia Phillips, formerly National Science Foundation
  • Using logic models, a key building block of results-focused programs : Tom Chapel, Centers for Disease Control and Prevention

  CUSTOMIZED ASSISTANCE

Please contact us if your organization needs assistance in developing logic models or facilitating workshops to do so.

strategic planning logic model

  • Open access
  • Published: 25 September 2020

The Implementation Research Logic Model: a method for planning, executing, reporting, and synthesizing implementation projects

  • Justin D. Smith   ORCID: orcid.org/0000-0003-3264-8082 1 , 2 ,
  • Dennis H. Li 3 &
  • Miriam R. Rafferty 4  

Implementation Science volume  15 , Article number:  84 ( 2020 ) Cite this article

78k Accesses

170 Citations

83 Altmetric

Metrics details

A Letter to the Editor to this article was published on 17 November 2021

Numerous models, frameworks, and theories exist for specific aspects of implementation research, including for determinants, strategies, and outcomes. However, implementation research projects often fail to provide a coherent rationale or justification for how these aspects are selected and tested in relation to one another. Despite this need to better specify the conceptual linkages between the core elements involved in projects, few tools or methods have been developed to aid in this task. The Implementation Research Logic Model (IRLM) was created for this purpose and to enhance the rigor and transparency of describing the often-complex processes of improving the adoption of evidence-based interventions in healthcare delivery systems.

The IRLM structure and guiding principles were developed through a series of preliminary activities with multiple investigators representing diverse implementation research projects in terms of contexts, research designs, and implementation strategies being evaluated. The utility of the IRLM was evaluated in the course of a 2-day training to over 130 implementation researchers and healthcare delivery system partners.

Preliminary work with the IRLM produced a core structure and multiple variations for common implementation research designs and situations, as well as guiding principles and suggestions for use. Results of the survey indicated a high utility of the IRLM for multiple purposes, such as improving rigor and reproducibility of projects; serving as a “roadmap” for how the project is to be carried out; clearly reporting and specifying how the project is to be conducted; and understanding the connections between determinants, strategies, mechanisms, and outcomes for their project.

Conclusions

The IRLM is a semi-structured, principle-guided tool designed to improve the specification, rigor, reproducibility, and testable causal pathways involved in implementation research projects. The IRLM can also aid implementation researchers and implementation partners in the planning and execution of practice change initiatives. Adaptation and refinement of the IRLM are ongoing, as is the development of resources for use and applications to diverse projects, to address the challenges of this complex scientific field.

Peer Review reports

Contributions to the literature

Drawing from and integrating existing frameworks, models, and theories, the IRLM advances the traditional logic model for the requirements of implementation research and practice.

The IRLM provides a means of describing the complex relationships between critical elements of implementation research and practice in a way that can be used to improve the rigor and reproducibility of research and implementation practice, and the testing of theory.

The IRLM offers researchers and partners a useful tool for the purposes of planning, executing, reporting, and synthesizing processes and findings across the stages of implementation projects.

In response to a call for addressing noted problems with transparency, rigor, openness, and reproducibility in biomedical research [ 1 ], the National Institutes of Health issued guidance in 2014 pertaining to the research it funds ( https://www.nih.gov/research-training/rigor-reproducibility ). The field of implementation science has similarly recognized a need for better specification with similar intent [ 2 ]. However, integrating the necessary conceptual elements of implementation research, which often involves multiple models, frameworks, and theories, is an ongoing challenge. A conceptually grounded organizational tool could improve rigor and reproducibility of implementation research while offering additional utility for the field.

This article describes the development and application of the Implementation Research Logic Model (IRLM). The IRLM can be used with various types of implementation studies and at various stages of research, from planning and executing to reporting and synthesizing implementation studies. Example IRLMs are provided for various common study designs and scenarios, including hybrid designs and studies involving multiple service delivery systems [ 3 , 4 ]. Last, we describe the preliminary use of the IRLM and provide results from a post-training evaluation. An earlier version of this work was presented at the 2018 AcademyHealth/NIH Conference on the Science of Dissemination and Implementation in Health, and the abstract appeared in the Implementation Science [ 5 ].

Specification challenges in implementation research

Having an imprecise understanding of what was done and why during the implementation of a new innovation obfuscates identifying the factors responsible for successful implementation and prevents learning from what contributed to failed implementation. Thus, improving the specification of phenomena in implementation research is necessary to inform our understanding of how implementation strategies work, for whom, under what determinant conditions, and on what implementation and clinical outcomes. One challenge is that implementation science uses numerous models and frameworks (hereafter, “frameworks”) to describe, organize, and aid in understanding the complexity of changing practice patterns and integrating evidence-based health interventions across systems [ 6 ]. These frameworks typically address implementation determinants, implementation process, or implementation evaluation [ 7 ]. Although many frameworks incorporate two or more of these broad purposes, researchers often find it necessary to use more than one to describe the various aspects of an implementation research study. The conceptual connections and relationships between multiple frameworks are often difficult to describe and to link to theory [ 8 ].

Similarly, reporting guidelines exist for some of these implementation research components, such as strategies [ 9 ] and outcomes [ 10 ], as well as for entire studies (i.e., Standards for Reporting Implementation Studies [ 11 ]); however, they generally help describe the individual components and not their interactions. To facilitate causal modeling [ 12 ], which can be used to elucidate mechanisms of change and the processes involved in both successful and unsuccessful implementation research projects, investigators must clearly define the relations among variables in ways that are testable with research studies [ 13 ]. Only then can we open the “black box” of how specific implementation strategies operate to predict outcomes.

  • Logic models

Logic models, graphic depictions that present the shared relationships among various elements of a program or study, have been used for decades in program development and evaluation [ 14 ] and are often required by funding agencies when proposing studies involving implementation [ 15 ]. Used to develop agreement among diverse stakeholders of the “what” and the “how” of proposed and ongoing projects, logic models have been shown to improve planning by highlighting theoretical and practical gaps, support the development of meaningful process indicators for tracking, and aid in both reproducing successful studies and identifying failures of unsuccessful studies [ 16 ]. They are also useful at other stages of research and for program implementation, such as organizing a project/grant application/study protocol, presenting findings from a completed project, and synthesizing the findings of multiple projects [ 17 ].

Logic models can also be used in the context of program theory, an explicit statement of how a project/strategy/intervention/program/policy is understood to contribute to a chain of intermediate results that eventually produce the intended/observed impacts [ 18 ]. Program theory specifies both a Theory of Change (i.e., the central processes or drivers by which change comes about following a formal theory or tacit understanding) and a Theory of Action (i.e., how program components are constructed to activate the Theory of Change) [ 16 ]. Inherent within program theory is causal chain modeling. In implementation research, Fernandez et al. [ 19 ] applied mapping methods to implementation strategies to postulate the ways in which changes to the system affect downstream implementation and clinical outcomes. Their work presents an implementation mapping logic model based on Proctor et al. [ 20 , 21 ], which is focused primarily on the selection of implementation strategy(s) rather than a complete depiction of the conceptual model linking all implementation research elements (i.e., determinants, strategies, mechanisms of action, implementation outcomes, clinical outcomes) in the detailed manner we describe in this article.

Development of the IRLM

The IRLM began out of a recognition that implementation research presents some unique challenges due to the field’s distinct and still codifying terminology [ 22 ] and its use of implementation-specific and non-specific (borrowed from other fields) theories, models, and frameworks [ 7 ]. The development of the IRLM occurred through a series of case applications. This began with a collaboration between investigators at Northwestern University and the Shirley Ryan AbilityLab in which the IRLM was used to study the implementation of a new model of patient care in a new hospital and in other related projects [ 23 ]. Next, the IRLM was used with three already-funded implementation research projects to plan for and describe the prospective aspects of the trials, as well as with an ongoing randomized roll-out implementation trial of the Collaborative Care Model for depression management [Smith JD, Fu E, Carroll AJ, Rado J, Rosenthal LJ, Atlas JA, Burnett-Zeigler I, Carlo, A, Jordan N, Brown CH, Csernansky J: Collaborative care for depression management in primary care: a randomized rollout trial using a type 2 hybrid effectiveness-implementation design submitted for publication]. It was also applied in the later stages of a nearly completed implementation research project of a family-based obesity management intervention in pediatric primary care to describe what had occurred over the course of the 3-year trial [ 24 ]. Last, the IRLM was used as a training tool in a 2-day training with 63 grantees of NIH-funded planning project grants funded as part of the Ending the HIV Epidemic initiative [ 25 ]. Results from a survey of the participants in the training are reported in the “Results” section. From these preliminary activities, we identified a number of ways that the IRLM could be used, described in the section on “Using the IRLM for different purposes and stages of research.”

The Implementation Research Logic Model

In developing the IRLM, we began with the common “pipeline” logic model format used by AHRQ, CDC, NIH, PCORI, and others [ 16 ]. This structure was chosen due to its familiarity with funders, investigators, readers, and reviewers. Although a number of characteristics of the pipeline logic model can be applied to implementation research studies, there is an overall misfit due to implementation research’s focusing on the systems that support adoption and delivery of health practices; involving multiple levels within one or more systems; and having its own unique terminology and frameworks [ 3 , 22 , 26 ]. We adapted the typical evaluation logic model to integrate existing implementation science frameworks as its core elements while keeping to the same aim of facilitating causal modeling.

The most common IRLM format is depicted in Fig. 1 . Additional File A1 is a Fillable PDF version of Fig. 1 . In certain situations, it might be preferable to include the evidence-based intervention (EBI; defined as a clinical, preventive, or educational protocol or a policy, principle, or practice whose effects are supported by research [ 27 ]) (Fig. 2 ) to demonstrate alignment of contextual factors (determinants) and strategies with the components and characteristics of the clinical intervention/policy/program and to disentangle it from the implementation strategies. Foremost in these indications are “home-grown” interventions, whose components and theory of change may not have been previously described, and novel interventions that are early in the translational pipeline, which may require greater detail for the reader/reviewer. Variant formats are provided as Additional Files A 2 to A 4 for use with situations and study designs commonly encountered in implementation research, including comparative implementation studies (A 2 ), studies involving multiple service contexts (A 3 ), and implementation optimization designs (A 4 ). Further, three illustrative IRLMs are provided, with brief descriptions of the projects and the utility of the IRLM (A 5 , A 6 and A 7 ).

figure 1

Implementation Research Logic Model (IRLM) Standard Form. Notes. Domain names in the determinants section were drawn from the Consolidated Framework for Implementation Research. The format of the outcomes column is from Proctor et al. 2011

figure 2

Implementation Research Logic Model (IRLM) Standard Form with Intervention. Notes. Domain names in the determinants section were drawn from the Consolidated Framework for Implementation Research. The format of the outcomes column is from Proctor et al. 2011

Core elements and theory

The IRLM specifies the relationships between determinants of implementation, implementation strategies, the mechanisms of action resulting from the strategies, and the implementation and clinical outcomes affected. These core elements are germane to every implementation research project in some way. Accordingly, the generalized theory of the IRLM posits that (1) implementation strategies selected for a given EBI are related to implementation determinants (context-specific barriers and facilitators), (2) strategies work through specific mechanisms of action to change the context or the behaviors of those within the context, and (3) implementation outcomes are the proximal impacts of the strategy and its mechanisms, which then relate to the clinical outcomes of the EBI. Articulated in part by others [ 9 , 12 , 21 , 28 , 29 ], this causal pathway theory is largely explanatory and details the Theory of Change and the Theory of Action of the implementation strategies in a single model. The EBI Theory of Action can also be displayed within a modified IRLM (see Additional File A 4 ). We now briefly describe the core elements and discuss conceptual challenges in how they relate to one another and to the overall goals of implementation research.

Determinants

Determinants of implementation are factors that might prevent or enable implementation (i.e., barriers and facilitators). Determinants may act as moderators, “effect modifiers,” or mediators, thus indicating that they are links in a chain of causal mechanisms [ 12 ]. Common determinant frameworks are the Consolidated Framework for Implementation Research (CFIR) [ 30 ] and the Theoretical Domains Framework [ 31 ].

Implementation strategies

Implementation strategies are supports, changes to, and interventions on the system to increase adoption of EBIs into usual care [ 32 ]. Consideration of determinants is commonly used when selecting and tailoring implementation strategies [ 28 , 29 , 33 ]. Providing the theoretical or conceptual reasoning for strategy selection is recommended [ 9 ]. The IRLM can be used to specify the proposed relationships between strategies and the other elements (determinants, mechanisms, and outcomes) and assists with considering, planning, and reporting all strategies in place during an implementation research project that could contribute to the outcomes and resulting changes

Because implementation research occurs within dynamic delivery systems with multiple factors that determine success or failure, the field has experienced challenges identifying consistent links between individual barriers and specific strategies to overcome them. For example, the Expert Recommendations for Implementing Change (ERIC) compilation of strategies [ 32 ] was used to determine which strategies would best address contextual barriers identified by CFIR [ 29 ]. An online CFIR–ERIC matching process completed by implementation researchers and practitioners resulted in a large degree of heterogeneity and few consistent relationships between barrier and strategy, meaning the relationship is rarely one-to-one (e.g., a single strategy is often is linked to multiple barriers; more than one strategy needed to address a single barrier). Moreover, when implementation outcomes are considered, researchers often find that to improve one outcome, more than one contextual barrier needs to be addressed, which might in turn require one or more strategies.

Frequently, the reporting of implementation research studies focuses on the strategy or strategies that were introduced for the research study, without due attention to other strategies already used in the system or additional supporting strategies that might be needed to implement the target strategy. The IRLM allows for the comprehensive specification of all introduced and present strategies, as well as their changes (adaptations, additions, discontinuations) during the project.

Mechanisms of action

Mechanisms of action are processes or events through which an implementation strategy operates to affect desired implementation outcomes [ 12 ]. The mechanism can be a change in a determinant, a proximal implementation outcome, an aspect of the implementation strategy itself, or a combination of these in a multiple-intervening-effect model. An example of a causal process might be using training and fidelity monitoring strategies to improve delivery agents’ knowledge and self-efficacy about the EBI in response to knowledge-related barriers in the service delivery system. This could result in raising their acceptability of the EBI, increase the likelihood of adoption, improve the fidelity of delivery, and lead to sustainment. Relatively, few implementation studies formally test mechanisms of action, but this area of investigation has received significant attention more recently as the necessity to understand how strategies operate grows in the field [ 33 , 34 , 35 ].

Implementation outcomes are the effects of deliberate and purposive actions to implement new treatments, practices, and services [ 21 ]. They can be indicators of implementation processes, or key intermediate outcomes in relation to service, or target clinical outcomes. Glasgow et al. [ 36 , 37 , 38 ] describe the interrelated nature of implementation outcomes as occurring in a logical, but not necessarily linear, sequence of adoption by a delivery agent, delivery of the innovation with fidelity, reach of the innovation to the intended population, and sustainment of the innovation over time. The combined impact of these nested outcomes, coupled with the size of the effect of the EBI, determines the population or public health impact of implementation [ 36 ]. Outcomes earlier in the sequence can be conceptualized as mediators and mechanisms of strategies on later implementation outcomes. Specifying which strategies are theoretically intended to affect which outcomes, through which mechanisms of action, is crucial for improving the rigor and reproducibility of implementation research and to testing theory.

Using the Implementation Research Logic Model

Guiding principles.

One of the critical insights from our preliminary work was that the use of the IRLM should be guided by a set of principles rather than governed by rules. These principles are intended to be flexible both to allow for adaptation to the various types of implementation studies and evolution of the IRLM over time and to address concerns in the field of implementation science regarding specification, rigor, reproducibility, and transparency of design and process [ 5 ]. Given this flexibility of use, the IRLM will invariably require accompanying text and other supporting documents. These are described in the section “Use of Supporting Text and Documents.”

Principle 1: Strive for comprehensiveness

Comprehensiveness increases transparency, can improve rigor, and allows for a better understanding of alternative explanations to the conclusions drawn, particularly in the presence of null findings for an experimental design. Thus, all relevant determinants, implementation strategies, and outcomes should be included in the IRLM.

Concerning determinants, the valence should be noted as being either a barrier, a facilitator, neutral, or variable by study unit. This can be achieved by simply adding plus (+) or minus (–) signs for facilitators and barriers, respectively, or by using coding systems such as that developed by Damschroder et al. [ 39 ], which indicates the relative strength of the determinant on a scale: – 2 ( strong negative impact ), – 1 ( weak negative impact ), 0 ( neutral or mixed influence ), 1 ( weak positive impact ), and 2 ( strong positive impact ). The use of such a coding system could yield better specification compared to using study-specific adjectives or changing the name of the determinant (e.g., greater relative priority, addresses patient needs, good climate for implementation). It is critical to include all relevant determinants and not simply limit reporting to those that are hypothesized to be related to the strategies and outcomes, as there are complex interrelationships between determinants.

Implementation strategies should be reported in their entirety. When using the IRLM for planning a study, it is important to list all strategies in the system, including those already in use and those to be initiated for the purposes of the study, often in the experimental condition of the design. Second, strategies should be labeled to indicate whether they were (a) in place in the system prior to the study, (b) initiated prospectively for the purposes of the study (particularly for experimental study designs), (c) removed as a result of being ineffective or onerous, or (d) introduced during the study to address an emergent barrier or supplement other strategies because of low initial impact. This is relevant when using the IRLM for planning, as an ongoing tracking system, for retrospective application to a completed study, and in the final reporting of a study. There have been a number of processes proposed for tracking the use of and adaptations to implementation strategies over time [ 40 , 41 ]. Each of these is more detailed than would be necessary for the IRLM, but the processes described provide a method for accurately tracking the temporal aspects of strategy use that fulfill the comprehensiveness principle.

Although most studies will indicate a primary implementation outcome, other outcomes are almost assuredly to be measured. Thus, they ought to be included in the IRLM. This guidance is given in large part due to the interdependence of implementation outcomes, such that adoption relates to delivery with fidelity, reach of the intervention, and potential for sustainment [ 36 ]. Similarly, the overall public health impact (defined as reach multiplied by the effect size of the intervention [ 38 ]) is inextricably tied to adoption, fidelity, acceptability, cost, etc. Although the study might justifiably focus on only one or two implementation outcomes, the others are nonetheless relevant and should be specified and reported. For example, it is important to capture potential unintended consequences and indicators of adverse effects that could result from the implementation of an EBI.

Principle 2: Indicate key conceptual relationships

Although the IRLM has a generalized theory (described earlier), there is a need to indicate the relationships between elements in a manner aligning with the specific theory of change for the study. Researchers ought to provide some form or notation to indicate these conceptual relationships using color-coding, superscripts, arrows, or a combination of the three. Such notations in the IRLM facilitate reference in the text to the study hypotheses, tests of effects, causal chain modeling, and other forms of elaboration (see “Supporting Text and Resources”). We prefer the use of superscripts to color or arrows in grant proposals and articles for practical purposes, as colors can be difficult to distinguish, and arrows can obscure text and contribute to visual convolution. When presenting the IRLM using presentation programs (e.g., PowerPoint, Keynote), colors and arrows can be helpful, and animations can make these connections dynamic and sequential without adding to visual complexity. This principle could also prove useful in synthesizing across similar studies to build the science of tailored implementation, where strategies are selected based on the presence of specific combinations of determinants. As previously indicated [ 29 ], there is much work to be done in this area given.

Principle 3: Specify critical study design elements

This critical element will vary by the study design (e.g., hybrid effectiveness-implementation trial, observational, what subsystems are assigned to the strategies). This principle includes not only researchers but service systems and communities, whose consent is necessary to carry out any implementation design [ 3 , 42 , 43 ].

Primary outcome(s)

Indicate the primary outcome(s) at each level of the study design (i.e., clinician, clinic, organization, county, state, nation). The levels should align with the specific aims of a grant application or the stated objective of a research report. In the case of a process evaluation or an observational study including the RE-AIM evaluation components [ 38 ] or the Proctor et al. [ 21 ] taxonomy of implementation outcomes, the primary outcome may be the product of the conceptual or theoretical model used when a priori outcomes are not clearly indicated. We also suggest including downstream health services and clinical outcomes even if they are not measured, as these are important for understanding the logic of the study and the ultimate health-related targets.

For quasi/experimental designs

When quasi/experimental designs [ 3 , 4 ] are used, the independent variable(s) (i.e., the strategies that are introduced or manipulated or that otherwise differentiate study conditions) should be clearly labeled. This is important for internal validity and for differentiating conditions in multi-arm studies.

For comparative implementation trials

In the context of comparative implementation trials [ 3 , 4 ], a study of two or more competing implementation strategies are introduced for the purposes of the study (i.e., the comparison is not implementation-as-usual), and there is a need to indicate the determinants, strategies, mechanisms, and potentially outcomes that differentiate the arms (see Additional File A 2 ). As comparative implementation can involve multiple service delivery systems, the determinants, mechanisms, and outcomes might also differ, though there must be at least one comparable implementation outcome. In our preliminary work applying the IRLM to a large-scale comparative implementation trial, we found that we needed to use an IRLM for each arm of the trial as it was not possible to use a single IRLM because the strategies being tested occurred across two delivery systems and strategies were very different, by design. This is an example of the flexible use of the IRLM.

For implementation optimization designs

A number of designs are now available that aim to test processes of optimizing implementation. These include factorial, Sequential Multiple Assignment Randomized Trial (SMART) [ 44 ], adaptive [ 45 ], and roll-out implementation optimization designs [ 46 ]. These designs allow for (a) building time-varying adaptive implementation strategies based on the order in which components are presented [ 44 ], (b) evaluating the additive and combined effects of multiple strategies [ 44 , 47 ], and (c) can incorporate data-driven iterative changes to improve implementation in successive units [ 45 , 46 ]. The IRLM in Additional File A 4 can be used for such designs.

Additional specification options

Users of the IRLM are allowed to specify any number of additional elements that may be important to their study. For example, one could notate those elements of the IRLM that have been or will be measured versus those that were based on the researcher’s prior studies or inferred from findings reported in the literature. Users can also indicate when implementation strategies differ by level or unit within the study. In large multisite studies, strategies might not be uniform across all units, particularly those strategies that already exist within the system. Similarly, there might be a need to increase the dose of certain strategies to address the relative strengths of different determinants within units.

Using the IRLM for different purposes and stages of research

Commensurate with logic models more generally, the IRLM can be used for planning and organizing a project, carrying out a project (as a roadmap), reporting and presenting the findings of a completed project, and synthesizing the findings of multiple projects or of a specific area of implementation research, such as what is known about how learning collaboratives are effective within clinical care settings.

When the IRLM is used for planning, the process of populating each of the elements often begins with the known parameter(s) of the study. For example, if the problem is improving the adoption and reach of a specific EBI within a particular clinical setting, the implementation outcomes and context, as well as the EBI, are clearly known. The downstream clinical outcomes of the EBI are likely also known. Working from the two “bookends” of the IRLM, the researchers and community partners and/or organization stakeholders can begin to fill in the implementation strategies that are likely to be feasible and effective and then posit conceptually derived mechanisms of action. In another example, only the EBI and primary clinical outcomes were known. The IRLM was useful in considering different scenarios for what strategies might be needed and appropriate to test the implementation of the EBI in different service delivery contexts. The IRLM was a tool for the researchers and stakeholders to work through these multiple options.

When we used the IRLM to plan for the execution of funded implementation studies, the majority of the parameters were already proposed in the grant application. However, through completing the IRLM prior to the start of the study, we found that a number of important contextual factors had not been considered, additional implementation strategies were needed to complement the primary ones proposed in the grant, and mechanisms needed to be added and measured. At the time of award, mechanisms were not an expected component of implementation research projects as they will likely become in the future.

For another project, the IRLM was applied retrospectively to report on the findings and overall logic of the study. Because nearly all elements of the IRLM were known, we approached completion of the model as a means of showing what happened during the study and to accurately report the hypothesized relationships that we observed. These relationships could be formally tested using causal pathway modeling [ 12 ] or other path analysis approaches with one or more intervening variables [ 48 ].

Synthesizing

In our preliminary work with the IRLM, we used it in each of the first three ways; the fourth (synthesizing) is ongoing within the National Cancer Institute’s Improving the Management of symPtoms during And Following Cancer Treatment (IMPACT) research consortium. The purpose is to draw conclusions for the implementation of an EBI in a particular context (or across contexts) that are shared and generalizable to provide a guide for future research and implementation.

Use of supporting text and documents

While the IRLM provides a good deal of information about a project in a single visual, researchers will need to convey additional details about an implementation research study through the use of supporting text, tables, and figures in grant applications, reports, and articles. Some elements that require elaboration are (a) preliminary data on the assessment and valence of implementation determinants; (b) operationalization/detailing of the implementation strategies being used or observed, using established reporting guidelines [ 9 ] and labeling conventions [ 32 ] from the literature; (c) hypothesized or tested causal pathways [ 12 ]; (d) process, service, and clinical outcome measures, including the psychometric properties, method, and timing of administration, respondents, etc.; (e) study procedures, including subject selection, assignment to (or observation of natural) study conditions, and assessment throughout the conduct of the study [ 4 ]; and (f) the implementation plan or process for following established implementation frameworks [ 49 , 50 , 51 ]. By utilizing superscripts, subscripts, and other notations within the IRLM, as previously suggested, it is easy to refer to (a) hypothesized causal paths in theoretical overviews and analytic plan sections, (b) planned measures for determinants and outcomes, and (c) specific implementation strategies in text, tables, and figures.

Evidence of IRLM utility and acceptability

The IRLM was used as the foundation for a training in implementation research methods to a group of 65 planning projects awarded under the national Ending the HIV Epidemic initiative. One investigator (project director or co-investigator) and one implementation partner (i.e., a collaborator from a community service delivery system) from each project were invited to attend a 2-day in-person summit in Chicago, IL, in October 2019. One hundred thirty-two participants attended, representing 63 of the 65 projects. A survey, which included demographics and questions pertaining to the Ending the HIV Epidemic, was sent to potential attendees prior to the summit, to which 129 individuals—including all 65 project directors, 13 co-investigators, and 51 implementation partners (62% Female)—responded. Those who indicated an investigator role ( n = 78) received additional questions about prior implementation research training (e.g., formal coursework, workshop, self-taught) and related experiences (e.g., involvement in a funded implementation project, program implementation, program evaluation, quality improvement) and the stage of their project (i.e., exploration, preparation, implementation, sustainment [ 50 ]).

Approximately 6 weeks after the summit, 89 attendees (69%) completed a post-training survey comprising more than 40 questions about their overall experience. Though the invitation to complete the survey made no mention of the IRLM, it included 10 items related to the IRLM and one more generally about the logic of implementation research, each rated on a 4-point scale (1 = not at all , 2 = a little , 3 = moderately , 4 = very much ; see Table 1 ). Forty-two investigators (65% of projects) and 24 implementation partners indicated attending the training and began and completed the survey (68.2% female). Of the 66 respondents who attended the training, 100% completed all 11 IRLM items, suggesting little potential response bias.

Table 1 provides the means, standard deviations, and percent of respondents endorsing either “moderately” or “very” response options. Results were promising for the utility of the IRLM on the majority of the dimensions assessed. More than 50% of respondents indicated that the IRLM was “moderately” or “very” helpful on all questions. Overall, 77.6% ( M = 3.18, SD = .827) of respondents indicated that their knowledge on the logic of implementation research had increased either moderately or very much after the 2-day training. At the time of the survey, when respondents were about 2.5 months into their 1-year planning projects, 44.6% indicated that they had already been able to complete a full draft of the IRLM.

Additional analyses using a one-way analysis of variance indicated no statistically significant differences in responses to the IRLM questions between investigators and implementation partners. However, three items approached significance: planning the project ( F = 2.460, p = .055), clearly reporting and specifying how the project is to be conducted ( F = 2.327, p = .066), and knowledge on the logic of implementation research ( F = 2.107, p = .091). In each case, scores were higher for the investigators compared to the implementation partners, suggesting that perhaps the knowledge gap in implementation research lay more in the academic realm than among community partners, who may not have a focus on research but whose day-to-day roles include the implementation of EBPs in the real world. Lastly, analyses using ordinal logistic regression did not yield any significant relationship between responses to the IRLM survey items and prior training ( n = 42 investigators who attended the training and completed the post-training survey), prior related research experience ( n = 42), and project stage of implementation ( n = 66). This suggests that the IRLM is a useful tool for both investigators and implementers with varying levels of prior exposure to implementation research concepts and across all stages of implementation research. As a result of this training, the IRLM is now a required element in the FY2020 Ending the HIV Epidemic Centers for AIDS Research/AIDS Research Centers Supplement Announcement released March 2020 [ 15 ].

Resources for using the IRLM

As the use of the IRLM for different study designs and purposes continues to expand and evolve, we envision supporting researchers and other program implementers in applying the IRLM to their own contexts. Our team at Northwestern University hosts web resources on the IRLM that includes completed examples and tools to assist users in completing their model, including templates in various formats (Figs. 1 and 2 , Additional Files A 1 , A 2 , A 3 and A 4 and others) a Quick Reference Guide (Additional File A 8 ) and a series of worksheets that provide guidance on populating the IRLM (Additional File A 9 ). These will be available at https://cepim.northwestern.edu/implementationresearchlogicmodel/ .

The IRLM provides a compact visual depiction of an implementation project and is a useful tool for academic–practice collaboration and partnership development. Used in conjunction with supporting text, tables, and figures to detail each of the primary elements, the IRLM has the potential to improve a number of aspects of implementation research as identified in the results of the post-training survey. The usability of the IRLM is high for seasoned and novice implementation researchers alike, as evidenced by our survey results and preliminary work. Its use in the planning, executing, reporting, and synthesizing of implementation research could increase the rigor and transparency of complex studies that ultimately could improve reproducibility—a challenge in the field—by offering a common structure to increase consistency and a method for more clearly specifying links and pathways to test theories.

Implementation occurs across the gamut of contexts and settings. The IRLM can be used when large organizational change is being considered, such as a new strategic plan with multifaceted strategies and outcomes. Within a narrower scope of a single EBI in a specific setting, the larger organizational context still ought to be included as inner setting determinants (i.e., the impact of the organizational initiative on the specific EBI implementation project) and as implementation strategies (i.e., the specific actions being done to make the organizational change a reality that could be leveraged to implement the EBI or could affect the success of implementation). The IRLM has been used by our team to plan for large systemic changes and to initiate capacity building strategies to address readiness to change (structures, processes, individuals) through strategic planning and leadership engagement at multiple levels in the organization. This aspect of the IRLM continues to evolve.

Among the drawbacks of the IRLM is that it might be viewed as a somewhat simplified format. This represents the challenges of balancing depth and detail with parsimony, ease of comprehension, and ease of use. The structure of the IRLM may inhibit creative thinking if applied too rigidly, which is among the reasons we provide numerous examples of different ways to tailor the model to the specific needs of different project designs and parameters. Relatedly, we encourage users to iterate on the design of the IRLM to increase its utility.

The promise of implementation science lies in the ability to conduct rigorous and reproducible research, to clearly understand the findings, and to synthesize findings from which generalizable conclusions can be drawn and actionable recommendations for practice change emerge. As scientists and implementers have worked to better define the core methods of the field, the need for theory-driven, testable integration of the foundational elements involved in impactful implementation research has become more apparent. The IRLM is a tool that can aid the field in addressing this need and moving toward the ultimate promise of implementation research to improve the provision and quality of healthcare services for all people.

Availability of data and materials

Not applicable.

Abbreviations

Consolidated Framework for Implementation Research

Evidence-based intervention

Expert Recommendations for Implementing Change

Implementation Research Logic Model

Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SJ, Buck S, Chambers CD, Chin G, Christensen G, et al. Promoting an open research culture. Science. 2015;348:1422–5.

Article   CAS   Google Scholar  

Slaughter SE, Hill JN, Snelgrove-Clarke E. What is the extent and quality of documentation and reporting of fidelity to implementation strategies: a scoping review. Implement Sci. 2015;10:1–12.

Article   Google Scholar  

Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, Collins LM, Duan N, Mittman BS, Wallace A, et al: An overview of research and evaluation designs for dissemination and implementation. Annual Review of Public Health 2017, 38:null.

Hwang S, Birken SA, Melvin CL, Rohweder CL, Smith JD: Designs and methods for implementation research: advancing the mission of the CTSA program. Journal of Clinical and Translational Science 2020:Available online.

Smith JD. An Implementation Research Logic Model: a step toward improving scientific rigor, transparency, reproducibility, and specification. Implement Sci. 2018;14:S39.

Google Scholar  

Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43:337–50.

Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53.

Damschroder LJ. Clarity out of chaos: use of theory in implementation research. Psychiatry Res. 2019.

Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8.

Kessler RS, Purcell EP, Glasgow RE, Klesges LM, Benkeser RM, Peek CJ. What does it mean to “employ” the RE-AIM model? Evaluation & the Health Professions. 2013;36:44–66.

Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, Rycroft-Malone J, Meissner P, Murray E, Patel A, et al. Standards for Reporting Implementation Studies (StaRI): explanation and elaboration document. BMJ Open. 2017;7:e013318.

Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, Walsh-Bailey C, Weiner B. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6.

Glanz K, Bishop DB. The role of behavioral science theory in development and implementation of public health interventions. Annu Rev Public Health. 2010;31:399–418.

WK Kellogg Foundation: Logic model development guide. Battle Creek, Michigan: WK Kellogg Foundation; 2004.

CFAR/ARC Ending the HIV Epidemic Supplement Awards [ https://www.niaid.nih.gov/research/cfar-arc-ending-hiv-epidemic-supplement-awards ].

Funnell SC, Rogers PJ. Purposeful program theory: effective use of theories of change and logic models. San Francisco, CA: John Wiley & Sons; 2011.

Petersen D, Taylor EF, Peikes D. The logic model: the foundation to implement, study, and refine patient-centered medical home models (issue brief). Mathematica Policy Research: Mathematica Policy Research Reports; 2013.

Davidoff F, Dixon-Woods M, Leviton L, Michie S. Demystifying theory and its use in improvement. BMJ Quality & Safety. 2015;24:228–38.

Fernandez ME, ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, Ruiter RAC, Markham CM, Kok G. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7.

Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Admin Pol Ment Health. 2009;36.

Proctor EK, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011;38.

Rabin BA, Brownson RC: Terminology for dissemination and implementation research. In Dissemination and implementation research in health: translating science to practice. 2 edition. Edited by Brownson RC, Colditz G, Proctor EK. New York, NY: Oxford University Press; 2017: 19-45.

Smith JD, Rafferty MR, Heinemann AW, Meachum MK, Villamar JA, Lieber RL, Brown CH: Evaluation of the factor structure of implementation research measures adapted for a novel context and multiple professional roles. BMC Health Serv Res 2020.

Smith JD, Berkel C, Jordan N, Atkins DC, Narayanan SS, Gallo C, Grimm KJ, Dishion TJ, Mauricio AM, Rudo-Stern J, et al. An individually tailored family-centered intervention for pediatric obesity in primary care: study protocol of a randomized type II hybrid implementation-effectiveness trial (Raising Healthy Children study). Implement Sci. 2018;13:1–15.

Fauci AS, Redfield RR, Sigounas G, Weahkee MD, Giroir BP. Ending the HIV epidemic: a plan for the United States: Editorial. JAMA. 2019;321:844–5.

Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50.

Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, Collins LM, Duan N, Mittman BS, Wallace A, et al. An overview of research and evaluation designs for dissemination and implementation. Annu Rev Public Health. 2017;38:1–22.

Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, Jaeger C, Steinhaeuser J, Godycki-Cwirko M, Kowalczyk A, et al. Identifying determinants of care for tailoring implementation in chronic diseases: an evaluation of different methods. Implement Sci. 2014;9:102.

Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14:42.

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4.

Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, Foy R, Duncan EM, Colquhoun H, Grimshaw JM, et al. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12:77.

Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JE. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10.

Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, McHugh SM, Weiner BJ. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7.

PAR-19-274: Dissemination and implementation research in health (R01 Clinical Trial Optional) [ https://grants.nih.gov/grants/guide/pa-files/PAR-19-274.html ].

Edmondson D, Falzon L, Sundquist KJ, Julian J, Meli L, Sumner JA, Kronish IM. A systematic review of the inclusion of mechanisms of action in NIH-funded intervention trials to improve medication adherence. Behav Res Ther. 2018;101:12–9.

Gaglio B, Shoup JA, Glasgow RE. The RE-AIM framework: a systematic review of use over time. Am J Public Health. 2013;103:e38–46.

Glasgow RE, Harden SM, Gaglio B, Rabin B, Smith ML, Porter GC, Ory MG, Estabrooks PA. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019;7.

Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89:1322–7.

Damschroder LJ, Reardon CM, Sperber N, Robinson CH, Fickel JJ, Oddone EZ. Implementation evaluation of the Telephone Lifestyle Coaching (TLC) program: organizational factors associated with successful implementation. Transl Behav Med. 2016;7:233–41.

Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. Tracking implementation strategies: a description of a practical approach and early findings. Health Research Policy and Systems. 2017;15:15.

Boyd MR, Powell BJ, Endicott D, Lewis CC. A method for tracking implementation strategies: an exemplar implementing measurement-based care in community behavioral health clinics. Behav Ther. 2018;49:525–37.

Brown CH, Kellam S, Kaupert S, Muthén B, Wang W, Muthén L, Chamberlain P, PoVey C, Cady R, Valente T, et al. Partnerships for the design, conduct, and analysis of effectiveness, and implementation research: experiences of the Prevention Science and Methodology Group. Adm Policy Ment Health Ment Health Serv Res. 2012;39:301–16.

McNulty M, Smith JD, Villamar J, Burnett-Zeigler I, Vermeer W, Benbow N, Gallo C, Wilensky U, Hjorth A, Mustanski B, et al: Implementation research methodologies for achieving scientific equity and health equity. In Ethnicity & disease, vol. 29. pp. 83-92; 2019:83-92.

Collins LM, Murphy SA, Strecher V. The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent eHealth interventions. Am J Prev Med. 2007;32:S112–8.

Brown CH, Ten Have TR, Jo B, Dagne G, Wyman PA, Muthén B, Gibbons RD. Adaptive designs for randomized trials in public health. Annu Rev Public Health. 2009;30:1–25.

Smith JD: The roll-out implementation optimization design: integrating aims of quality improvement and implementation sciences. Submitted for publication 2020.

Dziak JJ, Nahum-Shani I, Collins LM. Multilevel factorial experiments for developing behavioral interventions: power, sample size, and resource considerations. Psychol Methods. 2012;17:153–75.

MacKinnon DP, Lockwood CM, Hoffman JM, West SG, Sheets V. A comparison of methods to test mediation and other intervening variable effects. Psychol Methods. 2002;7:83–104.

Graham ID, Tetroe J. Planned action theories. In: Straus S, Tetroe J, Graham ID, editors. Knowledge translation in health care: Moving from evidence to practice. Wiley-Blackwell: Hoboken, NJ; 2009.

Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14:1.

Rycroft-Malone J. The PARIHS framework—a framework for guiding the implementation of evidence-based practice. J Nurs Care Qual. 2004;19:297–304.

Download references

Acknowledgements

The authors wish to thank our colleagues who provided input at different stages of developing this article and the Implementation Research Logic Model, and for providing the examples included in this article: Hendricks Brown, Brian Mustanski, Kathryn Macapagal, Nanette Benbow, Lisa Hirschhorn, Richard Lieber, Piper Hansen, Leslie O’Donnell, Allen Heinemann, Enola Proctor, Courtney Wolk-Benjamin, Sandra Naoom, Emily Fu, Jeffrey Rado, Lisa Rosenthal, Patrick Sullivan, Aaron Siegler, Cady Berkel, Carrie Dooyema, Lauren Fiechtner, Jeanne Lindros, Vinny Biggs, Gerri Cannon-Smith, Jeremiah Salmon, Sujata Ghosh, Alison Baker, Jillian MacDonald, Hector Torres and the Center on Halsted in Chicago, Michelle Smith, Thomas Dobbs, and the pastors who work tirelessly to serve their communities in Mississippi and Arkansas.

This study was supported by grant P30 DA027828 from the National Institute on Drug Abuse, awarded to C. Hendricks Brown; grant U18 DP006255 to Justin Smith and Cady Berkel; grant R56 HL148192 to Justin Smith; grant UL1 TR001422 from the National Center for Advancing Translational Sciences to Donald Lloyd-Jones; grant R01 MH118213 to Brian Mustanski; grant P30 AI117943 from the National Institute of Allergy and Infectious Diseases to Richard D’Aquila; grant UM1 CA233035 from the National Cancer Institute to David Cella; a grant from the Woman’s Board of Northwestern Memorial Hospital to John Csernansky; grant F32 HS025077 from the Agency for Healthcare Research and Quality; grant NIFTI 2016-20178 from the Foundation for Physical Therapy; the Shirley Ryan AbilityLab; and by the Implementation Research Institute (IRI) at the George Warren Brown School of Social Work, Washington University in St. Louis, through grant R25 MH080916 from the National Institute of Mental Health and the Department of Veterans Affairs, Health Services Research & Development Service, and Quality Enhancement Research Initiative (QUERI) to Enola Proctor. The opinions expressed herein are the views of the authors and do not necessarily reflect the official policy or position of the National Institutes of Health, the Centers for Disease Control and Prevention, the Agency for Healthcare Research and Quality the Department of Veterans Affairs, or any other part of the US Department of Health and Human Services.

Author information

Authors and affiliations.

Department of Population Health Sciences, University of Utah School of Medicine, Salt Lake City, Utah, USA

Justin D. Smith

Center for Prevention Implementation Methodology for Drug Abuse and HIV, Department of Psychiatry and Behavioral Sciences, Department of Preventive Medicine, Department of Medical Social Sciences, and Department of Pediatrics, Northwestern University Feinberg School of Medicine, Chicago, Illinois, USA

Center for Prevention Implementation Methodology for Drug Abuse and HIV, Department of Psychiatry and Behavioral Sciences, Feinberg School of Medicine; Institute for Sexual and Gender Minority Health and Wellbeing, Northwestern University Chicago, Chicago, Illinois, USA

Dennis H. Li

Shirley Ryan AbilityLab and Center for Prevention Implementation Methodology for Drug Abuse and HIV, Department of Psychiatry and Behavioral Sciences and Department of Physical Medicine and Rehabilitation, Northwestern University Feinberg School of Medicine, Chicago, Illinois, USA

Miriam R. Rafferty

You can also search for this author in PubMed   Google Scholar

Contributions

JDS conceived of the Implementation Research Logic Model. JDS, MR, and DL collaborated in developing the Implementation Research Logic Model as presented and in the writing of the manuscript. All authors approved of the final version.

Corresponding author

Correspondence to Justin D. Smith .

Ethics declarations

Ethics approval and consent to participate.

Not applicable. This study did not involve human subjects.

Consent for publication

Competing interests.

None declared.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

IRLM Fillable PDF form

Additional file 2.

IRLM for Comparative Implementation

Additional file 3.

IRLM for Implementation of an Intervention Across or Linking Two Contexts

Additional file 4.

IRLM for an Implementation Optimization Study

Additional file 5.

IRLM example 1: Faith in Action: Clergy and Community Health Center Communication Strategies for Ending the Epidemic in Mississippi and Arkansas

Additional file 6.

IRLM example 2: Hybrid Type II Effectiveness–Implementation Evaluation of a City-Wide HIV System Navigation Intervention in Chicago, IL

Additional file 7.

IRLM example 3: Implementation, spread, and sustainment of Physical Therapy for Mild Parkinson’s Disease through a Regional System of Care

Additional file 8.

IRLM Quick Reference Guide

Additional file 9.

IRLM Worksheets

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Smith, J.D., Li, D.H. & Rafferty, M.R. The Implementation Research Logic Model: a method for planning, executing, reporting, and synthesizing implementation projects. Implementation Sci 15 , 84 (2020). https://doi.org/10.1186/s13012-020-01041-8

Download citation

Received : 03 April 2020

Accepted : 03 September 2020

Published : 25 September 2020

DOI : https://doi.org/10.1186/s13012-020-01041-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Program theory
  • Integration
  • Study specification

Implementation Science

ISSN: 1748-5908

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

strategic planning logic model

IMAGES

  1. Reimagining the Logic Model: How to Create a Strategic Plan with a

    strategic planning logic model

  2. Logic Model Framework

    strategic planning logic model

  3. The Ultimate Strategic Planning Framework Tool: Introduction

    strategic planning logic model

  4. Strategic Planning Process: Mission, Priorities, Goals, KPIs

    strategic planning logic model

  5. Logic Model Template For Strategic Planning

    strategic planning logic model

  6. ESS Strategic Plan 2019-23 Logic Model

    strategic planning logic model

VIDEO

  1. Developing a Logic Model Part 1.m4v

  2. Strategic Management of IT

  3. Strategic Planning 1

  4. Strategic Planning

  5. STRATEGIC PLANNING MODELS

  6. 29- chapter 2 Leadership (strategic planning)

COMMENTS

  1. PDF Strategic Plan and Logic Model Guide

    1. Goals or objectives and desired outcomes of the project in both the short-and long-term 2. A plan for meeting identified needs by drawing on identified strengths 3. A timeline for achieving the desired outcomes 4. A mechanism for evaluation of these strategies once applied Strategic plans include the following elements: 1.

  2. Logic Modeling: Contributing to Strategic Planning

    Although logic modeling can't do all of the things a strategic plan can, it can become - especially when it includes an organization's many stakeholders - an important contributor to the process through which an organization reflects upon where it is and where it wants to go.

  3. Strategy Maps, Logic Models, & Theory of Change, Oh My!

    The W.K. Kellogg Foundation defines a logic model as "a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan, and the changes or results you hope to achieve." The logic model has five components:

  4. Using a logic model to develop your strategy

    A linked NCCPE guide covers how to use a logic model to plan your evaluation. In this guide I want to share how I used the approach to develop a strategic plan for our work. Context - a description of the situation you are trying to change. Resources / inputs - what you will invest to support the planned activity.

  5. Using Logic Models for Strategic Planning and Evaluation

    Using Logic Models for Strategic Planning and Evaluation Application to the National Center for Injury Prevention and Control by Victoria A. Greenfield, Valerie L. Williams, Elisa Eiseman Related Topics: Occupational Safety and Health, Public Safety Citation Embed Download eBook for Free Full Document Summary Only

  6. Section 1. Developing a Logic Model or Theory of Change

    Logic models integrate planning, implementation, and evaluation. As a detailed description of your initiative, from resources to results, the logic model is equally important for planning, implementing, and evaluating the project. If you are a planner, the modeling process challenges you to think more like an evaluator.

  7. PDF Logic Modeling and Strategic Planning: Keys to Program Success

    LOGIC MODELING AND STRATEGIC PLANNING: KEYS TO PROGRAM SUCCESS Louisa Jones August 13, 2013 2013 Tribal TANF Summit to Improve Performance and Strengthen Native Families Reflection Questions How do you know your program(s)are successful? What does evidence/data mean to you? What types of evidence/data do you currently collect?

  8. Logic Model: Framework for Strategic Planning

    Logic Model: Framework for Strategic Planning Expertly crafted logic models providing structured, visual representations of program objectives, activities, outputs, and anticipated impacts for clear strategic planning and assessment. Download Impact Framework Explore Sopact Sense In this article What is the Logic Model? Logic Model Components

  9. PDF Logic Modeling and Strategic Planning: Keys to Program Success

    A logic model is: A diagram of theory of how the program is supposed to work A graphic depiction of the relationship between activities and results A road map to reach program goals Why use it? Program Stages Planning & Program Design Implementation & Management Communication, Marketing, & Assessments Benefits of the logic model

  10. Strategic Planning Frameworks and Models

    The Basic Model: Sometimes called a simple strategic planning model, the basic model involves creating a mission statement, goals, and strategies. Blue Ocean Strategy: This framework emphasizes new markets and uncontested space. Gap Planning: A strategy gap is the distance between how a company is currently performing and its desired goal.

  11. 7 Strategic Planning Models and 8 Frameworks To Start [2023] • Asana

    1. Basic model The basic strategic planning model is ideal for establishing your company's vision, mission, business objectives, and values. This model helps you outline the specific steps you need to take to reach your goals, monitor progress to keep everyone on target, and address issues as they arise.

  12. A logic model framework for evaluation and planning in a primary care

    A logic model can also provide much needed detail about how resources and activities can be connected with the desired results which helps with project management, resource allocation and strategic planning (2-14). The process of developing the logic model also facilitates critical thinking through the process of planning and communicating ...

  13. PDF Logic Model-A Planning and Evaluation Tool

    A logic model is a visual illustration of a program's resources, activities and expected outcomes.1,2 It is a tool used to simplify complex relationships between various components and can be used during program planning, implementation and evaluation.3,4 A term that is sometimes used concurrently with logic models is theory of change.

  14. Developing and Using a Logic Model

    Logic models are tools for planning, describing, managing, communicating, and evaluating a program or intervention. Logic models increase the likelihood that program efforts will be successful because they: Communicate the purpose of the program and expected results. Describe the actions expected to lead to the desired results.

  15. PDF BASIC DEFINITIONS

    A logic model or strategic plan showing how the organization will implement the theory of change. A description of how the organization will track progress as it implements the logic model and how it will assess success in achieving its goals.

  16. Logic models

    STRATEGY DETAILS Q1. What is a logic model? A logic model is a detailed visual representation of an organization or office or program or initiative (we'll just say organization) that conveys its theory of change. It communicates how an organization works by depicting the intended relationships among its components:

  17. Reimagining the Logic Model: How to Create a Strategic Plan with a

    Traditional logic models start at the micro-level and piece together how elements must come together to achieve your desired outcome. More recently, the tearless logic model flips that process and takes a macro look, asking organizations to start by a broad vision of impact and then fill in the components and systems that will get them there. This focus on broad impact makes it well-suited for ...

  18. PDF Using the Logic Model for Program Planning

    Basically, a Logic Model is a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan to do, and the changes or results you hope to achieve. Figure 1. The Basic Logic Model. _______________________________

  19. PDF INFRASTRUCTURE, SAFETY, AND ENVIRONMENT

    conjunction with the logic model template, lays out the program's operations. The formulation of strategy essentially consists of using information from the opera-tional path of the logic model and the overall template to develop goals and measures. The goals and measures are the basis of a strategic plan as they speak directly to the issue ...

  20. The Implementation Research Logic Model: a method for planning

    Commensurate with logic models more generally, the IRLM can be used for planning and organizing a project, carrying out a project (as a roadmap), reporting and presenting the findings of a completed project, and synthesizing the findings of multiple projects or of a specific area of implementation research, such as what is known about how ...

  21. Centers for Disease Control and Prevention

    In turn, the strategic plan and logic model should guide the objectives and activities for implementing strategies you select and describe in your annual workplan. To ensure their utility as planning tools, periodically consult and compare the strategic plan, logic model, and workplan, especially when writing your workplan each year.

  22. PDF SECTION 230

    SECTION 230—AGENCY STRATEGIC PLANNING OMB Circular No. A-11 (2023) Page 1 of Section 230 ... 230.10 What is a logic model, and how can this framework tool help agencies coordinate strategic .

  23. Planning

    Strategic Prevention Framework Planning Prevention Services Strategic Prevention Framework Planning Planning involves the creation of a comprehensive plan with goals, objectives, and strategies aimed at meeting the substance abuse prevention needs of the community.