Skip to main content

Identifying and selecting implementation theories, models and frameworks: a qualitative study to inform the development of a decision support tool

Abstract

Background

Implementation theories, models and frameworks offer guidance when implementing and sustaining healthcare evidence-based interventions. However, selection can be challenging given the myriad of potential options. We propose to inform a decision support tool to facilitate the appropriate selection of an implementation theory, model or framework in practice. To inform tool development, this study aimed to explore barriers and facilitators to identifying and selecting implementation theories, models and frameworks in research and practice, as well as end-user preferences for features and functions of the proposed tool.

Methods

We used an interpretive descriptive approach to conduct semi-structured interviews with implementation researchers and practitioners in Canada, the United States and Australia. Audio recordings were transcribed verbatim. Data were inductively coded by a single investigator with a subset of 20% coded independently by a second investigator and analyzed using thematic analysis.

Results

Twenty-four individuals participated in the study. Categories of barriers/facilitators, to inform tool development, included characteristics of the individual or team conducting implementation and characteristics of the implementation theory, model or framework. Major barriers to selection included inconsistent terminology, poor fit with the implementation context and limited knowledge about and training in existing theories, models and frameworks. Major facilitators to selection included the importance of clear and concise language and evidence that the theory, model or framework was applied in a relevant health setting or context. Participants were enthusiastic about the development of a decision support tool that is user-friendly, accessible and practical. Preferences for tool features included key questions about the implementation intervention or project (e.g., purpose, stage of implementation, intended target for change) and a comprehensive list of relevant theories, models and frameworks to choose from along with a glossary of terms and the contexts in which they were applied.

Conclusions

An easy to use decision support tool that addresses key barriers to selecting an implementation theory, model or framework in practice may be beneficial to individuals who facilitate implementation practice activities. Findings on end-user preferences for tool features and functions will inform tool development and design through a user-centered approach.

Peer Review reports

Background

Over 100 different theories, models and frameworks exist to guide effective implementation and sustainability of evidence-based interventions or programs [1, 2]. The myriad of implementation theories, models and frameworks differ in complexity, such as their aim, scope and intended target for change. For example, they may describe the different stages of implementation (e.g., process models); identify barriers and facilitators that influence implementation (e.g., determinant frameworks); or predict or explain implementation success by offering an underlying mechanism or theory of change (e.g., implementation theories) [3]. Further, some theories, models and frameworks are broad and address the entire implementation process, while others focus on a particular implementation aspect such as intervention sustainability. Implementation theories, models and frameworks also operate at one or more levels of change, from a health system to an individual. In many cases, using multiple theories, models and frameworks is useful to inform or address the scope and aims of an implementation project and to guide intervention development and testing at multiple levels [4,5,6].

Despite a growing interest in the appropriate selection and use of implementation theories, models and frameworks [7,8,9,10,11], it can be difficult to sift through and make sense of the various options available – especially when most are used in practice only once or with limited justification [2, 12]. For instance, participants in an implementation practice training course [13] reported that they struggled to identify and select suitable theories, models or frameworks to guide their work. Studies also suggest that implementation theories, models and frameworks may not be used appropriately [8, 14].

Implementation researchers and practitioners looking to identify a theory, model or framework to inform their work can access existing tools and publicly available resources such as guidance documents (e.g., [15,16,17]). For example, drawing on their personal experience working with novice implementation practitioners, Lynch and colleagues [10] suggested five questions to consider when selecting a theory, model or framework: who are you working with, when in the process are you going to use theory, why are you applying theory, how will you collect data and what resources are available. Birken and colleagues [9] developed a checklist of 16 criteria (organized within four categories: usability, validity, applicability, acceptability) for implementation researchers or practitioners to consult when selecting a theory, model or framework. A major limitation identified by the tool developers is the prerequisite of a candidate list of suitable theories, models or frameworks to draw from and compare [9]. Rabin and colleagues developed a database of models and frameworks, www.dissemination-implementation.org, however the content is based on the findings of a narrative review of theories, models and frameworks [18] and is not comprehensive.

To address this problem, we propose to use the findings from a rigorous scoping review of over 300 implementation theories, models and frameworks [2] to develop a decision support tool, with input from implementation researchers and practitioners using qualitative research methods. A decision support tool provides structured guidance to help users make an explicit decision [19]. In this case, a decision support tool may facilitate appropriate selection of one or more implementation theories, models or frameworks by engaging the user to answer key questions, resulting in relevant options to consider. The decision support tool will be developed using rigorous methods guided by theory and evidence on user-centered design and implementation science. The overarching approach will be informed using the Knowledge-to-Action Cycle [20] and the United Kingdom Medical Research Council Framework for Development and Evaluation of Complex Interventions [21]. These methods have been used for creation of other decision support tools [22]. As tool development is not the focus for this paper, details on the methods will be described in a subsequent development and evaluation paper.

To inform tool development, we sought the perspectives of implementation researchers and practitioners working in healthcare. Specifically, this study aimed to identify 1) barriers and facilitators to identifying and selecting implementation theories, models and frameworks in research and practice, and 2) preferences for features (i.e., content items) and functions of the proposed decision support tool.

Methods

Thorne’s interpretive descriptive approach [23] guided all aspects of this research, including the design and analysis. Interpretive description is grounded in traditional qualitative methodologies (e.g., phenomenology) that are derived from the social sciences; yet, it is oriented toward applied health disciplines such as implementation practice and designed to address real-world knowledge gaps [23].

Study design

We used Thorne’s interpretive descriptive approach to elicit the perspectives of implementation researchers and practitioners through individual interviews. We chose to conduct individual, semi-structured interviews to understand individual perspectives, including challenges and successes related to identifying and selecting implementation theories, models and frameworks in research and practice. While focus groups would have allowed for group interactions and may have helped participants generate and share their ideas [24], we were most interested in individual opinions and decision processes [23]. Therefore, we felt that interviews would be more informative for tool development. Feasibility was also a factor, as our participants were from a wide geographic area. We followed the Consolidated Criteria for Reporting Qualitative Research checklist [25] (Additional file 1). We obtained research ethics board approval from Unity Health Toronto (REB #16–335) and the University of Toronto (REB #33907). Ethics approval covered recruitment at the conferences and workshops, which covered the study participants in the United States (USA) and Australia. Verbal informed consent was approved by the ethics boards and obtained (and audio-recorded) from all participants using a predetermined script prior to the phone interview.

Participant selection

Eligible study participants included implementation researchers and practitioners (e.g., administrators, clinicians, knowledge brokers) working in healthcare environments such as hospitals, academic research centers or universities, or broader community settings (e.g., public health or regulatory organizations). We defined implementation researchers as individuals who conducted implementation science, and implementation practitioners as individuals who facilitated implementation practice activities (including those who provided support through training and capacity building or knowledge brokering activities).

Study recruitment followed three approaches. First, we recruited in person at two international implementation conferences, one held in the USA in 2016 and one in Canada in 2017. At both conferences, we presented a poster on our scoping review of implementation theories, models and frameworks [2], distributed study information sheets to attendees who stopped to read the poster, and collected contact information from individuals who were interested in participating in our study. We then sent a personalized email to each individual to verify their interest and eligibility and schedule a phone interview. Second, we sent a personalized email to past participants of an implementation practice training course developed by the Knowledge Translation Program (St. Michael’s Hospital, Unity Health Toronto, Canada) [13] and delivered in Canada and Australia between 2015 and 2017. Third, we asked study participants to share the study information sheet with colleagues who might be interested in participating. We sent a personalized email to individuals referred to us by study participants. Up to two more emails were sent to non-responders.

These different recruitment approaches were selected because they targeted diverse implementation researchers and practitioners who were interested in, and had experience with, implementation. The sample was expected to reflect the perspectives of our target end-users of the proposed decision support tool. A sample size of 20–30 participants was expected to provide sufficient information to answer the research question through semi-structured interviews and was considered a feasible range given the available resources [23, 26].

Data collection

Interviews were conducted over the phone by one investigator (LS) between September 2017 and January 2018. A semi-structured interview guide (Additional file 1) was prepared and revised as needed throughout data collection. Part 1 of the interview explored the barriers and facilitators to identifying, selecting and using implementation theories, models or frameworks in research and practice. It included participants’ views and understanding of theories, models and frameworks and the processes used for considering one or more to inform their implementation activities. The interview guide questions were informed loosely by the Theoretical Domains Framework [27] as a starting point, to allow for inductive analysis. Direct questions inquiring about perceived barriers and facilitators were also included to allow for free-flowing discussion. The Theoretical Domains Framework is a validated determinant framework [28] that has been applied in numerous implementation studies to uncover the underlying barriers to and facilitators of behaviour change. Further, the framework includes a comprehensive set of barriers at the individual or person level, along with the organizational-level (e.g., groups of individuals), which we felt were most important to understand when developing a decision support tool to meet the needs of our targeted end-user. Part 2 of the interview explored the features and functions of a hypothetical decision support tool that would be important to participants as target end-users of the tool. The interview guide was reviewed by and pilot tested with three individuals, all experienced in qualitative research and implementation science and practice, and one of whom was also a clinician. Each interview lasted 30–60 min and was audio-recorded and transcribed verbatim.

Data analysis

Following an interpretive descriptive approach, we conducted a thematic analysis of the data to synthesize meanings across codes and generate a narrative of the key themes to inform subsequent tool development [23, 29]. Data analysis occurred concurrently with data collection. We used NVivo 12 qualitative data analysis software (QSR International, Cambridge, MA) to organize and code the transcripts. Once the audio-recorded interviews were transcribed and verified for accuracy, they were de-identified using a master linking log, prior to being imported into NVivo. After reading through the first few transcripts to become familiar with the data, we used open coding to create codes from the text and drafted a coding framework. This coding framework was revised iteratively throughout data collection and analysis. All data were coded inductively by a single investigator (LS), with a subset of 20% (i.e., 5 transcripts in total) coded by a second investigator (JB) with high concordance achieved. This duplicate coding process was done at the start and end of data collection to ensure consistency of themes. Representative quotes from participants were selected to support the themes and study findings. The final manuscript was shared with participants for feedback on the research findings.

Results

Participant characteristics

Twenty-four individuals consented to participate: 16 were from Canada, seven from the USA and one from Australia (Table 1). One eligible participant declined consent due to a confidentiality agreement with their current employer. Of the eligible workshop participants contacted, 2 were not reached due to undeliverable email addresses and 33 did not respond to our email invitation. Participants were recruited until no new themes were identified; therefore, not all workshop participants were sent a study invitation. Participants worked in a variety of healthcare environments including hospitals, academic research centers, universities, government organizations, and regulatory organizations. Participants had a range of experience supporting implementation activities in healthcare environments and reported working in implementation for 1.5 to over 20 years. Of the 24 participants, 11 (46%) had completed a “Practicing Knowledge Translation” course developed by the Knowledge Translation Program at St. Michael’s Hospital, Unity Health Toronto, Canada [13]. In terms of knowledge, 14 (58%) participants rated themselves as very or extremely knowledgeable or familiar with implementation theories, models and frameworks, and 13 (54%) as very or extremely confident in selecting and applying them to their work. Sixteen (67%) participants reported frequently or always selecting an implementation theory, model or framework and applying it to their work.

Table 1 Participant characteristics

Barriers and facilitators to identifying and selecting implementation theories, models or frameworks

Four broad categories and 10 factors, generated from the data, influenced identification and selection of implementation theories, models and frameworks and were relevant to tool development (Fig. 1). Illustrative interview excerpts are presented in Tables 2 and 3.

Fig. 1
figure1

Categories and factors influencing the identification and selection of an implementation theory, model or framework

Table 2 Interview excerpts supporting key factors related to category 1 ‘characteristics of individual or team conducting implementation’
Table 3 Interview excerpts supporting key factors related to category 2 ‘characteristics of implementation theory, model or framework’

Category 1: characteristics of the individual or team conducting implementation

Factor 1: attitudes about the importance of selecting theories, models and frameworks

Participants reported having a general understanding of theories, models and frameworks and described several uses in implementation research and practice. For example, many participants found Nilsen’s 2015 taxonomy [3] was useful for defining a theory versus a model versus a framework and referred to the taxonomy when describing their similarities and differences. Some participants said their understanding was grounded in their learnings from the “Practicing Knowledge Translation” course. Others described their understanding of implementation theories, models and frameworks in terms of their clinical or health discipline, such as the Iowa Model for Evidence-based Practice to Promote Quality Care [30] which originates in the nursing field. In general, frameworks and models were described as being descriptive and useful for clarifying aspects of a complicated process. Theories were viewed as being more explicit about how certain phenomena are operating and how change might be occurring.

Participants mentioned using 28 different implementation theories, models and frameworks to inform their work (Table 4). Participants described the important role that theories, models and frameworks play in advancing implementation understanding, especially regarding planning, developing and sustaining effective interventions and implementation strategies. Some of the described uses of theories, models and frameworks included: informing the research question; justifying and organizing an implementation project; guiding the selection and tailoring of implementation strategies; helping to achieve intended outcomes; and analyzing, interpreting, generalizing, or applying the findings of an implementation project. Other benefits to their use included providing a good starting point for implementation, providing a systematic or pragmatic approach for implementation, avoiding overlooking key categories or processes of implementation, and increasing methodological rigor. Participants commented on the importance of engaging in practices that are informed by theories, models and frameworks and evidence.

Table 4 Implementation theories, models and frameworks used by participants

While all participants agreed on the utility of frameworks and models, such as the Knowledge-to-Action Cycle [20], a few were skeptical of the value of using theory to enhance knowledge of the complexity of implementation; they preferred to avoid selecting a formal theoretical approach. Others lacked experience with theory-driven implementation. A few believed that implementation practitioners may not feel the same level of “pressure” to use a theory, model or framework in their role compared to an implementation researcher.

Factor 2: knowledge of existing implementation theories, models and frameworks

Knowledge of existing implementation theories, models and frameworks and where to find them were perceived to be important. Some participants struggled to identify new theories, models or frameworks to inform their work, and identified their lack of knowledge of the breadth of options as an important barrier. Most participants favoured one or more implementation theories, models or frameworks and used them repeatedly, stating that it was easy to use what was familiar. Many did not follow an explicit process for identifying a new theory, model or framework. Access to a comprehensive repository or database of existing implementation theories, models and frameworks was perceived as helpful. Participants also suggested having at least one implementation team member with up-to-date knowledge of what implementation theories, models and frameworks exist, where to find them and their uses.

Factor 3: training related to implementation theories, models and frameworks

Participants talked about the relationship between selecting implementation theories, models and frameworks in research or practice and their training experience. For example, most participants selected theories, models and frameworks for which they received specific training. Major barriers to selection included inadequate background or research training in implementation theories, models and frameworks, and lack of training or expertise in implementation research methods or practice. Some participants spoke about the challenge of getting others (e.g., senior administrators, healthcare providers) to buy into the use of a certain theory, model or framework, especially if they were not familiar with the application of theory. Facilitators to selection included gaining appropriate training through participation in capacity building activities, such as accessing implementation workshops, conferences, coaching, mentoring, train-the-trainer approaches or communities of practice. Examples included working with someone who was formally trained on the theory, model or framework, or receiving feedback from implementation experts who used it to inform their work.

Category 2: characteristics of the implementation theory, model or framework

Factor 4: language and terminology used to describe the theory, model or framework

Language and terminology were key factors for identification and selection. Participants described the language used in implementation theories, models and frameworks as “complex”, “abstract”, “complicated” and “confusing”. In particular, the use of jargon and lack of clear construct definitions were identified as major barriers. Further, several participants struggled with overlapping constructs, and the inconsistent terms used to describe them across theories, models and frameworks. For example, the same term or definition may be used for different constructs, or different terms or definitions may be used for the same constructs. A few participants commented on the inaccurate and inconsistent use of the term theory versus model versus framework, both in research and in practice settings. This appeared to be common with theories versus frameworks (e.g., calling something a theory but referring to a framework). Facilitators included the importance of clear and concise language, and clearly-defined constructs to help differentiate among the various theories, models and frameworks.

Factor 5: fit of the theory, model or framework to the implementation project

Another key factor for identification and selection was the level of fit or appropriateness of the theory, model or framework to the implementation project. Specifically, a poor fit between the context in which the theory, model or framework was developed or had been applied, and the context of the implementation project was identified as a major barrier. For example, many theories, models and frameworks were developed for a specific condition or health behaviour and had not yet been applied in different contexts. Important aspects of the context included the research question, purpose or goal; health problem; setting; population; and level of behaviour change. Evidence that the theory, model or framework had been applied in practice in a similar context (such as relevant examples of applications in the literature) facilitated appropriate selection. Participants stated that seeing a description of the contexts in which the theory, model or framework was previously used was helpful when determining fit. Being aware of a theory, model or framework’s underlying assumptions and its limitations also informed appropriateness and applicability. Other related challenges included the interchangeability, compatibility and adaptability of implementation theories, models and frameworks. For example, some participants struggled with the trade-offs of selecting one theory, model or framework over another. Participants perceived that guidance on comparing different options would facilitate appropriate selection. Some noted that theories, models and frameworks often overlap or are highly derivative of each other, which adds to the complexity of combining more than one within an implementation project. It was deemed helpful to highlight theories, models or frameworks that fit well together, such as the research by Michie and colleagues linking Capabilities Opportunities Motivation Behaviour with the Theoretical Domains Framework [31]. For others, implementation theories, models or frameworks that allowed for some modification were appealing, but participants struggled with how to modify or change aspects to improve fit while maintaining fidelity to key elements.

Factor 6: ease of use of the implementation theory, model or framework

Ease of use in practice was perceived to influence selection of a theory, model or framework. Some participants described implementation theories, models and frameworks as “not intuitive to use” and difficult to operationalize in the context of their own implementation project, even when the theory, model or framework was viewed as a relevant option. Facilitators to selection and use included existing online tools and publicly available resources, such as websites dedicated to specific theories, models or frameworks (e.g., the Consolidated Framework for Implementation Research). In terms of measurement challenges, a few participants cited a lack of relevant measures for key variables across theories, models and frameworks, as well as variability in the extent to which measures were developed to assess constructs. Participants preferred theories, models or frameworks that were “highly actionable”, “pragmatic” and “easy to operationalize” in practice, with detailed processes for the measures themselves that were compatible with their setting.

Factor 7: evidence supporting the implementation theory, model or framework

Empirical evidence of effectiveness, including strength of evidence supporting the theory, model or framework, influenced selection. Implementation theories were described as “fairly loose” and “without solid evidence” compared to theories in other scientific fields (e.g., physical sciences). Further, within a theory, model or framework, the level of evidence was perceived to be uneven across domains or specific processes. A summary of the evidence supporting a theory, model or framework, including the evidence used to create it and evidence of its effectiveness, was deemed to be an important facilitator. Participants also felt it was important that the theory, model or framework constructs and concepts had face validity and made sense in terms of the implementation research question or goal.

Categories 3 and 4

Other important barriers and facilitators mentioned by participants were related to characteristics of the healthcare environment (Category 4) and, to a lesser extent, characteristics of the implementation intervention or project (Category 3).

Availability of resources (Factor 10) within complex healthcare environments (Category 4), such as time, staffing and capacity, funding and access to data were identified as both barriers and facilitators to selection. Many participants also described a “tension” between time and robustness of implementation. For example, a lack of time to invest in the understanding and use of a theory, model or framework (e.g., competing demands or pressure to fix the problem right away) was a major barrier, while taking the time to create an implementation plan that included consideration of theories, models or frameworks at implementation onset was a facilitator. Theory, model or framework selection was also influenced by staff and stakeholder support, such as having an inadequate number of project staff available or being the sole implementation practitioner within an organization. It was deemed important to “assemble the right people at the right table” to avoid siloed practice and redundancy.

Finally, a few participants mentioned factors related to the implementation project (Category 3), such as consideration of the purpose, problem or goal and intended outcome (Factor 8). For instance, it may be inappropriate to select a theory when part of the research question or outcome of an implementation project was to further develop theory. Another relevant factor that presented a challenge to selection was the level of intervention complexity (Factor 9), including the type of intended behaviour change (e.g., individual, program, practice, policy), and the implementation stage (e.g., planning, evaluation, sustainability) for the project.

Features and functions of a decision support tool

Participants were enthusiastic and receptive to the idea of a decision support tool targeted to implementation practitioners. The following key features and functions were suggested to inform tool development. Illustrative interview excerpts are presented in Table 5.

Table 5 Interview excerpts supporting key tool features and functions

Features or content items

Most importantly, the tool should include a comprehensive list of existing implementation theories, models and frameworks to choose from. Suggested content items included characteristics of the theories, models and frameworks matched with characteristics of the end-user’s implementation project (e.g., aim, scope and level of change). Participants suggested organizing the theories, models and frameworks according to their purpose (including their intended aim, scope and level of change) to align them with end-users’ needs. Alternatively, one participant (ID1) suggested starting with the project end goal or outcome, and reviewing theories, models and frameworks that include that outcome as a relevant construct. Many participants also suggested including the context in which the theories, models and frameworks have been applied, along with links to seminal articles and examples of real-world use. Linking the tool with seminal articles would allow end-users to see examples of what has been done, and perhaps gauge ease of use, as well as where the literature may or may not be saturated. Some participants suggested summarizing the evidence supporting each theory, model and framework to highlight those that have been validated. A few participants suggested content items related to the availability of implementation resources, such as the project timelines, number of stakeholders, guidance and team expertise, and financial support.

Functions

Participants suggested that the tool be simple and easy to use by the target end-user (i.e., implementation practitioners). They identified that it should provide the user with a modest set of key questions or prompts that start off broad and become more specific. For example, the tool could respond to the user’s input by guiding them toward more specific theories, models and/or frameworks. The tool should also be practical in that the level of content detail fits the intended tool audience and purpose. Being highly accessible through an open access web-based platform was also important. Further, accommodating a team-based approach (e.g., permitting access and use of the tool by an entire multi-disciplinary implementation team) would foster collaboration. Other suggested features included: interactive viewing or search capabilities (e.g., clicking on an interactive theory, model or framework diagram or figure for more information, or searching by key word or construct name); webinars or instructional videos led by experts on when (and how) to use the theory, model or framework; the use of “storytelling” (e.g., case studies) to increase personal connection; and built-in chat room capabilities to connect or collaborate with and receive feedback from others in the field who have experience selecting and using the implementation theory, model or framework. Finally, a few participants suggested an embedded evaluation component whereby users may consent to complete a survey to provide feedback on the tool.

Discussion

Our findings revealed that factors related to the theory, model or framework, the individual or team conducting implementation and the implementation project are critical to consider when developing a decision support tool. Key barriers to selection related to characteristics of the theory, model or framework included: inaccurate and inconsistent language, poor fit with the implementation context, lack of appropriate measures and limited empirical evidence of effectiveness. These findings are supported by a recent, international survey of over 200 implementation researchers and practitioners who rated ‘empirical support’ and ‘application to a specific population or setting’ as the most important criteria for selection; nevertheless, survey respondents also reported selecting a theory, model or framework based on convenience or familiarity [1]. Similarly, we found that a lack of knowledge of and familiarity with existing implementation theories, models and frameworks, along with a lack of proper training on their use, were key individual/team-level barriers to selection. These knowledge and skills barriers were not surprising given the abundance of implementation theories, models and frameworks coupled with low citation rates in the literature, indicating they are not commonly used [2, 12]. Our study reaffirmed this finding by demonstrating that a group of implementation researchers and practitioners with high self-rated knowledge and experience generated a list of 28 theories, models, and frameworks, which represent less than 20% of those identified in a scoping review. While there may be benefits to selecting a highly-cited theory, model or framework (such as comparability of results across populations or health behaviours [32] or greater availability of resources for operationalization and measurement [12]), a systematic and comprehensive approach to theory, model and framework identification and selection is necessary to advance implementation science and practice.

There are numerous determinant frameworks that we could have chosen to inform our interview guide. For example, our team recently mapped over 300 implementation theories, models and frameworks to Nilsen’s taxonomy [3] and identified over 50 determinant frameworks targeting at least individual-level change; however, many did not include a comprehensive set of barriers and facilitators (unpublished data). Our findings on the barriers and facilitators to selection of a theory, model or framework, in the context of informing a decision support tool, are supported by the Theoretical Domains Framework. For example, the domain ‘knowledge’ considers having the knowledge to locate and understand existing theories, models and frameworks. The ‘skills’ and ‘beliefs about capabilities’ domains focus on having the skills required to know how to select a theory, model or framework in practice and considers how easy or difficult this task is for an individual or team. The ‘social/professional role and identity’ and ‘optimism’ domains consider attitudes about the importance of using theories, models and frameworks, specifically whether an individual believes that selecting and using them is part of their role as an implementation researcher or practitioner and that doing so will benefit their implementation work. The ‘goals’ and ‘intentions’ domains focus on wanting to select and use theories, models and frameworks and then making a conscious decision to include them in implementation work, for example, by using a decision support tool. Finally, the ‘environmental resources’ domain considers having the time and funds to invest in the selection process.

A decision support tool addressing our findings on barriers and facilitators to selection might include a comprehensive list of theories, models and frameworks, a glossary of key terms, the contexts in which the theories, models and frameworks have been developed and applied (including examples of application), and any available evidence to support their validity. Other suggested features for consideration during tool development included the purpose, goal or intended outcome of the implementation project as well as the target population and the intended target for change. It would be quite challenging as tool developers, to systematically categorize existing theories, models and frameworks according to factors such as the amount of time or funding required for use; it may be more beneficial for end-users to reflect on these environment-level factors as key considerations associated with the selection of a particular theory, model or framework from the options provided by the tool. Findings on end-user preferences for tool features and functions will inform tool development and design through a user-centered approach [33].

Limitations

The following study limitations should be considered. First, we used a convenience sample of implementation conference and course attendees. As a result, close to half of our participant sample completed a “Practicing Knowledge Translation” course. As such, we were mindful during recruitment to ensure representatives from different types of healthcare environments, roles, and level of experience. Although we did not intend to saturate these fields given our sample size, we did obtain saturation of themes and had a good sample size for qualitative interviews [23]. Second, we chose to interview implementation researchers and practitioners with some implementation practice experience (i.e., as the target end-users of our tool) because we felt that this experience would be necessary to identify the underlying barriers and facilitators. As such, all study participants described having a baseline understanding of at least a few implementation theories, models and frameworks. While many participants rated their knowledge and confidence with identifying, selecting and using implementation theories, models and frameworks as fairly high, for many this rating reflected their knowledge and confidence regarding the theories, models or frameworks that they were most familiar with and used repeatedly to guide their work.

Conclusion

Individuals who are doing implementation work face many challenges, including how to identify and select appropriate implementation theories, models and frameworks to inform their projects. Key barriers to selection identified in this study included inconsistent language, poor fit and limited knowledge about and training in theories, models and frameworks. These barriers, together with the findings of our scoping review on existing theories, models and frameworks, will inform and tailor the features and functions of a proposed decision support tool for use by implementation practitioners. Our findings from this interview-based study suggest the tool should be easy to use, accessible and feature questions about the implementation project’s purpose, scope and intended target for change, in addition to presenting a comprehensive list of relevant theories, models and frameworks and the contexts in which they were applied.

Availability of data and materials

Not applicable.

References

  1. 1.

    Birken SA, Powell BJ, Shea CM, Haines ER, Alexis Kirk M, Leeman J, et al. Criteria for selecting implementation science theories and frameworks: results from an international survey. Implement Sci. 2017;12(1):124.

    Article  Google Scholar 

  2. 2.

    Strifler L, Cardoso R, McGowan J, Cogo E, Nincic V, Khan PA, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100(Complete):92–102.

    Article  Google Scholar 

  3. 3.

    Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53.

    Article  Google Scholar 

  4. 4.

    Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients' care. Lancet. 2003;362(9391):1225–30.

    Article  Google Scholar 

  5. 5.

    Estabrooks CA, Thompson DS, Lovely JJE, Hofmeyer A. A guide to knowledge translation theory. J Contin Educ Heal Prof. 2006;26(1):25–36.

    Article  Google Scholar 

  6. 6.

    Glanz K, Rimer BK, Viswanath K. Health behavior and health education. Theory, research, and practice. 4th ed. San Francisco: Wiley; 2008.

    Google Scholar 

  7. 7.

    The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG). Designing theoretically-informed implementation interventions. Implement Sci. 2006;1(1):4.

    Article  Google Scholar 

  8. 8.

    Davidoff F, Dixon-Woods M, Leviton L, Michie S. Demystifying theory and its use in improvement. BMJ Qual Saf. 2015;24(3):228.

    Article  Google Scholar 

  9. 9.

    Birken SA, Rohweder CL, Powell BJ, Shea CM, Scott J, Leeman J, et al. T-CaST: an implementation theory comparison and selection tool. Implement Sci. 2018;13(1):143.

    Article  Google Scholar 

  10. 10.

    Lynch EA, Mudge A, Knowles S, Kitson AL, Hunter SC, Harvey G. “There is nothing so practical as a good theory”: a pragmatic guide for selecting theoretical approaches for implementation projects. BMC Health Serv Res. 2018;18:857.

    Article  Google Scholar 

  11. 11.

    May CR, Cummings A, Girling M, Bracher M, Mair FS, May CM, Murray E, Myall M, Rapley T, Finch T. Using normalization process theory in feasibility studies and process evaluations of complex healthcare interventions: a systematic review. Implement Sci. 2018;13(1):80.

    Article  Google Scholar 

  12. 12.

    Skolarus TA, Lehmann T, Tabak RG, Harris J, Lecy J, Sales AE. Assessing citation networks for dissemination and implementation research frameworks. Implement Sci. 2017;12(1):97.

    Article  Google Scholar 

  13. 13.

    Moore JE, Rashid S, Park JS, Khan S, Straus SE. Longitudinal evaluation of a course to build core competencies in implementation practice. Implement Sci. 2018;13(1):106.

    Article  Google Scholar 

  14. 14.

    Field B, Booth A, Ilott I, Gerrish K. Using the knowledge to action framework in practice: a citation analysis and systematic review. Implement Sci. 2014;9:172.

    Article  Google Scholar 

  15. 15.

    Department of Veterans Health Administration, Health Services Research & Development, Quality Enhancement Research Initiative. Implementation Guide. 2013; Available at: https://www.queri.research.va.gov/implementation/. Accessed 8 Jan 2020.

    Google Scholar 

  16. 16.

    Center for Research in Implementation Science and Prevention, University of Colorado Anschutz Medical Campus. Dissemination and implementation in health training guide and workbook. 2013; Available at: http://www.crispebooks.org/. Accessed 8 Jan 2020.

    Google Scholar 

  17. 17.

    Straus SE, Tetroe J, Graham ID. Knowledge translation in health care. Moving from evidence to practice. 2nd ed. Oxford: Wiley Blackwell; 2013.

    Google Scholar 

  18. 18.

    Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50.

    Article  Google Scholar 

  19. 19.

    Timmings C, Khan S, Moore JE, Marquez C, Pykal K, Straus SE. Ready, set, change! Development and usability testing of an online readiness for change decision support tool for healthcare organizations. BMC Med Inform and Decis Mak. 2016;16:24.

    Article  Google Scholar 

  20. 20.

    Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, et al. Lost in knowledge translation: time for a map? J Contin Educ Heal Prof. 2006;26(1):13–24.

    Article  Google Scholar 

  21. 21.

    Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.

    Article  Google Scholar 

  22. 22.

    Kastner M, Straus SE. Application of the knowledge-to-action and Medical Research Council frameworks in the development of an osteoporosis clinical decision support tool. J Clin Epidemiol. 2012;65(11):1163–70.

    Article  Google Scholar 

  23. 23.

    Thorne S. Interpretive description: qualitative research for applied practice. 2nd ed. New York: Routledge; 2016.

    Google Scholar 

  24. 24.

    Parsons MGJ. A guide to the use of focus groups in health care research: part 1. Contemp Nurse. 2000;9(2):169–80.

    CAS  Article  Google Scholar 

  25. 25.

    Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health C. 2007;19(6):349–57.

    Article  Google Scholar 

  26. 26.

    Nastasi B. Qualitative research: sampling & sample size considerations. 2010; Available at: https://my.laureate.net/Faculty/docs/Faculty%20Documents/Forms/AllItems.aspx. Accessed 1 Oct 2016.

    Google Scholar 

  27. 27.

    Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A, et al. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26–33.

    CAS  Article  Google Scholar 

  28. 28.

    Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7(1):37.

    Article  Google Scholar 

  29. 29.

    Braun VCV. Using thematic analysis in psychology. Qual Res Psychol. 2006;3:77.

    Article  Google Scholar 

  30. 30.

    Iowa MC, Buckwalter KC, Cullen L, Hanrahan K, Kleiber C, McCarthy AM, et al. Iowa model of evidence-based practice: revisions and validation. Worldviews Evid-Based Nurs. 2017;14(3):175–82.

    Article  Google Scholar 

  31. 31.

    Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42.

    Article  Google Scholar 

  32. 32.

    Bastani R, Glenn BA, Taylor VM, Chen MS, Nguyen TT, Stewart SL, et al. Integrating theory into community interventions to reduce liver cancer disparities: the health behavior framework. Prev Med. 2010;50(1–2):63–7.

    Article  Google Scholar 

  33. 33.

    Mao J, Vredenburg K, Smith PW, Carey T. The state of user-centered design practice. Comm ACM. 2005;48(3):105–9.

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank Melissa Courvoisier, Dr. Julia Moore and Dr. Rae Thomas for their assistance with recruitment from the Practicing Knowledge Translation course; Christine Marquez for her qualitative research expertise; and all the individuals who participated in the interviews, for their support and contribution to this work.

Funding

Lisa Strifler is funded by a Canadian Institutes of Health Research Banting Doctoral Research Award (#146261). Sharon E. Straus is funded by a Tier 1 Canada Research Chair in Knowledge Translation and the Mary Trimmer Chair in Geriatric Medicine. The funders had no role in the design of the study, the collection, analysis or interpretation of data, or the writing of the manuscript.

Author information

Affiliations

Authors

Contributions

LS and SES conceptualized and designed the study. LS, JMB and SES collected, analysed and/or interpreted the data. LS drafted the manuscript and JMB, MH and SES provided input and revised the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sharon E. Straus.

Ethics declarations

Ethics approval and consent to participate

Research ethics board approval was obtained from Unity Health Toronto (REB #16–335) and the University of Toronto (REB #33907). Ethics approval covered recruitment at the conferences and workshops, which covered the study participants in the USA and Australia. Verbal informed consent was approved by the ethics boards and obtained and recorded at the start of the phone interview using a predetermined script.

Consent for publication

Not applicable.

Competing interests

All authors declare no potential (or perceived) conflicts of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Strifler, L., Barnsley, J.M., Hillmer, M. et al. Identifying and selecting implementation theories, models and frameworks: a qualitative study to inform the development of a decision support tool. BMC Med Inform Decis Mak 20, 91 (2020). https://0-doi-org.brum.beds.ac.uk/10.1186/s12911-020-01128-8

Download citation

Keywords

  • Implementation
  • Theory
  • Model
  • Framework
  • Interviews
  • Decision support