4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Do dentists practice what they know? A cross-sectional study on the agreement between dentists' knowledge and practice in restoring endodontically treated teeth

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          There are very few studies comparing dentists' knowledge in relation to their clinical approach despite the existence of a possible gap between what they know and what they do.

          Aim

          To measure the agreement between knowledge and practice methods related to a selected clinical scenario involving the placement of an indirect post in endodontically treated teeth (ETT) among different types of practitioners.

          Methods

          An electronic questionnaire was emailed to members of the Saudi Dental Society. The questionnaire presented a clinical scenario of restoring a posterior ETT with an indirect post, core unit, and crown, followed by specific questions regarding knowledge and practice related to ten different treatment aspects such as who prepares the post space, technique, isolation, time, gap between gutta-percha, and time to cementation of the crown. Each question was presented twice for each aspect, once asking about their practice method and then what they thought was the correct practice (knowledge). The relationship between the participants' responses and their specialty and the agreement between the responses of knowledge and practice for each participant were analyzed by Pearson's chi-square test and Kappa.

          Results

          203 completed questionnaires were analyzed. Most participants were 30 years old or younger (62.6%), and general dental practitioners (59%). When comparing the knowledge to the practice methods of each participant, nine out of ten aspects were of a "weak" level agreement or below (kappa < 0.59, p < 0.001). Only one aspect demonstrated a "strong" level of agreement (Kappa = 0.804), which was related to the duration of time between obturation and post space preparation in the presence of a periapical lesion. However, this strong agreement in the responses was not aligned with current evidence. There was also a significant difference among the responses of endodontists, restorative dentists and general practitioners in most of the aspects.

          Conclusion

          Overall, there was a weak agreement between what practitioners know and do in most aspects of a selected clinical scenario involving the placement of an indirect post in posterior ETT. Moreover, the participant's specialty influenced their responses regarding both knowledge and clinical practice.

          Supplementary Information

          The online version contains supplementary material available at 10.1186/s12903-021-01479-2.

          Related collections

          Most cited references43

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Interrater reliability: the kappa statistic

          The kappa statistic is frequently used to test interrater reliability. The importance of rater reliability lies in the fact that it represents the extent to which the data collected in the study are correct representations of the variables measured. Measurement of the extent to which data collectors (raters) assign the same score to the same variable is called interrater reliability. While there have been a variety of methods to measure interrater reliability, traditionally it was measured as percent agreement, calculated as the number of agreement scores divided by the total number of scores. In 1960, Jacob Cohen critiqued use of percent agreement due to its inability to account for chance agreement. He introduced the Cohen’s kappa, developed to account for the possibility that raters actually guess on at least some variables due to uncertainty. Like most correlation statistics, the kappa can range from −1 to +1. While the kappa is one of the most commonly used statistics to test interrater reliability, it has limitations. Judgments about what level of kappa should be acceptable for health research are questioned. Cohen’s suggested interpretation may be too lenient for health related studies because it implies that a score as low as 0.41 might be acceptable. Kappa and percent agreement are compared, and levels for both kappa and percent agreement that should be demanded in healthcare studies are suggested.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Lost in knowledge translation: time for a map?

            There is confusion and misunderstanding about the concepts of knowledge translation, knowledge transfer, knowledge exchange, research utilization, implementation, diffusion, and dissemination. We review the terms and definitions used to describe the concept of moving knowledge into action. We also offer a conceptual framework for thinking about the process and integrate the roles of knowledge creation and knowledge application. The implications of knowledge translation for continuing education in the health professions include the need to base continuing education on the best available knowledge, the use of educational and other transfer strategies that are known to be effective, and the value of learning about planned-action theories to be better able to understand and influence change in practice settings.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Why don't physicians follow clinical practice guidelines? A framework for improvement.

              Despite wide promulgation, clinical practice guidelines have had limited effect on changing physician behavior. Little is known about the process and factors involved in changing physician practices in response to guidelines. To review barriers to physician adherence to clinical practice guidelines. We searched the MEDLINE, Educational Resources Information Center (ERIC), and HealthSTAR databases (January 1966 to January 1998); bibliographies; textbooks on health behavior or public health; and references supplied by experts to find English-language article titles that describe barriers to guideline adherence. Of 5658 articles initially identified, we selected 76 published studies describing at least 1 barrier to adherence to clinical practice guidelines, practice parameters, clinical policies, or national consensus statements. One investigator screened titles to identify candidate articles, then 2 investigators independently reviewed the texts to exclude articles that did not match the criteria. Differences were resolved by consensus with a third investigator. Two investigators organized barriers to adherence into a framework according to their effect on physician knowledge, attitudes, or behavior. This organization was validated by 3 additional investigators. The 76 articles included 120 different surveys investigating 293 potential barriers to physician guideline adherence, including awareness (n = 46), familiarity(n = 31), agreement (n = 33), self-efficacy (n = 19), outcome expectancy (n = 8), ability to overcome the inertia of previous practice (n = 14), and absence of external barriers to perform recommendations (n = 34). The majority of surveys (70 [58%] of 120) examined only 1 type of barrier. Studies on improving physician guideline adherence may not be generalizable, since barriers in one setting may not be present in another. Our review offers a differential diagnosis for why physicians do not follow practice guidelines, as well as a rational approach toward improving guideline adherence and a framework for future research.
                Bookmark

                Author and article information

                Contributors
                rbabaier@ksu.edu.sa
                Journal
                BMC Oral Health
                BMC Oral Health
                BMC Oral Health
                BioMed Central (London )
                1472-6831
                10 March 2021
                10 March 2021
                2021
                : 21
                : 110
                Affiliations
                [1 ]GRID grid.56302.32, ISNI 0000 0004 1773 5396, Department of Prosthetic Dental Sciences, College of Dentistry, , King Saud University, ; Riyadh, 12372 Saudi Arabia
                [2 ]GRID grid.56302.32, ISNI 0000 0004 1773 5396, Department of Restorative Dental Sciences, College of Dentistry, , King Saud University, ; Riyadh, Saudi Arabia
                [3 ]GRID grid.5379.8, ISNI 0000000121662407, Present Address: Division of Dentistry, Faculty of Biology, Medicine, and Health, School of Medical Sciences, , University of Manchester, ; Manchester, UK
                Author information
                http://orcid.org/0000-0001-9565-8275
                Article
                1479
                10.1186/s12903-021-01479-2
                7945671
                33691705
                6f17a5ae-8cda-41af-8f83-021fdc4b04af
                © The Author(s) 2021

                Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

                History
                : 25 June 2020
                : 3 March 2021
                Categories
                Research Article
                Custom metadata
                © The Author(s) 2021

                Dentistry
                patient care,practice gap,evidence-based dentistry,post and core,root canal therapy,post space

                Comments

                Comment on this article