24

Views

Health information systems evaluation criteria: Overview of systematic reviews

Ali Sharifi Kia1; Mohammad Beheshti2; Leila Shahmoradi*, 3

1. MSc Student of Medical Informatics, Department of Health Information Management, School of Health Management and Information Sciences, Iran University of Medical Sciences, Tehran, Iran, 2. MSc in Health Informatics, Department of Health Management and Informatics, School of Medicine, University of Missouri, Columbia MO, USA, 3. Associate Professor, Department of Health Information Management, School of Allied Medical Sciences, Tehran University of Medical Sciences, Tehran, Iran

Correspondence: *. Corresponding author: Leila Shahmoradi, Associate Professor, Department of Health Information Management, School of Allied Medical Sciences, Tehran University of Medical Sciences, Tehran, Iran


Abstract

Introduction: Health information systems play an important role in improving the quality of patient care and patient safety. to ensure their effectiveness and efficiency, they need to be evaluated. Although HIS evaluation has been investigated in many studies, there is no consensus on which aspects of HIS to evaluate. The aim of this study is to identify the indicators for the evaluation of health information systems and to provide an overview of the criteria devised and studies conducted.

Material and Methods: An umbrella review was performed exploring databases PubMed, Science direct, Web of Science, Science, and IEEE while following the PRISMA protocol. Articles were reviewed by two authors independently using the covidence tool to check the inclusion criteria and to extract the data items. Risk of bias was assessed using ROBIS and AMSTAR.

Results: All included studies showed a high risk of bias according to ROBIS criteria. The extracted evaluation criteria were classified into 13 categories. Most of the studies believe that a more reliable and standardized tool is needed for the evaluation of health information systems. Two studies mentioned that surveys and questionnaires were the most commonly used method for evaluation of the systems. Summative evaluation was the most used method in two studies and the least used method in another one study.

Conclusion: All the included studies had high risk of bias. Accordingly, further research and evidence is needed in this field. Most of the studies highlighted the need for more reliable and standardized tools for evaluation of health information systems.

Received: 2022 April 23; Accepted: 2022 July 11

FHD. 2022 Jul 20; 11(1): 120
doi: 10.30699/fhi.v11i1.376

Keywords: Key Words Health information systems, Evaluation, Criteria, Systematic Review.

INTRODUCTION

Management is essential for efficiency and effectiveness of health services. There is a need in health care to achieve more by spending less resources due to the increasing demands [1]. A way to achieve this goal is by using health information systems (HIS) to improve the effectiveness of management and services [2].

Health information systems are of vital importance for estimating the health needs of populations and also for planning the implementation of healthcare interventions [3]. Implementing these systems will lead to improved quality of patient care, medical error prevention, costs reduction, improved satisfaction, improved data quality and improved accessibility for patients [4, 5]. They range from simple systems e.g., transaction systems to more complex systems like decision support systems (DSS).

A crucial point in implementation of these systems is their evaluation to ensure effective delivery, and management of unwanted outcomes [6, 7]. HIS evaluation is “the act of measuring or exploring attributes of a HIS (in planning, development, implementation, or operation), the result of which informs a decision to be made concerning that system in a specific context” [8]. With all said about HISs there are also hazards to use of them, for instance, unreliability, not being user-friendly, functional errors and unprepared environment for implementation. Evaluating can help to rout out these imperfections and hazards [9].

Accurately evaluating HISs depends on choosing the right criteria [10]. Despite the abundant amount of literature regarding the HIS evaluations, evaluations are prone to many flaws, including the complexity, the lack of clarity, the changing of evaluation goals during the study and influencing the users’ expectations [11]. There is no agreement in evaluations on “What to evaluate” [12]. This problem originates from the complex nature of HIS evaluations [13]. Evaluation of these systems can be performed in any three stages of system development life cycle. These stages are pre-implementation, during implementation and post-implementation [14, 15].

To provide an overview of the criteria devised and studies conducted regarding the evaluation criteria a systematic review of the systematic reviews published in this area was performed (Umbrella review). The objective of this research is to identify the indicators for evaluation of health information systems.

Search strategy

This research followed the PRISMA protocol in reporting this systematic review [16]. Databases PubMed, Science direct, Web of science, IEEE and Science journal were searched using Keywords “Evaluation” AND “Health information systems” AND “systematic review” and their MESH headings and synonyms. Last search was performed in April 26 2020. Table 1 depicts the search strategy and number of articles that was retrieved per database.

Study selection and eligibility criteria

The result of search yielded a total of 994 which titles and abstracts were reviewed against eligibility criteria by two authors independently. In case of conflict in opinion the article was referred to a specialist in the field. The Covidence tool was used for the process of screening the articles [17]. Covidence is a software that can facilitate the process of screening and extracting knowledge while conducting systematic reviews.

Inclusion criteria for the included studies are: 1- published between 2015-2020 2- being a systematic review 3- published in English 4- presenting a criterion for evaluation of health information systems. Irrelevant papers were not included in the study.

Table 1. Database search strategies
Database Search query Number of articles
PubMed (((“Assessment”[Title/Abstract] OR “Evaluation” [Title/Abstract] OR Evaluation Methodologies” [Title/Abstract] OR Evaluation Research” [Title/Abstract] OR “Qualitative Evaluation” [Title/Abstract] OR Evaluation Indexes” [Title/Abstract] OR “Quantitative Evaluation” [Title/Abstract] OR “Measurement” [Title/Abstract] OR “Analysis” [Title/Abstract] OR “Appraisement” [Title/Abstract] OR “Validation” [Title/Abstract] OR “Use Effectiveness”[Title/Abstract] OR “Pre-Post Tests”)[Title/Abstract]) AND ((“Health information system”[Title/Abstract] OR “Health Information Systems” [Title/Abstract] OR “Information System” [Title/Abstract] OR “Information Systems” [Title/Abstract] OR “Hospital Information Systems” [Title/Abstract] OR “Hospital Information System” [Title/Abstract] OR “Health Information”) [Title/Abstract])) AND ("Systematic review" [Title/Abstract]) 378
Science direct Title, abstract, keywords: (“Health Information System” OR “Information System” OR “Hospital Information System” OR “Health Information”) AND (“Evaluation” OR “Assessment”) AND (“systematic review”) 52
Web of science TOPIC: ((“Assessment” OR “Evaluation” OR “Evaluation Methodologies” OR “Evaluation Research” OR “Qualitative Evaluation” OR “Evaluation Indexes” OR “Quantitative Evaluation” OR “Measurement” OR “Analysis” OR “Appraisement” OR “Validation” OR “Use Effectiveness” OR “Pre-Post Tests”)) AND TOPIC: ((“Health information system” OR “Health Information Systems” OR “Information System” OR “Information Systems” OR “Hospital Information Systems” OR “Hospital Information System” OR “Health Information”)) AND TOPIC: (("Systematic review")) 319
IEEE ((("Abstract":“Health Information System” OR “Information System” OR “Hospital Information System” OR “Health Information”) AND "Abstract":“Evaluation” OR “Assessment”) AND "Abstract":“systematic review”) 156
Science "(“Health Information System” OR “Information System” OR “Hospital Information System” OR “Health Information”) AND (“Evaluation” OR “Assessment”) AND (“systematic review”)" 89

Data items, extraction, and synthesis

Two reviewers independently extracted the items from the articles. Data items include: journal quality, number of papers included, technology focus, main outcome and evaluation criteria for assessment of health information systems. Subsequently, the evaluation criteria that were extracted from included studies were categorized.

Risk of bias assessment

Two tools were used for assessment of the risk in included studies. The first tool was risk of bias in systematic reviews (ROBIS) which uses 3 phases to assess the risk and relevance in systematic reviews [18]. The second tool was assessment of multiple systematic reviews (AMSTAR) which assessed the risk in studies using 16 questions. AMSTAR is not designed for a final estimation of risk in studies hence a final decision couldn’t be made using this tool [19].

RESULTS

The database search retrieved a total of 994 papers. After screening for duplicates 228 articles were excluded, which left 766 studies. The remaining papers were screened against inclusion criteria; which 758 articles were excluded. Finally, 8 studies were included in the study (Fig 1).


[Figure ID: F1] Fig 1. Flow diagram

Table 2 presents some of the characteristics of the studies that were included in our investigation.

Risk of bias assessment using AMSTAR

Two reviewers independently evaluated the studies using the criteria given by AMSTAR 2.0. This tool wasn’t designed to conclude a final decision regarding the risk of bias across the study, hence no decision is presented in this method of evaluation. Question 1 of the criteria wasn’t appropriate due to the study design of the included studies (all systematic reviews) so it was answered as no in evaluation. The result of the evaluation using this tool is presented in Table 3.

Risk of bias assessment using ROBIS

Two reviewers assessed the articles using the ROBIS tool independently. This tool evaluates systematic reviews in 3 phases, of which phase 1 examines the relativity of the article to the study question. Phase 2 evaluates the study under 4 domains and finally phase 3 concludes a decision concerning the risk of bias in the study. Phase 1 was omitted from our study due to the prior screening process (Table 4).

Evaluation criteria

The evaluation criteria were first extracted per article by two reviewers and then they were categorized into 13 major categories. These 13 categories consist of: quality of care, financial, security, data quality, workflow, satisfaction, management, motivation, support, technical and architectural, guidelines, intelligent factors, and demographics (Table 5).

Table 2. characteristics of the included studies
Ref. Journal quality Papers included Focused technology Outcome
[4] Q1 53 HIS It’s recommended to use a combination of quantitative and qualitative method for evaluation and the most used method was quantitative. Health information systems that are evaluated can lead to satisfaction and improved quality of care.
[13] Q1 20 HIS There is a need for including contextual factors, having a better method for stakeholder identification and a framework that clearly answers the five main questions of an evaluation.
[20] Q1 65 E-health lack of reliable measuring tools for even the most used constructs. Thus, there is the need for reliable measuring tools for readiness assessment factors
[21] Q1 120 Electronic health records (EHR) There is a lack of quality studies regarding valid and reproducible usability evaluations conducted at various EHR development stages and an effort is needed to address this knowledge gap.
[22] Q2 25 Personal health records (PHR) Despite the growing number of PHRs, there is an inability to support functional and technological characteristics and design flaws that impede their maintenance and usability.
[23] Q2 14 Hospital based cancer registries (HBCR) The design and performance of the HBCRs are very heterogeneous in different countries. There is no international coordination for the design and implementation of the standards for HBCRs. Investment on HBCRs would improve quality of care and thereby would decrease the mortality and burden of cancer.
[24] Q3 35 E-health usability is often misunderstood, misapplied, and
partially assessed, and that many researchers have used usability and utility as interchangeable terms.
[25] Q2 15 Clinical information systems (CIS) There was a lack of consideration of the entire clinical workflow in the selected articles. Also, in many cases, guidelines were developed through the synthesis of existing knowledge rather than through user testing or heuristic evaluations.

Table 3. Risk of bias assessment using AMSTAR
Study Question
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
[4] Evaluator 1 N N Y P-Y Y Y N N N N N-MA N-MA N Y N-MA Y
Evaluator 2 N N N N Y N N N N N N-MA N-MA N Y N-MA Y
[13] Evaluator 1 N N Y P-Y N N N N N N N-MA N-MA N Y N-MA Y
Evaluator 2 N N N N N N N N N N N-MA N-MA N Y N-MA Y
[20] Evaluator 1 N N N N N N N N N N N-MA N-MA N Y N-MA Y
Evaluator 2 N N N N N N N N N N N-MA N-MA N Y N-MA Y
[21] Evaluator 1 N N N P-Y Y N N N N N N-MA N-MA N Y N-MA Y
Evaluator 2 N N N N Y N N P-Y N N N-MA N-MA N Y N-MA Y
[22] Evaluator 1 N N Y N N N N N N N N-MA N-MA N Y N-MA Y
Evaluator 2 N N N N N N N N N N N-MA N-MA N Y N-MA Y
[23] Evaluator 1 N N N N Y N N N N N N-MA N-MA N Y N-MA N
Evaluator 2 N N N N Y Y N N N N N-MA N-MA N Y N-MA N
[24] Evaluator 1 N N N N Y Y N N N N N-MA N-MA N Y N-MA Y
Evaluator 2 N N N N Y Y N N N N N-MA N-MA N Y N-MA Y
[25] Evaluator 1 N N N N Y N N P-Y N N N-MA N-MA N Y N-MA Y
Evaluator 2 N N N N Y N N P-Y N N N-MA N-MA N Y N-MA Y

Abbreviations : N = No , Y = Yes , P-Y = Partial yes, N-MA = No meta-analysis conducted


Table 4. Risk of bias assessment using ROBIS

Table 5. Evaluation criteria
Categories Subcategories/Description References
Quality of care Decreasing administrative and medication errors [4, 20, 24]
Improving the quality & efficiency of services and medical care [4, 13, 20]
Reducing care risks [4, 22]
Increasing correct diagnoses [4, 20]
Reducing delays in care at the time of shift change [4]
Increasing bed occupancy rate [4]
Appropriateness of patient care [13]
Conducts a verification process to ensure that all medications comply with recommended dosing based on current evidence-based literature [25]
Financial Cost reducing working and human resources costs [4]
Increasing the costs imposed to hospitals [4]
Raising or no change in the costs imposed to patients or hospitals [4]
Patient costs [13]
Profit Increasing hospital revenue [4]
Security increasing security of information of patients & documents (data security) [4, 20, 22]
Improving accreditation and audit [4, 22]
Patient privacy [13, 20]
Authentication [22]
Authorization [22]
Delegation [22]
The security of ICT national networks for commercial transactions [20]
Data quality Increasing accuracy of information [4, 13, 20]
Improving collection of discharge data [4]
Improving documentation [4, 20]
Data Validity [13, 23]
Data Completeness [13, 20, 22, 23, 24]
Information quality [13, 24]
Usefulness [13]
Workflow Reducing task time & increasing working speed [4, 13]
Reducing staff movement between wards [4]
Improving communication between wards & staffs [4, 20]
Reducing paper work & use of paper [4]
Facilitating transmission of lab test orders [4]
Preventing repeated tests and activities (Reducing redundancy) [4, 24]
Improving exchange of information (Communication) [4, 13, 20]
Facilitating admission process [4]
Prolonging task time [4]
Increasing work load [4]
Changes in tasks, roles and responsibilities [13]
Satisfaction Employee Increasing employee (User) satisfaction [4, 13, 20, 22, 24]
Patients Reducing length of stay [4]
Patient satisfaction [13, 20, 22]
Management Administration Improving decision making efficiency [4]
Improving reviewing patients records and planning their care [4, 20]
Improving management performance [4]
Improving disease-based demographic analysis & comparison between private and public sectors [4]
serious supervision of the registry processes [23]
Business Plan [20]
Maintenance Consistency checks by software performing surveys [23]
Communication Appointment scheduling [20, 22]
Appointment reminders [22]
Communication messaging service to healthcare professionals [20, 22]
Motivation Increasing tendency to use system [4, 20, 21]
Reducing workload [4]
Improving risk-taking [4]
Participation of clinicians and feedback [25]
Overall reaction to the software [24]
barriers or facilitators to adoption [13]
anxiety during use [13]
Corporate IT/S Philosophy [20]
Trust with e-Health technology [20]
Increasing awareness [20]
Encouragement (Positive reward) [20]
Physicians’ concern about high investment and low reimbursement [20]
Resistance to change [20]
Ability/willingness of senior administration to consider benefits outside standard business case/cost [20]
Willingness to consider long timelines for implementation [20]
Movement from short-term funding; short-term accountability deadlines [20]
Cost-benefit analysis [20]
Potential negative impacts [20]
Organizational/Institutional readiness [20]
Support Training [13, 20, 23, 24]
creativity, collaboration & team working [4, 20]
organizational support [13]
IT support [13, 20]
hardware Infrastructure [13, 20]
Feedback [4, 24, 25]
Executive Champions for IT/S Projects [20]
IT/S Budget and Finance Patterns [20]
Integration of Organizational and IT/S Strategies [20]
IT/S Strategic Planning Patterns [20]
Financial support [20]
Access to the software and hardware in the country [20]
Access to the employees who are familiar with the IT concepts and skills [20]
Awareness and support for ICT among politicians [20]
Awareness and support for ICT among institutional policymakers [20]
Hospital’s rate of investment in ICT [20]
Technical and architectural Design of systems clearly legible font [25]
Content should be limited to 1–2 lines, with a justification separated by white space [25]
Omit items for which the information is not available to the user [25]
Cluster related information on the same screen [25]
Avoid too much information on the screen at one time [25]
consideration of clinical workflow from the user’s point of view [21, 25]
Screen recognition [24, 25]
Customizable Data view & entry [25]
choose the least, but necessary information in Data sets [21, 23]
Interface quality [24]
time to task completion [21]
mouse clicks [21]
free and open-source software [22]
Web based [22]
based on edge computer technologies and sophisticated architectural models [22]
implementation should be based on state-of-the-art frame-works [22]
need assessment [20]
cognitive workload [21]
Sharing of locally relevant content between healthcare institutions [20]
Treatment recording [22]
Diagnosis recording [22]
Self-health monitoring services [22]
Problem (Symptom) recording [22]
complexity of reporting systems [13]
System quality Usability [13, 22, 24, 25]
Learnability [13, 20, 24]
Efficiency [13, 22, 24]
Memorability [13, 24]
Flexibility [13, 22, 24]
Interactivity [24]
Aesthetics [24]
task completion accuracy [21]
Terminology and system information [24]
in compliance with high quality standards [22]
compatibility [22]
Reliability [13, 22]
Maintainability [22]
expendability [22]
interoperability [20, 22]
perform speed [13, 20]
Customizability [13]
Ease of use Increasing data accessibility [4, 13, 20, 22, 24]
Ease of use [13, 21, 24]
Lowering or no effect on accessibility of information [4]
error prevention [13]
Speed and quality of ICT/Internet at the institution [20]
Guidelines Guidelines for Data identification & selection [25]
Guidelines for Document entry [23, 25]
Guidelines for prescription entry [25]
Guidelines to reduce errors through appropriate notifications or warnings [25]
Guidelines for management of systems [25]
Existence of national information policy in the country [20]
ICT related regulations [20]
Policies regarding licensure, liability and reimbursement [20]
Intelligent factors Alert systems alert priority [25]
reduction of alert fatigue [25]
Additional educational resources [22]
Intelligent data presentation [22]
Intelligent data export [22]
System rule-based recommendations that enhance decision support [22]
Interaction with other EHR systems and health applications [22]
Demographics Gender [20]
Age [20]
Level of education [20]
Designation [20]
Working experience [20]

DISCUSSION

The studies were evaluated using both ROBIS and AMSTAR tools. AMSTAR wasn’t designed for judging the bias overall, so it wasn’t possible to assign a level of bias to the studies. All the studies were also evaluated using the ROBIS tool. Unfortunately, all the studies were judged to have high risk of bias based on the criteria of the tool. The ROBIS tool was found to be more effective for evaluating systematic reviews. The reason for that is the ROBIS tool has more comprehensive and compatible criteria for systematic reviews than AMSTAR.

The criteria from the articles were categorized into 13 categories including: quality of care, financial, security, data quality, workflow, satisfaction, management, motivation, support, technical and architectural, guidelines, intelligent factors, and demographics.

Most of the studies (5 out of 8) believed there was a need for a standardized and more reliable tool for assessment of health information systems. Two studies highlighted the fact that the most commonly used method for evaluation of systems was surveys and questionnaires [4, 25]. In two studies the most used evaluation method was summative evaluation [4, 25] while in another one, contrary to these two the least used method was summative [21]. The reason for this might lay in the fact that the focus of the latter was in EHR rather than information systems.

Data completeness, data accessibility, and employee satisfaction were the most reported (5 out of 8 studies) criteria for evaluating the health information systems. This might be an indication that these three criteria have the most impact on quality of information systems. The results of this study are concordant with previous studies on HIS evaluation criteria. As investigated by Mohammed et al., human and organizational factors including employee satisfaction have a significant role in determining the information quality of the HIS [26].

In a study by Lau et al., data completeness was the most reported criteria under the HIS quality category which is consistent with the overall findings in the current study [27]. Nikabadi et al. designed a compound set of criteria for the evaluation of health information systems similar to the current study [28]. However, they only categorized the criteria in 4 categories and contrary to this study believed that the human aspect had the most impact on the quality of health information systems. Chen et al. reported data completeness as the most evaluated criteria in data quality assessment of the public health information systems which is consistent with the results of the current study [29].

Sligo et al. stated that evaluating HISs has been historically inadequate and has simple and varied approaches that make it difficult to generalize the results [30]. This is in accordance with the results of this study which most of the papers believed that there is a need for a standard and reliable tool for evaluation of health information systems.

Considering the complex nature of the health information systems, it is very challenging to address all the aspects of a HIS in designing an evaluation framework. Hence, the compound criteria mentioned in this study can help researchers in their future evaluations.

Conclusion

Health information systems have an undeniable impact on health organizations and patient safety and quality of care. However, the positive effects might be neutralized or even turn into challenges when they are not designed or implemented properly. Through this study a new compound set of criteria was designed for evaluation of health information systems. All the articles included in our study had high risk of bias and there is a need for a stronger set of evidence and further research in this field. The majority of the studies believed that there is lack of standard and reliable tool for evaluation of health systems currently.

This study has a number of limitations. Only studies which published 2015 to 2020 and in English was included in the present study. This was due to the reason that only the recent evidence was needed; hence, the early works regarding the issue was left out. Moreover, all the studies included indicated a high risk of bias.

AUTHOR’S CONTRIBUTION

LS prepared the study conception and design, reviewed the final manuscript, critically revised the paper, and approved the final version of manuscript.

AS wrote the manuscript, assisted in data acquisition and also contributed to analysis and interpretation of data and reviewed the final paper.

MB wrote the manuscript and assisted in data acquisition and also contributed to analysis and interpretation of data.

All authors read and approved the final manuscript.

CONFLICTS OF INTEREST

The authors declare no conflicts of interest regarding the publication of this study.

FINANCIAL DISCLOSURE

This study was conducted as part of a research project (code of research: 96_03_31_35339) supported by Tehran university of medical sciences (TUMS).


ACKNOWLEDGMENTS

This study is following Tehran University of medical sciences ethical code of practice (IR.TUMS.SPH.REC.1397.281)

References
1. Lippeveld, T. Sauerborn, R. Bodart, C. Design and implementation of health information systems. World Health Organization 2000
2. Haux, R. Health information systems: Past, present, future. Int J Med Inform. 2006 75(3-4):268–81.
3. Azubuike, MC. Ehiri, JE. Health information systems in developing countries: Benefits, problems, and prospects. J R Soc Promot Health. 1999 119(3):180–4.
4. Ahmadian, L. Salehi, NS. Khajouei, R. Evaluation methods used on health information systems (HISs) in Iran and the effects of HISs on Iranian healthcare: A systematic review. Int J Med Inform. 2015 84(6):444–53.
5. Shahmoradi, L. Habibi-Koolaee, M. Integration of health information systems to promote health. Iran J Public Health. 2016 45(8):1096–7.
6. Yusof, MM. Paul, RJ. Stergioulas, LK. Towards a framework for health information systems evaluation. International Conference on System Sciences. IEEE 2006
7. Ahmadi, M. Shahmoradi, L. Barabadi, M. Hoseini, A. A survey of usability of hospital information systems from the perspective of nurses, department secretaries, and paraclinic users in selected hospitals: 2009. Journal of Health Administration. 2011 14(44):11–20.
8. Ammenwerth, E. Brender, J. Nykänen, P. Prokosch, H. Rigby, M. Talmon, J. Visions and strategies to improve evaluation of health information systems: Reflections and lessons based on the HIS-EVAL workshop in Innsbruck. Int J Med Inform. 2004 73(6):479–91.
9. Ammenwerth, E. Shaw, N. Bad health informatics can kill–is evaluation the answer?. Methods Inf Med. 2005 44(1):1–3.
10. Shahmoradi, L. Ahmadi, M. Haghani, H. Determining the most important evaluation indicators of healthcare information systems (HCIS) in Iran. Health Inf Manag. 2007 36(1):13–22.
11. Ammenwerth, E. Gräber, S. Herrmann, G. Bürkle, T. König, J. Evaluation of health information systems—problems and challenges. Int J Med Inform. 2003 71(2-3):125–35.
12. Shahmoradi, L. Ahmadi, M. Haghani, H. Defining evaluation indicators of health information systems and a model presentation. Journal of Health Administration. 2007 10(28):15–24.
13. Andargoli, AE. Scheepers, H. Rajendran, D. Sohal, A. Health information systems evaluation frameworks: A systematic review. Int J Med Inform. 2017 :97: 195–209.
14. Odhiambo-Otieno, GW. Evaluation criteria for district health management information systems: Lessons from the ministry of health, Kenya. Int J Med Inform. 2005 74(1):31–8.
15. Palvia, P. Jacks, T. Brown, W. Critical issues in EHR implementation: Provider and vendor perspectives. Communications of the Association for Information Systems. 2015 36(1):36.
16. Moher, D. Liberati, A. Tetzlaff, J. Altman, DG. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann Intern Med. 2009 151(4):264–9.
17. Babineau, J. Product review: Covidence (systematic review software). Journal of the Canadian Health Libraries Association. 2014 35(2):68–71.
18. Whiting, P. Savović, J. Higgins, JP. Caldwell, DM. Reeves, BC. Shea, B. ROBIS: A new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol. 2016 :69: 225–34.
19. Shea, BJ. Grimshaw, JM. Wells, GA. Boers, M. Andersson, N. Hamel, C. Development of AMSTAR: A measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007 :7–10.
20. Yusif, S. Hafeez-Baig, A. Soar, J. e-Health readiness assessment factors and measuring tools: A systematic review. Int J Med Inform. 2017 107:56–64.
21. Ellsworth, MA. Dziadzko, M. O'Horo, JC. Farrell, AM. Zhang, J. Herasevich, V. An appraisal of published usability evaluations of electronic health records via systematic review. J Am Med Inform Assoc. 2017 24(1):218–26.
22. Genitsaridi, I. Kondylakis, H. Koumakis, L. Marias, K. Tsiknakis, M. Evaluation of personal health record systems through the lenses of EC research projects. Comput Biol Med. 2015 :59: 175–85.
23. Mohammadzadeh, Z. Ghazisaeedi, M. Nahvijou, A. Kalhori, S. Davoodi, S. Zendehdel, K. Systematic review of hospital based cancer registries (HBCRs): Necessary tool to improve quality of care in cancer patients. Asian Pac J Cancer Prev. 2017 18(8):2027–33.
24. Sousa, VE. Lopez, KD. Towards usable e-health: A systematic review of usability questionnaires. Appl Clin Inform. 2017 8(2):470–90.
25. Lee, Y. Jung, MY. Shin, GW. Bahn, S. Park, T. Cho, I. Safety and usability guidelines of clinical information systems integrating clinical workflow: A systematic review. Healthc Inform Res. 2018 24(3):157–69.
26. Mohammed, SA. Yusof, MM. Towards an evaluation framework for information quality management (IQM) practices for health information systems: Evaluation criteria for effective IQM practices. J Eval Clin Pract. 2013 19(2):379–87.
27. Lau, F. Kuziemsky, C. Price, M. Gardner, J. A review on systematic reviews of health information system studies. J Am Med Inform Assoc. 2010 17(6):637–45.
28. Shafiei, NM. Naghipour, N. A model for assessing hospital information systems. Journal of Health Administration. 2015 18(60):50–66.
29. Chen, H. Hailey, D. Wang, N. Yu, P. A review of data quality assessment methods for public health information systems. Int J Environ Res Public Health. 2014 11(5):5170–207.
30. Sligo, J. Gauld, R. Roberts, V. Villa, L. A literature review for large-scale health information system project planning, implementation and evaluation. Int J Med Inform. 2017 :97: 86–97.

Refbacks

  • There are currently no refbacks.