Sharma, Prinja, and Aggarwal: Measurement of health system performance at district level: A study protocol

Measurement of health system performance at district level: A study protocol

Abstract

Background

Limited efforts have been observed in low and middle income countries to undertake health system performance assessment at district level. Absence of a comprehensive data collection tool and lack of a standardised single summary measure defining overall performance are some of the main problems. Present study has been undertaken to develop a summary composite health system performance index at district level.

Methods

A broad range of indicators covering all six domains as per building block framework were finalized by an expert panel. The domains were classified into twenty sub-domains, with 70 input and process indicators to measure performance. Seven sub-domains for assessing health system outputs and outcomes were identified, with a total of 28 indicators. Districts in Haryana state from north India were selected for the study. Primary and secondary data will be collected from 378 health facilities, district and state health directorate headquarters. Indicators will be normalized, aggregated to generate composite performance index at district level. Domain specific scores will present the quality of individual building block domains in the public health system. Robustness of the results will be checked using sensitivity analysis.

Expected impact for public health:

The study presents a methodology for comprehensive assessment of all health system domains on basis of input, process, output and outcome indicators which has never been reported from India. Generation of this index will help identify policy and implementation areas of concern and point towards potential solutions. Results may also help understand relationships between individual building blocks and their sub-components.

Significance for public health

Measuring performance of health system is important to understand progress and challenges, and create systems that are efficient, equitable and patient-focused. However, very few assessments of such nature have been observed in low and middle income countries, especially at district level, mainly because of methodological challenges. This study presents a methodology for comprehensive assessment of all domains of health system and generation of a composite Health System Performance Index on the basis of input, process, output and outcome indicators. It will help identify policy and implementation problems worthy of attention and point towards potential solutions to health system bottlenecks resulting in poor performance. The results may also help better understand the relationships between individual building blocks and their sub-components and the overall performance of the health system.




Competing interest statement

Conflict of interest: The authors declare no potential conflict of interest.

Background

Health systems are central to the healthy development of individuals, families and societies which, along with a number of other factors, influence lives of vast numbers of people.1 They are responsible for improving the health of populations, at an optimal quality and at a low per capita cost.2 World Health Organisation (WHO) in its Framework for Action listed 6 building blocks of the health system based on the expected functions.3 Not only do these blocks allow defining desirable attributes of the system, they also provide a mechanism to identify gaps in structure and performance.

Measuring performance of health system is an essential requirement to understand progress, challenges and propose way forward. This helps in creating systems that not only generate results in terms of enhanced coverage of services, but are efficient, equitable, patient-focused, accessible and sustainable.4 Performance measurement also helps in understanding relationships between performance of health system building blocks and the outcome indicators, which helps in aligning performance with specific objectives pursued by organizations and/or systems.5 Not only should this measurement include assessment of all three, structure, process and outcomes, for a comprehensive measurement of quality in healthcare,6 it should encompass assessment of efficiency and equity in healthcare delivery system.7

Commendable efforts have been made in high income countries over last few decades to develop and utilize performance assessment frameworks and methodologies. These include specific frameworks developed by countries taking peculiarities of their specific health systems into consideration, as well as generic frameworks developed by WHO and OECD.8-10 Frameworks exploring specific building blocks of the healthcare system, like Governance, and related performance evaluation systems have also been developed and compared for use at regional level.11,12 However, limited efforts have been seen in this direction in low and middle income countries (LMIC). Ministry of Health, Republic of Uganda has been producing an annual health system performance report since 2001 containing information on health sector performance (inputs and processes), health service coverage levels and local governance performance using league table analysis.13 Ministry of Health, Ghana published Holistic Assessment of the Health Programme in 2014 which reviewed the performance of health sector of the country on basis of 54 indicators under 6 stated objectives, and evaluated annual trends since 2010.14

Usage of health system performance frameworks for conducting assessments in LMIC have thus been primarily observed at national level, or in a few cases at sub-national or regional levels. 15 However, it is equally important to understand performance of the health system at state or district levels as these are the seats of implementation of policies and programmes in most of the countries with federal structure.16 This district level assessment becomes further relevant with greater impetus on decentralization of administrative and financial powers for local level planning and implementation in most of the LMICs. For example, the National Rural Health Mission (NRHM) in India which was instituted in 2005 to improve the availability and access to quality health care, laid strong emphasis on decentralization and district management of health programmes for improved governance.17

However, measuring performance at district level presents various methodological challenges. A high degree of geographical variation in various measures of performance demonstrated in earlier studies from the state depict that general quality/volume standards are not equally achievable among all districts.18 This makes it important to define and set benchmarks for effective comparisons of the indicators in selected areas to the state average, or to the rates in other areas.19,20 Another challenge is absence of any single data collection tool capable of providing comprehensive information on all building blocks of the health system.21 Most of the existing tools deal with health facility assessment, gathering information on resource management and to some extent on service delivery at health facilities, skipping other building blocks such as governance, health information system and financial management at local level.

Deficiency also exists in terms of a standardised summary measure capable of condensing all parameters to define performance of health system into a single value to present the crux of the story in an easily comprehensible manner to policy makers.22 Though disaggregated findings generated by individual assessment of all building blocks and their sub-components are essential to highlight the relevant areas of concern and undertake corrective measures, a single index summary measure may be effective in presenting an overall comparison among different regions. Such a measure may summarise complex, multi-dimensional realities by reducing visible size of a set of indicators without dropping the underlying information base.23,24 In this context, the present study was planned as a part of doctoral research with the objective of developing a summary composite index Health System Performance Index (HSPI) for assessment of overall performance of health system at district level.

Materials and Methods

Settings

Districts in Haryana state from north India were selected for the study because of availability of more decentralized data sources and presence of a favourable administrative environment encouraging research for evidence based policy making in the state. Haryana has a population of 25 million and is divided into 21 administrative districts (Geographical areas demarcated by the Government for provision of administrative, judicial and revenue functions). Each district in the state has an average of 320 villages and 1.2 million population size. The public health system in each district is a vertical 3-tier machinery, with a Health Sub-Centre (SC) over every 5000 population at the grass-root level. Five to six SCs are monitored by a Primary Health Centre (PHC) covering 30,000 population. Four to five PHCs are monitored by a Community Health Centre (CHC) with the overall district health administration run by Civil Surgeon (CS) who presides over all public health facilities.25 The state has 40 sub-divisional and district hospitals, 109 CHCs, 454 PHCs and 2542 SCs.26 The SCs and PHCs primarily provide primary care services, while CHCs and DHs provide both primary and secondary care services. The CS office is responsible for local operational planning and implementation of the policy guidelines for various health programs framed at national and state level and sustaining quality of service delivery at public health facilities in the district.

Selection of framework, indicators and tool development

The idea for this study was conceived and developed in late 2013. For the purpose of this research, building block framework proposed by World Health Organisation in 2007 was used as the guiding framework. This framework proposes practical ways to organize health systems into 6 operational building blocks. This approach was found to be suitable for locating, describing and classifying health system constraints as well as providing scores to individual building blocks with ease. Since it described health system functions in a simple manner, it helped in framing the research questions and data collection tools. It is currently the most commonly used framework to describe the health systems in international literature.27

A literature review was performed in January to June 2014 to identify existing study tools and their indicators. Search terms were developed and both methodological and topic literature was searched. This involved a review of research papers, published reports, policy documents and methodology documents in order to collect information on tools measuring the health system in totality or separate building blocks.

PubMed and google scholar databases were used to search for published literature on health system performance measurement. The search combined various terms for health system performance and measurement/ evaluation and combined both free text-words and controlled vocabulary terms (Supplementary file 1). No restrictions on publication date were used, however, the language was restricted to English. Reference lists of the included articles and recent reviews related to measurement of voice were hand searched to identify additional relevant articles and documents. The PubMed search yielded 3109 articles, out of which 82 peerreviewed articles were selected for full text reading after title and abstract screening. Additionally, 99 policy documents, books/ book chapters and published reports were reviewed. In addition, we searched the websites of governmental entities and international agencies like World Health Organization (WHO), World Bank and USAID.

Each document was reviewed and summarized in a data collection sheet that included title of the document, type, and component (and elements) of the framework it addressed.

A broad range of indicators and tools covering all the building blocks, as proposed by WHO framework, were enlisted. A set of 8 selection criteria were then used as guiding principles to assess the value and practicalities of potential indicators. These were relevance, accuracy, usefulness, importance, feasibility, credibility, validity and distinctiveness. The indicators were scored by the authors on a scale of 1-5 on these criteria and cumulative results were compared to generate a final list of 85 input and process indicators scoring more than 50% of the median score. Inputs towards scoring on importance and usefulness of indicators for programme strategies and health system priorities, and towards feasibility of data availability and collection in local settings were solicited from 9 programme managers at state and 5 at district and sub-district level in the department of health in Haryana. While the state level programme managers worked in the domains of maternal health, child health, HMIS monitoring and evaluation, referral transport, accounts and finance, human resource management, health system trainings and civil registration system in the public health sector in the state, the district officials ranged from administrators at district level to medical officers in-charge of the public health facilities. The comprehensive list developed included indicators on all six major health system functions: governance, financing, human and material resource management, health management information system (HMIS) and service delivery.

This list of indicators and tools were then discussed with an expert panel in October 2014. One of the authors developed a list of 13 health policy and system research experts with a wide variety of publications and professional experience in the field of health economics, health management, health promotion, human resource management, national health programmes, communicable and non-communicable disease epidemiology and Health Management Information Systems in context of developing country settings. Publication of at least 2 relevant articles in last five years and current activity in the field were used as inclusion criteria. Each of the 13 experts were invited to participate in a focus group discussion. Of the total invitations sent, 7 individuals (primarily from local settings) agreed to participate; 6 declined or did not reply.

The focus groups were scheduled as two 2 hour sessions at the authors’ affiliated institute. The first author acted as the moderator for the exercise. Participants were informed of the purpose of the group and their consent was obtained. Participants were asked to rate each indicator on a likert scale of 1-3 (3 being highest priority) considering 4 criteria: face validity, content validity, importance and feasibility of data collection. They could also assign a score of zero if they believed that an indicator should not remain on the list. The members were allowed to engage in extensive discussions to resolve any ambiguity related to selection and content of the indicators. A mean score for each indicator was calculated by summing all ratings reported for a single item. Items were subsequently listed in descending order of priority. The panel also assessed questions in the tools to confirm that they fulfil requirements of the selected indicators. The process was brought to completion with selection of final indicators with consensus of all members in the expert panel. Details of the indicators and data collection tools selected for the study have been provided in the next section.

Input and process indicators

The six health system building blocks were classified into twenty sub-domains. A total of 70 input and process indicators were selected to measure performance (Table 1).

For health financing building block, two sub-domains were identified: Resource pooling and purchase of services and extent of public health expenditure. The first sub-domain offers very narrow dimensions to identify indicators from as most of the discretionary power on how to pool financial resources and where should they be utilized is reserved with the state authorities. An indicator on absorptive capacity of the district, which is rather determined by the efforts and actions of district authorities was hence selected for this sub-domain. For the second sub-domain, an indicator on per capita public health expenditure by each district was selected. Details of all indicators, including their numerators and denominators, the computation methods and sources of information have been appended as Supplementary File 2.

Three sub-domains were identified for the second building block of human resources for health: availability and distribution, capacity and productivity and motivation and job satisfaction. A total of 4, 5 and 6 indicators respectively were chosen for their measurement. The indicators under availability and distribution sub-domain dealt with density of core healthcare workforce, position occupancy of this workforce and their equitable distribution (rural-urban, primary-secondary health facilities). The second subdomain included an indicator on average training status of the core workforce with respect to six selected trainings mandatory in the state for public health employees. The other four indicators referred to average work productivity score of staff at SC, PHCs, CHCs and DH respectively in the districts. The third sub-domain under human resources employed six indicators assessing motivation (3 indicators) and job satisfaction (3 indicators) of clinical cadres, public health cadres and administrative cadres. The public health cadre collected responses from ANMs, MPHW (m), LHVs and HIs. Details of all these cadres can be found in the Supplementary File 2.

Four sub-domains for the four levels of public health facilities i.e. SC, PHC, CHC and DH were selected for assessment of facility readiness (material resources) of the health system. Each of these sub-domains included four indicators, one each for indicating average availability and functionality of equipment and instruments, furniture, drugs and vaccines, and support services at health facilities.

HMIS domain was divided into five sub-domains (10 indicators) for assessment. The first sub-domain of HMIS resource availability included two indicators, to assess availability of resources at the selected facilities and at district headquarters. Three indicators were decided for the second sub-domain of resource capacity: HMIS training levels, familiarity with data reporting tools and conceptual understanding. The third sub-domain under HMIS had an indicator each for regularity and timeliness of data transmission from health facilities to state headquarters. While the fourth subdomain of data quality had indicators for MIS records completeness and accuracy, fifth sub-domain of data usage will be evaluated based on indicators for local data availability and feedbacks received by the facilities in the districts. Details of all these indicators can be seen in Supplementary File 2.

Two sub-domains, service availability and service quality, were shortlisted for assessment of service delivery building block. Seven indicators were selected for the service availability subdomain. These included average population covered by each PHC in the district, percent CHCs and higher facilities with availability of radiography (X-ray) services, average effective doctor presence rate at PHCs and higher facilities, number of Revised National TB Control Programme (RNTCP) microscopy centres per 100000 population, proportion of First Referral Units (FRUs) with functional blood bank facilities, proportion of PHCs with at least 1 female doctor and proportion of delivery points providing 24x7 maternity services in the district. The service quality sub-domain was considered to have five indicators: Timeliness of referral transport in the district, timeliness of OPD consultation, qualities of basic amenities, scores for respect for dignity of beneficiaries and score for confidentiality of information.

The governance building block was divided into four subdomains on basis of expert panel discussions. These were participation and responsiveness, transparency and fairness, efficiency and effectiveness and accountability. The first sub-domain will be measured using 3 indicators: community participation in public health facilities administration, regularity in interaction between district health authorities and facilities in-charges and approachability of district health authorities for sub-ordinates. The second sub-domain will assess transparency and fairness in the health system using 4 indicators: information of rights to community through citizen charter, fairness in financial dealings, equality/ non-discrimination among subordinates and upholding of ethics while decision making by district authorities. Four indicators were selected for assessing third sub-domain (efficiency and effectiveness) of governance building block. These were presence of essential administrative skills among district authorities, display of a capacity for problem resolution at district level, maintenance of continuous monitoring and supervision and a score for overall quality of health administration in district. The final sub-domain of governance referred to accountability of employees within the system and towards the community to be measured using 3 indicators.

Output and outcome indicators

A set of four sub-domains for outputs and three sub-domains for assessing outcomes of the health system at district level were identified (Table 1).

The health system outputs will be measured under four subdomains. A total of 5 indicators were shortlisted for assessment of first output sub-domain: coverage of primary care services. These were institutional delivery rate, full ANC rate, full immunization rate, contraception prevalence rate and oral rehydration solution (ORS) usage rate for childhood diarrhoea cases. The indicators selected for assessment of second sub-domain (utilization of curative care services) included bed occupancy rates for both rural and urban areas, OPD consultation rates, indoor admission rates for rural and urban health facilities and haemoglobin estimation tests per patient conducted at public health facilities for anaemia detection in the district (For details, see Supplementary File 3).

The third sub-domain of health system outputs (equity in health financing) will be assessed using two indicators: a concentration index for nil maternity expenses and a concentration index for catastrophic hospitalization expenses. The final sub-domain efficiency and equity in service delivery has two dimensions: efficiency and equity. The efficiency dimension will be measured using 4 indicators: high risk pregnancy detection rate during ANC by outreach workers, referral rate of sick neonates during PNC visits, proportions of public sector deliveries during night hours and proportion of CHCs in the district performing at least a minimum number of deliveries as per performance benchmarking standards defined by NHM, Haryana. The equity dimensions will be assessed in three areas: social equity, gender equity and geographic equity.

The health system outcomes were selected to be measured under three sub-domains: morbidity rates, mortality rates and financial protection. Two indicators, acute illness rates and hospitalization rate in community were decided to be used for assessment of morbidities. Infant mortality rates (IMR) and maternal mortality rates (MMR) for the districts were decided to be used for assessment of mortalities. Assessment of financial risk protection sub-domain will be made through use of four indicators: maternity cases at public health facilities reporting nil delivery expenditure, hospitalization cases in district not reporting catastrophic expenditure, hospitalization cases in district not reporting impoverishment due to out-of-pocket (OOP) expenditure and population covered under health insurance in the district.

Study tools

A total of 13 study tools were drafted that are being employed for collection of data on selected indictors. The questions in the tools were aggregated as per their desired source of information and then re-aggregated on basis of level of healthcare facilities at which these will be answered. Tool 1 and 2 will collect information from state level public health headquarters while tool 3 from district level public health system headquarters. Tools 4 to 10 will be used at facilities to collect primary data from health system employees and facility surveys. Client satisfaction for quality of service delivery at public health facilities will be assessed using tool 11 in the districts. Details of study tools, type of data to be collected and other pertinent information has been provided in Table 2. Study tools have been attached as Supplementary File 4.

Study tools pre-test

The pre-test of study tools was carried out in nine purposively selected primary and secondary public health facilities in Panchkula district of Haryana state in February to April 2015. Tools were pre-tested using in-depth interviews with SC, PHC and CHC facility in-charges and data handlers. A review of health facility survey instruments and self-administered questionnaires on ease of understanding questions and feasibility of data availability was also obtained from facility staff members. A total of 9 facility in-charges, 40 clinical and support manpower were interviewed, while 23 patient satisfaction interviews were conducted to collect quantitative data using both questionnaires and interview schedules during pre-test. Analysis of pre-test results helped in identification of contentious issues and questions which posed difficulty during data collection.

Data collection

Phase wise data collection was started in October 2015. It is planned to cover all 21 districts in Haryana state for actual data collection; it will be attempted to collect data from 18 facilities in a district, including 10 SCs, 5 PHCs, 2 CHCs and a DH. In totality, data will be collected from 378 health facilities across Haryana in a period of 2 years. Additionally, information will also be collected from state and district health directorate headquarters and results of a community based Concurrent Evaluation of NHM survey (CENHM) carried out in all districts of Haryana from 2012-15.28 Standard definitions will be used for defining these indicators (Supplementary Files 2 and 3).

Strict measures will be undertaken for ensuring quality of data collection from the field. Field investigators will be routinely supervised by supervisors, who will fill a supervision tool to record data collection and soft skills of the investigators. Field supervisors will undertake a repeat data collection on 10% of the selected health facilities which will be covered by field investigators. A statistical comparison of indicators as compiled using the entire data collected by field investigators and supervisors will be made to estimate quality of data collection.

Data entry and analysis

Data will be entered in data entry tools developed in Microsoft Excel. Range and consistency checks will be employed to minimize data entry errors. Five percent tools will be randomly selected and matched for cross-verification to identify errors in data entry. Entry will be repeated for tools with more than 3% error rate. Identified incorrect values will be corrected. Data will also be subjected to cleaning and normality checks (skewness and kurtosis) using criteria defined by Composite Indicators Research Group.29 Non-normal indicators will either be suitably modified or excluded from the analysis.

Indicators will be normalized for making them suitable for aggregation. Min-max approach will be used for this purpose.23 For the purpose of benchmarking, a combination of absolute and relative goalposts will be used at the stage of normalization of indicators. Absolute goalposts, depicting highest and lowest achievable values will be selected as benchmarks wherever possible considering the nature of indicators, for others, the benchmarks will be derived from the range of values attained by districts in the dataset. The min-max technique will yield normalized scores for the indicators between the values of 0 and 1, making it feasible to aggregate different type of indicators. The normalized scores will next be subjected to a preference-weighted approach, assuming each indicator, sub-domain and domain of the health system should be treated equally.30 The aggregation will be done using geometric mean approach.31 The robustness of the results will be checked using sensitivity analysis employing different approaches of normalization and aggregation. This will also verify the effect of different approaches employed for benchmarking on the study results. The computed measure will represent the performance of health system at district level as desired, and not on specific health facility levels. Along with this summary score, domain specific scores will be generated to present the quality and contribution of individual building block domains and sub-domains in the public health system of an area.

All the quantitative data collected during research will be analysed using statistical software Microsoft Excel and IBM SPSS (Statistical Package for Social Sciences).32

Discussion

In this paper, we present a methodology for comprehensive assessment of all domains of health system. The study will also generate a composite Health System Performance Index for comparison of the relative performance of different districts on the basis of input, process, output and outcome indicators which has seldom been reported from India. An earlier review of articles revealed that only 23% articles published from India had some policy implications in the field of health policy and systems research while majority were directed towards clinical practice.33 Less than 2% of these provided recommendations for policy making.33 Another review conducted in 2004 studying 4876 health research and policy articles published from India showed health policy/ systems to be a neglected field with only 1.9% of total articles.34 Since the index from our study will be based on indices generated for different building blocks, it will also be helpful in identifying core areas of the system needing urgent attention. Generation of such an index thus has significant policy implications. It can help policymakers and researchers identify unwanted variation in health system performance across the districts, policy and implementation problems worthy of attention and provide a direction towards potential solutions to health system bottlenecks resulting in poor performance.

The results may also help public health system administrators to better understand the relationships between individual building blocks and their sub-components and the overall performance of the health system. This will help in understanding the relative contribution of all aspects and hence their individual importance. Researchers may utilize these tools and results generated to explore associations between individual health system inputs and processes and health system outputs and outcomes. These associations may not restrict to primary care coverage and utilization of services, but also include morbidity and mortality rates in the community. This will help in attributing the wide variation in health outcomes among districts and states to difference in the health system performance. Though there has been an acknowledgement of the fact in literature that health systems have a role to play towards overall health of the community 35, there is a relative lack of objective assessments attempts of this contribution.36 Results of the present study may also pave a way towards fulfilment of this deficiency in the literature.

Strengths in the design of this study include a combination approach for inclusion of indicators in the study, on the basis of what are desirable attributes of a health system as well as information for which of the indicators is readily and reliably available.35 The study will also bridge a lack of locally adapted tools which can study the kinetics of this complicated social structure at the district level, which is the key seat of implementation of state health policies. The tools will be developed in local settings, to understand the factors modulating performance of the system at an intermediary district level, instead of a regional or national level, though information will be collected in both primary and secondary manner, from health facilities, district as well as state headquarters. Influence of quality of administration and governance at this level has not been documented earlier at this scale. Meaningful information on health system performance can be extremely useful for strengthening foundation of health policies framed for the state and lower levels, especially if the mechanics of administration and its influence on implementation of policies can be understood. Additionally the study attempts to provide an objective score to contribution of health system performance to overall health of the community, attempting to delineate the role of system as opposed to that of social determinants of health.

Published literature presents different approaches that can be employed for benchmarking while conducting evaluation studies. The benchmarks can be absolute or relative, depending upon the nature of indicators involved and considering whether it is possible to identify maximum and minimum achievable values. Under the scenario where absolute benchmarks cannot be established on basis of international, national or regional standards, these may have to be identified using mean or median from the range of regional values in the dataset.19 The present study proposes to utilize both approaches of absolute and relative benchmarks, after selection of indicators for the same in consultation with the expert panel members.

Different approaches towards normalization of indicators have also been proposed in literature from time to time, most common of these being standardization, min-max approach and distance to a reference.23 These methods may produce different results for the composite indicator, hence it is important to identify the most suitable procedure considering properties of the measurement units in which the indicators are expressed. The use of standardization approach restrict analysis to use of parameters obtained from dataset, unlike the other two approaches. In order to ascertain the impact of normalization approach on results, present study proposes to conduct a sensitivity analysis employing different methods.

The design of this study has limitations as well. The study begins with and limits to the WHO definition of Health Action and Primary Intent to define boundaries of the health system.37 While we have attempted to define the scope of our work in terms of health system building blocks, measuring these through lenses of equity, efficiency, timeliness and patient-centeredness, others have also emphasized upon assessment of quality of care as a criteria for measuring health system performance.6 Still others have suggested performance measurement of healthcare delivery systems in terms of value for patients, with value defined in terms of cost-effectiveness. 38,39 There is also a general move with-in the country to setup health systems in accordance with evidence for cost-effectiveness of various health interventions under national health programmes. 40 Though this study attempts to incorporate parameters of patient satisfaction as important indicators towards quality of service delivery by the system, performing a cost-effectiveness analysis was out of the scope of work because of the time constraints.

Previous literature has defined equity in health financing and service delivery as a broad concept, measured both as an output and an outcome of the health system. Under the present study, while we have considered use of indicators for financial protection as health system outcomes, the indicators assessing equity in service delivery (social, gender, geographic equity in service utilization) and equity in health financing (socio-economic inequity in nil maternity expenses, socio-economic inequity in catastrophic hospitalization expenses) have been used as health system outputs.

Like morbidity and mortality rates, financial protection of the community is a complex phenomenon which is influenced by a number of socio-demographic-economic factors and hence is relatively distant to the health system. However, equity in service delivery and associated equity in health financing are directly dependent upon the processes followed at the health facilities. Which wealth quintile of the community benefits more from cashless delivery or suffers more catastrophic expenditure, when there are necessary system level interventions in place to benefit the poor, reflects upon the behaviour and practices of the health facility staff from where the services are sought. Undoubtedly, this behaviour is a result of deep-rooted social biases inherent in the community, the health system personnel are also a member of which, strict regulation and better governance of the system may still prevent biased behaviour towards certain sections of the society. We have hence considered it more proximal to the health system, in comparison to the overall paradigm of financial protection. Keeping this in mind, equity has been considered as an output of the system under the present study.

In this study, other social systems with secondary influences on health system have also not been included for study of health system performance though these have been included to explain overall health of community. Cross-system goals of the health system have been excluded as the authors believe that the current health system in Indian context should not be held accountable for changes in all health determinants. Another limitation is the lack of ability of study tools to attribute concurrent micro-structural changes in the system to changes in performance. The study attempts to measure components determining health system performance, and hence assess performance on a broader time frame of one year duration, which was considered to be immune to micro-changes due to their generalised omnipresence.

Conclusions

Generation of a composite Health System Performance Index on the basis of input, process, output and outcome indicators is an urgent need of the hour. Not only will the results of the study develop a comprehensive comparable account of how well the health system is working in different districts of the state, these will also provide a basis to explore the why component of the equation, dealing with why certain health system domains in certain areas perform better than the others. The results can provide initial directions, establish pathways for conduction of further qualitative research in the future, which may help to generate a richer understanding of the underlying mechanisms in play.

Ethical considerations

Ethical clearance for the doctoral research, as a part of which this study is being conducted, was obtained from the Institute Ethics Committee of PGIMER, Chandigarh, India in June 2014. Two administrative approvals were obtained from Health Department, Haryana in November 2014 and August 2015 to undertake evaluation of their health facilities and collection of secondary data. Written informed consent is being obtained from respondents wherever they participate in data collection; the content of the consent form includes the purpose of the study, benefits of participation, opportunity to withdraw, confidentiality of the data, and contact persons in case individuals needed further clarification. The privacy of respondents selected and confidentiality of information received from them will be ensured by anonymizing the data retrieved.

References

1. 

WHO. The World Health Report 2000. Health Systems: Improving Performance. Geneva: World Health Organisation; 2000.

2. 

DM Berwick, TW Nolan, J. Whittington The Triple Aim: Care, Health And Cost. Health Affairs 2008;27:759-69.

3. 

World Health Organisation. Everybody’s business: strengthening health systems to improve health outcomes : WHO’s framework for action. Geneva, Switzerland: WHO; 2007.

4. 

AR Tawfik-Shukor, NS Klazinga, OA Arah. Comparing health system performance assessment and management approaches in the Netherlands and Ontario, Canada. BMC Health Serv Res 2007;7.

5. 

Canadian Institute for Health Information. A Performance Measurement Framework for the Canadian Health System. Ottawa: Canadian Institute for Health Information; November 2013.

6. 

A Donabedian. The quality of care. How can it be assessed? Arch Pathol Lab Med 1997;121:1145-50.

7. 

OA Arah, GP Westert, J Hurst, NS Klazinga. A conceptual framework for the OECD Health Care Quality Indicators Project. Int J Qual Health Care 2006;18:5-13.

8. 

CK Tashobya, VC da Silveira, F Ssengooba. Health systems performance assessment in low-income countries: learning from international experiences. Global Health 2014;10:5.

9. 

G Shakarishvili, R Atun, P Berman. Converging Health systems frameworks: towards a concepts-to-actions roadmap for health systems strengthening in low and middle income countries. Global Health Gov 2010;3:1-17.

10. 

G Shakarishvili. Building on health systems frameworks for developing a common approach to health systems strengthening. prepared for the world bank, the global fund and the GAVI alliance technical workshop on health systems strengthening. Washington, DC, June 25-27, 2009.

11. 

S Nuti, F Vola, A Bonini, M. Vainieri Making governance work in the health care sector: evidence from a natural experiment in Italy. Health Econon Policy Law 2016;11:17-38.

12. 

A Brown, GR Baker, T Closson, T. Sullivan The journey toward high performance and excellent quality. Healthcare Quarterly 2012;15:6-9.

13. 

Ministry of Health. Annual Health Sector Performance Report Financial Year 2012-13. Uganda: Ministry of Health, The Republic of Uganda; 2013.

14. 

Holistic Assessment of the Health Sector Programme of Work 2014. Ghana: Ministry of Health; 2015. Available from: http://www.moh.gov.gh/wp-content/uploads/2016/02/Holistic-Assessment-2015.pdf .

15. 

WHO. Pathways to health system performance assessment: A MANUAL to conducting health system performance assessment at national or sub-national level. World Health Organization; 2012.

16. 

S Prinja, M Kaur, R. Kumar Universal health insurance in India: ensuring equity, efficiency, and quality. Indian J Commun Med 2012;37:142-9.

17. 

NRHM. National Rural Health Mission 2005-2012 Mission Document. New Delhi: Ministry of Health & Family Welfare, Govt. of India; 2005.

18. 

S Prinja, P Bahuguna, R Gupta. Coverage and Financial Risk Protection for Institutional Delivery: How Universal Is Provision of Maternal Health Care in India? PLOS One 2015;10:e0137315.

19. 

S Barsanti, S. Nuti The equity lens in the health care performance evaluation system. Int J Health Plan Manag 2014;29:e233-46.

20. 

S Nuti, C. Seghieri Is variation management included in regional healthcare governance systems? Some proposals from Italy. Health Policy 2014;114:71-8.

21. 

WHO. Monitoring the building blocks of health systems: a handbook of indicators and their measurement strategies. Geneva, Switzerland: World Health Organisation; 2010.

22. 

V Navarro. Assessment of the World Health Report 2000. Lancet 2000;356:1598-601.

23. 

OECD. Handbook on Constructing Composite Indicators METHODOLOGY AND USER GUIDE. France: Organisation for Economic Cooperation and Development; 2008.

24. 

R Foa, J. Tanner Methodology of the Indices of Social Development. ISD Working Paper 04. The Hague: International Institute of Social Studies of Erasmus University Rotterdam (ISS); 2012.

25. 

National Health Mission. Framework for implementation of National Health Mission 2012-2017. New Delhi: Ministry of Health and Family Welfare, Government of India; 2012.

26. 

on rural health statistics upto 31st March Report, 2014. Panchkula: NRHM, Haryana; 2014.

27. 

S Mounier-Jack, UK Griffiths, S Closser. Measuring the health systems impact of disease control programmes: a critical reflection on the WHO building blocks framework. BMC Public Health 2014;14.

28. 

S Prinja, R Gupta, P Bahuguna. A Composite Indicator to Measure Universal Health Care Coverage in India: Way Forward for Post-2015 Health System Performance Monitoring Framework. Health Policy Planning 2016;7.

29. 

M Saisana. COIN tool: A do-it-yourself guide in Excel for constructing and assessing composite indicators. Italy: European Commission, Joint Research Centre; 2015.

30. 

GBD 2015 SDG Collaborators. Measuring the health-related Sustainable Development Goals in 188 countries: a baseline analysis from the Global Burden of Disease Study 2015. Lancet 2016;388:1813-50.

31. 

U Ebert, H. Welsch Meaningful environmental indices: a social choice approach. J Environ Econ Manage 2004;47:270-83.

32. 

IBM SPSS Statistics for Windows, Version 21.0. Armonk, NY: IBM Corp.; 2012.

33. 

L Jackson, N Lee, L. Samet Frequency of policy recommendations in epidemiologic publications. Am J Public Health 1999:1206-11.

34. 

L Dandona, Y Siwan, M Jyothi. The lack of public health research output from India. BMC Public Health 2004;4:55.

35. 

CJL Murray, J. Frenk A framework for assessing the performance of health systems. Bull World Health Organization 2000;78:717-31.

36. 

E Nolte, C Bain, M. Mckee Population health. In: PC Smith, E Mossialos, I Papanicolas, S Leatherman, editors. Performance Measurement for Health System Improvement: Experiences, Challenges and Prospects; 2008.

37. 

I Papanicolas, PC Smith, (eds). Health system performance comparison: An agenda for policy, information and research. World Health Organization; 2013.

38. 

ME Porter. What is value in health Care? New Engl J Med 2010;363:2477-81.

39. 

M Gray, Turabi A El. Optimising the value of interventions for populations. BMJ 2012;345:e6192.

40. 

LE Downey, A Mehndiratta, A Grover. Institutionalising health technology assessment: establishing the Medical Technology Assessment Board in India. BMJ Global Health 2017;2:e000259.

Table 1.

Sub-domains and number of indicators used for measurement of inputs, processes and outputs, outcomes of health system.

Sub-domains Number of indicators
Health system building blocks
Health Financing 1) Resource pooling and purchase of services 1
2) Extent of public health expenditure 1
Human Resources 1) Availability and distribution of human resources for health 4
2) Capacity and productivity of human resources for health 5
3) Motivation and job satisfaction of human resources for health 6
Material Resources 1) Sub-Centre facility readiness 4
2) Primary Health Centre facility readiness 4
3) Community Health Centre facility readiness 4
4) District Hospital facility readiness 4
Health Management Information System 1) Health Management Information System resources availability 2
2) Resource capacity 3
3) Data transmission 2
4) Data quality 2
5) Data usage 2
Service Provision 1) Service availability 7
2) Service quality 5
Governance 1) Participation and responsiveness 3
2) Transparency and fairness 4
3) Effectiveness and efficiency 4
4) Accountability 3
Total 20 70
Health system outputs and outcomes
Health System outputs 1) Primary care coverage 5
2) Curative care utilization 6
3) Equity in health financing 2
4) Efficiency and equity in service delivery 7
Health system outcomes 1) Morbidity rates 2
2) Mortality rates 2
3) Financial risk protection 4
Total 7 28
Table 2.

Study tools developed for data collection under study.

No. Tool Will collect data on Source of information Mode of administration Documents consulted during tool preparation
1. State level A HMIS data transmission; HR availability information; HR training; Financial reports of programmes under NHM; Out-patient, in-patient records State level public health headquarters: NHM state headquarters; Director General Health Services, Haryana; Civil Registration Department, Haryana; State Institute of Health and family Welfare, Haryana Record review: Secondary data collection WHO. SARA service availability indicators: 2012. WHO. Toolkit on monitoring HSS: Human resources for health: 2009. WHO. A guide to rapid assessment of human resources for health: 2004. PRISM Tools: RHIS performance diagnostic tool: Quality of data assessment
2. State level B Public health facilities in districts; Outputs at public health facilities; District demographic characteristics State level public health headquarters: NHM state headquarters; Director General Health Services, Haryana Record review: Secondary data collection WHO. SARA service availability indicators: 2012. WHO. A guide to rapid assessment of human resources for health: 2004.
3. District level Performance of public health facilities in district; Governance related parameters Audits and Meetings District level public health headquarters: Civil Surgeon office Record review: Secondary data collection WHO. SAM district questionnaire
4. Facility in-charge interview Feedback on governance parameters; Facility performance parameters Facilities surveyed in district Facility in-charge interview: Primary data collection. Record Review: Secondary data collection WHO. SARA core questionnaire: Infrastructure. WHO. Assessment of HRH: Healthcare providers: 2002. PAIMAN health facility assessment survey.
5. HMIS familiarity assessment, Conceptual understanding assessment tool Data handlers’ and facility in-charges’: Familiarity with HMIS; Conceptual understanding regarding primary care indicators target computation Facilities surveyed in district Data handlers’ and facility in-charges’ interview: Primary data collection NRHM, Haryana. Single line reporting format
6. DH facility assessment tool HR at facility; Facility readiness in terms of material resources; Facility performance DH surveyed in district Facility survey: Primary data collection. Record Review: Secondary data collection WHO. SARA core questionnaire: Infrastructure. MoHFW. Performa for IPHS facility survey of Cat. I (101-200 beds) DH. State essential drug list (EDL)
7. CHC facility assessment tool HR at facility; Facility readiness in terms of material resources; Facility performance CHCs surveyed in district Facility survey: Primary data collection. Record Review: Secondary data collection WHO. SARA core questionnaire: Infrastructure. MoHFW. Performa for IPHS facility survey of CHCs. State essential drug list (EDL).
8. PHC facility assessment tool HR at facility; Facility readiness in terms of material resources; Facility performance PHCs surveyed in district Facility survey: Primary data collection. Record Review: Secondary data collection WHO. SARA core questionnaire: Infrastructure. MoHFW. Performa for IPHS facility survey of PHCs. State essential drug list (EDL).
9. SC facility assessment tool HR at facility; Facility readiness in terms of material resources; Facility performance SCs surveyed in district Facility survey: Primary data collection. Record Review: Secondary data collection WHO. SARA core questionnaire: Infrastructure. MoHFW. Performa for IPHS facility survey of SCs. State essential drug list (EDL).
10. Motivation and job-satisfaction assessment tool Motivation of HR; Job-satisfaction of HR Facilities surveyed in district Employee interview: Primary data collection Tsai Y. Relationship between Organizational Culture, Leadership Behaviour and Job Satisfaction. Sharma M. Determinants of Indian physicians’ satisfaction & dissatisfaction from their job. Mbindyo PM. Developing a tool to measure health worker motivation in district hospitals in Kenya.
11. Client satisfaction tool Client satisfaction Facilities surveyed in district Client interview: Primary data collection Measure DHS. SPA survey: ANC client exit interview
12. CENHM survey information extraction tool Primary care coverage parameters; Health financing parameters CENHM survey data Data analysis
13. Social determinants information collection tool Social determinants of health Literature review
Abstract views:
220

Views:
PDF
65
Supplementary Material
15
HTML
52

Article Metrics

Metrics Loading ...

Metrics powered by PLOS ALM


Copyright (c) 2017 Atul Sharma, Shankar Prinja, Arun Kumar Aggarwal

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
 
© PAGEPress 2008-2018     -     PAGEPress is a registered trademark property of PAGEPress srl, Italy.     -     VAT: IT02125780185