Оценить:
 Рейтинг: 0

Позитивные изменения. Том 2, № 2 (2022). Positive changes. Volume 2, Issue 2 (2022)

Год написания книги
2023
Теги
<< 1 ... 15 16 17 18 19 20 21 22 23 ... 32 >>
На страницу:
19 из 32
Настройки чтения
Размер шрифта
Высота строк
Поля
Примечание: Уровень значимости: ** = 1 процент.

Примечание: Уровень значимости: ** = 1 процент.

Источник Paul J. Gertler, Sebastian Martinez, Patrick Premand, Laura B. Rawlings, and Christel M. J. Vermeersch. Impact Evaluation in Practice Second Edition. – International Bank for Reconstruction and Development / The World Bank, 2016

В заключение хотелось бы сказать, что использование строгих методов оценки и организация регулярного сбора и мониторинга данных о проекте или программе представляют основной набор инструментов, которые заинтересованные стороны могут использовать для проверки и повышения эффективности и результативности социальных проектов и программ на различных этапах реализации.

СПИСОК ЛИТЕРАТУРЫ:

1. Paul J. Gertler, Sebastian Martinez, Patrick Premand, Laura B. Rawlings, and Christel M. J. Vermeersch. Impact Evaluation in Practice Second Edition. – International Bank for Reconstruction and Development / the World Bank, 2016

2. Imbens, Guido W., and Donald B. Rubin. Rubin Causal Model. In the New Palgrave Dictionary of Economics, second edition, edited by Steven N. Durlauf and Lawrence E. Blume. Palgrave, 2008.

3. Rubin, Donald B. Estimating Causal Effects of Treatments in Randomized and Nonrandomized Studies. Journal of Educational Psychology 66 (5): 688–701, 1974.

4. Bertrand, Marianne, Bruno Crеpon, Alicia Marguerie, and Patrick Premand. Impacts ? Court et Moyen Terme sur les Jeunes des Travaux ? Haute Intensitе de Main d’oeuvre (THIMO): Rеsultats de l’еvaluation d’impact de la composante THIMO du Projet Emploi Jeunes et Dеveloppement des compеtence (PEJEDEC) en C?te d’Ivoire. Washington, DC: Banque Mondiale et Abidjan, BCP-Emploi. 2016.

5. Blattman, Christopher, Nathan Fiala, and Sebastian Martinez. «Generating Skilled Self-Employment in Developing Countries: Experimental Evidence from Uganda.» Quarterly Journal of Economics 129 (2): 697–752. doi: 10.1093/qje/qjt057.

6. Bruhn, Miriam, and David McKenzie. «In Pursuit of Balance: Randomization in Practice in Development Field Experiments.» American Economic Journal: Applied Economics 1 (4): 200–232. 2009.

7. Dupas, Pascaline. 2011. «Do Teenagers Respond to HIV Risk Information? Evidence from a Field Experiment in Kenya.» American Economic Journal: Applied Economics 3 (1): 1–34.

8. Glennerster, Rachel, and Kudzai Takavarasha. Running Randomized Evaluations: a Practical Guide. Princeton, NJ: Princeton University Press. Randomized Assignment 87, 2013.

9. Kremer, Michael, Jessica Leino, Edward Miguel, and Alix Peterson Zwane. «Spring Cleaning: Rural Water Impacts, Valuation, and Property Rights Institutions.» Quarterly Journal of Economics 126: 145–205. 2011.

10. Kremer, Michael, and Edward Miguel. «Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities.» Econometrica 72 (1): 159–217. 2004.

11. Premand, Patrick, Oumar Barry, and Marc Smitz. «Transferts monеtaires, valeur ajoutеe de mesures d’accompagnement comportemental, et dеveloppement de la petite enfance au Niger. Rapport descriptif de l’еvaluation d’impact ? court terme du Projet Filets Sociaux.» Washington, DC: Banque Mondiale. 2016.

12. Schultz, Paul. «School Subsidies for the Poor: Evaluating the Mexican Progresa Poverty Program.» Journal of Development Economics 74 (1): 199–250. 2004.

Evaluating Impact with All Rigor Possible. Applicability of Mathematical Methods in Measuring Social Impact (Exemplified by the Health Insurance Subsidy Program)

Increasing incomes, improving the education system, reducing morbidity are important areas of impact investment. Whether these changes will actually be achieved is the key question for an investor deciding on a social technology or project to invest in. However, the leaders of social projects and programs often focus on measuring the immediate outputs rather than on assessing whether projects and programs have had the expected impact. In this article, we would like to highlight the experience of evaluating the impact of the Health Insurance Subsidy Program (HISP) and describe approaches that can be used to address this and other similar problems.

Elena Avramenko

Expert of the project "Development of a Social and Economic Impact Assessment Model for NGOs" by the GLADWAY Foundation, Lean 6 Sigma Green Belt master

HEALTH INSURANCE FOR THE POOR

The Health Insurance Subsidy Program (HISP) is a program implemented in Kenya that finances the purchase of health insurance for low-income households in rural areas. The insurance covers the costs associated with medical care and medications. The objective of HISP is to reduce out-of-pocket health expenditures of poor families and ultimately to improve health outcomes.

HISP was originally launched in pilot mode. The plans for gradual expansion of the program depended on the results of the pilot stage. As part of the pilot run, the plan was to reduce the average yearly per-capita health expenditures of poor rural households by at least USD 10 below what they would have spent in the absence of the program, and this target was to be reached within two years.

During the initial pilot phase, HISP was introduced in 100 rural areas. Of the 4,959 households in the baseline sample, a total of 2,907 were enrolled in HISP, and the program operated successfully through its pilot stage over the next two years. All health clinics and pharmacies serving 100 villages accepted patients under the insurance program, and surveys showed that most enrolled households were happy with the program. Data was collected before the start of the pilot run and at the end of the two-year period, using the same sample of 4,959 households.

PROOF OF IMPACT

Has HISP affected out-of-pocket health expenditures of poor rural households? Yes it has, and it has been proven mathematically. The impact evaluation approach used as part of HISP was to select the most rigorous method, given the specifics of the project.

HISP implementation case study provides us with the following «menu» of options for impact evaluation methods:

• randomized assignment;

• instrumental variables;

• regression discontinuity design;

• difference-in-differences;

• benchmarking method.

All of these approaches aim at identifying valid comparison groups so that the true impact of the program on out-of-pocket health care expenditures of poor households can be evaluated.

So, we build on the stage when the evaluation indicators are selected and elaborated in detail, the data collection plan is ready and the data is collected properly.

We will review the evaluation methodology selected for this case by introducing the concept of counterfactual (that is, a fact that contradicts the hypothesis). And then, within the framework of this article, we will give an overview of the most rigorous evaluation method proposed by HISP and tested on this program.

There are two concepts that are integral to the process of making accurate and reliable impact evaluations – the concept of causation and that of counterfactual.

First of all, issues of social impact are related to causation, for example, with the search for answers to such questions:

Does teacher training improve student test scores? Do additional funding programs for health facilities result in better health outcomes for children? Do vocational training programs increase a trainee's income?

Finding answers to these questions can be difficult. For example, in the context of a vocational training program, simply observing how a trainee’s income increases after completing such a program is not sufficient to establish a causal relationship. A trainee’s income could have increased even if he or she had not been trained – all through the trainee’s own efforts, due to changing conditions in the labor market, or due to many other factors that could affect income.

The challenge is to find a method that will allow us to establish a causal relationship. We must empirically determine to what extent a particular program – and that program alone – contributed to the change in outcome. The methodology must exclude all external factors.

The answer to the basic question of impact evaluation is what is the impact or causal effect of the program (P) on the outcome of interest (Y)? – is given by the basic impact evaluation formula:

? = (Y | P = 1) ? (Y | P = 0).

This formula states that the causal effect (Д) of the program (P) on the outcome (Y) is the difference between the outcome (Y) with the program (in other words, when P = 1) and the same outcome (Y) without the program (i.e. when P = 0).

For instance, should P denote a training program and Y denote income, then the causal effect of the training program (Д) is the difference between a trainee’s income (Y) after participating in the training program (in other words, when P = 1) and the same person at the same point in time if he or she did not participate in the program (in other words, when P = 0).

If this were possible, we would observe how much income the same person would have at the same point in time both with and without the program, so that the program would be the only possible explanation for any difference in that person’s income. By comparing the same person with himself or herself at the same time, we would be able to exclude any external factors that could also explain the difference in outcomes.

But unfortunately, measuring two versions of the same unit at the same time is impossible: at a particular point in time, the person either participated or did not participate in the program.

This phenomenon is called the counterfactual problem: how can we measure what would happen if other circumstances prevailed?

COMPARISON AND TREATMENT GROUPS

In practice, the task of impact evaluation is to identify a comparison group and a treatment group that are similar in their parameters, but one of them participates in the program and the other does not. That way any difference in results must be due to the program.

<< 1 ... 15 16 17 18 19 20 21 22 23 ... 32 >>
На страницу:
19 из 32