almediah.fr
» » Misleading Evidence and Evidence-Led Policy: Making Social Science More Experimental (The ANNALS of the American Academy of Political and Social Science Series)

Download Misleading Evidence and Evidence-Led Policy: Making Social Science More Experimental (The ANNALS of the American Academy of Political and Social Science Series) eBook

by Lawrence W. Sherman

Download Misleading Evidence and Evidence-Led Policy: Making Social Science More Experimental (The ANNALS of the American Academy of Political and Social Science Series) eBook
ISBN:
0761928588
Author:
Lawrence W. Sherman
Category:
Social Sciences
Language:
English
Publisher:
SAGE Publications, Inc; 1 edition (September 1, 2003)
Pages:
236 pages
EPUB book:
1846 kb
FB2 book:
1984 kb
DJVU:
1805 kb
Other formats
rtf lrf docx txt
Rating:
4.7
Votes:
959


Subjects: Business & Economics, Economics, Political Science, Social Sciences, Sociology. Misleading Evidence and Evidence-Led Policy: Making Social Science More Experimental Se. 2003 pp. 1-233.

Subjects: Business & Economics, Economics, Political Science, Social Sciences, Sociology. Collections: Arts & Sciences VII Collection, JSTOR Archival Journal & Primary Source Collection, JSTOR Essential Collection. Islam: Enduring Myths and Changing Realities Ju. 1-201.

Making Social Science More Experimental. Just as much of the medical community has embraced the concept of "evidence-based medicine," increasing numbers of social scientists and government agencies are calling for an evidence-based approach to determine which social programs work and which ones don't.

The American Academy of Political and Social Science was founded in 1889 to promote progress in the social sciences

The American Academy of Political and Social Science was founded in 1889 to promote progress in the social sciences. Sparked by Professor Edmund J. James and drawing from members of the faculty of the University of Pennsylvania, Swarthmore College, and Bryn Mawr College, the Academy sought to establish communication between scientific thought and practical effort

Misleading evidence and evidence-led policy: making social science more experimental. When can we conclude that treatments or programs don’t work ? The Annals of the American Academy of Political and Social Science, 587, 31–48.

Misleading evidence and evidence-led policy: making social science more experimental. The Annals of the American Academy of Political and Social Science, 589, 6–19. CrossRefGoogle Scholar. Spiegelhalter, D. Abrams, K. & Myles, J. P. (2004). Weisburd, . Mazerolle, . & Petrosino, A. (2007). The Criminologist, 32(3), 3–7. Annals of the American Academy of Political and Social Science, 589, 6-19. Oct. 5 & Oct. 8 : Developmental Crime Prevention Lab (2014). Developmental crime prevention (pp. 157-172). Farrington & Welsh (2007). Individual factors (pp. 37-54), and Family factors (pp. 55-75).

Sherman, Lawrence W. 2003. Misleading Evidence and Evidence-Led Policy: Making Social Science more Experimental. The ANNALS of the American Academy of Political and Social Science, Vol. 589, Issue. Citrin, Jack Schickler, Eric and Sides, John 2003. What if Everyone Voted? Simulating the Impact of Increased Turnout in Senate Elections. September 2003 · The Annals of the American Academy of Political and Social Science. Lawrence William Sherman.

Misleading Evidence and Evidence-Led Policy: Making Social Science more Experimental

Misleading Evidence and Evidence-Led Policy: Making Social Science more Experimental.

Dec 15, 2011 - American Academy of Political and Social Science might include . Claims for undelivered copies must be made no later than six months following month of publication.

Dec 15, 2011 - American Academy of Political and Social Science might include American defaults, such as middle class, white, Christian, and hetero- sexual. THE ANNALS of the American Academy of Political and Social Science is the bimonthly publication of The Academy.

An Academy of Criminal Justice Sciences Book

2 Doesn't, What's Promising. An Academy of Criminal Justice Sciences Book. Cincinnati, O. Anderson Publishing.

Research evidence can and should have an important role in shaping public policy. Just as much of the medical community has embraced the concept of "evidence-based medicine," increasing numbers of social scientists and government agencies are calling for an evidence-based approach to determine which social programs work and which ones don′t. It is an irony not lost on the social scientists writing for the September volume of The Annals that the first use of experimental methods in medicine (to test the effects of Streptomycin on tuberculosis in the late 1940s) was actually conducted by an economist. But while more than one million clinical trials in medicine have been conducted since that time, only about 10,000 have been conducted to evaluate whether social programs achieve their intended effects.

Authors of the September volume argue that this level of investment in the "gold standard" of research designs is insufficient for a wide range of reasons. Randomized controlled trials, for example, are far better at controlling selection biases and chance effects than are other observational methods, while econometric and statistical techniques that seek to correct for bias fall short of their promise. The volume dramatically demonstrates that alternative methods generate different (and often substantially wrong) estimates of program effects. Some research based on nonexperimental research designs actually mislead policy makers and practitioners into supporting programs that don′t work, while ignoring others that do.

Authors of this volume also directly address critiques of experimental designs, which range from questions about their practicality to their ethics. Some of these arguments are well taken, but addressable. The authors, however, reject other arguments against controlled tests as unfounded and damaging to social science..

Policymakers will find these articles invaluable in better understanding how alternative research methods can mislead as much as enlighten. Students and researchers will be confronted with powerful arguments that question the use of nonexperimental techniques to estimate program effects.

This volume throws the gauntlet down. We challenge you to pick it up.