Tuesday, October 3, 2006
Resource for Public Policy Teaching: UT's "American Politics" Site
The University of Texas at Austin's American Politics site has a wealth of charts and graphs that illustrate key concepts useful for teaching a public policy course, such as Money Spent on Lobbying of Congress by Industry (2000) and Congress Reelection Rates 1966-2004. The resources are organized into 12 categories (plus a separate site for Texas politics, natch), and each category has a glossary of terms.
Monday, October 2, 2006
Watch Those Page Counts in Lecture Notes!
So I'm clacking away on my lecture notes for class and just writing and writing and writing and the page count just isn't going up; I target five pages for an hour and 15-minute lecture, and I've still got just two pages.
I do a Print Preview in Word to see why my seemingly bulbous lecture is so slim and it reports I'm at four pages. That many, and I haven't even covered one of the two chapters I want to talk about.
I switch back to the standard view, and the page count is now miraculously correct. Word must have been so overwhelmed by my genius prose that it got distracted.
I do a Print Preview in Word to see why my seemingly bulbous lecture is so slim and it reports I'm at four pages. That many, and I haven't even covered one of the two chapters I want to talk about.
I switch back to the standard view, and the page count is now miraculously correct. Word must have been so overwhelmed by my genius prose that it got distracted.
Friday, September 22, 2006
Resource for Public Policy Teaching: Think Tank Town
The Washington Post has a feature/blog/something called Think Tank Town that solicits weekly columns from D.C. policy research shops, aka think tanks. I'm going to point my public policy students toward this as a source for ideas when they write policy proposal papers.
(Via Daniel W. Drezner)
(Via Daniel W. Drezner)
Saturday, July 15, 2006
Behavioral Experiment Packages Rated
Behavioral Experiment Software Survey Results
Here's a one-paragraph summary of the survey results; details below: E-Prime is the most popular package of those surveyed, but the majority of folks are using either E-Prime, DMDX, or some flavor of PsyScope. E-Prime, PsyScope, SuperLab, and ERTS are all rated as easy to build experiments with, and about equally so. DMDX and NESU are seen as slightly harder. Presentation and MatLab are notably the hardest of the commonly used packages.
Friday, July 14, 2006
A Nice Font for Working Papers
The attractive font authors like Andrew Gelman use for working papers like this one (pdf) is American Typewriter, available in Office for Macintosh. But not in Office for Windows.
Krosnick's Forthcoming Survey Design Handbook
There's a nice preview of Jon Krosnick's forthcoming survey design handbook at the link below. Krosnick, whom I was fortunate enough to study with at the Summer Institute in Political Psychology, has done the research to give what I think will be definitive answers on design conundrums such as providing (or not) a "don't know" option. I've often been frustrated by the lack of a one-stop how-to on rigorous survey design, so I can hardly wait for his book.
Harvard University Program on Survey Research at the Institute for Quantitative Social Science
Harvard University Program on Survey Research at the Institute for Quantitative Social Science
The conference centered around the research Krosnick has done into a century's worth of survey methodology studies for his forthcoming book, The Handbook of Questionnaire Design: Insights from Social and Cognitive Psychology ( Oxford ).
Surveys have been an investigative staple since the earliest years of the social sciences. But surprisingly, though most research-methods textbooks include informal discussions of questionnaire design, they tend to treat it as an intuitive art rather than a scientific skill governed by formal rules for optimizing data quality.
...
Another surprising result involves "don't know" responses. Many researchers have presumed for decades that it is wise to offer a "don't know" option to respondents, because many people genuinely lack the information necessary to answer some survey questions. But Krosnick found instead that offering a "don't know" option mostly lures people who have real opinions to decline to answer, as a way to minimize the effort they devote to the process. By omitting the "don't know" option, researchers can measure the real opinions held by more people.
Subscribe to:
Posts (Atom)