Alfredo Covaleda,
Bogota, Colombia
Stephen Guerin,
Santa Fe, New Mexico, USA
James A. Trostle,
Trinity College, Hartford, Connecticut, USA
Maplight, a 501(c)(3) foundation, recently announced its “extensive mapping project examining the geographic origin of contributions to legislators by state; contributions from companies to legislators by state; and roll call votes by state and district on key bills in Congress.”
Today’s news peg points to “Who in Your State Has Contributed Money to Majority Leader Candidate Kevin McCarthy (R-CA)?”
MapLight looks to be a good edition to our GIS toolbox.
Thanks to Margo Williams for passing this interview along. It’s filled with important tips and insights gained from Myers’ years of experience. Read the full interview with Myers at http://www.icij.org/blog/2014/06/try-and-find-narnia-wardrobe-inside-work-research-specialist
“Paul Myers is an internet research specialist working in the U.K. media. He joined the BBC in 1995 as a news information researcher. This followed an earlier career in computers and internet experience dating back to the 1970s.
“These days, his role sees him organise and deliver training courses related to internet investigation, digital security, social media research, data journalism, freedom of information and reporting statistics. His techniques have helped his colleagues develop creative approaches to research, conduct their investigations securely and have led many journalists to information they would never have otherwise been able to find. He has worked with leading British T.V. & radio news, current affairs, documentaries and consumer programmes.”
Published by Don Begley at 10:09 pm under Complex News, event
It’s human nature: Elections and disinformation go hand-in-hand. We idealize the competition of ideas and the process of debate while we listen to the whisper campaigns telling us of the skeletons in the other candidate’s closet. Or, we can learn from serious journalism to tap into the growing number of digital tools at hand and see what is really going on in this fall’s campaigns. Join journalist Tom Johnson for a three-part workshop at Santa Fe Complex to learn how you can be your own investigative reporter and get ready for that special Tuesday in November.
Over the course of three Tuesdays, beginning September 30, Johnson will show workshop participants how to do the online research needed to understand what’s happening in the fall political campaign. There will be homework assignments and participants will contribute to the Three Tuesdays wiki so their discoveries will be available to the general public.
Everyone is welcome but space will be limited. A suggested donation of $45 covers all three events or $20 will help produce each session. Click here to sign up.
This workshop is NOT a sit-and-take-it-in event. We’re looking for folks who want to do some beginning hands-on (”On-line hands-on”, that is) investigation of New Mexico politics. And that means homework assignments and contributing to our Three Tuesdays wiki. Participants are also encouraged to bring a laptop if you can. Click here to sign up.
Tom Johnson’s 30-year career path in journalism is one that regularly moved from the classroom to the newsroom and back. He worked for TIME magazine in El Salvador in the mid-80s, was the founding editor of MacWEEK, and a deputy editor of the St. Louis Post-Dispatch. His areas of interest are analytic journalism, dynamic simulation models of publishing systems, complexity theory, the application of Geographic Information Systems in journalism and the impact of the digital revolution on journalism and journalism education. He is the founder and co-director of the Institute for Analytic Journalism and a member of the Advisory Board of Santa Fe Complex.
Deep Web Research 2008
http://www.llrx.com/features/deepweb2008.htm
By Marcus P. Zillman, Published on November 24, 2007
Bots, Blogs and News Aggregators is a keynote presentation that I have been delivering over the last several years, and much of my information comes from the extensive research that I have completed over the years into the “invisible” or what I like to call the “deep” web. The Deep Web covers somewhere in the vicinity of 900 billion pages of information located through the world wide web in various files and formats that the current search engines on the Internet either cannot find or have difficulty accessing. Search engines currently locate approximately 20 billion pages.
In the last several years, some of the more comprehensive search engines have written algorithms to search the deeper portions of the world wide web by attempting to find files such as .pdf, .doc, .xls, ppt, .ps. and others. These files are predominately used by businesses to communicate their information within their organization or to disseminate information to the external world from their organization. Searching for this information using deeper search techniques and the latest algorithms allows researchers to obtain a vast amount of corporate information that was previously unavailable or inaccessible. Research has also shown that even deeper information can be obtained from these files by searching and accessing the “properties” information on these files.
This article and guide is designed to give you the resources you need to better understand the history of the deep web research, as well as various classified resources that allow you to search through the currently available web to find those key sources of information nuggets only found by understanding how to search the “deep web”.
This Deep Web Research 2008 article is divided into the following sections:
Another unique investigation by The New York Times gets A1 play in this Sunday's edition (1 Oct. 2006) under the hed “Campaign Cash Mirrors a High Court's Rulings.” Adam Liptak and Janet Roberts (who probably did the heavy lifting on the data analysis) took a long-term look at who contributed to the campaigns of Ohio's Supreme Court justices. It ain't a pretty picture if one believes the justices should be above lining their own pockets, whether it's a campaign fund or otherwise.
In any event, there seems to be a clear correlation between contributions — and the sources — and the outcome to too many cases. A sidebar, “Case Studies: West Virginia and Illinois,” would suggest there is much to be harvested by reporters in other states. There is, thankfully, a fine description of how the data for the study was collected and analyzed. See “How Information Was Collected“
There are two accompanying infographics, one (“Ruling on Contributors' Cases” ) is much more informative than the other (“While the Case Is Being Heard, Money Rolls In” ), which is a good, but confusing, attempt to illustrate difficult concepts and relationships.
At the end of the day, though, we are grateful for the investigation, data crunching and stories.
We're pulling together the final pieces following the Ver 1.0 workshop in Santa Fe last week. Twenty journalists, social scientists, computer scientists, educators, public administrators and GIS specialists met in Santa Fe April 9-12 to consider the question, “How can we verify data in public records databases?”
The papers, PowerPoint slides and some initial results of three breakout groups are now posted for the public on the Ver1point0 group site at Yahoo. Check it out.
Friend-of-IAJ Griff Palmer alerts us to an impressive series this week that examines the conduct of the DA's office in Santa Clara County, California. If nothing else, the series illustrates why good, vital-to-the-community journalism takes time and is expensive. Rick Tulsky, Griff and other colleagues spent three years — not not three days, but YEARS — on the story. Griff writes:
We don't know if there has as yet been any empirical research done on how interested media consumers are in online crime mapping — and how good the coverage is — but there is a body of literature debating readers' interest in crime per se. It would seem to be a pretty good bet, though, that if people are interested in crime AND if more and more are going online via broadband, that some dynamic crime maps would get some hits.
Remember that crime mapping is not just about pushing digital push-pins on a map, GoogleMap or otherwise. “Journey to Crime” maps or maps showing where a car was stolen and when it was recovered can provide interesting insights.
Here are some links recently posted to the CrimeMapping listserv that could be of value to journalists:
Journey-after-crime: How Far and to Which Direction DO They Go? http://www.ojp.usdoj.gov/nij/maps/boston2004/papers/Lu.ppt
Linking Offender Residence Probability Surfaces to a Specific Incident Location http://www.ojp.usdoj.gov/nij/maps/dallas2001/Gore.doc
Journey to Crime Estimation http://www.icpsr.umich.edu/CRIMESTAT/files/CrimeStatChapter.10.pdf
Applications for Examining the Journey-to-Crime Using Incident-Based Offender Residence Probability Surfaces http://pqx.sagepub.com/cgi/content/refs/7/4/457
The Geography of Transit Crime: http://www.uctc.net/papers/550.pdf
See, too: Paulsen, Derek J. “WRONG SIDE OF THE TRACKS: EXPLORING THE ROLE OF NEWSPAPER COVERAGE OF HOMICIDE IN SOCIALLY CONSTRUCTING DANGEROUS PLACES.” Journal of Criminal Justice and Popular Culture, 9(3) (2002) 113-127
A piece on calling the elections in Detroit:
BY CHRIS CHRISTOFF FREE PRESS LANSING BUREAU CHIEF
November 10, 2005
What was a viewer to believe?
As polls closed Tuesday, WDIV-TV (Channel 4) declared Freman Hendrix winner of Detroit's mayoral race by 10 percentage points.
WXYZ-TV (Channel 7) showed Hendrix ahead by 4 percentage points, statistically too close to call.
But WJBK-TV (Channel 2) got it right, declaring just after 9 p.m. that Mayor Kwame Kilpatrick was ahead, 52% to 48%, which turned out to be almost exactly the final 53%-47% outcome declared many hours later.
And it was vote analyst Tim Kiska who nailed it for WJBK, and for WWJ-AM radio, using counts from 28 of 620 Detroit precincts.
Kiska did it with help from Detroit City Clerk Jackie Currie. She allowed a crew that Kiska assembled to collect the precinct tallies shortly after the polls closed at 8 p.m.
Using what he calls a secret formula, Kiska calculated how those 28 precincts would predict the result citywide.
His formula also assumed that absentee voters chose Hendrix over Kilpatrick by a 2-1 ratio.
That's different from the methods of pollsters who got it wrong Tuesday, Steve Mitchell for WDIV and EPIC/MRA's Ed Sarpolus for WXYZ and the Free Press. Both men used telephone polls, calling people at home during the day and evening and asking how they voted.
It's a more standard method of election-day polling, but Tuesday proved treacherous.
Kiska, a former reporter for the Free Press and Detroit News, has done such election-day predictions since 1974, but said he was nervous Tuesday.
“Every time I go into one of these, my nightmare is I might get it wrong,” said Kiska, a WWJ producer. “I had a bad feeling about this going in. I thought there was going to be a Titanic hitting an iceberg and hoping it wouldn't be me.”
Kiska said he especially felt sorry for his friend Mitchell.
Mitchell said he's been one of the state's most accurate political pollsters over 20 years, but said his Tuesday survey of 800 voters turned out to be a bad sample.
He said polling is inherently risky, and that even well-conducted polls can be wrong one out of 20 times. “I hit number 20 this time.”
For Sarpolus, it's the second Detroit mayoral race that confounded his polls. He was the only major pollster in 2001 who indicated Gil Hill would defeat Kilpatrick.
Sarpolus said the pressure to get poll results on the air quickly made it impossible to adjust his results as real vote totals were made public during the late evening.
Of Kiska, Sarpolus said: “You have to give him credit. … But you have to assume all city clerks are willing to cooperate.”
Contact CHRIS CHRISTOFF at 517-372-8660 or christoff@freepress.com.