SIDEBAR
»
S
I
D
E
B
A
R
«
Scientists track money to help predict disease
Jan 26th, 2006 by JTJ

Yet another fine example of creative thinking wherein a good idea in one discipline is morphed into an unintended application in another.  (Something all-too-rare in the practice of journalism.)  The journal Nature reports:

Another day another dollar


The
website wheresgeorge.com invites its users to enter the serial numbers
of their US dollar bills and track them across America and beyond. Why?
“For fun and because it had not been done yet”, they say. But the
dataset accumulated since December 1998 has provided the ideal raw
material to test the mathematical laws underlying human travel, and
that has important implications for the epidemiology of infectious
diseases. Analysis of the trajectories of over half a million dollar
bills shows that human dispersal is described by a 'two-parameter
continuous-time random walk' model: our travel habits conform to a type
of random proliferation known as 'superdiffusion'. And with that much
established, it should soon be possible to develop a new class of
models to account for the spread of human disease.

LetterThe scaling laws of human travel

D. Brockmann, L. Hufnagel
and T. Geisel



Tailor-Made Cartography with Google Maps
Jan 26th, 2006 by JTJ

We missed this one when it was originally posted on the National Public Radio site, but the story offers such interesting info and links, we wanted to get it into the IAJ archive.  The NPR links at the top are of value, but be sure to scroll down to see other fascinating mashups from around the world.

Jan 12, 2006

Tailor-Made Cartography with Google Maps

Listen to this story... 

 
A mashup of brew pubs and breweries around Wilwaukee, Wis.

A Google Maps mashup of brew pubs and breweries around Wilwaukee, Wis. Beer Mapping Project

NPR's Top 10 Markets

See Google Maps “mashups” of public radio stations and their coverage areas in the top 10 markets:

Public radio stations in the San Francisco market. NPR

 

 


All Things Considered, January 12, 2006 ·
Google's popular mapping service has inspired people to add their own
information to maps. The resulting “mashups” are maps overlaid with
clickable icons that provide a unique look at fast-food restaurant
locations, crime statistics and other data sets.

Robert Siegel talks to Mike Pegg, whose Google Maps Mania Web log tracks the latest mashups, by category.

Topics include transit (Boston subway stations), current events (BBC world news), and weather and Earth (meteor impact sites).

Some are clearly designed to be useful for everyday life: New York pizza places, Washington, D.C., home prices, and Chicago crime locations. Others are more for fun: find the nearest pub or brewery, peek in on Webcams, or look for a convenient jogging route.

“One of my favorites is a mashup in Dublin, Ireland, which takes the real-time locations of a commuter train and plots it onto the map, and it actually shows that train moving,” Pegg says.

Another popular mashup lets users see where they would end up if they drilled through the Earth to the other side. For example, click on Wichita, Kan., and you come out in the middle of the Indian Ocean.

“I
think we're destined to see big things from this, both as the maps
improve and as people's imaginations just continue to go wild with
this,” Pegg says.




Encouraging signs of analytic journalism at UC-Berkeley J-School
Jan 25th, 2006 by JTJ

From the Nieman Watchdog.org posting….


How not to cover the economy
SHOWCASE | January 23, 2006

A fed-up Berkeley economics professor joins up with the J-school to
teach journalists and would-be journalists how to cover – and even more
emphatically, how not to cover – economic news.

By Dan Froomkin
froomkin@niemanwatchdog.org

Brad DeLong – the Berkeley economics professor whose popular blog includes more than a bit of media criticism
– launched a fascinating experiment last week: He joined forces with
Journalism School Professor Susan Rasky to teach a class for would-be
journalists called “Covering the Economy.”

In DeLong’s hands, the class might be better titled “How Not to
Cover the Economy.” As DeLong writes in the syllabus, he took up the
challenge “because he is being gradually driven insane by stories in
major newspapers and other outlets.”

DeLong, who was a senior Treasury Department official in the Clinton
administration, has long been the scourge of sloppy economic reporting
in his blog. He’s also been a contributor to NiemanWatchdog.org.

And although his official audience is made up of Berkeley students,
anyone who covers the economy would be well advised to follow along
online. In fact Rasky says several established Bay-area journalists
have already asked for – and received – permission to sit in.

The syllabus, reading list and what DeLong calls “after-action reports” will be available on the class’s Web site. And DeLong will be providing regular updates to NiemanWatchdog.org as well.

DeLong and Rasky write in their syllabus:

We both start with this premise: Nobody goes into journalism to
write bad stories that mislead their readers and omit or downplay the
important news of the events that they are covering. Journalists,
especially daily journalists have a very difficult job. They are under
ferocious deadline pressure. They are beat reporters–which means that
they cannot afford to alienate their sources too far, for they have to
go back to them again and again. They are dealing with complicated and
subtle issues. And at least half the people they talk to are telling
them subtle (and sometimes not so subtle) lies.

So what has gone wrong? And how can journalists–and those among
their sources who are interested in public education and in raising the
level of the debate–make things go right?”

Among DeLong’s horror stories: This November 4, 2005 New York Times story: “Senate Passes Budget With Benefit Cuts and Oil Drilling,” By Robert Pear with Carl Hulse.

The first paragraph:

The Senate on Thursday narrowly approved a sweeping five-year plan
to trim a variety of federal benefit programs and to allow drilling for
oil and natural gas in a wilderness area of Alaska, increasing the
chances that the energy industry and Alaska officials will achieve a
long-sought goal. The budget bill, the most ambitious effort to curb
federal spending in eight years, was approved by a vote of 52 to 47.
Five Republicans opposed the measure; two Democrats voted for it.
Senator Judd Gregg, Republican of New Hampshire, the chairman of the
Senate Budget Committee, said, “This bill is a reflection of the
Republican Congress's commitment to pursue a path of fiscal
responsibility.” It will, Mr. Gregg said, reduce the deficit and save
roughly $35 billion over the next five years…”

DeLong explains why it’s a horror:

The Federal government currently spends money at the rate of $2.6
trillion a year. Total incomes in the entire American economy are about
$12 trillion a year. Saving $35 billion over five years means that you
are saving $7 billion a year–0.3% of federal spending; 0.06% of GDP.
Out of a federal budget that spends $9,000 per person per year, Judd
Gregg is saving $27 a year.

Thus reading a lead like that makes Brad DeLong, at least, foam at
the mouth: phrases like “sweeping,” “ambitious,” “commitment,” and
“fiscal responsibility” simply have no place here–especially since
[the author] does not give his readers any of the numbers needed as
reference points to assess the magnitude of the Senate's action.

DeLong argues that there were three responsible ways to report this
story: By writing  that Judd Gregg had labored mightily and brought
forth a mouse; by noting that the Republicans were so eager to be
associated with “deficit reduction” that they were only announcing the
spending side of their budget proposal – and delaying the announcement
of the tax side (which, it turned out, would more than offset their
spending cuts); or by writing about the institutional factors that make
it so hard to cut the budget these days.

Rasky says she is thrilled that DeLong is joining her. And she credits a grant from the Carnegie-Knight Initiative on the Future of Journalism Education (described here
by Katharine Q. Seelye in the May 26, 2005, New York Times), which
calls in part for improving subject-matter education for journalists by
having journalism school classes team-taught by experts from throughout
the university.

The first six weeks of the class will be spent “looking at how the
bread-and-butter economic news is covered and how it should be covered…
During the next six weeks, we will focus more closely on four or five
big economic trends,” the syllabus promises.

Following that, if there’s time, DeLong hopes to “examine the work
done by some extremely good and skilled practitioners of journalism”
such as William Greider, John M. Berry, Greg Ip, Paul BlusteinJulie Rovner,  and Rebecca Smith.



Getting that tabled data from there to here
Jan 23rd, 2006 by JTJ

Another reason to use Firefox….

Copying and pasting data from online tables into a spreadsheet is often fraught with frustration, often centering around invisible characters or custom formatting in web tables.  And then there's the problem of getting data from non-adjacent cells. Some fine fellow — actually, it is Davide Ficano — has written a slick extension for Firefox to minimize these, um, challenges.  See:

Table2Clipboard – Firefox Extension

Table2Clipboard 0.0.1, by Davide Ficano, released on January 13, 2006


Table2Clipboard preview - You can select non adjacent cells
More Previews»

Quick Description


Mozilla applications allow to select rows and columns from a table
simply pressing Control key and picking rows/columns with left mouse
button.


The selection can be copied to clipboard but the original table
disposition is lost making ugly results when you paste the text on
datasheet applications (eg excel).


If you want to paste data in Microsoft Excel on OpenOffice Calc with correct disposition simply use Table2Clipboard.


Pasting in plain text editors is also supported as CSV file (but you can change rows and columns separators from option dialog)


SJ Mercury-News Series: "Tainted Trials, Stolen Justice."
Jan 23rd, 2006 by JTJ

Friend-of-IAJ Griff Palmer alerts us to an impressive series this week that examines the conduct of the DA's office in Santa Clara County, California.  If nothing else, the series illustrates why good, vital-to-the-community journalism takes time and is expensive.  Rick Tulsky, Griff and other colleagues spent three years — not not three days, but YEARS — on the story.  Griff writes:

I invite you all to take a look at “Tainted Trials, Stolen Justice.”
This five-day series was three years in the making. It starts in
today's Mercury News:


http://www.mercurynews.com/mld/mercurynews/news/special_packages/stolenjustice/

Free
registration is required to view the Merc's content. I'm not sure yet
if this URL will be cumulative or will only point to each day's part.
If the latter, I'll work to get the entire package pulled together
under one URL.


The Merc's on-line presentation includes a multimedia presentation, with Flash graphics, streaming audio and streaming video.


The project's backbone is reporter Rick Tulsky's review of  every 
criminal appeal originating out of Santa Clara County Superior Court
for five years. Rick was aided in his review by staff writers Julie
Patel and Mike Zapler.


Rick has a law degree, and he used
his legal training to analyze these cases for prosectuorial er! ror,
defense error and judicial error. He went over the cases with the Santa
Clara County District Attorney's Office, defense attorneys and judges.
He recruited seasoned criminal justice scholars and former judges and
prosecutors to review his findings.



Rick's findings: Santa Clara County's criminal justice system, while
far from broken, is systemically troubled by serious flaws that bias
the system in prosecutors' favor and, in the worst cases, lead to
outright miscarriages of justice. Rick found that more than a third of
the 727 cases he analyzed were marred by some form of questionable
conduct on the part of prosecutors, defense attorneys or judges. He
found that California's Sixth Appellate District routinely found
prosecutorial and judicial error to be harmless to criminal defendants
— in dozens of instances, resorting to factual distortions and flawed
reasoning to reach their conclusions.


This analysis has at
least one serious limitation: It doesn't comp! are Rick's Santa Clara
County findings with similar data from any other jurisdiction. It would
frankly have been impossible, at least within three years, to conduct a
similar case review on a broader scale.



To help us examine how Santa Clara County's criminal justice system
differs from those of other counties, I captured 10 years' worth of
felony arrest disposition data from the Criminal Justice Statistics
Center, maintained by the California Attorney General's Office.  (http://ag.ca.gov/cjsc/datatabs.htm).
I hand-keyed another four years' worth of CJSC data that were available
only on paper. (I did a rough estimate at one point and determined that
I'd keyed in somewhere in the neighborhood of 10,000 cells of data.)



This analysis showed us that, within the accuracy limitations of the
CJSC data, Santa Clara County stood out for having one of the highest
conviction rates and one of the lowest judicial dismissal rates among
all counties with populations of ! 100,000 or more.


As Rick's attention turned to the the appellate
system, my attention was drawn to an interactive database system
maintained by the California Administrative Office of the Courts: http://appellatecases.courtinfo.ca.gov/.


I
requesed a copy of the underlying database from the AOC, only to be
stonewalled. Months of effort on our attorneys' part yielded only one
summary spreadsheet from the AOC.


Thanks to discussions on
this list and at NICAR conferences, I knew it should be possible to
programmatically retrieve the contents of the AOC database.  With Aron
Pilhofer's and John Perry's Perl scripting tutorials, and with lots of
generous coaching from John, I put together scripts that harvested the
criminal appeals data from the AOC system and parsed it from HTML into
delimited files.”


That data retrieval underlies the numbers that appear in the final day of this series.


Charlotte Mortgage Foreclosures
Jan 21st, 2006 by JTJ

Three-Day Series

http://www.charlotte.com/mld/charlotte/news/special_packages/foreclosure/

summary stolen from (http://www.thescoop.org/)

Charlotte Mortgage Foreclosures
Posted by Derek on January 18th, 2006. Filed under Fed Data, Mapping.

     Lisa Hammersly Munn, Binyamin Appelbaum and Ted Mellnik of the
Charlotte Observer have a three-part series on mortgage foreclosures,
finding that home loan failures have more than quadrupled in Mecklenburg
County since 1999. More foreclosures are filed here, per person, than any
other county in the state. On average, 11 Mecklenburg houses are sold in
foreclosure auctions every business day. The owners are evicted, their
credit ruined, and they face thousands in court fees and moving expenses.
     Included with the series is an interactive map of Mecklenburg County
foreclosures and a sidebar reporting that local loans from the Federal
Housing Administration are failing at almost twice the national rate.



The Numbers Guy – WSJ
Jan 21st, 2006 by JTJ

Wall Street Journal
Richard Holden says journalists at his seminars often don't find
the problems in his examples of poorly presented numbers from
newspapers. “I'm surprised with professional newspaper people, how
frequently it goes right over their head,” he tells Carl Bialik.
“Many times, I'm greeted by 30 blank stares.” Part of the problem is
embedded in the culture of the profession, he says. “Journalists always
prided themselves on knowing so little about math.”



More lies, loudly spoken, from the Bush Administration?
Jan 20th, 2006 by JTJ

Posted on Thu, Jan. 19, 2006

Feds dispute mine safety report

By SETH BORENSTEIN and LINDA J. JOHNSON


Knight Ridder Newspapers


http://www.montereyherald.com/mld/montereyherald/news/nation/13661497.htm



WASHINGTON – Federal mine safety officials on Wednesday disputed a Knight Ridder analysis showing a dramatic reduction in the dollar amount of large fines for mine safety violations during the Bush administration, saying in an Internet posting that those fines are actually up.



Mine Safety and Health Administration spokesman Dirk Fillpot said that Knight Ridder made “assumptions that were incorrect'' in its Jan. 6 analysis.  But when Knight Ridder conducted a new analysis in the manner suggested by Fillpot using MSHA's newest database, it showed the same dramatic drop.



The newest data show a 43 percent reduction in proposed median major fines from the last five years of the Clinton administration when compared with the first five years of the Bush administration. That's the same percentage reduction found in Knight Ridder's original analysis, using a smaller, online database of MSHA violations.



When asked about that drop and the analysis, Fillpot refused Wednesday to answer 11 specific questions about MSHA's fines, its analysis or the posting of its critique.  Instead Fillpot repeated a prepared statement that said “it is unfortunate that Knight Ridder's analysis of MSHA's penalties was inaccurate.''



But four statistical experts who looked at the databases and analyses said Knight Ridder's findings were accurate and that MSHA's assessment didn't contradict the newspaper's findings of smaller fines during the Bush administration.



“It's really wrong for them (MSHA) to say you're incorrect,'' said John Grego, a professor of statistics at the University of South Carolina in Columbia. “There's no question that the average/median proposed penalty has gone down.”



MSHA's response “is looking at two different things and making a statement as if they are looking at the same thing,'' said Jeff Porter, a database library director for Investigative Reporters and Editors Inc., an association of journalists. Porter also teaches data analysis at the University of Missouri School of Journalism.



On its Web site, MSHA said the size of the final assessments — which are lower after bargaining and appeals — are up by “nearly 38 percent.''



Knight Ridder looked only at proposed fines because some of the actual fines are determined not by MSHA, but by administrative judges when mining companies appeal those penalties. Further, Knight Ridder found that fines finally assessed and paid fines are still lower on average in the Bush administration.



Fillpot wouldn't explain how his agency came up with the 38 percent figure.



The statistical experts said they couldn't understand how MSHA figured that out. Fillpot said “that information is taken from actual MSHA enforcement records and is accurate.''  He refused to elaborate.



In an unusual posting on the Internet on the Martin Luther King Jr. Day holiday on Monday, MSHA said, “Knight-Ridder's numbers are inaccurate, obscuring the reality that penalties issued by MSHA have gone up during this Administration — not down.''



After Knight Ridder questioned the posting, it was taken down Tuesday afternoon. It went back up Wednesday morning.  Among fines of $10,000 or more, the median penalty levied in the past five years was $27,139. During the last five years of the Clinton

administration, the comparable fine was $47,913, according to Knight Ridder's analysis of the newest data from MSHA.



That data, which included 221 large fines that weren't in the publicly available database used by Knight Ridder for its initial analysis, show that the total number of large fines increased to 527 in the Bush administration from 461 during the last five Clinton years.



Fillpot declined to say where those extra fines came from or why they weren't in the online database.|

(Johnson reports for Lexington Herald-Leader.)


 

Mr. Google, may I introduce you to Ms. Associated Press
Jan 19th, 2006 by JTJ

From the Librarians' Internet Index



“AP News and Google Maps Mashup

This mashup plots selected current
Associated Press (AP) news stories superimposed on a Google map or
satellite image of the United States. It includes national news,
sports, business, technology, and “strange” stories. Clicking on a
marker provides a synopsis with a link to the full story as hosted on
the site for the San Francisco Chronicle. From a software developer
with a degree in computer science.


URL: http://81nassau.com/apnews/

LII Item: http://lii.org/cs/lii/view/item/20125″





Geospatial data on human interactions with the environment
Jan 16th, 2006 by JTJ

From CCA:

Socioeconomic Data and Applications Centre, or SEDAC, is a branch of NASA that offers geospatial on human interactions with the environment. World datasets that are available for download include population and urban development and wilderness areas.
Other data focus on a specific area of the world. Most of the datasets
seem to be in some sort of grid or e00 format. Some of the sites also
offer maps of the data.
http://sedac.ciesin.columbia.edu/  SEDAC Projects are designed to help users synthesize and apply earth
science and socioeconomic data and information in their research,
educational activities, analysis and decision making. These projects
include data products and applications that address various types of
interdisciplinary data integration.”



»  Substance:WordPress   »  Style:Ahren Ahimsa