Resources related to Crime Mapping
Dec 7th, 2005 by Tom Johnson

don't know if there has as yet been any empirical research done on how
interested media consumers are in online crime mapping — and how good the coverage is —  but there is a body of
literature debating readers' interest in crime per se.  It would
seem to be a pretty good bet, though, that if people are interested in
crime AND if more and more are going online via broadband, that
some dynamic crime maps would get some hits. 

that crime mapping is not just about pushing digital push-pins on a
map, GoogleMap or otherwise.  “Journey to Crime” maps or maps
showing where a car was stolen and when it was recovered can provide
interesting insights.

Here are some links recently posted to the CrimeMapping listserv that could be of value to journalists:

Journey-after-crime: How Far and to Which Direction DO They Go?

Linking Offender Residence Probability Surfaces to a Specific Incident Location

Journey to Crime Estimation

Applications for Examining the Journey-to-Crime Using Incident-Based Offender Residence Probability Surfaces

The Geography of Transit Crime:


Indirect indicators. Or maybe not.
Dec 5th, 2005 by Tom Johnson

journalists have a tendency to be too literal.  We want to ask a
question and we want the response to be a quote that is without
ambiguity.  One that's fills in some of the space between our
anecdotes.  But other times, we need tools that work like a
periscope, a device that allows us to not look at the object directly
but through a helpful lens.  Such periscopes for analyzing the
economy are indirect indicators.

(5 Dec. 2005) NYTimes' Business Section was loaded with references to
such indicators that journos could keep in mind when looking for
devices to show and explain what's happening.  Check out “
What's Ahead: Blue Skies, or More Forecasts of Them?”   Be sure to click on the link Graphic: Indicators From Everyday Life

Another indirector was mentined Sunday on National Public Radio in “Economic Signs Remain Strong
  There, an economist said he tracks changes in the “titanium dioxide” data, the compound is used in all white paint and reflects manufacturing production. 

Tilling the soil makes for fertile crops, Congressionally speaking.
Dec 5th, 2005 by Tom Johnson

Kudos to Derek Willis and Adrian Holovaty of The Washington Post for the site “U.S. Congress Votes Database.”  One element we find of recent and special interest is the “late night votes
variables for both the House and Senate.  With a little more
probing and data slicing and dicing, it would make an interesting bit
of visual
statistics/infographics to do a longitudinal comparison of the time of
votes in various congresses.

This site/searchable database is a fine example of how investing in some basic data preparation
can create the potential for a ton of stories.  Why, for example, do
Democrats have such a preponderance (18 out of 20) of Representatives on the “missed
” list, but only 9 out of 20 on the similar list for the Senate?

This is
also a fine example of how a newspaper can do good things for itself
while doing good things for the community and readers.  This
database gives the WP reporters and editors a quick look-up of
Congressional activity, the kind of fact and detail that can enrich a
story.  At the same time, citizens can turn to this value-added
form of the public record to answer their own questions.

Derek Willis wrote to the news librarians listserv:


It's not part of a story or series, but the Post today launched a site

that may prove useful to your newsrooms or even as an inspiration to

learn Python: a congressional votes database that covers the

102nd-109th congresses (1991-present). Currently browsable, we're

working on adding a search engine and other features to it. Adrian

Holovaty, who works for, and I assembled the data

and he built the web framework to display it. All of the data is

gathered using Python, the database backend is PostgreSQL and the web

framework is Django.”

Decentralized, complex adaptive systems meet realpolitik and journalism. Finally.
Dec 3rd, 2005 by Tom Johnson

couple of articles have passed across our desk in recent days that
illustrate the impact — and  importance of understanding —
decentralized (or “distributed”) systems and
complex adaptive systems.

For starters, take a look at “Reinventing 911
How a swarm of networked ¬≠citizens is building a better ¬≠emergency broadcast system.”

Author Gary Wolf writes:
I've been talking with security experts about one of the thorniest
problems they face: How can we protect our complex society from massive
but unpredictable catastrophes? The homeland security establishment has
spent an immeasurable fortune vainly seeking an answer, distributing
useless, highly specialized equipment, and toggling its multicolored
Homeland Security Advisory System back and forth between yellow, for
elevated, and orange, for high. N
ow I've come [to Portland, Oregon] to take a look at a
different set of tools, constructed outside the control of the federal
government and based on the notion that the easier it is for me to find
out about a loose dog tying up traffic, the safer I am from a terrorist

“To understand the true nature of warnings, it helps to see them not
as single events, like an air-raid siren, but rather as swarms of
messages racing through overlapping social networks, like the buzz of
gossip. Residents of New Orleans didn't just need to know a hurricane
was coming. They also needed to be informed that floodwaters were
threatening to breach the levees, that not all neighborhoods would be
inundated, that certain roads would become impassible while alternative
evacuation routes would remain open, that buses were available for
transport, and that the Superdome was full.

“No central authority possessed this information. Knowledge was
fragmentary, parceled out among tens of thousands of people on the
ground. There was no way to gather all these observations and deliver
them to where they were needed. During Hurricane Katrina, public
officials from top to bottom found themselves locked within
conventional channels, unable to receive, analyze, or redistribute news
from outside. In the most egregious example, Homeland Security
secretary Michael Chertoff said in a radio interview that he had not
heard that people at the New Orleans convention center were without
food or water. At that point they'd been stranded two days.

“By contrast, in the system Botterell created for California,
warnings are sucked up from an array of sources and sent automatically
to users throughout the state. Messages are squeezed into a standard
format called the Common Alerting Protocol, designed by Botterell in
discussion with scores of other disaster experts. CAP gives precise
definitions to concepts like proximity, urgency, and certainty.
Using CAP, anyone who might respond to an emergency can choose to get
warnings for their own neighborhood, for instance, or only the most
urgent messages. Alerts can be received by machines, filtered, and
passed along. The model is simple and elegant, and because warnings can
be tagged with geographical coordinates, users can customize their cell
phones, pagers, BlackBerries, or other devices to get only those
relevant to their precise locale.”

Second item of interest
I'm sure many of you noted Dexter Filkins Pg1 lead story in the NYT on
Friday, 2 Dec. 2005.  The online version headline is “
Profusion of Rebel Groups Helps Them Survive in Iraq
.”  That, unfortunately, lacks the truth and insight of the print version headline:
“Loose Structure of Rebels Helps them Survive in Iraq — While Al Qaeda Gains Attention, Many Small Groups Attack on Their Own.

seems that finally someone in the journalism community has figured out
that what's happening in Iraq — and around the world — is a
decentralize, CAS.  Too bad journalists — journalism educators, students and professionals — haven't been exposed to the
concepts and vocabulary to really present the problem in all its, ahem,

Creative analytic techniques
Nov 25th, 2005 by JTJ

A recent edition of MIT's Technology Review
tells a tale with direct parallels to analytic journalism.  That
is, investigators bringing well-known and established analytic tools to
new applications.  In this case, using computer scans to conduct a
“visual autopsies.”  See:

“Dead Men Do Tell Tales
Virtual autopsies reveal clues that forensic pathologists might miss. By John Gartner,1,p1.html?trk=nl

Taking games seriously
Nov 23rd, 2005 by Tom Johnson

Serious Games Initiative

The Serious Games Initiative is focused on uses for games
in exploring management and leadership challenges
facing the public sector. Part of its overall charter
is to help forge productive links between the
electronic game industry and projects involving the use of
games in education, training, health, and public policy.

Says information specialists Marylaine Block:

 “As one who believes nobody should be allowed to run for office until they have played

Sim City for at least six months, I think such games have enormous

potential for helping people explore complex social problems and possible


Growth opportunity (of the intellectual sort) for journalists
Nov 18th, 2005 by Tom Johnson

With newspapers — and news magazine — cutting staff on
an almost weekly basis, some of us in journalism are going to have to
reinvent ourselves.  One of our tenents of Analytic Journalism is
simulation modeling, a methodology and analytic tool we believe will be
to the social sciences in the 21st century (and journalism IS a social
science) what quantum physics was to the hard sciences in the
20th. So here's an interesting opportunity for someone.

“> The Department of Mathematics as the University of California, Los

> Angeles is soliciting applications for a postdoctoral fellowship

> position in Mathematical and Computational Social Science.  The

> qualified applicant will work in the UC Mathematical and Simulation

> Modeling of Crime Group (UCMaSC), a collaboration between the UCLA

> Department of Mathematics, UCLA Department of Anthropology, UC

> Irvine Department of Criminology, Law and Society and the Los

> Angeles Police Department to study the dynamics of crime hot spot

> formation.  The research will center on (1) development of formal

> models applicable to the study of interacting particle systems, or

> multi-agent systems, (2) simulation of these systems and (3)

> directed empirical testing of models using contemporary crime data

> from Los Angeles and other Southern Californian cities.


> The initial appointment is for one year, with possible renewal for

> up to three years.  For information regarding the UCMaSC Group visit




> DUTIES: Work closely with an interdisciplinary team of

> mathematicians, social scientists and law enforcement officials to

> develop new mathematical and computational methodologies for

> understanding crime hot spot formation, diffusion and dissipation.

> Responsibilities include teaching one course in the Department of

> Mathematics per year, publication and presentation of research

> results.


> REQUIRED: A recent Ph.D. in Mathematics, Physics or a related

> field.  The qualified applicant is expected to have research

> experience in one or more areas that would be relevant to the study

> of interacting particle/multi-agent systems including, but not

> limited to, mathematical and statistical physics, complex systems,

> and partial differential equations modeling.  The applicant is also

> required to have advanced competency in one or more programming

> languages/environments (e.g., C++, Java, Matlab).


> Qualified candidates should e-mail a cover let, CV and the phone

> numbers, e-mail addresses, and postal addresses of three

> individuals who can provide recommendation to:


> Dr. P. Jeffrey Brantingham

> Department of Anthropology

> 341 Haines Hall

> University of California, Los Angeles

> Los Angeles, CA 90095″

Yes, Virginia, methodology DOES matter
Nov 10th, 2005 by JTJ

A piece on calling the elections in Detroit:

MAKING A FORECAST: A secret formula helps producer call the election right



November 10, 2005

What was a viewer to believe?

As polls closed Tuesday, WDIV-TV (Channel 4) declared Freman Hendrix winner of Detroit's mayoral race by 10 percentage points.

WXYZ-TV (Channel 7) showed Hendrix ahead by 4 percentage points, statistically too close to call.

But WJBK-TV (Channel 2) got it right, declaring just after 9 p.m. that
Mayor Kwame Kilpatrick was ahead, 52% to 48%, which turned out to be
almost exactly the final 53%-47% outcome declared many hours later.

And it was vote analyst Tim Kiska who nailed it for WJBK, and for WWJ-AM radio, using counts from 28 of 620 Detroit precincts.

Kiska did it with help from Detroit City Clerk Jackie Currie. She
allowed a crew that Kiska assembled to collect the precinct tallies
shortly after the polls closed at 8 p.m.

Using what he calls a secret formula, Kiska calculated how those 28 precincts would predict the result citywide.

His formula also assumed that absentee voters chose Hendrix over Kilpatrick by a 2-1 ratio.

That's different from the methods of pollsters who got it wrong
Tuesday, Steve Mitchell for WDIV and EPIC/MRA's Ed Sarpolus for WXYZ
and the Free Press. Both men used telephone polls, calling people at
home during the day and evening and asking how they voted.

It's a more standard method of election-day polling, but Tuesday proved treacherous.

Kiska, a former reporter for the Free Press and Detroit News, has done
such election-day predictions since 1974, but said he was nervous

“Every time I go into one of these, my nightmare is I might get it
wrong,” said Kiska, a WWJ producer. “I had a bad feeling about this
going in. I thought there was going to be a Titanic hitting an iceberg
and hoping it wouldn't be me.”

Kiska said he especially felt sorry for his friend Mitchell.

Mitchell said he's been one of the state's most accurate political
pollsters over 20 years, but said his Tuesday survey of 800 voters
turned out to be a bad sample.

He said polling is inherently risky, and that even well-conducted polls
can be wrong one out of 20 times. “I hit number 20 this time.”

For Sarpolus, it's the second Detroit mayoral race that confounded his
polls. He was the only major pollster in 2001 who indicated Gil Hill
would defeat Kilpatrick.

Sarpolus said the pressure to get poll results on the air quickly made
it impossible to adjust his results as real vote totals were made
public during the late evening.

Of Kiska, Sarpolus said: “You have to give him credit. … But you have to assume all city clerks are willing to cooperate.”

Contact CHRIS CHRISTOFF at 517-372-8660 or

Digital detectives
Nov 3rd, 2005 by JTJ

those interested in the forensic process — and in this case, computer
forensics — be sure to check out this fine, fine piece of digital
detective work by Mark Russinovich, a computer security expert with
Sysinternals.  He
discovered evidence of a “rootkit” on his Windows PC.

We don't think journalists need to know how to DO this kind of
deep-diving probing, but  we should be aware that it is possible
and, broadly speaking, the methods if only to know the appropriate
search terms.

Through heroic forensic work,
he traced the code to First 4 Internet, a British provider of
copy-restriction technology that has a deal with Sony to put digital
rights management on its CDs. It turns out Russinovich was infected
with the software when he played the Sony BMG CD
Get Right With the Man by the Van Zant brothers.

Here's WIRED Magazine's take on the story, “The Cover-Up Is the Crime

And here's what Dan Gillmor had to say about it, with additional links.

In the tradition of William Playfair and Charles Joseph Minard….
Oct 26th, 2005 by Tom Johnson

Ericson of the NYTimes has delivered yet again a piece of superb
infographics.  This one, sadly, illustrates the 2000+ U.S. deaths
in Iraq.  (See “Deaths in Iraq by Month” in the 26 Oct. 2005 story “
2,000 Dead: As Iraq Tours Stretch On, a Grim Mark“)

William Playfair
(1759-1823) was the
Scottish engineer and political economist who did the ground-breaking work in visual statistics.  Charles Joseph Minard, in the mid-nineteenth century, produced the classic infographic of
Napoleon's March to (and retreat from) Moscow.  Minard's great
work is notable for displaying multiple data sets on a timeline as
well as their geographical relationships.

Ericson has done something similar by showing the combat deaths in Iraq
from the March 2003 invasion until mid-Oct. 2005 as the occupation
continues.  Ericson shows not just the numbers, but the branch of
service, the locations of the deaths and the causes of death (i.e.
explosive devices, vehicle or plane crashes, etc.).

It's a brilliant piece of work that also demonstrates the added value
that very good journalists and their editors can bring to what should
be public discussion.  But this kind of work doesn't happen
overnight, nor is it cheap to do.  (Are you listening
Knight-Ridder, Gannett, et al.?)

We would only hope that someone at the Times would work to develop a
flash program/presentation that would, in a relatively automatic
mannter, constantly update this important informational display.

»  Substance:WordPress   »  Style:Ahren Ahimsa