SIDEBAR
»
S
I
D
E
B
A
R
«
AJ Tool-of-the-Week: Furl – Online bookmarking tool
May 25th, 2005 by JTJ

We've been using a variety of web-based bookmarking tools for the past four or five years, tools like the now-departed Blink and Backflip.  They were all OK (so long as they remained financially viable), but never quite seemed to meet all our needs.  Recently, though, we learned about Furl (www.furl.net) and we like what we see.  Furl is in beta, so we don't know what the ultimate price will be, but journalists will like the ease with which we can pull URLs off a web page, markup those savings with keywords, copy-and-paste webpage annotations and then save the citation in a folder of your making.  Oh yeah, you can also save and e-mail the link(s) to anyone.  In fact, we like Furl so much, we will be demo-ing it next week at the IRE conference in Denver.

As the Furl gang says:
“Furl will archive any page, allowing you to recall, share, and discover useful information on the Web. Browse your personal archive of Web pages, and subscribe to other archives via RSS.”

Check it out.


 

So why can't this sourcing thing be fixed?
May 23rd, 2005 by JTJ

It can. 

The NYT this morning tells us that “Big News Media Join in Push to Limit Use of Unidentified Sources.”  Readers are told:

Concerned that they may have become too free in granting anonymity to sources, news organizations including USA Today, The Washington Post, The Los Angeles Times, NBC News and The New York Times are trying to throttle back their use.
     “But
some journalists worry that these efforts could hamper them from doing
their jobs – coming in a hothouse atmosphere where mistrust of the news
media is rampant, hordes of newly minted media critics attack every
misstep on the Web, and legal cases jeopardize their ability to keep
unnamed news sources confidential….
     “
Last year, The New York Times adopted a more stringent approach to its
treatment of confidential sources, including a provision that the
identity of every unidentified source must be known to at least one
editor. A committee of the paper's journalists recently recommended
that the top editors put in place new editing mechanisms to ensure that
current policies are enforced more fully and energetically.”

We look forward to these “new editing mechanisms.”

Yes, policies on unnamed sources should be made,
those policies should be clear and everyone in the newsroom should know
what they are.  But more often (as in “every day”), editors must
know the sources — indeed, all sources
— are for a story, how to reach those souces and how to verify what
the reporter wrote, even if the reporter is out-of-pocket. 

This is not difficult if journalists recognize that a
PC-based word processing application already has the tools to assist in
this “Who Are The Sources” mission. (If the publication is still using
something like the old Coyote terminals, sorry, we probably can't
help  you.) 

The tool is the “comment” function in the word processor.  While the newsroom is making policies about sourcing, add this one: “Every
paragraph of every story will end with an embedded comment.  That
comment will show editors exactly how the reporter knows what he or she
just wrote.”
  The comment might include a source's name,
phone number and date-time-place of interview.  The comment might
include a URL or a bibliographic citation.  It might include
reference to the specific reporter's notebook.  But in the end,
the comments should be sufficient that an editor can “walk the cat
backward” to determine exactly how the reporter knows what he/she just
wrote.  Doing so helps prevent unwarranted assumptions and errors
of fact, if not interpretation.

There will be those of the Burn-Your-Notes School of
libel defense who will contend this is comment thing is suicidal. 
We would suggest, first, that very few stories ever become court
cases.  Secondly remember that truth is the first defense in libel
actions, and it is our responsibility to deliver that truth.



Figuring the odds
May 20th, 2005 by JTJ

Last week, NOAA predicated a serious hurricane season a'comin' in the Atlantic, which has implications for the entire U.S. East Coast.  That's last week's
news, but if one lives in California, Mexico, Central America or Japan,
then today there's always the possibility of a major shaker.  And
those are just risks imposed by nature.  Modeling these and other
hazards of life is the mission of RMS, a fascinating California company demonstrating innovative thinking and analytic tools.

RMS brings together a unique, multidisciplinary team of experts to
create solutions for its clients’ natural hazard and financial risk
management challenges. We are the technical leader in our market, with
over 100 engineers and scientists devoted to the development of risk
models. Of this number, approximately fifty percent hold advanced
degrees in their field of expertise.


Our specialists track research among leading experts and academic
institutions worldwide, and supplement this knowledge with internal R&D
to ensure that our models provide the most complete and accurate
quantification of risk.

Yup — our kind of guys.  Examples of the output of these “risk models” can be found here.  Of special interest to U.S. journalists are the Catastrophe Risk maps.  (They are a bit too small to read in detail, but big enough to get the gist of some of the RMS product.)

We hope to report more next week about RMS, how it does what it does and how there might be some synergy there for analytic journalists.



Doing well by doing good
May 19th, 2005 by JTJ

Here
at the IAJ we believe one of the reasons people come to newspapers or
broadcast stations is to get the data which, upon analysis, they can
turn into information that helps them make decisions.  Ergo, the
more meaningful data a journalistic institution can provide, the
greater value that institution has for a community.




A good example arrived today thanks to Tara Calishain, creator of ResearchBuzz.  She writes:

** Getcher Cheap Gas Prices on Google Maps

<http://www.researchbuzz.org/getcher_cheap_gas_prices_on_google_maps.shtml>



“Remember
when I was saying that I would love a Gasbuddy / Google Maps mashups
that showed cheap gas prices along a trip route?   Turns out
somebody has already done it —  well, sorta. You can specify a
state, city  (only selected cities are available) and 
whether you're looking for regular or diesel  fuel. Check it out
at 
http://www.ahding.com/cheapgas/

The data driving the map is ginned up by GasBuddy.com 
It's not clear how or why GasBuddy gets its data, but it offers some
story potential for journalists and data for news researchers.  It
has an interesting link to dynamic graphs of gas prices over time.

Surely the promotion department of some news organization could grab
onto this tool, tweak it a bit,  promote the hell out of it, and
drive some traffic to and build loyalty for the organization's web
site. 

That's the obvious angle, but what if some enterprising journo started
to ask some questions of the data underlying the map?  What's the
range in gas prices in our town/state?  (In Albuquerque today, the
range was from $2.04 to $2.28.)  Are there any demographic or
traffic flow match-ups to that price range?  How 'bout the
variance by brand? 

Would readers appreciate this sort of data?  We think so,
especially if there was an online sign-up and the news provider would
deliver the changing price info via e-mail or IM much like Travelocity
tells us when airline ticket prices change by TK dollars.






FYI: Economic Models and Base Closings Teleconference
May 17th, 2005 by JTJ

Regional Economic Models Inc. cordially invites you to join us on June 7th for a teleconference regarding Base Realignment and Closure (BRAC)
On Friday May 13th Department of Defense released Recommendations to
the BRAC commission. We feel that a discussion of BRAC studies and
analysis methods would be helpful to a number of communities:




Topics to be discussed include:

– Demographic effects of active military, reservists, & dependents.

– Migration effects of re-alignment or closures.

– Dynamic effects of government spending over time.

– The Impacts of lost or reduced civilian contracts.

– Previous BRAC studies using the REMI model.

– Other topics by REMI Guest Speakers.



A
presentation will be sent out before the call in order to direct and
facilitate discussion. There will be two teleconferences taking place
on the 7th, one at 10am, one at 4pm  EST, hosted by Frederick
Treyz and Jonathan Lee.




There is
no fee for participation, but space is limited.  If you are
planning on joining us or would like to participate in the discussion
please respond to this e-mail, register online at www.remi.com or
contact us by phone at (413) 549-1169.




We look forward to speaking with you in June!



Yours truly,

Frederick Treyz, Ph.D.

Chief Executive Officer

Regional Economic Models, Inc.

306 Lincoln Ave.

Amherst, MA 01002

T. 413-549-1169

F. 413-549-1038

Fredtreyz@remi.com

www.remi.com




"Flashing" the human body
May 16th, 2005 by JTJ

The
power of good infographics is that they can greatly aid in the 
upstream aspects of  journalism — providing insight for
journalists to understand what's happening with a particular phenomena
— and then downstream, to help journalists tell the story and for the
audience to understand it.




The Digital Revolution has upped the ante far beyond what good ol' Leonardo was using and envisioning.  One of the innovators in today's datasphere is
Alexander Tsiaras.  A recent story in Digital Journal has this to say about Tsiaras's company, Anatomical Travelogue:



“Digital Journal — At ideaCity04, one presenter was so overflowing with
information that host Moses Znaimer had to enter stage right and
patiently sit beside him, a silent reminder to wrap it up. But you
couldn’t ask Alexander Tsiaras to gloss over the wonders of the human
body, from blood flow to cell mutation.

During his presentation, he showed images from his visualization
software company Anatomical Travelogue, whose clients include Nike,
Pfizer and Time Inc. Tsiaras and his 25 employees take data from MRI
scans, spiral CT scans and other medical imaging technologies, and use
them to create scientifically accurate 3D pictures and animations.

In 2003, his book of images of fetal development, From Conception to Birth, sold 150,000 copies and his latest work is Part Two of this fantastic voyage, The Architecture and Design of Man and Woman. For a chapter on sex, Tsiaras even scanned an employee doing the deed with his girlfriend — all in the name of science.”



Online course in epidemiology
May 16th, 2005 by JTJ

Jump into the study of epidemiology with Prof. David Kleinbaum and Prof. Nancy Barker in the online course “Fundamentals of Epidemiology” at statistics.com June 10 – July 15.  Using their electronic textbook “ActiveEpi”, this introductory course emphasizes the underlying  concepts andmethods of epidemiology. Topics covered  include: study designs (clinical trials, cohort studies,  case-control studies, and cross-sectional studies),  measures of disease frequency and effect.



Dr. Kleinbaum, professor at Emory University, is internationally known for his textbooks in statistical and epidemiologic methods and also as an outstanding teacher.  He is the author of “Epidemiologic Research-Principles and Quantitative Methods”, “Logistic Regression- A Self-Learning Text”, and “Survival  Analysis- A Self Learning Text”.  Prof. Barker is a consulting biostatistician and a co-author of the “ActivEpi Companion Text”, and has over 10 years of experience teaching short courses in epidemiology and  biostatistics at Emory and at the Centers for Disease Control and Prevention.



The course takes place online at statistics.com in a  series of 5 weekly lessons and assignments. Course participants work directly with both instructors via a  private discussion board.  Participate in the course at  your own convenience; there are no set times when you are required to be online.



For registration and information:

http://www.statistics.com/content/courses/epi1/index.html



Peter Bruce

courses@statistics.com



P.S.  Coming up June 3 at statistics.com:  “Toxicological

Risk Assessment” and “Using the Census's new 'American

Community Survey' ” and, on June 10, “Categorical Data

Analysis.”

New link to Chance
May 14th, 2005 by JTJ

We have long admired and appreciated the work of Dartmouth Professor J. Laurie Snell and his colleagues at the CHANCE project.  (There are some terrific online lectures on all phases of statistics and probability at the Chance Lectures)



We received the following recently:

In order to give Chance News the chance for a longer life we have changed it to
a ChanceWiki. The new url
is


http://chance.dartmouth.edu/chancewiki/



For the ChanceWiki we
are using the software developed for the very successful free Encyclopedia
Wikipedia.


http://en.wikipedia.org/wiki/



The wiki software makes
it easy for anyone to add an item or to make changes in an existing article
(hopefully an improvement) in the current Chance News.




On the Main Page
of the ChanceWiki you will find links to the current Chance news and “How to
submit a new article or edit an existing article”.




We hope you will try
making a contribution. If you have any questions I will be happy to try to
answer them.




J. Laurie Snell



jlsnell@dartmouth.edu

The NYT: Do as I say (sorta), not as I do
May 8th, 2005 by JTJ

Today's NYT “Week in Review” carries Daniel Okrent's column, “The Public Editor.”  This week's solid piece — “Briefers and Leakers and the Newspapers Who Enable Them” — takes another deserved shot at the use of unattributed and/or anonymous sourcing. 
But both Okrent and the NYT fall short in providing adequate
transparency and leveraging of the digital environment to the benefit
of both readers and the newspaper.

Okrent reports on some analytic work regarding the NYT's use of sourcing
practices, work carried out by a grad student at NYU, Jason B.
Williams.  Okrent gives appropriate attribution to Williams and
his data and, let's assume, reported it correctly.  But he only reported the data.  At the
end of the essay, Okrent quotes NYT editor Bill Keller: “'We need to
get our policies [regarding sourcing] hard-wired into the brains of our
reporters and editors that
we are obliged to tell readers how we know
what we know
,' Bill Keller told me the other day.” [The IAJ's emphasis added.]



Here Keller and Okrent disappoint us by prompting one of the fundamental
admonitions to novice journalists:  Don't TELL the reader, SHOW the reader what you know.



The way
to build reader confidence and improve the relevance of journalism
would have been to provide an online link to Williams' raw data so readers
could explore it for even richer insights and draw their own
conclusions. 


And even if you didn't create the "archives"….
May 6th, 2005 by JTJ

The
current issue of WIRED (or is it only the online WIRED News?  I'm
not always sure which is which.) carrieds a piece on what Amazon is
doing with its search engines to tease data out of the PDF books it
carries.  “
Judging a Book by its Contents
includes the following from Amazon exec. Bill Carr.  Oh that news
organizations could bring the same type of thinking to their archives.





Bill Carr, Amazon's executive vice president of digital media, confirms that this is a serious attempt to sell more books.


“We've been spending a lot of time thinking, 'We have this rich digital
content, how can we pull info out and expose it to customers that makes
discovery even better?'” Carr said. “What you are seeing here are the
fruits of a lot experimenting and brainstorming.”


Carr points to the “adaptive unconscious” SIP from Malcolm Gladwell's best seller, Blink, as an example of how improbable data mining can get a curious reader into the long tail of Amazon's catalog.”


Benjamin Vershbow, a researcher at the Institute for the Future of the Book,”…sees Amazon's data mining as part of a trend on the web where sites are
learning to weave data sources together to create a new web experience.”

Someone, and it won't be a newspaper or magazine
publisher, will see an opportunity to do the same thing with our
archives.  No, Lexis-Nexis is just a warehouse.  Valuable,
but not much added value.

»  Substance:WordPress   »  Style:Ahren Ahimsa