SIDEBAR
»
S
I
D
E
B
A
R
«
Google in the 3D modeling business?
May 2nd, 2006 by JTJ

Interesting new tool from the folks at Google.  If Sketchup follows the evolutionary line of Google Maps, we can expect to see some interesting mash-ups in coming weeks.  We are looking forward to some flowchart models that can be annotated with URL and comments.  But until then….

The modeling tool SketchUp has long been a
favorite of designers, architects, and hobbyists who have used
the powerful program to render 3D images of their ideas. In
March, search-engine giant and emerging software powerhouse
Google acquired SketchUp developer @Last Software. Last week,
Google
SketchUp
was quietly released to the public. The program
has been made completely free for personal use, and it
includes tools for integrating your creations with
Google
Earth
or uploading them to Google's 3D
Warehouse
gallery.

Google is establishing a
pattern of acquiring software companies and releasing free
versions of their programs. As with Keyhole (now
Google
Earth
) and Picasa,
Google hopes to make SketchUp popular with its massive Web
audience. We get very cool free software, and Google gains new
users, loyal customers, and a potential avalanche of
third-party content added to Google Earth.


It might
appear at first that the free version of SketchUp has been
watered down, but you'll find most of its same functionality
in an easier-to-use interface. The creative possibilities are
endless, and included video tutorials will get you up and
modeling in no time. Not only can Google SketchUp create
detailed structural models, it can also be used as a more
general conceptual visualization tool for everything from
games and art projects to work flows and engineering.


Take Google SketchUp for a spin, and let us know what
you think. Then
see
what others have to say
about Google's latest software or
add
a review of your own
.

Finally, if you're a fan of
CNET
Download.com and are willing to back it up with an
Internet vote, please help support us by voting for
Download.comWebby's
People's Voice
competition. Voting ends this week.


Peter Butler
Senior Editor, CNET Download.com



Ver 1.0 kicks off. Statician George Duncan opening speaker.
Apr 9th, 2006 by JTJ

Late this afternoon, the 20 participants in Ver 1.0
will be gathering at the Inn of the Governors in Santa Fe, NM for the
first session of the workshop.  The first, set-the-tone speaker is
George Duncan, professor of statistics at Carnegie Mellon University.  George will be speaking on “
Statistical
Confidentiality: What Does It Mean for Journalists’ Use of Public Databases?

We will post George's address as soon as possible, along with those of other participants in coming days.

We
are very pleased with high-powered thinkers who are in or coming to
Santa Fe to address the major problem of how do we verify the data in
public records databases.  The proceedings of the workshop will,
we hope, be published by the end of the month and also available online.






For our readers in the UK….
Mar 23rd, 2006 by JTJ




Bridging quantitative and qualitative methods for social sciences using text mining techniques

Organiser: Dr Sophia Ananiadou
(Sophia.Ananiadou@manchester.ac.uk or (0161)3063092),
School of Informatics, University of Manchester and National Centre for Text Mining (http://www.nactem.ac.uk/)

Date and location

28 April 2006, Weston Conference Centre, University of Manchester.

Registration

To register for this workshop please complete the registration form.

Summary

This workshop aims to bring together researchers from different
subject areas (computer scientists, computational linguistics, social
scientists, psychologists, etc) in order to explore how text mining
techniques can revolutionise quantitative and qualitative research
methods in social sciences. New technologies from text mining (e.g.
information extraction, summarisation, question-answering, text
categorisation, sectioning, topic identification, etc.) which go beyond
concordances, frequency counts etc can be used for quantitative and
qualitative content analysis of different data types (e.g. transcripts
of interviews, questionnaire analysis, archives, chatroom files,
weblogs, etc). The semantic analysis of new text types, e.g. weblogs is
important for sociologists and political scientists in inferring social
trends. Reputation and sentiment analysis collects and identifies
people’s opinions, attitudes and sentiments in text. Text mining
techniques also aid metadata creation for qualitative data and
facilitate their sharing.



Summer workshop on IPUMS databases
Mar 20th, 2006 by JTJ

A good learning opportunity in the Land of Lakes this summer….

Dear IPUMS Users,

I am pleased to announce the first annual IPUMS Summer Workshop, to be held
in Minneapolis on July 19th-21st. This training session will cover four
major databases: IPUMS-USA, IPUMS-International, IPUMS-CPS, and the North
Atlantic Population Project (NAPP).

For more information, please visit
http://www.pop.umn.edu/training/summer.shtml.

I hope to see some of you in Minneapolis this summer.

Sincerely,

Steven Ruggles
Principal Investigator
IPUMS Projects


What about those polls, eh?
Mar 10th, 2006 by JTJ

Marylaine Block, at Ex Libris: an E-Zine for Librarians and Other Information Junkies.
http://marylaine.com/exlibris/ tips us to another good blog for analytic journalists.  Click below to see what Charles Franklin has to say about presidential polls.

Political Arithmetik – Where Numbers and Politics Meet
http://politicalarithmetik.blogspot.com/
Blog
by Charles Franklin, a professor at the University of Wisconsin who teaches statistical analysis of
polls, public opinion and election results. He helps people understand
issues like political bias in poll samples and questions, and provides
historical context for current data.


Scientists track money to help predict disease
Jan 26th, 2006 by JTJ

Yet another fine example of creative thinking wherein a good idea in one discipline is morphed into an unintended application in another.  (Something all-too-rare in the practice of journalism.)  The journal Nature reports:

Another day another dollar


The
website wheresgeorge.com invites its users to enter the serial numbers
of their US dollar bills and track them across America and beyond. Why?
“For fun and because it had not been done yet”, they say. But the
dataset accumulated since December 1998 has provided the ideal raw
material to test the mathematical laws underlying human travel, and
that has important implications for the epidemiology of infectious
diseases. Analysis of the trajectories of over half a million dollar
bills shows that human dispersal is described by a 'two-parameter
continuous-time random walk' model: our travel habits conform to a type
of random proliferation known as 'superdiffusion'. And with that much
established, it should soon be possible to develop a new class of
models to account for the spread of human disease.

LetterThe scaling laws of human travel

D. Brockmann, L. Hufnagel
and T. Geisel



More lies, loudly spoken, from the Bush Administration?
Jan 20th, 2006 by JTJ

Posted on Thu, Jan. 19, 2006

Feds dispute mine safety report

By SETH BORENSTEIN and LINDA J. JOHNSON


Knight Ridder Newspapers


http://www.montereyherald.com/mld/montereyherald/news/nation/13661497.htm



WASHINGTON – Federal mine safety officials on Wednesday disputed a Knight Ridder analysis showing a dramatic reduction in the dollar amount of large fines for mine safety violations during the Bush administration, saying in an Internet posting that those fines are actually up.



Mine Safety and Health Administration spokesman Dirk Fillpot said that Knight Ridder made “assumptions that were incorrect'' in its Jan. 6 analysis.  But when Knight Ridder conducted a new analysis in the manner suggested by Fillpot using MSHA's newest database, it showed the same dramatic drop.



The newest data show a 43 percent reduction in proposed median major fines from the last five years of the Clinton administration when compared with the first five years of the Bush administration. That's the same percentage reduction found in Knight Ridder's original analysis, using a smaller, online database of MSHA violations.



When asked about that drop and the analysis, Fillpot refused Wednesday to answer 11 specific questions about MSHA's fines, its analysis or the posting of its critique.  Instead Fillpot repeated a prepared statement that said “it is unfortunate that Knight Ridder's analysis of MSHA's penalties was inaccurate.''



But four statistical experts who looked at the databases and analyses said Knight Ridder's findings were accurate and that MSHA's assessment didn't contradict the newspaper's findings of smaller fines during the Bush administration.



“It's really wrong for them (MSHA) to say you're incorrect,'' said John Grego, a professor of statistics at the University of South Carolina in Columbia. “There's no question that the average/median proposed penalty has gone down.”



MSHA's response “is looking at two different things and making a statement as if they are looking at the same thing,'' said Jeff Porter, a database library director for Investigative Reporters and Editors Inc., an association of journalists. Porter also teaches data analysis at the University of Missouri School of Journalism.



On its Web site, MSHA said the size of the final assessments — which are lower after bargaining and appeals — are up by “nearly 38 percent.''



Knight Ridder looked only at proposed fines because some of the actual fines are determined not by MSHA, but by administrative judges when mining companies appeal those penalties. Further, Knight Ridder found that fines finally assessed and paid fines are still lower on average in the Bush administration.



Fillpot wouldn't explain how his agency came up with the 38 percent figure.



The statistical experts said they couldn't understand how MSHA figured that out. Fillpot said “that information is taken from actual MSHA enforcement records and is accurate.''  He refused to elaborate.



In an unusual posting on the Internet on the Martin Luther King Jr. Day holiday on Monday, MSHA said, “Knight-Ridder's numbers are inaccurate, obscuring the reality that penalties issued by MSHA have gone up during this Administration — not down.''



After Knight Ridder questioned the posting, it was taken down Tuesday afternoon. It went back up Wednesday morning.  Among fines of $10,000 or more, the median penalty levied in the past five years was $27,139. During the last five years of the Clinton

administration, the comparable fine was $47,913, according to Knight Ridder's analysis of the newest data from MSHA.



That data, which included 221 large fines that weren't in the publicly available database used by Knight Ridder for its initial analysis, show that the total number of large fines increased to 527 in the Bush administration from 461 during the last five Clinton years.



Fillpot declined to say where those extra fines came from or why they weren't in the online database.|

(Johnson reports for Lexington Herald-Leader.)


 

Indirect indicators. Or maybe not.
Dec 5th, 2005 by Tom Johnson

Sometimes
journalists have a tendency to be too literal.  We want to ask a
question and we want the response to be a quote that is without
ambiguity.  One that's fills in some of the space between our
anecdotes.  But other times, we need tools that work like a
periscope, a device that allows us to not look at the object directly
but through a helpful lens.  Such periscopes for analyzing the
economy are indirect indicators.




Monday's
(5 Dec. 2005) NYTimes' Business Section was loaded with references to
such indicators that journos could keep in mind when looking for
devices to show and explain what's happening.  Check out “
What's Ahead: Blue Skies, or More Forecasts of Them?”   Be sure to click on the link Graphic: Indicators From Everyday Life


Another indirector was mentined Sunday on National Public Radio in “Economic Signs Remain Strong
  There, an economist said he tracks changes in the “titanium dioxide” data, the compound is used in all white paint and reflects manufacturing production. 








Decentralized, complex adaptive systems meet realpolitik and journalism. Finally.
Dec 3rd, 2005 by Tom Johnson

A
couple of articles have passed across our desk in recent days that
illustrate the impact — and  importance of understanding —
decentralized (or “distributed”) systems and
complex adaptive systems.

For starters, take a look at “Reinventing 911
How a swarm of networked ­citizens is building a better ­emergency broadcast system.”

http://www.wired.com/wired/archive/13.12/warning.html


Author Gary Wolf writes:
I've been talking with security experts about one of the thorniest
problems they face: How can we protect our complex society from massive
but unpredictable catastrophes? The homeland security establishment has
spent an immeasurable fortune vainly seeking an answer, distributing
useless, highly specialized equipment, and toggling its multicolored
Homeland Security Advisory System back and forth between yellow, for
elevated, and orange, for high. N
ow I've come [to Portland, Oregon] to take a look at a
different set of tools, constructed outside the control of the federal
government and based on the notion that the easier it is for me to find
out about a loose dog tying up traffic, the safer I am from a terrorist
attack.

“To understand the true nature of warnings, it helps to see them not
as single events, like an air-raid siren, but rather as swarms of
messages racing through overlapping social networks, like the buzz of
gossip. Residents of New Orleans didn't just need to know a hurricane
was coming. They also needed to be informed that floodwaters were
threatening to breach the levees, that not all neighborhoods would be
inundated, that certain roads would become impassible while alternative
evacuation routes would remain open, that buses were available for
transport, and that the Superdome was full.

“No central authority possessed this information. Knowledge was
fragmentary, parceled out among tens of thousands of people on the
ground. There was no way to gather all these observations and deliver
them to where they were needed. During Hurricane Katrina, public
officials from top to bottom found themselves locked within
conventional channels, unable to receive, analyze, or redistribute news
from outside. In the most egregious example, Homeland Security
secretary Michael Chertoff said in a radio interview that he had not
heard that people at the New Orleans convention center were without
food or water. At that point they'd been stranded two days.

“By contrast, in the system Botterell created for California,
warnings are sucked up from an array of sources and sent automatically
to users throughout the state. Messages are squeezed into a standard
format called the Common Alerting Protocol, designed by Botterell in
discussion with scores of other disaster experts. CAP gives precise
definitions to concepts like proximity, urgency, and certainty.
Using CAP, anyone who might respond to an emergency can choose to get
warnings for their own neighborhood, for instance, or only the most
urgent messages. Alerts can be received by machines, filtered, and
passed along. The model is simple and elegant, and because warnings can
be tagged with geographical coordinates, users can customize their cell
phones, pagers, BlackBerries, or other devices to get only those
relevant to their precise locale.”



Second item of interest
I'm sure many of you noted Dexter Filkins Pg1 lead story in the NYT on
Friday, 2 Dec. 2005.  The online version headline is “
Profusion of Rebel Groups Helps Them Survive in Iraq
.”  That, unfortunately, lacks the truth and insight of the print version headline:
“Loose Structure of Rebels Helps them Survive in Iraq — While Al Qaeda Gains Attention, Many Small Groups Attack on Their Own.
 

It
seems that finally someone in the journalism community has figured out
that what's happening in Iraq — and around the world — is a
decentralize, CAS.  Too bad journalists — journalism educators, students and professionals — haven't been exposed to the
concepts and vocabulary to really present the problem in all its, ahem,
complexity.


Taking games seriously
Nov 23rd, 2005 by Tom Johnson

Serious Games Initiative

http://www.seriousgames.org/



The Serious Games Initiative is focused on uses for games
in exploring management and leadership challenges
facing the public sector. Part of its overall charter
is to help forge productive links between the
electronic game industry and projects involving the use of
games in education, training, health, and public policy.





Says information specialists Marylaine Block:



 “As one who believes nobody should be allowed to run for office until they have played



Sim City for at least six months, I think such games have enormous



potential for helping people explore complex social problems and possible



solutions.”



»  Substance:WordPress   »  Style:Ahren Ahimsa