SIDEBAR
»
S
I
D
E
B
A
R
«
New Search engine in the making?
Sep 19th, 2005 by JTJ

Dwight Hines posts an interesting opportunity to the IRE listserv:

“I am going to participate as an internet journalist in IBM's Project

Serrano Beta program. If you read the material below, you will see

that the beauty, or the absolute brute force ability of the system

being developed by IBM is the capacity to search lots of data bases

and integrate the information.  It seems to me that this is ideal for

those involved in investigative reporting at global or local levels,

or criminal justice issues, who need lots of flexibility and crank

power to draw information from all over.



If you are interested in participating in the Beta program, please

contact me.  You will be able to define the system that you need,

working with the IBM folks and other journalists.  Obviously, the more

different people and different media organizations participating, the

more power the system will have.  I don't think antitrust issues or

intellectual property rights will be an issue until the system is

working, but those are just two areas that will become important,

along with differences in laws in different countries.



This ain't gonna be your Gramma's google.



Dwight Hines, I do not work for IBM nor do I take goodies from them in any way.







Project Serrano Beta Programs:

Enterprise search and Data modeling and integration design



Project Serrano extends WebSphere(r) Information Integration with

enhanced search and data modeling and integration design. It expands

the source accessibility, functionality, performance, and localization

of already robust information integration technologies — to help

customers manage their growing information requirements in both

structured and unstructured domains. Project Serrano Beta includes two

programs:

Rational(r) Data Architect will combine traditional data modeling

capabilities with metadata discovery, mapping, and analysis, all

organized by a modular project-based structure.

WebSphere Information Integration (II) OmniFind Edition finds

information stored across the enterprise in file systems, content

archives, databases, collaboration systems, and applications.



http://www-306.ibm.com/software/data/integration/beta.html



==================



WebSphere Information Integrator OmniFind Edition





http://www-306.ibm.com/software/data/integration/db2ii/editions_womnifind.htmlres

and benefits





Key search features include:

       •       search results with sub-second response from enterprise content

such as intranets, extranets, corporate public websites, relational

database systems, file systems, and content repositories.

       •       supported sources such as HTTP/HTTPS, news groups (NNTP), file

systems, Domino(r) databases, Microsoft(r) Exchange public folders, DB2(r)

Content Manager, DB2 Universal Database™ (DB2 UDB), DB2 UDB for z/OS(r),

Informix(r), and Oracle databases. Documentum and FileNet support is

provided through WebSphere(r) II Content Edition.

       •       state-of-the art relevancy algorithms for corporate content.



The new OmniFind Edition provides numerous technology and business benefits. It:

       •       scales to millions of documents and thousands of users

       •       fits easily into enterprise Java™ applications with appropriate

security so that confidential information is not exposed

       •       eases administration for quick set up

       •       utilizes background analysis to minimize administrator tasks

required to get high quality search results

       •       provides highly relevant search results and the framework for

richer text analysis

       •       includes a seamless upgrade to WebSphere II OmniFind for WebSphere

Portal customers who can leverage existing taxonomies for navigation

and categorization, migrate rules for rule-based classification, and

surface the same user experience through the WebSphere Portal Search

Center”





GISc Resources for Hurricane Katrina
Sep 16th, 2005 by JTJ

From the Librarians' Index to the Internet….

GISc Resources for Hurricane Katrina


  This website collects resources related to the use of geographic
information systems (GIS) in response to Hurricane Katrina and in
disaster recovery. Includes articles, maps, satellite images, GIS data,
and news about research opportunities related to the hurricane. From
the University Consortium for Geographic Information Science.

 http://ucgis.org/Katrina/
 http://lii.org?recs=027428
 Subjects:
    * Geographic information systems
    * Emergency management
    * Hurricane Katrina, 2005



Simulations of bad, bad times
Sep 9th, 2005 by JTJ

Friend Steve Guerin sends this from Santa Fe….

The Disaster Dynamics Project at UCAR looks timely:http://swiki.ucar.edu/dd/2

Check out the Hurricane Landfall gamehttp://swiki.ucar.edu/dd/71
The Hurricane Landfall Disaster Dynamics Game is a four-player virtual strategy game about the interaction between natural disasters and urban planning. The game is computerized; it plays like a traditional physical boardgame, but there are simulation components that require significant computation. The game's architecture is client-server, with each player having her own computer.

Individual machines allow moves to be made in parallel and enable players to access private representations of the game state in addition to the public representation. The server is typically run on the instructor's computer, and
will also provide facilitation tools.


Katrina Missing Persons Meta-search Engine
Sep 9th, 2005 by JTJ

This seems to be the best tool we've seen to track individuals who may be unaccounted for following Katrina.

Lycos: Katrina Missing Persons Site http://www.lycos.com/katrina/
With multiple small databases of survivors, we desperately needed one search engine that would search through all of them, and Lycos created one.  The site lists all the databases it searches through. If you're aware of others, please fill out Lycos' form to add them.


The basics of the basics: What is/are the definitions?
Aug 19th, 2005 by JTJ

Ford Fessenden, of the NYTimes, has yet another strong piece in Thursday's paper, “Health Mystery in New York: Heart Disease.”  The lede lays out the perplexing problem in NYC: “Death rates from heart disease in New York City and its suburbs are
among the highest recorded in the country, and no one quite knows why.”

But among possible answers — and here especially is where the AJ kicks in — is that there is some “…speculation that doctors in the area may lump deaths with more subtle
causes into the heart disease category, making that toll look worse
than it actually is
.”  And “…the Centers for Disease Control and Prevention, at the health
department's request, has sent specialists to determine whether doctors
in New York City ascribe causes of death substantially differently.”


I know, I know, we're preaching here, but we don't think it can be pointed out too often: journalists and all social scientists cannot simply accept given numbers as a true, valid, honest.  We always have to swim up the data-creation stream to determine where, why and from who came the numbers. 




More government employees may be removed from public records
Aug 18th, 2005 by JTJ

Tamara Thompson reports on her blog PI News Link….

~ more government employees may be removed from public records ~

By Tamara Thompson Investigations

California
SB 506
will add an additional group of public officials to the roster of
those whose personal data is confidential. Keep this idea filed in the
back of your hat. When subject to a potential threat, various
government employees may apply to have their address and other
identifiers removed from public records. In its current form, SB 506
deems
the application for closure a public record. If the
document exists, you'll know that the subject has convinced another
public official that “a life threatening circumstance” exists that
impels the request for confidentiality.

“This bill would require a local elections official to extend this
confidentiality of voter registration information to specified public
safety officials, upon application, as specified, for a period of no
more than two years, if the local elections official is authorized to
do so by his or her county board of supervisors. The application of a
public safety official would be a public record.”



U.S. paper using Google Maps online
Aug 18th, 2005 by JTJ

As Anna-Maria Mende reports from journalism.co.uk:

“US: News sites playing with Google Maps

By Anna-Maria Mende

As Journalism.co.uk reports US local sites are beginning to experiment with Google Maps. New York State local newspaper Record Online,
for example, began to put Google maps on its articles. While reading
the article readers can see the location of the story on maps or
satellite images. Newspapers are thereby taking advantage of Google in
contrast to usual complaints that Google News and Google Ads threaten
newspapers.

“Recently, technology firm Daden from Birmingham, UK, developed a tool that combines Google Earth with users' favorite RSS feeds (see previous posting).
(Google Earth – unlike Google Maps – shows three-dimensional images.)
With this tool readers can select news by location on an international,
regional or local map on their computer. Newspapers experimenting with
Google Maps works the other way round; showing readers the location of
a news story while they are already reading it.
Source: Journalism.co.uk

We wonder when Google will begin licensing its maps to I-o-P publications for inclusion in the hard copy edition.



You will want to link to Matt Waite's blog
Aug 18th, 2005 by JTJ

Matt Waite, a reporter at the St. Petersburg Times,
is one of the bright lights in analytic journalism.  (And “bright”
has all the meanings you can apply.)  He is one of a handful of
the next generation, i.e. folks <40 years old, who are pushing some
intellectual and methodologtical boulders up the institutional hill
that is classic journalism. 




Matt has
created a non-rant blog describing his stories and projects in St.
Pete.  It's a learning resource.  See 
www.mattwaite.com


Those beloved — and ever valuable — news researchers….
Aug 17th, 2005 by JTJ

Friend Barbara Semonche,
queen of the news research kingdom (queendom?) at the Univ. of North
Carolina School of Journalism and Mass Communications, posts these
always-pertinent observations today on the NewsLib listserv:




“Journalism/mass
comm students will be returning to colleges and universities within the
next week or two. Time to get fresh examples for these emerging
journalists about just what news researchers are capable of doing for
and with them.




       Here is what I'm seeking for our beginning and advanced reporting students.



 
     Current (within the last couple of years or so)
examples/strategies of the research methods and sources news librarians
used for both investigative projects and breaking stories. Here is what
I have now:




1. Kathy Hansen's and Nora Paul's recent book, “Behind the Message: Information strategies for Communicators,”
has a classic example of a 1994 Minneapolis Star-Tribune story by
reporter Tony Kennedy which was enhanced by not only his investigative
research, but supplemented by the efforts of the Star-Tribune's news
research team. The case study in the book reprinted Kennedy's article
on the privately-held Schwan Company and then detailed each fact with
what resources were discovered and used.  Interesting note: the
local public library and librarian proved to be a gold mine of
information for Kennedy as did local interviews with former Schwan
employees.




2.
Alison Head's (former head of research at The Press Democrat in Santa
Rosa, California) handout on the news research involved with a breaking
crime story. She took the text of reporter Tom Chorneau's 1995 article
and


then
highlighted all the resources used to get the data for the story. A
sort of “Anatomy of Crime Research.” [Note: please check this URL:


http://parklibrary.jomc.unc.edu/head2.html ]



3. John
Martin's (St. Pete Times' researcher) 1998 description of how he worked
with a reporter on retrieving information on an alledged murderer's
identity on deadline. [Note: please check this URL:
http://parklibrary.jomc.unc.edu/stpete.html








A year's worth of stories awaiting
Aug 17th, 2005 by JTJ

The mission of the Annie E. Casey Foundation is “to
build better futures for disadvantaged children and their families in
the United States.”  One of the ways it does that is by packaging
data about America's children in a form that reporters can easily
access and use.  Hence, the “Kids Count State-Level Data Online
site. 


“This new database, launched in July 2005,
contains more than 75 measures of child well-being, including the 10
measures used in our annual KIDS COUNT Data Book. It includes the most
timely data available on Education, Employment and Income, Poverty,
Health, Basic Demographics, and Youth Risk Factors for the U.S., all 50
states, and D.C. Depending on availability, three to five years of
trend data is currently available for most indicators.

“This
easy-to-use, powerful online database allows you to generate custom
reports for a geographic area (Profiles) or to compare geographic areas
on a topic (Ranking, Maps, and Line Graphs).



 

»  Substance:WordPress   »  Style:Ahren Ahimsa