Alfredo Covaleda,
Bogota, Colombia
Stephen Guerin,
Santa Fe, New Mexico, USA
James A. Trostle,
Trinity College, Hartford, Connecticut, USA
From All Points Blog
University of Southern California students developed the online game for the Annenburg Center for Communications to teach about the challenges (and partisanness) of redistricting. Along the way players learn that to keep their candidates elected they may need to examine ethical issues. The game is Flash-based.
From the [original News 10] site: The Redistricting Game is designed to educate, engage, and empower citizens around the issue of political redistricting. Currently, the political system in most states allows the state legislators themselves to draw the lines. This system is subject to a wide range of abuses and manipulations that encourage incumbents to draw districts which protect their seats rather than risk an open contest.
From O'Reilly Radar:
ACM GIS 2007 CFP Extended
Posted: 12 Jun 2007 11:46 AM CDT
By Brady Forrest
The 2007 ACM International Symposium on Advances in GIS will be in Seattle from November 7th to the 9th. As they describe themselves:
The ACM International Symposium on Advances in Geographic Information Systems in 2007 (ACM GIS 2007) is the fifteenth event of a series of symposia and workshops that began in 1993 with the aim of bringing together researchers, developers, users, and practitioners carrying out research and development in novel systems based on geo-spatial data and knowledge, and fostering interdisciplinary discussions and research in all aspects of geographic information systems. The symposium provides a forum for original research contributions covering all conceptual, design, and implementation aspects of GIS and ranging from applications, user interface considerations, and visualization down to storage management and indexing issues. This year, a novelty is that ACM GIS has separated from its long-time host conference in order to become independent and more visible to the GIS community, further expand the spectrum of research topics covered by the symposium, and grow over the next years.
If this looks like something you would like to be a part of the deadline for submitting papers has been extended. The program from last year looks very academic. I wonder if reaching out through blogs (like this one) if they'll get some more real world talks. The topics (after the jump) are wide-ranging and quite fascinating (no wonder both Microsoft and Google are sponsoring). I'll be looking at the proceedings to get ideas and speakers for Where 2.0 2008.
Once again, O'Reilly's Radar tips us to a fine posting related to JAGIS (Journalism and GIS), this one regarding the challenge of generating change-over-time in urban areas.
Stamen's Map for Trulia
Posted: 12 Jun 2007 12:22 AM CDT
Trulia's new Hindsight Map is a beautiful, animated visualization of the development history of US cities and towns. With it, you can watch entire towns and cities grow. In Seattle, you can watch the city grow starting in year 1900. Trulia is a real estate search engine (much like Zillow). Stamen Design, known for their work on CabSpotting and in Digg Labs, built the map for Trulia using their new Flash mapping library, Modest Maps. Tom Carden and Shawn Allen of Stamen released and demoed Hindsight at Where 2.0.
Tom sent me the following notes on Hindsight and Modest Maps:
Time has been one of the missing dimensions in online maps, but recently it has become a common thing to add. Outside.in (Radar post) recently added the fourth dimension with their ability to track geographic stories over time. Google Earth (info) added the ability to “play” GPS traces. Hindsight really has me wondering about the applications of time-phased maps beyond analysis. In situations like Katrina (See Mikel Maron's post on the maps of Katrina) and the Maze Meltdown (See SF Chronicle article on the Maze) where there are rapid changes to roads this would especially helpful. To get your mind around changes, you need to be able to compare. I wonder if we can expect this to come from the major portals.
A recent post from the FreeGis group at Google. Looks to be a fine solution to a decade-old challenge. ————— Free Toolbar available from the TerraGo download link. MAP2PDF provides an easy to use and affordable solution for distributing GIS data to non-GIS users. By leveraging Adobe Acrobat, GeoPDF as portable mapping format, allows for the creation and publishing of layered Georegistered maps that can be accessed at no cost by non-GIS users. – Sat 9 Jun 2007 15:21 1 message, 1 author http://groups.google.com/group/freeGIS/t/9672fdc5d31e958b
Picking up some interesting Web 2.0 tools at the IRE's annual conference, this year in Phoenix.
The Programmableweb.com www.programmableweb.com/ Good jumpstation for APIs, Mashups, How-To info, etc. CityCon — www.tetonpost.com/citycon/ “CityCon allows you to find detailed information about any member of the current 110th U.S. Congress. Use the Input field above to query the CityCon database and the Internet for a U.S. City, State, Senator or Representative.” Maplight.org — www.maplight.org/ “MAPLight.org brings together campaign contributions and how legislators vote, providing an unprecedented window into the connections between money and politics. We currently cover the California Legislature and U.S.”
So the NYT did backtrack on the percent-of-change error described yesterday without assigning blame. That's fine. But the correction suggests another big story that we have only seen parts of. That is, of all the U.S. presence in Iraq — military and contractors — how many and what proportion are actually on the streets and how many and in what capacity are in support categories.
This weekend, friend-of-the-IAJ Joe Traub sent the following to the editor of the New York Times. Here's the story Joe is talking about: “White House….“
The headline error is bad enough (it's only in the hed, not not in the story) — and should be a huge embarrassment to the NYT. But the error gets compounded because while the Times no longer sets the agenda for the national discussion, it is still thought of (by most?) as the paper of record. Consequently, as other colleagues have pointed out, the reduction percentage gets picked up by other journalists who don't bother to do the math (or who cannot do the math.) See, for example: * CBS News — “Troop Retreat In '08?” — (This video has a shot of the NYT story even though the percentage is not mentioned. Could it be that the TV folks don't think viewers can do the arithmetic?)(NB: We could not yet find on the NPR site the transcript of the radio story that picked up the 50 percent error. But run a Google search with “cut in Troops by 50%” and note the huge number of bloggers who also went with the story without doing the math.)Colleague Steve Doig has queried the reporter of the piece, David Sanger, asking if the mistake is that of the NYT or the White House. No answer yet received, but Doig later commented: “Sanger's story did talk about reducing brigades from 20 to 10. That's how they'll justify the “50% reduction” headline, I guess, despite the clear reference higher up to cutting 146,000 troops to 100,000.”
Either way, it is a serious blunder of a fundamental sort on an issue most grave. It should have been caught, but then most journalists are WORD people and only word people, we guess.
We would also point out the illogical construction that the NYT uses consistently in relaying statistical change over time. To wit: “… could lower troop levels by the midst of the 2008 presidential election to roughly 100,000, from about 146,000…” We wince.
English is read from left to right. Most English calendars and horizontal timelines are read from left to right. When writing about statistical change, the same convention should be followed: oldest dates and data precedes newest or future dates and data. Therefore, this should best be written: “…could lower troop levels from about 146,000 to roughly 100,000 by the midst of the 2008 presidential election.”
Source: http://radar.oreilly.com/archives/2007/05/geocommons_shar.html
GeoCommons, Share Your GeoData
Posted: 23 May 2007 01:59 PM CDT
GeoCommons is a new mapping site that allows members to use a variety of datasets to create their own maps. It provides the free geodata, a map builder tool,the ability to create heat maps, and a map hosting site. An API will be available shortly. GeoCommons comes from FortiusOne, a Washington, D.C. company. The public Beta is going to be releasedWhere 2.0's launchpad. Monday, May 28th, at Where 2.0's launchpad.
When building a map you can use one of the 1500 data sets (with 2 billion data attributes) that they have made freely available. The data sets vary widely and include things like “Identity Theft 2006”, “Coral Reef Bleaching – Worldwide”, “Starbucks Locations – Worldwide”, and “HAZUS – Seattle, WA – Resident Demographics”. As you can see below, data can be viewed in a tabular format prior to loading it onto a map. Data sets can be combined together so that you can see “The Prices of Living in NYC & SF” and “Barack vs. Clinton – Show Me the Money! ” — it seems to me that Barack has more widespread support.
We are finding O'Reilly's Radar an increasingly valuable site/blog to keep up with interesting developments in Web 2.0, publishing and the general Digital Revolution. Brady Forrest's contribution below is an example.
See http://radar.oreilly.com/archives/2007/05/trends_of_onlin.html
Trends of Online Mapping Portals
Posted: 21 May 2007 04:34 PM CDT
Last week there were several announcements made that show the direction of the online mapping portals. Satellite images and slippy maps are no longer differentiators for attracting users, everyone has them and as I noted last week there are now companies that have cropped up to service companies that want their own maps. Some of these new differentiators are immersive experiences, owning the stack, and data!
Immersive experience within the browser – A couple of weeks ago Google maps added building frames that are visible at street level in some cities. These 2.5D frames are very clean and useful when trying to place something on a street.
Now the Mercury News (warning: annoying reg required; found via TechCrunch) is reporting that these builds will soon be fully fleshed out.
The Mercury News has learned that Google has quietly licensed the sensing technology developed by a team of Stanford University students that enabled Stanley, a Volkswagon Touareg R5, to win the 2005 DARPA Grand Challenge. In that race, the Stanford robotic car successfully drove more than 131 miles through the Mojave Desert in less than seven hours. The technology will enable Google to map out photo-realistic 3-D versions of cities around the world, and possibly regain ground it has lost to Microsoft's 3-D mapping application known as Virtual Earth.
The Mercury News has learned that Google has quietly licensed the sensing technology developed by a team of Stanford University students that enabled Stanley, a Volkswagon Touareg R5, to win the 2005 DARPA Grand Challenge. In that race, the Stanford robotic car successfully drove more than 131 miles through the Mojave Desert in less than seven hours.
The technology will enable Google to map out photo-realistic 3-D versions of cities around the world, and possibly regain ground it has lost to Microsoft's 3-D mapping application known as Virtual Earth.
The license will be exclusive, but don't think Google will be the only ones with 3-D in the browser. Microsoft has had 3-D for a while now (unfortunately, it requires the .NET framework; my assumption is that the team is busy converting it to SilverLight). 3-D is going to become a standard part of mapping applications. The trick will be making sure that the extra data doesn't get in the way of the user's quest to get information. Buildings are slow to render and can obscure directions.
This strategy is a nice compliment to their current strategy of gathering and harnessing 3-D models from users. Currently these are only available in Google Earth. The primary location to get them is Google's 3D Warehouse. I suspect that we will start to see user contributed models on Google Maps.
No word on how many cities Google will roll out their 3D models in or when the new data will be available via their API.
Data, Data, & More Data – Until recently, search engines did not provide neighborhoods as a way of searching cities. Neighborhoods are an incredibly useful, if hard to define, method of defining an area of a city.
Google has now added neighboorhood data to their index, but they have not really done much with it. If you know the neighborhood name then you can use that to supplement searching a city. However, if you are uncertain or if you are unaware of the feature, then you are SOL. There is no indication that the feature exists, how widespread it is, or what the boundaries of the neighborhood are. I hope that they continue to expand on this feature.
Ask on the other hand has done a great job with this feature (see above). They surface nearby neighborhood names for easy follow-on searches (see below). They show you the bounds of the neighborhood quite clearly.
Ask is using data from SF startup Urban Mapping. Urban Mapping claims complete coverage of ~300 urban areas in the US and Canada (with Europe coming). This isn't an easy problem. Urban Mapping has been working at it for quite sometime and are known for having a good data set. They have also been aggregating transit data. An interesting thing to note is that many of the same neighborhoods available on Ask are also available on Google maps (examples: Tenderloin, SF: Google, Ask; Civic Center, SF: Google, Ask) No word yet if any of the other big engines are going to add neighborhood data, but my guess is that it will soon become a standard feature; it's too useful to not have.
Own the Stack – Until recently, Yahoo! used deCarta to handle creating directions (or routing). They have announced that they have taken ownership of this part of the stack and have built their own routing engine. Ask and Google still use deCarta. Microsoft has always had their own. Yahoo! is hoping to make their new engine a differentiator. In some ways this is analogous to Microsoft's purchase of Vexcel, a 3D imagery provider. Microsoft did not want the same 3D data as Google Earth or any other search engine for its 3D world.
I think that any vendor servicing Google, Microsoft, Ask, Yahoo or MapQuest will have to keep an eye on their next source of revenue. Those contracts aren't going to necessarily last too long. The geostack is too valuable to outsource.
There is only one part of the stack that I think *might* be to expensive for any one of the engines to buy or build out right. That's the street data and it's a data source primarily supplied by two companies, NAVTEQ and Tele Atlas. NAVTEQ has a market cap of 3.5 bilion dollars as of this writing; Tela Atlas has one of 1.4 billion pounds. These would be spendy purchases. Microsoft is currently working closely with Facet Technology Corporation to collect street data for cities to add a street-level 3D layer (see Facet's SightMap for a preview), but this Facet is not collecting data to match the other players. It will be interesting to see if Yahoo! parleys its partnershipOpenStreetMap into a data play. with