SIDEBAR
»
S
I
D
E
B
A
R
«
Direct to the Dashboard
Jan 28th, 2008 by JTJ

 We've been a fan of the dashbroad approach for a long time because dashboard graphics can give readers a quick snapshot of multiple sets of dynamic data.  Charley Kyd, who studied journalism some years back, has developed a nifty plug-and-play package — Dashbroad Kit #1 — to generate these.  And below is a recent and relevant posting from Jorge Camoes that gives us some good tips on the topic.

Charts



10 tips to improve your Excel dashboard

Posted: 26 Jan 2008 06:42 PM CST

Posts in the series Excel Dashboard

  1. How to create a dashboard in Excel
  2. 10 tips to improve your Excel dashboard





Excel is a great (but underrated) BI tool. Several BI vendors gave up fighting it and offer Excel add-ins as front-ends for their BI solutions. So, if you want to create a dashboard you should consider Excel, since it really offers better functionalities than many other applications for a fraction of the cost and development time. I know that Excel is not a one-size-fits-all solution, but first you should be sure that your requirements are not met by Excel. Let me share with you some random tips from my experience with the Demographic Dashboard.

But, shouldn’t I just ask my IT to create the dashboard?

This is a fact: many IT departments hate Excel. The IT spends millions in BI solutions and users keep using Excel. Why? Because they know it, they like it, they feel in control and can do what ever they want with the data. Ask your BI manager to replicate the image above using an expensive BI solution and he’ll come back six month later with something you didn’t ask for, to answer a need you don’t have anymore (I know, I’m oversimplifying…). Do you know Master Foo Defines Enterprise Data?

1. Go to the point, solve a business need

So, you have your idea for a dashboard, you’ve discuss the project it with the users (right?) and you are ready. But where to start? Remember this: a graph, a table, the entire dashboard, are merely instrumental to solve a business need. It’s about insights, not about data, not about design.

2. Don’t use formulas

Yes, I know, this is Excel, and it is supposed to have formulas. What I am telling you is that you should aim at minimizing the number of independent formulas, and this should be a fundamental constraint to your global strategy. Too often I see Excel used as a database application. It is not, it is a spreadsheet (not everyone finds this obvious).

Over the years I had my share of “spreadsheet hell”: a lookup formula in the middle of nowhere would reference a wrong range for no apparent reason. An update cycle adds a new column and suddenly there are errors all over the place. You leave the project for a week and when you come back you don’t know what all those formulas mean. Even if everything goes smoothly the auditing dep wants to trace every single result.

But how do you minimize the use of formulas? If your data table resides in an Excel sheet you’ll have to rely heavily on lookup formulas, and that’s one of the highways to spreadsheet hell. Instead, get the data from an external source (access, OLAP cube…) and bring data into Excel. Calculations should be performed at the source. After removing all the formulas you can, the remaining should be as clear as possible.

3. Abuse Pivot Tables

Every object (graph, table) in the Demographic Dashboard is linked to a pivot table. Let me give you an example. One of the charts shows population growth over the years, using 1996 as reference. Pivot tables can calculate that directly, I don’t need to add a new layer of complexity by using formulas (to calculate the actual values and look up formulas to get them).

The population table has 200,000 records, so I coundn’d fit into the Excel limit of 65 thousand rows (yes, that’s changed in Excel 2007, but it is debatable if a table with a million rows in a spreadsheet application can be considered good practice). By using a pivot table I can overcome that limit.

4. Use named ranges

To be able to use self-document formulas (”=sales-costs” is much simpler to understand than “=$D$42-$F$55″) is one of several uses of named ranges. But they are also the building blocks of interaction with the user and they make your Excel dashboard more robust.

5. Use as many sheets as you need, or more

You don’t have to pay for each additional sheet you use in a workbook, so use as many as you need. Each cell in your dashboard report sheet should point to some other sheet where you actually perform the calculations. You should have at least three groups of sheets: a sheet with the dashboard report itself, sheets with the base data and other group with supporting data, definitions, parameters, etc. Add also a glossary sheet and a help sheet.

6. Use autoshapes as placeholders

Once you know what you need, start playing with the dashboard sheet. Use autoshapes to test alternative layouts or, better yet, use real objects (charts, tables…) linked to some dummy data.

7. Get rid of junk

There are two ways to wow your users: by designing a dashboard that actually answer needs, or by planting gauges and pie charts all over the place (this one can guarantee you a promotion in some dubious workplaces, but it will not help you in the long run). In the series on Xcelsius Dashboards you can see how difficult is to create something beyond the most basic and irrelevant charts.

So, get rid of Excel defaults (take a look at this before/after example) and just try to make your dashboard as clean and clear as possible. You’ll find many tips around here to improve your charts, so I’ll not repeat myself.

8. Do you really need that extra-large chart?

Charts are usually larger than they should. What it really matters in a chart is the pattern, not the individual values, and that can be seen even with a very small chart.

9. Implement some level of interaction

A dashboard is not an exploratory tool, is something that should give you a clear picture of what is going on. But I believe that at least a basic level of interactions should be provided. User like to play with the tools and can they learn a lot more than just looking at some static image.

10. Document your work

Please, please, structure and document your workbook. Excel is a very flexible environment, but with flexibility comes responsibility… I am not a very organized person myself, but from time to time I try the tourist point of view: I pretend I never saw that file in my life and I’ll try to understand it. If I can’t or takes me too long, either I must redesign it or write a document that explains the basic structure and flow.

Bonus tip: there is always something missing…

Once you have a prototype, user will come up with new ideas. Some of them can be implemented, others will ruin your project and if you accept them you’ll have to restart from scratch. So, make sure the specifications are understood and approved and the consequences of a radical change are clear.

This is far too incomplete, but I’ll try to improve it. Will you help? Do you have good tips specific to the design of Excel dashboards? Please share them in the comments.


 

Clever political mapping from the Brits
Jan 26th, 2008 by Tom Johnson

From Poynter's E-media Tidbits

Posted by Paul Bradshaw 10:48:59 AM
Flash, Database, Mapping: Telegraph Does It with Politics

The Telegraph's U.K. election map: Not just pretty, but potentially very useful.

The Telegraph is making a habit of combining Flash and databases to impressive effect. Their latest project also includes mapping to produce a political map of the U.K. with real depth behind its Flashy appearance.

The Telegraph achieves this with some nifty database connectivity. A click on a particular constituency brings up info about the last election results — not in itself very impressive. However, a tab to the right (see detail below: surely this should be the default?) offers a “public services profile” of how health, education and crime have changed — along with (currently empty) spaces for related articles and links.

If and when this works, it promises the sort of connectivity that has been lacking from so much online journalism. But will they be brave enough to link to reports on other sites?

Other features include the Swingometer (see how different swings affect the results), previous results, and lists of vulnerable seats — all of which are now expected, having been done before by the BBC among others (as I reported in the mists of 2005) — while the links to the latest polls add something extra.

Meanwhile, usability is smooth with postcode search, drop-down, and zoom feature, plus the ability to “mark” an area.

Telegraph.co.uk editor Marcus Warren says the tool was prepared for last autumn's “'General Election that wasn't.' It would have been ready for the closing weeks of the campaign, but in the end the Prime Minister thought better of going to the country. So we pursued the project at a slightly less breakneck speed and launched in the political 'new year'.

“It's also part of a more general drive, both by us and elsewhere, to drill down to the local level and exploit data relevant to our audience's lives. We also wanted to the tool to be fun. (Originally, for example, the images of the party leaders were caricatures.)”

While acknowledging the influence of the likes of Start Swinging with Peter Snow, Warren says there has been no one model “that made us exclaim: 'We want one of them too.' Like everyone else, we've been keeping an eye on the digital election campaigns in Australia and the U.S., both Google's approach and that of others. World Archipelago has done a great job in building the thing, as have the people here who worked on it.”

The most frustrating thing at the moment about the map is simply the fact that there is no election on yet, which gives the Telegraph team plenty of time to respond to feedback, iron out problems, try new ideas, find out about others through the blogosphere (Warren admits to not being aware of Electoral Calculus until Simon Dickson's post), and be all mysterious about their plans.

As Warren says: “There are lots of clues in there which hint at what else we plan to do with it. And we have other surprises up our sleeve as well.”

For my part I'd like to see some individual RSS feeds and mobile alerts for constituencies, and some tapping into the power of tagging — perhaps automated grabs of delicious bookmarks with clusters of key words in them (or indeed which key words become popular), or getting Telegraph journalists to tag their sources with a particular phrase that is picked up by the engine.

But I'm being fussy. What do you think?


 

The type of service newspapers should be supplying, but are not.
Jan 24th, 2008 by Tom Johnson

This from O'Reilly Radar.  (Yes, whoever wrote the hed surely meant “travel-time maps”.) 

Dynamic Time-Travel Maps From MySociety and Stamen


london dynamic commute map

UK-based non-profit MySociety teamed up with Stamen Design to develop some innovative time-travel maps. The snapshot of the map that you see above shows where you can live in London with a commute between 30 to 60 minutes where the median house price is over £230, 000. As you adjust the sliders, the map changes in realtime letting you adjust the commute times from 0 up to 90 minutes and the housing price from 0 to £990,00. The Department of Transportation, who requested the work, is the map's center (and basis for the commute times).

You can try out the map after the jump. They also made dynamic maps with the Olympic Stadium and the BBC as the center.


london commute map

These maps are an update of Chris Lightfoot's 2006 Time Travel project. The focus of that project was how to present commute time data (see a static commute map of London with contour lines at half-hour intervals to the right). This version (2007) they focused on making them interactive. The mapping data comes from Open Street Map (converted from the Ordinance Survey data used in 2006) .

MySociety is a tech-centric non-profit that focuses on making websites for the civic good and teaching others about the internet. Some of their previous projects have had a more political bent to them. TheyWorkForYou was their first project. It's a searchable site that provides British citizens a way to find out what is happening in the parliament. Their most recent site, FixMyStreet, provides neighborhoods with tools to discuss local problems.

To generate the maps MySociety screen scraped the Transport Direct website. At first they would query for the routes at each public transport stop. With this method Cardiff took 4 hours and 15 minutes to generate. Next they tried parallel screen scraping and got the time to generate Cardiff down to 45 minutes. They estimate that with a better algorithm they can get it down to 15 minutes. 15 minutes is a long time to wait for a map.

In the future they want to make these maps generated on the fly for users. This would require direct access to the data (Google Maps, MySociety estimates, with their lightning fast routing servers and direct access to the data would only take 2 minutes to generate Cardiff). To achieve this MySociety is considering building a client app or getting dedicated servers from Transport Direct (as this was a government-sponsored project anything is possible).

Time is a difficult thing to represent on maps, but will become more common in the future. Should it be a loop? (Like the Stamen project Trulia Hindsight, where 20th century housing data is shown (Radar post)) Or should it be sliders? (Like this project) Or more like a video? (Google Earth lets you “play” your GPS tracks.) We'll explore this more at Where 2.0.




 

So just how much are those guys making, anyway?
Jan 4th, 2008 by Tom Johnson

The editorial team at ResourceShelf and DocuTicker tipped us off to a potentially valuable link from the SEC.  While we find that the site is still short on data, we assume it will be filling up as various corporate reports are filed.  Check out:

New from the SEC: Internet Tool With Instant Comparisons of Executive Pay

Chairman Cox Unveils New Internet Tool With Instant Comparisons of Executive Pay

Securities and Exchange Commission Chairman Christopher Cox today launched the first-ever online tool that enables investors to easily and instantly compare what 500 of the largest American companies are paying their top executives. The new database highlights the power of interactive data to transform financial disclosure.

The Executive Compensation Reader – available today on the SEC’s Web site at http://www.sec.gov/xbrl – builds on the Commission’s new requirements that went into effect earlier this year to dramatically enhance clarity and completeness of executive compensation disclosure.



By tagging the executive compensation figures in XBRL, the computer language of interactive data, the SEC has created a new online tool to help investors more efficiently view Summary Compensation Tables and certain other data in the proxy statements of large companies. Investors can quickly glimpse the total annual pay as well as dollar amounts for salary, bonus, stock, options and company perks. They can instantly compare those executive compensation figures with other companies by sorting according to industry or size.

The SEC’s new Web tool includes information in XBRL for 500 large companies that have filed proxy statements with the Commission. The new tool includes direct links to companies’ proxy statements, including footnotes and the companies’ explanation of their compensation decisions.

Direct to Executive Pay Finder

Source: U.S. Security and Exchange Commission


 

The Dataweb and the DataFeret
Jan 3rd, 2008 by Tom Johnson

Marylaine Block's always informative “Neat New Stuff” [Neat New Stuff I Found This Week at http://marylaine.com/neatnew.html] tipped us to the DataWeb site and its interesting tool, the Data Feret (or “dataferet”).

“TheDataWeb is a network of online data libraries that the DataFerrett application accesses the data through. Data topics include, census data, economic data, health data, income and unemployment data, population data, labor data, cancer data, crime and transportation data, family dynamics, vital statistics data, . . . As a user, you have an easy access to all these kinds of data. As a participant in TheDataWeb, you can publish your data to TheDataWeb and, in turn, benefit as a provider to the consumer of data.”

What is the DataFerrett?
DataFerrett is a unique data mining and extraction tool. DataFerrett allows you to select a databasket full of variables and then recode those variables as you need. You can then develop and customize tables. Selecting your results in your table you can create a chart or graph for a visual presentation into an html page. Save your data in the databasket and save your table for continued reuse. DataFerrett helps you locate and retrieve the data you need across the Internet to your desktop or system, regardless of where the data resides. DataFerrett:
* lets you receive data in the form in which you need it (whether it be extracted to an ascii, SAS, SPSS, Excel/Access file); or
* lets you move seamlessly between query, analysis, and visualization of data in one package;
* lets data providers share their data easier, and manage their own online data.
DataFerrett Desktop IconDataFerrett runs from the application icon installed on your desktop.


Check it out at http://www.thedataweb.org/


 

»  Substance:WordPress   »  Style:Ahren Ahimsa