Earlier this year I was saddened to learn that Andreas Lipphardt, the co-founder and chief executive of the business intelligence software company BonaVista Systems, died tragically on January 27. With a team of never more than three or four people, BonaVista Systems made useful, affordable, effective, and desperately needed data visualization tools that functioned as add-ins to Excel. The company’s first and best-known product was MicroCharts, which added to Excel the ability to embed sparklines and bullet graphs into cells of a spreadsheet.
Andreas was the heart of BonaVista Systems. Following his death, the company has now ceased to exist. He was a good guy who untiringly dedicated his bright talent to the creation of tools that really worked. He was my friend. I will miss him greatly.
I first met Andreas when his submission to a data visualization competition that I judged in 2006 won the prize for best dashboard. He used an early version of MicroCharts to create his dashboard, which was the first product to incorporate bullet graphs. After this, we became acquainted, initially via email, and eventually in person on several occasions while collaborating on a project that resulted in a product named Chart Tamer. I fondly remember spending a cold winter day together in his apartment in Darmstadt, Germany, drinking coffee while sitting in front of his computer in the early days of that project. We met on other occasions as well, once nearby when Andreas and his loving partner were vacationing in San Francisco, and once in London.
I was always impressed by Andreas’ ability to do so much in such a short period of time with limited resources. I was also impressed with his perseverance when faced with seemingly insurmountable problems, which almost always resulted in an innovative solution. The flipside of these great qualities was the fact that he worked too hard. Now that Andreas’ young life has been cut short, I’m reminded of how precious life is and how important it is that we live it fully, which involves setting the work aside at times.
Andreas managed to make the world a better place through the products that he created, and for that we owe him our thanks. His dedication to excellence was rare. I’ve lost a friend and the world at large has lost someone who understood the potential of technology for good and did all he could to make that potential tangible.
When you’re looking for reliable information—especially guidance—pay close attention to the sources. Experience has taught me to approach information with a skeptical eye and to always identify and scrutinize the source. This has become especially important since the advent of the Internet. The anonymity of the Web makes it easy for people to claim expertise or to feign objectivity that is lacking. Organizations often publish information that is tailored to serve their own interests, and their interests are often not ours. When you know the source and are aware of its motives and biases, you can take them into account. When sources are concealed, however, especially when they create the impression of independence and objectivity, you may trust them in error.
Did you know that several sources of information about business intelligence that appear independent and objective have hidden interests and affiliations? In most cases the objectivity of so-called independent organizations is compromised by the fact that they are funded through advertising and sponsorships from the very vendors that they are supposed to objectively evaluate. Sometimes, however, affiliations are more intentionally masked. For example, if you have an interest in dashboard design, you might have visited the website Dashboard Insight in search of advice and examples. Even if you didn’t expect the site to provide great expertise, you certainly expected it to be objective. Although many software vendors advertise on Dashboard Insight and the articles, papers, and examples that you’ll find there come from various sources (mostly vendors), if you tally the site’s content per vendor, I believe you’ll find that one in particular is more visible than you would expect based on its share of the market: Dundas Software. This isn’t an accident. Dashboard Insight is owned by Dundas Software. Unless you visit the site’s Privacy Statement page where this affiliation is mentioned, you would never know this.
I’ve visited Dashboard Insight myself many times over the years in search of dashboard examples. Until fairly recently, I did so without knowledge of Dundas’ involvement.
My first direct interaction with someone at Dashboard Insight occurred in 2010. On January 13 I received this introductory email from Steve Bogdon:
Good afternoon Stephen,
As you know, Dashboard Insight is a key resource for decision makers in the BI and data visualization industry. We feature innovative articles, expert interviews, the latest news and much more - from all over the world. There’s always room for a variety of views on any subject.
In February, Dashboard Insight will be taking a close look at “trends in BI and data visualization.” As a highly regarded dashboard expert, I believe an article written by you commenting on recent data visualization trends (as well as where you think the industry is heading over the next year) would be an excellent addition to our venerable article library.
Would you be interested in writing this article for our readers?
By the way, when I started in this industry, almost 4 years ago, the first book I came across was “Information Dashboard Design” written by yourself. This book was an excellent tool and provided a strong foundation for the next few years of continued learning. Thank you.
I appreciate the invitation to contribute to Dashboard Insight. Unfortunately, I won’t be able to write anything for February, because I’m booked solid for the next few months. My schedule tends to fill up well in advance.
I’m curious about what you’re doing. I’m aware of your site, of course, but haven’t examined what you do closely. Are you making an effort to improve dashboard technology and its use or merely to serve as a forum for dashboard related content and a venue for dashboard vendor advertising without making judgments about effectiveness? I ask, because I believe that sites such as yours have an opportunity, and even a responsibility, to nudge readers toward effective practices. This conflicts, however, with a model that accepts advertising from any vendor that’s willing to pay and accepts content regardless of merit. In the past, I’ve ceased working with so-called independent, vendor-agnostic BI publications and conferences because they were in fact influenced by advertisers to favor their products and to censor criticism of them. I hope you’re in a position to exercise leadership by selecting good content that doesn’t just seek to promote vendor interests and to turn away advertising from vendors whose products are ineffective. I think your readers would appreciate objective guidance that looks out for their interests.
(Note: In the interest of full disclosure, before Dashboard Insight originally launched its website I received an email from a colleague who mentioned it, including the fact that it was affiliated with Dundas Software. Much later, when I ran across the site on my own, my aging brain had lost track of this little fact and nothing that I found on the website reminded me.)
Bogdon’s reply contained the following information about Dashboard Insight:
As for what we are trying to do … we pride ourselves on being a completely free dashboard/BI resource destination for the community. Almost all of our articles are vendor neutral and are donated by people like yourself, the experts in this space. The dashboard examples shown on our site are generally not vendor neutral but are still a great resource for the community to get dashboard-development ideas. We encourage comments at the bottom of every article and dashboard page - this allows our audience to ask questions or perhaps make suggestions to the author. Dashboard Insight is visited regularly by students at business schools including Harvard, St. George and Pepperdine (I know one professor at Pepperdine talks about us in class and encourages his students to visit our site).
One project currently in the works that may be of interest to you is our new “getting started” section. We are launching a special menu system with the needs of the “beginner” user in mind. The goal of this project is to guide someone who may not even know what a dashboard is directly to the information they need. There will be no excessively technical articles found there, just basic getting started-type articles. For example, we will have sections like: “I have mountains of data, what should the next step be?” and “Will a dashboard help me?” A user can select one of these sections and a list of relevant vendor-neutral articles will then be available. This will be a very user-friendly interface. We are currently gathering a list of getting started topics and articles, perhaps you have some suggestions?
We generate our revenue by website advertising via banner ads and our business intelligence directory. This is a level playing field, as all advertisements are weighed equally and no special treatment has been offered to the vendors advertising with us. We even have a section in our directory for anyone in this space to list their company free of charge. All advertisements (both free and paid) are checked by myself to make sure they truly belong on Dashboard Insight. We have turned away many potential advertisers who did not fit in accordingly.
“Level playing field”? “No special treatment”? This email was a perfect opportunity for the Bogdon to reveal Dashboard Insight’s affiliation with Dundas. It should have been obvious that this affiliation would concern me.
Later that year, still not aware of the affiliation, when Bogdon asked if he could republish one of my articles on Dashboard Insight, I responded as follows:
I’m sorry to disappoint you—I really am—but I just can’t have my work featured alongside stuff like this:
He was unhappy with my stand.
Steve I really do not understand your concern with this. Almost all of our content is vendor neutral and published as a resource (learning tools) for this community. The example you have given below is not vendor neutral, no dashboard is as someone created it and expects credit for it. It does however give an example of a dashboard that our readers can learn from. Between you and me this would likely be one for the “what not to do category” but it is still an example. We use these as examples to draw our audience into the site from the search engines. Between these examples and the small amount of advertising (less than 30k per year) that funds this publication we are able to bring the remaining 90% of the articles that are vendor neutral to our audience. Like any business we need to keep the lights on. It is a little confusing that someone like you who lists himself as a leading expert in data visualization that helps organizations learn about this technology is unable to see good in the service we are providing to this community. Being so respected in this community has driven my audience to request your expertise and views on issues from time to time, it would have been nice to bring this to them.
You’re reading something into my response that wasn’t intended. I did not say that you’re not providing a useful service. I said that I don’t want my work to be exhibited alongside examples of bad practices. To do otherwise would compromise the integrity of my work and cause confusion. People out there who rely on us for help deserve better. There’s enough confusion out there already.
About six months later I was contacted by Dashboard Insight again, this time by its new leader, Alexander (Sandy) Chiang.
There have been recent changes at Dashboard Insight and I thought this is an appropriate time to reconnect with you. I have been brought on as Research Director at DI and one of my mandates is to up the quality of the content. To accomplish this, I need to start being more critical in choosing the dashboards we feature. Going forward, the articles we post must be aligned with actual dashboard design and data visualization best practices. However, I still have to fulfill the rest of DI’s obligations on posting dashboards and articles for the month of June. After that, it’s a new Dashboard Insight.
I cannot remove content, but what I am going to do is make better content more visible. Suffice to say, there’s a lot that needs to be done and I think a great way to start this new direction is an interview with you. I think this interview will help you get your message to our audience and DI would benefit from information.
If you’re interested, I can send you a list of questions. You can take your time in responding, and once that’s done, we can post it on DI. I look forward to hearing from you.
I was encouraged by Sandy’s plans and agreed to do the interview, which was eventually published on the site. Just before publication, Sandy wrote the following:
Out of courtesy, I wanted to let you know that I will be reaching out to two vendors who you have felt in the past has done a good job at adhering to dashboard and data visualization best practices: TIBCO Spotfire and Tableau. I am NOT saying you are endorsing any of their products in this interview (as you don’t) nor have you suggested asking them for paid promotional activities. In addition, I do not plan on putting their ads anywhere in the actual interview text itself so it doesn’t take away from the intent of the interview. However, I will be pitching the idea of having their banner ads in the usual spots (top of the site and on the right side).
Even though Tableau and Spotfire are two of the relatively few vendors with data visualization products that I like, I discouraged sponsorship.
I would prefer it if no ads were visible in conjunction with the interview. If you can do this, I’d appreciate it.
You’re in a difficult position. By funding your site through vendor advertising, you open yourself to vendor influence. Even if you manage to resist this entirely, this possibility of vendor influence will always undermine the credibility of your site as long as you accept advertising from the very vendors whose work you are supposed to objectively critique. One of the reasons that I rarely write for websites other than my own is because I do not want advertising associated with my work.
Sandy wasn’t able to honor my request. Late in 2011, after he and I had a chance to meet when he attended my public workshop in San Francisco, Sandy asked if I would write an article for Dashboard Insight. Still wanting to support his efforts to improve the site, I initially agreed, but when Sandy raised the issue of vendor sponsorship again, I had second thoughts.
This matter of vendor sponsorship has prompted me to take a fresh look at your website, which has renewed old concerns of mine. Despite your good intentions, your site is still aligned with vendors to an uncomfortable degree. The home page alone made me cringe, with the Dundas ad featured so clearly at the top with its moving bubbles, which is an eyesore that conflicts with the principles of non-distraction that I teach. As I looked further, I found that most of your educational content was written by vendors and is designed to promote their products more than to teach useful principles and practices. Your site would be so much more useful to people if it were free of vendor content.
One of the reasons that I stopped writing for the B-Eye-Network several years ago was the fact that bad products were being promoted alongside my articles, which I couldn’t tolerate because it contributed to the confusion that was already rampant among people who were trying to implement dashboards and other forms of data visualization. I should have thought this through more carefully when you asked me to write a white paper for your site. Had you mentioned sponsorship when we first spoke about this, I would have never considered doing it. Now that this issue has been raised, I find myself in the uncomfortable position of needing to back away from our agreement. I’m sorry to disappoint you, but this decision feels right, given the circumstances. As long as your business model requires you to feature advertising and other content from the vendors, you cannot serve as an objective resource for your readers.
Sorry, my friend. I know that you want to do good work. Until you can find a way to distance yourself from the vendors whose products exhibit most of the ineffective dashboard practices that people struggle with, however, the usefulness of your work will be compromised.
A particular sentence in Sandy’s subsequent response caught my attention: “I do wish I could get away from vendor support but that’s the business model the powers that be have decided.” I wrote him back to ask: “Who are the ‘powers that be’? I assumed that you owned and ran Dashboard Insight independently.” When Sandy responded to my question is when I learned that Dashboard Insight was owned by Dundas Software and that he answered to them.
You can imagine my dismay and disappointment. I responded by encouraging Sandy to make this affiliation obvious on the website. He proposed this to Dundas’ management and afterwards told me that they would eventually follow my advice and make the affiliation known. That was in November of 2011. We corresponded about this several times since, and I could tell that Sandy was doing what he could. Here’s an excerpt from the last email that I received from Sandy earlier this month on March 8, 2012:
I will be leaving Dashboard Insight and tomorrow will be my final day. In the past, we spoke about your concern regarding Dashboard Insight’s non-disclosure of its affiliation with Dundas.
With that being said, Adam (who is CCed on this email) will be running Dashboard Insight going forward. He will address any PR related issues.
I’ll let Adam take over from here.
I haven’t heard from Adam. Perhaps, now that the cat is out of the bag, Dashboard Insight will itself clearly disclose on its website what I have revealed here. If so, visitors to the site who come in need of information will know the source and be able take its interests into account.
We often speak of finding and following our passions. Bret Victor, a talented data visualizer and deep thinker on many topics, believes in finding and following great ideas as guiding principles. I met Bret several years ago and became reacquainted with him and his work again this year. One of his recent projects was the creation of the interactive graphics that accompany Al Gore’s newest book Our Choice. Unlike many interactive infographics, Bret’s are both beautiful and uncompromisingly useful, perfectly suited to the audience and task.
Bret and I share an interest in ideas that can serve as guiding principles. We both approach our work in the service of these principles with a keen sense of responsibility. One of the guiding principles of my work can be expressed as:
Individuals usually make better decisions when they learn to think critically (a.k.a., scientifically) and have broad access to information. These individuals can have a positive effect on others if they learn to present information clearly, accurately, and truthfully. Better thinking, based on good information, paired with effective communication, can produce a better world.
Bret talks about the power of guiding principles in a video-recording that you can watch online. One of his own guiding principles, which he pursues in innovative ways, is the idea that the creative process works best when we have an immediate connection with the object of creation. Immediate feedback while creating works around limitations of our brains, allowing us to see what we would otherwise have to imagine with great difficulty. I recommend that you watch this video. You’ll find Bret’s presentation both brilliant and inspiring. If your interest is piqued, you can learn more about Bret Victor and his work at www.worrydream.com.
When Business Intelligence software vendors that are to blame for keeping the information age advancing at a snail’s pace have the chutzpah to give advice about data science (the in-vogue term today for data sensemaking), I find it difficult to remain silent. The latest entry in the “Advice from the Clueless” category is an interview with Timo Elliott of SAP Business Objects, titled “What Is a Data Scientist? SAP’s Timo Elliott Says Leadership,” which appeared in Forbes on February 22, 2012. I’ll warn you now that my comments in this blog post are dripping with disdain. A less acerbic response would lack honesty.
Rather than writing a thorough review of Elliott’s comments, which isn’t warranted, I’ll just feature a few quotes from the interview, followed in each case by a short rejoinder.
Timo Elliott, Senior Director, Strategic Marketing, SAP Business Objects
To begin, let’s put Elliott’s comments into context by looking at his experience:
Elliott performed analytics for Royal Dutch Shell for about a year ending in 1988, when he joined BusinessObjects in Paris as the eighth employee. He has been with the company, now part of SAP, for more than 20 years.
“About a year” of analytics experience 24 years ago? Well in that case, let’s hang on his every word.
We have now entered an era in which technology is no longer the primary bottleneck to extracting meaningful business value from data. The primary bottleneck is actually human leadership, according to Timo Elliott, Senior Director, Strategic Marketing at SAP BusinessObjects. In other words, to expand the impact of data on your business, it is time to balance the focus on the technology of analyzing data with development of leadership in order to make sure that technology is put to good use.
SAP would love for us to believe that technology is no longer a bottleneck. Human leadership is indeed part of the problem, but savvy leaders would realize that technologies are still in fact a big part of the problem that we face. A good leader would advise the organization to stop relying on vendors like SAP when attempting to make sense of data.
Historically, what we now call “data science” has been somewhat limited to Web companies, which have had wide access to their activity logs and have been able to devise excellent products from them. But now, through the release of the latest crop of visualization and business intelligence (BI) products, that kind of capability now exists for all kinds of companies.
This is indeed beginning to happen, no thanks to SAP. Those who are using the term “data science” meaningfully and with integrity are doing so, in part, to distance themselves from the likes of SAP Business Objects, which has so far provided nothing useful but production reporting systems. When marketers like Elliott use terms like “data science,” they’re trying to give the impression that they’re on to something new and better without any real substance to support the claim. This is the same organization that introduced the embarrassingly impoverished Business Objects Explorer as “revolutionary.”
The reason many BI projects ultimately fail is too much focus on technology.
I couldn’t agree more. This is also the reason why so many BI products fail. Focusing exclusively on technology without understanding the needs and abilities of those who will use it—a common pitfall in software development—produces products that only a software engineer could love. This is especially true of products that support thinking, which must interface seamlessly with human perception and cognition. SAP Business Objects knows how to build production reporting systems, but not how to build tools for interacting meaningfully and efficiently with data.
“You have Scotty down in the engine room,” Elliott says. “He’s the guy who understands the technology perfectly, but he’s not the one leading the ship. You need the whole crew. You’ve got Spock, who’s an analytics person; you’ve got Bones, who’s the human relations person who does the emotional side; you’ve got Uhuru [sic] on communications. But the key person is Captain Kirk. Captain Kirk doesn’t know how the fusion generator works. He is a decider. His job is to lead people into whatever the situation is and make those tough decisions.
Much like former President George W. Bush, the “deciders” in many organizations are not in touch with the data. Relying on deciders in leadership (the Captain Kirks of the organization) will only work if they actually do have some idea of how that fusion generator propels the ship.
In some ways, a data scientist is equal parts Captain Kirk and Mary Leakey, the best-known member of the team that discovered and interpreted the early human skeleton “Lucy” in Egypt. The data scientist is part ship’s captain, part anthropologist. The data scientist is aware of the complexity of the systems at hand, but is less a deep technology expert than a comprehensive evaluator of the modalities of data used in an organization.
Okay, I’ll start by admitting that I don’t know what “comprehensive evaluator of the modalities of data” means. Probably nothing, giving the fact that marketers like Elliott don’t have to make sense as long as their words sound impressive. Unfortunately, data scientists—those who understand what’s going on based on evidence derived from data—are rarely given leadership positions. Knowing what’s really going on is seldom given priority in organizations.
“What’s not data science is a business person with a business question going to an IT organization saying, ‘Give me this report,’ and the IT person coming back and saying, ‘Here’s your report,’” Elliott says. “That is not data science. Why? Because it’s not about the interface. That’s the business person basically trying to do their current job in the current way, using a little more data. That can be worthwhile, and I’m not saying IT organizations don’t provide value when they do that, but it’s not data science.
I agree that this scenario doesn’t reflect data science. But this scenario accurately represents the very model that SAP has fostered for years and continues to foster in its products today.
“Ultimately, the reason why a lot of the companies that have people called ‘data scientists’ are successful is not only because of the data scientists and their skills, but also because the people that run those companies are keenly aware how much of a difference data can make to their businesses.”
The suggestion that organizations with people called data scientists, rather than data analysts, business intelligence professionals, or decision support specialists, are more successful than others is pure nonsense. The term data scientist in most organizations is just the latest term that’s being used by people who do precisely the same work as those who use the other titles. Changing what you call these folks doesn’t magically improve their work.
Even as data-science technology is on the upswing—IT spending per head…may actually jump 60 percent in the coming years, according to Gartner—there is a growing realization among the most data-savvy companies that the culture is just as important as the technology.
And if Gartner says it, we know what that means, don’t we? After all, Gartner’s magic quadrant claims that SAP Business Objects is the second most visionary vendor in business intelligence, second only to IBM, neither of which have demonstrated any real vision in their products for years.
“Most of us are just really bad at analyzing information,” Elliot says.
Yep, this is absolutely true. Why? Because most people haven’t learned to think critically, haven’t learned basic analytical skills, and have grown less capable to the degree that they rely on dumb technologies such as SAP Business Objects to do their thinking for them. If SAP wants to provide leadership in data science or whatever you choose to call the work of data sensemaking, they themselves have some learnin’ to do. Until then, perhaps they should remain silent and concentrate on listening.
Jeff Heer of Stanford University and Ben Shneiderman of the University of Maryland have co-authored a wonderful new paper titled “Interactive Dynamics for Visual Analysis.” With so much emphasis today on the visualizations themselves, Jeff and Ben are encouraging us to also attend to the interactions with those visualizations that are required for effective analysis. Data exploration and sensemaking (a.k.a., exploratory data analysis) requires constant and fluid movement from one view to the next, rapidly changing how we’re viewing the data to pursue each new question that arises. Interactions are required to alter the view, and it’s important that those interactions be quick and easy, otherwise our minds will be distracted from the flow of analysis.
In this paper, a taxonomy of 12 interactions, organized into three categories, is proposed:
Jeff and Ben are two of the smartest, most articulate, and most productive researchers in the field of information visualization. This paper is well worth your time. Read it and then consider how well the data analysis tools that you currently use support these interactions.
Data visualizations can be designed to look beautiful, if you possess the required visual design skills. The question is, “Should data visualizations be beautiful?” For years a battle has raged between infographic designers who emphasize the importance of aesthetics and data visualizers with a more practical bent who focus on the degree and quality of understanding that results. Those in the aesthetics camp argue that if an infographic is not eye-catching, no one will look at it, and that compromises in the quality of communication are justified as a means to capture the reader’s attention. Those in the optimal-understanding camp argue that the reader’s attention is wasted if the visualization does not clearly and accurately tell its story. In truth, most people have joined one camp of the other, not because of deep thinking on the topic, but because of preferences formed by their experience or lack of it. I’ve tried to occupy a middle ground, pointing out that visualizations can be both aesthetically pleasing and fully informative, without compromising either concern, but that this takes a high degree of visual design and communication skill. While the battle rages, however, fundamental questions are being ignored.
Should data visualizations be beautiful?
What qualifies as beautiful?
If you believe as I do that data visualizations, despite secondary variations in purpose, are always meant to inform, then their effectiveness is determined by the degree and quality of understanding that results. Therefore, a data visualization should only be beautiful when beauty can promote understanding in some way without undermining it in another. Is beauty sometimes useful? Certainly. Is beauty always useful? Certainly not.
What’s always required is that a visualization work for the human eyes, which means that it should not be displeasing to the eyes. A few basic principles of visual aesthetics can be followed—good color choices, legible fonts, proper placement and spacing, etc.—to achieve this result. Making a visualization beautiful is rarely required and it is usually not worth the effort unless your audience is huge and the information is really important. In addition, it can often work against the goal of informing. Making a data visualization beautiful in a way that compromises the integrity of the data always works against you. Even when the information is not compromised, however, beauty can work against you by drawing attention to the design of the visualization rather than the information that it seeks to communicate. Think back over your life and ask: “Were the people who influenced and taught me the most all physically beautiful? If they were wrapped in a different physical package, would that have affected their ability to influence me or my ability to listen to them? Did I ignore information that wasn’t delivered by stunningly attractive people?” Beauty is not the goal of visualization and it is usually not required to achieve the goal.
On those occasions when making a data visualization beautiful is truly useful, we must face the fact that beauty is indeed “in the eyes of the beholder.” What qualifies as beautiful for some is not beautiful to others, beyond the basic aesthetics that I referred to earlier that are rooted in visual perception. Most of what we deem beautiful is a product of culture and experience. If you love wine, as I do, you probably no longer prefer the wines that you found pleasing in the beginning. The fruit-bomb California Zinfandel’s that I loved in the past are no longer palatable to me. I now prefer wines that were crafted in the European tradition to produce greater subtlety and depth of character and to pair well with food.
To further illustrate this point, I’ve found that, when arguing the importance of beauty in data visualization, people often illustrate their position using works by infographic designers such as David McCandless. To my eyes, however, even when I ignore the fact that the information has been ravaged, I rarely find his work beautiful. Obviously, some people see his work differently than I do, but that’s the point that I’m making. Beauty is a fleeting target. What qualifies as beauty varies with the tastes of the audience.
If you’re a gifted graphic artist and communicator and have the skill that’s required to craft beautiful data visualizations when they’re needed, that’s wonderful, and I wish you well. Just don’t hinder the advance of data visualization by arguing that it must always be beautiful. Remember that the goal is to enlighten.
Boris Evelson of Forrester Research has been singing off-key about data visualization recently and he doesn’t seem to realize that he’s tone deaf on this topic. Have you ever noticed that when people become recognized as experts in a particular field, they sometimes think this magically grants them expertise in other fields as well? Expertise requires study and years of practice, practice, practice. I’m particularly sensitive to this tendency when BI generalists give opinions about data visualization without taking time to understand it. This bothers me because people put their trust in “experts” and make costly decisions based on their opinions. My ire was most recently raised when reading statements by Evelson about data visualization as quoted in an InfoWorld article by Chris Kanaracus.
Evelson and I exchanged strong words back in 2009 when he deigned to list the features of “advanced data visualization” in his blog. His list was nonsense and I said so. Long after the dust settled, Evelson contacted me to ask if I’d be willing to advise him on matters related to data visualization. He should have asked my advice before his interview with Kanaracus.
Here’s the section of the article “Tableau BI visualization tools with user-centric design” (InfoWorld, January 18, 2012) that cites Evelson’s opinion:
Until the in-memory addition, Tableau wasn’t necessarily something a company already invested in a BI platform from SAP or Oracle would need, according to Forrester Research vice president Boris Evelson. “These days all of the other vendors have perfectly fine data visualization capabilities,” he said. “Now they let you do this in-memory, which very often is what the business users want. They don’t want to be restricted to the underlying database structure.”
At the same time, Tableau and its competitors need to further differentiate themselves. Microsoft is pushing PowerPivot as an extension of Excel with not much of a learning curve, while Spotfire features integration with Tibco’s middleware stack and offers advanced analytic capabilities, he said.
However, “whatever [Tableau] is doing, they’re doing it right,” as Forrester client interest in the company has jumped significantly of late, Evelson added.
I can imagine the mixed feelings of Tableau’s leaders when reading Evelson’s words: grateful he said that “they’re doing it right” but cringing to have these words spoken by someone who doesn’t actually understand what they’re doing and what makes it right.
Tableau did not suddenly become relevant to organizations with big BI product stacks when they introduced in-memory data handling. All along, these organizations have needed what good data visualization vendors like Tableau and their kin have been providing—effective ways to explore and analyze data—because the big BI vendors haven’t provided it and still don’t, which brings us to Evelson’s most naïve and potentially harmful statement: “These days all of the other vendors have perfectly fine data visualization capabilities.” After I read this, my wife mistook my convulsions as a seizure. Evelson’s statement couldn’t be more wrong. To date, none of the big BI software companies support data visualization in a manner that is “perfectly fine” or even reasonably adequate. They allow you to view data in graphs, but do so in embarrassingly inadequate ways. This inadequacy is especially apparent when we narrow our focus to exploratory data analysis, which requires meaningful and rapid interaction with data. Neither PowerPivot from Microsoft, Business Objects Explorer, nor any of the other attempts that I’ve seen by big BI vendors to enable exploratory data analysis have advanced past kindergarten. To draw on my Biblical roots for a moment, good visual analysis products such as Tableau, Tibco Spotfire, and SAS JMP lead people who have previously stumbled around in the dark using clunky BI products to exclaim “I was blind but now I see.”
Finally, back to our friend Boris Evelson. The best experts in any field are the people who started out as and continue to be the best students. When we stop being students, our expertise ceases to grow. When people recognize you as an expert and begin to hang on your every word adoringly, it’s tempting to wear that mantle with pride, refusing to ever again assume the role of student. If Evelson wants to express useful opinions about data visualization, he’s got some learnin’ to do. This is true of many BI thought leaders. Until then, they should stick to what they know.
When I participated as a judge and speaker at last year’s Malofiej conference in Spain—the Oscars of journalistic infographics—I had a chance to become acquainted with Alberto Cairo, who has since become a trusted friend and colleague. I discovered while at the conference that it was a result of Alberto’s suggestion that I was invited to attend. Alberto is one of the few infographic designers that I know who has a mature understanding of the work. Beginning as a journalist in Spain, Alberto approached the work first as a communicator, and second as a graphic designer. He has worked in various parts of the world for news publications and at universities where he has honed his craft and thought deeply about the principles and practices that make it effective. Now, after most recently working in Brazil, he is teaching in the United States for a second time at the University of North Carolina at Chapel Hill in the Master of Arts in Technology and Communication program and also at the University of Miami where he works with communication and journalism students.
Mostly in response to his new book, The Functional Art, (so far only available in Spanish, but in the process of being translated into English) Alberto was recently interviewed for an article in the Velora Newsletter. If you’re interested in infographics, I encourage you to read it. Alberto’s thoughtful advice serves as a desperately needed voice of sanity in this time of infographicmania.
You have pointed me to another way of doing it. It can also be achieved by placing your common script in a share. When you use a share you are also independent of the filestructure of Windows.
For instance when you create the folder “C:\MyScript” and create “Common.qvs” there. You can share the folder with the name “QlikView_Script”.
In your QlikView app you can include the script like this: $(Include=\\localhost\QlikView_Script\Common.qvs);
When you are on another machine you can use a different folder, but use the same share name. So the next machine can have the common file in “C:\QlikView\Scripts”. But still share this as “QlikView_Script”. This means that the same include will still work.
I see. Yeah we have a standard directory structure on all of our QV servers and developers machines. So then we just have a relative path ($(Include=..\..\shared includes\incDataSource.qvs)) to the include file. Many ways to “skin the cat” depending on how you need to do it. QlikView is nice that way.
That is also a very good option. There are multiple ways of solving the inclusion of a common script. I suspect that your include points to a static file path? That’s something I wanted to prevent. But of course in my method I point to a static registry location. Just a matter of preference I guess.
We use a little simpler method. We have a common include script that all apps include. The include script detects the machine name it is running on and then we use a case statement that sets a standard set of variables to the appropriate connection strings for that machine. Each app can then instantiate any connection string variables it needs for its script. We try to avoid having any registry dependencies for portability sake.
You could rename a field with as like in the example CustomerCode field below:
CustomerCode as RenamedCustomerCode,
nice posts and nice site Great post man! Continue the good work!
I was listening to “Science Friday” on NPR last week and heard about the work of Ted Kaptchuk, Director of Harvard University’s Program in Placebo Studies and the Therapeutic Encounter. I was particularly interested in one of his studies that investigated placebo effects on asthma. This study tested physical effects of real medication vs. placebos as well as patients’ perceptions of the effects. Over the course of 12 sessions, subjects were given the following three different treatments and a non-treatment session three times each: 1) an albuterol inhaler, 2) a placebo inhaler, 3) a fake acupuncture treatment, and 4) several minutes in the waiting room with no subsequent treatment. In each case physical effects were subsequently tested by measuring subjects’ lung capacity and subjective effects were tested by asking subjects to rate their perception of improvement on a 1 to 10 point scale. Based on actual lung function, the albuterol inhaler—the only real medical treatment—produced a 20% improvement while the two placebo treatments and waiting without treatment each produced a 7% improvement. Apparently the mere act of sitting for a while without activity produced some improvement. What’s interesting is that, even though neither placebo produced an effect greater than no treatment at all, indicating the absence of an objective placebo effect, subjective perceptions were quite different. Subjects reported the following perceived levels of improvement: 50% for the albuterol inhaler, 45% for the placebo inhaler, 46% for the fake acupuncture treatment, and 21% for merely waiting without treatment. Both placebo treatments provided subjective perceptions of improvement that were almost as great as the medical treatment.
In an article about this study in the Wall Street Journal’s Health Blog titled “The Placebo Effect, This Time in Asthma” Katherine Hobson wrote:
Isn’t it enough to feel better? In the case of some conditions, yes, says senior author Ted Kaptchuk…
He tells the Health Blog that when it comes to things like asthma or cholesterol or diabetes, while patient reports are important, it’s key to keep tabs on objective measures, too. I may feel great, but if my cholesterol level isn’t budging, the statin isn’t working and my risk for another heart attack isn’t going down.
But in conditions such as depression, pain and insomnia, the subjective response is the main thing being treated. If I’m depressed, take a pill and no longer feel depressed, by definition the medicine is working. There’s no blood or imaging test used to confirm whether my condition is being fixed.
I share Kaptchuk’s opinion. In a case such as insomnia, if a placebo allows someone to sleep, it does the job, and that’s enough. If a patient’s health is at risk, however, making her think she’s getting better when she isn’t is potentially harmful.
As I was listening to this story on NPR, I began to think of similar issues involving data visualization. If people enjoy your infographic, isn’t that enough? Or in the realm of information dashboards, if the CEO has fun looking at the flashy gauges, isn’t that enough? No, it isn’t. Both are meant to inform. To understand the story of an infographic or an organization’s performance on a dashboard requires real information. Enjoying a pretty picture and feeling like you’ve been informed is not the same as the actual understanding that’s needed to make better decisions.
I like to feel good as much as the next guy. Data visualizations often give me great pleasure. I do not think, however, that enjoyment is the goal. It is not essential. In fact, enjoyment that distracts from the information rather than drawing people into it in meaningful and useful ways impedes the goal. When it comes to the real health of people’s minds and decisions, placebos are definitely not enough.