How governments are adopting modern business intelligence

Data analysis company Tableau calls on state and local government leaders to embrace modern business intelligence platforms to help scale services more efficiently.

Efficiency and scalable impact might not be the first things that come to mind when you think of government. A basic, and inherent connotation is to think of government as slow and bureaucratic. But that’s the old paradigm — the path without data.

In 1995, when former President Bill Clinton signed the Government Performance and Results Act (GPRA) into law, federal agencies began shifting their strategy to measure and report on program performance. Data, not gut feel, became the new foundation for improving citizen services, increasing accountability, and driving mission goals.

Twenty years later, the shift to data is further amplified in part due to additional legislation, like the Digital Accountability and Transparency Act (DATA), signed into law by former President Barack Obama in 2014. This mandated transparency — where all Federal agencies are required to share data sources, critical insights and reports on matters like budget spend with the public — is gaining traction. However, factors beyond legislation are pushing governments to become more data-centric.

The newest catalyst driving governments to a data-driven approach is the emergence of modern business intelligence (BI), a methodology that empowers everyone within an organization to access and analyze the data they need. This modern, self-service approach to analytics enables an easier and faster way for both employees and leaders to measure performance metrics across every program in the agency.

Mark Russell, a contracted analyst and systems administrator at the Florida’s Department of Juvenile Justice (FDJJ), is one of those government leaders changing the way people think about data-led government efficacy. By turning to self-service analytics, Russell is working to empower the government workers in his agency with a 360-degree view of juvenile offenders that are at risk of falling deeper into the system. By giving everyone access to data and insights, workers are better equipped to take action faster.

“When [our data-driven program] works, people develop an expectation that we can get stuff done. That reputation contributes to a view of good government. It means lawmakers trust our input when developing bold juvenile justice reform policies, and have faith in us to carry out those policies. We’re delivering outcomes for the ‘business’ of government,” Russell said.

In the past, when governments relied on traditional BI, IT managed the reporting queue and struggled to keep up with business questions — making timely and trustworthy insights almost impossible.

For the FDJJ, compiling a detailed report about every juvenile offender and their current standing within the system used to take up to four to five months to create. When Russell’s team implemented a modern business intelligence platform, the speed to insight was reduced to two days.

The FDJJ also used self-service data visualization to create the Prolific Juvenile Offenders dashboard, which delivers a complete view of Florida’s most at-risk juveniles. This dashboard enables everyone in the FDJJ, from caseworkers to agency leaders, to drill down into the specifics and understand where an individual is physically located, what offenses they committed, and what treatments they are receiving. The dashboard was further enhanced by a data-sharing agreement with the Department of Children and Families (DCF). Adding this additional layer of transparency allows an even better view into at-risk youth and helps coordinate intervention strategies between the FDJJ and DCF.

Additionally, the FDJJ uses insights from this dashboard to directly influence policy by showing legislators and other stakeholders the impact policy changes have on the budget, and ultimately the at-risk youth. For example, when juveniles are in the community, the visibility from the dashboard triggers more contact from caseworkers and parole officers. And because the dashboard is updated every six hours, workers in the field — like parole officers and even direct supervisors — have real-time information to effectively and quickly manage caseloads.

The Florida Department of Juvenile Justice isn’t alone in its success with modern BI. Governments around the world are using modern analytics platforms like Tableau to deliver more with less. With modern BI, anyone in a government agency can use data to see and understand exactly how programs are performing. And the results are amazing.

Source: statescoop.com

European Data and Computational Journalism Conference (Dublin, Ireland)

The European Data and Computational Journalism Conference aims to bring together industry, practitioners and academics in the fields of journalism and news production and information, data, social and computer sciences, facilitating a multidisciplinary discussion on these topics in order to advance research and practice in the broad area of Data and Computational Journalism.

Held in Dublin, Ireland, the conference will present a mix of academic talks and keynotes from industry leaders. It will be followed by a half-day ‘Introduction to Data Journalism’ workshop and the ‘Computational and Data Journalism Unconference’.

Topics of interest include, but are not limited to:

  • Application of data and computational journalism within newsrooms
  • Data driven investigations
  • Data storytelling
  • Open data for journalism, storytelling, transparency and accountability
  • Algorithms, transparency and accountability
  • Automated, robot and chatbot journalism
  • Newsroom software and tools
  • ‘Post-fact’ journalism and the impact of data
  • User experience and interactivity
  • Data and Computational Journalism education
  • Post-desktop news provision/interaction
  • Data mining news sources
  • Visualization and presentation
  • Bias, ethics, transparency and truth in Data Journalism
  • Newsroom challenges with respect to data journalism, best practices, success and failure stories

Find out more here

What Is Data Visualization?

By Stephen Few

As with many fields that experience rapid growth, the meaning and practice of data visualization have become muddled. Everyone has their own idea of its purpose and how it should be done. For me, data visualization has remained fairly clear and consistent in meaning and purpose. Here’s a simple definition:

Data visualization is a collection of methods that use visual representations to explore, make sense of, and communicate quantitative data.

You might bristle at the fact that this definition narrows the scope of data visualization to quantitative data. It is certainly true that non-quantitative data may be visualized, but charts, diagrams, and illustrations of this type are not typically categorized as data visualizations. For example, neither a flow chart, nor an organization chart, nor an ER (entity relationship) diagram qualifies as a data visualization unless it includes quantitative information.

The immediate purpose of data visualization is to improve understanding. When data visualization is done in ways that do not improve understanding, it is done poorly. The ultimate purpose of data visualization, beyond understanding, is to enable better decisions and actions.

Understanding the meaning and purpose of data visualization isn’t difficult, but doing the work well requires skill, augmented by good technologies. Data visualization is primarily enabled by skills—the human part of the equation—and these skills are augmented by technologies. The human component is primary, but sadly it receives much less attention than the technological component. For this reason data visualization is usually done poorly. The path to effective data visualization begins with the development of relevant skills through learning and a great deal of practice. Tools are used during this process; they do not drive it.

Data visualization technologies only work when they are designed by people who understand how humans interact with data to make sense of it. This requires an understanding of human perception and cognition. It also requires an understanding of what we humans need from data. Interacting with data is not useful unless it leads to an understanding of things that matter. Few data visualization technology vendors have provided tools that work effectively because their knowledge of the domain is superficial and often erroneous. You can only design good data visualization tools if you’ve engaged in the practice of data visualization yourself at an expert level. Poor tools exist, in part, because vendors care primarily about sales, and most consumers of data visualization products lack the skills that are needed to differentiate useful from useless tools, so they clamor for silly, dysfunctional features. Vendors justify the development of dumb tools by arguing that it is their job to give consumers what they want. I understand their responsibility differently. As parents, we don’t give our children what they want when it conflicts with what they need. Vendors should be good providers.

Data visualization can contribute a great deal to the world, but only if it is done well. We’ll get there eventually. We’ll get there faster if we have a clear understanding of what data visualization is and what it’s for.

How disinformation spreads in a network

OnTheGo

Disinformation is kind of a problem these days, yeah? Fatih Erikli uses a simulation that works like a disaster spread model applied to social networks to give an idea of how disinformation spreads.

“I tried to visualize how a disinformation becomes a post-truth by the people who subscribed in a network. We can think this network as a social media such as Facebook or Twitter. The nodes (points) in the map represent individuals and the edges (lines) shows the relationships between them in the community. The disinformation will be forwarded to their audience by the unconscious internet (community) members.”

Set the “consciousness” parameter and select a node to run.

Source: flowingdata.com

Tableau five years a leader in Gartner’s Magic Quadrant for Analytics

We’re proud to see that Tableau is a leader in the Gartner Magic Quadrant for Business Intelligence and Analytics Platforms for the fifth consecutive year.We believe Tableau is the gold standard for intuitive interactive visual analytics and an established enterprise platform.

We wouldn’t be here without our customers’ input, support, and continuing encouragement to solve more of your data challenges. You are the inspiration for our work. Thank you.

Our leadership in the industry is a signal of the progressive changes that organizations around the world are pursuing with modern analytics platforms like Tableau. The difference is clear: Our analytics platform is a transformational product that changes organizations by providing self-service analytics at scale.

onthego


/sites/all/libraries/html5shiv/html5shiv.min.js
/sites/all/libraries/es5-shim/es5-shim.min.js
/sites/all/libraries/respondjs/respond.min.js

Companies like ExxonMobil and PepsiCo are seeing massive time savings with Tableau. Others like Skyscanner are using Tableau to leverage huge volumes of data in the cloud. In fact, over 54,000 customers have adopted Tableau to answer more questions of their data. And we’re now seeing our customers go even bigger with Tableau by enabling more people to see and understand their data, which we believe is reflected in this year’s Magic Quadrant.

Download the full Gartner report here.

Helping people see and understand their data is our only mission

For us, helping people see and understand their data has been our only mission all along. It’s what we do every single day. We work to empower people who know the data to ask their own questions of the data.

When we first started, we set out to revolutionize the way that people think about analytics. We had a lofty vision: that everyone, not just specialists, should be able to see and understand data, that analytics should be visual and intuitive. We disrupted the market when we introduced VizQL, our first innovation, and we redefined the way people interact with their data.

Fast-forward to today, and we are once again leading innovation, this time transforming the way entire organizations see and understand their data. Survey customers rated our analytics platform “one of” (39%) or “the” (49%) enterprise standard, according to Gartner. And 41% of our reference customers reported deployments with more than 1,000 users. There is a reason Gartner says, “Tableau continues to be perceived as the modern BI market leader.”

Our continued leadership is a testament to the success our customers have had using Tableau. Companies like Honeywell, Deloitte, and JPMorgan Chase are using our modern analytics platform to empower people across the organization and drive business impact.

It’s customer stories like these that keep us energized and inspired. We continue to devote the largest industry percentage to R&D because we’re even more excited about what’s next. For us, analytics isn’t just a market; helping people see and understand their data is our mission. Every single dollar of R&D goes toward this mission, and we’re just getting started.

Here are five ways we are innovating our modern analytics platform to be even faster, easier, and more intuitive to broaden the use of data and analytics in organizations.

1. Built-in data governance that balances empowerment with control

Having a self-service environment where everyone can surface data is a great thing—as long as you can determine when to use what, and which data sources are trustworthy for the task at hand.

That’s why we’ll introduce certified content to help both IT and business users. It allows IT to define governed data sources including defining the proper joins, security rules, and performance optimizations as well as create the standard calculations the rest of the organization relies on. And business users can select a certified data source and be sure the data is accurate and trustworthy.

We are also enhancing our products to support agile data modeling so you can understand how your centralized data models are used by your users. You’ll be able to perform visual impact analysis to help you understand the impact of any changes you might make to the data source.

2. A Hyper-speed data engine to enable faster analysis on larger data volumes

To help address growing data needs, we are building a new in-memory data engine with Hyper, the fast database technology we acquired last year.

Hyper enables fast analysis on billions of records and near-real-time data updates. It’s designed to simultaneously process transactional and analytical queries without compromising performance. This means you’ll be able to scale to perform sophisticated analysis on large data with incredible performance.

Hyper will also enhance Tableau’s hybrid data model. You’ll still be able to connect live to over 60 different sources that Tableau supports. This means you can leverage the capabilities of databases like Amazon Redshift, Google BigQuery, Snowflake, and Microsoft SQL Server, or choose to bring some or all of your data into Tableau with Hyper.

3. Self-service data prep that lets you quickly transform data for analysis

We know that getting data ready for analysis is a time-consuming and difficult process. That’s why we’re working on Project Maestro. This new product will make it possible for more people, from IT to business users, to easily prep their data with a direct and visual approach. You will instantly see the impact of the joins, unions, and calculations you’ve made, ensuring that you have exactly what you need before jumping into analysis.


/sites/all/libraries/html5shiv/html5shiv.min.js
/sites/all/libraries/es5-shim/es5-shim.min.js
/sites/all/libraries/respondjs/respond.min.js

Project Maestro will also integrate with the rest of the Tableau platform, letting you centrally govern your data, automate data refreshes, and analyze it in Tableau Desktop, Tableau Server, and Tableau Online.

4. Advanced analytics for everyone

Visual analytics continues to be a central pillar of our R&D efforts as it puts the power of data into the hands of more people. This area is far from being commoditized and there are many innovations that we’re working on to help you think with your data.

We’re adding rich features like visualizations in tooltips, drill-down improvements, new chart types including step lines, and the ability to add images to headers, labels, and tooltips. We are giving users more flexibility with legends per measure and nested sorting.

We’re also investing in sophisticated geospatial analysis to help you answer more questions from geographic data. In Tableau 10.2, we are adding spatial file support, and that’s just the beginning. We will also add spatial operations like filters and calculations so you can ask questions like how many customers live within a mile of your store. And with layers, you’ll be able to map different data sets on a single view with just a few clicks.Our advanced analytics features will help you get to the root of your question, no matter how complex it is. We want to bring the power of data science to more users without requiring any programming. You can already perform clustering, forecasting, and trending with a simple drag and drop. You’ll see more algorithms such as outlier detection and sentiment analysis coming in the future.

We also want to enable data scientists to bring rich models directly into Tableau. You can now embed R and Python models in Tableau for interactive analysis. In the future, you will be able take advantage of cloud-based machine-learning platforms to bring even more scalable algorithms for interactive analysis.

Tableau has made it easier and easier to answer richer and richer questions. But what if we could look at what you’re doing and be one step ahead of you, answering new questions for you automatically, helping you interpret what you’re seeing, or suggesting next steps? We’re adding powerful machine-learning algorithms directly within Tableau to recommend the appropriate views, fields, tables, and joins to help you answer questions more quickly.

And soon, we will enable new conversations with data through smart analytics. With natural language processing, you will be able to interact with your data in more natural ways through voice or text.

Tableau integration with natural language processingWe’re also adding machine learning directly to Tableau to make it easier to find the data and views to answer key questions. This will provide recommendations so you can perform better analysis faster.

5. Flexible hybrid deployments

Deploying Tableau needs to be simple and flexible. This flexibility includes allowing you to deploy and connect to your data wherever it lives—in the cloud, on-premises, or both. That’s why we’re expanding the deployment options that you have for Tableau. We’re adding an enterprise-grade version of Tableau Server on Linux. For many organizations, Linux means lower costs, more customization, and a more secure way to run Tableau Server.

You can now deploy Tableau Server on public cloud platforms including AWS, Azure, and Google Cloud. And of course, you can deploy Tableau on-premises in VM and physical environments. No matter where you are on your journey or which platforms you choose, we will be there to support you.

You can also let Tableau run the infrastructure for you with Tableau Online, our managed SaaS offering. We’re adding full cloud authoring in Tableau Online, data-driven alerting, self-service schedules, collaborative discussions, and many more capabilities enabling a complete cloud-based analytics solution.

When discussing hybrid deployments, we also need to talk about data. Tableau supports hybrid data connectivity which means that you can query data live without first requiring data movement or you can move some or all of the data within our fast in-memory engine. This approach is supported across all deployment environments.

However, when deploying in the cloud, connecting to data on-premises can be a challenge. You don’t always want to replicate the data in the cloud to use it. Soon, you will be able to analyze data behind the firewall in Tableau Online using the new live-query agent that acts as a secure tunnel to on-premises data.

We are also adding prebuilt dashboards for popular web applications like Salesforce and Marketo. Imagine being able to explore your data in seconds by using one our prebuilt dashboards to populate directly to your Salesforce environment. This will make it easier and faster to help you see and understand your data.

Join us on this journey

These innovations are just a small sample of what we’re working on; there’s much more on the horizon. And we invite you to come along on this journey. You are at the core of everything we do here at Tableau. Your needs dictate our work. We listen to your feedback, and with each new release, we build features based on our conversations with you. Please join our pre-release program to test-drive these features when they become available and let us know how they solve your problems. You can also contribute new ideas and join the conversation on our Ideas Forum.

Data rockstars, join our conversation on social media. Tag a #DataLeader—it can be anyone!—and tell us why. And we’ll send the data leader a fun avatar as a token of recognition. Share on your platform of choice: Twitter, LinkedIn, or Facebook.

The above graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Tableau. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Source: Tableau.com

Hans Rosling, Swedish Doctor and Pop-Star Statistician, Dies at 68

onthego

Hans Rosling, a Swedish doctor who transformed himself into a pop-star statistician by converting dry numbers into dynamic graphics that challenged preconceptions about global health and gloomy prospects for population growth, died on Tuesday in Uppsala, Sweden. He was 68.

The cause was pancreatic cancer, according to Gapminder, a foundation he established to generate and disseminate demystified data using images.

Even before “post-truth” entered the lexicon, Dr. Rosling was echoing former Senator Daniel Patrick Moynihan’s maxim that everyone is entitled to his own opinions but not to his own facts.

“He challenged the whole world’s view of development with his amazing teaching skills,” Isabella Lovin, Sweden’s deputy prime minister, said in a statement.

A self-described “edutainer,” Dr. Rosling captivated vast audiences in TED Talks — beginning a decade ago in front of live audiences and later viewed online by millions — and on television documentaries like the BBC’s “The Joy of Stats” in 2010.

Inviting animated visualizations and prosaic props (like apples and colorful Lego plastic blocks) defined him as a funky philosopher rather than a geeky professor.

“I produce a road map for the modern world,” he told The Economist in 2010. “Where people want to drive is up to them. But I have the idea that if they have a proper road map and know what the global realities are, they’ll make better decisions.”

In Dr. Rosling’s version of those realities, the traditional divide between third-world and industrialized nations had become anachronistic, since so many countries were undergoing development, with some in Asia improving faster than some in Europe. He considered that five billion people continued to head toward healthier lives while one billion remained mired in poverty and disease; that progress toward health and wealth had contributed to climate change; and that the world was so poorly governed that possibilities to improve it abounded.

“I’m not an optimist,” Dr. Rosling once said. “I’m a very serious possibilist.”

He predicted that the United Nations’ goal of eradicating extreme poverty by 2030 was attainable because the tools to do so had been identified and the share of people living in that condition had already declined by more than half in 25 years.

He also argued vigorously that overpopulation would no longer be problematic as the world grew wealthier and fertility rates declined.

“There are so many who think that death keeps control of population growth,” he said in an interview with The Guardian in 2013. “That’s just wrong!”

He told The Economist: “The only way to reach sustainable population levels is to improve public health. Child survival is the new green.”
Photo

As a medical doctor, epidemiologist and academic, but with the flair of a seasoned performer (he once demonstrated his expertise as a sword swallower), he delivered counterintuitive factoids, accused advocates of tweaking statistics to advance their own causes, and debunked misapprehensions about the third world — although not every expert concurred.

He pointed out that Sweden had more children per woman than Iran, that Shanghai was just as wealthy and healthy as the Netherlands, and that the world’s average life expectancy of 71 years was now closer to the highest (84 in Japan) than to the lowest (49 in Swaziland).

“They just make it about us and them; the West and the rest,” Dr. Rosling told the journal Nature in December. “How could anyone hope to solve problems if they didn’t understand the different challenges faced, for example, by Congolese subsistence farmers far from paved roads and Brazilian street vendors in urban favelas?”

Hans Gosta Rosling was born in Uppsala on July 27, 1948. His father was a coffee roaster.

He studied statistics and medicine at Uppsala University and public health at St. John’s Medical College in Bangalore, India, where he received his medical degree in 1976.

In 1979, he and his wife, the former Agneta Thordeman, whom he met while she was studying to be a nurse, moved to Mozambique with their two young children.

He was delivering on a pledge he had made years earlier to Eduardo Mondlane, the founder of the Mozambican Liberation Front, to help provide health services when the country became independent. Mr. Mondlane was killed in 1969, six years before independence was granted by Portugal.

Dr. Rosling served as district medical officer in a northern province. He was the sole doctor for a population of 300,000.

His investigation of a paralytic disease called konzo in the Democratic Republic of Congo, which was determined to be caused by ingesting naturally occurring cyanide in cassava roots, earned him a doctorate from Uppsala University.

In addition to his wife, a pediatrician and researcher, he is survived by two sons, Ola and Magnus; a daughter, Anna; and a brother, Mats.

With his son Ola and his daughter-in-law, Anna Rosling Ronnlund, Dr. Rosling established Gapminder in 2006 while he was a professor of global health at the Karolinska Institute, the medical university outside Stockholm. The foundation aims to chart trends and fight what it calls “devastating ignorance with fact-based worldviews everyone can understand.”

It derived its name from the London Underground’s recorded warnings to passengers to “mind the gap” between a subway car and the platform. Gapminder’s data images are designed to evoke the divide between statistics and the misleading ways in which they are sometimes interpreted.

“It’s like the emperor’s new clothes, and I’m the little child saying: ‘He’s nude! He’s nude!’” Dr. Rosling told The Guardian.

Brandishing his bubble chart graphics during TED (Technology, Entertainment and Design) Talks, Dr. Rosling often capsulized the macroeconomics of energy and the environment in a favorite anecdote about the day a washing machine was delivered to his family’s cold-water flat.

“My mother explained the magic with this machine the very, very first day,” he recalled. “She said: ‘Now Hans, we have loaded the laundry. The machine will make the work. And now we can go to the library.’ Because this is the magic: You load the laundry, and what do you get out of the machine? You get books out of the machines, children’s books. And Mother got time to read to me.”

“Thank you, industrialization,” Dr. Rosling said. “Thank you, steel mill. And thank you, chemical processing industry that gave us time to read books.”

Source: NY Times