All grown up: The IoT market today

Using a data-driven analysis to understand IoT technology adoption.

onthego

Although we’ve been talking about it for years, it was in 2016 that the adoption of the Internet of Things (IoT)—among consumers and businesses—rose dramatically. At least in part, this was due to factors like the increased numbers of sensors and connected devices, a growing pool of skilled IoT developers, and real-time data and analytics support for IoT systems.

As consumers, we now live with and are helped by IoT devices in our homes, our cars, even our toys. In the business world, the impact is even more pervasive. One of the industries that comes to mind for many of us when we think about the IoT is manufacturing, where connected technologies can improve safety, maintenance, and efficiency. Globally, the impact of the IoT on manufacturing has been evidenced by national initiatives such as Germany’s Industrie 4.0 and China’s Made in China 2025. But that’s old news. In the past year, we’ve seen businesses use IoT technologies in other ways, and with wider applications. Innovations like advances in deep learning that can be applied to the IoT make it likely that the growth of the IoT will continue to accelerate. If last year the IoT “grew up,” 2017 will mark the year that the IoT starts to become essential to modern business.

A newly published report, The Internet of Things Market, by Aman Naimat, presents a current snapshot of the IoT business landscape, describing a data-driven analysis of the companies, industries, and workers using IoT technologies. Making use of web crawlers, natural language processing, and network analysis to process over 300 TB of data, Naimat uncovers some surprising results. For one, the IoT landscape looks different than the market for big data tools in several ways: which companies are adopting IoT technologies, their sizes, and their locations. Secondly, we found that the current use cases that incur the most spending on the IoT are not the ones that have been predicted to become the most valuable: health care and smart cities, for example.

The report opens by looking at the number of companies using IoT technologies and the maturity level of their projects.

What do we mean by IoT project maturity? Level 1 projects are still under development, meaning products have not been deployed; strategy, design, scope, and infrastructure are all a work in progress. Level 2 projects have been deployed either within a specific department or for a single use case, such as inventory control. Level 3 maturity projects represent companies like Nest or Amazon, that have made the IoT a strategic directive for their businesses and have deployed IoT tools or products to address multiple use cases.

While the numbers listed in the figure may not seem large, they represent actual adoption of IoT technologies and are, in fact, similar to the current level of adoption of big data technologies. Given the buzz around big data began as early as the 1990s, much earlier than interest in the IoT began, this points to a faster adoption curve for IoT.

Ever increasing data sets and more robust compute power and scalability will doubtless lead to more IoT breakthroughs—and therefore business investments—in the future. But to know where we’re going, we need to understand where we currently are. The Internet of Things Market report can help you get your bearings.

Source: oreilly.com

 

 

 

Advertisements

Big Data: Why NASA Can Now Visualize Its Lessons Learned

onthego

NASA’s Lessons Learned database is a vast, constantly updated collection knowledge and experience from past missions, which it relies on for planning future projects and expeditions into space.

With detailed information from every mission going back as far as the 60’s, every record is reviewed and approved before inclusion. As well as NASA staff, thousands of scientists, engineers, educators and analysts access the database every month from private-sector and government organizations.

As it has swollen in size, the interface used internally to query the dataset – a keyword-based search built on a PageRank-style algorithm – was becoming unwieldy. Chief Knowledge Architect David Meza spoke to me recently and told me that the move to the graph-based, open source Neo4J management system has significantly cut down on time engineers and mission planners spend combing through keyword-based search results.

Meza says “This came to light when I had a young engineer come to me because he was trying to explore our Lessons Learned database – but sometimes it’s hard to find the information you want in that database.

“He had 23 key terms he was trying to search for across the database of nearly 10 million documents, and because it was based on a PageRank algorithm the records nearest the top of the results were there because they were most frequently accessed, not necessarily because they had the right information.”

The gist of the problem was that even after searching the database, the engineer was left with around 1,000 documents which would need to be read through individually to know if they held information he needed.

“I knew there had to be something better we could do,” Meza says. “I started looking at graph database technologies and came across Neo4J. What was really interesting was the way it made it easier to combine information and showcase it in a graph form.

“To me, that is more intuitive, and I know a lot of engineers feel that way. It makes it easier to see patterns and see how things connect.”

The engineer was trying to solve a problem involving corrosion of valves, of the sort used in numerous technologies in use at Johnson Space Center, Texas, including environmental systems, oxygen and fuel tanks.

Using graph visualization, it quickly became apparent, for some reason, there was a high correlation between records involving this sort of corrosion and topics involving batteries.

“I couldn’t understand how these topics were related,” Meza says, “but when I started looking into the lessons within those topics I was quickly able to see that some of the condition where we had issues with lithium batteries leaking, and acid contaminating the tanks – we definitely had issues.

“So, if I’m concerned about the tanks and the valves within those tanks, I also have to be concerned about whether there are batteries close to them. Having this correlation built in allowed the engineer to find this out much faster.”

Correlating information graphically in this way makes it far quicker to spot links between potentially related information.

“To me, it’s a validation,” Meza says. “There are many different ways to look and search for information rather than just a keyword search. And I think utilizing new types of graph databases and other types of NoSQL databases really showcases this – often there are better ways than a traditional relational database management system.”

Neo4J is one of the most commonly used open source graph database management systems. It hit the headlines in 2016 when it was used as a primary tool by journalists working with the ICIJ to analyze the leaked, 2.6 terabyte Panama Papers for evidence of tax evasion, money-laundering and other criminal activity.

Obviously, to an organization as data-rich as NASA, there are clear benefits to thinking beyond keyword and PageRank when it comes to accessing information. NASA’s experience serves as another reminder that when you’re undertaking data-driven enterprise, volume of information isn’t always the deciding factor between success and failure, and in fact can sometimes be a hindrance. Often insights are just as likely to emerge from developing more efficient and innovative ways to query data, and clearer ways to communicate it to those who need it to do their jobs.

Source: Forbes

Tableau five years a leader in Gartner’s Magic Quadrant for Analytics

We’re proud to see that Tableau is a leader in the Gartner Magic Quadrant for Business Intelligence and Analytics Platforms for the fifth consecutive year.We believe Tableau is the gold standard for intuitive interactive visual analytics and an established enterprise platform.

We wouldn’t be here without our customers’ input, support, and continuing encouragement to solve more of your data challenges. You are the inspiration for our work. Thank you.

Our leadership in the industry is a signal of the progressive changes that organizations around the world are pursuing with modern analytics platforms like Tableau. The difference is clear: Our analytics platform is a transformational product that changes organizations by providing self-service analytics at scale.

onthego


/sites/all/libraries/html5shiv/html5shiv.min.js
/sites/all/libraries/es5-shim/es5-shim.min.js
/sites/all/libraries/respondjs/respond.min.js

Companies like ExxonMobil and PepsiCo are seeing massive time savings with Tableau. Others like Skyscanner are using Tableau to leverage huge volumes of data in the cloud. In fact, over 54,000 customers have adopted Tableau to answer more questions of their data. And we’re now seeing our customers go even bigger with Tableau by enabling more people to see and understand their data, which we believe is reflected in this year’s Magic Quadrant.

Download the full Gartner report here.

Helping people see and understand their data is our only mission

For us, helping people see and understand their data has been our only mission all along. It’s what we do every single day. We work to empower people who know the data to ask their own questions of the data.

When we first started, we set out to revolutionize the way that people think about analytics. We had a lofty vision: that everyone, not just specialists, should be able to see and understand data, that analytics should be visual and intuitive. We disrupted the market when we introduced VizQL, our first innovation, and we redefined the way people interact with their data.

Fast-forward to today, and we are once again leading innovation, this time transforming the way entire organizations see and understand their data. Survey customers rated our analytics platform “one of” (39%) or “the” (49%) enterprise standard, according to Gartner. And 41% of our reference customers reported deployments with more than 1,000 users. There is a reason Gartner says, “Tableau continues to be perceived as the modern BI market leader.”

Our continued leadership is a testament to the success our customers have had using Tableau. Companies like Honeywell, Deloitte, and JPMorgan Chase are using our modern analytics platform to empower people across the organization and drive business impact.

It’s customer stories like these that keep us energized and inspired. We continue to devote the largest industry percentage to R&D because we’re even more excited about what’s next. For us, analytics isn’t just a market; helping people see and understand their data is our mission. Every single dollar of R&D goes toward this mission, and we’re just getting started.

Here are five ways we are innovating our modern analytics platform to be even faster, easier, and more intuitive to broaden the use of data and analytics in organizations.

1. Built-in data governance that balances empowerment with control

Having a self-service environment where everyone can surface data is a great thing—as long as you can determine when to use what, and which data sources are trustworthy for the task at hand.

That’s why we’ll introduce certified content to help both IT and business users. It allows IT to define governed data sources including defining the proper joins, security rules, and performance optimizations as well as create the standard calculations the rest of the organization relies on. And business users can select a certified data source and be sure the data is accurate and trustworthy.

We are also enhancing our products to support agile data modeling so you can understand how your centralized data models are used by your users. You’ll be able to perform visual impact analysis to help you understand the impact of any changes you might make to the data source.

2. A Hyper-speed data engine to enable faster analysis on larger data volumes

To help address growing data needs, we are building a new in-memory data engine with Hyper, the fast database technology we acquired last year.

Hyper enables fast analysis on billions of records and near-real-time data updates. It’s designed to simultaneously process transactional and analytical queries without compromising performance. This means you’ll be able to scale to perform sophisticated analysis on large data with incredible performance.

Hyper will also enhance Tableau’s hybrid data model. You’ll still be able to connect live to over 60 different sources that Tableau supports. This means you can leverage the capabilities of databases like Amazon Redshift, Google BigQuery, Snowflake, and Microsoft SQL Server, or choose to bring some or all of your data into Tableau with Hyper.

3. Self-service data prep that lets you quickly transform data for analysis

We know that getting data ready for analysis is a time-consuming and difficult process. That’s why we’re working on Project Maestro. This new product will make it possible for more people, from IT to business users, to easily prep their data with a direct and visual approach. You will instantly see the impact of the joins, unions, and calculations you’ve made, ensuring that you have exactly what you need before jumping into analysis.


/sites/all/libraries/html5shiv/html5shiv.min.js
/sites/all/libraries/es5-shim/es5-shim.min.js
/sites/all/libraries/respondjs/respond.min.js

Project Maestro will also integrate with the rest of the Tableau platform, letting you centrally govern your data, automate data refreshes, and analyze it in Tableau Desktop, Tableau Server, and Tableau Online.

4. Advanced analytics for everyone

Visual analytics continues to be a central pillar of our R&D efforts as it puts the power of data into the hands of more people. This area is far from being commoditized and there are many innovations that we’re working on to help you think with your data.

We’re adding rich features like visualizations in tooltips, drill-down improvements, new chart types including step lines, and the ability to add images to headers, labels, and tooltips. We are giving users more flexibility with legends per measure and nested sorting.

We’re also investing in sophisticated geospatial analysis to help you answer more questions from geographic data. In Tableau 10.2, we are adding spatial file support, and that’s just the beginning. We will also add spatial operations like filters and calculations so you can ask questions like how many customers live within a mile of your store. And with layers, you’ll be able to map different data sets on a single view with just a few clicks.Our advanced analytics features will help you get to the root of your question, no matter how complex it is. We want to bring the power of data science to more users without requiring any programming. You can already perform clustering, forecasting, and trending with a simple drag and drop. You’ll see more algorithms such as outlier detection and sentiment analysis coming in the future.

We also want to enable data scientists to bring rich models directly into Tableau. You can now embed R and Python models in Tableau for interactive analysis. In the future, you will be able take advantage of cloud-based machine-learning platforms to bring even more scalable algorithms for interactive analysis.

Tableau has made it easier and easier to answer richer and richer questions. But what if we could look at what you’re doing and be one step ahead of you, answering new questions for you automatically, helping you interpret what you’re seeing, or suggesting next steps? We’re adding powerful machine-learning algorithms directly within Tableau to recommend the appropriate views, fields, tables, and joins to help you answer questions more quickly.

And soon, we will enable new conversations with data through smart analytics. With natural language processing, you will be able to interact with your data in more natural ways through voice or text.

Tableau integration with natural language processingWe’re also adding machine learning directly to Tableau to make it easier to find the data and views to answer key questions. This will provide recommendations so you can perform better analysis faster.

5. Flexible hybrid deployments

Deploying Tableau needs to be simple and flexible. This flexibility includes allowing you to deploy and connect to your data wherever it lives—in the cloud, on-premises, or both. That’s why we’re expanding the deployment options that you have for Tableau. We’re adding an enterprise-grade version of Tableau Server on Linux. For many organizations, Linux means lower costs, more customization, and a more secure way to run Tableau Server.

You can now deploy Tableau Server on public cloud platforms including AWS, Azure, and Google Cloud. And of course, you can deploy Tableau on-premises in VM and physical environments. No matter where you are on your journey or which platforms you choose, we will be there to support you.

You can also let Tableau run the infrastructure for you with Tableau Online, our managed SaaS offering. We’re adding full cloud authoring in Tableau Online, data-driven alerting, self-service schedules, collaborative discussions, and many more capabilities enabling a complete cloud-based analytics solution.

When discussing hybrid deployments, we also need to talk about data. Tableau supports hybrid data connectivity which means that you can query data live without first requiring data movement or you can move some or all of the data within our fast in-memory engine. This approach is supported across all deployment environments.

However, when deploying in the cloud, connecting to data on-premises can be a challenge. You don’t always want to replicate the data in the cloud to use it. Soon, you will be able to analyze data behind the firewall in Tableau Online using the new live-query agent that acts as a secure tunnel to on-premises data.

We are also adding prebuilt dashboards for popular web applications like Salesforce and Marketo. Imagine being able to explore your data in seconds by using one our prebuilt dashboards to populate directly to your Salesforce environment. This will make it easier and faster to help you see and understand your data.

Join us on this journey

These innovations are just a small sample of what we’re working on; there’s much more on the horizon. And we invite you to come along on this journey. You are at the core of everything we do here at Tableau. Your needs dictate our work. We listen to your feedback, and with each new release, we build features based on our conversations with you. Please join our pre-release program to test-drive these features when they become available and let us know how they solve your problems. You can also contribute new ideas and join the conversation on our Ideas Forum.

Data rockstars, join our conversation on social media. Tag a #DataLeader—it can be anyone!—and tell us why. And we’ll send the data leader a fun avatar as a token of recognition. Share on your platform of choice: Twitter, LinkedIn, or Facebook.

The above graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Tableau. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Source: Tableau.com

Coke Let People Make Any Flavor They Want, The People Demanded Cherry Sprite

Thanks to new machines that let customers flavor their drinks however they want, Coca-Cola discovered that what people really wanted was Cherry Sprite.

champion

When it launched its design-your-own-flavor soda dispensers, Coca-Cola handed over the keys to its customers, letting them add a shot of flavor — say raspberry or vanilla or lemon — to any drink. In return, the touchscreen machines started sending Coca-Cola some very useful data on what its customers really want.

Now, eight years after introducing the machines, which have made their way into movie theaters and fast food outlets around the country, Coca-Cola is unveiling its first product created using all that data. People have spent years dialling their own flavor combinations into the machine, and the lesson was simple: The people demand Sprite Cherry.

To the casual soda drinker, Sprite Cherry may seem kind of predictable — it’s not a huge leap from Cherry Coke — and even a little disappointing considering the other options people could add to drinks, like strawberry, grape, peach, raspberry, orange, and vanilla. Serious Eats described Sprite Cherry as “kind of meh.”

But the people have spoken.

“There’s proven data that people actually love it,” said Bobby Oliver, director of Sprite & Citrus Brands for Coca-Cola North America. “It’s not just a survey where people say yes or no.”

Asked if cherry’s victory was a letdown, Oliver said, “We’re not disappointed at all.” Combining “lemon lime, with a twist of cherry flavor” allows the brand to “stay true to what Sprite is about,” he said.

Sprite has been an outlier in a shrinking soda business, with dollar sales up about 3.4% in 2016, according to Coca-Cola, citing Nielsen data. Meanwhile, the company’s overall revenues fell by 5% last year.

Introducing new products — especially beverages that aren’t soda — is part of Coca-Cola’s strategy, “We brought to market more than 500 new products, nearly 400 of which were tea, juices, coffees, waters or other still beverages,” CEO Muhtar Kent said to investors last week.

Sprite Cherry and Sprite Cherry Zero are the first Freestyle products to make it to Coca-Cola’s permanent lineup (there are other limited-time products like Sprite Cranberry), and are also the first new Sprite flavor since Sprite Zero was launched more than a decade ago. Coca-Cola announced Sprite Cherry in late 2016. Whether Sprite Cherry fans will find the bottled version as satisfying as the fountain soda remains to be seen.

Source: buzzfeed.com

Best Self-Service Business Intelligence Software

onthego

Self-service business intelligence (BI) software empowers business users to investigate company data, and reveal patterns and insights. Self-service BI products are designed to be set up and used by average business users without the need for input by IT. Businesses require intelligence throughout their company structure to determine the health of metrics, and how different metrics and data points are related. This analysis can locate opportunities and areas of improvement, and is necessary to continue adapting and refining business strategies. Self-service BI products consume data in its many forms, from file uploads and direct connectors to databases and business applications.

To qualify as a self-service BI platform, a product must:
•Consume data from any source through file uploads, database querying, and application connectors.
•Transform data into a useful and relatable model.
•Support data modeling, blending, and discovery processes.
•Create reports and visualizations with business utility.
•Be able to be configured and used by average business users with little IT involvement.

Products in our Self-Service BI category are designed to be primarily configured and used by non-technical business users like managers and analysts. For more customizable and comprehensive platforms that require IT involvement for designing and deploying internal analytics applications, look at our BI Platforms category. Our Data Visualization category houses products primarily designed to create charts, graphs, and benchmark visualizations.

Self-Service Business Intelligence Software Grid Overview
The best Self-Service Business Intelligence Software are determined by customer satisfaction (based on user reviews) and scale (based on market share, vendor size, and social impact) and placed into four categories on the Grid:
•Leaders are rated highly by G2 Crowd users and have substantial scale, market share, and global support and service resources. Leaders include: Zoho Reports, Tableau Desktop, Microsoft Power BI, Qlik Sense, and SAP Crystal Reports
•High Performers are highly rated by their users, but have not yet achieved the market share and scale of the Leaders. High Performers include: Izenda, Dundas BI, Alteryx, Chartio, and Datorama
•Contenders have significant Market Presence and resources, but have received below average user Satisfaction ratings or have not yet received a sufficient number of reviews to validate the solution. Contenders include: Salesforce Wave Analytics, JMP, and Jaspersoft
•Niche solutions do not have the Market Presence of the Leaders. They may have been rated positively on customer Satisfaction, but have not yet received enough reviews to validate them. Niche products include: ReportServer, Analyzer, Panorama Necto, Looker, and DecisionPoint

Source: g2crowd.com

Frozen phone? Cosmic rays could be to blame

onthego

Next time your smartphone freezes, think twice before cursing the shoddy workmanship of the phone manufacturer under your breath. The culprit might actually be the cosmic rays that are constantly raining down on us from outer space and can mess with the integrated circuits in electronic devices. A new study by Vanderbilt University has examined how modern consumer electronics are becoming more vulnerable to cosmic interference, and suggested ways for manufacturers to build better chips.

Thought to be produced by supernovae, cosmic rays are particles that travel through space at close to the speed of light, and they can be dangerous to humans and electronics alike. While the Earth’s electromagnetic field shields us from the worst of the damage, astronauts in orbit or, eventually, journeying to Mars, can soak up unhealthy amounts of radiation fairly quickly. Likewise, satellites and probes need to carry proper shielding to protect their delicate electronics from failure.

Here on Earth, oxygen and nitrogen in the atmosphere break these cosmic rays down into other secondary particles, like neutrons, pions, positrons and muons. We’re being showered in these lighter particles every second of the day, and although they’re harmless to living organisms, they can interfere with electronic systems. Granted, a quick reboot can usually fix the problem, but unfortunately, the more advanced a computer system is, the more susceptible it is to failure by cosmic rays.

“The semiconductor manufacturers are very concerned about this problem because it is getting more serious as the size of the transistors in computer chips shrink and the power and capacity of our digital systems increase,” says Bharat Bhuva, a professor and member of Vanderbilt University’s Radiation Effects Research Group. “In addition, microelectronic circuits are everywhere and our society is becoming increasingly dependent on them.”

Some of these particles have enough energy to actually alter individual bits of data in an electronic system, switching it from a zero to a one (or vice versa) in a process called a “bit flip.” While it might sound too small to be a problem, the effects can be catastrophic: Bhuva illustrated the point with the example of a Belgian voting machine in 2003, where a bit flip resulted in over 4,000 erroneous votes. In 2008, the autopilot system in a Qantas A330 failed, causing the plane to buck and dive, injuring 119 people on board. Although it’s hard to determine exactly what caused a given bit flip, cosmic rays were suspected in both incidents.

“When you have a single bit flip, it could have any number of causes,” says Bhuva. “It could be a software bug or a hardware flaw, for example. The only way you can determine that it is a single-event upset is by eliminating all the other possible causes.”

So the Vanderbilt team tested the rate that several generations of transistors would fail as a result of a single-event upset (SEU), or a bit flip caused by cosmic rays. By blasting samples of these chips with a neutron beam, the researchers measured how many failures occurred, and found that overall, that number is growing with each generation.

First, the “good” news: individual transistors are much less likely to experience an SEU now than ever. That’s probably because they’re shrinking with each generation, making them smaller physical targets for particles to strike. And since they’re now made in a three dimensional architecture, the chips are also much hardier against SEUs.

But the problem is that modern devices contain billions of transistors, and each one also requires a smaller electrical charge to make up each bit of information. All factors considered, devices at the system level are increasingly vulnerable to cosmic ray-induced failures.

“Our study confirms that this is a serious and growing problem,” says Bhuva.”This did not come as a surprise. Through our research on radiation effects on electronic circuits developed for military and space applications, we have been anticipating such effects on electronic systems operating in the terrestrial environment.”

Bhuva pointed out that industries like aviation, IT, transportation, spacecraft, communications, power and medical technology are addressing the problem in their devices, but so far consumer electronics have been lagging behind. Shielding isn’t practical in everyday devices, but steps can be taken at the design level, by building in redundancy measures.

“The probability that SEUs will occur in two of the circuits at the same time is vanishingly small,” says Bhuva. “So if two circuits produce the same result it should be correct.”

Bhuva presented the Vanderbilt team’s findings at the annual American Association for the Advancement of Science meeting last week.

Source: Vanderbilt University

MD Anderson Benches IBM Watson In Setback For Artificial Intelligence In Medicine

It was one of those amazing “we’re living in the future” moments. In an October 2013 press release, IBM declared that MD Anderson, the cancer center that is part of the University of Texas, “is using the IBM Watson cognitive computing system for its mission to eradicate cancer.”