Understanding Data Roles

AnalyticsAnywhereWith the rise of Big Data has come the accompanying explosion in roles that in some way involve data. Most who are in any way involved with enterprise technology are at least familiar with them by name, but sometimes it’s helpful to look at them through a comprehensive lens that shows us how they all fit together. In understanding how data roles mesh, think about them in terms of two pools: one is responsible for making data ready for use, and another one that puts that data to use. The latter function includes the tightly-woven roles of Data Analysts and Data Scientist, and the former includes such roles as Database Administrator, Data Architect and Data Governance Manager.

Ensuring the data is ready for use

Making Sure the Engine Works.

A car is only as good as its engine, and according to PC Magazine the Database Administrator (DBA), is “responsible for the physical design and management of the database and for the evaluation, selection and implementation of the DBMS.” Techopedia defines the position as one that “directs or performs all activities related to maintaining a successful database environment.” A DBA’s responsibilities include security, optimization, monitoring and troubleshooting, and ensuring the needed capacity to support activities. This of course requires a high level of technical expertise–particularly in SQL, and increasingly in NoSQL. But while the role may be technical, TechTarget maintains that it may require managerial functions, including “establishing policies and procedures pertaining to the management, security, maintenance, and use of the database management system.”

Directing the Vision. With the database engines in place, the task becomes one of creating an infrastructure for taking in, moving and accessing the data. If the DBA builds the car, then the Enterprise Data Architect (EDA) builds the freeway system, laying the framework for how data will be stored, shared and accessed by different departments, systems and applications, and aligning it to business strategy. Bob Lambert describes the skills as including an understanding of the system development life cycle; software project management approaches; data modeling, database design, and SQL development. The role is strategic, requiring an understanding of both existing and emerging technologies (NoSQL databases, analytics tools and visualization tools), and how those may support the organization’s objectives. The EDA’s role requires knowledge sufficient to direct the components of enterprise architecture, but not necessarily practical skills of implementation. With that said, Monster.com lists typical responsibilities as: determining database structural requirements, defining physical structure and functional capabilities, security, backup, and recovery specifications, as well as installing, maintaining and optimizing database performance.

Creating and Enforcing the Rules of Data Flow. A well-architected system requires order. A Data Governance Manager organizes and streamlines how data is collected, stored, shared/accessed, secured and put to use. But don’t think of the role as a traffic cop–the rules of the road are there to not only prevent ‘accidents’, but also to ensure efficiency and value. The governance manager’s responsibilities include enforcing compliance, setting policies and standards, managing the lifecycle of data assets, and ensuring that data is secure, organized and able to be accessed by–and only by– appropriate users. By so doing, the data governance manager improves decision-making, eliminates redundancy, reduces risk of fines/lawsuits, ensures security of proprietary and confidential information, so the organization achieves maximum value (and minimum risk). The position implies at least a functional knowledge of databases and associated technologies, and a thorough knowledge of industry regulations (FINRA, HIPAA, etc.).

Making Use of the Data

We create a system in which data is well-organized and governed so that the business can make maximum use of it by informing day-to-day processes, and deriving insight from data analysts/scientists to improve efficiency or innovation.

Understand the past to guide future decisions. A Data Analyst performs statistical analysis and problem solving, taking organizational data and using it to facilitate better decisions on items ranging from product pricing to customer churn. This requires statistical skills, and critical thinking to draw supportable conclusions. An important part of the job is to make data palpable to the C-suite, so an effective analyst is also an effective communicator. MastersinScience.org refers to data analysts as “data scientists in training” and points out that the line between the roles are often blurred.

Data scientist–Modeling the Future. Data scientists combine advanced mathematical/statistical abilities with advanced programming abilities, including a knowledge of machine learning, and the ability to code in SQL, R, Python or Scala. A key differentiator is that where the Data Analyst primarily analyzes batch/historical data to detect past trends, the Data Scientist builds programs that predict future outcomes. Furthermore, data scientists are building machine learning models that continue to learn and refine their predictive ability as more data is collected.

Of course, as data becomes increasingly the currency of business, as it is predicted to, we expect to see more roles develop, and the ones just described evolve significantly. In fact, we haven’t even discussed one of a role that is now mandated by the EU’s GDPR initiative: The Chief Data Officer, or ‘CDO’.

Source: datasciencecentral.com


The Ultimate Data Set


Until recently, using entire populations as data sets was impossible—or at least impractical—given limitations on data collection processes and analytical capabilities. But that is changing.

The emerging field of computational social science takes advantage of the proliferation of data being collected to access extremely large data sets for study. The patterns and trends in individual and group behavior that emerge from these studies provide “first facts,” or universal information derived from comprehensive data rather than samples.

“Computational social science is an irritant that can create new scientific pearls of wisdom, changing how science is done,” says Brian Uzzi, a professor of management and organizations at the Kellogg School. In the past, scientists have relied primarily on lab research and observational research to establish causality and create descriptions of relationships. “People who do lab studies are preoccupied with knowing causality,” Uzzi says. “Computational work says, “I know that when you see X, you see Y, and the reason why that happens may be less important than knowing that virtually every time you see X, you also see Y.”

“Big data goes hand in hand with computational work that allows you to derive those first facts,” Uzzi says. “Instead of trying to figure out how scientists come up with great ideas by looking at 1,000 scientists, you look at 12,000,000 scientists—potentially everyone on the planet. When you find a relationship there, you know it’s universal. That universality is the new fact on which science is being built.”


Computation in the Social Sphere

Studying large data sets for first facts about human behavior has led to striking advances in recent years. Uzzi notes how one particular data set—mobile-phone data—“has taught us very distinctively about human mobility and its implications for economical and social stratification in society.” It has also shed light on how people behave during evacuations and emergency situations, including infectious-disease outbreaks. Knowing how behaviors affect the spread of diseases can help public health officials design programs to limit contagion.

The ability to track the social behavior of large groups has also shifted people’s understanding of human agency. “Until recently, we really believed that each of us made our decisions on our own,” Uzzi says. “Our friends may have influenced us here or there but not in a big way.” But troves of social-media data have shown that people are incredibly sensitive and responsive to what other people do. “That’s often the thing that drives our behavior, rather than our own individual interests or desires or preferences.”

This may change how you think about your consumer behavior, your exercise regimen, or what you Tweet about. Researchers like Uzzi are also deeply interested in how this responsiveness influences political behavior on larger issues like global climate change or investments in education systems. Think of it as a shift from envisioning yourself as a ruggedly individual, purely rational, economic person to a sociological person who encounters and engages and decides in concert with others.

One aspect of computational social science—brain science—has already discovered that those decisions are often being made before we even know it. “Brain science has taught us a lot about how the brain reacts to stimuli,” Uzzi says. With the visual part of your brain moving at roughly 8,000 times the speed of the rest of your brain, the visual cortex has already begun processing information—and leaping to certain conclusions—before the rest of your brain ever catches up. And with 40 percent of the brain’s function devoted strictly to visualization, “if you want to get in front of anything that’s going to lead to a decision, an act of persuasion, an in-depth engagement with an idea, it has got to be visual.”

“The really big things are understanding how something diffuses through a population and how opinions change,” Uzzi says. “If you put those two things together, you really have an understanding of mass behavior.”

This increased understanding of both mass and individual behavior presents huge opportunities for businesses, notably in the health sphere. “There is going to be an entirely new ecology of business that goes beyond how we think about health today,” Uzzi says. “For many people, there is no upper threshold on what they will pay for good health and beauty. With health increasingly decentralized to the individual, that’s going to spin off to companies that want to take advantage of this information to help people do things better.”

Scaling from One to Everyone

While gathering data on groups as large as the entire population is beneficial to scientists, marketers, and the like, computational social science has the scalability to allow for practical data generation on an individual level as well. This means that you can be the subject of your own data-rich computational study, without control groups or comparison testing. “You actually generate enough data on yourself, every day, that could be collected, that you can be the subject of a computational study,” Uzzi says.

Developments in the ability to collect and parse data on individuals is one area where computational social science has the potential to transform people’s lives—from providing more information about individuals’ own health to raising their awareness of unconscious biases to showing how their decision-making processes are influenced by others. “It’s going to allow people to personally use data that can help them improve their lives in a way that they never imagined before,” Uzzi says.

For example, using wearable technologies allows for sensor data collection that can include emotional activation and heart-rate monitoring in social interactions, caloric intake, biorhythms, and nervous energy. The crunching of that raw data into actionable information will happen through our machines. If you think you have a close connection to your smartphone and your tablet now, wait until you rely on it to tell you how much that last workout helped—or did not help—you shake off the tension of a long day at the office.

“Our closest partnership in the world is probably going to be our machine that helps us manage all this,” Uzzi says. This can be transformative by making us healthier.

It may make us less discriminatory, too. We all have cognitive biases that lead us to make irrational decisions. These are thought to be hard-wired things we can identify but not necessarily change on our own. Sensor data can provide a feedback loop of how we have acted in the past. This has the potential to improve future decision making. If your sensors pick up signals that show your body acting differently around certain groups, perhaps in ways that you suppress or to which you are oblivious, that may be harder to ignore.

“Our own sense of identity could be greatly shaken by this, or improved, or both.”

Source: Kellogg Insight

What Is Machine Learning???

Machine Learning for Dummies


Amazon uses it. Target uses it. Google uses it. “It” is machine learning, and it’s revolutionizing the way companies do business worldwide.

Machine learning is the ability for computer programs to analyze big data, extract information automatically, and learn from it. With 250 million active customers and tens of millions of products, Amazon’s machine learning makes accurate product recommendations based on the customer’s browsing and purchasing behavior almost instantly. No humans could do that.

Target uses machine learning to predict the offline buying behaviors of shoppers. A memorable case study highlights how Target knew a high school girl was pregnant before her parents did.

Google’s driverless cars are using machine learning to make our roads safer, and IBM’s Watson is making waves in healthcare with its machine learning and cognitive computing power.

Is your business next? Can you think of any deep data analysis or predictions that your company can produce? What impact would it have on your business’s bottom line, or how could it give you a competitive edge?

Why Is Machine Learning Important?

Data is being generated faster than at any other time in history. We are now at a point where data analysis cannot be done manually due to the amount of the data. This has driven the rise of MI — the ability for computer programs to analyze big data and extract information automatically.

The purpose of machine learning is to produce more positive outcomes with increasingly precise predictions. These outcomes are defined by what matters most to you and your company, such as higher sales and increased efficiency.

Every time you search on Google for a local service, you are feeding in valuable data to Google’s machine learning algorithm. This allows for Google to produce increasingly more relevant rankings for local businesses that provide that service.

Big Big Data

It’s important to remember that the data itself will not produce anything. It’s critical to draw accurate insights from that data. The success of machine learning depends upon producing the right learning algorithm and accurate data sets. This will allow a machine to obtain the most efficient insights possible from the information provided. Like human data analysts, one may catch an error another could potentially miss.

Digital Transformation

Machine learning and digital technologies are disrupting every industry. According to Gartner, “Smart machines will enter mainstream adoption by 2021.” Adopting early may provide your organization with a major competitive edge. Personally, I’m extremely excited by the trend and recently spent time at Harvard attending its Competing on Business Analytics and Big Data program along with 60 senior global executives from various industries.

Interested In Bringing The Power Of Machine Learning To Your Company?

Here are my recommendations to get started with the help of the right tools and experts:

  1. Secure all of the past data you have collected (offline and online sales data, accounting, customer information, product inventory, etc.). In case you might think your company doesn’t generate enough data to require machine learning, I can assure you that there is more data out there than you think, starting with general industry data. Next, think about how you can gather even more data points from all silos of your organization and elsewhere, like chatter about your brand on social media.
  2. Identify the business insights that you would benefit from most. For example, some companies are using learning algorithms for sales lead scoring.
  3. Create a strategy with clear executables to produce the desired outcomes such as fraud protection, higher sales, increased profit margin and the ability to predict customer behavior. Evaluate and revisit this strategy regularly.

Source: Forbes

5 Questions to Assess Digital Transformation at the Enterprise Level


Digital transformation is still one of the business buzzwords of the year. It is estimated that 89% of organizations have digital transformation as a business priority. But if you feel like you’ve come to a standstill in your digital transformation efforts, you are not alone. As many as 84% of digital transformation efforts fail to achieve desired results. And that statistic would likely be higher if we examined only the larger, enterprise level efforts.

What exactly is digital transformation? According to researchers at MIT Sloan, digital transformation occurs when businesses are focused on integrating digital technologies, such as social, mobile, analytics and cloud, in the service of transforming how their businesses work. The preoccupation with digital transformation makes sense given the pace of change. Richard Foster, at the Yale School of Management, found that the average lifespan of an S&P company dropped from 67 years in the 1920s to 15 years today.

Creating digital products receives a lot of press. For example, the 2017 Ford GT supercar’s digital instrument display has been advertised as the dashboard of the future featuring a state-of-the-art 10-inch digital instrument display that helps reduce driver distraction. Yet, Ford’s share price is down nearly 30% over the past 3 years. On the other hand, the design of the Airbus 380 aircraft had some exciting digital innovations, but Airbus also leveraged big data to improve customer experience with very positive results on the company’s share price over the past 3 years. GE is another example of a company that has pursued digital transformation to reinvent its own industrial operations through digital technology, and then leveraged those learnings to help its customers do likewise. While the product innovations are sometimes impressive, more than purely product related innovations are needed for digital transformation at the enterprise level.

There’s no doubt that the digital tools which includes social, mobile, analytics and cloud (sometimes referred to as the “SMAC” acronym) creates value – but digital transformation at the enterprise level must go beyond just the tools.

Having a transformative purpose or vision and a process based view is recognized as being important. In “Leading digital,” the authors found that firms with a strong vision and mature processes for digital transformation were more profitable on average, had higher revenues, and achieved a bigger market valuation than competitors without a strong vision. Yet more reason to emphasize that while technology is integral to digital transformation – it can’t just be about technology. If we go back to the early days of the research on digital transformation, it was proposed that true digital transformation at the enterprise level needs to embrace fundamental change is three areas: customer experience, operational processes, and business models.

Focusing on customer experience is central to success. According to the Altimeter Group in 2014, around 88% of companies reported undergoing digital transformation – yet only 25% of respondents indicated that they had mapped the customer journey. The 2016 update to this research, based on survey data from 528 leaders, found that the number of companies which mapped customer journey had risen to 54% – indicating a positive trend – but still a way to go.

Focusing on improving the organization’s ability in improving end to end business processes is also needed for success with digital transformation. Where does your organization stand in terms of its process maturity? Are you just beginning the process improvement and management journey or is the organization well on the way to modeling, improving, measuring and managing its key business processes to achieve business goals? If there is room to improve your people’s skill in areas such as BPM, customer experience and change management, then you may wish to explore the training programs offered on these topics at: http://www.bpminstitute.org/learning-paths.

Further, the answers to the following questions may provide you with additional insight on your organization’s situation on its enterprise digital transformation journey:

  1. To what extent is your company strategy driving the digital transformation program?
  2. To what extent are you actively challenging the elements of your business model (i.e. value proposition, delivery channels, etc.)?
  3. To what extent are you exploring new digital business and digitally modified businesses?
  4. To what extent do your leaders have a shared understanding of the entire customer journey?
  5. To what extent are you deploying digital to redesign end to end business processes?

Recall the power of the one page principle. This involves in having a high level schematic – just one page for your customer journey map, one page for your business model, and one page for your process relationship map. That’s what drives discussion and collaboration and storytelling. Of course, some of these high level schematics need to be developed at a more granular level of detail – but the one page view is what captures attention and drives dialogue.

The vast majority of digital transformation efforts at the enterprise level are led from the top. Leading by example is part of the success formula as well as defining clear priorities and managing the cross-functional interdependencies that many digital solutions often involve. Chances for success are amplified when employees believe that their leaders have the skills to lead the digital strategy and understand the major digital trends – and that is augmented with stories.

How can you get started on the journey? The following were some of the tips presented by Gartner at the Program & Portfolio Management Summit (PPM) in Orlando:

• Assess your organization’s appetite for risk taking
• Be introspective
• Introduce innovation into every project
• Find a project that can be monetized with digital
• Engage in experiments and communicate lessons learned

One of the keynotes at the 2017 Gartner PPM also emphasized that digital business is an entirely new game, the rules of which are not yet written. Whatever road you choose for your digital transformation journey, it will be important to take into account the central role of customer experience, the power of process management, and the importance of having clear priorities.

Source: BPM Institute

Apple’s AR platform: These demos show what ARKit can do in iOS 11



Apple sees a lot of potential in augmented reality.

Ever since Pokemon Go exploded in popularity last summer and subsequently revived interest in both Apple’s App Store and mobile gaming, Apple has said several times that it is embracing the technology, which is commonly called AR, especially now that it offers the ARKit platform. Here’s everything you need to know about ARKit, including what it can do and examples of its power in action.

What is AR?

Augmented reality isn’t a new technology. But Apple is now jumping into AR, so everyone’s been talking about it. You see, while virtual reality immerses you into a space, essentially replacing everything you see in a physical world, AR takes the world around you and adds virtual objects to it. You can look with your phone, for instance, and see a Pokemon standing in your living room.

What is Apple ARKit?

With iOS 11, which debuted at WWDC 2017, Apple is officially acknowledging AR. It has introduced the ARKit development platform, allowing app developers to quickly and easily build AR experiences into their apps and games. It will launch alongside iOS 11 this autumn. When it’s finally live, it’ll use your iOS device’s camera, processors, and motion sensors to create some immersive interactions.

It also uses a technology called Visual Inertial Odometry in order to track the world around your iPad or iPhone. This functionality allows your iOS device to sense how it moves in a room. ARKit will use that data to not only analyse a room’s layout, but also detect horizontal planes like tables and floors and serve up virtual objects to be placed upon those surfaces in your physical room.

What’s the point of ARKit?

Developers are free to create all kinds of experiences using ARKit, some of which are already being shown off on Twitter. IKEA even announced it is developing a new AR app built on ARKit that will let customers to preview IKEA products in their own homes before making a purchase. IKEA said that Apple’s new platform will allow AR to “play a key role” in new product lines.

That last bit is key. For Apple, ARKit opens up an entirely new category of apps that would run on every iPhone and iPad. It essentially wants to recreate and multiply the success of Pokemon Go. Plus, it opens up so many long-term possibilities. The company is rumoured to be working on an AR headset, for instance. Imagine wearing Apple AR glasses capable of augmenting you world every day.

Does ARKit face any competition?

Let’s also not forget that ARKit allows Apple to compete with Microsoft’s Hololens and Google’s Tango AR kit. But while Hololens and Tango are designed to be aware of multiple physical spaces and all of the shapes contained within, ARKit is more about detecting flat surfaces and drawing on those flat surfaces. In other words, it’s more limited, but we’re still in early-days territory right now.

We actually think ARKit’s capabilities, as of July 2017, reminds us of the AR effects found inside Snapchat or even the Facebook Camera app. The potential of Apple’s AR platform will likely improve as we move closer to the launch of iOS 11, however.

Which iOS devices can handle ARKit apps?

Any iPhone or iPad capable of running iOS 11 will be able to install ARKit apps. However, we’re assuming newer devices will handle the apps better. For instance, the new 10.5-inch and 12.5-inch iPads Pro tablets that debuted during WWDC 2017 have bumped-up display refresh rates of 120hz, which means what you see through the camera should seem much more impressive on those devices.

How do you get started with ARKit?

If you’re interested in building ARKit apps for iOS 11, go to the Apple Developer site, which has forums for building AR apps and beta downloads. If you’re a consumer who is just excited to play, you can go get the new iPad Pro and install the iOS 11 public beta to try out some of the early demos for AR. Otherwise, wait for iOS 11 to officially release alongside new AR apps in the App Store.

Source: pocket-lint.com

3 Technologies You Need To Start Paying Attention To Right Now


At any given time, a technology or two captures the zeitgeist. A few years ago it was social media and mobile that everybody was talking about. These days it’s machine learning and block chain. Everywhere you look, consulting firms are issuing reports, conferences are being held and new “experts” are being anointed.

In a sense, there’s nothing wrong with that. Social media and mobile computing really did change the world and, clearly, the impact of artificial intelligence and distributed database architectures will be substantial. Every enterprise needs to understand these technologies and how they will impact its business.

Still we need to remember that we always get disrupted by what we can’t see. The truth is that the next big thing always starts out looking like nothing at all. That’s why it’s so disruptive. If we saw it coming, it wouldn’t be. So here are three technologies you may not of heard about, but you should start paying attention to. The fate of your business may depend on it.

1. New Computing Architectures

In the April 19th issue of Electronics in 1965, Intel Co-Founder Gordon Moore published an article that observed the number of transistors on a silicon chip were doubling roughly every two years. Over the past half century, that consistent doubling of computing power, now known as Moore’s Law, has driven the digital revolution.

Today, however, that process has slowed and it will soon it come to a complete halt. There are only so many transistors you can cram onto a silicon wafer before subatomic effects come into play and make it impossible for the technology to function. Experts disagree on exactly when this will happen, but it’s pretty clear that it will be sometime within the next five years.

There are, of course, a number of ways to improve chip performance other than increasing the number of transistors, such as FPGA, ASIC and 3D stacking. Yet those are merely stopgaps and are unlikely to take us more than a decade or so into the future. To continue to advance technology over the next 50 years, we need fundamentally new architectures like quantum computing and neuromorphic chips.

The good news is that these architectures are very advanced in their development and we should start seeing a commercial impact within 5-10 years. The bad news is that, being fundamentally new architectures, nobody really knows how to use them yet. We are, in a sense, back to the early days of computing, with tons of potential but little idea how to actualize it.

2. Genetic Engineering

While computer scientists have been developing software languages over the past 50 years, biologist have been trying to understand a far more pervasive kind of code, the genetic code. For the most part, things have gone slowly. Although there has been significant scientific progress, the impact of that advancement has been relatively paltry.

That began to change in 2003 with the completion of the Human Genome Project. For the first time, we began to truly understand how DNA interacts with our biology, which led to other efforts, such as the Cancer Genome Atlas, as well as tangible advancements in agriculture. For the first time, genomics became more than mere scientific inquiry, but a source of new applications

Now, a new technology called CRISPR, is allowing scientists to edit genes at will. In fact, because the technology is simple enough for even amateur biologists to use, we can expect genetic engineering to become much more widespread across industries. Early applications include liquid fuels from sunshine and genomic vaccines.

“CRISPR is accelerating everything we do with genomics,” Megan Hochstrasser of the Innovative Genomics Initiative at Cal Berkeley told me, “from cancer research to engineering disease resistant crops and many other applications that haven’t yet come to the fore. Probably the most exciting aspect is that CRISPR is so cheap and easy to use, it will have a democratizing effect, where more can be done with less. We’re really just getting started.”

3. Materials Science

Traditionally, the way you improved a material to build a product has been a process of trial and error. You changed the ingredients or the process by which you made it and saw what happened. For example, at some point a medieval blacksmith figured out that annealing iron would make better swords.

Today, coming up with better materials is a multi-billion business. Consider the challenges that Boeing faced when designing its new Dreamliner. How do you significantly increase the performance of an airplane, a decades old technology? Yet by discovering new composite materials, the company was able to reduce weight by 40,000 pounds and fuel use by 20%.

With this in mind, the Materials Genome Initiative is building databases of material properties like strength, density and other things, and also includes computer models to predict what processes will achieve the qualities a manufacturer is looking for. As a government program, it is also able to make the data widely available for anyone who wants to use it, not just billion dollar companies like Boeing.

“Our goal is to speed up the development of new materials by making clear the relationship between materials, how they are processed and what properties are likely to result,” Jim Warren, Director of the Materials Genome program told me. “My hope is that the Materials Genome will accelerate innovation in just about every industry America competes in.”

It’s Better To Prepare Than Adapt

For the past few decades, great emphasis has been put on agility and adaptation. When a new technology, like social media, mobile computing or artificial intelligence begins to disrupt the marketplace, firms rush to figure out what it means and adapt their strategies accordingly. If they could do that a bit faster than the competition, they would win.

Today, however, we’re entering a new era of innovation that will look much more like the 50s and 60s than it will the 90s and aughts. The central challenge will no longer be to dream up new applications based on improved versions of old technologies, but to understand fundamentally new paradigms.

That’s why over the next few decades, it will be more important to prepare than adapt. How will you work with new computing architectures? How will fast, cheap genetic engineering affect your industry? What should you be doing to explore new materials that can significantly increase performance and lower costs? These are just some of the questions we will grapple with.

Not all who wander are lost. The challenge is to wander with purpose.

Source: Digital Tonto

AI Models For Investing: Lessons From Petroleum (And Edgar Allan Poe)


A decade ago, at a NY conference, an analyst put up slides showing his model of the short-term oil price (variables like inventories, production and demand trends, and so forth). I turned to the colleague next to me and said, “I just want to ask him, ‘How old are you?’” I worked on a computer model of the world oil market from 1977, when the model was run from a remote terminal and the output had to be picked up on the other side of campus. (Yes, by dinosaurs.) Although I haven’t done formal modeling in recent years, my experiences might provide some insight into the current fashion for using computer models in investing (among other things).

About two centuries ago, Baron von Maelzel toured the U.S. with an amazing clockwork automaton (invented by Baron Kempelen), a chess-playing “Turk” in the form of a mannequin at a desk with a chess board. The mannequin was dressed up as a Turk, given perceptions at the time of their perceived superior wisdom. The automaton could not only play chess very well, but solve problems presented to it that experts found difficult. Viewers were amazed, given the complexity of chess, and the level of play was not matched by modern computers for nearly two centuries. None of the Turk’s observers could initially explain the mechanism by which such feats were performed.

This is reminiscent of the 1970s when Uri Geller made claims to have paranormal abilities, which physicists from SRI found they could not explain. Because he wasn’t committing acts of physics, but sleight of hand, as demonstrated by the Amazing Randi who was not a scientist but rather an expert in the latter craft. (Similarly, peak oil advocates are often amazed at techniques done by scientists that are actually statistical in nature—and done wrong.)

Edgar Allan Poe considered the case and proved to be the Amazing Randi of his day. The chess-playing Turk was the result of “the wonderful mechanical genius of Baron Kempelen [that] could invent the necessary means for shutting a door or slipping aside a panel with a human agent too at his service…” in Poe’s words. He noted that the Baron would open one panel on the desk, show no one behind it, close it and open the other, again revealing no human agent; but this is just a standard magician’s trick, where the subject simply moves from one side to the other. Indeed, others claimed to have seen a chess player exit the desk after the audience had left.

Computer models often fall into this category. No matter how scientific and objective they appear, there is always a human agent behind them. In oil market modeling in the 1970s, this took the form of the price mechanism. NYU Professor Dermot Gately had suggested that prices moved according to capacity utilization in OPEC, as in the following figure (later used by the Energy Information Administration, among many others). If utilization was above 80%, prices would rise sharply, below 80% they would taper off.

AnalyticsAnywhereThis made sense, given that many industries use a similar conceptual model to predict inflation: high utilization in the steel industry results in higher steel prices, etc. And the model certainly seems to fit the existing data.

At least until 1986. After 1985, the data points no longer fit the curve, and the EIA stopped publishing the figure after 1987; for the last two years the model was well off. Subsequently, EIA ceased to publish the figure, although they used the formula for some time to come.

What had become obscured by the supposed success of the formula was that it was intended for use of short-term price changes. High steel capacity utilization would mean higher steel prices, but lead to investment and more capacity, so that prices would stabilize and even drop.

But oil models couldn’t capture this, because much of the capacity was in OPEC and it was assumed that OPEC would not necessarily invest in response to higher prices. Instead, the programmer had to choose numbers for OPEC’s future capacity and input them into the machine, meaning the programmer had control over the price forecast by simply modifying the capacity numbers. Despite the ‘scientific’ appearance of the computer model, there really was a man in the machine making the moves.

People have long sought to reduce the influence of fallible humans, whether replacing workers with machines or putting control of our nuclear weapons in the hands of Colossus, a giant computer that would avoid an accidental nuclear war. (1970 movie Forbin, the Colossus Project, fourteen years before Terminator’s Skynet). Ignoring that there is a always human element, even if only in the design.

Without any expertise in the field of artificial intelligence, it nonetheless seems to me that AI trading programs might learn, but won’t they learn at they are taught to do? Will this not simply be an extension of algorithms used by others in the financial world, at whose core as simply comparison of current with historical data and trends?

And this, after all, is what led to the financial meltdown described so aptly in When Genius Failed, the story of Long Term Capital Management and the way it nearly crashed the world economy. Recognizing patterns of behavior preceding an OPEC meeting, such as the way prices move in response to comments by member country ministers, can be useful, but will novel cases such as the SARS epidemic or the 2008 financial crisis catch the programs flat-footed, possibly triggering massive losses?

The answer, as it often does, comes down to gearing. LTCM’s model failed, but the problem was the huge amount of money they had at risk, far outreaching their capital. For a few small traders to use AI programs, or an investment bank to risk a fraction of its commodities’ funds would not be a concern. But if such programs become widespread, and they all programs are drawing the same conclusions from historical data, could there be a huge amount of money making the same bet?

For individuals, of course, the answer is diversify, one of the first investing lessons. I wonder how many AI programs will practice the same.

Source: Forbes