All posts in Thought Leadership

An Analyst Toolkit: From Big Data to Big Knowledge

Data and knowledge

I recently read that big data to organizations is like sex to teenagers: They talk a lot about it but few are actually doing it. My experience, however, has taught me otherwise: Many public- and private-sector organizations implement big data tools in order to improve their value chains, increase engagement with clients and stakeholders, improve HR management, and enhance strategic decision-making processes.

What few organizations are actually doing, however, is utilizing “big knowledge”: the aggregation of insights generated by a constellation of analysts — and the next “big thing” after the mastery of big data.

Big knowledge sometimes starts with big data. There’s no doubt that big data tools sift through huge quantities of information that might not otherwise be relevant or interesting. But big data is merely one of many starting points for disparate groups or even masses of analysts to generate the insights which are later turned into big knowledge.

What separates big knowledge from big data is not just making sense of an influx of information, but a flood of insights coming from many different analytic sources. Organizations utilizing big knowledge are thus faced with the following questions: Are we fully utilizing the insights generated, or are some of them getting lost on the way? And can we even generate true knowledge out of this aggregation of insights?

As if these questions are not complicated enough, there’s another set of variables to take into consideration: the identity of those creating the insights and the context of their generation. Unlike “ordinary” data (which deals with objective facts), insights are interpretations of reality and therefore are subjective in nature. It is thus important to contextualize the aggregation of preliminary insights so as to build new layers of analysis thereupon. Indeed, ignorance of the basic conditions in which insights are initially generated may replicate biases, mental pathologies or just simple mistakes.

Understanding the context of insight generation is especially important when dealing with analysts examining similar or interconnected issues but who are all dispersed by location or field. Organizations facing this challenge include intelligence agencies, large consulting firms, research firms and risk assessment departments of large multinational corporations and many more. Such entities need to consistently generate new and impactful insights which transcend the mundane. They must also constantly examine the holistic relevance of knowledge generated for a specific purpose. Doing this manually (via human analytic capacity) is simply too labor-intensive – not to mention limited by cognitive limitations.

So how does one effectively digest analytic products produced in multiple contexts? How does one identify blind spots, contradictions or patterns that may or may not emerge? In what structured way can one “connect the dots” between multitudes of analytic products to produce new sets of insights? And how can one assess whether the context in which an analysis was conducted (e.g., time, location, culture, professional background) influenced the assessment?

This is where big knowledge comes into play — an analytic toolkit that enhances cognitive capabilities rather than making them redundant. Even if artificial intelligence and big data become productive, there will be an extended (and perhaps indefinite) period where human-in-the-loop analytics are needed to pick up where software falls short. The most interesting future use cases come in the illumination of unknown or hidden connections which become a kind of “force multiplier” for the analytic community. Such a toolkit will allow analytic communities to map knowledge, interconnect insights, uncover new relationships and ideas, and identify relevant experts for the task at hand.

We need to move beyond mere data and bring forth the next revolution of knowledge management: the big knowledge revolution. We need to develop tools that enhance the ability of individuals and organizations to reach a deeper level of analysis, thus improving decision-making processes. This will enable us to transition from “known unknowns” to “unknown unknowns” — i.e., to acquire real-time understanding of previously obscured connections and relationships and to use artificial intelligence to point analysts in the right direction. Such tools will be the ultimate analytic resources for creating synergy between data-information knowledge, software-hardware and human experts to produce “big” knowledge. Read More →

Facebook Twitter Email

Generation Risk: Assessing Political Instability in the Business World


Photo by Derek Gavey

We are Generation Risk. Just think of the number of man-made and nature-made catastrophes humankind has experienced in the last two decades: tremendous technological changes, global economic downturns, ecological disasters, wars and massive terror attacks. In this turbulent environment, organizations, firms and individuals are seeking to find the new crystal ball that will enable them to not only improve their situational awareness but also (and even more importantly) create better early warning mechanisms.

More concretely, risk management has become a crucial part of corporate activities and best practices in risk management policy is constantly pursued.

Assesing political risk — specifically on the assessment of regime stability — is not a theoretical issue, nor is it something that should concern only international relations wonks. In a globally hyper-connected world, assessing regime stability is a crucial practice: First, most multinational corporations have complex operations all over the world, including in underdeveloped and unstable countries. Second, in such a global world, even the clap of a butterfly wing in the southern hemisphere is likely to influence the northern one (and vice versa). So it is necessary to track changes in the strategic environment, it the broader sense of the word. Geopolitical stability is a crucial part of that.

Attempts to develop generic models to enable decision-makers to assess regime stability (and even more importantly, to preemptively identify political changes) can be traced back to as early as the 1950s. The events surrounding the Arab Spring – especially the inability of literally any Western intelligence organization to predict this political tsunami – gave these efforts a renewed push.

Most methodologies and services aimed at measuring political (in)stability suffer from what I call “risky dichotomies”: They are either quantitative or qualitative in nature, are either generic or context-heavy, are either machine-based or human-based, and so on. In short, they suffer from two main problems. First, the need to create a model as generic as possible (and therefore one which is scalable) contradicts the fact that every country (or at least every geographic region) has its own unique characteristics. The need for contextualization takes the edge off of any generic model. Second, the need to weight quantitative and qualitative indicators under one model creates measurements, validity and credibility problems — which in turn harms such models’ predictive effectiveness.

Given these methodological complexities, and the market’s desperate need for an efficient tool to enable decision-makers to manage political risks by being fully aware of related trends, Wikistrat has developed a tool that does exactly that. As the company’s Chief Strategy Officer, I have led a team of analysts, economists, statisticians and data scientists with a highly ambitious objective: to develop a model that (a) doesn’t fall into the methodological pitfalls we identify in existing methodologies and services, and (b) presents analysis, trends and early warning in a simple and intuitive manner, thus enabling decision-makers to quickly and simply consume information and apply it in the decision-making processes.

The Regime Stability Model (RSM) we have developed brings the power of subject-matter expert collaboration and infuses data collection and interpretation capabilities into a service offering. Used principally to perform real-time monitoring of country-level risk and instability, this capability combines qualitative crowdsourced analysis with quantitative datamining to provide a real-time, 24-7-365 system for monitoring and anticipating social unrest and destabilization potential in countries around the world.

The RSM has several main advantages: First, it combines quantitative and qualitative methods. Second, it utilizes a wide array of sources (e.g., open sources, social media, big data and expert opinions). Third, it is able to compare output across time. Fourth, it is able to differentiate between events and trends, and to analyze events within a broad strategic context. Finally, independent analysis generated by one method is used to validate results of another in a symbiotic relationship.

This altogether ensures a wider and more accurate perspective on the stability of a regime, as well as evolution and improvement of the model itself. Most importantly, it works. Among other things, the model was able to predict several occurrences regarding a third-world country — including the exact locations of riots in several cities, the judiciary as a new terror target, a period of growth in foreign investment, and a devaluation of the local currency several months before it was officially announced.

Want to hear more? Don’t hesitate to contact me. Read More →

Facebook Twitter Email

The 3X3 Golden Rules of a Successful Scenarios-Based Simulation


Scenarios-based simulations are a powerful tool that supports strategic decision-making processes, both in the public and private sectors. They provide a virtual laboratory in which participants can experiment and play with ideas, without cost or damage in the real world. Such games help us examine scenarios from various perspectives and try multiple engagement strategies. They assist us in thinking the unthinkable thus getting us better prepared for the future by learning from future experience. Today’s simple and accessible technologies enable us to use a virtual environment in order to simply create such exercises. This in turn makes a simulation even more efficient as it enables a large number of participants to collaborate or compete at the same time.

When trying to design and execute a scenarios-based simulations these are the top nine tips one should keep in mind (I call it “the 3X3 golden rules”):

During the planning phase:

    1. Preparation is the key. So do your preliminary research.
    2. Ask the right questions and make sure the participants are able to answer them.
    3. Divide and concur – don’t try to answer all the questions under the same game. Instead, focus on the most important ones.

During the execution phase:

    1. Know where you are heading but allow participants to explore new terrain.
    2. Give the group the feeling that they generate their own insights.
    3. Keep the process enjoyable.

During the post-production phase:

    1. Don’t forget to wrap-up — a game without a report that presents clear insights is like a tree falling in the forest.
    2. Stay focused and provide actionable takeaways.
    3. Ask for feedback — you also can always learn and improve.

My experience in managing hundreds of such exercises has taught me that when done correctly, scenarios-based simulations is a game changer for decision makers, general and CEOs alike. Try it yourselves. Read More →

Facebook Twitter Email

Imagining Strategy: Using Imagination in Strategic Planning


Games are a serious matter. Yes, they are fun. But more importantly, they enable the player (whether a toddler or an adult) to experience a story and to actively write it as they go. Games are a great way to experience a certain scenario (fiction or nonfiction) – even more so than reading a book or watching a movie. Games require active involvement. They don’t have a preset end. They force the player to take responsibility and determine the course of events and the end results. This responsibility creates a strong psychological element of real-time decision-making and the need to cope with the results of one’s actions.

What I really like about games is that they force you to use your imagination. And imagination is a powerful tool for decision-makers, strategists and analysts. Lack of imagination is in fact the inability to think creatively about the future, preparing for past challenges instead of for what’s in store.

But how can one define imagination? And how can it influence decision-making?

Literature identifies three types of imagination: descriptive, creative and challenging. Descriptive imagination helps us turn an abstract world into a tangible one. This is the type of imagination that identifies patterns, regularities and pathologies (or the lack thereof) out of a glut of information. It enables us to analyze and make educated judgment calls based on years of experience. For managers, having descriptive imagination means they can identify challenges and opportunities, make sense out of them, and thereby form a strategy. Think of SWOT analysis or the BCG growth-share matrix. These tools help us to describe what we imagine; they turn the abstract into something tangible and cognitive.

Then there’s creative imagination, or what is commonly referred to as “outside-the-box” thinking. While descriptive imagination enables us to see and explain “what’s out there” in a new way, creative imagination enables us to see and explain what’s not out there – i.e., to create something truly new, sometimes even completely different from what already exists. This is the type of imagination that is identified with business innovation. It needs to be applied when a certain organization comes to the conclusion that its business model, service or product is outdated and a paradigmatic change is required. In other word, the main driver behind this kind of imagination is a sense of discontent regarding the present.

Finally, there’s challenging imagination, which stands in contrast to the other two. With challenging imagination, we criticize, challenge and sometimes even destroy what was achieved by way of the previous two. This is the kind of imagination that undermines all previous rules and assumptions, and provides a clear cognitive playground to test the unthinkable. It doesn’t presuppose anything, and it doesn’t use previous knowledge as a given. It simply starts everything from scratch. It deconstructs existing knowledge, perception and language. It uses cynicism and sarcasm, and see nothing as sacred. Read More →

Facebook Twitter Email

“Sir, I Have a Revolutionary Idea”: Innovation in the Military and the IC

West Point graduate

We think of military people as conservatives, realists — even pessimists. We think of them as obedient, lacking any imagination and vision. We especially enjoy blaming intelligence officers for not being innovative, for tending toward stifled group thinking, for possessing the undesired combination of arrogance and nearsightedness. We therefore blame them for not being able to predict events: Pearl Harbor, the Soviet deployment of nuclear missiles to Cuba, 9/11, the Arab spring and countless others throughout history. Novelty and vision are, so we believe, the province of the private sector. Only there, in the promised land of innovation, can we find the next generation of any and every conceivable technology. But my experience has proven otherwise.

For the last several years, I have been working with government entities — especially militaries and intelligence agencies — suggesting sophisticated methodologies to help them to think creatively about the future. The wisdom of the crowd and big data analysis, two elements I believe can revolutionize strategic planning, have been joyfully adopted by governments around the world, in growing numbers.

In my experience, the willingness to push boundaries and experiment with new methodologies is one of the driving forces behind government–led research and strategic planning. I have found it easier to create interest and experimentation with new approaches to analysis and forecasting when dealing with governments than with private market corporations. The possibilities are endless: How will West Africa’s economy evolve in the next 20 years? What future threats will air forces around the world face? How will future population shifts affect Europe’s national borders? How will climate changes influence the global marketplace? This is just a partial list of topics I was asked to answer on behalf of government entities last year.

Why is the “military mind” so badly mischaracterized? How can one explain the shift from what Samuel P. Huntington portrayed in 1957 as “conservative realism… obedience… pessimistic, collectivist, historically inclined, power-oriented, nationalistic, [and] militaristic,” to the innovative, even intellectual entrepreneurship of today? The truth is that Huntington missed the mark. History is full of examples of sophisticated, daring senior officials. From the early days of wargames in the Prussian army to the revolutionary thinker General Donn A. Starry, the history of national security is overflowing with innovation and unique ideas.

This is especially true if we look at the organizations whose main purpose is to develop the knowledge that supports decision-making — primarily intelligence organizations, but also other strategic-planning apparatuses. In these organizations, the culture of debriefing, learning from failure, and developing new methodologies is well established.

Right, you say, but they sometimes (too often) fail to predict the future, or even help decision-makers be well prepared for the future. Trust me, they are well aware that they operate in a complex environment, and that their ability to comprehend this complexity is limited. Do a quick Google search and take a look at the impressive corpus of writings on geopolitical analysis methods and forecasting. See how many books and articles have been written on the issue of strategic surprise, on new approaches and methodologies for coping with decision-making under extreme uncertainty. This flood of publications stands in harsh contrast to the relatively limited literature on surprise, analysis and strategic planning in the business world. Sure, there are several great books and articles, some of them even offering sound advice. But between the two disciplines, the business sector is left trailing in government’s dust. Read More →

Facebook Twitter Email

Ahead of the Information Curve: Crowdsourced Wargames


Governments, especially defense and intelligence agencies, routinely plan for both likely and improbable events. Though intelligence gathering and analysis form the core foundation of these efforts, all organizations are looking to include a wider range of inputs and methods to more accurately forecast responses to likely scenarios. Wargaming involves assembling groups of analysts into designated teams tasked with roleplaying a particular actor, and responding to the moves of the others in a dynamic simulation of situations.

Exercises of this kind can greatly aid national organizations engaged in advanced contingency planning. The effective wargaming of credible scenarios helps organizations get ahead of the information curve, giving them a real-world advantage in crisis management situations.

There are four focus areas where wargames represent an indispensable tool:

    1. Strategic Planning: Improving strategic foresight and shortening response times by highlighting previously unidentified indicators of activity.
    2. Operations: Improving decision-making in planning and operations by identifying bureaucratic weaknesses, blind spots and capability shortfalls ahead of time.
    3. Critical Review and Analysis: Improving decision-making and problem-solving by testing assumptions, hypotheses and existing plans.
    4. Intelligence: Improving understanding of enemies/rivals/competitors by projecting likely strategies and actions of adversaries.

From my experience, working on crowdsourced wargames with Wikistrat, several elements are key to any successful such exercise:

  • Multiple role-playing teams comprised of subject-matter experts that simulate the decision-making process of the actor they are tasked with representing. The number and size of teams may vary depending on client’s preferences and project requirements.
  • A Control Team that oversees the progress of and decision-making within the wargame, ensuring the highest quality of analysis and coordination.
  • A framework scenario that is prepared in advance, and the information stream that is tailored individually to every team in the exercise. The scenario is based upon a testable assumption or identified problem requiring examination and resolution.
  • Wild cards/shocks that may be introduced in any cycle of the wargame to one or more participating teams.
  • An analysis and evaluation of the results that is conducted by the Control Team, and it assumes critical review and synthesis of all initiatives created in the exercise.

Wikistrat ran several wargames in 2015, among which was the “Scarborough Shoal Incident” exercise — a wargame built around a scenario in which China begins major dredging operations to build up the landmass in the Scarborough Shoals with the intention of building an airstrip. Four teams participated the game, representing the governments of the U.S., China, Japan and the Philippines. Read More →

Facebook Twitter Email

Analysis in the Digital Age: From Revealing Secrets to Solving Puzzles


Today’s strategic environment — whether in the realm of geopolitics or business — is characterized by great complexity and a propensity for rapid change. In order for one to make educated decisions, this complexity demands many sources of information from various fields of expertise. The propensity for rapid change requires decision-making based on the most updated information and analysis, as well as preparation for a spectrum of futures.

In the past, the major challenge of analysts and decision-makers was to acquire accurate data. In that bygone world, the decision-maker’s challenge revolved around secrets that needed to be revealed. However, the modern information age is one where information is increasingly moving to open sources and social media, with sensitive information frequently leaked — and where the perception of what is secret or private is changing. In such a world, the challenge moves from secrets to be revealed to puzzles to be solved.

In other words, the transition to the information age marks a parallel transition for analysts and decision-makers in the strategic environment: from collection of data to interpretation and sense-making of data, from secrets to puzzles.

Today’s exciting and novel technologies are altering the way in which decision-makers consume information and transform it into actionable insights. Advanced technology to collect, process and interpret data, advanced big data analytics, machine learning and sophisticated prediction algorithms all fundamentally alter the way we think about strategic planning. In today’s complicated world, businesses and governments need broad intellectual exposure to sift through increasingly complex issues and generate greater analytic insight.

This revolution is pioneered by inventive, digital-age companies — from the well-established Palantir to newcomer Epistema (check them out!) — all involved in facilitating smarter decision-making and analysis.

We founded Wikistrat six years ago with the belief that the old models of consulting and analysis are simply outdated. Whereas crowdsourcing is everywhere, transforming industries and business models, the consulting industry still lags behind.

In this move from revealing secrets to solving puzzles, Wikistrat proposes a new approach — one that taps into hundreds of experts rapidly on an interactive platform, and enables them to collaboratively map future scenarios, propose innovative strategies and analyze risks. It is expert-led collaborative crowdsourcing, and it is transforming how corporations and intelligence agencies think about analysis and strategy. It is how puzzle-solving is done in the digital age. Read More →

Facebook Twitter Email

Making Ourselves Uncomfortable: Red Team Methodology

Red team

A recent article by Micah Zenko in the latest issue of Foreign Policy looks at the experience of the CIA in challenging its own strategic predictions. According to Zenko, the “Red Cell” initiative began on September 12, 2001, when then–Director of Central Intelligence George Tenet formed a group of contrarian thinkers to challenge conventional wisdom in the intelligence community and mitigate the threat of additional surprises through “alternative analysis.” On that evening, his instructions were simple: “Tell me things others don’t, and make senior officials feel uncomfortable.”

National intelligence organizations throughout the world have long been struggling with the need to break their own analytic “glass ceiling” and to bring in new inputs that will help in creating better strategic analysis. Red teaming, alternative analysis and “playing devil’s advocate” are all synonyms for an analytical methodology aimed at sharpening intelligence thinking. Simply put, it is the practice of viewing a problem from the perspective of an adversary or competitor — including that of a competing thesis or analysis. The goal of most red teams is to enhance decision-making, either by specifying the adversary’s preferences and strategies or by simply acting as a devil’s advocate. The three main focus areas of red teaming are:

    1. Planning and Operations: Improve decision-making in planning and operations.
    2. Critical Review and Analysis: Improve decision-making and problem-solving.
    3. Intelligence: Improve understanding of enemies/rivals/competitors and develop better synchronization of intelligence and operations.

Based on my experience in conducting such exercises throughout the years, and especially at Wikistrat, I learned to appreciate the following methodologies, when running a red-team analysis:

    1. Discourse Analysis: A team analyzes written text (e.g., a five-year strategic plan) in order to address various characteristics (e.g., basic assumptions) of the paper, as well as text structure.
    2. Key Assumptions Checks: This ensures that an analytical judgment is not based on a flawed premise. This methodology allows a baseline of confidence to be established.
    3. Devil’s Advocacy: Given any argument, an opposing claim is made in order to test the quality of the original argument, identify weaknesses in its structure, and to use such information to either improve or abandon the original position.
    4. Team A/Team B: A competitive analysis exercise in which two (or more) teams compete to raise arguments and counter the claims of the other group (or a third party).
    5. Contingency “What-If” Analysis: This analysis employs various assumptions (associated with a probable event) to portray different.

I will conclude with a project Wikistrat led, and in which I was involved: In 2014, the Australian military published “The Future Land Warfare Report“, which outlines the major challenges facing Canberra’s military forces over the next two decades. The report’s second edition was produced through a collaborative red-team effort led by Wikistrat in 2013. The Army wished to recognize the contribution of the distributed decision-making and critiquing activity conducted by Wikistrat in late 2013, and the report was produced as a result. Read More →

Facebook Twitter Email

Intelligence Agencies’ Problem: Strategic Planning

Strategy formulation

It seems that most organizations that deal with strategic planning are still struggling to adjust themselves to today’s changing environment. Most government organizations, especially those dealing with knowledge development, are still characterized by the following:

  • A hierarchical and compartmentalized organizational structure.
  • A clear distinction between professional functions and fields of expertise.
  • Limited access to non-confidential (“civilian”) information and analysis, as well as a tendency to depend on classified information.
  • Over-reliance on the “gut feelings” of a small number of highly experienced analysts and deep familiarity with their research subjects.
  • Lack of collaborative tools — and sometimes even lack of collaborative culture.

These characteristics are problematic when dealing with complex issues requiring several fields of expertise and deep integration of these fields. Organizational separation often leads to a similar separation in analysis, even though reality is not subject to such limitations. Furthermore, the need to run collection efforts, analysis and recommendations through the chain of command creates a cumbersome analytic process that hinders the necessary ability to address issues quickly and maintain relevancy. Classified sources of information do bring unique value that is otherwise inaccessible, but in the digital age (where opinions, trends and events exist in open sources, and where civilian, non-classified institutions generate formidable analysis), over-reliance on classified sources might lead to a distorted perception of reality.

Lastly, when covering many issues on an ongoing basis, a structured yet flexible
methodology is crucial for creating comparisons (e.g., between different time periods or between different regions), recognizing correlations and identifying early-warning signposts. The basic scientific demand of replicable research is of the essence when investigating complex social phenomena. Read More →

Facebook Twitter Email

Innovative Consulting Model: Wikistrat’s Idea of “The Many”


Many organizations that conduct research and strategy predominantly utilize a “singular approach” — i.e., relying on limited types of information and working in silos with few researchers, thus providing a relatively narrow and limited perspective on complex and essentially multidisciplinary phenomena. Such models are only seldom able to contextualize issues in a larger context, and therefore risk omitting important and impactful variables. They are also at risk of generating single-perspective analysis that is more prone to biases.

Put differently, traditional models rely on a linear approach: collect information, analyze it, create assumptions regarding the future and provide the decision-maker with an analytic product that only seldom contains actionable recommendations. In today’s increasingly complex environment, this model is becoming ever more obsolete.

Wikistrat’s unique methodology is based on a different approach which better addresses the challenges of today and tomorrow. Where there is a need to analyze complex issues in depth — whether contemporary or futuristic in nature — Wikistrat has the ability to bring an array of experts to the analytic table, and has the technological platform and methodological experience to incentivize them to cooperate and collectively generate in depth analysis and policy recommendations.

At the core of Wikistrat’s model stands the idea of “the plural” or “the many”: many sources of information, many experts and many fields of expertise working collaboratively in real time to create many angles of analysis, produce many predictions and illustrate many futures — all to allow decision-makers to address many concerns with many policy options.

Lastly — and most importantly — the provision of many futures (as well as the probability and magnitude of each) enables policymakers to better manage risks based on educated prioritization and understanding of the potential consequences of various future possibilities.

From an analytic perspective, bringing together many experts from various fields of specialization helps contextualize the problem-set and facilitates multidisciplinary analysis — all while generating constructive competition between ideas and theses to test-case them and examine their resiliency. Many experts generate many points of view – i.e., many insights – even when trying to illuminate an issue from a single perspective. In other words, Wikistrat’s crowdsourced methodology facilitates a deeper and wider study: It allows the planting of an issue in a larger yet relevant context, as well as an analysis from several angles and an examination of the strength of related insights. Read More →

Facebook Twitter Email