By Cynthia Cheng Correia
This is an excerpt of an article that originally appeared in Competitive Intelligence Magazine.
Practitioners understand that, by its definition and practice, intelligence is of high value, and the specific needs and issues must be well understood for the intelligence we generate to possess value. Intelligence products are designed to specifically address management concerns and support decision-making. Off-the-shelf products, even those that aim to employ the “wisdom” and efforts of the crowd, don’t satisfy this requirement. While they may provide company information and basic data analysis – which, like Hoover’s or traditional syndicated reports, can be helpful in foundational competitive intelligence research – their lack of custom and targeted content falls short of providing actual competitive intelligence. Moreover, their very accessibility can diminish their intelligence value, since, by intelligence standards, information available to and known by competitors and the broader business audience can quickly become commoditized.
Another fundamental issue is the nature and reliability of content and sources. When intelligence – or information – is drawn from a crowd of not-necessarily-experts in the industries, functions, and the issues in question, and the system lack a robust and reliable means to identify and give emphasis to reliable sources. It’s easy to understand how experienced intelligence practitioners do not embrace the material or the method.
On top of the question of intelligence value, questions also remain concerning crowdsourcing itself. While crowdsourcing seems to offer efficiencies and cost-savings in many applications, in general contributions from an external crowd tend to fall short regarding questions and issues that are complex, creative, and/or innovative. Largescale, broad-based intelligence crowdsourcing typically doesn’t work well in situations or for questions that are specific to an organization. It also isn’t as successful in contexts that are nuanced, dynamic, or that involve multiple factors. In these instances (and outside game how scenarios) crowdsourcing needs to involve topical and current knowledge and expertise as well as timely responses.
WORKING THE CROWD
In addition to these factors, we also need to understand the distinctions between key crowd-based concepts and how each concept may be applied. Collective intelligence (the other “CI”) and wisdom of the crowd can help gather information, identify trends, analyze or anticipate market changes, and address other information, facts, or opinion-based questions. While these two concepts seem synonymous, collective intelligence describes group intelligence that is formed through interactions like competition, coordination, or collaboration. Wisdom of the crowd describes the collective opinion towards answering a question or drawing a conclusion. Each of these two key methods must be applied appropriately to suit the intelligence need, with collective intelligence better suited for brainstorming, problem-solving, and other creative activities, and wisdom of the crowd better suited for simpler needs such as collecting information, gauging opinions, or discussing simple issues.
While these concepts are relatively recent, some of these methods are already known or well-established among competitive intelligence practitioners.
Developed by the RAND Corporation in the 1950s, this traditional and more selective technique enlists groups of experts who anonymously provide feedback on questionnaires concerning issues. The results are compiled into a collective response that includes respondents’ reasoning, then shared with the participants who reconsider their answers in the next iteration. This process may extend a few rounds before it produces a conclusion.
Although some consider this a form of crowdsourcing, the criteria involved in sourcing experts distinguishes it from the solicitation of contributions from a broader base, even if the base collectively possesses relevant knowledge, skills, and abilities. Thus, it has some qualities of collective intelligence.
Garnering feedback from knowledgeable colleagues to test our information, intelligence, and conclusions seems routine, sensible, and intuitive to many people. This simple technique is also a form of tapping collective wisdom.
Scaled up to involve a broader group, the sanity check can become a due diligence mechanism that elicits a wider range of perspectives, opinions, information, and other feedback to help overcome biases and blindspots, fill gaps, provide validation, or challenge our assumptions and conclusions. To make this more robust, sanity checks may be scaled up to tap the wisdom of an organization’s crowd, as some have begun to do.
By tapping the wisdom of crowds in the external environment, these tools can identify macroenvironmental trends, gauge sentiment, anticipate developments, and gain additional perspectives. As corporations embrace the use of prediction markets to sharpen their focus or help drive innovation, some enterprises have begun to employ internal prediction markets that tap the wisdom of their own crowd to help their decision makers gain better inputs, intelligence, and understanding.
In addition to these methods, and for routine project collection activities, competitive intelligence practitioners can extend further the use of their organization’s internal collective and tap their wisdom for these additional activities.
We all use published sources to support our early warning efforts. More sophisticated systems will also tap the knowledge of internal and external human experts. Advancing this further, competitive intelligence practitioners can enlist their organization’s internal collective to systematically collect internal and external information and intelligence as well as provide observations to cast a wider net for information and indicators. This expands the competitive intelligence program’s capability while expanding access and gaining perspectives that can enhance intelligence generation and minimize myopia.
In evolving or high stakes environments, it usually isn’t enough to end an active intelligence project with a hard stop. Issues often require monitoring and follow-up, which are part of good competitive intelligence and project due diligence.
Like tapping the organization’s collective for early warning applications, using our organization’s crowd wisdom, time, and access to our intelligence program can provide greater efficiency in managing the intelligence effort. It also provides greater assurance that the intelligence team – and management – are staying on top of developments. People within the various functions of our organizations can help us stay on top of developments, as well as gauge the accuracy and progress of our intelligence conclusions.
When we consider tapping our internal human sources and resources, we typically do so in the context of intelligence generation. However, collective input can play a role protecting our organization.
Keys ways of collective engagement that can boost and scale up an organization’s defensive efforts include enlisting employees’ ideas and efforts to identify vulnerabilities, establish protections, and report possible activities that are aimed at the company. Engaging employees more broadly can also help provide defense by promoting the adoption of shared awareness, values, and skills that help individuals make decisions and respond without the need for management intervention at each step.
For example, as part of a defense system, we can establish good awareness, training, policies, and guidelines to employees who receive probing phone calls. This allows them to know how to screen callers, what to say to them, and how to contribute to a company’s awareness system by reporting these intrusions.
Broader engagement with employees can generate more dynamic collection and feedback as well as create more adaptive systems that can improve our collection efforts, intelligence creation, and problem solving abilities. Sophisticated forms of these systems also enable a wider range of employees to be self-organizing, as they find solutions and innovate.
While these efforts have been limited to the creation of functional competitive intelligence startups that have sprung up within marketing, information services, or other departments of many companies, there is potential for adaptive and self-organizing efforts to contribute to established enterprise-wide and strategic intelligence programs as well. Forms of this can include testing working assumptions with broader groups and the creation of collectives around competitive problems when individuals and sponsors see the need.
For issues that require problem-solving and managing divergent or conflicting opinions or stakeholder priorities, collaborative intelligence can be a good fit. This form of collective contribution involves individuals actively lending their skills and efforts toward solving specific problems or generating new products and ideas. Groups aim to maximize idea and knowledge sharing as well as to capitalize on the skills, experience, and expertise of individual contributors.
In intelligence application, collaborative intelligence is typically found within the Intelligence Team. This intelligence program model draws on the skills of collective internal experts. Described in greater depth by Jan Herring and Clifford Kalb, ad-hoc teams form to examine intelligence questions and issues, with team members drawn based on their skills, expertise, and perspectives. This method enables an organization to harness the wisdom of its experts, as appropriate to the intelligence issue in question.
In addition to the practical ways in which crowdsourcing has been applied in intelligence and in competitive intelligence, James Surowieki and others have argued for and demonstrated the strengths of crowd wisdom and aggregate predictions in a number of applications. However, it’s important to note that specific elements are required to use these techniques to achieve more reliable results. These include involving contributors who:
- hold diverse opinions.
- are specialized, well-informed, and independentthinking.
- are exposed to a broad range of information sources.
The process also must be able to aggregate crowd inputs into useable form. The collection of data is meaningless if we can’t process it into something meaningful. In simple form, this may involve the use of spreadsheets; in more robust form, databases, or custom software.
For competitive intelligence, the application of collective intelligence and wisdom of the crowd can identify ideas and perspectives, harness collective time and resources, and provide approaches that can address inherent and common gaps and vulnerabilities in intelligence generation – in short, optimize our practices. The intelligence expert remains the central player who designs and drives the process and methods, examines the collective input, generates the intelligence, and provides recommendations to decision makers.
Collective and crowd intelligence are not intended to truncate or circumvent systematic thinking and analysis, suppress or marginalize expertise or divergent thinking (which can provide safeguards against biases and offer valuable perspectives that are beyond conventional thinking) or promote groupthink. Eliciting contributions from a wider collective also doesn’t mean favoring consensus over leadership. Collective contributions are inputs to intelligence generation, not a replacement for good intelligence practices, responsible decision-making, and sound leadership.
THINGS PEOPLE SAY
Drawing on the collective to gather or generate intelligence may seem counter-intuitive to some people. After all, shouldn’t competitive intelligence activities be kept under wraps? Won’t too many contributors sacrifice efficiency while creating more opportunities for leaks?
While these concerns are legitimate, research in knowledge management and organizational behavior have revealed that adverse consequences are more complex than these questions suggest. Behaviors are often affected by organizational culture, morale, incentives, and other influences, many of which can be shaped or enacted by organizations and their leadership. As with most initiatives, crowdsourcing will likely enjoy greater success and less risk in cohesive organizations whose members share the same vision and values, and feel supported, than in organizations that suffer poor morale and offer few incentives for positive behaviors.
We also understand that collaboration can provide benefits like creativity in problem-solving, shared memory (which can facilitate the formation of solutions), and reduced risk and lowered opportunity costs from tapping the internal collective. These, when combined with good organization influences, can make tapping the collective an attractive proposition.
Just as important, participants must be trained to understand the importance of intelligence defense, organizational and individual consequences of breaches, and how to support and participate in defensive efforts. When people understand expectations and the potential effects of their actions, they are better prepared to behave accordingly. Clearly defined policies and guidelines and non-disclosure agreements – as well as periodic reminders – can define and reinforce expectations. It really goes back to the basics.
The degree to which we need to minimize the risk of disclosure should also factor into how we approach the way we involve our internal collectives and manage intelligence defense. An elegant and useful model for achieving this is through Helen Rothberg and G. Scott Erickson’s Strategic Protection Factor (SPF), which describes the optimal balance between knowledge sharing and intelligence defense in companies and industries. For example, organizations that require a high degree of knowledge sharing to compete but lower intelligence defense requirements can enjoy greater benefits from collective intelligence efforts with lower risks. Conversely, companies that require less knowledge sharing to compete and more intelligence defense will tend to enjoy fewer returns from collective intelligence.
For concerns relating to leaks, the key is prevention. Enlisting the participation of internal colleagues doesn’t inherently result in intelligence breaches, which are addressed by the techniques mentioned above. Moreover, not all participants need to be aware of the full scope and definition of the projects they are supporting. Just as the distribution of intelligence products is on a needto- know basis, the nature of crowd participation should be, too. Selecting and involving participants who are the most suitable to the crowdsourcing effort – as well as designing the crowd collection effort to reveal only what is necessary and appropriate – can help prevent unauthorized information from flowing outward.
FOLLOWING THE CROWD?
In addition to the issues we’ve covered, the question of group-think is a common concern for those considering harnessing collective intelligence. We discussed some of the safeguards previously, and we can prevent group-think or herd bias from inserting itself into the collection process by gathering participants’ information and perspectives prior to a discussion about the issue in question. Sharing –
what occurs as a part of iterative processes like the Delphi Method – is not a part of face-to-face group exchange and occurs only after the initial knowledge or opinion capture. Moreover, asking participants to comment on their own prognostications as well as those of others participants can bring self-reflection and critical thinking components into the process.
In addition to these techniques, we can help forestall groupthink within the crowd by scrutinizing our intelligence questions for signs of bias, and involve and value china breakers in our organization when examining crowd feedback. By both tapping crowd perspectives and giving opportunities for divergent thinking, we can spark discussion and debates that can help uncover questions, issues, and ideas that our core intelligence members and decision-makers have not recognized or explored.
THE GREATER GOOD
Other issues can also arise from the broader involvement of colleagues. When contributions to competitive intelligence extend beyond a core group of contributors, intelligence training (including ethics and legality) needs to extend to them as well. This is more applicable to activities involving collective intelligence in the research and collection efforts than eliciting wisdom (i.e. opinions) of the crowd inputs, since the former requires a deeper understanding of legitimate competitive intelligence collection techniques.
Initially, this approach can place a greater demand on the competitive intelligence team and on the organization’s resources. However, once participants are contributing, wider competitive awareness and an expansion of intelligence capabilities can allay these concerns and generate improved intelligence returns.
THE IN(TELLIGENCE) CROWD
Too often, our own human resources hold untapped information and knowledge that can help our organizations better understand our competitive environment and how to be more competitive. This is due in part to blinders, neglect, and the application of limited tools and techniques to tap these repositories. Collective and collaborative intelligence can offer means by which we can tap these repositories and put information and knowledge to use for us in more systematic and timely ways.
The key is to apply these techniques on top of a strong competitive intelligence foundation while balancing our need for good intelligence and good defense. Tapping the collective involves applying both established and new methods and tools, and innovation continues to support the evolution and power of collective and collaborative intelligence.
As with most things in intelligence, there is no silver bullet – just continued diligence, savvy, sense…and wisdom.
This article originally appeared in Competitive Intelligence Magazine, Vol. 16, No. 2, April-June 2013.