“Contextual Intelligence” for CHROs, the Meaning of “AI” and Planning for Innovation

Special: Human Resources Analytics Discussion with Eric Sandosham and
Marcia Tal

In this issue, we are pleased to feature expert commentary by Eric Sandosham, Founder and Partner at Red & White Consulting Partners LLP. Formerly Managing Director and Regional Head of Citibank’s Decision Management and Analytics function in 14 global markets, Eric is currently based in Singapore. Specializing in Human Capital Analytics, Red & White Consulting Partners has client engagements across Manufacturing, Banking and Non-Profit organizations in Indonesia, China, Malaysia, Singapore and Vietnam.

Eric and Marcia discuss the role of data analytics in Human Resources—and more.

FRAMEWORK : Human Resources Analytics

Finding Context for CHROs and Data Analytics

What exactly is the CHRO’s role today?

Neeraj Sanan’s People Matters article, “How contextual intelligence is empowering CHROs,” contends that today’s businesses demand more  strategic direction from HR that requires “an assertive, data-driven CHRO.”

Research shows survey respondents expect Human Resources to focus on aligning their function with the overall business strategy rather than their traditional role.

However, Sanan cites research that shows a “large gap” between the requirements for this strategic role and “analytical data skills” of many CHROs today, and this is where “contextual intelligence” enters the discussion. Yet, the article fails to explain exactly what “contextual intelligence” is and how it is “empowering CHROs.”

This is where the disconnect occurs.

Confusing terminology aside, we all agree that businesses need to better understand and promote the use of data analytics in HR. For further perspective and insight, I asked my colleague Eric Sandosham to join me in a discussion about questions the article raises.

What follows are excerpts from our conversation and discussions with my Editor. Our reflections focus on the meaning of “contextual intelligence” and data analytics’ role in Human Resources.

Marcia Tal: The term “contextual intelligence” is often used by global organizations for redefining roles for both people and leadership when moving talent, services and products from one market to another.

In this case, using data and analytics helps understand attitudes, belief systems, mental models, how people adjust to change, and to what degree people are open to new ideas.

Eric Sandosham: “Contextual intelligence” is a very big, even bombastic word. In the case of HR, the way you use experiences, the nature of different cultures, different occasions, different personal contacts—all of that is contextual. All analysis has no value until you contextualize it.

The real pressure point is that many organizations are feeling that HR may be the weakest link. And I mean weakest link in the sense that the quality of decision-making is not keeping pace with what’s required.

Marcia Tal: I think we all agree that data and analytics can provide the capabilities to better understand people. HR can use that understanding to determine that we have the right people. The right job. And the right skills and mindset. With that combination, we can design strategy and drive new business practices to deliver on business results.

Eric Sandosham: Commentators often talk of “contextual intelligence technology” and the idea of big data—in all the work that I’ve done in HR, there’s very little big data. What data we have in Human Capital Analytics is instructive. You have profiles, performances, productivities, and there are some great examples of using digital applications to monitor employees. This data may be available, but it’s not being curated to allow CHROs to be able to map an employee’s journey. That’s the challenge.

Marcia Tal: I think that CHROs don’t have to be the analyst themselves. What they need is access to analysts within their organizations or a budget to acquire external sources for data analysis. There’s such a big gap—between what organizations expect and the capabilities of CHROs— because this analytics discipline is very advanced. CHROs should be savvy enough to understand what they need and to find the right specialists, but they don’t have to be the analyst.

Eric Sandosham: In all the data work that we do, data only has meaning when it’s phased into context. It may be unique to a particular situation, or unique location attributes, or specific problematic attributes. Of all the data that we use, we never use it generically. It’s never a universal set of data—because it’s people.

Often it’s the exact opposite. In our experience, HR business partners resist using data. Their decisions have largely been based on customized, contextual information. HR is so used to making decisions that are individual or departmental, they may see data as robbing them of that contextualization. Does the data homogenize the approach? Does HR lose what’s special about their case-by-case approach?

Marcia Tal: In the end, there needs to be an organization-wide culture that is open to data… and recognizes the individual… and embraces that people and organizations are driven by something larger than any single one of us.

Eric Sandosham: Without saying, it’s people that move the needle. Technology comes very much later in the process. Technology itself doesn’t lead to discussion or to transformation; technology comes in at the end to optimize and automate.

If you want to know how to fix things on a human capital level, look to people rather than technology. You can crack it with people—with good old-fashioned thinking and framing.

Intelligence, contextual or universal, comes from human mental strength.

Marcia Tal: Exactly, it’s the framing that provides context. The solution comes from human understanding.

FRAMEWORK : Data Analytics

What's So Smart About Artificial Intelligence?

Try asking Siri: “What does “AI” mean?”

Ian Bogost more successfully tackles that question in his Atlantic article, “‘Artificial Intelligence’ Has Become Meaningless.” Distinguishing what he calls “supposed-AI” from real artificial intelligence, Bogost searches for a meaningful definition.

The author, of course, doesn’t suggest that artificial intelligence itself is meaningless but that the term “AI” escapes clear meaning by imprecise usage and exponential overuse.

From Google to Facebook…from the Homeland Security to Coca-Cola…from earnings call transcripts to corporate strategy to press releases—examples of “supposed AI” abound. Bogost believes that most systems “making claims to artificial intelligence aren’t sentient, self-aware, volitional, or even surprising. They’re just software.”

To gain clarity, Georgia Tech artificial intelligence researcher Charles Isbell defines AI this way: “Making computers act like they do in the movies.”

Not to seem “too glib,” Isbell then identifies two features “necessary before a system deserves the name AI.”

First, the system “must learn over time in response to changes in its environment.”

Second, “what it learns to do must be interesting enough that it takes humans some effort to learn.”

Real AI is not just repeatable standard process; rather, it’s the learning of things as we go through the process of learning itself. This method includes interface, integration, conversation—all the things that human beings experience in learning and applying what they learn. It’s this learning that differentiates artificial intelligence from “mere computational automation.”

The same kind of imprecise terminology and clarity of definition affects our understanding of “analytics.” Often used as an umbrella term, “analytics” encompasses many skill sets, techniques, specializations, and tools along its continuum.

Think of the commonly defined analytical processes:

  • Descriptive – what happened
  • Diagnostic – why something happened
  • Discovery – learning and insight
  • Predictive – likely to happen
  • Prescriptive – recommended action

We use all relevant techniques, tools and technologies available to perform these analytical processes. In turn, the application for these analytic processes vary depending on the business need. Some examples of business needs which require different expertise and tools are:

  • Information solutions—data queries to advanced reporting
  • Business solutions—campaign tracking to predictive response modeling
  • Quantitative solutions—decision trees to game theory

Running the gamut from reporting to cognitive computing, analytics needs to be viewed and discussed in ways that appreciate the complexity, applications, skill sets, value, and the human element.

What do AI and analytics have in common? Look to the core of analytics to find a meaningful definition for “AI”—“machinery that learns and then acts on that learning.” Just like in the movies.

Photo Credit: Gordon Tarpley from PS-Hoth Set

FRAMEWORK : Data Products Design

How a Classic Process Leads to Focus, Clarity and Empowerment.

Amy Gallo’s “A Refresher on Discovery-Driven Planning” revisits a “classic methodology for planning innovation” for less predictable new ventures.

In 1995, this new approach was “better suited to high-potential projects whose prospects are uncertain at the start” and require “substantial adjustments to the plan along the way.” Via Eric Ries, Discovery-Driven Planning became the foundation of the “lean startup” movement.

In developing data products today, we find the DDP process especially applicable.

  • Define success
  • Do benchmarking
  • Determine operational requirements
  • Document assumptions
  • Manage planning with key checkpoints 

Key to successfully using the DDP methodology is the “continual updating of your assumptions and checkpoints,” says Rita McGrath, one of the two original DDP developers.

Even more important is the disciplined process around planning. In our work, we find the process to be iterative—planning your checkpoints, for example, often leads to redefining your success matrix.

Even though DDP has remained “remarkably durable,” in 2014 McGrath and her co-founder of DDP Ian Macmillan made several enhancements.

I appreciate her reasoning for updating DDP: “The velocity today is so much faster than it was then—companies need to make decisions more quickly.”

Specifically, the 2014 and more recent updates add focus on competitive positioning and current/future risk:

  • Take a close look at future competition to better anticipate disruption
  • Create assumptions about when competitive attacks and profit erosion will begin to more effectively plan the “next advantage stage” launch
  • More quickly stop pursuing ventures when they “turn out to be flawed”

We strongly agree with the author that Discovery-Driven Planning “may be more relevant now than it was 20 years ago.”

As you develop your products and push through rapid prototyping, you can now pause, take a deep breath, and thank DDP for helping clarify the uncertainty and empowering us entrepreneurs to move forward.