IS2021
5 min read

Part 7 - Developing a Strategy for Integrating Big Data Analytics into the Enterprise

Table of Contents

Deciding What, How, and When Big Data Technologies are Right for you

Some very good technologies may not be completely aligned with the cooperative strategy or the corporate culture because organization is not equipped to make best use of the technology.

Thus, enterprises need to allow experimentation to test-drive new technologies in ways that conform to proper program management and due diligence.

The Strategic Plan for Technology Adoption

Achieve balance between the need for agility in adopting innovative analytics methods and continued operations within the existing environment.

It should guide the evolution of data management architectures in ways that are aligned with corporate vision and governance.

It will incorporate aspects of exploration in selecting those techniques that best benefit the organizations as well as provide support for moving those techniques into the production environment.

It should incorporate the following key points:

  • Ensuring that there are standard processes for soliciting input from the business users.
  • Specifying clear evaluation criteria for acceptability and adoption.
  • Prepare the Data Environment for massive scalability.
  • Providing the concept of data reuse.
  • Instituting oversight and governance for the innovation activity.
  • Streaming the methods for mainstreaming accepted technologies.

Let’s take a closer look at each of these points.

Standardize Practices for Soliciting Business User Expectations

Usually there is a trap that project plans focus on the delivery of the capability toward a technical implementation, yet neglect solving specific business problems.

Whether the big data activity is driven by the business users or the technologists, it is critical to engage the business users early on in the process to understand the types of business challenges they face so that we can gauge the expectations and establish their success criteria.

Directly interact with the business function leaders as partners. Enlist their active participation as part of the requirements gathering stage, and welcome their input and suggestions during the design, testing and implementation.

Acceptability for Adoption: Clarify Go/No-Go Criteria

At some point, a decision must be made to either embrace the technology that is being tested and move it into production, or to recognize that it may not meet the business’s needs and then move along to the next opportunity.

Before embarking on any design and development activity, collaborate with the business users utilizing their specific corporate value metrics to provide quantitative performance measures that will reflect the success of the technology.

State an s specific expected improvement associated with a dimension of value, assert a minimal but measurable level of acceptable performance that must be achieved, and provide and explicit time frame within which the level of acceptable performance is to be reached.

This method will solidify the success criteria, which benefits both the business users and the technologists. This benefits the business users by providing clear guidelines for acceptability. For the technologists, the benefit provided is an audit trail that demonstrates that the technology has business value.

Making this decision on incomplete metrics or irrelevant measures may lead to one of several potential unfounded outcomes:

  • Committing to the methods even when it may not make sense.
  • Killing a project before it has been determined to add value.
  • Deferring the decision, effectively continuing to commit resources without having an actionable game plan for moving technology into production.

An example can be, “Increase cross-sell volume by 20% as a result of improved recommendations within 10 days after each analysis”.

These discrete quantitative measures can be used to make that go/no-go decision whether to move forward or to pull the plug.

Prepare the Data Environment for Massive Scalability

Big data volumes may threaten to overwhelm an organization’s existing infrastructure for data acquisition for analytics, especially if the technical infrastructure is organized around a traditional data warehouse information flow.

Test-driving big data techniques can be done in a virtual sandbox environment which can be iteratively configured and reconfigured to suit the needs of the technology.

Developing an application “in the small” that uses a small subset of the anticipated data volumes can mask performance issues that may be hard to overcome without forethought.

Considerations to address include using high speed networks, enabling high performance data integration tools such as data replication, change data capture and compression to rapidly load and access data and enabling large-scale backup systems.

Promote Data Reuse/Repurpose

Big data analytics holds the promise of creating value through the collection, integration, and analysis of many large, disparate datasets.

Different analyses will employ a variety of data sources, implying the potential need to use the same datasets multiple times in different ways.

Institute Proper Levels of Oversight and Governance

Grey areas: navigating the boundary between speculative application development; assessing pilot projects and when successful and then transitioning those pilots into the mainstream.

The above cannot be done without some oversight and governance which will properly direct the alignment of speculative development with achieving buisness goals in relation to the collected business requirments.