Quantcast
Channel: Predictive Analytics – Capgemini Worldwide
Viewing all articles
Browse latest Browse all 23

Big Data Analytics: When will the initiatives finally come of age?

$
0
0

This was the title of my article that appeared in IDG Expert Network last year. It was about the lacking profitability of analytics initiatives. After the article was published, I had a lot of discussions with colleagues and clients about the same topic—in the same tenor, big data has to become more professional. The analysis of huge amounts of data is no longer a newcomer in the context of digitalization; it is already well established. Therefore, the trial-and-error approach, which covers a lot of projects, is no longer valid in my opinion.

Lately, there’s been a lot of empirical data available, especially regarding the success criteria. In spite of all this—all the ambitions and best practices—organizations are failing to implement their Big Data plans. Why? In my view the following four observations play a pivotal role here:

  1. The project is not initiated by the business

Big data projects are initiated too often by IT. However, big data and analytics are two sides of the same coin. The respective business areas, with their strategic business goals, have to be the initiators of the initiatives—they represent the analytics part. IT’s role is that of an inspirer—they have to enable the project. One of our clients recently set up a Hadoop platform, but  then found out that people were not automatically using  the new platform. So Capgemini was asked to identify and prioritize promising analytical use cases jointly with the business. We considered analytical use cases that deliver concrete insights, whether it be for client interaction, process flows, the competitive situation in the market, or other business areas. Project leaders should always start big data projects with this important step.

  1. Stop  use cases that do not fly

Not every big data use case works; often the quality or spectrum of data is insufficient. It is also possible that the planned analytics approach is not convincing enough. The data scientists then remark that the resulting forecast model is insufficiently significant or accurate. Often this can’t be changed. The answer to the problem here is “Fail-fast.” Don’t get hung up on a single analytics use case, but keep several candidates in consideration. Developing the use case further depends on whether you can get the underlying idea of a use case off the ground and see it fly or not. That needs to be determined quickly, usually with the help of a proof of concept. Instead of being guided by the fear of failure of a particular use case, the credo of an initiative should be to determine as soon as possible which use cases are viable and then to implement them appropriately.

  1. Available infrastructure is often not flexible enough

For a fail-fast approach, IT needs an infrastructure that can be deployed for a use case very quickly and at low investment cost. IT infrastructure in traditional companies is not really cut out for that. The maxims that govern them are availability, quality and reliability. Therefore, it is important to explore the option of working in the cloud for big data analytics use cases. This also reduces a company’s problem of low level of necessary skills in working with new Big Data platforms, as this expertise can be provided by the providers themselves. Capgemini’s Business Data Lake-as-a-Service is provided on the cloud and our clients use it to jumpstart their big data projects on the cloud without worrying about what impact the project can have on their infrastructure.

  1. The operationalization of big data analytics is underestimated

Most people are successful in making a proof of concept fruitful. However the final implementation of the use case in the IT application landscape, and its sustainable embedding in business processes, is tough. Project leaders often don’t consider the effort necessary for making the use case fully operationalized. At times, the tools and platform used in the proof of concept are not suitable for regular ongoing operation and scaling up. Usually a use case pays off financially only once it has been adopted in production. Companies should therefore create an overarching process for the implementation of analytical use cases, which starts from the idea, continues with the proof of concept and ends at the operation stage. For the PoC phase, agile mostly cloud-based platforms like sandboxes are fine. But finally you need an operational platform for running all these use cases and coming up with all the operational requirements on an enterprise level.

Have you been aware of any of these four reasons as the biggest hurdles on the way to profitable big data projects in your organization? What are your experiences in the operationalization of analytical use cases? Do let me know if you agree—or not.


Viewing all articles
Browse latest Browse all 23

Latest Images

Trending Articles





Latest Images