International gurus view of BI 20165 min read

Bert Brijs

Bert Brijs
Senior BI Consultant, Owner, Lingua Franca bvba & Lingua Franca B.V.

 

 

 

 

In your opinion, what was the most important advancement within BI/Analytics in 2015?
The breakthrough of Self-Service BI (SSBI). SSBI has become more than a promise as far as the tool vendors are concerned. Tableau and Qlikview, to name the most prominent,  have led the way but Microsoft is catching up with its 2015 suite. More integration efforts will be needed from  the Redmond guys but we have become used to their market response strategies… Oracle and IBM are following suit as well as SAP. What else do you expect from the gorillas in the market?

On the client side, there is still much work to be done. SSBI without a solid master data management and data quality strategy and process is dead in its tracks and the old adage of BI will pop up as soon as users lose confidence in the presented facts:

BI is a process of extracting, manipulating and publishing data for analysis and decision making and then the data end up in Excel…

In your opinion, what will be the most important advancement within BI/Analytics in 2016?
More than my opinion; this is my hope for 2016: that business and IT people will get together and focus on the common goal. It is the common goal of integrating information management with the strategy process instead of reducing analytics demand to “analyse these data, build me a cube, deliver me reports”. This application development paradigm should be replaced by a common information and decision support strategy where both parties share knowledge and vision to deliver optimum output, pushing the boundaries of the unknowns and uncertainties further away.

So yes, a clear business case built on a deep understanding of the decision making process and culture, the data architecture and the opportunities offered by technology should become the common ground for the analytics strategy. Unfortunately, in many organisations, the builders call in the architects when the building is already collapsing.

People have been talking about big data for a while now. Do you see big data projects becoming mainstream in 2016? Why/why not?
I have been involved in Big Data projects all over Europe as far as business analysis and project management is concerned.  The “all over Europe” is not a coincidence: resources are scarce as well as business driven projects. In other words, most of the Big Data project funding still comes from project budgets, IT budgets or innovation budgets. The result is obvious: short proof of concepts set up by lab rats who are driven by technology arguments. They have been developing all sorts of tests that –to say the least- are not always durable and certainly not documented nor scalable and transferable to the business community. When Gartner tries to launch the concept of the citizen data scientist, I think this will offer nice material for conferences and reports but before we will see these species emerge in organisations we will see a lot more laboratory stuff and propeller heads performing their magic. And then there is the turbulence in the technology market itself. Will Spark provide  the all-encompassing roadmap in Big Data? What about Flink or Storm? Will lambda architectures really provide a useful and maintainable platform to integrate streaming with batch processing in Big Data? Will there ever be less than twenty NoSQL dialects? And who else is developing new architectures, Exasol, Elasticsearch, or some other vendor we all have looked over? I believe 2016 will slowly but surely become a turning point in this evolution. The three major Hadoop packaged distributions (MapR, Hortonworks, Cloudera) will convince the business that the dust is settling and they needn’t worry anymore about the next big thing as the opportunity cost of not upscaling the Big Data analytics infrastructure will become prohibitive. Already a few European e-Commerce, telcos and banks are making the move. Whether these moves will be successful will –again- depend on the level of understanding created between all parties concerned. 

bert1

What new buzzword within BI/Analytics will we see in 2016?
I already mentioned the Citizen Data Scientist: a “light” version of the propeller head that is facilitated by more accessible analytics tools instead of the usual Python, R  and Scala scripting and functional programming tools.

Three more buzzwords that already existed before will gain traction in 2016: data virtualization, linked open data and  concept mining.

Data virtualization (DV) has been around for a while with a very focused Denodo putting it on the CIO’s agenda and ETL tool vendors like Informatica positioning themselves in the market. But what really will develop the market is the giant Cisco’s awakening to the data virtualization call. Will DV deliver on the promise to bring order in the data warehouse chaos some first movers have created over time? Wait and see. I have a client with 56 data warehouses, the equivalent of 56 new information silos. They sure as hell can take a look at DV.

Linked open data will explode as key value stores are easily explored with the Big Data analytics tools. Science, management and politics will be the greatest beneficiaries of this evolution.

Concept mining is  an emerging discipline that transcends the usual “term frequency–inverse document frequency analysis” (tf-idf) as well as natural language processing and natural language understanding. A new breed of (oops it’s back again) artificial intelligence will help us to understand human beliefs, values, attitudes as well as behavior via text analytics. There are huge business cases waiting to be explored and exploited in industries where security and trust is mission critical. Think of banks, trading platforms, hazardous production environments, armies, police, …

Best Regards

Bert

About Arne Rosness