Sep 8, 2020

All New Post

 For all the new post, please follow me on LinkedIn

https://www.linkedin.com/in/nikhil-bhatia-analytics/

Introducing Data Analytics CoE Within Your Organization

Businesses are seeking avenues to increase their efficiencies continuously. be it in cost efficiencies, growth strategies, market expansion plans, or improving decision-making. Today, one way, or perhaps the ‘only’ way, to increase business efficiency is by making data-driven decisions. This need, along with the growing volume of data, is pushing organizations to set up analytics and business intelligence Centre of Excellence (CoE). It helps enterprises access insights on customers, services, products, markets, etc. and help them proactively adjust their strategies according to the shifts in the market forces.

But why do I need an Analytics Center of Excellence? Isn’t having an in-house analytics team enough?

Why you need an Analytics Center of Excellence?

Let’s begin by understanding the purpose of a CoE. The CoE is a cross-organizational body that is responsible for a specific function (primarily information management). Its main purpose is to identify, develop, and establish cross-functional processes and harness the expertise and knowledge of resources to provide tangible business benefits.

A CoE is a channel leveraging which project managers, customers, line managers, etc. can fulfill and improve business initiatives and their outcomes. The CoE, most importantly, is also an updated knowledge repository that continuously generates and refreshes knowledge, skills, practices, and competencies and serves to guide those working in that business domain.

So, when should you set up an Analytics CoE?

Some of the key reasons to establish an Analytics CoE could include:

·        Increased adoption of analytics at an organizational level

·        Need to accelerate go-to-market

·        Improve collaboration between business and IT

·        Make analytics accessible to all business departments

·        Standardize analytics tools across the organization for increased adoption

·        Discover ways to transform operations, products, markets and eradicate operational redundancies and obstacles

Strategies for setting up an Analytics CoE

Define the roles and responsibilities

The right set of people will make all the difference to your CoE. This becomes a Catch-22 situation as in the face of global analytical talent shortage, this is perhaps the hardest piece of the puzzle. You need data experts such as data engineers, data scientists, data architects.

You need to create a data custodian. This custodian can be in-house or a third-party team and has access to all the data sources of the organization. This team will support all needs for data mining, data assimilation, and reporting.

Along with the data experts, your Analytics CoE should also include business experts to align the organizational business strategy and all aspects of the business to improve efficiencies and drive profitability.

Choose the right platform

Choosing the right platform for your CoE is of importance if you want to drive usage and increase the use of data in decision-making processes.

Adopt an easy to use, intuitive, comprehensive platform that allows business users to play with data, examine patterns, reveal hidden insights without necessarily needing the help of data scientists. The platform should automatically implement the right models to the right data to help business users gain the insights they need to achieve their desired business outcomes.

The platform should also give you access to rich visualizations and powerful interactions to provide an elevated user experience. It should also have a pre-built set of Linguistic, Statistical, NLP, and Machine Learning techniques to Model & Structure textual data for analysis, visualization, and collaboration.

Focus on governance

As the volume of data increases, so does the focus on data governance. Given that no CoE has unlimited funds and resources, it is imperative to set the right priorities and control the costs. For this having a good governance framework is essential.

When establishing this framework, it is also crucial to ensure that the Analytics CoE fits in with the existing Business Intelligence and information governance framework.

Along with this, the CoE also has to have big data governance policies in place, ensure the security of data, have robust access management strategies, and also have the right privacy controls. When setting up a CoE, it is imperative to remember that data is a key asset. Thus, having governance focus helps from not only the legislative standpoint but also a security standpoint.

Deploy as a service-radar

The Analytics CoE should also be able to deploy as a service. By enabling deployment as a shared service across business units, the Analytics CoE can democratize data use by making data-driven decisions accessible. This model also helps in improving and optimizing infrastructure and resource utilization and rationalization.

When setting up the Analytics CoE, it makes sense to educate the users and explain that the CoE is not just an algorithm factory. It is also not just a team of product specialists. It is a place where technical and business teams come together. Here, tools, technologies, methodologies, and techniques are applied to data to gain the efficiency that today’s business organizations need and help all employees become citizen data scientists.


Analytics CoE - How to setup

 The economic and market dynamics over the past decade has taught us one thing - ‘business efficiency’.

Be it in pushing cost efficiencies, growth strategies, market expansion plans, or improving decision-making, businesses are seeking avenues to increase their efficiencies continuously. Today, one way, or perhaps the ‘only’ way, to increase business efficiency is by making data-driven decisions. This need, along with the growing volume of data, is pushing organizations to set up analytics and business intelligence Centre of Excellence (CoE). It helps enterprises access insights on customers, services, products, markets, etc. and help them proactively adjust their strategies according to the shifts in the market forces.

But why do I need an Analytics Center of Excellence? Isn’t having an in-house analytics team enough?

Why you need an Analytics Center of Excellence?

Let’s begin by understanding the purpose of a CoE. The CoE is a cross-organizational body that is responsible for a specific function (primarily information management). Its main purpose is to identify, develop, and establish cross-functional processes and harness the expertise and knowledge of resources to provide tangible business benefits.

A CoE is a channel leveraging which project managers, customers, line managers, etc. can fulfill and improve business initiatives and their outcomes. The CoE, most importantly, is also an updated knowledge repository that continuously generates and refreshes knowledge, skills, practices, and competencies and serves to guide those working in that business domain.

So, when should you set up an Analytics CoE?

Some of the key reasons to establish an Analytics CoE could include:

·        Increased adoption of analytics at an organizational level

·        Need to accelerate go-to-market

·        Improve collaboration between business and IT

·        Make analytics accessible to all business departments

·        Standardize analytics tools across the organization for increased adoption

·        Discover ways to transform operations, products, markets and eradicate operational redundancies and obstacles

Strategies for setting up an Analytics CoE

Define the roles and responsibilities

The right set of people will make all the difference to your CoE. This becomes a Catch-22 situation as in the face of global analytical talent shortage, this is perhaps the hardest piece of the puzzle. You need data experts such as data engineers, data scientists, data architects.

You need to create a data custodian. This custodian can be in-house or a third-party team and has access to all the data sources of the organization. This team will support all needs for data mining, data assimilation, and reporting.

Along with the data experts, your Analytics CoE should also include business experts to align the organizational business strategy and all aspects of the business to improve efficiencies and drive profitability.

Choose the right platform

Choosing the right platform for your CoE is of importance if you want to drive usage and increase the use of data in decision-making processes.

Adopt an easy to use, intuitive, comprehensive platform that allows business users to play with data, examine patterns, reveal hidden insights without necessarily needing the help of data scientists. The platform should automatically implement the right models to the right data to help business users gain the insights they need to achieve their desired business outcomes.

The platform should also give you access to rich visualizations and powerful interactions to provide an elevated user experience. It should also have a pre-built set of Linguistic, Statistical, NLP, and Machine Learning techniques to Model & Structure textual data for analysis, visualization, and collaboration.

Focus on governance

As the volume of data increases, so does the focus on data governance. Given that no CoE has unlimited funds and resources, it is imperative to set the right priorities and control the costs. For this having a good governance framework is essential.

When establishing this framework, it is also crucial to ensure that the Analytics CoE fits in with the existing Business Intelligence and information governance framework.

Along with this, the CoE also has to have big data governance policies in place, ensure the security of data, have robust access management strategies, and also have the right privacy controls. When setting up a CoE, it is imperative to remember that data is a key asset. Thus, having governance focus helps from not only the legislative standpoint but also a security standpoint.

Deploy as a service-radar

The Analytics CoE should also be able to deploy as a service. By enabling deployment as a shared service across business units, the Analytics CoE can democratize data use by making data-driven decisions accessible. This model also helps in improving and optimizing infrastructure and resource utilization and rationalization.

When setting up the Analytics CoE, it makes sense to educate the users and explain that the CoE is not just an algorithm factory. It is also not just a team of product specialists. It is a place where technical and business teams come together. Here, tools, technologies, methodologies, and techniques are applied to data to gain the efficiency that today’s business organizations need and help all employees become citizen data scientists.


How AI and IoT Can Transform the Future of Workplace

 By 2020, 36% of the workforce will comprise of Millennials – these are the human resources who are born after the baby boomer era. The Millennials and Gen Z are the digital natives. They are not only extremely comfortable with technology but also expect their workplace to be digitally advanced too. With this new workforce, comes the need for a modern workplace.

What fostered productivity earlier has been replaced by newer motivation and driving forces coupled with new ways of collaboration and working. The digital workplace has become an inherent part of the organization. Legend has it that Steve Jobs was extremely fixated on collaboration between the various teams, and hence, he facilitated the build of a concourse to enable various teams to meet and exchange ideas. A similar principle is being further augmented by next-gen technologies, which are also increasing interconnectedness in the workplace. Enterprises today are implementing next-gen solutions to optimize day-to-day operational costs.

This brings us to the first point about connected workplaces.

Interconnectedness: It is no more just collaboration but intelligent interconnectedness. Whether it is a factory floor or a corporate office, connectedness is being redefined with AI and IoT. Smartphones have converted themselves into mini offices and at the click of a button, can turn themselves into a virtual conference room. On the shop floor, a person sitting in the control room has a clear view of the health of the various equipment as well as the whereabouts of people working in various areas. It is helping is fostering greater collaboration, transparency in operations, and also workforce safety.

Smart Buildings: Another example is the conversion of office buildings into smart buildings. The operational cost of keeping an office running is fairly substantial for most enterprises. Star Trek fans would know about the concept of running the spaceship in grey mode. This essentially means that when the starship has fuel constraints can only operate those machines and components which are crucial to keeping the spaceship flying. IoT and AI can enable the offices to run on the proverbial grey mode during the off-peak hours. The lights and other power equipment are getting sensor controlled. This is contributing heavily to power saving (leading to cost-saving), the safety of the workers, and of course, helping companies become more environmentally friendly.

Automation Leading to Productivity Enhancement: Communication, as stated earlier, is being abetted by AI and IoT. For instance, if your employees need a software upgrade or want to get a damaged ID card replaces, it can be automated with the help of AI/ML. One can raise self-service requests and get most of their issues closed. This allows the operations staff to work on more value-add things and better communicate with employees on the things that really matter.

On another note, have you ever noticed how the files that we use, the names of the people that we most communicate with pop up automatically? There is smart text analytics with the help of which most of the communication gets flagged so as we do not miss those important emails.

Workplace Safety: In industries such as oil and gas, worker and ecological safety is one of the primary areas of concern. No one wants a rerun of the Deepwater horizon episode. The ID card of the employees, as well as their families, are IoT enabled with a microchip. The sole motive of that chip is to give a real-time location update about all the people on the site. This helps the companies optimize their safety effort in case of a disaster. With the help of the tracking device, the relevant stakeholders can know where their human resources are and in what priority they need to be rescued.

Employee Engagement: Now let us come to the most awaited piece, how can AI drive employee interaction and training. HR analytics is already a part of the interactions between the HR and the employee. Various data points are tracked to gauge employee engagement levels within the company. This gives the HR leaders a view into the churn predictability. This input can help organizations retain their most prized employees. Similarly, AI can also identify upskilling opportunities. AI/ML-based identification and formulation of training modules to reskill and upskill resources have become a critical part of the use cases. We are sure that companies will explore this extensively going forward.

Having said all that, one must realize that we are at a cusp of change, and these technologies are still maturing. The market is responding with a lukewarm approach to the adoption of AI and IoT in the workplace. But as and when it gathers traction, this will start playing a pivotal role in generating employee productivity.  


Hurdles you could face while launching a New Data Science Initiative...

 Adopting new technology is never a cake-walk. Irrespective of the size of the company, with every process transformation, there’s always anxiousness and struggle. With a relatively newer technology such as data science where huge amounts of investments are at stake, organizations are willing to go the extra mile to get everything right.

Implementing new initiatives in data science is like a transformation in itself. However, it is not limited to the understanding of tech-tools, hiring data scientists, analysts, etc. alone. It actually goes way beyond the obvious.

Read on to know what can be some of the not-so-obvious challenges that you need to know and in knowing them could help you avert huge losses while launching a new data science initiative!

What does a data science initiative entail?

Data science is all about giving your data a purpose. It helps you transform all the acquired data into a certain value - to bring improved revenue, enhanced customer experience, business model innovation, reduced costs, and agile business solutions - to serve the customers better.

A data science initiative that is based on clarity about how and where the data is going to be used, inevitably, brings better results. But it is also imperative that the goals of data science be embedded within the business teams and also be aligned with its objectives to produce outstanding results.

With a clear objective, any organization can optimize data in the right way. For any data science initiative, the company needs to have experts such as data analysts, data scientists, data evangelists who can extract maximum value out of data and also bring that data to actual use. But more than anything, it is the mindset of the people and the processes in the organization that determine the success or failure of any data science initiative.

The Not-So-Obvious Hurdles

While many companies might be doing well in data science, there are still several others that struggle with their first data science initiative.

The absence of data science culture might be the real challenge that needs to be addressed in all data science initiatives. This is the huge precursor for having a healthy ‘data culture’. The people and processes of the company must be prepared to embrace the data-driven transformation.

Experts say that the need for this data culture should be imbibed in all the teams involved in the initiative. It is a crucial step in order to build feasible frameworks and sustainable infrastructure that supports data analytics. Companies need a culture that transforms the way people perceive data in their decision-making.

Hiring data scientists and analysts might be the easier part. However, integrating the new talent into a data-based culture or re-configuring the entire corporate culture to make it more data-driven is the real challenging part for any company that wants to succeed in its data science initiatives.

Stakeholders of the company play a pivotal role in setting the right tone and ground for an all-inclusive data culture for the organization’s success in the initiative. It is important for the company leadership to lead the data-driven culture and practice the same.

What defines a data team in an organization is also very important for success. Is it a mixed bag of data-handlers? Does it comprise of a group of hand-picked data scientists or analysts? It is also important that the business folks, data scientists, and experts are marching to the same drumbeat and that all are on the same page.

To inherently incorporate the knowledge and better understanding of markets, customers, and their needs that are based on data can pave the way to a successful data initiative. People of the organization should consider data as a high-value component and data science as a serious business. Clearly, for building a culture like that, it should be drawn from the source and simply, just imposing such as a data-driven culture will almost always fail. The executive and major influencers should set an example by demonstrating a serious data mindset with clear objectives by placing data over intuition.

The Other Obvious Hurdles

Poorly defined needs

If your project needs are not well-defined from the very beginning, it could become a huge challenge in the future. If your project falls short of clearly defined objectives, there are more chances that the project might fail in the longer run. You need to clearly define where will the data come from, how it will be processed, who will use it for what kind of decision-making, and how do you measure its success.

Unmatched objectives

Data science is useful only when the data can be used effectively to bring about a change. If the data analytics and implementation don’t match the users work-flow or needs, such data is of no value at all. You need to clearly define what you aim to achieve through the initiatives, and only then will you know how to measure its success.

Lack of a data analytics translator

Business folks are an integral part of a data science project. However, it is important that a data analytics translator is also included to bring synchronicity within the team to make sure that all the people who are involved understand the technical language of data science.

Conclusion

While launching any data science project, it is important to go to the basics of data science, analytics, its needs, and the objectives. While one needs to keep an eye on the obvious challenges that are more technical and tangible, keeping in mind the other, not-so-obvious challenges can guarantee better success for the data science project.

Turning around the mindset of people of any organization is not an easy task. While it is the stakeholders of a company who are the torchbearers, it is also more of an individual-ownership-job to know the true value of data!

look at rubiscape.com and if interested, drop me a line

Data Science played a key role in Indian Elections 2019

 Almost all political parties are focusing on designing campaigns backed by data and analytics. Post its bitter defeat in 2014, even Congress joined the data bandwagon to get this right this time around. But the use of data and data analytics is not new in the political arena. The BJP very vociferously used data analytics in the 2014 election to design their campaign. While on a small scale, their agenda was to convert volunteers into voters and the voters into volunteers. We also have the 2012 U.S presidential elections to look at. Here, team Obama used Big Data and analytics to understand, analyze and detail voter sentiment, profile, behavioral patterns and all the relevant factors that would impact their voting.

Why does it make sense to leverage data analytics?

Let's begin with a simple observation. Nothing about India is simple. Really. Take a look at our country’s geographical context. 7 lakh villages, 600 districts, 36 states, and union territories. In the 2014 General Elections, there were nearly a million polling booths set up. It is in these booths that you get micro information, data, of 600 million voters! These booths also hold the key to understanding how the population voted. Not only is that valuable information, but it is also a veritable goldmine to understand the complex dynamics of our country.

The 2014 elections were where we used data analytics for the first time to change the traditional electioneering techniques. The BJP, for example, used data to design, structure, target, implement, and communicate their campaigns to the masses. With data, they managed to strategically navigate the complexities of demographics, religion, caste and that of politics as well.

So how did data analytics help?

Insight into voter stance

Data analytics in this election helped candidates evaluate and understand voter stance. It gave them insights that helped them tweak their election campaigns. They could micro-target their messages and use the right communication channel to reach out to their voters. The BJP, for example, targeted mobile voters using voice broadcasting. They also used GPS in campaign vans and cookies on their website to harvest information about a user’s internet activity and then use it for customized advertising.

Navigate political campaigns

While the 2014 General Election used datasets were obtained from the Election Commission website, Government websites, for the 2019 election, we are witnessed the use of commissioned data sets from social media platforms and historical electoral records more aggressively. Electoral data is also being used in conjunction with other data such as GIS data, census data, etc. Along with this public data, political parties are also keen on using private data, but of course with consent, to steer their political campaigns.

Real-time data analytics is also becoming a darling of the political circuit. You see, our political parties so far have not had the right vehicle to understand social intelligence. It was primarily driven by the proverbial ‘gut feel’. With data analytics, not only did they get quantified insights into this social intelligence, but real-time data analytics ensured that they got this access on time – proactively. So, whether it was to set the narrative or to change it, real-time analytics could ensure that the political parties did not have to be reactive.

Convince and convert

Data analytics is also helping political parties navigate the curious case of the ‘floating voter’ – the elusive Indian who has not yet made up his/her mind on which candidate to vote for.

With data analytics becoming a part of the main campaign toolkit, the political party could engage with their target group and also have their volunteers focus on converting the undecided voter. Along with this, micro-communication using Internet, mobile, social media could be leveraged to swing this voter.

Targeted messaging

Data analytics also helps political parties understand the issues faced by the majority of the people. Unemployment, women empowerment, and female education or sanitation, etc. have been some resonating issues faced by the Indian population.

Political parties leveraging data analytics have been able to identify demographic patterns and then design their campaigns around these issues.

Calibrating Social Media wins

Social Media is another medium that is becoming the new political battleground. It is estimated that India has over 258.27 million social network users in 2019. India also has the second highest number of internet users after China, standing at an estimated 462.12 million in 2019. Implementing data analytics to understand, communicate and convince this huge bank of voters using sentiment and behavioral analysis is a common sense thing to do.

Along with all this, data analytics is also helping the political contenders of this election season assess who election tickets should be given to, who the party should ally with, what should be their stand and positioning and how should they roll out their campaigns and convince voters.

We can consider 2014 to be a watershed moment in Indian politics - be it the rise of Modi-led NDA or the fact that the BJP came to power with an absolute majority for the first time in 30 years. The indelible fact remains that the BJP campaign employed data analytics to micro-target certain voter segments, gather relevant demographic information to develop voter maps and analyze past voter patterns, booth management, etc. to craft messages that appealed to a broader electorate. Data analytics also refurbished advertisements and was leveraged to assess how best to achieve voter engagement.

I believe that data analytics is changing the electoral battlefield. It is introducing new rules. And most political parties are leveraging it for the insights they get. I agree that data in the Indian context can be complex given the country’s diversity. But because the Indian context is so complex and diverse that data can emerge as the ultimate leveler – something that tells the truth as it actually is.

What do you think ?


AI on trial: There is a case for robots as judges.

Businesses have been using machine-based learning for a while to study and predict customer behaviour.

Prima facie, the case for using artificial intelligence (AI) as an aide to judges is on solid grounds. In the judiciary, save for landmark decisions that break new ground, a majority of litigation is decided on the basis of stare decisis, or the doctrine of legal precedent. In that sense, the use of AI should be a no-brainer, with machine learning algorithms obviously able to scan and sift through a lot more - potentially unlimited - historical data to arrive at a dispassionate view. Countries around the world have taken the first monumental steps by allowing judges to use the help of AI in sentencing - or acquitting - criminals.

Businesses have been using machine-based learning for a while to study and predict customer behaviour. Law enforcement authorities, are investing resources to leverage AI in order to predict crime, traffic congestion and accidents besides using facial recognition and other tech solutions to counter threats and boost security. There is, therefore, every reason to believe that AI will prove to be a great enabler in jurisprudence and the Judicial Department is showing how it can help in reducing the time it takes to study the available evidence to arrive at a fair decision.

Interestingly, AI can prove to be a boon in deciding cases in countries like India and China, where there is a huge backlog of cases that remain undecided for long, at times decades, owing to a lack of judges and infrastructure. It isn't just fact-finding that benefits from AI intervention - it can also help eliminate human bias while deciding on routine cases. The fact is that humans can be emotionally swayed or could have a bad day at work or be tired or miss out on a certain angle of the case due to a variety of other reasons - something that a robot won't. Therefore, when it comes to deciding on routine cases - petty thefts, traffic accidents, etc. - AI can do a much faster job than humans.

What about landmark cases, though, where precedent needs to be set and you need the 'human' intelligence to do so? Will AI ever reach that level of intelligence? The jury is still out on that.


Feb 1, 2018

Big Data is more than a promise.

Over the past 20 years, the database has been Enterprise IT standard for organizations which have attempted to gain further business insight by storing their data out of a central repository, known as a data warehouse. As the technology matured the field of business intelligence and data warehousing took off with Gartner estimating the total 2016 spend at $16.9B.
Until very recently, most of that spend was on the traditional IT, where an operational report or analytics need would turn into a traditional data warehouse project, front-ended with a traditional business intelligence tool.  In recent years, the monolithic data warehouse went out of favor and in its place was a more iterative and agile approach,  but with a similar design topography and data architecture including a star schema with an ETL layer for data transformation and a BI platform for reporting and analytics.  More often than not, these systems were expensive to build and did not live up to their lofty goals.  The care and feeding of these systems require 24/7 support especially within the ETL component, with loads that update the warehouse incrementally or overnight.  As a result, it is not unusual for an ETL process in a mature warehouse to run 6-10 hours, and in many cases much longer. By today’s standards, not exactly technological light speed, and the users deserve better.
As the BI Ecosystem has matured, Big Data and the promise of Hadoop has been a hot topic in the technology space but so far has been more hype than reality. Analysts have estimated the Big Data market (includes cloud, hardware, software and services) to grow to over $50B by the early 2020’s from the $1B today. Although it is very true that some of the most successful organizations have built their success on the back of Big Data including Facebook, Amazon, Netflix and Google, it has been a slow crawl to Big Data.  Considering how quickly newer technologies have been embraced in the consumer space eg. SnapChat, Instagram, why has Big Data been slow to be embraced? Some would say, without a specific use case that screams for the need of a Big Data application, the initiative will fail.  Others would say, the technology is still not mature and unproven although organizations like Cloudera, which provides the wrapper and admin consoles for Hadoop, have legitimized the technology for commercial scale use.
That begs the question if we are taking the right approach with Big Data? Use cases are important but are very limiting in scale and scope, hence the approach and vision to Big Data need foresight of unexplored possibilities.
Who knew that Big Data will allow an insurance company to truly understand customer behavior. Sensor data is allowing companies to be better equipped to handle parts failure in a product. Business Analytics and Big Data are striving for the same objective, to maximize the usage and analysis of all of an enterprise’s data assets. However, the technology foundation and structures are very different with the Java based Open Source Hadoop versus the conventional relational models of traditional Data warehousing. 
For more validation, the father of modern data warehousing Ralph Kimball has stated there is a better way than the design and data architectures he created.  He believes that the future is Hadoop, the foundation of Big Data.  Ultimately, what is most important is that any organization that has a successful business intelligence program, or is just starting out, should consider Hadoop as a core component of its modern data architecture and analytics program.  The HDFS file system can play along and in some cases replace a traditional environment including the ETL Component.  At the very least, Hadoop can be tightly integrated with the conventional ecosystem and will be sure to lower operating costs and improve performance.  The relational DB is not going away and will be core to the modern data architecture, but Big Data is more than a promise, and should be adopted as a key piece of the overall information architecture.

Sep 14, 2016

Re-engage Business to BI

If you have observed users disengaging from your organization business intelligence initiative, you need to take the right steps to regain their trust.
Following are solutions on how to implement business engagement once again.

Why is there disengagement:
·         Longer time to insights, slower performance, or loss of access to information
·         Changes in BI platforms, processes, team roles, and information access
·         Business anxiety of the unknown and wanting analytical capabilities
·         Local BI teams may exist in business units and can be a major disruptive force to the corporate BI initiative

How to re-engage, Key takeaways:
BI and analytics leaders must evolve through the five stages of re-engagement
1.    Add BI with new data sources
2.    Layer BI with Analytics
3.    Enable access to self-service analytics
4.    Support decentralization – collaborative with the local BI teams
5.    Consolidate set of tools, skills and processes; to achieve a unified corporate BI

1.   Add BI with new data sources.

Diagnostic:
·         Identify the data sources that are missing or that need customization to fulfill users' needs, in the data warehouse or reporting layers.

Actions:
·         Broader and easier access to data sources, including those not loaded into the data warehouse
·         Reduced need for highly optimized data models
·         Quicker and easier dashboard design
·         Faster turnaround for user requests by joint development workshops where users sit next to the dashboard developer, providing requirements and feedback in real time as they see results.
·         Expand the range of data sources available on the BI platform. It's not easy to expand the data sources used in BI with traditional development methods. The solution is the reduction of effort on data modeling and report design activities. It will also increase users' satisfaction due to shorter time to insights. The use of a data discovery tool will help achieve this goal. The data models required will be simpler and less structured, with fewer performance optimization layers (such as the need for aggregated tables or cubes), or even based on direct extracts from business applications and other sources.

Points to be taken care while adding new sources:
·         Less effort on data integration
·         Data to be aggregated or properly optimized for queries and analytics processes.
·         The information management infrastructure will get more complex, with ad hoc or less structured processes side-by-side with fully curated data integration processes.
·         ETL process is not the only way to do it. Other scenarios can be used and considered robust enough for production, such as the use of a logical data warehouse (LDW), data lake, self-service data preparation or even loading data directly into the in-memory engine of a data discovery tool.

2.   Layer BI with Analytics


Myth in traditional BI teams

·         The business analytics platform must be a tightly integrated solution with as few components as possible — preferably from a single vendor — to deliver a single version of truth to the organization.
·         Information can only be trusted if stored in the corporate data warehouse and delivered to the information consumer using BI artifacts, such as reports or dashboards.
·         Information created or manipulated by business users will inevitably produce discrepancies through different analysis, leading to wrong decisions and generating chaos in the organization over time.
·         IT's responsibility for information management stops at the BI semantic layer and IT-driven content. Business-driven analytic processes are out of scope and not supported by IT.

New Approach for success
A successful BI need to add on an analytics platform and capabilities to evolve beyond the monolithic mindset.
Following is a tiered architecture layering business outputs over BI to meet Analytical and advanced Analytical needs. To realize the vision of the three tiers and be able to maximize their strengths, BI team needs to deploy new technical capabilities to provide missing analytic styles, improve the usage of existing tools through a better overall integration, and provide common metadata and governance.
The platform tiers must work in conjunction. Table below shows data feeds they share that help create a coherent global BI, Analytics and Data Science platform and, at the same time, how the output differs and caters to different business needs and various tool that are used for addressing these business needs.

BI Landscape
Analytics Landscape
Data Science Landscape
Input
Transaction Systems
Specific transactions
Detailed System Logs
Ad Hoc Files
Social Media Sources
Audio

3rd Party data
Image

Word, Text files



Output
Reports
Data discovery
Predictive analytics
Dashboards
Ad hoc queries
Simulations
Mobile BI
Forecasting
Optimizations

Location analytics
Big Data



Tools
SAP Business Objects
Tableau
SAS Enterprise Miner
IBM Cognos
Qlik
IBM SPSS
Oracle BI
SAP Lumira
R
Microsoft SSRS
Microsoft SSAS
Cloudera

Power BI
Hortonworks

BI Landscape

The information portal is the workspace where business users can quickly and easily find the key trusted metrics with which the organization measures its performance. It is usually made of reporting and dashboard capabilities that provide content to information consumers.
Its outputs are the result of a formal development process that includes a business user establishing requirements and a technical specialist implementing them. This can take days depending on complexity and workload. The information can be trusted and is used across the organization, but has low flexibility and reduced associated interactivity capabilities.

Analytics Landscape
The analytics landscape is the workspace used to investigate trends on trusted metrics or to detect patterns in other datasets — from multiple sources — that may turn into opportunities or risks. It is an agile tier to explore information and has access to a broad range of data sources, with limited to no support from technical experts. Toolsets should include a data discovery tool and a number of other capabilities to help business users extract value from information autonomously.
In some cases — namely through the use of more analytics-focused data discovery tools — it can extend to a basic level of predictive analytics and will gain data modeling and more advanced analytic capabilities going forward.
Data Science Landscape
The data science laboratory is the workspace where advanced analytics takes place and is the ideal incubator for big data initiatives. It is a flexible environment where experimentation — with trial and error — is actually encouraged to generate impactful insights for the organization.
A broad set of technical capabilities is expected and often provided by specialized tools with minimal IT integration, meant to deliver agility and the ability to answer unforeseen questions. Users are skilled and experienced, often more than the technical experts in IT. Their toolsets include data mining capabilities, forecasting and other complex statistical and analysis tools.

3.   Enable access to self-service analytics.

Diagnostic:
·         Check users analyzing information in Excel and assess their skills level in information exploration and analytics.
·         Rank the most widely exported datasets to define priorities in self-service analytics data models.

Actions:
·         Data discovery capabilities, including content authoring and ad hoc analysis, must be provided to business users. From the BI team perspective, there will be reductions in data modeling and report design workloads, which will free resources to properly support data discovery in the analytics workbench.
·         Connect the analytics workbench to the existing data warehouse as extensively as possible. If needed, customize and document datasets to make them data discovery-friendly.
·         Provide training and access to the data discovery tool for those requiring authoring or ad hoc information exploration capabilities.
·        Optimize data integration and transformation processes.
·         Start using self-service data preparation tools on the BI team, this should help accelerate the delivery of ad hoc datasets for user-led information exploration processes.


4.   Support decentralization – collaborative with the local BI teams.

Diagnostic:
·         Catalog local data repositories, their data sources, ETL processes and developers.
·         Identify the analytics processes and insights generated and automated by users.
·         Assess users' skills levels on the data warehouse data models and querying capabilities.

Actions:
·         Deploy members from the centralized BI team within business units to improve relations with the business and get a better understanding of their challenges and objectives.
·         Work with business units to identify BI and analytics experts across the organization. Determine available skills and skills gaps and arrange a training program for them. Such a program must include analytics at different levels of specialization and data preparation skills.
·         Start best-practice sharing forums through regular meetings with users that will help build a community effect around BI and analytics.
·         Make local data constructs such as user-built data marts or data extract scripts part of the global information management infrastructure. Consider data federation capabilities to blend user-built data artifacts with corporate data repositories.
·         If needed include write access, querying and unlimited access to data. Bear in mind that if users feel restrained, then the BI team will be discredited and they will find ways around it again.
·         Deploy self-service data preparation capabilities, data integration and transformation tools for more skilled business users. As with the data discovery tools, this will reduce the workload for the BI team — in the data modeling space — and allow it to focus on higher value-added tasks.

5.   Consolidate set of tools, skills and processes in a unified corporate BI initiative.


Diagnostic:
·         List the BI and analytics platforms deployed by users and the capabilities they offer.
·         Identify overlaps, gaps and integration areas between the corporate BI platform and the users' tools.

Actions:
·         The BI team must refrain from following the general rule of replacing users' tools by the one provided by the corporate BI vendor. Instead, they should run assessment processes with the help of business users — including proof-of-concept experimentation — to select the standard tools for the organization.
·         Evolve the information management infrastructure to a logical data warehouse.
·         Unify analytics governance and processes.
·         Further develop the community effect by cloning successful solutions in other business areas or applying them to different business problems. If possible, go as far as rotating experts among business units, with their and HR's support.
  

All New Post

 For all the new post, please follow me on LinkedIn https://www.linkedin.com/in/nikhil-bhatia-analytics/