Back to Blogs

Covid-19 accelerated digital transformation for enterprises in all industries — and today, cloud initiatives are at the core of digital transformation. Enterprises are likely to experience mixed success with their cloud initiatives in the medium term as they manage challenges such as severe skill shortages, evolving data architecture practices and myriad cloud vendors with similar offerings.

Drawing from our company’s experience of implementing more than 50 cloud initiatives across industries, here are my top five recommendations for business and IT leaders planning significant data and analytics (D&A) initiatives in the cloud.

1. Not everything that can be moved should be moved to the cloud.

Cloud migrations involve significant capital expenditures (CAPEX). In my experience, when you migrate old data applications to the cloud, you should not expect to see operating expenses (OPEX) savings for up to three years. In many cases, all layers are not migrated together due to interdependencies on other systems, leading to a hybrid approach with a combination of on-premises, private and public cloud hosting.

Carefully evaluate the suitability and need to migrate the following category of applications to the cloud:

  • End-of-life legacy applications or tool platforms
  • Applications built on comparable SaaS (Software as a Service) tools
  • Applications accessing data that requires stringent data security and privacy
  • Specific hardware-dependent applications

2. Plan to embrace a multi-cloud future.

Three leading public cloud players — Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP) are adding capabilities, services and geographical locations at rapid pace. Most of their comparable services match each other in terms of cost and performance, and with no consolidation in sight, you can benefit from their competition.

Each of these cloud vendors do provide a few differentiating services. To enable the creation of cutting-edge data and analytics solutions, aim to leverage the best services available, regardless of which vendor provides it. For example, one of our clients — a leading media and entertainment company — uses a multi-cloud setup with AWS infrastructure and select AWS services for its data apps, Azure for email services and cloud native PaaS platforms like Domo and Snowflake for analytics.

Within your organization:

  • Discourage investment in single cloud vendor
  • Promote a culture of looking for the best services, comparing capabilities and costs across cloud vendors
  • Encourage technical teams to design data architectures that seamlessly use cross-cloud capabilities

3. Don’t let security be an afterthought.

According to the Verizon Data Breach Investigations Report (DBIR), most cybersecurity incidents now involve cloud infrastructure. We can expect the threat of data breaches to grow in the foreseeable future, and the responsibility for increasing security protections lies with enterprises.

In our work, we have seen that most cloud initiatives, especially enterprises’ early endeavors, try to address security requirements through native services. However, due to inadequate design, these solutions fall short of addressing all risks. Thankfully, there are a number of solutions from third-party vendors available that you can use to address this critical gap. Use these tools to:

  • Carefully assess security requirements
  • Invest early in holistic security solutions
  • Conduct frequent vulnerability scans

A global bank that we work with has implemented a unified data-centric security model with sensitive data-flow discovery, real-time monitoring, behavior analytics and protection across all operational and analytical applications (both on-premises and on-cloud).

4. Monitor all D&A solutions through a unified platform.

Given the nature of cloud services, any data and analytics platform migrated to the cloud gets decomposed into many independent solutions. While this offers advantages, such as no single point of failure and scalable performance, managing multiple platforms can be complex. In case of service level failures, it can be difficult to ascertain the root cause, replay the sequence of events and recover from the failure.

DevOps staff supporting disparate platforms need to invest significant effort in scanning consoles of multiple services for any meaningful analysis — post mortem or change impact. It is highly likely that components of such systems will drift away from the initial architectural vision. To avoid this outcome, push for:

  • Holistic assessment of current and future monitoring requirements
  • Early investment in a comprehensive monitoring solution
  • Frequent “game day” drills to test responses, in processes and people

A global market research firm we work with uses a centralized monitoring platform to track its infrastructure, databases, analytical apps, workflows and security. It gives them the ability to have a 360-degree, single-pane view of its data and analytics ecosystem and provides greater operational efficiency.

5. Aim for an accelerated pace of innovation through the cloud.

For most enterprises, the first set of cloud initiatives includes migrating existing data and analytics applications to a cloud platform. Whether as-is (lift and shift) or re-engineered, these types of migrations don’t change the status quo dramatically.

But there is a constantly expanding set of cloud offerings that covers capabilities like IoT, blockchain, data science, machine learning, media, quantum, robotics, satellite, VR and AR. Explore how your organization can use cloud initiatives to power innovation. How effectively you do this will prove to be a competitive advantage in the Industry 4.0 era.

There are also countless focused solutions available on cloud marketplaces that significantly reduce the cost of experimentation. Take advantage of these cost-effective tools and encourage:

  • A culture of innovation with cloud at the center
  • A risk appetite based on leveraging cloud offerings and marketplace solutions
  • Thinking “cloud first” before costly in-house development of new solutions

Your organization has probably moved the first set of data stores and front end analytics apps to the cloud with varying degrees of success. Enterprises that don’t see measurable positive outcomes with their first cloud initiatives tend to delay the rest of their cloud adoption journey. Don’t fall into the same trap. Cloud initiatives will continue to be a critical ingredient for future business capabilities. By finding the right solutions and engaging the right partners, you can set your organization up to make well-informed choices, develop pragmatic roadmaps and avoid the pitfalls that lead to failure.

Recent Blogs

This article was first published on Forbes.com, read the article here.

In the digital economy, data is our most valuable resource. “Data is the new oil” is now a popular saying, and it’s an apt metaphor. Companies that learn to extract and use data efficiently have the potential for huge rewards, driving profits, innovation and customer satisfaction.

Just like with oil, quality matters in data. Quality isn’t free; it requires time, money and attention, but it’s worth the investment.

What Is Good Data? 

Good data quality means the data is fit for use based on the business context and requirements. The business rules governing quality include both quantitative (e.g., “Is the date legitimate?”) and qualitative rules (e.g., “Is the date captured in the American notation?”). In addition, expectations of users regarding availability, reliability, usability, relevance and clarity may lead to perceptions of quality issues. Data quality initiatives need to address various aspects to ensure that data is trustworthy.

Think about global data related to Covid-19 vaccinations. Reliable data must include a patient’s date of birth, the date of each dose, type of vaccine, number of required doses and location of each dose. Quality issues become complicated when you consider that some users received shots from different vaccine manufacturers, or data may have been captured in different states or countries. Poor data quality can prevent various stakeholders — including public health experts, vaccine advocates, and the general public — from making informed decisions. If people perceive vaccine data to be unreliable, they may become more hesitant to get vaccinated and ultimately damage public health outcomes.

The Cost Of Bad Data 

In 2016, an IBM* study estimated that bad data costs the U.S. economy $3.1 trillion a year, and in 2020, a Gartner** survey found that organizations calculated that the average cost of poor data quality was $12.8 million a year.

In my experience leading a global data and analytics (D&A) solutions provider, I’ve seen that while everyone cares about data quality in theory, when it comes to actually making funding decisions, many customers want to cut corners.

But this is where the rubber meets the road. If you don’t finance data quality initiatives, then you won’t get the result you want. Poor quality data can lead to flawed decision making, top- and bottom-line consequences and decreased employee and customer satisfaction. Incomplete data may result in ineffective marketing campaigns, and a data breach can cause reputational damage or leave you vulnerable to litigation under laws like GDPR or CCPA.

Six Common Challenges In Data Quality Improvements

Improving data quality in your company will have significant long-term benefits, but you must be proactive. Here are six of the most common challenges to be aware of when improving data quality:

1. Lack of awareness: Because data is an intangible asset, it’s often hard to assess quality and address problems. Your stakeholders may not fully appreciate the state of data quality in your systems until a major issue impacts your business.

2. Difficulty justifying investments: To get buy-in on improving data quality, you need to be able to make a solid business case for it, showing how poor-quality data has had negative consequences in the past. But frontline staff may not be willing to document quality issues to build a future business case for something like automation, preferring instead to resolve issues manually.

3. Confusing shared responsibility with IT responsibility: Enterprise data is often used across multiple business units, moving through line-of-business systems into reporting and analysis systems. Quality ownership is delegated to IT as data flows through various pipelines, yet IT is not fully responsible for the source systems. Data quality demands shared responsibility.

4. Resistance to change: Data quality programs are heavily focused on continuous improvement, which calls for users in your organization to adopt new behaviors and perform additional checks and balances with discipline. If employees are unwilling to adapt, you will run into obstacles.

5. Fear of transparency: Data quality assessments and real-time quality dashboards may make some leaders uncomfortable. Looking into past data decisions and results may cause some in your organization to feel embarrassed or concerned, creating yet another roadblock.

6. Lack of sponsorship: Data quality initiatives often compete with new technology and product management investments. It can be tempting to throw money at a shiny new object, spending $X on a cloud computing platform instead of the same amount for a data governance consultant’s annual fee. Data quality is less glamorous and often loses out to modernization initiatives.

Five Ways To Improve Data Quality 

Once you’ve identified the challenges you’re facing, here are five actions you can take to address them:

1. Sponsor beyond budgets: To achieve successful data quality initiatives, you must be willing to invest more than dollars. Articulate the importance of data quality for the organization, inspire cross-functional collaboration, prioritize progress and hold frontline managers accountable for long-term success.

2. Invest in data quality and transparency tools: As the volume, variety and velocity of data increases, you need more specialized tools for quality management. Invest in software that automates quality management for profiling, catalogs, metadata and lineage.

3. Adopt DataOps automation practices: DataOps practices build on DevOps practices from the agile software engineering domain within data analytics. Integrate them to promote collaboration, communication and automation to bring business, applications, operations and data teams together to speed feedback cycles for quality issues and reduce cycle times for delivering D&A products.

4. Embrace a stewardship culture: To create a data-driven culture, you need to incorporate stewardship in your organizational norms and habits. This is a shared responsibility among various stakeholders and includes identifying and articulating quality issues and following governance practices to resolve issues.

5. Build a data management team: Data quality management requires dedicated skills, roles and positions depending on the scope, size and complexity of your organization’s data. Hire the people who can evaluate, design and execute the solutions you need, including Chief Data Officers, data analysts, business and technical stewards, tool experts and data management consultants.

Investing in data quality is a long but worthwhile journey. Evaluate your company’s needs and challenges, and invest in the practices, tools and leadership required to advance valuable data quality initiatives.

References:

Recent Blogs
Back to Blogs

Data is the most important asset for any organization and is the key to its growth and success. Analytics, which is at the heart of any digital transformation initiative empowers new age organizations to leverage data effectively to make smart decisions.

Every organization today is trying to invest significantly in Analytics and it’s a core area of discussion in board meetings. But then, why do so many Analytics projects fail to make an impact? While there are many reasons for failures, here are the top six that you should be aware of before kicking off any analytics project.

  1. Lack of Management Vision & Support:

    Top management sets the tone for any organizational initiative and Analytics is no different. Visionary management team will support the need to build a data-driven culture within the organization.

    • Humans have the tendency to resist any change. If analytics programs are kicked off just to follow the market trend without setting up the business strategy, it’s bound to fail. Business leaders should include analytics as part their vision and handle the change management effectively through inclusivity and trust building measures.
    • Analytics projects are traditionally hard to implement. They typically start with a Proof of Concept(PoC). Bringing analytics projects from conceptualization to production requires clear understanding of business goals to everyone involved in the program. We have seen the crazy race to launch the Analytics programs and many of these projects go down due to lack of management vision and top-down support.
  2. Poor or Missing Data:

    Data driven originations have survived the most challenging of the conditions because they believed in their data and utilized it to the maximum for accomplishing their executive goals.

    • It is important to continuously check the quality of data and make necessary changes to the data cleansing routines as required. Make the Data quality routines flexible so that they can continuously adapt to new rules or sources of data.
    • Business processes are becoming more and more complex and end users are continuously looking for new product lines and real time assistance. Understanding the data required for any Analytics model is the key to success.
    • Idea is to bring right data by effectively understanding the business requirements and baselining the processes. Data should be able to power analytics engines effectively which can result in actionable insights with greater impact.
    • Good data will aid the success of any analytics project and on contrary bad choice or quality of data will make the task difficult and ultimately take down the complete analytics investment.
  3. Missing Analytics Governance:

    Most of the Analytics projects are built on small PoC/MVPs and once it’s successful, business/product owners demand to build more use cases on top of this without strengthening Analytics Governance processes.

    • Analytics governance should go hand in hand with the first ever Analytics project and should not be left aside for future enhancements.
    • Things like Security should be the day 0 priority and should not be ignored even when you are delivering a small PoC. Go for an effective role and access management so that the trust is not compromised at any level.
    • Analytics Governance ignorance can result in multiple points of failures such as security breaches or reduced performance which eventually may bring the bad name to the organization and complete analytics initiative may go for a toss.
  4. More Time to Market:

    It is a furious competition out there and if you are not able to deliver on time, someone else will.

    • We have seen cases where Analytics programs were delivered but it failed to catch up with the consumers as the competitor was ahead in the game.
    • Time to Market from ideation to finished analytics should be a short cycle and idea should be to deliver results quickly to the business through short MVPs.
    • From technology point of view use the power of cloud to eliminate infrastructure bottlenecks. This will not only shorten the complete cycle, but you will be able to come back to the drawing board quickly in case of any inadequacies. Adaptation is the key to success and the feedback loop should be kept open.
    • Create a lean Project management schedule even if it is a PoC. This not only helps in tracking activities closely, but any bottlenecks can also be managed immediately.
    • Have simple visualizations on top of your analytics output which can be easily understood by the business teams. Many of the times we have seen that end users who consume analytic results fail to interpret analytics output and ultimately the UAT cycle is stretched.
  5. Missing Appropriate Team members:

    While it is okay to have cross trained team members in your analytics team, at least have one strong resource who understands the technology and is curious about what that data might reveal to solve the business question that analytics is trying to answer and bring required Data Science & Machine Learning leadership.

    • Working on an Analytics project without business stakeholder’s active engagement can lead to a weak hypothesis and ultimately the Time to market is stretched.
    • Keep business stakeholders in your team as product owners from day 1 even for a small PoC. They can be owners of the business process that you are trying to simplify. Having them in your team can reduce the chance of failure as they can continuously provide the feedback to improve your model.
  6. Ignoring Regulatory compliance requirements:

    Businesses may have to pay heavily if regulatory compliance are missed.

    • While analytics for an organization is rewarding, it may become a high-risk issue if it is not organized and compliant to regulatory laws.
    • Most of the time Analytics projects are kicked off in a rush and delivered technically without considering the regulatory and compliance requirements for that region/business. For example, if you are working for a European client, GDPR is something which you cannot ignore else you will never be able to productionize your model even if it’s technically super.

Get in touch to know more!

Recent Blogs
Back to Blogs

This article was first published on Medium, read the article here.

 

On February 2, 2021, the Groundhog Day, Punxsutawney Phil emerged from his snowy burrow to predict six more weeks of winter! “We have all passed through the darkness of night but now see hope in morning’s bright light. But now when I turn to see, there’s a perfect shadow cast of me. Six more weeks of winter there will be.” narrated Phil’s top-hatted handler.

If you are an executive responsible for Data & Analytics Modernization in your firm, you may well be living what Phil Connor, played by Bill Murray, lived through in the 1993 classic “Groundhog Day”–a time loop. Except, you find yourself in what I call “perpetual modernization cycles”– a result of exponential growth of different forms of technological progress over the past two decades–something suggested 20 years ago by American futurist, Ray Kurzweil, in his renowned essay, “The Law of Accelerating Returns.”

The Modern Groundhog Day – Start of Perpetual Modernization Cycles

As we embark on a new decade, I can’t help but think about what IT executives will have to deal with in the coming years. We already face relentless pressures to modernize. Business is creating more demand for speed, agility, and capabilities; while, IT is facing supply pressures due to changing technology, ever-growing data sources, and security threats. According to Gartner, by 2025, 90% of current applications will still be in use, and most will continue to receive insufficient modernization investment.

To address modernization, companies are using approaches that look something like Rehash, Rehost, Rebuild, and Replace using an effort and value trade-off. There is no one-size-fits-all approach and you are likely trying multiple options, like what Phil tried, using a combination of technology, functionality, and architecture adaptations to deal with your situation. But even before you finish your current project, you are already facing pressures either from the demand- or supply-side. Welcome to the Modern Groundhog Day!

During my 20-year experience supporting enterprise customers in the public and private sectors, I have lived through multiple cycles myself. For example, at the Department of Health and Human Services, I supported multiple versions of web-based systems used to collect, report, and analyze–clinical, performance, and administrative data from over 13,000 health center service delivery sites providing primary care to over 25M patients nationwide. At Infocepts, we are modernizing data-driven capabilities for multiple retail, media, data syndication, financial, and health companies across the globe. Whether it is preparing customers for the Olympics, or liberating them from legacy monolithic BI platforms, or using AI-driven insights to connect retail businesses with consumers–we are constantly driving modernization initiatives.

Like Phil Connor, we evaluate each experience and apply our learnings to future endeavors improving outcomes for our customers. Here I share three learnings and ideas to deal with the Modern Groundhog Day.

Three Implications of the Perpetual Modernization Cycles

The scope of modernization programs is becoming broader. Companies are no longer looking at “lift and shift” approaches to upgrade technologies, instead, becoming more mindful of revisiting end-to-end business needs to deliver the intended benefits such as improving customer experiences, reducing TCO, or improving operational effectiveness. But that means more decision points along the way, often beyond the IT organization, and thereby more time to execute. Agile techniques and modern technologies such as cloud and low-code platforms help with reducing the cycles times and time-to-value, but large-scale legacy modernization programs often span across multiple calendar years, despite what vendors want you to believe.

  • 1. Longer cycles mean business keeps waiting for intended benefits

    Proponents of agile techniques–and, I am one of them–often advise clients to focus on some variation of “minimum operating capability.” After the initial discussion on the scoping and prioritization, the program management leadership focus on the question of, “What is the minimum capability necessary in production to see value?”, and allocate resources to get to that. But that isn’t the finish line and success shouldn’t be measured based on that metric alone.

    An Infocepts data syndication customer operates large-scale data factories; they are in the process of modernizing them–a few processes at a time. But the consumers of these factories residing in distributed business units are facing massive operational pressures to cut their annual operating budgets. The challenge is that the processes that haven’t been modernized continue to drain resources and perhaps the program management leadership is oblivious to that.

    So, the key takeaway here is to measure executive success of modernization programs based on the initial time-to-value, incremental cycle times, and total time-to-value with corresponding TCO numbers.

  • 2. Organizations are accumulating redundant capabilities faster than they can retire their legacy ones

    To take advantage of technology evolution, companies are adopting new systems for engaging their customers and/or new systems for innovation while maintaining their systems of record. In a rapidly shifting market, vendors are offering niche capabilities to create differentiation to acquire customers. This is particularly true in the analytics landscape where platforms such as Tableau penetrated from the front-end visualization layer, closer to business users, and then organically matured toward established enterprise platforms such as MicroStrategy and Business Objects. The same is true now with products such as ThoughtSpot that are focused on search-based analytics, continuing to bridge the gap between the end-user and insights. This trend coupled with inorganic changes to customer data & analytics landscape due to internal restructuring or M&A results in multiple redundant capabilities.

    An Infocepts media customer ended up with six enterprise capabilities to include MicroStrategy, Tableau, Qlik, Power BI, Business Objects, and Looker! Clearly, they needed to eliminate redundancy and save costs. Solving such a problem requires a few months for the rationalization, then migration, and ultimately timely decommissioning of redundant capabilities. It took our team 12 months to cut down the number of systems from 6 to 3 and save them $750K in annual costs.

    The key takeaway from this is to maintain up-to-date information on current tools to support future rationalization efforts and tie decommissioning to project success. Maintain basic information such as the core purpose of the tool, usage data, financial data, and intangibles such as customer expectations to speed up rationalization efforts and to inform critical cutover milestones based on software renewal dates.

  • 3. Leaders (and teams) are grappling with multiple responsibilities

    Regardless of how you are organized, IT executives typically deal with responsibilities to include strategy, experimentation, development, operations, and adoption. For any modernization project, you must address these elements–how you choose to do may vary. Factors that play into how leadership and teams are organized include the need for legacy knowledge, aspirations and interests of leaders and team members, and long-term vendor agreements and relationships. But you can see how quickly the number of responsibilities multiply with the result that teams start to locally optimize for their available capacity and the overall velocity suffers.

    For example, data migration requires knowledge of business rules prevalent in the legacy system; teams want to work on newer technologies and not be boxed into maintaining legacy tools; a leader may be interested in demonstrating results for their growth; and, companies may be locked up into master service agreements with global system integrators to gain preferred rates with penalty clauses for deviating! These dynamics hamper modernization success and the longer the cycles last, the more complex they become. The result is that customers pay more for their modernization efforts than initially forecasted.

    The key takeaway here is to ensure that companies take an end-to-end approach for modernization outcomes and establish cross-functional leadership teams to devise and execute the strategy to minimize cognitive biases from influencing enterprise roadmaps. Modernization is more than a technology or capability upgrade.

Three Ideas to Deal with Perpetual Modernization

As companies deal with multiple modernization cycles, I anticipate them codifying their holistic approach in an Enterprise Modernization Life Cycle (EMLC) much like the Software Development Life Cycle (SDLC). The key is to connect the EMLC to their enterprise roadmap that advances them along their pursued business strategy. If you do these two things, you’d be able to find comfort in its predictability and take advantage much like how Phil made use of each of his new days once he found his purpose!

  • 1. Don’t reinvent the wheel

    With the proliferation of cloud-based “services”, it is becoming harder to quickly put them together to construct enterprise capabilities. This is in part due to the paradox of choices and our desire to make the right choice without negative repercussions. Some vendors are making it easier for businesses by offering “Anything-as-a-Service.” But if you are responsible to advance your enterprise roadmap you can look for building blocks in the form of “turnkey projects” or “reusable capabilities” and compose your bespoke solutions rapidly. In this case you are counting on the experiences and learnings from other parts of your organization or from the market to reduce your time, costs, and risks with success.

    Consider a situation where you want to find an alternative to your legacy analytics enterprise platform and move to a modern one. This is likely not a very “common” project for your team. In such situations, you may look at experts that do turnkey migration with precision enabling you to focus on making critical choices, supporting current operations, and preparing for the cutover. As an example, Infocepts offers a BI migration solution that consists of a cross-functional team with a proprietary methodology and tool refined over successful execution of 100s of such projects to guarantee your success. Similarly, we have a solution to move workloads from legacy data warehouses such as Netezza to Snowflake within a few weeks.

    You may find such solutions within your own organization as well. For example, at one of our customers, teams from different business units are sharing foundational capabilities for a CI/CD DataOps pipeline so they can augment it, rather than starting from scratch.

  • 2. Get your teams out of the DIY mindset

    You should encourage your team to think outside the box when making progress toward modernization initiatives. This is to overcome situations where the same team is signing up for several capabilities and then prioritizing them locally. Teams with highly capable technical talent often think they can learn new technology quickly and then implement at scale. Or like mentioned earlier, some companies may be constrained by their master service agreements with penalty clauses. Regardless of your situation, you should evaluate your team’s current capacity and focus, and what you need to accomplish toward your modernization, and determine how to allocate responsibilities creatively.

    For example, one of our customers wanted to free up the capacity in their operational team toward modernization. They engaged Infocepts to do a turnkey RPA automation using UiPath to reduce both the time and effort to execute quality assurance lifecycles for their report factories. In just 3 months, we helped reduce the QA effort from 9 FTEs to 3 FTEs while increasing their coverage to 100%.

    At another customer, our team helps in experimentation while they keep the enterprise implementations closer to their core team. This could be a way forward to try out multiple options for newer capabilities such as Data Science adoption. The reason is that a single team or organization likely does not have the time or resources to exploit the breadth of scenarios during evaluation to help them make informed choices.

  • 3. Consider alternative solutions

    The last idea prompts leaders to challenge their teams to explore alternative solutions to costly modernization endeavors. Cohesive agile teams sustained for long durations may get very productive, but there is a risk of them succumbing to groupthink. Let me illustrate with two examples.

    Consider a need within your organization to provide an analytics hub to make analytics assets from different capabilities more accessible to your users. If your team is used to building custom solutions or is currently supporting a custom portal, their default thinking would be to add features to their portal or build a new one. In such situations, you must challenge them using the build vs. buy rubric. For example, solutions such as Metric Insights and ZenOptics can provide hubs in a few weeks rather than going through a costly and risky development endeavor.

    Or consider a situation where you are looking to optimize long-running ETL jobs designed several years ago. Again, if your team is used to traditional warehousing approaches, they may Rebuild the ETL packages with automation. But such an approach may miss out on the Replace option using emerging technologies such as Incorta that enable direct connects to data sources with real-time aggregations and in memory transformations.

When Will Modernization End?

Kurzweil may well be on to something. We know for a fact now that the rate of technology is certainly changing rapidly. But for that rate of change to translate into accelerated returns for our businesses, we all need to think differently going forward. Rather than looking to modernize quickly, we need to find ways to remain modern. This can be done by adopting evolutionary design thinking, growth mindset, agile leadership, and reliance on proven reusable assets to charter our own course. Maybe, like Phil, we will wake up one fine day and find that we’ve transcended the modernization loops.

Recent Blogs

This article was first published on Forbes.com, read the article here.

 

According to McKinsey’s 2020 global survey of 1,216 executives and managers across financial, retail, health, automotive, and telecom industries, 87% of respondents said their organizations are either already facing skill shortages or they anticipate skill shortages within the next five years. The impact of such talent gaps will vary by industry, with healthcare, pharmaceutical, and medical products industries least likely to be disrupted, and financial services, tech, and telecom industries most likely to undergo a serious step change.

Forty-three percent of the survey’s respondents said they expect data and analytics (D&A) to be the area with the greatest skills gap, making D&A the most prominent area of concern, followed by IT management and executive management. Clearly, D&A leaders are under substantial pressure to deliver transformative results in today’s economy. The demand created by skill shortages is only exacerbated by rapidly expanding data, technology, and digital economies.

In order for data & analytics leaders to deliver the transformation their organizations depend on, it’s crucial to connect the organization’s data & analytics roadmap to its corporate strategy in a way that’s both affordable and sustainable. If you are a leader responsible for data & analytics strategy and execution in your organization, this may mean you’re also responsible for getting buy-in from your CEO.

As the co-founder and CEO of a global top 40 D&A solutions provider, I’ve found success with three concrete strategies for accelerating the execution of an enterprise data & analytics roadmap by securing the right leadership, lowering TCO (total cost of ownership), and pursuing speed of execution to achieve intended outcomes.

Build sustainable teams for data & analytics outcomes.

Imagine you are responsible for enterprise D&A platforms at an organization. Chances are, you’re dealing with multiple complementary stacks of cloud and analytics technology, which you may have inherited as a result of inorganic growth. As your organization adapts to these changes and/or moves from on-premise to cloud infrastructure, hiring specialists may not be sustainable – not only is it hard to find and recruit these unicorns, but they’re extremely expensive at scale.

So how do you go about building a sustainable team? One approach is to build a team of generalist leaders supported by competent advisory teams with access to a pool of specialists who can offer best-in-class advice as needed. This arrangement affords you the best of both worlds.

Embrace simplicity, self-service, and automation.

Organizations commonly spend north of 20% in annual maintenance fees covering support, upgrades, and peace of mind, without asking if their fixed investments are actually worth the cost. On the flip side, simplicity, automation, and self-service are three pillars of a simple framework to find cost-savings opportunities and make room for growth investments.

If your stack is stable and innovations aren’t solving your organization’s problems, consider self-service and intelligent automation in order to save a lot of time, effort, and cost. Strategies such as automating your continuous integration and continuous delivery (CI/CD) pipeline for D&A combined with intelligent automation using robotic process automation (RPA) and data-driven intelligence can easily save organizations up to 75%.

Shift your teams away from a DIY mindset.

When modernizing data & analytics platforms, there may be a tendency for teams to operate with a DIY mindset. However, moving from one analytics platform to another requires an assessment of the current and target systems, as well as rationalization, extraction of the metadata, conversion of reports, and validation. This process can be complicated by gaps in technical skills and a lack of built-in automation tools–thereby requiring manual effort, as well as more time and money.

But if you leverage reusable assets from partners that can mitigate migration risk, you may be able to save time and money while driving transformation. Specialized tools can significantly accelerate this process up to 80% by automating manual activities and enabling businesses to focus on what matters most for their bottom line.

As leaders seek to accelerate D&A roadmaps with affordable investments, these three strategies–building sustainable teams; embracing simplicity, self-service and automation; shifting away from a DIY mindset–can help organizations become data-driven, modern and competitive while substantially cutting costs and accelerating speed of execution. In a world where up to 30% of hours worked globally could be automated by 2030, it’s vital for organizations to stay ahead of the curve on a sustainable trajectory.

Recent Blogs

This article was first published on Forbes.com, read the article here.

The role of data and analytics is rapidly changing from simply acting as a business-supporting function to being a catalyst for digital transformation.

Whether you like it or not, we are all generating and consuming data at an unprecedented pace. For example, Google now aggregates anonymized datasets from willing mobile users based on their device’s location history. It has built community mobility reports based on geolocation data and shared it for public use. In the current Covid-19 climate, this data has proved to be immensely useful for any organization that is looking to reopen and reimagine its operations.

As a leader of a corporation, government agency, university or nonprofit, it is likely that you use data every day to make choices that affect your business. If you were not already using data to drive decisions, the Covid-19 pandemic must have been a wake-up call for your organization. The ensuing economic disruption has forced businesses to adopt or accelerate their digital transformation journey as they adapt to changing market demands, reinventing offerings, optimizing resources and, in some cases, even fighting for survival. Data and analytics have become strategic and central to digital transformation, but Covid-19 has complicated this journey, making it more challenging to collaborate.

Based on our firm’s 16-year history of supporting customers across all industries and recent conversations with customers and our global centers of excellence, I’ve highlighted five success factors that CEOs should consider critical in their quest to derive maximum value from available data assets:

1. Put data and analytics at the heart of business strategy.

To create a sustainable competitive advantage for your business, you should invest in analytics as a core capability within your organization. This means a top-down commitment from executive leadership, investment in people, trustworthy data and insights made accessible to business users. This also means that you focus on ensuring that executives act on what the data tells them.

Do not assume that you must build analytics systems yourself to create a competitive advantage. If IT is not core to your business, find the right partner to help you build the analytics systems, and you focus on consumption and decision-making.

2. Plan for perpetual modernization.

Given the state of evolution of technology and data, IT leadership should reimagine how it designs data and analytics systems. The architecture itself should be evolutionary — including data systems, insight development and delivery systems. Without taking advantage of cloud, modern architectures and automation, you will not be able to fulfill the expectations of different users and their analytic intentions in the organization. Keep in mind that moving to the cloud should not take years of planning and execution. If needed, you can always keep your current systems working on-premise while you build the new systems in the cloud.

One of our customers moved a 2PB complex analytics system from an on-premise data center to the AWS cloud within nine months. You should adopt modular and technology-agnostic architectures with room to evolve and avoid lock-in to specific tools.

3. Revisit the business/IT engagement model.

To ensure that the value of analytics reaches the strategy and operational units, organizations should focus on how the business and IT work together. Various models are available — centralized, decentralized, supportive, consultative — so you can pick the one that works best for your organization. Not being consistent with how your business is organized is likely to create longer-term challenges. The good thing is, you can start with one model and evolve into the next based on your organizational maturity.

When one of our customers implemented self-service analytics capability across their business units, they established clear roles; IT was made responsible for the enterprise platform, and business units were made responsible for building and maintaining the applications.

4. Make insights accessible to users.

Data-informed decision-making is no longer an executive privilege. Data should be accessible to all staff so they can use or create their own insights for their jobs. IT teams do not need to create every single report and insight. They should also move away from trying to standardize the consumption tools. They should consider making insights available in the tools that the users are already comfortable with.

Insights should be understandable and actionable. Consider the use of techniques such as data storytelling to enable users to comprehend information easily and move toward actions. One of our customers uses automated commentary on top of visuals to convey specific actions to front-line staff.

5. Reimagine return on investment.

We all know that the Covid-19 pandemic has caused a significant shift in consumer demand and that most organizations have had to reevaluate what is important today and what will be relevant tomorrow. Traditional portfolio rationalization models use metrics that show a connection to the revenue or profit for justification and prioritization. However, return on investment encompasses more than just the financial impact; one must consider other factors, such as the potential of results from empowered employees, better customer experiences and newer capabilities.

Leaders must also find ways to explain the intangibles that come with informed decisions. Not everything can be black and white. For example, how do you quantify the impact of saving lives due to better insights? If Google’s community mobility report helps you keep your employees safe, get health care to those in need or bring a sense of normalcy into our lives, what is the investment that we are willing to make to use it?

Navigating organizations safely and effectively in the coming months will be a big challenge for all leaders. However, in times of uncertainty, making informed actions based on data is better than making decisions simply based on instincts. You do not have to drive into an unknown city without maps; you can invest in a navigation aid that you can afford and that makes sense for your business — it could be a paper-based map, a GPS or maybe even a Tesla. You have to make a choice.

Recent Blogs

In today’s business climate, using data to make quick decisions is a common ask across organizations. To fulfill such asks, business users want more, faster, and better access to data and analytic tools. IT wants to balance this need for speed with the responsibility to protect the data assets from security, privacy, and quality risks. A common solution to this scenario is self-service analytics which has been around for at least two decades. Yet, today’s needs cannot be fulfilled by yesterday’s solutions.

Over the last decade, people have become more data savvy, and technology companies are constantly innovating and changing narratives in a highly competitive environment by introducing new tools. But to achieve self-service nirvana, customers must think beyond business intelligence tools. Through our recent webinar and a series of related blogs, we offer our point of view on the “New Self-Service Analytics” based upon our firm’s 15-year history and recent research and experimentation done within our Centers of Excellence.

Demystifying Self-Service Analytics

Self-service means different things to different people. To level set what “self-service” really means in the industry, we need to consider answers to three questions.

The first question is WHAT does self-service do? The goal of self-service is to make it easier for anyone to do any number of things for the end user across the data-to-decision lifecycle. For example, starting from using tools for reporting, to exploration, or to even advanced use cases like mashing-up and wrangling with data sets. Another key element is to CO-OPT users to do more on their part in that cycle.

The second question is WHY are we doing self-service analytics? There are multiple value drivers across the industry such as – reducing the time-to-decision, transferring labor effort from IT to business, or empowering employees to play with the data and generate their own insights. Ultimately, self-service is about increasing the overall productivity in your ecosystem.

Finally, what is the CONTEXT of the self-service? Is it for the consumer of analytics or for its developer? Is it for an individual or for a team? Most practitioners naturally think in terms of “users”. But there is another concept about making business units in large organizations “self-sufficient”. That is where the scale of self-service analytics gets interesting and challenging.

Challenges with Self-Service Analytics

In a typical organization, operational business units deal with today’s pressures. For example, most customers are responding to the public health, financial, and competitive challenges in the market. On the other hand, IT organizations must not only support business units with today’s operational needs, but also balance legacy systems operations with the pace of technology and data growth. The quest for self-service analytic solutions in this situation results in some interesting challenges for both business and IT.

The Business Perspective

The first challenge the desire to standardize around enterprise tools. As business users become more mobile across organizations, they often want to bring their own tools. If a user is comfortable taking data and using Excel to do the analysis, what’s wrong in it? Should IT allow the user to use Excel or not?

Another challenge is with data comprehension. IT understands where data is stored and in what format, but end users often question if all the data exists and when they do see it, sometimes the way data is coded may not make sense to them. Since the data is growing rapidly, it is very difficult for new users to get a handle on the universe of data sets that exist in their organization.

Lastly, more often than not, sufficient time is not given to discovery. So, needs become wants, and soon business users start demanding features rather than expressing what problem they are facing. And IT may or may not persist with them to dig deeper and might just give in with a self-service tool.

The IT Perspective

But provisioning a tool is not enough. IT also needs more time from business for successful user adoption. None of the BI tools in the market, regardless of their claims, is 100% user friendly. Users need some level of support.

Another implication of legacy systems is about organizational inertia. The new folks in an organization want to modernize systems, and old guards want to rationalize the choices made in the past. The reality is we need the best of both worlds for any successful modernization. So, the perpetual modernization cycle tied to inertia creates more challenges in terms of lack of necessary skills for today’s needs.

Moving Beyond Tools

Breaking through these challenges requires more soft skills than technical skills. We feel it is time to revisit how IT and business collaborate and how self-service analytics are designed to deliver sustainable results. Through our next blog, we will offer a new paradigm for achieving better self-service outcomes and provide recommendations for making progress in different situations.

 

On Demand Webinar

Slides

Recent Blogs