80% of executives believe that automation can be applied to any business decision, according to a recent Gartner survey. Businesses are automating a wide range of business processes and operations, from simple and repetitive tasks to complex and mission-critical operations.
Companies have been working to become more data-driven for many years at this point, with mixed results – only 26.5% of companies indicate that their organization is data-driven. Automation tools directly impact brand success and are frequently adopted and integrated by businesses to stay competitive within their industry, and enable data-driven transformation. Data-driven automation enables businesses to improve operational efficiency, make better decisions, and deliver an enhanced customer experience.
Automation projects can be a slippery slope – if not executed properly, it can adversely impact data processes, usage, employee confidence and the customer experience. To realize the value of automation, data and analytics must champion data-driven automation as a strategic thread of business DNA, not a tactical one-off project.
As businesses look for opportunities to modernize their processes and optimize operations via data-driven automation tools, they must first develop a meaningful strategy. This requires a well-planned approach that includes clear objectives, appropriate technologies, and the right skills.
The first step in building a strategy for data-driven automation is to define clear objectives. These objectives should be aligned with the organization’s overall business strategy and should be specific and measurable. A scoring methodology can help businesses rate opportunities for automation according to business impact while sustaining an ongoing backlog for prioritization.
Organizations also need to have the necessary tools & skills in place to support their automation strategy. They must carefully consider which technologies are best suited for them – like robotic process automation (RPA), artificial intelligence (AI) or machine learning (ML). Having experts such as data scientists, engineers and specialists on board will guarantee faster results.
In short, data-driven automation is no longer a luxury but a necessity for today’s organizations who are looking to thrive in an ever-changing market.
Check out the full article for data-driven automation usecases and more steps for successful implementation.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
I had the privilege of speaking and actively participating in thought-provoking discussions in the recently concluded Data Engineering Summit 2023. In this article, I share key insights from my own talk, as well as my takeaways from the keynotes and engaging conversations I had with fellow data enthusiasts at the summit.
-
Smart Data Engineering is flipping traditional approaches – Intelligent systems, techniques, and methodologies are being employed to improve Data Engineering processes and provide clients with added value. Organizations are dedicating resources to implementing cutting-edge AI technologies that can enhance various Data Engineering tasks, from initial ingestion to end consumption. The emergence of Generative AI is transforming the way data is analyzed and utilized in organizations. While it is currently revolutionizing the consumption side of the industry, the pace of developments indicate that it will soon have a significant impact on Data Analytics workloads. This shift towards Generative AI will pave the way for new approaches to Data Engineering projects in the upcoming quarters, resulting in increased efficiency and effectiveness.
-
FinOps will be a game changer – As companies move their Data and Analytics workloads to cloud-based platforms, they are discovering the potential for costs to go out of control without careful management. Though various solutions exist, few provide a sufficient return on investment, leaving customers in search of fresh methods to manage expenses across cloud infrastructure. FinOps provides monitoring teams with tools they need for cloud cost screening and control while promoting a culture of cost optimization by increasing financial accountability throughout the organization. CFOs are especially pleased with this development and are keen on spreading this cost-conscious approach.
-
Data Observability is not a buzzword anymore – Mature organizations are proactively utilizing observability to intelligently monitor their data pipelines. Unforeseen cloud charges can arise from occurrences such as repetitive invocation with Lambda or the execution of faulty SQL code, which can persist unnoticed for prolonged periods. The implementation of observability equips operations teams with the ability to better comprehend the pipeline’s behavior and performance, resulting in the effective management of costs associated with cloud computing and data infrastructure.
-
Consumption-based D&A chargeback is the way to go – Shared services teams are encountering challenges when it comes to accurately charging their internal clients for their utilization of D&A services. The root of this problem is attributed to the lack of transparent cost allocation mechanisms for data consumption, which makes it difficult to determine the genuine value of a D&A service. The solution lies in implementing consumption-based cost chargeback, which not only addresses the current challenges but also prompts businesses to adopt more intelligent FinOps models.
In summary, the summit provided valuable insights into the latest trends, challenges, and opportunities in the field, highlighting the importance of collaboration, innovation, and upskilling. There are many exciting developments that promise to revolutionize the industry. As we move towards a data-driven world, it is clear that data engineers will play a crucial role in shaping our future, and it is essential that they stay informed, adaptable, and agile to keep up with the rapidly evolving landscape.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
We live in a growing digital economy and data fluency is a necessary digital skill that everyone needs to thrive in a modern data-driven workplace culture. Employees across all levels of the organization must cultivate this skill to contribute to business outcomes and promote data proficiency. Yet only 32% of C-Suite is considered data literate – a main precursor to data fluency.
However, building enterprise-wide data fluency and nurturing a data-driven culture is a complex process. Executives often encounter obstacles, such as lack of skilled personnel with the necessary technical and analytical skills. Demand for data talent is high, which can drive up costs, making this particularly difficult for smaller organizations without the resources to compete for top talent.
Limited availability of training and resources for employees to develop their data fluency skills is another challenge. Without comprehensive training programs, many employees are left to develop skills on their own, leading to inconsistencies in skill level.
Additionally, lack of clarity on roles and responsibilities can cause uncertainty among employees as to what is expected of them or how they can contribute to the organization’s data fluency goals.
Building data fluency requires a comprehensive, coordinated effort that involves leadership support, employee training and development, and a clear vision and strategy.
Define what data fluency means for your organization
Defining data fluency involves settling clear goals and expectations for what skills and knowledge employees need to develop to meet those objectives.
Assess current skills and identify gaps
After the unique data fluency goals have been established, your organization must evaluate the business-data-literacy of their employees who create and consume data, and identify the gaps that must be filled to align with those goals.
Provide training and resources
Organizations must provide their workforce with training and resources to develop their data analysis skills. This might include classroom training, online courses, webinars, workshops, and other educational assets that enable employees to learn how to work with data effectively.
Check out the full article for top business barriers to establishing data fluency, essential skills, and strategies to build a data fluent workforce.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
I had an opportunity to participate in the 2023 Gartner Data & Analytics Summit in Orlando. It was a humbling experience networking with & learning from various analysts, speakers, exhibitors, and industry peers. Here are 10 learnings that stuck with me from the conference mixed with my own reflections:
- The struggle to demonstrate tangible ROI from D&A initiatives continues
Gartner’s 2023 CDO survey shows 69% of D&A leaders are struggling to deliver measurable ROI. To change this, leaders must master the language of business, stop chasing the “shiny new objects”, and use both financial & financial influencing factors to measure & articulate value.
- The divide between business & IT organizations in large enterprises is real & growing
Far too many examples or conversations where Business & IT don’t seem to be operating based on a shared understanding of what success looks like. Introducing new technologies or operating models won’t necessarily overcome this divide. What is needed is empathy, willingness at the top, and then specific interventions in your context. This topic deserves more attention in the industry.
- “Enterprise decision networks” is an emerging capability to watch out for
“The success of an organization is nothing but a sum of all of its decisions.” Gartner predicts that 75% of Global 500 companies will use decision networks as a competitive differentiator. Not many products exist in the market that systematically close the loop between analytics & decision making. We train & optimize AI models based on data but rely on training & then just experience for humans. Expect decision intelligence platforms to emerge that help you model, track & improve decisions.
- LOCO/NOCO are driving the next wave of innovation in D&A
Companies like Nexla, Aible, Cinchy, and Kyligence are reducing the effort needed to organize, explore, collaborate, and trust data & insights. Nexla is automating data engineering; Aible is automating exploratory insights; Cinchy is eliminating the need for data integration; and Kyligence is automating the metrics layer. They are reducing the need for developers in the data-to-insights lifecycle. Expect shorter time to value and higher software & niche labor costs – all while hopefully lowering the TCO!
- Are we overestimating the short-term impact & underestimating the long-term impact of Generative AI?
ChatGPT is likely going to be the “word of the year.” Nearly every vendor had something to say about Generative AI. The possibilities in content creation, workforce productivity, software engineering and knowledge management are eye opening. To take advantage companies should develop a position paper outlining the opportunities & risks and establish an incubation team to explore use cases. To prioritize the right ones, apply the “willingness to pay” test to determine use-case worth.
- A consolidation wave is coming in the D&A technology landscape potentially adding to your modernization debt
As I observed the bake offs for data management, analytics & BI, and DS/ML, it’s clear that many vendors have overlapping foundational capabilities & then a few differentiating ones. Most try to differentiate based on knowledge of their own offerings. Customers need to be very systematic in evaluating these tools to identify what will be relevant in their data ecosystems or be ready to see their modernization debts rise. While the total addressable market is very large in D&A, can the industry afford so many overlapping platforms?
- Executives need to become AI fluent to seize opportunities for & mitigate new risks to their businesses
AI is going to disrupt almost all businesses before the end of this decade. But are executives in different organizations comfortable with AI? Given the potential for business benefits & harm, executives must invest in AI fluency. They need to be able to ask the right questions to guide their teams towards relevant opportunities in customer experience, operational excellence, or new business. They should also take questions around harm – both malicious & commercial use – very seriously & establish governance frameworks to exit early when risks are high.
- FinOps practices are essential to avoid surprises in cloud bills
Gartner shares that a primary complaint from cloud buyers is the lack of predictable costs & transparency in their purchased cloud solutions. Factors such as rapid provisioning, pay for what you use, & elasticity that made cloud attractive are also leading to overspend. One way to bring equity in cloud cost monitoring is to use price performance metric (total cost/total work capacity). Organizations must apply FinOps to cloud budgeting, governance, & optimization practices. FinOps requires technical & cultural shifts to ensure CFO, CIO/CTO, and Business Leaders can connect cost to value.
- If your data literacy programs aren’t yielding results, shift to persona-based or employee-led models
Data fluency is an organizational capability that enables every employee to move away from “just reading the numbers” to “accessing data, understanding its context, interpreting insights, and using them” as fluently as one can speak in their language. A powerful success story from New York City Health & Hospitals confirms that to get effective results organizations should consider persona-based & adult learning models in their data literacy programs.
- Organizational change leadership is critical to close the gap between insights & actions
And lastly, but perhaps most importantly, to get any meaningful data-driven results in your organization, you need to be able to effect change in yourself, your teams, & your organization. I enjoyed Shankar Vedantam’s talk on how the “hidden brain” shapes our capacity for change. Barriers to motivation include organizational inertia, the IKEA effect, and the sunk costs fallacy. Techniques to persuade include adopting swift irreversible change actions, using positive reinforcement, and doing hard things first.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
In today’s business landscape, data is the key which separates thriving companies from those falling by the wayside. Whether it is rapidly redesigning products & services to the new realities, closing talent supply-demand gaps, or resetting operational costs based on zero-based budgets – businesses & executives need the ability to make the right decisions at the right time with the right stakeholders.
Despite increasing investments in D&A capabilities to support such needs, a 2023 D&A Leadership Executive survey highlights the gap between the worlds of business & analytics where the overly optimistic assessment from leaders executing D&A initiatives (98% say D&A is providing value) does not match with the progress reported on longer-term initiatives such as driving business innovation with data (only 59.5% say yes), and, creating a data-driven organization (only 23.9% say yes).
So, if you are responsible for D&A capabilities, it can be difficult to separate what are the real, transformational initiatives capable of revolutionizing your business from the ‘noise’ of so many new technologies and trends promising ‘faster and more accurate access to insights.’ Using our experience supporting & guiding leaders at multiple levels, I help you cut through the noise to focus your limited time, effort & resources on the areas that will be most impactful to the goal of using D&A to drive business growth in 2023.
-
Embrace managed AI to realize the promise of AI for your business.
McKinsey – who expects a $10T+ return from AI – rightly identifies, success remains elusive for many businesses, with 72% not yet able to take advantage, while an astonishing 85% presently fail to realize any sort of business benefit from their attempts at integrating AI into existing operations! If majority of the AI projects fail, organizations must invest in failing fast. In other words, you must be able to identify the use cases that will achieve specific business outcomes & are feasible & do it in days not months. Rather than DIY, you should consider fully managed AI solutions for business such as Infocepts’ AI as a Service (AIaaS).
-
Revisit resource allocation models for your D&A projects to get more done with less.
Companies are facing financial crunch because of on-going economic uncertainty including the very real possibility of a recession in the next 12-18 months. Data shows that companies that come out strong after a recession – i.e., with twice the industry averages in revenue & profits – are proactive in preparing & executing against multiple scenarios backed by insights. They re-deploy employees against the highest value activities in their strategy, weigh trade-offs & take deliberate risks, but without cutting corners on critical projects. You should a) stop paying for legacy technology maintenance, b) fix your broken managed services model, & c) choose on-demand talent.
-
Help your teams ‘actually’ use your D&A capabilities to increase adoption & realize business value.
Many organizations have invested in specialized teams for delivering data. But the real value from data comes only when your employees use it. Data fluency is an organizational capability that enables every employee to understand, access, and apply data as fluently as one can speak in their language. With more fluent people in an organization, productivity increases, turnover reduces, and innovation thrives without relying only on specialized teams. Top roadblocks according to industry surveys include lack of trust in organizational data and lack of data skills in employees. You should a) assess your organizational fluency, b) establish a data concierge, & c) introduce interventions to strengthen access, trust, and adoption.
-
Inverse your team’s thinking in 3 (counterintuitive) ways to be more responsive to your business needs.
Large enterprises have multi-year roadmaps to advance their transformative D&A capabilities to include modernizing existing platforms, retiring legacy ones, or building new applications for business. A proven approach is centred around building a business-driven roadmap, using a modular foundational architecture, aligning cross-functional stakeholders, and allocating necessary talent for success. It sounds simple, but the execution is fraught with challenges.
Patterns such as agile, data mesh, data fabric, and delta lake guide IT leaders for dealing with them, but there are 3 antipatterns in how work is managed that slow organizations down. You are likely experiencing them! If so, you should a) design work for outcomes, not functions, b) choose products over projects, & c) choose teams over staffing. These tactics will increase flexibility in your roadmap execution, responsiveness to your business needs, and effectiveness of your teams.
-
Reimagine how you bring value to business – move fast with Solutions as a Service!
Salesforce and Amazon pioneered commercially successful IaaS, PaaS, and SaaS models in cloud computing that gradually shifted significant portions of responsibility for bringing value from the client to the service provider. The benefits of agility, elasticity, economies of scale, and reach are well known. Since then, countless products have emerged in these 3 categories. But they are limiting & constrain clients from going one step further towards productized services, what we call, Solutions as a Service!
You must combine products, problem solving & expertise together in one easy to use solution. This is inevitable given the sheer pace at which technology is evolving. This new category requires a shift in thinking and will give you a source of advantage like how the early cloud adopters received during the last decade. Infocepts offers many domain-driven solutions in this category to include data products such as e360 for people analytics and AutoMate for business automations.
Moving forward from ideas to actions
D&A leadership will be challenged like never before in 2023. You can either keep doing more of the same, or take a step back, reflect, and lean-in on new & better ways to move forward on your D&A initiatives. To learn more about the transformational D&A initiatives that you must get done, download our advisory note for the full analysis, examples, and recommendations.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
58% of those companies who have implemented AI report increased efficiency and decision-making throughout their teams. Yet only 12% of companies are actually benefiting from this technology.
AI is rapidly reshaping the business landscape and is now poised to revolutionize decision-making processes. AI-supported decisions are changing how businesses operate and enhancing the effectiveness, speed and accuracy of decision-making processes. Many organizations have made great strides towards implementation but it’s a complicated, time consuming process that often leads to failure.
Organizations need to know what challenges they’re up against when attempting to leverage AI-assisted decision-making and the best practices that help them drive widespread value from their data.
Top challenges which organizations face in implementing AI-assisted decisions
Though AI-assisted decision making is a quickly growing field, many data and analytics executives struggle to successfully implement and scale these solutions within their organization. Here are the top challenges you’ll need to tackle:
- Poor business case for AI
Without a clear strategy and business case, organizations might not have a good understanding of what they hope to achieve with AI-assisted decision-making, making it difficult to determine if implementation is successful. Also, it can be difficult to integrate the solution into existing business processes, gain support of key stakeholders, measure ROI, and identify opportunities for ongoing improvement. - Data quality
One of the biggest challenges in implementing AI-assisted decision-making is ensuring that there’s copious amounts of high quality data. Poor data quality can negatively impact the accuracy of AI algorithms and limit the ability to provide meaningful insights, leading to greater inefficiencies.
Best practices for implementing AI-assisted decision-making
Implementation of AI-assisted decision-making requires careful consideration of these challenges and a strategic approach to ensure efforts are delivering measurable business value and growth – and placing AI-powered insights into the hands of the organization. Below are a few best practices data and analytics executives should embrace:
- Develop a comprehensive strategy
A clear strategy helps align AI-assisted decision-making with overall business goals and ensure that resources are allocated effectively. High performing AI adopters tend to link their AI strategy to business outcomes. Executives need to define their business objectives and where AI can add the most value; assess their existing infrastructure to determine what must be in place to support the solution; conduct a feasibility study to understand the efficacy and cost of implementation; secure buy-in from key stakeholders; and develop a roadmap for implementation, including budget, milestones, and timeline. - Foster collaboration and communication
Data and analytics teams, technology partners, and stakeholders from all levels of the organization should be involved in the design, development and implementation process to ensure all needs and concerns are taken into account. Establish regular communication channels and encourage cross-functional collaboration to facilitate open discourse about the status and progress of AI projects to increase buy-in and ensure alignment on the goals of AI-assisted decisions.
There are many other challenges which organizations face and best practices which will help you realize the full potential of AI-assisted decision-making. Read our ebook to learn more tips.
Check out Infocepts DiscoverYai, an end-to-end solution providing 360° support to take care of all your woes and embed best practices in your implementation process easier than ever.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
Covid-19 accelerated digital transformation for enterprises in all industries — and today, cloud initiatives are at the core of digital transformation. Enterprises are likely to experience mixed success with their cloud initiatives in the medium term as they manage challenges such as severe skill shortages, evolving data architecture practices and myriad cloud vendors with similar offerings.
Drawing from our company’s experience of implementing more than 50 cloud initiatives across industries, here are my top five recommendations for business and IT leaders planning significant data and analytics (D&A) initiatives in the cloud.
1. Not everything that can be moved should be moved to the cloud.
Cloud migrations involve significant capital expenditures (CAPEX). In my experience, when you migrate old data applications to the cloud, you should not expect to see operating expenses (OPEX) savings for up to three years. In many cases, all layers are not migrated together due to interdependencies on other systems, leading to a hybrid approach with a combination of on-premises, private and public cloud hosting.
Carefully evaluate the suitability and need to migrate the following category of applications to the cloud:
- End-of-life legacy applications or tool platforms
- Applications built on comparable SaaS (Software as a Service) tools
- Applications accessing data that requires stringent data security and privacy
- Specific hardware-dependent applications
2. Plan to embrace a multi-cloud future.
Three leading public cloud players — Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP) are adding capabilities, services and geographical locations at rapid pace. Most of their comparable services match each other in terms of cost and performance, and with no consolidation in sight, you can benefit from their competition.
Each of these cloud vendors do provide a few differentiating services. To enable the creation of cutting-edge data and analytics solutions, aim to leverage the best services available, regardless of which vendor provides it. For example, one of our clients — a leading media and entertainment company — uses a multi-cloud setup with AWS infrastructure and select AWS services for its data apps, Azure for email services and cloud native PaaS platforms like Domo and Snowflake for analytics.
Within your organization:
- Discourage investment in single cloud vendor
- Promote a culture of looking for the best services, comparing capabilities and costs across cloud vendors
- Encourage technical teams to design data architectures that seamlessly use cross-cloud capabilities
3. Don’t let security be an afterthought.
According to the Verizon Data Breach Investigations Report (DBIR), most cybersecurity incidents now involve cloud infrastructure. We can expect the threat of data breaches to grow in the foreseeable future, and the responsibility for increasing security protections lies with enterprises.
In our work, we have seen that most cloud initiatives, especially enterprises’ early endeavors, try to address security requirements through native services. However, due to inadequate design, these solutions fall short of addressing all risks. Thankfully, there are a number of solutions from third-party vendors available that you can use to address this critical gap. Use these tools to:
- Carefully assess security requirements
- Invest early in holistic security solutions
- Conduct frequent vulnerability scans
A global bank that we work with has implemented a unified data-centric security model with sensitive data-flow discovery, real-time monitoring, behavior analytics and protection across all operational and analytical applications (both on-premises and on-cloud).
4. Monitor all D&A solutions through a unified platform.
Given the nature of cloud services, any data and analytics platform migrated to the cloud gets decomposed into many independent solutions. While this offers advantages, such as no single point of failure and scalable performance, managing multiple platforms can be complex. In case of service level failures, it can be difficult to ascertain the root cause, replay the sequence of events and recover from the failure.
DevOps staff supporting disparate platforms need to invest significant effort in scanning consoles of multiple services for any meaningful analysis — post mortem or change impact. It is highly likely that components of such systems will drift away from the initial architectural vision. To avoid this outcome, push for:
- Holistic assessment of current and future monitoring requirements
- Early investment in a comprehensive monitoring solution
- Frequent “game day” drills to test responses, in processes and people
A global market research firm we work with uses a centralized monitoring platform to track its infrastructure, databases, analytical apps, workflows and security. It gives them the ability to have a 360-degree, single-pane view of its data and analytics ecosystem and provides greater operational efficiency.
5. Aim for an accelerated pace of innovation through the cloud.
For most enterprises, the first set of cloud initiatives includes migrating existing data and analytics applications to a cloud platform. Whether as-is (lift and shift) or re-engineered, these types of migrations don’t change the status quo dramatically.
But there is a constantly expanding set of cloud offerings that covers capabilities like IoT, blockchain, data science, machine learning, media, quantum, robotics, satellite, VR and AR. Explore how your organization can use cloud initiatives to power innovation. How effectively you do this will prove to be a competitive advantage in the Industry 4.0 era.
There are also countless focused solutions available on cloud marketplaces that significantly reduce the cost of experimentation. Take advantage of these cost-effective tools and encourage:
- A culture of innovation with cloud at the center
- A risk appetite based on leveraging cloud offerings and marketplace solutions
- Thinking “cloud first” before costly in-house development of new solutions
Your organization has probably moved the first set of data stores and front end analytics apps to the cloud with varying degrees of success. Enterprises that don’t see measurable positive outcomes with their first cloud initiatives tend to delay the rest of their cloud adoption journey. Don’t fall into the same trap. Cloud initiatives will continue to be a critical ingredient for future business capabilities. By finding the right solutions and engaging the right partners, you can set your organization up to make well-informed choices, develop pragmatic roadmaps and avoid the pitfalls that lead to failure.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
This article was first published on Forbes.com, read the article here.
In the digital economy, data is our most valuable resource. “Data is the new oil” is now a popular saying, and it’s an apt metaphor. Companies that learn to extract and use data efficiently have the potential for huge rewards, driving profits, innovation and customer satisfaction.
Just like with oil, quality matters in data. Quality isn’t free; it requires time, money and attention, but it’s worth the investment.
What Is Good Data?
Good data quality means the data is fit for use based on the business context and requirements. The business rules governing quality include both quantitative (e.g., “Is the date legitimate?”) and qualitative rules (e.g., “Is the date captured in the American notation?”). In addition, expectations of users regarding availability, reliability, usability, relevance and clarity may lead to perceptions of quality issues. Data quality initiatives need to address various aspects to ensure that data is trustworthy.
Think about global data related to Covid-19 vaccinations. Reliable data must include a patient’s date of birth, the date of each dose, type of vaccine, number of required doses and location of each dose. Quality issues become complicated when you consider that some users received shots from different vaccine manufacturers, or data may have been captured in different states or countries. Poor data quality can prevent various stakeholders — including public health experts, vaccine advocates, and the general public — from making informed decisions. If people perceive vaccine data to be unreliable, they may become more hesitant to get vaccinated and ultimately damage public health outcomes.
The Cost Of Bad Data
In 2016, an IBM* study estimated that bad data costs the U.S. economy $3.1 trillion a year, and in 2020, a Gartner** survey found that organizations calculated that the average cost of poor data quality was $12.8 million a year.
In my experience leading a global data and analytics (D&A) solutions provider, I’ve seen that while everyone cares about data quality in theory, when it comes to actually making funding decisions, many customers want to cut corners.
But this is where the rubber meets the road. If you don’t finance data quality initiatives, then you won’t get the result you want. Poor quality data can lead to flawed decision making, top- and bottom-line consequences and decreased employee and customer satisfaction. Incomplete data may result in ineffective marketing campaigns, and a data breach can cause reputational damage or leave you vulnerable to litigation under laws like GDPR or CCPA.
Six Common Challenges In Data Quality Improvements
Improving data quality in your company will have significant long-term benefits, but you must be proactive. Here are six of the most common challenges to be aware of when improving data quality:
1. Lack of awareness: Because data is an intangible asset, it’s often hard to assess quality and address problems. Your stakeholders may not fully appreciate the state of data quality in your systems until a major issue impacts your business.
2. Difficulty justifying investments: To get buy-in on improving data quality, you need to be able to make a solid business case for it, showing how poor-quality data has had negative consequences in the past. But frontline staff may not be willing to document quality issues to build a future business case for something like automation, preferring instead to resolve issues manually.
3. Confusing shared responsibility with IT responsibility: Enterprise data is often used across multiple business units, moving through line-of-business systems into reporting and analysis systems. Quality ownership is delegated to IT as data flows through various pipelines, yet IT is not fully responsible for the source systems. Data quality demands shared responsibility.
4. Resistance to change: Data quality programs are heavily focused on continuous improvement, which calls for users in your organization to adopt new behaviors and perform additional checks and balances with discipline. If employees are unwilling to adapt, you will run into obstacles.
5. Fear of transparency: Data quality assessments and real-time quality dashboards may make some leaders uncomfortable. Looking into past data decisions and results may cause some in your organization to feel embarrassed or concerned, creating yet another roadblock.
6. Lack of sponsorship: Data quality initiatives often compete with new technology and product management investments. It can be tempting to throw money at a shiny new object, spending $X on a cloud computing platform instead of the same amount for a data governance consultant’s annual fee. Data quality is less glamorous and often loses out to modernization initiatives.
Five Ways To Improve Data Quality
Once you’ve identified the challenges you’re facing, here are five actions you can take to address them:
1. Sponsor beyond budgets: To achieve successful data quality initiatives, you must be willing to invest more than dollars. Articulate the importance of data quality for the organization, inspire cross-functional collaboration, prioritize progress and hold frontline managers accountable for long-term success.
2. Invest in data quality and transparency tools: As the volume, variety and velocity of data increases, you need more specialized tools for quality management. Invest in software that automates quality management for profiling, catalogs, metadata and lineage.
3. Adopt DataOps automation practices: DataOps practices build on DevOps practices from the agile software engineering domain within data analytics. Integrate them to promote collaboration, communication and automation to bring business, applications, operations and data teams together to speed feedback cycles for quality issues and reduce cycle times for delivering D&A products.
4. Embrace a stewardship culture: To create a data-driven culture, you need to incorporate stewardship in your organizational norms and habits. This is a shared responsibility among various stakeholders and includes identifying and articulating quality issues and following governance practices to resolve issues.
5. Build a data management team: Data quality management requires dedicated skills, roles and positions depending on the scope, size and complexity of your organization’s data. Hire the people who can evaluate, design and execute the solutions you need, including Chief Data Officers, data analysts, business and technical stewards, tool experts and data management consultants.
Investing in data quality is a long but worthwhile journey. Evaluate your company’s needs and challenges, and invest in the practices, tools and leadership required to advance valuable data quality initiatives.
References:
- * Harvard Business Review – ‘Bad Data Costs the U.S. $3 Trillion Per Year‘ by Thomas C. Redman, 22 September 2016.
- ** Gartner Research – ‘Cost Optimization Is Crucial for Modern Data Management Programs’, by Ankush Jain, Guido De Simoni, Eric Thoo, Adam Ronthal, Melody Chien, Donald Feinberg, Ehtisham Zaidi, Sally Parker, Simon Walker, Malcolm Hawker, 22 June 2020.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
Data is the most important asset for any organization and is the key to its growth and success. Analytics, which is at the heart of any digital transformation initiative empowers new age organizations to leverage data effectively to make smart decisions.
Every organization today is trying to invest significantly in Analytics and it’s a core area of discussion in board meetings. But then, why do so many Analytics projects fail to make an impact? While there are many reasons for failures, here are the top six that you should be aware of before kicking off any analytics project.
-
Lack of Management Vision & Support:
Top management sets the tone for any organizational initiative and Analytics is no different. Visionary management team will support the need to build a data-driven culture within the organization.
- Humans have the tendency to resist any change. If analytics programs are kicked off just to follow the market trend without setting up the business strategy, it’s bound to fail. Business leaders should include analytics as part their vision and handle the change management effectively through inclusivity and trust building measures.
- Analytics projects are traditionally hard to implement. They typically start with a Proof of Concept(PoC). Bringing analytics projects from conceptualization to production requires clear understanding of business goals to everyone involved in the program. We have seen the crazy race to launch the Analytics programs and many of these projects go down due to lack of management vision and top-down support.
-
Poor or Missing Data:
Data driven originations have survived the most challenging of the conditions because they believed in their data and utilized it to the maximum for accomplishing their executive goals.
- It is important to continuously check the quality of data and make necessary changes to the data cleansing routines as required. Make the Data quality routines flexible so that they can continuously adapt to new rules or sources of data.
- Business processes are becoming more and more complex and end users are continuously looking for new product lines and real time assistance. Understanding the data required for any Analytics model is the key to success.
- Idea is to bring right data by effectively understanding the business requirements and baselining the processes. Data should be able to power analytics engines effectively which can result in actionable insights with greater impact.
- Good data will aid the success of any analytics project and on contrary bad choice or quality of data will make the task difficult and ultimately take down the complete analytics investment.
-
Missing Analytics Governance:
Most of the Analytics projects are built on small PoC/MVPs and once it’s successful, business/product owners demand to build more use cases on top of this without strengthening Analytics Governance processes.
- Analytics governance should go hand in hand with the first ever Analytics project and should not be left aside for future enhancements.
- Things like Security should be the day 0 priority and should not be ignored even when you are delivering a small PoC. Go for an effective role and access management so that the trust is not compromised at any level.
- Analytics Governance ignorance can result in multiple points of failures such as security breaches or reduced performance which eventually may bring the bad name to the organization and complete analytics initiative may go for a toss.
-
More Time to Market:
It is a furious competition out there and if you are not able to deliver on time, someone else will.
- We have seen cases where Analytics programs were delivered but it failed to catch up with the consumers as the competitor was ahead in the game.
- Time to Market from ideation to finished analytics should be a short cycle and idea should be to deliver results quickly to the business through short MVPs.
- From technology point of view use the power of cloud to eliminate infrastructure bottlenecks. This will not only shorten the complete cycle, but you will be able to come back to the drawing board quickly in case of any inadequacies. Adaptation is the key to success and the feedback loop should be kept open.
- Create a lean Project management schedule even if it is a PoC. This not only helps in tracking activities closely, but any bottlenecks can also be managed immediately.
- Have simple visualizations on top of your analytics output which can be easily understood by the business teams. Many of the times we have seen that end users who consume analytic results fail to interpret analytics output and ultimately the UAT cycle is stretched.
-
Missing Appropriate Team members:
While it is okay to have cross trained team members in your analytics team, at least have one strong resource who understands the technology and is curious about what that data might reveal to solve the business question that analytics is trying to answer and bring required Data Science & Machine Learning leadership.
- Working on an Analytics project without business stakeholder’s active engagement can lead to a weak hypothesis and ultimately the Time to market is stretched.
- Keep business stakeholders in your team as product owners from day 1 even for a small PoC. They can be owners of the business process that you are trying to simplify. Having them in your team can reduce the chance of failure as they can continuously provide the feedback to improve your model.
-
Ignoring Regulatory compliance requirements:
Businesses may have to pay heavily if regulatory compliance are missed.
- While analytics for an organization is rewarding, it may become a high-risk issue if it is not organized and compliant to regulatory laws.
- Most of the time Analytics projects are kicked off in a rush and delivered technically without considering the regulatory and compliance requirements for that region/business. For example, if you are working for a European client, GDPR is something which you cannot ignore else you will never be able to productionize your model even if it’s technically super.
Get in touch to know more!
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023
This article was first published on Forbes.com, read the article here.
According to McKinsey’s 2020 global survey of 1,216 executives and managers across financial, retail, health, automotive, and telecom industries, 87% of respondents said their organizations are either already facing skill shortages or they anticipate skill shortages within the next five years. The impact of such talent gaps will vary by industry, with healthcare, pharmaceutical, and medical products industries least likely to be disrupted, and financial services, tech, and telecom industries most likely to undergo a serious step change.
Forty-three percent of the survey’s respondents said they expect data and analytics (D&A) to be the area with the greatest skills gap, making D&A the most prominent area of concern, followed by IT management and executive management. Clearly, D&A leaders are under substantial pressure to deliver transformative results in today’s economy. The demand created by skill shortages is only exacerbated by rapidly expanding data, technology, and digital economies.
In order for data & analytics leaders to deliver the transformation their organizations depend on, it’s crucial to connect the organization’s data & analytics roadmap to its corporate strategy in a way that’s both affordable and sustainable. If you are a leader responsible for data & analytics strategy and execution in your organization, this may mean you’re also responsible for getting buy-in from your CEO.
As the co-founder and CEO of a global top 40 D&A solutions provider, I’ve found success with three concrete strategies for accelerating the execution of an enterprise data & analytics roadmap by securing the right leadership, lowering TCO (total cost of ownership), and pursuing speed of execution to achieve intended outcomes.
Build sustainable teams for data & analytics outcomes.
Imagine you are responsible for enterprise D&A platforms at an organization. Chances are, you’re dealing with multiple complementary stacks of cloud and analytics technology, which you may have inherited as a result of inorganic growth. As your organization adapts to these changes and/or moves from on-premise to cloud infrastructure, hiring specialists may not be sustainable – not only is it hard to find and recruit these unicorns, but they’re extremely expensive at scale.
So how do you go about building a sustainable team? One approach is to build a team of generalist leaders supported by competent advisory teams with access to a pool of specialists who can offer best-in-class advice as needed. This arrangement affords you the best of both worlds.
Embrace simplicity, self-service, and automation.
Organizations commonly spend north of 20% in annual maintenance fees covering support, upgrades, and peace of mind, without asking if their fixed investments are actually worth the cost. On the flip side, simplicity, automation, and self-service are three pillars of a simple framework to find cost-savings opportunities and make room for growth investments.
If your stack is stable and innovations aren’t solving your organization’s problems, consider self-service and intelligent automation in order to save a lot of time, effort, and cost. Strategies such as automating your continuous integration and continuous delivery (CI/CD) pipeline for D&A combined with intelligent automation using robotic process automation (RPA) and data-driven intelligence can easily save organizations up to 75%.
Shift your teams away from a DIY mindset.
When modernizing data & analytics platforms, there may be a tendency for teams to operate with a DIY mindset. However, moving from one analytics platform to another requires an assessment of the current and target systems, as well as rationalization, extraction of the metadata, conversion of reports, and validation. This process can be complicated by gaps in technical skills and a lack of built-in automation tools–thereby requiring manual effort, as well as more time and money.
But if you leverage reusable assets from partners that can mitigate migration risk, you may be able to save time and money while driving transformation. Specialized tools can significantly accelerate this process up to 80% by automating manual activities and enabling businesses to focus on what matters most for their bottom line.
As leaders seek to accelerate D&A roadmaps with affordable investments, these three strategies–building sustainable teams; embracing simplicity, self-service and automation; shifting away from a DIY mindset–can help organizations become data-driven, modern and competitive while substantially cutting costs and accelerating speed of execution. In a world where up to 30% of hours worked globally could be automated by 2030, it’s vital for organizations to stay ahead of the curve on a sustainable trajectory.
Recent Blogs

Data-driven Automation: The Guiding Force for Business Innovation & Growth
May 18, 2023

The Future of Data Engineering: Key Insights from the Summit
May 18, 2023

Building a Data Fluent Workforce: Challenges & Top Solutions
April 19, 2023

Our Learnings from the 2023 Gartner Data & Analytics Summit
April 11, 2023