In the era of digital transformation, data became the most valuable asset for companies in every sector. The ability to collect, process, analyze and, above all, govern this information is now the core of any successful strategy. However, a recent study conducted by Denodo in collaboration with IKN Italy revealed a worrying picture for the Italian business landscape: a significant percentage of companies still lack structured management and data governance figures. This gap is not only an operational problem, but a real barrier to the full realization of the potential data-driven, placing companies in the face of complex challenges ranging from poor data quality to the difficulty of generating timely and reliable insights. This article aims to explore in depth the challenges emerging from this research, analyzing the impact of Chief Data Officer's deficiency, the disconnection between business and IT, and the implications of poor data quality. We will deepen emerging solutions, from data virtualization to more advanced architectures such as Data Fabric and Data Mesh, and the transformative role of cloud technologies and artificial intelligence. The goal is to outline a clear path for Italian companies towards a more agile, safe and value-oriented data management, which is fundamental to compete in an increasingly dynamic and data-centric market, where the speed of decision and relevance of information can determine success or corporate failure. Understanding these dynamics is the first step to build a future in which the data is not only an accumulation, but a real strategic engine.
Il Panorama Italiano e la Crisi della Data Governance: A Deep Analyses
The research Denodo has highlighted a reality in which, in Italy, the 29% of companies do not yet have a person who specifically deals with data governance. An alarming figure, which reflects a still uncertain perception of data as a fundamental strategic asset. Only fewer than two out of ten companies (up to 19%) can boast a Chief Data Officer (CDO) in their own staff, while in most cases (26%) the governance function is delegated to the Chief Information Officer (CIO), it is acute but with a primary focus on technological infrastructure rather than strategic data enhancement. This delegation can lead to insufficient management, as the IOC is often oberated by tasks related to IT operations and may not have the specific vision or skills to address the complex challenges of data governance, which include legal, ethical, quality and corporate strategy. The lack of a dedicated and well-defined role exposes enterprises to a number of significant risks, including operational inefficiencies, hidden costs due to unreliable data, difficulty in complying with increasingly stringent regulations such as the GDPR, and above all, a substantial inability to draw the maximum value from the immense amounts of data generated daily. In an increasingly competitive global economy and data-driven, not having robust and proactive data governance means condemning itself to a competitive disadvantage. Business decisions, from the definition of marketing strategies to supply chain optimization, from the customization of customer experience to the mitigation of financial risks, intrinsically depend on the quality, accessibility and integrity of data. Without a clear guide and an organizational structure that supports its management, the data remains isolated in silos, their quality degrades, and their potential intelligence remains unexpressed, turning from opportunities to burden. This situation highlights not only an operational gap, but a real strategic crisis that Italian companies must urgently address to remain relevant and innovative in the contemporary digital scene, where volume, speed, variety and veracity of data (the so-called ‘4 V’ of the Big Data) continue to grow exponentially, making governance more complex but also more critical than ever.
Business and IT Disconnection: Obstacles to Data Processing
One of the most critical aspects highlighted by Denodo research is the deep disconnection that persists between business needs and IT capabilities regarding data management and delivery. The 23% of italian companies complain about long waiting times for business before they have the required data available, while the 19% suffers from data dispersion and isolation within different business structures. These numbers are not simple statistics, but manifestations of a systemic problem that brakes the operational and strategic agility of enterprises. The business, driven by the need to react quickly to market dynamics, to customize the offer to customers and to optimize processes, requires easy, fast and autonomous access to reliable information. IT, on the other hand, is often managing complex legacy infrastructures and limited resources and technological constraints that make rapid data delivery clean and integrated. This clutch generates a vicious circle: the business, frustrated by delays, uses 'shade' solutions (shadow IT), creating its own datasets and analyses that often lack rigor, coherence and governance, increasing fragmentation and overall complexity. Data remains trapped in departmental silos – whether Excel sheets, local databases or specific application systems – making it almost impossible for a unique and holistic view of the customer or business operation. The lack of a single access point and a shared semantic transforms the search for relevant information into a real treasure hunt, consuming precious time and undermining the trust in the data itself. This inefficiency does not only result in higher operating costs, but also in suboptimal decisions, loss of market opportunities and poor capacity to innovate. The disconnection between those who generate data and those who have to use it strategically is not a mere technical problem, but an organizational and cultural challenge that requires a realignment of the objectives and a redefinition of the roles, with the IT that as an infrastructure provider becomes facilitator and corporate intelligence enabler, and the business that develops a greater awareness of the sources and quality of the data it uses for its own analysis.
The Strategic Role of the Chief Data Officer: Architect of Data-Driven Transformation
The poor presence of a Chief Data Officer (CDO), found in 19% of Italian companies, is one of the most obvious signs of an approach not yet mature to data management. While 26% of companies entrust this task to the CIO, it is essential to understand that, although there are overlaps, the role of the CDO is distinct and complementary, but above all *repensable* to navigate the data-driven era. CIO is traditionally focused on technology infrastructure, connectivity, system security and IT operational efficiency. The CDO, however, has as its primary mandate the maximizing data value as a corporate strategic asset. This means defining the data strategy, establishing governance policies (from quality to privacy, from security to ethics), promoting data-driven culture, and enabling innovation through intelligent information analysis and usage. The CDO is the architect who builds the bridge between technology and business objectives, ensuring that data is not only accessible but also reliable, compliant and ready to be transformed into actionable insights. Its responsibilities range from the creation of a corporate data catalog to the supervision of integration projects, from the definition of KPI for the quality of data to the management of the complete life cycle of information. It is not a purely technical role, but requires a unique combination of strategic vision, deep understanding of business, leadership skills and a solid knowledge of enabling technologies. An effective CDO works transversally to the organization, collaborating with all the functions to identify new data-based opportunities and solve the challenges related to their management. Its presence is crucial to overcome the disconnection between business and IT, acting as a catalyst for a holistic approach to data that permeates every decision-making level of the company. Investing in a CDO means investing in its ability to make more informed decisions, to optimize operations, to innovate products and services and to build a lasting competitive advantage in a constantly evolving market, where the speed and accuracy of information are key at a time. The CDO is not a luxury, but a strategic need for any company that aspires to be really data-driven.
Quality of Data such as Pilar Fundamental: Impact and Strategies
Denodo’s research revealed that the84% of companies believe that the variety of data sources adversely affects the quality of analysis. This data is emblematic of a central challenge: the poor quality of data is not an isolated technical problem, but a strategic obstacle that undermines confidence, slows down operations and compromises decisions at all levels. Data quality is not limited to accuracy, but is a multidimensional concept that includes: completeness (all the necessary information is present? ), consistency (is data uniform between different sources? ), validity (do data respect the default formats and values? ), timeliness (is the data updated? ), uniqueness (no duplicates? ) integrity (Is data relations correct? ). When one or more of these aspects fail, the consequences are felt in every business area. Think of a CRM with duplicate or wrong customer addresses: marketing campaigns become ineffective, communication fails, customer experience relies on, and corporate reputation can be compromised. In operational processes, inaccurate inventory data can lead to excessive stocks or stock breaks, generating inefficiencies and losses. Strategically, sales forecasts based on unreliable historical data can lead to incorrect production or investment decisions, with significant financial repercussions. The most affected sectors, according to research, are those related to customers (25%), business operations (24%) and sales (20%), areas where data quality is directly related to forecasting and defining market strategies. To address this challenge, it is essential to take a proactive approach to data quality management (DQM). This includes profiling of data to identify critical issues, definition of validation rules, implementation of cleansing and enrichment processes, and adoption of Master Data Management (MDM) solutions to create a “unique source of truth” for critical entities (customers, products, suppliers). Data quality is not a one-off project, but a continuous process that requires constant monitoring, governance and commitment of the whole organization. Only then can data be transformed from potential source of reliable pillar errors for corporate growth and innovation.
Virtualization of Data: A Bridge to Agility and Democratization
Faced with fragmentation challenges, long waiting times and poor data quality, a technological solution is gaining significant ground: virtualization of data. The Denodo research highlights that well the 61% of italian companies are considering the adoption of these technologies to solve the challenges inherent in the integration and management of the information heritage. But what exactly is data virtualization and why is it considered such a promising solution? In summary, data virtualization creates a unified and real-time logic view of all corporate data sources, regardless of their location (on-premise, cloud), their format (structured, unstructured) or their complexity. Instead of physically moving data to a data warehouse or data lake to integrate them (a slow and expensive process), virtualization leaves the data where they originally reside and creates an abstraction level that makes them accessible as if they were in a single repository. The benefits are multiple and deeply impacting for companies. First of all, theagility: the business can access the data required in drastically reduced times, exceeding long waiting times. Secondly, the democratization: virtualization facilitates a self-service approach, allowing business users to question and analyze data independently, without the constant dependence on IT for reporting. The virtualization platform acts as a single access point, where data can be modeled, enriched and made available in comprehensible and consistent formats for end users. This significantly contributes to improving overall quality of the presented data, since the rules of transformation and governance can be applied centrally. Moreover, reducing the need to replicate data, you get cost savings storage and infrastructure and minimise security and compliance risks, as the original data remains protected in their sources. Virtualization of data is not a substitute for data warehouses or data lakes, but rather a complement that makes them more effective, acting as a powerful logical interface that orchestrates access and integration of all sources, including those residing in complex environments such as cloud or legacy systems, enabling a real flexible data-driven ecosystem and responsive to constantly changing business needs, and ensuring a faster return of investment.
Beyond Data Lake: Evolution to Data Fabric and Data Mesh
If data virtualization is a fundamental step towards agility, the landscape of data architecture is constantly evolving, going far beyond traditional Data Warehouse and Data Lake. Denodo research notes that the adoption of Data Lake is still not widespread in Italian companies, with more than one third of organizations (39%) that does not own one. Although the Data Lakes have promised great flexibility in storing large volumes of raw data in native format, they often turned into ‘data swamps’ – ungoverned data swamps, difficult to discover and use effectively. To overcome these limitations and respond to the increasing complexity of data sources and to the different needs of users, more advanced concepts such as the Data Fabric and the Date Mesh. The Data Fabric is a holistic data management platform, an integrated and intelligent architecture that aims to unify data management in heterogeneous environments (on-premise, multi-cloud, edge) through automation, AI and machine learning. Its goal is to provide an integrated view of the data, facilitating their discovery, access and governance, regardless of their physical location. It is not a product, but an architectural model that orches different technologies – including data virtualization – to create a cohesive and self-managed data ecosystem, allowing companies to manage data more efficiently, scalable and secure. On the other hand Date Mesh represents a more radical change, not only technological but also organizational and cultural. It is a decentralized architecture that processes data as products, assigning data ownership to specific domain teams (e.g. customer teams, product teams), who are responsible for providing high quality, documented and easily consumable data from other teams. The four fundamental principles of Data Mesh are: domain-based ownership, data as product, a self-service platform for data and one federated computational governance. This approach aims to overcome the bottlenecks of centralized data teams and to give more autonomy to business teams, accelerating innovation. Both Data Fabric and Data Mesh try to solve data isolation and slowness issues in access highlighted by Denodo research, offering more agile and scalable frameworks for managing information assets in a distributed and diversified context of data. Virtualization of data can act as a key component within both these paradigms, providing the level of abstraction necessary for integration and access to data in a coherent and governed way, facilitating the transition from a monolithic approach to a more distributed and value-oriented data.
Cloud, Artificial Intelligence and Data Management Impact: New Opportunities and Challenges
The acceleration towards Cloud is a fact for italian companies, with more than four out of five companies (84%) claiming to have a Cloud initiative. However, only 29% indicate that they have more than half of their data in the cloud, confirming that migration is still in the initial but growing stages. The cloud offers undeniable benefits for data management: unlimited scalability, flexibility, access to managed services and, above all, platforms Artificial Intelligence (AI) and Machine Learning (ML) last generation. These technologies, increasingly integrated into cloud offerings, promise to revolutionize data analysis, process automation and insight generation. AI, in particularIA Generative, is catching attention for its ability to create content, optimize research and even interact with data in new ways, as suggested by the concept of ‘zero click at zero checkout’ in marketing. However, the enthusiasm for AI must be balanced by a pragmatic approach, as indicated by the market, which exceeded the hype phase. Companies face concrete risks, such as AI’s ‘hallucinations’ (plausible but false responses) and the fragility of the supply chain of data that feed these models. Here emerges the fundamental importance of data governance: an AI is only effective when fed by high quality data, governed, safe and reliable. Without a solid database, AI becomes a risk source instead of value, amplifying the errors and inefficiencies already present. The concept of ‘Business AI’, promoted by companies like SAP (quoted in the article), emphasizes the need for an AI based on authoritative data, integrated into existing and controlled business processes through mechanisms such as RAG (Retrieval Augmented Generation), which combine the power of generative models with the accuracy of verified internal data sources. AWS service interruption (meaning in similar contexts) serves as a warning: cloud dependency requires robust resilience, backup strategies and, possibly, a multi-cloud approach to mitigate risks. Cloud and AI integration is not only a technological trend, but a strategic transformation that requires a total rethink of data management, placing governance and quality at the centre to unlock the true innovative potential of these technologies and transforming data into a sustainable and secure growth engine.
Data Self-Service and Data-Driven Culture: Towards Autonomy and Innovation
The drive towards greater agility and ease in the path that brings data to those who have to analyze it is a clear need emerged from Denodo research, with a strong demand to be able to work independently and develop a self-service approach to data analysis and consumption. Although data self-service is now a consolidated reality, research reveals that in most cases (65%) IT still maintains an important role of supervision, and only 19% of companies adopt a complete self-service where business operates independently. This caution is understandable: granting full autonomy without adequate guardrail can lead to chaos, inconsistencies and security risks. However, effective self-service is the key to democratizing access to data and speeding up decision-making. To achieve a true self-service of data, it is necessary to equip business users not only with the right tools (intuitive Business Intelligence platforms, user-friendly virtualization interfaces, data catalogs for data discovery), but also of skills and the culture necessary to use them responsibly and meaningfully. Data literacy training is crucial: users need to understand data sources, their definitions, metrics and implications of their analysis. The IT, in this scenario, evolves from “gatekeeper” to “abilitator”, providing the infrastructure, tools and governance that allow the business to explore data safely. This passage requires a cultural transformation that permeates the entire organization, promoting a culture data-driven. It is not only about having access to data, but to incorporate data analysis into each decision-making process, from strategy to everyday activity. This involves a commitment from leadership, creating a common language on data, promoting a curious and analytical mindset, and recognizing “data champions” within business teams. When self-service is well implemented, it results in greater speed in gaining insights, a reduction in IT demand backlog, a greater capacity for innovation, and ultimately an employee empowerment that can actively contribute to business value based on concrete evidence. Self-service is not the absence of governance, but a governance that empowers autonomy, transforming the complexity of data into a competitive advantage available to all.
Strategies for a Sustainable Data-Driven Future: Lesson for Italian Companies
The challenges outlined by Denodo research, the lack of Chief Data Officer to the disconnection between Business and IT, to the poor quality of data and the complexity of architectures, outline a framework in which Italian companies have a significant path to travel to reach a full data-driven maturity. However, in these challenges, there are major opportunities for growth and innovation. To build a sustainable and resilient data-driven future, Italian companies must adopt a strategic and holistic approach, which is not limited to punctual technological implementations, but which involves people, processes and culture. Here are the key strategies: 1. Prioritize Leadership and Data Organization: It is imperative to invest in creating a role as Chief Data Officer or a function dedicated to data governance, ensuring that this figure has the executive support and the necessary authority to guide change. The CDO must be the catalyst that unifies the vision of data between Business and IT. 2. Adopting Skilling Technologies and Modern Architectures: Virtualization of data is an essential bridge for access agility and democratization. Looking beyond, towards architectures such as Data Fabric and Data Mesh, can offer long-term solutions for scalability and management of complex and distributed data ecosystems, especially in multi-cloud contexts. Strategic adoption of the cloud, with attention to data security and sovereignty, is essential to access scalable resources and advanced AI/ML tools. 3. Invest in Quality and Data Governance: Implementing robust Data Quality programs and defining a clear and shared governance framework is the foundation on which to build any data-driven initiative. Without reliable data, any analysis or AI model is intended to fail. 4. Promote a Data-Driven Culture and Data Literacy: Empowerment of business users through self-service is crucial, but must be accompanied by training programs (data literacy) that develop the skills needed to interpret and use data critically. The company culture must value the data as a decision-making resource. 5. Strengthen Security and Conformity: In a world of increasing cyber threats and privacy regulations (GDPR in primes), data security and compliance must be integrated at every stage of the data life cycle. Service interruptions of large cloud providers (such as AWS) remind us of the importance of robust contingency plans. Italian companies have the potential to transform their current challenges into competitive advantages. The era of Data-Driven Transformation requires that decisions be guided by insights, and this, as pointed out by Gabriele Obino of Denodo, requires a democratization of access to data while guaranteeing security and governance. Denodo’s mission, and the hope for all companies, is to be able to focus on its business goals and value for customers, without the concerns related to intrinsic data management. Only in this way, immediately and easily access all the necessary data, regardless of their location or complexity, companies will be able to thrive in a future that is already here.



