California-based Vinay Samuel, CEO and Founder of Zetaris, highlights the rise of confidential computing to provide greater data security and data virtualization software to increase in adoption and shift the industry beyond the traditional extract, transform and load processes that were time consuming, costly and inefficient.
1. The emergence of confidential computing in 2024 will address heightened concerns around data security and privacy. Confidential computing isolates data within a secure processing enclave within a cloud computing environment. With Australian businesses fearing hefty fines, confidential computing will rapidly increase in adoption throughout 2024.
Confidential computing can protect data using hardware-based trusted execution environments providing a level of assurance of data integrity, data confidentiality and code integrity. Confidential computing in cloud environments can protect data during processing with exclusive control of encryption keys delivering stronger end-to-end data security in the cloud by isolating sensitive data, such as customer PII. Confidential computing will be deployed to provide greater business assurance that data is protected and confidential throughout 2024.
2. Data virtualization software will enable the emergence of the open lakehouse to enable businesses, governments and organizations leverage rapid data insights without the cost, complexity and security issues associated with moving, cleaning and storing data using traditional ETL processes. The open lakehouse allows organizations to run structured and unstructured analytics at scale by connecting data in the lakehouse to data in data silos in real time.
This will usher in a transformational shift in the way governments, businesses and organizations can use analytical data virtualization in their digital and data strategies. It will enable instant joining and access to virtual data silos across the whole data ecosystem, whilst ensuring data governance and security by removing the need to move data, analysing it at its source. Whether it’s in the cloud, on-premise, or hybrid, those adopting these new approaches will gain a single, unified view of data enabling faster, cheaper, more comprehensive and accurate data insights. Data virtualization approaches will continue to mature in 2024.
3. Data processing is now leveraging the mixed capability of CPUs and GPUs enabling data preparation, which has been handled by CPUs, and data analytics which are best suited to GPUs to occur on the same machine. This is giving rise to a new computing class – the Analytical Processing Unit (APU).
4. Natural language interfaces, large language models, and data analytics tools will come together and enable users to ask questions in plain language rather than through traditional analytics tools. This will improve explainability and fast-track insights from data analysis, turbocharging business innovation and boosting competitiveness.
5. On-premise hardware data storage systems will become more intelligent, queryable and integrated with the cloud in 2024. In 2024 we will see multi-cloud requirements accelerate and include multi data center and multi-cloud, as customers seek faster speed and agility to joining data across their business for improved decision-making and analytics.
6. Unified semantic lakehouse – the concept of stitching together and transforming data into a business readable language across various databases, files, streams another data sources in real-time for the purpose of SQL based analytics (known as data virtualization) will extend to the idea of creating a single queryable multi-cloud virtual data lake house. This will negate the need for continuous movement of data between clouds enabling customers to join datasets and discover new insights and hidden business intelligence.
7. Organizations that are data driven and heavily reliant on using Big Data platforms, analytical databases, data warehouses and data lakehouses will begin to understand in 2024 the value of separating business logic from the storage of data within database engines. Business logic, meta data and schema stores will mean the painful traditional model of data migration will be dramatically reduced.
8. Data observability, and organization’s ability to monitor, analyse and understand the state of their data, supporting systems and associated risks, will be an area within the data management and analytics world that will enhance the usage of data as an asset. Information management and analytics vendors will encourage organizations to implement point solutions or new features within vendor data platforms throughout 2024.Click below to share this article