The Denodo Data Innovation Award recognizes client companies that have exemplified the best and most innovative data architectures. In this segment three Denodo customers will present their use cases on topics such as data fabric, data mesh, multi/hybrid cloud solutions to name a few.
Denodo has recently released support for SQL-Based Stored Procedure. Together with new capabilities for data replication, incremental updates and ELT execution, it creates a powerful tool to manage complex data integration tasks with ease. This workshop will teach you how to use with practical examples.
The data optimizer is the most complex piece of engineering of the Denodo Platform. Understanding how it works is critical to master performance tuning.
Discover how Denodo helps addressing multi-layer data architectures with hybrid cloud, multi-cloud, multi-data centers, inter-company and data mesh deployments. In this video you will learn how Denodo works in a multi-layer architecrure using one or more than one Denodo instance (Denodo clusters) as a data source.
Data management reimagined. In it's eigth year of existance, Denodo DataFest brings to you some of the most exciting news in the data managment world. Let's celebrate data.
Proper data monitoring of an enterprise system is critical to understand its capacity and growth, to anticipate potential issues, and even to understand key ROI metrics.
In this video you will learn how to improve the performance of your advanced data analytic systems, including traditional reporting with tools like Tableau or PowerBI, but also accelerating the MDX cube interface.
Denodo embeds a Presto cluster to efficiently process and manage content in your data lake. Learn how to configure and deploy it, and when to use it as part of a global Data Fabric strategy.
Join this session to learn why AI will be necessary for any modern data infrastructure, how AI will influence federated learning within any organization, and how prepared your organization is for this AI wave.
Learn the best practices on code migrations between different environments, how to implement a CI/CD strategy with success, and how to leverage Data Solution Manager and its APIs.
This session will review the best practices when working with data APIs and how to leverage the different no-code options for enabling services via: SOAP, REST, Denodo RESTful, GraphQL, OData, GeoJSON, and Geoservices.
Did you know that Denodo offers a container repository and Helm charts to simplify Kubernetes deployments? In this session, we will show you how easy it is to deploy a complete environment with these technologies.
This session will give you directions on how different development strategies can be implemented with Denodo along with the Best Practices. It will also be reviewed how to Organizing Projects with Denodo, Structuring the Metadata and View Development.
Denodo's Data Catalog uniquely combines cataloging capabiliites like search, documentation and lineage, with execution capabilities provided by the virtual layer. We will review how to squeeze those capabilities to get the most our of a self-service strategy. We will also review the future evolution of the Data Catalog during 2023.
A conversation about the unique set of people, platform and process ingredients that Mainline uses to help customers turn their data into a competitive advantage. It’s really a “think different” approach to unlocking that full value of Denodo over time.
This exclusive session will explore the collaborative synergy between CDW, AWS, and Denodo, showcasing how they work together to deliver tangible business value.
Presto, as a distributed compute engine, has been around for over a decade; yet it continues to quickly evolve and grow to meet the changing data landscape. Join the Presto Foundations Technical Steering Committee Chair, Tim Meehan, to learn more about Presto, how it works and about the innovative features being worked on that continue to make Presto a top engine choice for data lakes and lakehouses.
Join us for a Fireside chat that explores how we can use Denodo's data security features to meet strict security requirements and a flexible deployment model to augment infrastructure governance. This session will explore real-world case examples of how Denodo's unified security model enables organizations to deploy a secure data fabric layer to serve operational and analytical workloads and deliver business value.
Join us in this fireside chat with Denodo and Amplifi to understand how these two industry leaders are poised to solve some of the most pressing data management problems.
Join us for a dynamic 20-minute fireside chat featuring Steve Simpson, Chief Revenue Officer, Aligned Automation and Amir Assar, Regional Sales VP at Denodo. The discussion will illuminate how tailored client services can amplify business function value.
Every organization has some form of sensitive data that they need to protect. Protect from unauthorized access and protect from misuse. In many cases, organizations do not know what sensitive data they have and where it is stored. Even if they know what sensitive data they own, they struggle to know how to protect this from different users, each with different needs.
This talk will discuss the current trend of utilizing Generative Artificial Intelligence for a variety of business applications and where there are significant issues surrounding this technology – specifically with regards to explain-ability, daydreaming, ethics, & security concerns.
Solve for today’s analytics demands and seamlessly scale and modernize your business by incorporating a logical data management architecture using Denodo Platform on GCP. Streamline your migration path to BigQuery and accelerate your time-to-insights in a hybrid and multi-cloud environment by bringing all your data (SaaS, On-prem, Cloud) together to get a clearer picture of the customer journey, predict business outcomes, and create more personalized experiences for your customers.
Join this fireside chat with Wemba Opota (Microsoft) and Mitesh Shah (Denodo), where they discuss and outline a strategy to maximize your investment in Azure cloud using the Denodo Platform. Learn about the key integration aspects and how Denodo Platform can help amplify your data integration and data management needs on Azure hybrid cloud, with minimal risk of cloud migration of your workloads.
Mint Mobile leveraged the Denodo Platform to build a Logical Data Fabric to accelerate and democratize data across the company. Over the years they utilized Denodo for many critical use cases including addressing Security Risk Management for subscribers porting out by social engineering. In this session, Christopher Gi, AVP of Data Engineering at Mint Mobile, explains how they invent and simplify to accelerate time to market in developing data products/solutions by leveraging Denodo.
In 2018, CITY Furniture was a promising, modest-sized business, but when the pandemic hit, and many companies scaled back, CITY Furniture made the opposite decision and pushed forward by building out a modern data infrastructure that included a logical data fabric enabled by the Denodo Platform. That decision paid off, and by the end of 2021, CITY Furniture had grown its employee base and driven its revenue up to nearly a billion.
This panel discusses the challenges Data Mesh addresses and how organizations can align its principles to their unique environments using Data Virtualization. Data Mesh revolutionizes data management but implementing it can be complex. We explore the role of Data Virtualization as a strategic enabler. Panelists share insights on challenges in traditional approaches and how Data Mesh promotes data democratization, scalability, and agility.
Strategizing and implementing a robust cloud data architecture is more relevant than ever, because so many user organizations are migrating data to the cloud, modernizing analytics and data warehouses, deploying data lakes and lakehouses, and adopting new practices for data management. A well-designed, cloud-based data architecture provides a home for highly diverse data and the numerous use cases it enables.
Empowered by modern data management and insights-driven decision making, many senior data and analytics executives are in the driver’s seat to steer their organizations towards high growth, maximizing operational efficiency and creating a data-driven culture. We are excited to bring you three such executives who will join a Denodo subject matter expert to discuss how they are propelling their organization towards a digital future
After software, is it generative AI’s turn to eat the world? While it may be too early to comment on that, many enterprises have started to build generative AI based applications to serve their customers better. In this presentation learn how the powerful combination of AWS and Denodo is poised to creates some of the early but practical generative AI applications on AWS, as Denodo becomes the serving layer for one of the most critical component of generative AI – the data.
Confusion reigns with companies sticking to traditional data management architectures of copying data into a physical repository. Is that a viable approach to future of data architecture, where data and new technologies are exploding? How do logical architectures provide a better balance in increasing speed to data while lowering the cost and effort of enabling a single pane of data? This panel will explore these topics with top minds in data management.
Attend this session to discover some of the most interesting new features that Denodo has recently released to explore how to take advantage of them in your own use cases.
Hear from Denodo CTO and Product Management team what are the next big things in data management. Also, listen to Tim Meehan, Chair of Technical Steering Committee at Presto Foundation, about how Denodo and other Presto foundation members are shaping the future of Presto. Finally, hear from Denodo customer Estes Express about how they leverage the Denodo Platform to streamline freight transportation and provide customers with up-to-the-minute updates.
Traditional data architectures pose a challenge for AI workloads. They have led to data silos, data quality issues, and scalability problems. It takes too long to derive intelligence from our current setup and the unreliable pipelines are too brittle to be trusted.