data’distribution strengths: zero latency, functional diversity, low TCO
data’distribution reveals its value technologically, functionally and economically for the benefit of:
- The data and data sources: zero latency, zero impact on applications
- The information and BI system: functional diversity, vast range of uses for a real-time IS
- The IT department: very reasonable total cost of ownership (TCO) and a positive impact on IT cost control.
Zero latency, no impact on applications
- Real-time loading: data'distribution's CDC technology uses the databases logs to detect and distribute data events with a latency of less than one second. This "zero latency" aspect is a key factor for many businesses, where up-to-date information implies a freshness of less than one second.
- Non-intrusive: data'distribution is non-intrusive in the transactions. An asynchronous agent captures the changes in the transactions log; this means that there are no changes for programs or databases.
- Incremental: data'distribution operates in real time (on the fly).
- High-performance, guaranteed delivery: the technology embedded in data'distribution makes massive use of parallelism to accommodate huge workload increases and break free of network delays. This allows it to be deployed on global WANs, even in countries with poor network infrastructures, without having to worry about network outages that are otherwise handled in complete transparency.
- Guaranteed transaction integrity: data'distribution captures data transactions in the order in which they occur.
- Polymorphic use: real-time BI, database cloning, inter-application dialog, migration of applications and databases, dialog in the industrial sector, e-business synchronization, MDM loading, complex event processing (CEP), etc.
- Heterogeneous/database-agnostic: data'distribution supports a large number of sources and targets. In fact, the setup interface makes it possible to disregard the nature of the source and target.
- Unlimited transformation on the fly: data'distribution's transformation capabilities are highly advanced since it incorporates an integrated language that is extremely simple to script. Moreover, it is fast to run. In addition, external code can be called. As standard, it can comply with the target database's referential integrity with no programming (even if events occur in the wrong order on the source), automatically manage orphans, and support the real-time update of aggregates. This technology explains why, for more than 10 years, data’distribution has been implemented in complex projects that involve anything from populating complex ODSes with data re-standardization, asynchronous data exchanges with industrial robots, or real-time data exchange between heterogeneous applications and ERP solutions.
- Accommodates huge workload increases.
- Can operate upstream from an ETL for very low refresh times.
- Data virtualization: A very useful feature with data’distribution is the creation of a virtual database. Application infrastructures implement increasingly scattered and heterogeneous data and systems. data’distribution is a coordination tool that provides a near-instantaneous virtual view of the global data on each system without being impacted by delays or communication breaks.
- Replication in non-connected mode: Companies installed on a multitude of remote sites that are not inter-connected via a global network and/or with mobile workers can exchange and synchronize their data in complete security.
Low TCO (Total Cost of Ownership)
- Easy adoption and implementation: With an extremely fast setup and development (just a few minutes per table), data'distribution can seamlessly co-habit with the legacy system, and can be used for on-demand projects in the form of quick solutions. In turn, this increases IS responsiveness and reduces cost.
- Very fast learning curve: 4 days.
Almost zero administration.
Does not impact system performance: data'distribution has a very low CPU footprint on databases, systems and networks (0.02% for 500Gbits/day). In certain sensitive configurations, such transparency represents an undisputed edge.
- High availability: data'distribution accommodates network outages, machine failures, etc. with automatic recovery and no data loss. Exchanges are synchronized transparently and intelligently. The target database is always available and remains consistent in terms of its content for the user.