Data Quality at Source
Consistent Data Quality used for all Data Products, captures 6 dimensions of Data Quality, full transparency and controls, deploy on-premise or cloud.
While every organisation is different in its own way, one thing that is common is data quality challenges. This main due to fragmented Data Infrastructure, lack of Data Integrity checks by Producers, deferred checks identifies problems too late, multiple siloed data quality (DQ) solutions tied to implementations.
​
Poor quality data leads to reconciliation breaks, manual adjustments. Becomes the norm for IT and Business. Repeated short term workarounds result in progressively longer lead times for IT deliveries.
​
ALFA Active-DQ guarantee that data is correct at the source - when it is being produced. This ensure that data sent downstream to Analytics and Regulatory Reporting systems are fully validated.
Reduces cost
Eliminate need for multiple reconciliations and manual adjustments.
Evolutionary, not revolutionary
Non-evasive introduction of Data Quality checks into existing or new Producers.
Accelerate IT delivery
Data Quality Rules expressed in the model are fully generated into code libraries, therefore developers do not need to translate specifications into code.
Purpose-built data quality solution to enforce data integrity checks at the source with minimal latency and no change to underlying applications.
Technology agnostic Modelling
Active-DQ is implementation agnostic, therefore the Producer does not need to be hosted on any specific infrastructure, such as Cloud.
​
DQ rules generated to Java, Scala, Python to execute at native speed, also generate 3rd party DQ model configurations.
Comprehensive Data Quality
Constraints are based on value/range/size/text/pattern/format, inter-fields, calculations/expressions/aggregations.
​
ALFA runtime executes these DQ validation as objects are being deserialized and can be used in stream or batch modes.