Data Quality Analyst
The Data Quality Analyst is your dedicated guardian for data integrity in investment operations. It catches bad data before it impacts NAV calculations, compliance, or decision-making. This agent excels at anomaly detection, data profiling, reconciliation, and quality trend analysis across your entire fund ecosystem.The Data Quality Analyst is one of 12 specialized agents in the OpsHub multi-agent system. It works independently on data quality tasks and collaborates with other agents (Fund Accountant Assistant, Compliance Sentinel, Integration Specialist) to resolve issues.
Primary Capabilities
Data Profiling & Analysis
The Data Quality Analyst automatically profiles your holdings, transactions, and reference data to establish baselines and identify deviations:- Distribution Analysis - Understand value, quantity, and market price distributions across holdings
- Completeness Checks - Identify missing required fields in positions, transactions, or corporate actions
- Field Validation - Verify data types, ranges, and format compliance (ISIN codes, dates, decimals)
- Trend Analysis - Monitor data quality metrics over time to detect emerging problems
- Data Lineage - Trace data sources and transformations to pinpoint quality issues at origin
- “Profile the APAC holdings to identify outliers in valuation frequency”
- “Show me any positions with missing counterparty information”
- “Which securities have pricing gaps or stale data?”
Anomaly Detection & Exception Handling
Intelligent, configurable anomaly detection catches problems humans might miss:- Outlier Detection - Identify holdings or transactions that deviate significantly from normal patterns
- Statistical Anomalies - Detect unusual price moves, volume spikes, or valuation changes
- Cross-Source Mismatches - Compare data from custodian, administrator, and external sources
- Threshold Monitoring - Configure custom alerts for values exceeding acceptable ranges
- Pattern Recognition - Learn normal operating patterns and alert on deviations
- “Flag any holdings with price changes exceeding 10% since last valuation date”
- “Identify transactions that appear in the custodian feed but not in our system”
- “Show me securities with unusual trading volumes this week”
Reconciliation Assistance
The Data Quality Analyst streamlines the complex work of comparing multiple data sources:- Three-Way Reconciliation - Compare custodian, administrator, and internal system data simultaneously
- Break Analysis - Identify which specific positions or fields are causing reconciliation breaks
- Root Cause Investigation - Suggest probable causes based on data patterns and timing
- Reconciliation Rules - Learn your firm’s reconciliation methodology and apply consistently
- Exception Tracking - Maintain history of known differences and explain why they’re expected
- “Reconcile the NAV with the administrator report and explain any breaks”
- “Compare our cash position to the custodian statement - where’s the $2.3M difference?”
- “Which corporate actions might explain the quantity variance in this holding?”
Quality Trend Analysis & Reporting
Transform raw data quality metrics into actionable insights:- Trend Visualization - Track data quality metrics over weeks and months
- Root Cause Patterns - Identify systematic issues (specific custodian, asset class, data source)
- SLA Monitoring - Track completeness and accuracy against service level agreements
- Quality Dashboards - Create visual reports on data quality status by source and asset type
- Improvement Recommendations - Suggest process changes based on quality trends
- “Show me the data quality trend for emerging market holdings over the last 3 months”
- “Which custodian has the highest error rate, and in which asset classes?”
- “Generate a quality report for the compliance team showing improvement trends”
When to Use the Data Quality Analyst
The Data Quality Analyst is your expert for:Data Validation
Verify data completeness, accuracy, and format compliance across sources
Break Investigation
Investigate reconciliation breaks between multiple sources
Anomaly Alerts
Configure and monitor for unusual values or patterns
Quality Trends
Track data quality metrics and identify systemic issues
How It Works
1. Data Quality Assessment Request
You ask the agent to evaluate data quality across specific dimensions:2. Intelligent Data Profiling
The agent executes data profiling tools to understand baseline characteristics:- Queries the holdings, transactions, and reference data schemas
- Calculates distributions, quartiles, and statistical measures
- Identifies missing or invalid data
- Compares against expected formats and ranges
3. Anomaly Detection
Configurable algorithms identify deviations from normal patterns:- Statistical measures (z-scores, IQR) for numerical fields
- Pattern recognition for categorical data
- Time-series analysis for trend deviations
- Cross-source consistency checks
4. Root Cause Analysis
For detected anomalies, the agent investigates probable causes:- Timing correlation (did this coincide with a system change or data import?)
- Source analysis (which system introduced this data?)
- Context review (is this a legitimate exception like a corporate action?)
- Comparison to historical patterns
5. Recommended Actions
The agent proposes remediation steps:- Data Corrections - For confirmed data errors
- Process Changes - For systematic quality issues (e.g., “implement validation rule for ISIN format”)
- Alert Configuration - For patterns that should trigger future monitoring
- Escalation - To Integration Specialist for source system issues
6. Audit Trail
All quality assessments are logged for compliance:- Analysis parameters and thresholds applied
- Findings and root causes documented
- Recommended actions with rationale
- Audit trail for regulatory review
Tools at Your Service
The Data Quality Analyst leverages these enterprise tools:Data Profiling Tools
Data Profiling Tools
- Statistical analysis and distribution calculations
- Completeness and format validation
- Data lineage and source tracking
- Baseline establishment and deviation detection
Anomaly Detection Tools
Anomaly Detection Tools
- Real-time outlier detection algorithms
- Configurable sensitivity thresholds
- Pattern recognition and learning
- Cross-source mismatch identification
Reconciliation Tools
Reconciliation Tools
- Three-way reconciliation execution
- Break analysis and reporting
- Tolerance threshold configuration
- Exception tracking and history
Database Query Tools
Database Query Tools
- Direct access to holdings, transactions, and reference schemas
- Performance-optimized queries
- Multi-asset-class support
- Temporal analysis (compare across dates)
Reporting Tools
Reporting Tools
- Quality dashboard generation
- PDF report export
- Trend visualization
- Executive summary creation
Real-World Use Case: Detecting a Pricing Error
Scenario
A global fund manager’s data quality process typically runs weekly. This week, the Data Quality Analyst detects something unusual in the APAC equity holdings.What Happens
-
Profiling Phase
- Agent profiles all APAC holdings by valuation frequency, pricing source, and price recency
- Identifies 47 positions with normal characteristics
- Flags 2 positions (Chinese tech stocks) with stale pricing (no update in 8 days)
-
Anomaly Detection
- Compares against historical norms: these securities typically update daily
- Calculates the potential NAV impact: ~$2.3M valuation difference if using stale prices
- Checks custodian data: prices ARE available but weren’t imported
-
Root Cause Analysis
- Investigates data pipeline: discovers file import from Asia custodian failed yesterday due to timeout
- Checks administrator feed: also missing these prices
- Identifies timing: system downtime correlates with data import failure
-
Recommended Actions
- Data Correction: “Rerun the custodian import process to fetch current prices”
- System Alert: “Configure monitoring to alert if >24 hours elapse without price update for these securities”
- Process Enhancement: “Add automated retry logic for failed custodian imports”
-
Audit Documentation
- Full analysis logged with findings, evidence, and root cause
- Recommended actions with implementation rationale
- Impact assessment: prevented potential $2.3M pricing error
Result
What might have become a NAV break investigation is caught proactively. The fund accountant can prevent the error before the morning NAV run, maintaining data integrity and regulatory compliance.Integration with Other Agents
The Data Quality Analyst works as part of the broader agent ecosystem: With Fund Accountant Assistant:- Data Quality Analyst flags quality issues
- Fund Accountant confirms if issues impact NAV calculation
- Escalates breaks requiring investigation
- Data Quality Analyst detects anomalies
- Compliance Sentinel assesses regulatory impact
- Documents findings for audit trail
- Data Quality Analyst identifies source system issues
- Integration Specialist investigates and fixes data pipelines
- Implements monitoring to prevent recurrence
- Data Quality Analyst identifies data requiring correction
- Workbook Engineer creates spreadsheets for exception handling
- Fund Accountant reviews and approves corrections
Key Differentiators
Proactive vs. Reactive
Traditional data quality approaches catch problems during month-end reconciliation. The Data Quality Analyst identifies issues in real-time, before they propagate.Multi-Source Intelligence
Rather than checking single sources, the agent compares custodian, administrator, and internal data simultaneously to identify discrepancies and contradictions.Learning from Your Patterns
The agent learns what’s normal for your fund, your custodians, and your asset classes. This means fewer false alarms and faster detection of real issues.Explainable Findings
Every anomaly comes with documented reasoning: “This price is 12% above historical range because of X event” vs. just “price is unusual”.Getting Started with Data Quality Analysis
1
Ask for Data Profiling
Start with a broad profile: “Profile the portfolio holdings and identify any data quality issues”
2
Review Findings
The agent provides a summary of what’s normal, what’s unusual, and why
3
Set Up Monitoring
Configure ongoing monitoring for specific data quality metrics important to your operations
4
Integrate with Workflows
Add quality checks to your NAV validation and reconciliation workflows
5
Continuous Improvement
Use quality trend reports to identify and fix systemic issues at the source
Configuration & Customization
The Data Quality Analyst can be configured to match your firm’s standards:- Tolerance Thresholds - Define what constitutes an “anomaly” for your assets (e.g., price variance %)
- Monitoring Schedules - Run quality checks daily, weekly, or real-time
- Alert Destinations - Route quality issues to specific teams (operations, compliance, IT)
- Data Sources - Configure which sources to compare and reconcile
- Custom Rules - Add firm-specific validation rules (e.g., “corporate action processing SLAs”)
Best Practices
Profile Regularly
Run baseline profiling weekly to establish normal patterns before anomalies occur
Escalate Systematically
Establish clear protocols for how anomalies should be investigated and resolved
Learn from History
Use trend analysis to identify root causes and prevent recurring issues
Integrate with Workflows
Add quality checks to NAV validation, reconciliation, and reporting processes
FAQs
How does the agent decide what's an anomaly?
How does the agent decide what's an anomaly?
The agent uses statistical methods (z-scores, interquartile ranges) combined with your firm’s configured thresholds. For example, you might define “any price change >10% since last update” or “missing data in mandatory fields”. The agent learns from your feedback about what constitutes a real issue vs. expected variation.
Can it handle multiple custodians?
Can it handle multiple custodians?
Absolutely. The agent can simultaneously reconcile data from multiple custodians, administrators, and internal systems. It identifies which source has the correct data and why others might differ (timing, data lag, FX conversion, etc.).
How does it know what's 'normal' for my fund?
How does it know what's 'normal' for my fund?
The agent establishes baselines by analyzing historical data patterns. It learns that your APAC holdings typically have daily price updates but your private equity holdings might have monthly valuation. Over time, it becomes increasingly accurate at identifying true anomalies vs. expected variation.
What happens when it detects an error?
What happens when it detects an error?
The agent proposes corrective actions (rerun import, correct data, configure alert) but requires human approval before executing. High-risk actions go through the draft system where you review and approve the proposed changes.
Can it reconcile with external reports?
Can it reconcile with external reports?
Yes. If you provide an administrator report or custodian statement, the agent can reconcile it line-by-line with your system, identifying breaks and suggesting explanations (timing differences, corporate actions, FX impact, etc.).
How is quality analysis audited?
How is quality analysis audited?
Every analysis is logged with full context: which data was examined, what parameters were used, what was found, and what actions were recommended or taken. This creates an audit trail suitable for compliance review.
Next Steps
Ready to improve your data quality?Explore Other Agents
Learn how Data Quality Analyst works with other specialized agents
View All Tools
See the complete 62+ tool suite available to all agents
Questions? The Data Quality Analyst is continuously learning and improving. Provide feedback through the agent console or contact support@opshub.ai for questions about configuring data quality monitoring for your specific fund structures and requirements.