Data Team Lead
Who We Are Looking For
Our Data Team Lead position requires a proficiency in operational excellence, technical leadership, and data enablement. The individual in this role will oversee, manage, and evolve our established data infrastructure spanning OLTP system, a centralized Operational Data Store (ODS), a Business Intelligence/Data Warehouse environment, and a growing portfolio of data migration initiatives. This is a high-impact leadership role responsible for ensuring data consistency, reliability, and accessibility across our platform — supporting both daily operations and strategic insights.
What It Takes (some or all)
- 10+ years of experience in data engineering, ETL development, or database administration
- 3+ years of experience in a data leadership or team management role
- Strong knowledge of SQL Server (or equivalent RDBMS) across OLTP and analytical workloads
- Deep experience with ETL/ELT tools (e.g., SSIS, Azure Data Factory, dbt, or custom pipelines)
- Proven success managing complex data migration projects involving legacy systems
- Proficiency in data modeling, schema management, and performance optimization
- Hands-on experience with reporting platforms (e.g., Power BI, SSRS, Tableau, etc.)
- Excellent communication, leadership, and cross-functional collaboration skills
- Experience with or strategic understanding of AI-powered tools for data classification, cleansing, forecasting, or migration assistance
- Background in industries with regulated reporting (e.g., insurance, finance)
- Familiarity with observability tools (e.g., Grafana, OpenTelemetry, Prometheus)
- Exposure to data cataloging and governance platforms
- Experience with cloud-based data solutions (e.g., Azure, AWS, GCP)
What YOU Will Be Doing
- Operational Leadership
- Lead day-to-day management of OLTP databases, ODS systems, and the Data Warehouse.
- Oversee and support production data pipelines — batch and near real-time — with built-in monitoring, auditing, and alerting.
- Ensure performance, uptime, and traceability across the data stack.
- Data Migration Oversight
- Lead efforts to automate data migration workflows from legacy systems, minimizing manual intervention and improving reliability.
- Define repeatable, scalable processes and tools for validating, transforming, and importing data.
- Coordinate across internal teams and customer stakeholders to ensure timely and accurate migration outcomes.
- Establish data quality checks and reconciliation methods post-migration.
- Optimize systems to handle high-throughput load.
- Evolve & Optimize
- Continuously improve pipeline efficiency, model flexibility, and reporting responsiveness.
- Identify architectural improvements to support new data use cases, KPIs, and regulatory changes.
- Own change management across interconnected systems with minimal disruption.
- Configure a natural language query engine to enhance accessibility and usability of enterprise data for both technical and non-technical users.
- Adopt and integrate AI-driven tools for improving data quality, forecasting trends, anomaly detection, and automating operational tasks across the data stack.
- Implement a real-time observability framework to monitor data pipeline health, system performance, and data quality across OLTP, ODS, and DW layers. Enable fast detection and response to issues using automated alerts and stakeholder-facing dashboards.
- Lead a High-Performing Data Team
- Manage and mentor a cross-functional team including DBAs, ETL developers, reporting specialists, and data quality engineers.
- Promote skill development and cross-training.
- Coordinate with engineering, implementation, QA, and product teams to support integrated initiatives.
- Business Intelligence & Reporting Support
- Maintain and extend the Data Warehouse and reporting layer to deliver high-quality dashboards and reports.
- Ensure consistency of dimensions, metrics, and business logic across all reporting tools.
- Collaborate with stakeholders to address evolving analytics needs.
- Data Governance & Compliance
- Implement policies for secure data access, retention, masking, and compliance.
- Maintain end-to-end data lineage and metadata documentation.
- Support internal and external audits with timely and accurate information.
Who WE Are
Finys is a leading producer of packaged software for the Property Casualty (P&C) insurance sector. With a deep understanding of industry intricacies and cutting-edge technology, we have crafted the Finys Suite—an adaptable enterprise platform for policy administration, claims, billing, business intelligence, and mobile access. Our solution serves dozens of carriers, reducing operational costs and accelerating time to market, while fostering seamless collaboration between carriers, their agents, vendors, and insureds. Located in Troy, Michigan, our talented, U.S.-based team is dedicated to your success.
What WE Offer YOU
At Finys, we offer an outstanding work environment, in which great people work with great technology. We also offer a competitive compensation package with generous benefits including health, vision, dental, life, paid vacation, paid holidays, matching retirement plan, flex time, and bonus opportunities.
Join our rapidly growing software company and be part of a team dedicated to transforming the P&C insurance industry through innovation and excellence.
Apply now and take the next step in your career!