We are a fast-growing, nationwide personal injury law firm seeking an experienced Data Engineer to design and build a world-class data infrastructure.
Job Description
We operate on Salesforce, AWS, Amazon Connect, and various other platforms, generating vast amounts of valuable data. Our ideal candidate will architect, optimize, and scale our data ecosystem so our analysts and data scientists can focus on insights instead of wrangling data.
Key Responsibilities
* Data Pipeline Development: Build and optimize ETL/ELT pipelines using BigQuery, AWS (S3, Redshift, Glue, Lambda), and Salesforce data.
* Data Warehouse Management: Design and manage scalable data warehouses & lakes that support real-time and batch processing.
* Collaboration and Communication: Work closely with analysts and stakeholders to ensure clean, structured, and accessible data.
* Data Governance and Security: Implement best practices in data governance, security, and performance optimization.
* Automation and Integration: Automate data workflows to minimize manual effort and enhance data reliability. Develop API integrations to centralize data from CRM, marketing, finance, and telephony systems.
Requirements
* At least 5 years of data engineering experience in a fast-paced environment.
* Bachelor's degree in Information Systems, Analytics, Mathematics, Computer Science, Engineering, or related field required.
* Expertise in BigQuery, AWS (S3, Redshift, Glue, Lambda), and SQL-based data pipelines.
* Strong experience working with Salesforce data architecture.
* Proficiency in Python and SQL for data transformation and automation.
* Ability to roadmap, design, and build from scratch - not just maintain existing systems.
* Experience working with BI tools (e.g., Looker Studio, Tableau, Power BI) is a plus.