Shulman Fleming & Partners

Integration Data Engineer

Shulman Fleming & Partners - Iselin, NJ, United States

Integration Data Engineer

MUST be local to Iselin, NJ, Hybrid Schedule at least 3 days onsite

Salary Range: $77k to $135k

No Sponsorship Available

We are seeking an Integration Data engineer with a background in SQL and data warehousing for enterprise level systems. The ideal candidate is comfortable working with business users along with business analyst expertise.

Major Responsibilities:

  • Design, develop, and deploy Databricks jobs to process and analyze large volumes of data.
  • Collaborate with data engineers and data scientists to understand data requirements and implement appropriate data processing pipelines.
  • Optimize Databricks jobs for performance and scalability to handle big data workloads.
  • Monitor and troubleshoot Databricks jobs, identify and resolve issues or bottlenecks.
  • Implement best practices for data management, security, and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions.
  • Demonstrated proficiency with Data Analytics, Data Insights
  • Proficient writing SQL queries and programming including stored procedures and reverse engineering existing process
  • Leverage SQL, programming language (Python or similar) and/or ETL Tools (Azure Data Factory, Data Bricks, Talend and SnowSQL) to develop data pipeline solutions to ingest and exploit new and existing data sources.
  • Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards.

Skills:

  • 5+ years - Enterprise Data Management
  • 5+ years - SQL Server based development of large datasets
  • 5+ years with Data Warehouse Architecture, hands-on experience with Databricks platform. Extensive experience in PySpark coding. Snowflake experience is good to have
  • 3+ years Python (numpy, pandas) coding experience
  • 3+ years’ experience in Finance / Banking industry – some understanding of Securities and
  • Banking products and their data footprints.
  • Experience with Snowflake utilities such as SnowSQL and SnowPipe - good to have
  • Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.
  • Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills
  • Capable of discussing enterprise level services independent of technology stack
  • Experience with Cloud based data architectures, messaging, and analytics
  • Superior communication skills
  • Cloud certification(s) preferred
  • Any experience with Regulatory Reporting is a Plus

Education

  • Minimally a BA degree within an engineering and/or computer science discipline
  • Master’s degree strongly preferred


Posted On: Thursday, June 5, 2025



Apply to this job

or