Senior Python Developer

Job ID: 08684
Location: NJ  [Hybrid]
Employment Type: Contract

Apply Now

Fill out the form below to submit your information for this opportunity. Please upload your resume as a doc, pdf, rtf or txt file. Your information will be processed as soon as possible.

(Word, PDF, RTF, TXT)
* Required field.

Senior Python Developer with Databrick and Kafka (Capital Markets)

Iselin, NJ – 3days/week onsite.

We are seeking a skilled Python Developer with hands-on experience in Databricks, and Kafka to join our client's technology team. The ideal candidate will design, develop, and optimize large-scale data processing pipelines and real-time data streaming solutions to support trading, risk, and compliance functions. You will collaborate with business stakeholders and data teams to deliver high-performance data solutions in a fast-paced financial environment.

Responsibilities:

  • Develop, test, and maintain scalable ETL/ELT data pipelines using Python, PySpark, and Databricks on cloud platforms.
  • Build and manage real-time data streaming solutions with Kafka to support low-latency data feeds.
  • Collaborate with quantitative analysts, traders, and risk managers to understand data requirements and deliver effective solutions.
  • Optimize existing data workflows for performance, reliability, and efficiency.
  • Implement data quality checks and monitoring mechanisms.
  • Participate in code reviews, documentation, and knowledge sharing within the team.
  • Ensure compliance with financial data governance and security standards.
  • Stay updated with emerging technologies and propose innovative solutions for data processing challenges.

Required Skills & Qualifications:

  • 8+ years of experience in Python development
  • Strong experience with Databricks platform and cloud-based data engineering.
  • Proven expertise in Kafka for building scalable, real-time streaming applications.
  • Knowledge of relational and NoSQL databases (e.g., SQL, Cassandra, MongoDB).
  • Familiarity with investment banking processes, trading systems, risk management, or financial data workflows.
  • Good understanding of distributed computing concepts and big data ecosystem.
  • Experience with version control systems (e.g., Git) and Agile development methodologies.
  • Excellent problem-solving skills, attention to detail, and ability to work under tight deadlines.

Preferred Qualifications:

  • Experience with other big data tools such as Hadoop, Spark SQL, or Flink.
  • Knowledge of financial data standards and regulations.
  • Certification in Cloud platforms (AWS, Azure, GCP).
  • Previous experience working in a regulated financial environment.