Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Big Data Developer (Spark/Scala + Python) @ Synechron

Home > Software Development

 Big Data Developer (Spark/Scala + Python)

Job Description

Job Summary

Synechron is seeking an experienced Big Data Developer with strong expertise in Spark, Scala, and Python to lead and contribute to large-scale data projects. The role involves designing, developing, and implementing robust data solutions that leverage emerging technologies to enhance business insights and operational efficiency. The successful candidate will play a key role in driving innovation, mentoring team members, and ensuring the delivery of high-quality data products aligned with organizational objectives.

Software Requirements
  • Required:
    • Apache Spark (latest stable version)
    • Scala (version 2.12 or higher)
    • Python (version 3.6 or higher)
    • Big Data tools and frameworks supporting Spark and Scala
  • Preferred:
    • Cloud platforms such as AWS, Azure, or GCP for data deployment
    • Data processing or orchestration tools like Kafka, Hadoop, or Airflow
    • Data visualization tools for data insights
Overall Responsibilities
  • Lead the development and implementation of data pipelines and solutions using Spark, Scala, and Python
  • Collaborate with business and technology teams to understand data requirements and translate them into scalable solutions
  • Mentor and guide junior team members on best practices in big data development
  • Evaluate and recommend new technologies and tools to improve data processing and quality
  • Stay informed about industry trends and emerging technologies relevant to big data and analytics
  • Ensure timely delivery of data projects with high standards of quality, performance, and security
  • Lead technical reviews, code reviews, and provide inputs to improve overall development standards and practices
  • Contribute to architecture design discussions and assist in establishing data governance standards
Technical Skills (By Category)

Programming Languages:

  • Essential: Spark (Scala), Python
  • Preferred: Knowledge of Java or other JVM languages

Data Management & Databases:

  • Experience with distributed data storage solutions (HDFS, S3, etc.)
  • Familiarity with NoSQL databases (e.g., Cassandra, HBase) and relational databases for data integration

Cloud Technologies:

  • Preferred: Cloud platforms (AWS, Azure, GCP) for data processing, storage, and deployment

Frameworks & Libraries:

  • Spark MLlib, Spark SQL, Spark Streaming
  • Data processing libraries in Python (pandas, PySpark)

Development Tools & Methodologies:

  • Version control (Git, Bitbucket)
  • Agile methodologies (Scrum, Kanban)
  • Data pipeline orchestration tools (Apache Airflow, NiFi)

Security & Compliance:

  • Understanding of data security best practices and data privacy regulations
Experience Requirements
  • 5 to 10 years of hands-on experience in big data development and architecture
  • Proven experience in designing and developing large-scale data pipelines using Spark, Scala, and Python
  • Demonstrated ability to lead technical projects and mentor team members
  • Experience working with cross-functional teams including data analysts, data scientists, and business stakeholders
  • Track record of delivering scalable, efficient, and secure data solutions in complex environments
Day-to-Day Activities
  • Develop, test, and optimize scalable data pipelines using Spark, Scala, and Python
  • Collaborate with data engineers, analysts, and stakeholders to gather requirements and translate into technical solutions
  • Lead code reviews, mentor junior team members, and enforce coding standards
  • Participate in architecture design and recommend best practices in big data development
  • Monitor data workflows performance and troubleshoot issues to ensure data quality and reliability
  • Stay updated with industry trends and evaluate new tools and frameworks for potential implementation
  • Document technical designs, data flows, and implementation procedures
  • Contribute to continuous improvement initiatives to optimize data processing workflows
Qualifications
  • Bachelors or Masters degree in Computer Science, Information Technology, or a related field
  • Relevant certifications in cloud platforms, big data, or programming languages are advantageous
  • Continuous learning on innovative data technologies and frameworks
Professional Competencies
  • Strong analytical and problem-solving skills with a focus on scalable data solutions
  • Leadership qualities with the ability to guide and mentor team members
  • Excellent communication skills to articulate technical concepts to diverse audiences
  • Ability to work collaboratively in cross-functional teams and fast-paced environments
  • Adaptability to evolving technologies and industry trends
  • Strong organizational skills for managing multiple projects and priorities

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Big Data Engineer
Employement Type: Full time

Contact Details:

Company: Synechron
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   big data administration scala spark hadoop python pyspark bitbucket spark streaming java git apache gcp kanban spark mllib big data hbase jvm airflow microsoft azure cloud platforms nosql pandas apache nifi cassandra kafka scrum agile aws

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Network Developer 3

  • Oracle
  • 6 - 9 years
  • Kolkata
  • 18 hours ago
₹ Not Disclosed

Java Developer With Cloud ! 6+yrs ! Hyd/bang

  • Tech Mahindra
  • 6 - 11 years
  • Hyderabad
  • 21 hours ago
₹ 17-27.5 Lacs P.A.

UEFI Developer

  • Capgemini
  • 4 - 9 years
  • Bengaluru
  • 1 day ago
₹ Not Disclosed

Lead Linux Device Driver Developer- Audio

  • Capgemini
  • 9 - 14 years
  • Bengaluru
  • 1 day ago
₹ 0-45 Lacs P.A.

Synechron

Headquartered in New York and with 18 offices around the world, Synechron is helping global financial services and insurance companies embrace the most cutting-edge innovations to evolve their businesses. Synechron uniquely delivers these firms an end-to-end Digital, Consulting and Technology capabi...