Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack.
Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure.
Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations.
Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS.
Develop and deliver detailed presentations to effectively communicate complex technical concepts.
Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc.
Adhere to Agile practices throughout the solution development process.
Design, build, and deploy databases and data stores to support organizational requirements.
Undergraduate or Graduate degree preferred Minimum Skills Required: Basic Qualifications:
2+ years of experience supporting Network Protocols Engineering, Data Engineering, or Data Analytics projects.
- Proficiency in configuring and utilizing code conversion accelerators is crucial. Code conversion accelerators are tools or frameworks that help in translating code from one programming language to another.
- Strong knowledge and experience in Python programming language is essential.
- A solid foundation in software development principles and methodologies is important. This includes knowledge of software development life cycle (SDLC), version control systems, testing, and debugging techniques.
- Strong problem-solving skills and the ability to think analytically. They should be able to identify and resolve complex issues related to code conversion and application development.
-Demonstrate proficiency in coding skills, utilizing languages such as, Leaplogic, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations.
-Collaborate seamlessly across diverse technical stacks, including Azure, Databricks, Snowflake, and AWS.
- Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc.
Preferred Skills:
Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more.
Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems.
Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc.
Showcase professional written and verbal communication skills to effectively convey complex technical concepts."