Back
Job   UK   W Yorks   Leeds Area   Engineer -

Spark Scala Engineer | Engineer in Engineering Job in Leeds WYK | 7313277618

This listing was posted on iSmartRecruit.

Spark Scala Engineer

Location:
Leeds, W Yorks
Description:

JOB ADVERT FOR IJP Exciting Long-Term Opportunity to Work on Cutting-Edge Technology in One of Wipro's Top Fastest-growing Accounts which has over 1900 associates working across India, UK, China, Hong Kong and Mexico. HSBC is one of the biggest financial services organizations in the world, with operations in more than 38 countries. It has an IT infrastructure of 200,000+ servers, 20,000+ database instances, and over 150 PB of data. As a Spark Scala Engineer, you will have the responsibility to refactor Legacy ETL code for example DataStage into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. The HSBC Account is looking for an enthusiastic Spark Scala Engineer who will be responsible for designing, building, and maintaining data pipelines using Apache Spark and Scala. This includes tasks like: • Extracting data from various sources (databases, APIs, files) • Transforming and cleaning the data • Loading the data into data warehouses or data lakes (e.g., BigQuery, Amazon Redshift) Automating the data pipeline execution using scheduling tools (e.g., Airflow) • Work with Big Data technologies: You'll likely work with various Big Data technologies alongside Spark, including: o Hadoop Distributed File System (HDFS) for storing large datasets o Apache Kafka for real-time data streaming o Apache Hive for data warehousing on top of HDFS o Cloud platforms like AWS, Azure, or GCP for deploying and managing your data pipelines • Data analysis and modeling: While the primary focus might be on data engineering, some JDs might require basic data analysis skills: Writing analytical queries using SQL or Spark SQL to analyze processed data and Building simple data models to understand data relationships Your benefits As the Spark Scala Engineer, you will have the opportunity to work with one of the biggest IT landscapes in the world. You can also look forward to being mentored and groomed in your career journey by some of the finest in the business. Your responsibilities As a Spark Scala Engineer you will be working for HSBC – GDT (Global Data Technology) Team, you will be responsible for: • designing, building, and maintaining data pipelines using Apache Spark and Scala • Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP). Mandatory Skills You need to have the below skills. • At least 8+ Years of IT Experience with designing, building, and maintaining data pipelines. • At least 4+ Years of experience with designing, building, and maintaining data pipelines using Apache Spark and Scala • Programming languages: Proficiency in Scala and Spark is essential. Familiarity with Python and SQL is often a plus. • Big Data technologies: Understanding of HDFS, Kafka, Hive, and cloud platforms is valuable. • Data engineering concepts: Knowledge of data warehousing, data pipelines, data modeling, and data cleansing techniques is crucial. • Problem-solving and analytical skills: You should be able to analyze complex data problems, design efficient solutions, and troubleshoot issues. • Communication and collaboration: The ability to communicate effectively with data scientists, analysts, and business stakeholders is essential. • Ready to work at least three days from HSBC Leeds (UK) office and accept changes as per customer/Wipro policies. • To be able to traverse and explain the system designs and file format usages you have been a part of and why any tool/technology was used. Good to have skills. Ideally, you should be familiar with • Machine learning libraries: Familiarity with Spark ML or other machine learning libraries in Scala can be advantageous. • Cloud computing experience: Experience with cloud platforms like AWS, Azure, or GCP for data pipelines deployment is a plus. • DevOps tools: Knowledge of DevOps tools like Git, CI/CD pipelines, and containerization tools (Docker, Kubernetes) can be beneficial.
Posted:
June 25 on iSmartRecruit
Visit Our Partner Website
This listing was posted on another website. Click here to open: Go to iSmartRecruit
Important Safety Tips
  • Always meet the employer in person.
  • Avoid sharing sensitive personal and financial information.
  • Avoid employment offers that require a deposit or investment.

To learn more, visit the Safety Center or click here to report this listing.

More About this Listing: Spark Scala Engineer
Spark Scala Engineer is a Engineering Engineer Job located in Leeds WYK. Find other listings like Spark Scala Engineer by searching Oodle for Engineering Engineer Jobs.