Hello,
We are hiring
Job Title: Bigdata/Data Engineer (Hadoop)
Duration: 5
Months to hire
Location:
Wilmington, DE
Job
Description:
Responsibilities:
·
Reporting to the Team Manager for Data
Engineering, the Data Engineer will be part of the core group responsible for
the end to end data management on the Big Data platform.
·
The Data Engineer will be responsible for data
ingestion, data validation, code development on various big data technologies,
data analysis troubleshooting and user interactions working in an agile team.
·
Effectively partner with the Data and
Analytics team to understand their data requirements, work on data ingestion,
analysis, validation, transformation etc.
·
Partner closely with various analytics
projects to provide value from a data engineering perspective.
·
Provide support for development work
supporting the ingestion framework, SQL and anything else needed for generating
validations.
·
Support & monitor various data feeds as
required for the project.
·
Experience in working in Agile/Scrum framework
and participate in all ceremonies and deliver
·
Apply quality assurance best practices to all
work products
Required
Skills/Experiences:
·
Bachelor's or Advanced Degree in Information
Management, Computer Science, Mathematics, Statistics, or related fields
desired
·
Financial Services background or experience
preferred across different LOBs. (Consumer, Commercial, Asset Management &
CIB)
·
Proven proficiency with data analysis and
ability to troubleshoot data issues.
·
Proficiency across the full range of big data
technologies and/or database and business intelligence tools; publishing and
presenting information in an engaging way
·
Intensive, recent experience in assessing and
sourcing data needs
·
Detail oriented with a commitment to
innovation
Preferred
Skills/Experiences:
·
Strong experience in Big Data Hadoop stack –
Hive, Impala, Spark, Spark/SQL
·
Experience in Python, Java, Spark development
is highly desired.
·
Experience with MPP databases like Teradata or
Oracle is preferred.
·
Experience in Big Data technologies like Pig,
Kafka is preferred
·
Exposure to machine learning is preferred
·
Exposure to ETL tools (ABINTIO, INFORMATICA,
DataStage or Others) a plus.
·
Good communication skills and ability to
interact with users, make presentations and guide conversations.
·
Experience with scheduling & data
integration tools like Control-M and Ni-Fi is highly desired.
·
Strong exposure in Data Management, Governance
and Controls functions
·
Experience in data analysis, validation and
reporting is a plus.
·
Ability to present complex information in an
understandable and compelling manner
If you are interested Clickhere
No comments:
Post a Comment