Job Listings

Data Engineer urgently wanted: APPLY HERE

Data Engineer urgently wanted: APPLY HERE

Johannesburg, Gauteng

Standard Bank is a firm believer in technical innovation, to help us guarantee exceptional client service and leading edge financial solutions. Our growing global success reflects our commitment to the latest solutions, the best people, and a uniquely flexible and vibrant working culture. To help us drive our success into the future, we are looking for an experienced Data Engineer, Global Markets Technology to join our team at our Johannesburg offices. Standard Bank is a leading African banking group focused on emerging markets globally. It has been a mainstay of South Africa’s financial system for 150 years, and now spans 16 countries across the African continent.

Job Purpose

Provide infrastructure, solutions, tools and frameworks used to deliver end-to-end solutions for the Global Markets Data-Analytics and Regulatory business mandate. Build scalable infrastructure and solutions for supporting the delivery of clear business insights from raw data sources; with a focus on collecting, managing, analysing, visualising data and developing analytical solutions.

Responsible for expanding and optimising Global Markets data and data pipeline architecture, whilst optimising data flow and collection to ultimately support data initiatives.

Key Responsibilities/Accountabilities

In conjunction with business and subject matter experts, perform a diligent and disciplined analysis of regulatory documentation. Enumerate the precise regulatory requirements necessary to achieve regulatory compliance. Document these requirements to provide traceability and clearly identify the resultant business value. Identify potential challenges, risks and issues arising from these requirements, and bring them to the attention of the necessary parties. Develop an awareness of the problem space to proactively identify potential non-functional requirements.
Translate abstract regulatory requirements into a working solution architecture alongside the product owner, other stakeholders and relevant architects. Continually acquire a broad understanding of the business landscape with minimal guidance, and a deep understanding where regulatory requirements necessitate it. Identify opportunities to sweat assets or produce a future data asset for consumption by other parties. Present architectural solutions at the appropriate forums and boards for socialisation and approval.
Working with the product owner, decompose a solution architecture into actual features and dependencies. Identify the urgency and importance of these features, and which teams will implement which features. Ensure all features are delivered to the program boards of the impacted release trains, and that logical prioritisation takes place to bed down scope over future planning increments. Evaluate the solution design for security, robustness and reliability in the face of up- and downstream failure. Safeguard against design flaws by actively seeking peer review. Follow best practices and patterns, seeking consensus from peers when no patterns exist. Quantify and validate technical requirements against designed capacity to ensure adequate performance and scalability.
Create and maintain optimal data pipeline architecture and creating databases optimized for performance, implementing schema changes, and maintaining data architecture standards across the required Standard Bank databases. Work alongside data scientists to help make use of the data they collect.
Assemble large, complex data sets that meet functional / non-functional business requirements and align data architecture with business requirements. Processes, cleanses, and verifies the integrity of data used for analysis.
Build analytics tools that utilise the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Create data tools for analytics and data scientist team members that assist them in building and optimising Standard Bank into an innovative industry leader.
Utilise data to discover tasks that can be automated and identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Designing and developing scalable ETL packages from the business source systems and the development of ETL routines in order to populate databases from sources and to create aggregates. Oversee large-scale data Hadoop platforms and to support the fast-growing data within the business.
Responsible for enabling and running data migrations across different databases and different servers and defines and implements data stores based on system requirements and consumer requirements.
Responsible for performing thorough testing and validation in order to support the accuracy of data transformations and data verification used in machine learning models.
Perform ad-hoc analyses of data stored in Standard Banks databases and writes SQL scripts, stored procedures, functions, and views. Proactively analyses and evaluates the Standard Banks databases in order to identify and recommend improvements and optimisation. Deploy sophisticated analytics programs, machine learning and statistical methods.
Analyse complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models.
Liaise and collaborate with the entire Global Markets Technology team, providing support to the entire department for its data centric needs. Collaborate with subject matter experts to select the relevant sources of information and translates the business requirements into data mining/science outcomes. Presents findings and observations to team for development of recommendations.
Acts as a subject matter expert from a data perspective and provides input into all decisions relating to data engineering and the use thereof. Educate the organisation on data engineering perspectives on new approaches, such as testing hypotheses and statistical validation of results. Ensure ongoing knowledge of industry standards as well as best practice and identify gaps between these definitions/data elements and company data elements/definitions.
Convey a pro-active, positive influence on the broader team in terms of architecture, design and implementation. Actively participate in the architecture forums to grow and maintain and awareness of the changing business and technical landscape, and to identify areas potentially impacting on the regulatory services space. Act as a subject matter expert across the broader set of technology teams on all regulatory concerns. Maintain and up-to-date view of both domestic and international developments as they relate to existing or potential regulatory requirements.

Preferred Qualification and Experience

Qualifications:

Degree / Honours Degree: Information Technology

Experience:

5 -7 years’ data monetisation. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, RDS, Redshift.Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
5 – 7 years’ software design & development. Experience building and maintaining full-stack enterprise .NET-based solutions: Windows Services, ASP.NET Web Applications, ADO.NET and Microsoft SQL Server, IBM MQ, etc. Automated unit testing (e.g. xUnit) skills, and experience with automated acceptance testing (e.g. FitNesse) is desirable. Must be familiar with modern service-oriented architecture (SOA), and the design principles of reliable, robust, and performant solutions. Microsoft .NET Core and/or Amazon/Azure cloud experience would be beneficial. Preference towards candidates with Strong SQL skills.
5 – 7 years’ DevOps experience. Demonstrable experience with modern ‘DevOps’ enterprise software development toolchains and processes: lean and agile practices, software version control, formal change/release management processes, automated testing, continuous integration, continuous builds/packaging, continuous deployment, monitoring, etc. Experience with the Scaled Agile Framework (SAFe) and/or maintaining a DevOps automation platform, e.g. Bamboo or Jenkins, would be beneficial.
5 – 7 years’ software maintenance experience. Experience providing support to end-users of applications developed and maintained by the team. Experience unpacking new requirements, identifying the impact on existing solutions, and communicating these findings to business and IT. Strong interpersonal skills and analytical abilities that will ensure regulatory obligations are fulfilled accurately and timeously, even when under pressure.

Knowledge/Technical Skills/Expertise

Architectural methodologies used in the design and development of IT systems.
The ability to ensure the accuracy and consistency of data for the duration that the data is stored as well as preventing unintentional alterations or loss of data
Knowledge and understanding of IT applications and architecture.
Ability to analyze statistics and other data, interpret and evaluate results, and create reports and presentations for use by others.
The ability to apply metadata to information to make it easy for other people to find.
Refers to the knowledge and experience required to manage the installation, configuration, upgrade, administration, monitoring, and maintenance of physical databases.
Ability to build and maintain software solutions that meet user requirements, while following enterprise standards, patterns, and practices.
The ability to design software solutions that meet user and organizational requirements.
Refers to managing changes and deploying them in a manner that does not compromise solution or business integrity, and complies with policies and procedures.
Refers to the knowledge and experience required to manage the installation, configuration, upgrade, administration, monitoring, and maintenance of bespoke software. Apply Now

Source: jobs365