Data Engineer

Applicants must have a minimum of two years experience in analytics or a related area in addition to the following criteria:

Skills

Knowledge

Applicants must have confidence in at least one skillset under each heading:

Data Management


Informatica
Abinitio
Talend
IBM Data Studio
SAS Data Integration
Dataflux

Visualisation


Tableau
Spotfire
Qlik
PowerBl
SAS Visualisation Analytics

Techniques & Methods


Big Data Concepts
Data Warehousing

Languages


Python
Java
Javascript (or similar)
Go
Ruby
C
C#
VBA

Databases


SQL Server
Postgre SQL
Teredata
Oracle
IMB DB2
MySwl
SAP HANA
Mongo DB

Data Management


Informatica
Abinitio
Talend
IBM Data Studio
SAS Data Integration
Dataflux

Visualisation


Tableau
Spotfire
Qlik
PowerBl
SAS Visualisation Analytics

Techniques & Methods


Big Data Concepts
Data Warehousing

Languages


Python
Java
Javascript (or similar)
Go
Ruby
C
C#
VBA

Databases


SQL Server
Postgre SQL
Teredata
Oracle
IMB DB2
MySwl
SAP HANA
Mongo DB


Scroll

Applicants must be able to demonstrate some capability in the following areas:

Develop and implement data engineering strategy for data collection, integration, quality, lineage, security, storage, preservation, and availability for further processing.Develop and implement data strategy, in particular, in a form of data management policy and Data Management Plan and path to execution of the plan – tooling & steps.

Develop and implement relevant data models, define metadata using common standards and practices, for different data sources in variety of scientific and industry domains.

Integrate heterogeneous data from multiple source and provide them for further analysis and use/Maintain historical information on data handling, including reference to published data and corresponding data sources – Data Lineage and Data Dictionary.

Visualise results of data analysis, design dashboard and use storytelling methods.

Ensure data quality, accessibility, interoperability, compliance to standards, and publication.Design, build, operate appropriate effective ETL (Extract, Transform, Load) solutions and processes for the Data Analysis being performed such that they can be both implemented and scaled into target environments.

Apply designated quantitative techniques, including statistics, time series analysis, optimization, and simulation to deploy appropriate models for analysis and prediction.

Identify, extract, and pull together available and pertinent heterogeneous data, including modern data sources such as social media data, open data, governmental data.

Develop and implement data engineering strategy for data collection, integration, quality, lineage, security, storage, preservation, and availability for further processing.Develop and implement data strategy, in particular, in a form of data management policy and Data Management Plan and path to execution of the plan – tooling & steps.

Develop and implement relevant data models, define metadata using common standards and practices, for different data sources in variety of scientific and industry domains.

Integrate heterogeneous data from multiple source and provide them for further analysis and use/Maintain historical information on data handling, including reference to published data and corresponding data sources – Data Lineage and Data Dictionary.

Visualise results of data analysis, design dashboard and use storytelling methods.

Ensure data quality, accessibility, interoperability, compliance to standards, and publication.Design, build, operate appropriate effective ETL (Extract, Transform, Load) solutions and processes for the Data Analysis being performed such that they can be both implemented and scaled into target environments.

Apply designated quantitative techniques, including statistics, time series analysis, optimization, and simulation to deploy appropriate models for analysis and prediction.

Identify, extract, and pull together available and pertinent heterogeneous data, including modern data sources such as social media data, open data, governmental data.


Scroll

Apply for Certification

If your skills, knowledge and experience meet the minimum requirements as set out in the framework, you can apply today to be assessed for certification.

Our certification committee meets to assess applicants on a regular basis. You should expect to hear from us within six weeks of applying.

Click Register Now and complete an online application form, uploading the required evidence:

  • CV.
  • LinkedIn Profile.
  • Certificates of Education (Degrees/Diploma).
  • Any Project Work details you wish to include to support your application.

Not yet ready for Certification?

If you’ve decided you’re not ready for certification, we’ve outlined a programme of online learning with Coursera© to get you there. Mapped to our framework, these courses will give you the necessary skills to complete certification in your chosen area of expertise. Coursera© courses can be completed in your own time and typically cost €49.

Simply choose the courses that fill the gaps in your knowledge. Once completed, you can return and register for certification, uploading your Coursera© certificates to support your application.

European Data Science Framework

My Image

Our certification is mapped to the Edison Framework (EDSF).

A Europe-wide project to establish the necessary skills and define Data Science as a professional practice across the continent.

Got a question?

Request a Callback

Thank you! Your submission was successfully sent :-)×
Opps! Some went wrong... Your submission did not go through :-(×

Not yet ready for Certification?

If you’ve decided you’re not ready for certification, we’ve outlined a programme of online learning with Coursera© to get you there. Mapped to our framework, these courses will give you the necessary skills to complete certification in your chosen area of expertise. Coursera© courses can be completed in your own time and typically cost €49.

Simply choose the courses that fill the gaps in your knowledge. Once completed, you can return and register for certification, uploading your Coursera© certificates to support your application.

Coursera

Big Data Integration and Processing

Commitment:

6 weeks, video lectures and reading each week.

Coursera

Big Data Modeling and Management Systems

Commitment:

6 weeks of study, 2-3 hours/week.

Coursera

Cloud Computing Applications, Part 1

Commitment:

5 weeks of study, 5 - 10 hours/week.

Coursera

Basic Data Descriptors, Statistical Distributions, and Application to Business Decisions

Commitment:

4 weeks of study, 1-3 hours/week.

Coursera

Foundations of strategic business analytics

Commitment:

4 weeks of study, 2-3 hours/week.

Coursera

Big Data Analysis with Scala and Spark

Commitment:

4-weeks, on demand video and reading.

Coursera

Cloud Computing Applications, Part 2

Commitment:

5 weeks of study, 5 - 10 hours/week.

Coursera

SAP S/4HANA Training – in Plain English

Commitment:

7 hours of video lectures and some reading.

Coursera

The Ultimate Hands-On Hadoop – Tame your Big Data!

Commitment:

14.5 hours of on-demand video.

Coursera

Become QlikView Designer from Scratch

Commitment:

5.5 hours of on-demand video.

Coursera

Microsoft Power BI – A Complete Introduction

Commitment:

10 hours of on-demand video.

Industry Advisory Council: Certification Group

Keep up to date

Subscribe to our quarterly newsletter. Get the latest events insights and opportunities from the Analytics Institute.

Thank you for subscribing.×
Opps! Some went wrong... Please try that again×

Contact Us:

Analytics Institute Limited
38/39 Fitzwilliam Square
Dublin 2
D02 RV08
Ireland

Company Registration Number: 479298.
VAT Number: IE9739127A

Managing Director: Lorcan Malone


© Analytics Institute of Ireland 2019 | All Rights Reserved | Read our Privacy Policy | Terms & Conditions