Data Architect in Topeka, KS at GDH

Date Posted: 8/9/2018

Job Snapshot

Job Description

General
Accountability
:

The day to day of this position involves developing systems that ingest, sanitize and normalize diverse datasets, forming them into meaningful structured data for reporting and analytics. The accountabilities also include monitoring data movement, validating schedules/cycles and working through any problems that arise as part of the process.

Duties and Responsibilities:

•        Design, develop, and test data model for traditional data warehouse and data mart structure according to the data standards.

•        Create fact and dimension structures and cycles required to keep data refreshed

•        Design, develop, and test data ingestion and integration between a broad set of internal/external data sources including but not limited to enterprise data warehouse, operational data marts, SAAS, PAAS, iPaas, flat files, Web Services, API’s, SalesForce, DST, SE2, AWS, Azure and on premise solutions.

•        Create data mapping  documents from source to target for all the internal & external integrations.

•        Define and collaborate on normalized datasets that can be utilized by the reporting and analytics analysts

•        Provide consultation on dataset performance and quality monitoring

•        Handle automation and refactoring of existing and future integrations

•        Assist in the creation of standards, best practices, and architectural frameworks for data collections, integrations  and data retention and compliance.

•        Provide support of the data collection and integration solutions, including resolution of incidents during implementation and warranty period.

•        Communicate, escalate, and/or resolve risks and issues related to assigned projects/initiatives

•        Create and maintain change management requests and follow the change management process

•        Effectively communicate with various levels of the organization.

•        Working with multiple vendors.

Qualifications:

•     Bachelors degree in Computer Science, Business Information Systems or related field preferred


•       Experience managing, configuring and implementing designs for large-scale datasets


•       Minimum of 3-5 years’ of data pipeline management with a focus on data structures and normalization of desperate data sources


•       Minimum of 3-5 years’ working with data warehouse and or data mart creation

•        Minimum of 3-5 years’ of data integration / ETL experience utilizing Microsoft SSIS, IBM Infosphere and or Informatica CloudInfosphere,

•        Minimum of 1-2 years’ experience with the UC4/Atomic scheduling tool (or similar tools)

•        5+ years’ with SQL in a Relational Database Environment (Microsoft SQL preferred) including skills in Structured Query Language (SQL) programming language with ability to code highly complex SQL

•        Working knowledge of .Net C# and or VB.Net

•        5+ years’ of professional experience in software analysis, development, engineering and support in a large/medium corporate setting

•        Experience working in cross-functionally or across organizations

•     Strong written and verbal communication skills along with presentation skills

•     Ability to handle multiple tasks of highly varied nature in a timely manner

•     Strong business acumen and existing knowledge of financial services & or insurance industry a PLUS