Searching for a senior level ETL specialist highly skilled in Financial Data Warehousing and the SAS DI Studio tool set. The Candidate will be required to lead the definition of the client’s Extract / Transform / Load (ETL) process including but not limited to development of conceptual, logical and physical data models, source to target mapping, data quality analysis, data aggregation, data analysis and reporting. The ideal candidate must have experience working within Financial related organizations such as Department of Treasury financial agencies, wall street brokerage firms, hedge funds, etc.
SAS ETL/Data Integration Expert:
Experience designing, developing, automating and testing source to target mappings using SAS DI Studio and Base SAS experience performing ETL. Skilled in performing an ETL process that will load legacy data into a logical SPDS data model. The individual will establish and apply data quality standards such that the data will be maintained so that purged data will retain its meaning and definition.
The candidate will possess excellent writing and communication skills and be responsible for documenting and maintaining the ETL reference architecture, that is, the process for describing the loading process for spanning all extraction, transformation, and loading activity. Must possess SAS administration skills spanning SAS logging including standard SAS logging, ARM logging, and LOG4SAS. Must also be able to enhance performance by reviewing and applying changes to processing. Knowledge of various SAS methods at getting increased performance by putting datasets into memory using things like SASFILE or HASH objects.
Background in financial institution/brokerage firm/wall street/agency such as SEC/IRS/Dept of Treasury.
Candidates with this skill set will be able to provide:
• Knowledge of sources: their locations, type and accessibility requirements.
• Working with source and target data sizes; ability to provide estimates for disk size.
• Inter-relationships and pattern identification amongst source data and targets.
• Maintain data integrity issues such as column lengths, null keys, key referential integrity, missing values, improper / inconsistent values, value types.
ActioNet, Inc. - 16 months ago