Post a Job Sign in

Vamsi Singh

ETL Architect at Russell Investments

-

Work Experience

ETL Architect

Russell Investments
-
Seattle, WA
-

July 2010 to Present

Seattle, WA 
Project: Investment Division Operational Data Store Jul '10 - Till Date 
Role: ETL Architect 
 
Description: Russell Investments is a subsidiary of Northwestern Mutual. Russell clients include retirement plans, endowments and Investment plans of all types. Investors have access to Russell's Services like BICC (Business Intelligence Competency Center), Russell Indexes, Risk Management, EDM ODS or ISS PACE (Investment Shared Services) . This project involved working with the different Line of Business like Transition Management, Portfolio Services, Funds Derivatives, Overlay Services, Short Term Investment, and FX (Foreign Exchange), MAT. The Project initially started with RIS ODS (Russell Investments Services) in which in specific in the long run has been renamed to ID ODS (Investment Division) . So the prime goal of the Project was to Recommend and build Architecture for ODS, Warehouse and Data Marts, Infrastructure, Messaging Architecture, ETL Engine Recommendations and Reporting Requirements. 
 
Responsibilities: 
• Played multi headed team person as Data Modeler/Architect, ETL Architect, and Onsite Coordinator. The project was in 2 Phases in which my role differed. 
Phase-I: Data Modeler/Architect 
• Worked with Business Analyst closely to gather most of the Business Requirements for all the Line of Businesses. 
• Extensively worked on Data Modeling while designing the Operational Data Store for various Line of Businesses and Integrating together. 
• Good experience in Logical and Physical Database Design, Data Modeling and Analysis work with Power Designer Tool, Experience in working with ERWIN tool for the Data Modeling. 
• Strong understanding of Modeling techniques and good implementation techniques of Forward Engineering and Reverse Engineering in various scenarios. 
• Led discussions with Business Analysts and SMEs for Requirement Gathering. 
• Translated all LOBs Business Requirements to Technical Specifications. 
• Experience in delivering the process flow of different Line of Business to various teams of Internal Technical Teams. 
Phase-II: ETL Architect/Sr. Informatica Developer 
• Experience in creating Business Requirement Document, Technical Architecture Document, Process Control Document, High Level Document, and Low Level Document. 
• Worked on different types of ETL/Data Load processes that pull data from flat files into SQL Server database. 
• Analyzed and prioritized the requirements according to the timelines and resources available. 
• Led an ETL team to implement the complete Out of Box Solution for Finance Module including TM, OS, SWAPS and FX. 
• Designed and developed required Informatica Mappings and Workflows. 
• Created and built ETL mappings to load custom Warehouse tables. 
• Used ETL, Informatica Designer to design mappings and coded it using reusable Mapplets using various Transformations like Joiner, Aggregate, Expression, Filter, update Strategy. 
• Designed and Developed Informatica Mapping to load data from different Line of Business Operational Data Store by Scrubbing, Cleansing and Transforming data before its being utilized by the Downstream Systems. 
• Tuned the Informatica Code for Performance, Creating Aggregates, Drill Downs, Cache Management, Tweaking Business Models. 
• Experience in understanding and implementing the Functional logic from DTS Packages and C# codes to Informatica Code. 
• Customized Informatica Mappings, Configured, Scheduled and Executed Full and Incremental loads on the Operational Data Store. 
• Introduced various techniques in handling the ETL Informatica code with Period Control Implementation, File Load Master technique. Also integrated the Holiday Table into the Investment Division ODS. 
• Assisted the reporting developers and Data Modelers to ensure the ETL addresses the scope of reporting adhering to the required standards. 
• Designed, Developed and Enhanced various techniques of ETL load from one particular Line of Business to another and with the Integration of Messaging Bus of EAGLE PACE. 
• Documented the tasks at the detailed level and presented the recommendations on the best practices for the migration of Informatica code and Database Code from Development to SIT, SIT to UAT and UAT to Production. 
• Involved in Performance Unit Testing, SIT and UAT for ETL process. 
• Worked on building CSV files on accounting groups required for the Brown Brothers Harriman Finance Module and created a new Database for BBH Accounting Process with their Legacy Database which involved whole transition of their objects, with the new code in Informatica along with the modifications in their Logical and Physical Data Models and extended the Warehouse tables with required flex fields. 
• Worked on REMEDY Ticketing System to resolve issues that were opened and provided timely solutions. 
• Have experience in leading the Process Load meeting with the MFT teams for the File Transfer Process. 
• Supported trouble shooting for the Data Issues on the ETL side and documented the resolution history for all the issues. 
• Experience in driving and delivering time sensitive deadlines on time. 
• Conducted Knowledge Transfer and End user training for the customer and Tracked the progress of the engagement. 
• Documented Technical user guides and recorded training sessions for customer reference. 
• Discussed real-world problems at the student's client sites and advised resolution. 
 
Environment: Informatica 8.6.1, SQL Server2000/2005/2008, ERWIN, Flat Files, SQL, PL/SQL, EAGLE PACE.

Sr. ETL Developer

Merkle Inc
-

December 2009 to July 2010

Client: DELL 
 
Description: Merkle Inc is one of the largest database marketing solution agencies. It provides database marketing solutions for several clients. This project involved working with the marketing data warehouse fed various source systems. The DWH maintains the customer, warranty, historic, campaign and other stating information. The project was to provide improved customer value through improved 90-Day response rates and higher average order values, improve long-term customer value through improved retention rates, increase email opt-ins, improve consumer awareness of Dell products and services, and improve customer satisfaction. 
 
Responsibilities: 
• Extensively worked on ETL/Data Load processes that pull data from flat files into Netezza database with Productin Support. 
• Experience in working with DMExpress tool for pulling data from Relational Databases Netezza and Oracle loading into it at various levels. 
• Involved in extracting data from Oracle database and loading into Netezza and some of the Oracle tables with Data Management techniques. 
• Developed the code for the production run and wrote many SQL & PL/SQL queries. Worked extensively on WINSQL for querying the database and running some the independent jobs associated with the daily and weekly processes. 
• Good experience in Logical and Physical Database Design, Data Modeling and Analysis work with Power Designer Tool. 
• Strong understanding of Performance tuning in Informatica and loading data into Data Warehouse/Data Marts using Informatica. 
• Created design document Informatica mappings based on business requirement and run most of the production run work with Informatica. 
• Designed and developed Informatica Mappings to load data from Source systems to Operational Data Source and then to Data Mart and Scrub, Cleanse and transform the data before it is loaded to the DWH. 
• Used ETL, Informatica Designer to design mappings and coded it using reusable mapplets using various Transformations like Joiner, Aggregate, Expression, Filter, update Strategy. 
• Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval. 
• Customizing the ETL Detailed Design template in order to encompass Informatica mapping and session details 
• Involved in migration of Informatica mapping from Development to Production environment. 
• Extensive documentation on the design, development, implementation, daily loads and process flow. 
• Extensively worked on Tidal for scheduling production runs daily and weekly. 
• Worked on creating complex SQL code and procedures that pulls data from one Netezza database into DWH. 
• Worked on JIRA tickets to resolve issues that were opened and provided timely solutions. 
• Worked on VBScripts of DMExpress for scheduling them and making many jobs easier. 
• Worked on multiple projects at the same time and involved in maintenance, support of the application after production roll out. 
• Worked on performance tuning of queries and improved response time and reduced IO statistics. 
• Actively participated in all business meetings, daily status meeting and provided status. 
• Used to have constant modification meeting with B.A and clients. 
• Maintained warehouse metadata, naming standards and warehouse standards for future application developments. 
• Have experience in encrypting the files using PGP for the security purpose and then transferring. 
• Involved in setting up the FTP and source connections which are specified using Filezilla as a FTP tool for transferring files from source to different agencies like Acxiom and WMSG. 
• Supported the team in testing during the UAT phase for Validations of Business scenarios and data corrections. 
 
Environment: Netezza 4.6.5, WINSQL 8.0, DMExpress 5.0, Informatica Power Center 8.5.1, Oracle 10g, Flat Files, SQL Server 2005, NZSQL, SQL, PL/SQL, Windows NT, Power Designer 15.0, Tidal, Filezilla 3.2.4.1, PGP 8.0.3.

Sr. ETL Developer

UBS Bank, NY
-

June 2008 to November 2009

Description: This project involved working with a financial data warehouse fed periodically by source systems. Scope of the project was maintenance of the CRPS DWH. The DWH maintains the customer information, financial balance details of the customer and different bank branch details. Reporting involved like city wise customer reports, customer grade reports floating balance reports, valuable customer reports, branch performance details reports etc. some of these reports are mission critical as those contained financial data and was critical for business planning. 
 
Responsibilities: 
• Involved in the extraction process to get the data from Oracle sources, DB2, XML files, Flat files. 
• Extensively used Informatica Power Center 7.x to extract the data from XML, flat files and other RDBMS databases like Oracle, DB2, and SQL Server into staging area and then populate the data warehouse. 
• Interacted with the business community and database administrators to identify the Business requirements and data realties. 
• Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations in an efficient manner. 
• Used Autosys to schedule and run the jobs on a daily and monthly basis. 
• Underwent a basic training session from Netezza representatives. 
• Assisted with scripting and monitoring interfaces between DB2 and Oracle. 
• Involved in Conversion and Migration of ETL Functionality from Oracle to DB2 UDB. 
• Used Informatica Power Connect to extract data from IBM MAINFRAME/DB2 environment, evaluated Informatica Power exchange for CDC (Change Data Capture) 
• Used Informatica Power Connect for Netezza to pull data from Netezza data warehouse. 
• Populated error tables as part of the ETL process to capture the records that failed the migration. 
• Created design document Informatica mappings based on business requirement. 
• Created Informatica mappings using various Transformations like Joiner, Aggregate, Expression, Filter, and Update Strategy. 
• Involved in the performance improvement project with production move and documentation. 
• Improving workflow performance by shifting filters as close as possible to the source and selecting tables with fewer rows as the master during joins. 
• Played key role in determining data feed extraction process, data analysis prior to ETL and optimization of Informatica ETL process. 
• Worked on Informatica developments and enhancements by creating new mappings and making changes to the existing mappings as per the business requirements. 
• Worked on Informatica Source Analyzer, Data warehousing designer, Mapping Designer, Mapplet designer and Transformations. 
• Responsible for validating the mappings against the pre-defined ETL design standards. 
• Used persistent caches whenever data from workflows were to be retained. 
• Used connected and unconnected lookups whenever appropriate, with appropriate caches. 
• Involved in designing of testing plan (Unit Testing and System Testing) 
• Tested scripts by running workflows and assisted in debugging the failed sessions. 
• Involved in the Release Management team and verify the Production checklist documents necessary for the Production Activities. 
• Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor and also setup Project Folders in all Environments of Informatica. 
• Setup Developer Access for Development, QA and Production Environments in Informatica. 
• Implemented the PMCMD command to run the workflows through UNIX shell scripts and job automation using Autosys scheduling tool. 
• Perform Maintenance, including Manage Space, Remove Bad Files, Remove Cache Files and Monitoring Services. 
• Setup Permissions for Groups and Users in all Environments (Dev, UAT and Prod) 
• Migration of developed objects across different environments. 
• Also involved in conducting and leading the team meetings and providing status report to project manager. 
 
Environment: Informatica Power Center 8.5.1/7.1.2, Oracle 10g, Netezza, Flat Files, DB2, SQL Server 2005, Unix Shell Scripts, SQL, PL/SQL, Windows NT, Erwin 6.1, Autosys, Trillium, Clear Case, Business Objects 6.5.1.

Sr. Informatica Developer

GE Corporate
-
Milford, CT
-

November 2006 to May 2008

Description: The Global HR Data warehouse supported and maintained by GE caters to the HR system reporting requirements for the organization. The project was done to cover employee registration, employee retrenchment, vacation computation, Learning Management System change registration and salary changes and necessary reports like Attrition Reporting, Compensation Reporting and support various GE down streams. Custom OBIEE Dashboards, Reports were configured. 
Responsibilities: 
• Requirement analysis in support of Data Warehousing efforts with Oracle and DB2. 
• Monitoring and developing Downstream Application Data Feeds. 
• Fine tuned the session performance using Session partitioning for long running sessions. 
• Used SQL override to customize queries as per the downstream requirements. 
• Worked extensively on Oracle query optimization and built complex queries. 
• Handling minor project Enhancements like adding New Dimensions to the warehouse. 
• Used pmcmd command in UNIX scripts to call Informatica session and workflows. 
• Built a reusable and easily scalable UNIX shell script based solution to perform bulk load into the target DB2 database. 
• Created Sessions and Workflows to load data from the IBM DB2 UDB 8 databases. 
• Implemented the Incremental loading of Dimension and Fact tables. 
• Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache. 
• Implemented the concept of swapping database instances in the project. 
• Worked on customizing the DAC in adding the tasks, subject areas related to service, sales, and Order Management, HR and marketing modules. 
• Successful and smooth execution of HR3 workflow redesigning project. 
• Used Versioning, Labels and Deployment group in the production move process. 
• Created detailed test cases and objective of the mappings. 
• Developed and tested the extraction, transformation, and load (ETL) processes for relational databases like Oracle and DB2 and flat file systems for Cognos Reporting. 
• Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval. 
• Implemented OBIEE (Oracle Business Intelligence Enterprise Edition) for the reporting in HR3Dash board application for the development of Dashboards. 
• Implemented Oracle warehouse Builder to create process Flows for some down streams. 
• Used Deployment Manager to deploy the OWB mappings to Production. 
• Developed various shell scripts to support the Informatica Mappings. 
• Involved with DBA Team in creating database triggers for data security. 
• Worked extensively with update strategy and lookup transformation for inserts and updates. 
• Involved in the validation of the OLAP Unit testing. 
• Created reports using Query Studio and Report studio in Cognos. 
• Involved in the Production Release Activities. 
 
Environment: Informatica (Power Center 8.1), Oracle 9i, DB2, Flat Files, Teradata V2R5, SQL, PL/SQL, Cognos, UNIX, ERwin Data Modeling (Erwin 4.5), OBIEE 9.1, MS SQL Server 2000, RHEL (Red Hat Enterprise Linux) 4.0.

Informatica Developer / Analyst

Fannie Mae
-
Herndon, VA
-

November 2005 to October 2006

Description: This Project is to create a centralized data mart that could support drill down analysis, drill through analysis, and management reporting for the securities portfolio. This project also focuses on identifying and defining required analytics to validate overall financial results. As an ETL developer/Analyst, I was involved in the ETL Processes using Test Oracle development of restatement, Catch-up 2005, 06 and 07 Projects. I was also involved in ETL Analysis, Control data population for ST team, data loading for Analytics, Fin 46 and Impairments reports using Informatica. 
 
Responsibilities: • Involved in understanding BRD, Technical documentation, which includes business objectives, Scenario based strategies and test environment. 
• Worked with Data Modeler to understand the Business Requirements and Business Rules Specific to Business Subject. 
• Prepared ETL Specifications and design documents to help develop mappings. 
• Created mappings to read from Flat files, RDBMS and to load into RDBMS tables. 
• Created mappings/sessions/workflows and shared transformations. 
• Used version control to check in and checkout versions of objects. 
• Performance tuning of Mapping and Workflow objects. 
• Prepared and maintained mapping specification documentation. 
• Involved in End-to-end Execution using Autosys and in extensive analysis of business scenarios. 
• Involved in extensive analysis of business scenarios. 
• Created and tested Informatica mappings to perform data transformations, compares and loading as part of Test Oracle creation and compares. 
• Developed number of Informatica mappings based on business requirements using various transformations Dynamic Lookup, Connected and Unconnected lookups, Filter, Stored procedure, Update Strategy, Joiner, Aggregator, Expression, Router, Sequence generator and Normalizer. Extensively tested the mappings by running the queries against Source and Target tables and by using Break points in the Debugger. 
• Prepared Expected SQL queries to validate BOXI reports against Actual reports developed by reporting team as part of cross validation process. 
• Active participation in status and Pre- release and module initiation meetings. 
• Active interaction with DBA and IG4 Environment team to resolve issues and to come up with workarounds in case of failures. 
• Defect tracking is done using rational clear quest and Quality center. 
• Quality Center is used to maintain requirements and test cases and also test cases are executed in Quality center for each new release. 
• Assisted QA Team to fix and find solutions for the production issues. 
 
Environment: Informatica 8.1.1 (Power Center/Power Mart), AbInitio 1.13, Oracle 9i/ 10g, Autosys JIL Scripting, Quality Center, Rational Clear quest, Mercury Quality Centre and, O/s Windows 2000/NT, UNIX servers.

Informatica ETL Consultant

VTECH, CA
-

September 2004 to October 2005

Description: VTech is a leading telephone instrument manufacturing company. This Data warehouse was being built for the Order Management System. This warehouse synchronizes the production data with the order and Inventory data available in USA. This data warehouse gives a clear picture of the order status, Shipped quantity, Open order quantity, Product wise sales, Inventory status, In transit inventory and production plan, POS data for sales promotion schemes through discounts and offers with multiple hierarchical dimension as required by the client. 
 
Responsibilities: 
• Designed and developed ETL process using Informatica tool. 
• Developed mappings, sessions, workflows and workflow tasks based on the user requirement and scheduled the workflows in workflow manager and monitor the load status in the workflow monitor. 
• Performance tuning of sources, targets, mappings and SQL queries in the transformations 
• Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, input, output transformation in the mapping. 
• Created reusable transformations and Mapplet and used with various mappings. 
• Created Connected, Unconnected and Dynamic lookup transformation for better performance and increased the cache file size based on the size of the lookup data. 
• Done various optimization techniques in Aggregator, Lookup, and Joiner transformation. 
• Developed Informatica parameter files to filter the daily data from the source system. 
• Created mappings in designer to implement the type 2 slowly changing dimensions. 
• Implemented Informatica parameter files to filter the daily data from the source system. 
• Developed Workflow Tasks like reusable Email, Event wait, Timer, Command, and Decision. 
• Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading. 
• Creating Test cases for Unit Test, System Integration Test and UAT to check the data 
• Created personalized version of reports as well as statements for customers using Business Objects. Created a universe using the data from Informatica metadata and then generated Business Objects reports using slice and dice capabilities. 
• Utilized mathematical functions, built-in-workflow and data mining through Business Objects. 
• Created Oracle Stored Procedure to implement complex business logic for good performance and called from Informatica using Stored Procedure transformation. 
• Used various Oracle Index techniques like B*tree, bitmap index to improve the query performance and created scripts to update the table statistics for better explain plan. 
• Responsible for loading data into warehouse using Oracle Loader for history data. 
• Responsible for moving the mappings and sessions from development repository to production repository box. 
 
Environment: Informatica Power Center 7.1.1, Business Objects 6.5, Oracle 9i, SQL Server 2000, Mainframe DB2, COBOL Files, AS/400, SQL, PL/SQL (Stored Procedure, Trigger, Packages), Erwin, MS Visio, Windows 2000, UNIX AIX 5.1

ETL Developer

Arthur Andersen India
-
Phoenix, AZ
-

March 2003 to August 2004

Description: Consumer Analysis Tool is the Risk Consulting project for AP Transco. It analyses the consumer payment, consumption and maintenance of the consumer database across the state. The project includes, collection of consumer data, formulate a generalized database structure for the Private Accounting Agencies (PAAs) to follow for the future maintenance of data at DISCOM level. The tool will centralize the database and will provide front end tool to generate reports at the DISCOMs. Power consumption and the generated revenue are evaluated and the revenue loss/arrears towards each DISCOM are generated. It will finally analyze possible ways to reduce the receivables which will reduce the working capital. 
Responsibilities: 
• Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their OLTP systems using Informatica and provide technical support and hands-on mentoring in the use of Informatica. 
• Data conversion from one database to Data Warehouse. 
Modify the database structure as required by the application developers. 
• Prepare SQL queries for consumer analysis reports. 
Expertise in writing SQL queries for cross verification of data. 
• Responsible for all pre-ETL tasks upon which the Data Warehouse depends, including managing and collection of various existing data sources. 
• Major involvement in performing ETL Operations using Informatica. 
• Interaction with end-users and business analysts to identify and develop business requirements and transform it into technical requirements. 
• Used various Transformations like Expression, Filter, Joiner and Lookups for better data massaging to migrate, clean and consistent data. 
• Extensively worked on creations of external tables and importing data from flat files into the external tables using Informatica. 
• Working closely and iteratively with technical staff and business owners to produce logical and normalized data models that reflect business processes. 
• Wrote complex SQL using joins, sub queries and correlated sub queries. 
• Consumer profiling and ageing reports are generated for all the months and analysis is done for classification of consumer depending on different classes based on the payment patterns. 
• Hold significant responsibility in the Data Mining Team and to coordinate with the team for successful result oriented work. 
• Working closely and iteratively with technical staff and business owners to produce logical and normalized data models that reflect business processes. 
 
Environment: Oracle 8.0, Informatica Power Center 7.1, SQL, PL/SQL, Power Mart 4.7, Windows NT.

Jr. Programmer

Client United Telecoms Ltd
-

July 2001 to January 2003

Description: United Telecoms Limited is the flagship company of UTL group. UTL is the largest manufacturer of C-DOT technology based Switching Equipment and in Collaboration with ZTE, China is engaged in manufacture of CDMA technology based infrastructure equipment. This Data Warehouse is a central location where data for Sales and Orders from different regions is stored. This Data Warehouse is build on the top of the different application database to create analysis and future projection reports. 
 
Responsibilities: • Designed and developed mappings and sessions in Informatica to load target database. 
• Used Supervisor to create Repository, Users, and General Supervisor and manage other users and their privileges. 
• Involved in designing and building of Universe, Classes and Objects using Business Objects 6.5 
• Performed drill analysis which will allow the user to view data at a more detail level. 
• Generated Master/Detail, Master/Cross, Chart, and Subtype reports as and when required. 
 
Environment: Informatica Power Center 5.1, Oracle 8i, SQL, PL/SQL, UNIX, Windows 2000, Business Objects 5.5

Education

Bachelors

Pondicherry University -
Puducherry, Puducherry

Bachelors

University Rank in Bachelors

Additional Information

• Ten years of IT experience in the Analysis, Design, Development, Testing and 
Implementations of business application system for Health care, Financial, Insurance, Banking, Manufacturing and Telecom Sectors. 
• Seven years of strong Data Warehousing and ETL experience in using Informatica Power Center 8.0/7.1.2/7.0/6.2/5.1, Informatica Power Mart 6.2/5.1/4.7 and good understanding of Informatica Architecture. 
• Dimensional Data Modeling experience using Data modeling, Erwin Modeling (Erwin 4.5/4.0, Oracle Designer) Star Join Schema/Snowflake Modeling, FACT & Dimensions tables. 
• Physical & Logical Data modeling and Hands on experience in Star and Snowflake Schema design in Relational, Dimensional and Multidimensional modeling. 
• Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, and XML files, Teradata, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets. 
• Experience with data extract management techniques, target load data techniques and data management techniques with databases like Oracle, DB2, Netezza and SQL Server. 
• Experience working with various Heterogeneous Source Systems like Oracle, MS SQL Server, Teradata, DB2 UDB, Mainframe files, dBase, SAP tables and Flat files. 
• Experienced in developing Materialized Views, stored procedures, packages, functions, partitions and triggers. 
• Expertise in implementation of OBIEE Applications including Informatica, DAC and OBIEE Platform. 
• Extensive experience in scheduling jobs using Tidal on real time Processing and developing SQL and PL/SQL code with a good knowledge of Netezza. 
• Good experience in Unix Shell Scripting and ETL Process Automation using Shell Programming and Informatica. Also have had quite good experience in Performance Tuning at both Database and Informatica Level. 
• Experience in developing Test Strategies, Test Plans and Test Cases and expertise in Performance Tuning and Debugging of existing ETL processes. 
• Was involved in the complete life cycle of quite a few Datawarehouse implementations which include System Architecture, Functional/Technical Design, Development Coding and Testing (Unit Testing, System Integration Testing and User Acceptance Testing), Implementation and Production Support. 
• Proficient knowledge and experience in working with Relational and Non-Relational Source and Target systems. 
• Excellent understanding of Client/Server architecture makes a valuable asset to any project, with excellent Logical, Analytical, Communication and Organizational Skills. 
• Good team player and can work on both development and maintenance phases of the project. 
• Efficient team player with excellent communication skills, Strong Team building, mentoring skills and good interaction with users. 
• Strong Logical and Analytical Reasoning Skills, Excellent Communication with good Listening, Presentation and Intrapersonal Skills. 
 
Technical Profile: 
 
ETL Tools Informatica Power Center 8.6.1/7.x/6.x., DMExpress 
Databases Netezza 4.0.1, Oracle 10g, 9i, 8i, MS SQL Server 2000/2005/2008, DB2 UDB 7.0/8.0, MS Access. 
Data Modeling 
Dimensional Data Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical and Logical Data Modeling, Entities, Attributes, Cardinality, ER Diagrams, ERWIN 4.5/4.0, Power Designer 15.0. 
 
Tools WINSQL, Tidal, Filezilla, PGP, Toad, Autosys, Erwin Macro, Power Designer, Ctrl-M, Crontab. 
Business Intelligence 
OBIEE Oracle BI Enterprise Edition 10.1.3 and Siebel Analytics 7.8/7.7/7.5, Business Objects 6.5/6.0, Cognos 8.0 BI (Reporting Studio, Query Studio) 
 
Language SQL/ PL SQL, Unix Shell Scripting, VB Script, C, and C++. 
Operating Systems Windows 2003/2000/XP/98, UNIX, Win NT, Fedora Linux, MS-Dos. 
System Design & Development 
Requirements Gathering and Analysis, Data Analysis, ETL Design, Reporting Environment Design, Development and Testing UAT Implementation.