Mohammed

Phone: 214-295-6666 Extn. 105 / Email:

SUMMARY

Over 8+ years of professional IT experience in Data warehousing, Design, Modeling, Development, Analysis, Implementation and Testing.

Expert knowledge in working with Data Warehousing tools (ETL tools) likeInformatica Power Center 9.x/8.x/7.x/6.x/5.x, Power Exchange.

Extensively used Informatica Warehouse Designer to create and manipulate Source and Target definitions, Mappings, Mapplets and Transformations.

Extensively worked on Informatica Power Center Transformationssuch as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter and Sequence Generator.

Have clear understanding of Data warehousing and Business Intelligence concepts with emphasis on ETL and life cycle development using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.

Expertise in Data warehousing concepts like OLTP/OLAP System Study, Analysis and E-R modeling, developing database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional data modeling.

Have work experience with Relational Data Modeling tool Erwin 7.0/4.2.

Experienced in handling SCDs (Slowly Changing Dimensions) using Informatica.

Over Seven Years ofRDBMS experience using Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005/2000, MS Access 2000, SQL (DQL, DML, DDL, TCL), PL/SQL (Stored Procedure, Function, Package, Trigger, Bulk Load, Autonomous Transaction, Exception Handling), XML, SQL*Loader and SQL Editor (SQL*Plus, TOAD).

Extensively worked on Data migration, Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.

Experience in implementingupdate strategies, incremental loadsandChange Data Capture (CDC).

 Knowledge of complete Software Development Life Cycle(SDLC) including Requirement Analysis, Requirement Gathering, Cost Estimation, Project Management, Design, Development, Implementation and Testing.

Involved in Full Life Cycle Development (Waterfall & Agile) of building a Data Warehouse on Windows and Unix Platforms for Investment Banking, Financial, Retail, and Insurance Industries.

Experience in UNIX working environment, writing UNIX shell scripts for Informatica pre & post session operations.

Experience in generating reports using(Reporting tools) COGNOS and BO

Performed System Analysis and QA testing and involved in Production Support.

Excellent communication and interpersonal skills. Ability to work effectively working as a team member as well as an individual.

Self-motivated individual with strong Technical & Analytical skills. Always keen and eager to face up to challenges by means of novel and innovative ideas.

TECHNICAL EXPERIENCE

ETL Tools : Informatica PowerCenter v9.x/8.x/7.x/6.x, Power Center Client tools -

Designer, Repository manager, workflow manager/monitor, ILM tool,Server tools – Informatica Server, Repository Server manager.

Programming & Scripting Languages :C, C++, SQL, PL/SQL

Web Technologies : HTML

Database : Oracle 7i, 8i/9i/10g/11g, MS SQL Server, IMS database and DB2.

Database Tools :SQL Plus, Toad, SQL Navigator, PLSQL Developer, Query Analyzer

Modeling Tools : Microsoft Visio. Toad

Platforms : Windows NT/2000/XP/Vista, UNIX, Linux. Mainframes - z/OS

Photoshop : Adobe Photoshop, Adobe Illustrator.

Other tools : Microsoft Office (FrontPage, Excel, Groove, Communicator, OneNote, LiveMeeting), BeyondCompare, SmallTalk, HP QualityCenter,Putty.

PROFESSIONAL EXPERIENCE :

CVS Caremark, Lincolnshire, ILJul11–Current

Sr. InformaticaDeveloper

PBM Data Masking Projectgoal is to discover and mask sensitive production data in non-production environments to meet the regulatory compliance (FTC, HIPPA, Gramm-Leach Bliley Act, PCI) standards as per the US Federal Laws. Sensitive data extracted from legacy systems and done the masking in a reference tables then updated back to the all non production environments including Development, Testing and STP. Masking is done by using Informatica PowerCenter, ILM tool and PowerExchange.

Responsibilities:

  • Identified applications & sub applications interacting with SME’s (Subject matter experts) and application managers and assisted in the creation of dada discovery documents.
  • Scheduled review meeting with SME’s and Compliance team to understand, receive inputs and finalize the applications, tables and attributes in scope
  • Prepared the high level design document and detailed design for the masking process and received approvals from SME’s, DBA’s, Managers.
  • Assigned available masking business rules for the indentified attributes in scope
  • Analyzed entities, attributes for referential attributes and also analyzed application dependency
  • Prepared application level Data masking architectural diagram for data flow by using MS-Visio tool
  • Designed seed mapping for extracting data from source tables and loading into reference master tables.
  • Created ILM workflows to mask the master tales by creating the policies, rules and plan in the ILM interface for the entities & attributes
  • Extracted data from IMS data base by using PowerExchange and PowerCenter and loaded into DB2 for masking and updated back the masked data.
  • Developed mappings, sessions and workflows to mask data and update for replacing the sensitive data.Used various transformations like Source Qualifier, Expression, Aggregator, Lookup, Filter, Router, Update Strategy transformations to implement the masking process
  • Implemented variables and parameters in the mapping and workflow level.
  • Conducted unit testing (e.g. rules, groups, policies, plans, and workflows) on tables & attributes with sample data and fixed the errors.
  • Tested the model in Development environment and successfully moved into the Test and STP environment.
  • Worked closely with the testing team to analyze and fix defects and deployed Informatica folders.
  • Schedule and Run Extraction, masking and updating back process and monitor sessions using Informatica Workflow Manager.

Environment: Informatica PowerCenter v9.1, 8.6, Oracle 11g, IMS Databse (DB1), DB2, SQL, Flat Files, Putty, Windows-XP (Client), Mainframe, Unix, TOAD 9.0, MS Visio, ILM, Informatica servers on Unix and PowerExchange

Citi Bank, Freeport, NY Dec10 – Jun 11
Sr. InformaticaDeveloper

TheCargo On-Hand System (COS)Warehouse Management and Distribution System offerOperations a flexible, robust application which responds to the day to day challenges encountered in warehouse management. The application provides package level tracking, management of multiple warehouses from a central location, tight integration with the Freight Forwarding systems which provide global visibility of shipment data

Responsibilities:

  • Analyzed and understood business and customer requirements by interacting with Business Analysts, Data Modelers and Subject Matter Experts.
  • Wrote and reviewed documents like Functional Specifications, Data Mapping Sheets and Technical Specification Documents.
  • Prepared documents like Deployment Doc, Estimation Reports, Development tracking report and weekly status Reports .
  • Established standard code migration process.
  • Extracted data from a wide variety of sources like flat files, XML files, Relational Databases (Oracle, SQL Server) and from the legacy mainframe systems by using Informatica PowerExchange.
  • Involved in creation of Informatica mappings to build business rules to load data using transformations like Source Qualifier, Expression, Aggregator, Lookup, Filter, Router, Update Strategy, Normalizer, Stored procedure,XML and Sequence generator transformations.
  • Created User defined functions to reuse the logic in different mappings.
  • Extensively used mapping parameters, mapping variables to provide the flexibility and parameterized the workflows for different system loads.
  • Creation of sessions and workflows according to the data load in to different systems.
  • Data investigation in the analysis of incoming data from the various source systems, documenting the data anomalies and generating Data Quality reports.
  • Involved in different phases of testing like Unit, Functional, Integration and System testing.
  • Creation of Review documents for specification document and test cases.
  • Performance tuning of targets, sources, mappings and sessions.
  • Involved in database testing, writing complex SQL queries to verify the transactions and business logic like identifying the duplicate rows by using SQL Developer.

Environment: Informatica PowerCenter v8.6.1, Informatica DataExplorer (IDE) v8.6.1, PowerExchange, Oracle 11i,Autosys, Business Objects XI R3, Windows-XP (Client), Informatica servers on AIX Unix, SQL Developer, PL/SQL Developer.

Excellus BCBS, Rochester, NY Feb 10 –Nov10

InformaticaDeveloper

Operational Data Store (ODS): The ODS is built to store daily Membership, Group information from Legacy systems. This ODS is used for various applications and projects like COBR (Coordination of Benefits-Recertification) and for Reporting. This ODS gets the data from several legacy systems which runs on daily Kill & Fill process. This is also used by EDW to store all the historical information.

Responsibilities:

  • Involved in creating the technical architecture and design documents for small project change requests.
  • Work with business users directly in gathering requirement and implementing them and working close with offshore developers.
  • Prepared user requirement documentation for mapping and additional functionality.
  • Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system.
  • Analyzed the requirements and framed the business logic for the ETL process.
  • Assisted in the creation and implementation of the business rules via stored procedures.
  • Extensively used ETL to load data using PowerCenter from source systems like XML Files, Flat Files, and Excel Files into staging tables and load the data into the target database (Oracle, SQL Server).
  • Developed mappings to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Expression, Lookup, aggregate, Update Strategy, Joiner, Normalizer transformations.
  • Developed Slowly changing dimension mappings (Type 1 and Type 2).
  • Worked on populating the EDW / Data Marts, multiple fact and dimension tables with star methodology.
  • Designed and developed reusable mapplets and used mapping variable and mapping parameters in the ETL mapping.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Identified the different bottlenecks.
  • Involved in documenting the deployment plan and the unit test plan and other UAT related documents.
  • Did unit testing to identify what can be done to improve effectiveness and efficiency.
  • Did the User Acceptance Testing (UAT) to see that the system under test meets the user’s needs.
  • Used HP Quality center to create and for tracking of defects.

Environment:Informatica PowerCenter v8.6 ,Oracle 11g / 10g, Teradata 12 / 13, SQL Server 2005, XML Files, CSV Files, Business Objects XI R3, TWS (Tivoli Workflow Scheduler),Windows-XP (Client), Unix (Solaris), SQL Developer, DB Visualizer 5.0, TOAD 9.0, SSH (Secure Shell), UltraEdit.

ETRADE, Palo Alto, CA Aug2009 to February 2010

ETL Developer

ETRADE is a global financial leader, delivering value and innovation to millions of customers in more than 40 countries worldwide. The project was to enhance the Enterprise Data Warehouse in order to facilitate seamless access from any location. It also involved building centralized data marts for decision making and reporting in the following areas; Banking, Risk analysis, Pricing, Retirement, Investments.

Responsibilities:

  • Involved with Business Analyst during the data analysis from the different sources.
  • Design several technical design documents including source to target mappings which included top level flow of the data extraction and loading to appropriate destinations.
  • Developed the data models for the Pricing and Investment data mart using Erwin.
  • Extracted data from heterogeneous sources like oracle, xml, flat file and perform the data validation and cleansing in staging area then loaded in to data warehouse in oracle 10g.
  • Developed Mappings, Mapplets, Sessions, Workflows and Shell Scripts to extract, validate, and transform data according to the business rules.
  • Developed complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Developed data conversion/quality/cleansing rules and executed data cleansing activities such as data consolidation, standardization, de-duplication, matching, merging using Informatica Data Quality tool.
  • Involved in Database and ETL mappings Performance Tuning.
  • Used Informatica Debugger to troubleshoot logical errors and runtime errors.
  • Performed Unit, Systems and Regression Testing of the mappings.
  • Extensively used UNIX shell scripts to create the parameter files dynamically.
  • Used PMCMD command to automate the PowerCenter sessions and workflows through UNIX.

Environment: Informatica Power Center 8.3/8.1 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), Informatica Applications, Oracle 10g, IDQ, Toad, SQL/PLSQL, Windows NT 4.0, Unix Shell scripts , SQL server and XML database.

MetLife Inc.Somerset, NJ June2007- Aug2009

Informatica Developer

MetLifeInsurance in the United States offer personal and institutional clients a broad array of Services in Insurance. The project is called Compliance Compensation Marketing – Life (CCM - Life), which will provide CCM warehouse with feeds from Life EDW in the form of delimited flat files like Contract, Transaction, Benefit, Producer, Demographics, Loans and Rider. CCM Warehouse supports requirements for USA Patriot Act, Know Your Customer (KYC) and Suspicious Activity Detection systems.

Responsibilities:

  • Involved in interactions with the Business Analyst to understand and obtain application requirements.
  • Involved in preparation of the software change request documents.
  • Involved in enhancement of SQL queries and Java modules to fix SQL Injection Vulnerabilities.
  • Updated complex SQL queries and demonstrated ability to collaborate with other developers during Design, Development, and Testing.
  • Used design and development techniques that resulted in efficient, maintainable, and high quality ETL processes.
  • Involved in development of a new Java sub-module to calculate PenaltyInterest for defaulters based on new requirements.
  • Responsible for source code and version control, involved in integration and deployment.
  • Given demos, presentations to internal teams.
  • Involved in Unit testing and System testing and prepared Test plan documents.
  • Involved in co-ordinations with Quality Assurance in addressing the bugs in the application.

Environment:Java, Eclipse, Oracle, Struts MVC,EJB, XML, WebLogic 9, Informatica Power Center v7.1,Toad , Windows XP, UNIX.

Life Insurance Corporation of India, Hyderabad, India Dec 2005 to Jun 2007

Informatica Developer

TheLife Insurance Corporation of India(LIC) is the largestlife insurancecompany in India. The purpose of the project is to create a customized data mart AIP (Agency Information program) which consisted of Agency details, Policy details, and Claims/Loss details. The data mart is in star schema with Agency Fact table and several dimension tables, Policy and Claims details were laid out in flat files. The information was then transformed according to the business logic and loaded into the data mart.

Responsibilities:

  • Understood existing business and customer requirements.
  • Involved in designing the mappings between sources and targets and also tuned them for better performance.
  • Created Informatica Mappings to build business rules to load data using transformations like Source Qualifier, Expression, Lookup, Filter, Router and Update Strategy.
  • Involved in code migration from development to QA and production environments.
  • Extracted data from of different types of Flat files, Relational Databases Oracle.
  • Implemented reusable transformations, mappings, User Defined Functions, Sessions.
  • Implemented SCD Type 1 and SCD Type 2 methodologies to keep historical data in data warehouse.
  • Creation of Unit, Functional, Integration and System test cases based on Requirement Specification Documents, Use Case Docs, PDM, and User Interface Specifications.
  • Involved in Unit testing and System testing and prepared test plan documents
  • Extensively used mapping parameters, mapping variables to provide the flexibility and parameterized the workflows for different system loads.
  • Creation of sessions and workflows according to the data load in to different systems.
  • Maintained Coding standards and participated in Code Review.
  • Involved in Performance tuning of mapping level and session level.

Environment:InformaticaPowerCenter v7.1, Oracle 7i, Toad 7.6, Windows XP, UNIX.

GVK Bio Informatics, Hyderabad, India Jun 2003 to Nov2005

Software Developer

The GVK Bio PvtLimited is a part of GVK group and India’s biggest CRO in pharma and healthcare. The current project objective was to design and develop a Single Integrated Data Warehouse for reporting the USFDA Drug data for the organization, which enables the scientific team to access the drug data.

Responsibilities:

  • Involved in understanding requirements, analyze new and current systems to quickly identify required sources and targets
  • Involved in preparation oftechnical documentation, design transformations that are consistent with the goals of the existing data warehouse, and worked with the development team to implement the solution
  • Worked on legacy code to analyze problems, confirm requirements, and create designs, code and testing.
  • Experience in designing, developing, and testing Informatica extract/transform/load processes in a Data Warehouse.
  • Extensively worked on Mapping Parameters, Workflow Variables, Mapping Variables, and Session Parameters.
  • Used the Workflow Manager to create Workflows, Worklets and Tasks.
  • Wrote SQL over ride queries in the required transformations.
  • Used design and development techniques that resulted in efficient, maintainable, and high quality ETL processes.
  • Used problem solving skills and quickly understood relationships between data with little documentation.

Environment: Informatica Power Center v6.2, PowerMart 6.2, Oracle 9i, PL/SQL, Toad 7.6, Windows XP, UNIX.