MEENA

Professional Summary:

·  7 years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications

·  Experience using query tools for Oracle, Teradata, DB2, Sybase and MS SQL Server to validate reports and troubleshoot data quality issues.

·  Information technology professional with 7+ years experience in Requirement’s Gathering, Analysis, Documentation, Testing, Implementation and Maintenance. Specialized in Quality Assurance and testing, worked as ETL Tester, BI Tester, Data Tester and QA Analyst.

·  Worked with Teradata stored procedures, standard tables and ETL processes.

·  Solid Back End Testing experience by writing and executing SQL Queries.

·  Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.

·  Database/ ETL/Reports testing -- Validated the data derived in the target table is adhering to the standards and business rules. Played major role in Test Data Creation and Maintenance. Verified if there are any downstream impacts in the warehouse

·  Has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management and Experienced in UNIX shell scripting and configuring cron-jobs for Informatica sessions scheduling

·  Experience in testing XML files and validating the data loaded to staging tables.

·  Experience in testing and writing SQL and PL/SQL statements.

·  Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica

·  Expertise in Testing complex Business rules by creating mapping and various transformations

·  Experience in testing XML files and validating the data loaded to staging tables.

·  Experience in testing and writing SQL and PL/SQL statements.

·  Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using Informatica.

·  Extensively worked with reporting and Business Intelligence tools like Hyperion financial

Reports, Hyperion interactive reporting, Cognos Reports and Business Objects report

·  Experience in writing Test Plan and Test Strategies for Data Warehousing Applications.

·  Extensive knowledge in preparing reports in pivots, charts, dashboards, querying using

Hyperion Interactive reporting studio

·  Create and run automated test scripts for functional, regression and performance testing using QTP, Load Runner and Quality Center for Defect Tracking.

·  Professional Experience in Financial, Banking & Brokerage applications and strong

Understanding of Mortgage lending applications

Technical Skills:

Testing Tools: Quality Center 10.0/9.2, Test Director 8.0/7.6/7.2, Win Runner 7.6/7.0/6.5, QTP 9.2/9.0/8.2, Load Runner 7.6/7.2, Rational Clear Quest/Case, Rational Robot

Languages: Java, Shell Scripting, SAS, XML, HTML, SQL, XSLT, TSL, Java Script, VB

Script

Hardware: HP-9000 Series, IBM Compatible PC Pentiums

Operating Systems: WindowsNT/2000/XP, UNIX, LINUX

Databases: MS Access 2000, Teradata V2R6, SQL Server, Oracle, Sybase, Informix, DB2

ETL Tools: Informatica Power Center 7.1.3/8.2, Ab Initio (GDE 1.14, Co > Op 2.14)

Browsers: Internet Explorer, Netscape Navigator, Fire fox

Bug Tracking Tool: Rational Clear Quest, PVCS Tracker, Bugzilla

BI Tools: Hyperion Financial reports (HFR). Hyperion Interactive reporting, Cognos 7.3

Series, Business Objects 6.0/XIR2

Tools/GUI: Visual Basic 5.0/ 6.0, Crystal Reports 4.6/6.0, ER Win, ER Tools, Visio

App/Web Servers: Apache, Tomcat, Web logic, Web sphere

Professional Experience:

Well Point, Richmond, VA Feb/13 – Till Date

Sr. Quality Analyst /ETL

Responsibilities:

·  Reviewed the Business Requirement Documents and the Functional Specification.

·  Prepared Test Plan from the Business Requirements and Functional Specification.

·  Extensively worked with Metadata Team and managed Metadata documentation in compliance with Data standards

·  Extensively worked with ASG Rochade to capture, update the metadata and Create or review metadata

·  Exclusively involved in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Mercury Quality Center

·  Involved in designing the Data Mart model with Erwin using Star Schema methodology.

·  Extensively involved in loading the flat files, xml feeds, data from IMS mainframes into databases such as Oracle 10G, SQL server 2005

·  Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.

·  Extensively used Ab Initio for extraction, transformation and loading process.

·  Preparation of System Test Results after Test case execution.

·  Testing has been done based on Change Requests and Defect Requests.

·  Written several UNIX scripts for invoking data reconciliation.

·  Experienced in writing complex SQL queries for extracting data from multiple tables.

·  Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in Sql Server Database.

·  Tested to verify that all data were synchronized after the data is troubleshoot and also used SQL to verify/validate my test cases.

·  Extensively written Teradata SQL Queries, creating Tables and Views by following Teradata Best Practices.

·  Testing the ETL data movement from Oracle Data mart to Teradata Data mart on an Incremental and full load basis.

·  Tested the ETL Ab Initio graphs and other ETL Processes (Data Warehouse Testing)

·  Extracted data from Oracle and upload to Teradata tables using Teradata utilities FASTLOAD.

·  Involved in developing test cases to test Teradata scripts (Bteq, multiload, fastload).Updated the status of the testing to the QA team, and accomplished tasked for the assigned work to the Project Management team regularly

·  Tested graphs for extracting, cleansing, transforming, integrating, and loading data using Ab Initio ETL Tool

·  TOAD is used to perform manual test in regular basis. UNIX and Oracle are using in this project to write Shell Scripts and SQL queries.

·  Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.

·  Responsible to report on Error Exceptions, Health Checks including performance and availability, report on Production Support Issues, Document Process for monitoring and report out to DBO Data Quality

·  Involved in document Process for monitoring Error Exceptions and document Process for Production Support and participated in Weekly communication to users on Production Support issues.

·  Review any data issues reported and escalate to IT for resolution as needed.

·  Worked with IT to determine correct course of action for any Production Support issue or Alerts, Monitor any alerts and reports sent to mailbox, Monitored all User tables

·  Create and maintain User Guide for Database, notify end users of downtime, issues, new projects, etc.

·  Work with user and IT to ensure users have access to data, Maintain end user distribution list, Consult with end users on how to use data.

·  Provide requirements around data quality as it pertains to ETL.

·  Review/approve business requirements for data quality and review/approve technical requirements for data quality.

·  Written Test Cases for ETL to compare Source and Target database systems.

·  Testing of records with logical delete using flags

·  Interacting with senior peers or subject matter experts to learn more about the data

·  Identifying duplicate records in the staging area before data gets processed

·  Prepared Test status reports for each stage and logged any unresolved issues into Issues log.

·  Data transformation tool (ETL) and Business Objects as data mining and front-end reporting

·  Responsible for testing Business Reports developed by Business Objects XIR2

·  Tested Business Objects reports for verifying and validating the data. Tested several complex reports generated by reporting tool including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards

·  Performed Manual Functional testing with QC and regression testing with QTP.

·  Tracked the defects in Quality center

Environment: Ab Initio CO>OP 2.15, GDE 1.15, EME , ASG Rochade , Erwin 4.1, Oracle 10g, Teradata V2R6, Teradata SQL Assistant 12.0, Data Profiler(Homegrown), SQL Server 2005, T-SQL, SQL, PL/SQL, MS Suite ( MS Access, MS Excel, MS PowerPoint, MS Visio), TOAD, SAS, QTP, Quality Center 10.0, Business ObjectsXIR2 , Share Point, Windows XP

Broad ridge Financials, Jersey City NJ Sep/11 – Jan/13

ETL QA Tester

Responsibilities:

·  Analysis of Business requirements & Design Specification Document to determine the functionality of the ETL Processes.

·  Involved in the entire software development Life Cycle of the projects from Initiation to Implementation.

·  Did testing on ETL Informatica process for various data loading scenarios.

·  Involved in developing the UNIX scripts for Informatica Workflows using parameter files and monitoring the workflows on testing environment.

·  Involved in SQL Optimizations, Performance Analysis, and future growth analysis for OLTP and data warehouse applications

·  Involved in creating source to target mapping, edit rules and validation, transformations, and business rules.

·  Did testing on multiple Informatica mapping for data validation and data conditioning.

·  Created test plan and test cases from the business requirements to match the project’s initiatives in Quality Center.

·  Work with source system personnel on source data delivery issues .Interacted with DBA for setting up test environment.

·  Developed UNIX scripts and AUTOSYS scripts to load extract files in to staging tables using sqlldr.

·  Tested PL/SQL Packages to transform/load the staging data into MIS schema using business logic.

·  Involved in Regression, Functional, Integration, System Testing and User Acceptance Testing.

·  Performed database validation according to the business logic by comparing the source to the target in Spreadsheet.

·  Interacted with developers & various members of team to discuss and resolve defects and their priorities.

·  Attended Bi-Weekly Defect Review and Defect Status meeting. Provide update to management on testing status and some other issues.

·  Worked with all kinds of Teradata utilities including TPUMP, FLOAD, FEXPORT, and MLOAD & Teradata Assistant

·  Attended Bi-Weekly Defect Review and Defect Status meeting. Provide update to management on testing status and some other issues.

·  Converted SQL Queries results into Perl variable.

·  Used Perl for automating all the types of modules at a time.

·  Worked with Users to develop Test cases for user acceptance testing.

·  Maintained the test logs, test reports, test issues, defect tracking using Quality Center.

·  Perform Data Quality integrity testing (completeness, conformity, consistency, accuracy, duplicates).

·  Did data validation on Teradata using SQL Assistant and BTEQ. Improved query performance.

·  Verified the data in DW tables after extracting the data from source tables by writing SQL queries using TOAD

·  Tested SQL queries, PL/SQL scripts to validate the data and performance of the database.

·  Understand business /transformation rules which are involved in developing the source to target mappings, write Complex SQL queries to reflect those rules and be able to validate the mappings by running these test scripts and checking if the data in target table populates correctly by writing complex SQL queries

·  .Personalized Cognos Connection for specified display options for testing regional options and personal information.

·  Created Jobs to Schedule multiple reports in Cognos Connection.

·  Troubleshooting Data & Cognos 8.0 Report formatting issues.

·  Written several complex SQL queries for validating Cognos Reports.

·  Used Query Studio to test ad hoc reports

·  Set up the Automated Testing Environment for creating, and running automated tests using QTP.

Environment: Informatica 8.1/7.5.2, Teradata V2R6, SQL Assistant, BTEQ, FLOAD, FEXPORT, MLOAD, SQL Assistant 6.0, Oracle 10g, Cognos BI 8 series, SQL SERVER 2008, HP Quality Center 9.0, QTP 9.0, PERL, SQL/PLSQL, ASP.Net, C#, TOAD, XML, XSLT, XML Spy 2008, Korn Shell Scripts, UNIX, Windows XP

Partners Health Care, Boston MA Mar/09 – Aug/11

DWH/ETL/SQL Tester

Responsibilities:

·  Analysis of Business requirements & Design Specification Document to determine the functionality of the ETL Processes.

·  Involved in the entire software development Life Cycle of the projects from Initiation to Implementation.

·  Worked with various EDI transactions like 834,835,837,276,277in accordance with HIPPA standards

·  Did testing on ETL Informatica process for various data loading scenarios.

·  Involved in developing the UNIX scripts for Informatica Workflows using parameter files and monitoring the workflows on testing environment.

·  Involved in SQL Optimizations, Performance Analysis, and future growth analysis for OLTP and data warehouse applications

·  Involved in creating source to target mapping, edit rules and validation, transformations, and business rules.

·  Did testing on multiple Informatica mapping for data validation and data conditioning.

·  Created test plan and test cases from the business requirements to match the project’s initiatives in Quality Center.

·  Work with source system personnel on source data delivery issues .Interacted with DBA for setting up test environment.

·  Developed UNIX scripts and AUTOSYS scripts to load extract files in to staging tables using sqlldr.

·  Tested PL/SQL Packages to transform/load the staging data into MIS schema using business logic.

·  Involved in Regression, Functional, Integration, System Testing and User Acceptance Testing.

·  Performed database validation according to the business logic by comparing the source to the target in Spreadsheet.

·  Interacted with developers & various members of team to discuss and resolve defects and their priorities.

·  Attended Bi-Weekly Defect Review and Defect Status meeting. Provide update to management on testing status and some other issues.

·  Worked with Users to develop Test cases for user acceptance testing.

·  Maintained the test logs, test reports, test issues, defect tracking using Quality Center.

·  Involved in preparation of Requirement Traceability Metrics (RTM), Software Metrics, Defect Report, Weekly Status Reports and SQA Report using Quality Center

·  Perform Data Quality integrity testing (completeness, conformity, consistency, accuracy, duplicates).

·  Did data validation on Teradata using SQL Assistant and BTEQ. Improved query performance.

·  Verified the data in DW tables after extracting the data from source tables by writing SQL queries using TOAD

·  Tested SQL queries, PL/SQL scripts to validate the data and performance of the database.

·  Understand business /transformation rules which are involved in developing the source to target mappings, write Complex SQL queries to reflect those rules and be able to validate the mappings by running these test scripts and checking if the data in target table populates correctly by writing complex SQL queries.