Chennai
37 days ago
Lead II - Software Testing (Python+Selenium)

Role Proficiency:

Create and Organise testing process based on project requirement and manage test activities within team

Outcomes:

     Test Estimates and Schedules-. Ensure Test Coverage      Produce test results defect reports test logs and reports to evidence for testing      Publish RCA reports and preventive measures      Ensure Quality of Deliverables      Report project metrics and status      Ensure adherence of Engineering practices processes and standards      Understand and contribute to test automation/performance testing      Work with DevOps team when required; to understand testing framework and QA process for implementing continuous testing Manage team utilization

Measures of Outcomes:

Test Script Creation and Execution Productivity Defect Leakage Metrics (% of defect leaked % of UAT defects and % of Production defects) % of Test case reuse Test execution Coverage Defect Acceptance Ratio Test Review efficiency On-time delivery Effort Variance Test Automation Coverage

Outputs Expected:

Supporting Organization:

Ensure utilization and quality of deliverables prepared by the team Co-ordinate Test Environment and Test Data provisioning


Test Design
Development
Execution:

Participate in review
walkthrough
demo and obtain sign off by stakeholder Prepare Test Summary Report for modules/features


Requirements Management:

Analyse
Prioritize
Identify Gaps; create workflow diagrams based on Requirements/User stories


Manage Project:

Participate in Test management Preparing
Tracking and Reporting of Test progress based on schedule


Domain relevance:

Identify business processes
conduct risk analysis and ensure test coverage


Estimate:

Prepare Estimate
Schedule
Identify dependencies


Knowledge Management:

Consume
Contribute
Review (Best Practices
Lesson learned
Retrospective)


Test Design
Execution:

Test Plan preparation
Test Case/Script Creation
Test Execution


Risk Identification:

Identification of risk/issues and prepare Mitigation and Contingency plans


Test & Defect Management:

Conduct root cause and trend analysis of the defects


Test Planning:

Identify the test scenarios with understanding of systems
interfaces and application Identify end-to-end business critical scenarios with less support Create and review the test scenarios and prepare RTM Prepare estimates (time /effort) based on the requirements/User stories Identify scope of testing


Client Management:

Define KPIs to the engagement and ensure adherence to these KPIs.


Stakeholder Connect:

Handle monthly/weekly governance calls and represent issues for the team

Skill Examples:

     Ability to Create Review and manage a test plan      Ability to prepare schedules based on estimates      Ability to track report progress and take corrective measures on need basis      Ability to identify test scenarios and prepare RTM      Ability to analyze requirement/user stories and prioritize testing      Ability to carry out RCA      Ability to capture and report metrics Ability to identify Test Data and Test Env. Specifications

Knowledge Examples:

     Knowledge of Estimation techniques      Knowledge of Testing standards      Knowledge of identifying the scope of testing      Knowledge of RCA Techniques      Knowledge of Test design techniques      Knowledge of Test methodologies      Knowledge of scope identification and planning Knowledge of Test automation tools and frameworks

Additional Comments:

Job Summary Responsible for the coordination and completion of software and system testing activities for all sizes of projects. Works closely with the project teams to analyze requirements, provide estimates, build, and execute test cases, reporting results, and create/maintain automated test cases. Job Summary As a Senior Data Quality Engineer, you will play a crucial role in ensuring the reliability and accuracy of our data platform and projects. Your primary responsibilities will involve developing and leading the product testing strategy, leveraging your technical expertise in AWS and big data technologies. You will also work closely with the team to implement shift-left testing using Behavior-Driven Development (BDD) methodologies integrated with AWS CodeBuild CI/CD. Accountabilities+ Develop Product Testing Strategy: Collaborate with stakeholders to define the product testing strategy, identifying key platform and project responsibilities. Leverage your technical acumen to design a comprehensive and effective testing approach. Lead Testing Strategy Implementation: Take charge of implementing the testing strategy, ensuring its successful execution across the data platform and projects. Oversee and coordinate testing tasks to ensure thorough coverage and timely completion. BDD and AWS Integration: Guide the team in utilizing Behavior-Driven Development (BDD) practices to drive shift-left testing. Leverage your in-depth understanding of AWS services, including AWS DL/DW components (AWS Glue, Lambda, Airflow jobs, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, Spark, etc.) to enhance testing effectiveness. Reviews and approves deliverables and work products. Reviews monitors and summarizes progress of project testing activities. Work with team to identify the appropriate data for testing, and prepare or direct the preparation of data for the test cases in the applicable test environments. Executes test cases and documents results. Assists application developers and technical support staff in the analysis and timely resolution of identified problems. Creates automation engineering testing solutions Basic Qualifications: Education and Experience: Bachelor’s in Computer Science and 4 years or HS/GED and 8 years] Big Data Platform Expertise: Bring at least 2 years of experience as a technical test lead working on a big data platform, preferably with direct experience in AWS. Your understanding of testing requirements at both the platform and individual project levels will be invaluable in delivering high-quality solutions. AI/ML Experience: Possess familiarity with AI/ML concepts and practical experience working on AI/ML projects. Your expertise in this area will contribute to ensuring data quality in AI/ML-driven initiatives. Synthetic Data and Test Data: Demonstrate knowledge of synthetic data tooling, test data generation, and best practices. Employ your skills to create realistic and meaningful test data sets. Offshore Team Leadership: Showcase your ability to lead and collaborate with offshore teams, efficiently managing projects with limited real data access. Quality Focus: Advocate for a quality-focused approach, emphasizing the importance of building quality into platforms and solutions. Strive for continuous improvement in data quality standards. Strong Programming Skills: Possess strong object-oriented programming skills, with expertise in Python programming. Utilize your programming proficiency to enhance test automation and tooling. Testing Frameworks and Tools: Familiarity with testing frameworks and tools like PyTest, PyTest-BDD, AWS CodeBuild, and Harness will be beneficial in streamlining the testing process. Exceptional Communication: Communicate effectively with both technical and non-technical stakeholders, translating complex technical concepts into easily understandable terms.

Confirm your E-mail: Send Email