Milwaukee, WI, US
9 hours ago
Cloud Integration Engineer

Komatsu is an indispensable partner to the construction, mining, forestry, forklift, and industrial machinery markets, maximizing value for customers through innovative solutions. With a diverse line of products supported by our advanced IoT technologies, regional distribution channels, and a global service network, we tap into the power of data and technology to enhance safety and productivity while optimizing performance. Komatsu supports a myriad of markets, including housing, infrastructure, water, pipeline, minerals, automobile, aerospace, electronics and medical, through its many brands and subsidiaries, including TimberPro, Joy, P&H, Montabert, Modular Mining Systems, Hensley Industries, NTC, and Gigaphoton.

Job Overview The DevOps Engineer will work closely with our development and operations teams to design, implement, and manage the infrastructure and processes that support our software development lifecycle. The ideal candidate will have a strong background in software engineering, software architecture, systems administration, and experience with continuous integration and continuous deployment (CI/CD) methodologies.  Work within and enhance the analytics software development lifecycle, Services for analyzing and exploring data. Champion development and operations support processes that leverage industry best-practice and that continually improve robustness and supportability of the analytics platform that is used by global JoySmart teams, customers and factory engineering and product support teams. Ensure that the analytics platform continues to meet performance needs as it scales with data volume and user base. Lead the day-to-day operation, maintenance and monitoring of the analytics platform, Services related to collecting and ingesting data from various sources into the big data platform. This includes batch processing , API integration , database replication, real-time streaming, and data connectors. Ensuring data quality, compliance, and security. This involves setting access controls, monitoring data usage, managing metadata, backup and disaster recovery strategies. Monitoring system health, resource utilization, and identifying bottlenecks. Alerts and logs help troubleshoot to ensure the reliability and availability of data systems. Key Job Responsibilities Automate infrastructure deployment using the "Infrastructure as Code" methodology (e.g., Bicep templates).  Implement and manage CI/CD pipelines in Azure DevOps for automated testing, building, and deployments.  Design, deploy, and maintain secure, scalable, and reliable infrastructure solutions.  Collaborate with Dev, Analytics, and Customer Teams to streamline code releases and ensure smooth delivery.  Monitor, troubleshoot, and optimize application performance and system health.  Develop and maintain APIs for seamless integration and functionality.  Design and implement ETL tasks in Spark to cleanse, process, and load data efficiently.  Enhance security by integrating best practices into development and deployment processes.  Leverage tools like Wiz to monitor and improve cloud security posture.  Conduct root cause analysis for production issues and implement permanent fixes.  Manage system configurations using Azure Policy and Azure Automation to ensure consistency and compliance.  Configure and manage user and application lifecycle processes, including RBAC permissions, using EntraID.  Implement alerting and monitoring systems for rapid issue detection and resolution.  Develop, maintain, and update complex architectural diagrams to support infrastructure planning.  Document processes, configurations, and infrastructure changes comprehensively.  Optimize SQL and Snowflake queries to improve performance and reduce costs while ensuring data governance policies are followed.  Deploy and manage Dash Enterprise for advanced analytics and visualization.  Develop and maintain automation scripts using Shell and Python to streamline processes and enhance productivity.  Qualifications/ Requirements Requires a minimum of Masters in Computer Science or IT, Information Systems, Engineering or equivalent and 7+ years of progressive development experience with data systems, analytics, and big data Proven experience as a DevOps Engineer or in a similar software engineering role.  Proficiency with version control systems (e.g., Git).  Experience with CI/CD tools, specifically Azure DevOps.  Solid knowledge of scripting languages (e.g., Python, Bash, PowerShell), with a focus on Azure CLI and Azure PowerShell.  Solid experience with Spark, SparkSQL & DataBricks Experience with configuration management tools (e.g., Bicep, Ansible, Chef, Puppet).  Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes), especially within the Azure Kubernetes Service (AKS) ecosystem. Solid Admin experience with Dash Enterprise, InfluxDB, Grafana, Prometheus, Snowflake, Confluent cloud & Azure. Solid experience with NiFi for developing ETL tasks. Preferred Qualifications

Strong Experience with data cataloging tools, Microsoft Purview preferred.

preference for experience with big data applications involving complex equipment and mining.

 

Komatsu is an Equal Opportunity Workplace and an Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.

Confirm your E-mail: Send Email