cover image
Burke Porter, an Ascential Technologies Brand

Burke Porter, an Ascential Technologies Brand

ascentialtech.com

1 Job

228 Employees

About the Company

Building on its history of leadership in the automotive industry, Burke Porter is now an Ascential Technologies company. As the innovator and clear leader in end-of-line testing with thousands of installed manufacturing systems globally, this division now extends its automotive expertise to cover product design, development, laboratory services, EV propulsion, and aftermarket products and services. As a full lifecycle partner, Ascential Technologies collects and reports on data at every stage of development, optimizing product delivery. Impossible? Done.

Listed Jobs

Company background Company brand
Company Name
Burke Porter, an Ascential Technologies Brand
Job Title
Software Engineer
Job Description
Job Title: Data Engineer Role Summary: Develop and maintain industrial data analytics platforms, integrating on-premises and cloud-based data systems for aerospace, automotive, medical, and industrial automation clients. Focus on scalable data pipelines, visualization, and AI/ML-driven insights to optimize performance and compliance. Expectations: - 5-10 years in data engineering, analytics, or industrial data roles. - Proven expertise in cloud/on-prem data systems and cross-industry data integration. - Ability to deliver customer-centric solutions while mentoring teams and collaborating on agile workflows. Key Responsibilities: - Architect and optimize data pipelines for industrial, aerospace, medical, and automotive systems (CSV, SQL, PLC logs, text). - Design and deploy robust data collectors, preprocessors, and ingestion frameworks for cloud platforms. - Build dashboards, reports, and AI-driven summaries for performance monitoring, anomaly detection, and predictive analytics. - Integrate new protocols (PLC) and formats, ensuring secure, scalable, and fault-tolerant pipeline operations. - Collaborate with clients to refine requirements, resolve data system issues, and validate solution quality. - Implement monitoring, alerts (webhooks/thresholds), and automated reporting. Required Skills: - Data engineering: ELK stack, OpenSearch, OpenTelemetry, pipeline design, data cleaning/transformation. - Cloud platforms: AWS, Azure, GCP, Elastic Cloud. - Scripting/programming: Python, Bash, PowerShell. - Visualization tools: Kibana, Grafana, OpenSearch Dashboards. - Core software development: Object-oriented design, modular architecture, code reviews. Required Education & Certifications: - Bachelor’s in Computer Science, Data Science, or Engineering. - Proven track record in large-scale industrial or mission-critical data projects.
San diego, United states
On site
Mid level
10-02-2026