- Company Name
- LinkedIn
- Job Title
- Staff AI Engineer, AI Privacy Specialist
- Job Description
-
Job title: Staff AI Engineer, AI Privacy Specialist
Role Summary:
Drive research, development, and deployment of privacy‑preserving techniques for large‑scale AI systems. Lead technical initiatives to embed differential privacy, federated learning, and secure computation into production pipelines, ensuring compliance with regulatory and internal privacy standards while maintaining model utility and robustness.
Expectations:
- Deliver production‑ready privacy methods and libraries.
- Provide technical leadership, mentorship, and cross‑functional collaboration.
- Translate regulatory requirements into effective safeguards.
- Publish findings to influence product strategy and policy.
Key Responsibilities:
- Conduct independent research on state‑of‑the‑art differential privacy, secure computation, and privacy‑preserving ML.
- Evaluate, adapt, and implement algorithms that balance data utility and privacy guarantees.
- Design and prototype privacy‑preserving ML models (DP, federated learning, secure aggregation) for enterprise scale.
- Develop evaluation frameworks to quantify fidelity, utility, and privacy of datasets and models.
- Build internal libraries, evaluation suites, and documentation for engineering adoption.
- Measure and mitigate model robustness against privacy attacks (membership inference, model inversion, memorization).
- Define privacy standards, threat models, audit procedures across the ML lifecycle.
- Collaborate with Security, Policy, Product, and Legal teams to align safeguards with evolving regulations.
- Mentor and lead a small team of engineers, fostering innovation.
Required Skills:
- Deep expertise in differential privacy, federated learning, secure aggregation, and related privacy techniques.
- Strong AI/ML engineering background with experience deploying large language models.
- Proficiency in Python, distributed systems, and cloud ML platforms.
- Ability to design and perform rigorous privacy and utility evaluations.
- Excellent communication for cross‑functional collaboration and technical mentorship.
Required Education & Certifications:
- Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Data Science, or a related technical discipline (equivalent practical experience acceptable).
- Ph.D. in Privacy, Security, Trust, or a closely related field preferred.
- Demonstrated experience in privacy‑preserving ML research and engineering.