About the job
About Us:
BigID is a pioneering tech startup specializing in cutting-edge solutions for data security, compliance, privacy, and AI data management. We are at the forefront of the data landscape, empowering our customers to mitigate risks, foster business innovation, achieve compliance, build trust, make informed decisions, and maximize the value of their data.
We are committed to building a global team united by a passion for innovation and advanced technology. BigID has received numerous accolades, including:
- Named a Hot Company in Artificial Intelligence and Machine Learning at the Global InfoSec Awards
- Listed in Citizens JMP Cyber 66 as one of the Hottest Privately Held Cybersecurity Companies
- CRN 100 list recognizes BigID as one of the 20 Coolest Identity Access Management and Data Protection Companies for three consecutive years
- Ranked among the DUNS 100 Best Tech Companies to Work for
- Featured as a Top 3 Big Data and AI Vendor to Watch in the 2023 BigDATAwire Readers' and Editors' Choice Awards
- Included in the 2024 Inc. 5000 list for the fourth consecutive year!
- Shortlisted for the 2024 AI Awards in the Best Use of AI in Cybersecurity category
At BigID, our team is the cornerstone of our success. Join our dynamic, people-centric culture where you’ll have the opportunity to collaborate with some of the most talented professionals in the industry who prioritize innovation, diversity, integrity, and teamwork.
Who We Are Looking For:
We are on the hunt for a Senior Data Platform Engineer to enhance our Data Platform team. The ideal candidate will possess substantial experience in data engineering, particularly with Kafka and Elasticsearch, to design and maintain our robust data platforms. You will collaborate closely with cross-functional teams to ensure the scalability and reliability of our data solutions.
Role Overview:
As a Senior Data Platform Engineer, you will be instrumental in the design, development, maintenance, troubleshooting, and implementation of our big data architecture. Your proficiency in Elastic, Kafka, and Node.js will play a vital role in ensuring the scalability and performance of our data systems.
Key Responsibilities:
- Develop data processing pipelines utilizing Kafka for real-time data streaming.
- Enhance and manage search functionalities leveraging Elastic technologies.
- Work alongside product managers, data analysts, and stakeholders to gather requirements and translate them into technical specifications.
- Lead code reviews and promote best practices in coding and data handling.

