About the job
Join Our Dynamic Team!
- The Data Analytics Engineer (Data Engineer) at Toss Securities is an integral part of the Data Warehouse Team within the Data Division.
- Your focus will be on Data Platform and Data Mart, with opportunities to collaborate cross-functionally.
- The Mart responsibilities include structuring and managing data from the Toss Securities domain to facilitate analysis through data warehouse and aggregation table creation.
- Our current team of approximately 7 members brings diverse experiences ranging from 2 to 14 years, with backgrounds in various sectors such as portals, banking, gaming, and startups.
Curious About Our Data Division?
- The Data Division at Toss Securities strives to become a world-class securities firm by leveraging data technology, services, and data-driven decision-making.
- We foster close collaboration among various data roles, creating an enjoyable working environment.
- Regular Tech Weekly sessions are held to share expertise, allowing you to engage with and learn from other roles as per your interest.
Your Responsibilities Will Include:
- Designing clear and reliable table structures that can be easily understood and utilized, encompassing architecture design, compliance with standards, data processing logic management, data integrity validation, DQ monitoring, security reviews, and documentation using meta management systems.
- Collaborating with data users to design data marts and establish pipelines for key business performance analysis.
- Setting the groundwork for effective data asset utilization through data cataloging and standard management.
- Proactively addressing essential data processing tasks in a rapidly growing service environment with your colleagues.
- Enhancing system efficiency by effectively refactoring and optimizing various existing mart tables through data modeling that considers consistency, reusability, and scalability.
- Designing data marts and constructing pipelines for external/public reporting requirements.
We Are Looking For Someone Who:
- Has a deep understanding of the securities domain or has actively engaged in stock trading.
- Can clearly define key concepts of the securities domain as a DW data modeler and take the lead in designing easy-to-understand data structures.
- Has experience simplifying complex data models or automating repetitive issues.
- Can propose efficient data processing methods while adhering to data standards through smooth communication with various stakeholders.
- Has experience structuring enterprise tables through defining data standards and building data catalogs.
- Is capable of independently conducting data warehouse/mart modeling, pipeline construction, and operational tasks.
- Can present standards from a clear data structure and efficient utilization perspective, rather than just processing simple requests.
- Is proficient in SQL and can write organized queries considering readability and efficiency.
- Has experience developing data pipelines based on Hadoop, Airflow, and DBT.
- May need to have intermediate to advanced skills in PySpark, depending on the situation.
- Would benefit from having experience with BI tools such as Tableau.
Resume Tips:
- Detail impactful projects you have worked on.
- If you have improved services, quantify the results (omit sensitive external information).
- Elaborate on your work related to data governance.
- Include business analysis or reporting experience.

