Golden Gate Ventures


Senior Data Analyst at Carousell
Love buying and selling on Carousell? Then meet the team that handcrafts various parts of the mobile applications, website and backend systems in order to deliver the best user experience. Here at Carousell, our engineering team works on a myriad of problem domains. You get to work on building the simplest buying and selling experience on our mobile applications, dive deep into our database systems that powers the business, or even work on tools to empower the rest of the teams in Carousell. Every month, we organize an engineering day with different topics, ranging from product hackdays to a Swift workshop by the engineering team members to keep our minds sharp.

Ensuring that the user experience stays simple is complicated - and we take pride in our work to keep things that way.


Lead key practices in the application of proper design of experiments for 2-sided marketplaces
Work with product teams to understand the experimentation needs and design proper statistical experiments along with analysis and measurement plan
Evaluate experiment results and validate statistical significance and accuracy of numbers
Work with backend and system engineers to scale the testing process
Be responsible for building relationships with business stakeholders across the Carousell organisation, understand their needs and data requirements and providing relevant data reports, visualisations, analysis and meaningful insights in a timely manner
Support the various business functions of the company with the maintenance and generation of business management and operational reports
Be able to spot patterns and trends in our data and propose solutions to respond such trends and insights
Be able to identify gaps and issues in our data and resolve them, and propose long term solutions to better increase our data integrity from the moment the data is collected, ingested, stored and processed

You are someone who:

Has deep expertise and working knowledge of statistics required for designing and evaluating statistical experiments
Is hands-on in the implementation of experiments
Is data-driven and passionate about solving problems through data
Is inquisitive and curious to delve deep into data to investigate trends or anomalies
Is detail oriented and be able to work efficiently in a fast-paced team environment
Can visualize and communicate findings and data insights to various stakeholders in a coherent and logical manner
Is keen on data technologies and picking up new skills and tools along the way
Has strong critical thinking and ability to frame issues in a logical manner

Preferred Skills:

Solid experience in designing, implementing, evaluating statistical experiments (working knowledge of quasi-experiments is a plus) to help guide product owners in evaluating the success of their products
Solid experience with common analytics and visualisation tool (SQL, Python, and R) for ad hoc analysis to extract insights
Possess deep subject matter expertise in at least one key area e.g. Causal Inference, Design of Experiments, Econometrics
Expertise using Big Data ecosystem tools e.g. BigQuery, Hadoop, Pyspark, Sqoop, and MLlib
Comfortable with distilling complicated analyses into an accessible format for non-technical audiences (e.g. white papers, presentations, documents)
Strong communication skills (fluent spoken English, visual presentation skills, effective written communication skills)
Excellent interpersonal skills, ability to work in team and communicate with business people and other analysts
Significant demonstrated track record in analytics with thorough examples of experience
Commercial acumen to understand and define the business significance of various parcels of data and analysis
Must have demonstrated team working capability

Qualifications and Experience:

Bachelor's or Master’s Degree in a quantitative subject e.g. Economics, Mathematics, Engineering, Computer Science, or Physical Science
5 years of experience performing exploratory and summary analytics
5 years of experience using specialized tools and / or programming languages for batch and ad-hoc data analysis
3 years of experience implementing traditional data Extract, Transform, and Load (ETL) processes
3 years of experience of designing, implementing, evaluating statistical experiments to guide decision making process
3 years of big data analytics experience
3 years of experience with using and setting up analytics products and platforms