OnlyDataJobs.com

Epidemic Sound AB
  • Stockholm, Sweden

At Epidemic Sound we are reinventing the music industry. Our carefully curated music library, with over 30,000 tracks, is tailored for storytellers no matter what their story is. Countless customers around the world, from broadcasters, productions companies and YouTubers rely on our tracks to help them tell their stories. Epidemic Sound’s music is heard in hundreds of thousands of videos on online video platforms such as YouTube, Facebook, Twitch and Instagram. Our HQ is located in Stockholm with offices in NYC, LA, Hamburg, Amsterdam and Madrid. We are growing fast, we have lots of fun and we are transforming the music industry.


We are now looking for a Software Engineer!


Job description


In addition to the hundred of thousands of storytellers using our products, our music is heard tens of billions of times every month across Youtube, social media & music streaming platforms. We want to make maximum use of all this data to generate insights and enable data driven decisions both in our product development and for our musicians and business stakeholders.


As such, we are now looking to grow our Data Infrastructure Team with an experienced software engineer to help us design, develop and evolve our platform and products.


You can expect to:



  • Collaborate with our product-oriented teams during data-related projects to achieve reliable and scalable solutions

  • Design, develop and evolve the data pipelines that fuel our back-end services, machine learning platform and business intelligence systems.

  • Contribute to all stages of the product life-cycle: Design, implementation, testing, releasing and maintenance

  • Work in a lightweight agile process where we regularly evaluate how we work and try to improve

  • Collaborate constantly: We’re big believers in teamwork and the value of practices like careful code reviews, pair (or mob) programming etc

  • Learn a ton of new things: Be it through hack-days, courses, conferences or tech-talks, we emphasize learning and we also expect you to share your knowledge with your colleagues

  • Have fun and take a lot of pride in what you and all of Epidemic Sound is doing


What are we looking for?



  • You have a great understanding of modern web architectures, distributed systems, data processing and storage solutions

  • You have strong knowledge of relational databases and SQL

  • You are able to find opportunities for data acquisition and new uses for existing data

  • You enjoy working with multiple stakeholders and feel comfortable prioritizing internal and external requirements

  • You love teamwork

  • You speak English with professional working proficiency (Swedish is not a requirement)


It would also be music to our ears if:



  • You have experience in one or more of the following: Python, AWS ecosystem, Google Cloud Platform

  • You have experience working with data processing and querying engines such as Spark Hadoop, BigQuery or Kafka

  • You have experience with multiple storage solutions such as columnar oriented databases, NoSQL or Graph databases


Application


Do you want to be a part of our fantastic team? Send us your cover letter and CV as soon as possible - interviews are held continuously. We look forward hearing from you!

iMoney Group
  • Kuala Lumpur, Malaysia
  • Salary: $84k - 96k

Reporting into the CEO, as iMoney’s Head of Data Science, you will be the guru for the full-suite of data related services supporting the organization, including reporting and business intelligence, data analytics, data science and the overall data infrastructure.



  • Unlock the full potential of the huge amounts of data that has been and is continuously being collected at iMoney

  • Craft the data vision and own, identify and implement the data analytics and data science roadmap for the iMoney Group across all business areas

  • Be the strategic leader and developmental coach for our current data team comprising two data analysts and build out additional data capabilities within the iMoney Group

  • Partner with the different business units and leverage data analytics, insights and science to drive all aspects of the customer conversion funnel including marketing channel attribution and optimization, onsite and offline (call-centre) user behavior and conversion, recommendation engines and product matching, customer segmentation, predictive analysis and propensity modelling

  • Utilize best in class practices with respect to data analytics, visualization, reporting dashboards and data science modelling

  • Establish iMoney as the market leader in the field of data analytics and innovative data science

  • Collaborate with the technology and product teams in continuously enhancing and delivering a robust, efficient and scalable data collection, structuring and warehousing infrastructure


Requirements:



  • Passionate about data and its ability to drive high business impact and growth

  • 10 years of experience in the field of data analytics and data science, including at least 3 years in a leadership role at a scale-up stage digital consumer business such as e-commerce, online lending platforms, digital banks, online financial marketplaces or similar

  • Hands on experience in any of the following tools: R, Python, Knime, SAS, SPSS

  • Clear understanding of databases and extensive knowledge of SQL, AWS Redshift, Hadoop, Hive, Teradata, Google Big Query

  • Experience in implementing and leveraging Tableau for business reporting and intelligence

  • Expertise in applying advanced predictive statistical techniques to develop regression, time-series, segmentation models. Exposure to design of experiments or neural network.

  • Responsibility and Attention to Detail - take responsibility for delivery of precise and accurate business intelligence, data analytics and insights to tight timescales and work to resolve problems when they occur

  • Project management skills - ability to scope out and implement larger data related projects as per business requirements including clear understanding of resource and timing requirements

  • People leadership skills - coaching, inspiring, career counselling, mentoring and capability development of team members and peers

  • Excellent stakeholder management, communication and presentation skills – fluent in English, breaking down complex problems with data-driven solutions and having a service-orientated mindset 

Recursion
  • Salt Lake City, UT

At Recursion we combine experimental biology, automation, and artificial intelligence to quickly and efficiently identify treatments for human diseases. We’re transforming drug discovery into a data science problem and to do that we’re building a platform for rapid biological experimentation, data generation, automated analysis, model training, and prediction.


THE PROBLEMS YOU’LL SOLVE


As a Machine Learning Engineer, you'll report to the VP of Engineering and will work with others on the data, machine learning, and engineering teams to build the infrastructure and systems to enable both ML prototyping and production grade deployment of ML solutions that lift our drug discovery platform to new levels of effectiveness and efficiency. We are looking for experienced Machine Learning Engineers who value experimentation and the rigorous use of the scientific method, high collaboration across multiple functions, and intense curiosity driving them to keep our systems cutting edge. In this role you will:



  • Build, scale, and operate compute clusters for deep learning.You’ll be a part of a team responsible for the ML infrastructure, whether that be large-scale on-prem GPU clusters or cloud-based TPU pods.

  • Create a world-class ML research platform.You’ll work with Data Scientists, ML Researchers, and Systems and Data Engineers to create an ML platform that allows them to efficiently prepare hundreds of terabytes of data for training and processing, train cutting-edge deep learning models, backtest them on thousands of past experiments, and deploy working solutions to production. Examples of ML platforms like this are Uber’s Michelangelo and Facebook’s FBLearner Flow.

  • Be a mentor to peers. You will share your technical knowledge and experiences, resulting in an increase in their productivity and effectiveness.


THE EXPERIENCE YOU’LL NEED



  • An ability to be resourceful and collaborative in order to complete large projects. You’ll be working cross-functionally to build these systems and must always have the end (internal) user in mind.

  • Experience implementing, training, and evaluating deep learning models using modern ML frameworks through collaboration with others, reading of ML papers, or primary research.

  • A demonstration of accelerating ML research efforts through improved systems, processes and frameworks.

  • A track record of learning new technologies as needed to get things done. Our current tech stack uses Python and the pydata libraries, TensorFlow, Keras, Kubernetes + Docker, Big Query, and other cloud services provided by Google Cloud Platform.

  • Biology background is not necessary, but intellectual curiosity is a must!


THE PERKS YOU’LL ENJOY



  • Coverage of health, vision, and dental insurance premiums (in most cases 100%)

  • 401(k) with generous matching (immediate vesting)

  • Stock option grants

  • Two one-week paid company closures (summer and winter) in addition to flexible, generous vacation/sick leave

  • Commuter benefit and vehicle parking to ease your commute

  • Complimentary chef-prepared lunches and well-stocked snack bars

  • Generous paid parental leave for birth, non-birth, and adoptive parents

  • Fully-paid gym membership to Metro Fitness, located just feet away from our new headquarters

  • Gleaming new 100,000 square foot headquarters complete with a 70-foot climbing wall, showers, lockers, and bike parking



WHAT WE DO


We have raised over $80M to apply machine learning to one of the most unique datasets in existence - over a petabyte of imaging data spanning more than 10 billion cells treated with hundreds of thousands of different biological and chemical perturbations, generated in our own labs - in order to find treatments for hundreds of diseases. Our long term mission is to decode biology to radically improve lives and we want to understand biology so well that we can fix most things that go wrong in our bodies. Our data scientists, machine learning researchers and engineers work on some of the most challenging and interesting problems in computational drug discovery, and collaborate with some of the brightest minds in the deep learning community (Yoshua Bengio is one of our advisors), who help our machine learning team design novel ways of tackling these problems.



Recursion is an equal opportunity employer and complies with all applicable federal, state, and local fair employment practices laws. Recursion strictly prohibits and does not tolerate discrimination against applicants because of race, color, religion, creed, national origin or ancestry, ethnicity, sex, pregnancy, gender (including gender nonconformity and status as a transgender individual), age, physical or mental disability, citizenship, past, current, or prospective service in the uniformed services, or any other characteristic protected under applicable federal, state, or local law.

Coolblue
  • Rotterdam, Netherlands
As an Advanced Data Analyst / Data Scientist you use the data of millions of visitors to help Coolblue act smarter.

Pros and cons

  • Youre going to be working as a true Data Scientist. One who understands why you get the results that you do and apply this information to other experiments.
  • Youre able to use the right tools for every job.
  • Your job starts with a problem and ends with you monitoring your own solution.
  • You have to crawl underneath the foosball table when you lose a game.

Description Data Scientist

Your challenge in this sprint is improving the weekly sales forecasting models for the Christmas period. Your cross-validation strategy is ready, but before you can begin, you have to query the data from our systems and process them in a way that allows you to view the situation with clarity.

First, you have a meeting with Matthias, whos worked on this problem before. During your meeting, you conclude that Christmas has a non-linear effect on sales.  Thats why you decide to experiment with a multiplicative XGBoost in addition to your Regularised-Regression model. You make a grid with various features and parameters for both models and analyze the effects of both approaches. You notice your Regression is overfitting, which means XGBoost isnt performing and the forecast isnt high enough, so you increase the regularization and appoint the Christmas features to XGBoost alone.

Nice! You improved the precision of the Christmas forecast with an average of 2%. This will only yield results once the algorithm has been implemented, so you start thinking about how you want to implement this.

Your specifications

  • You have at least 4 years of experience in a similar function.
  • You have a university degree, MSC, or PHD in Mathematics, Computer Science, or Statistics.
  • You have experience with Machine Learning techniques, such as Gradient Boosting, Random Forest, and Neutral Networks, and you have proven experience with successfully applying these (or similar) techniques in a business environment.
  • You have some experience with Data mining, SQL, BigQuery, NoSQL, R, and monitoring.
  • You're highly knowledgeable about Python.
  • You have experience with Big Data technologies, such as Spark and Hadoop.

Included by default.

  • Money.
  • Travel allowance and a retirement plan.
  • 25 leave days. As long as you promise to come back.
  • A discount on all our products.
  • A picture-perfect office at a great location. You could crawl to work from Rotterdam Central Station. Though we recommend just walking for 2 minutes.
  • A horizontal organisation in the broadest sense. You could just go and have a beer with the boss.

Review



'I believe I'm working in a great team of enthusiastic and smart people, with a good mix of juniors and seniors. The projects that we work on are very interesting and diverse, think of marketing, pricing and recommender systems. For each project we try to use the latest research and machine learning techniques in order to create the best solutions. I like that we are involved in the projects start to end, from researching the problem to experimenting, to putting it in production, and to creating the monitoring dashboards and delivering the outputs on a daily basis to our stakeholders. The work environment is open, relaxed and especially fun'
- Cheryl Zandvliet, Data Scientist
Pythian
  • Dallas, TX

Google Cloud Solutions Architect (Pre Sales)

United States | Canada | Remote | Work from Home

Why You?

Are you a US or Canada based Cloud Solutions Architect who likes to operate with a high degree of autonomy and have diverse responsibilities that require strong leadership, deep technology skills and a dedication to customer service? Do you have Big data and Data centric skills? Do you want to take part in the strategic planning of organizations data estate with a focus of fulfilling business requirements around cost, scalability and flexibility of the platform? Can you draft technology roadmaps and document best practice gaps with precise steps of how to get there? Can you implement the details of the backlogs you have helped build? Do you demonstrate consistent best practices and deliver strong customer satisfaction? Do you enjoy pre sales? Can you demonstrate adoption of new technologies and frameworks through the development of proofs of concepts?

If you have a passion for solving complex problems and for pre sales then this could be the job for you!

What Will You Be Doing?  

  • Collaborating with and supporting Pythian sales teams in the pre-sales & account management process from the technical perspective, remotely and on-site (approx 75%).
  • Defining solutions for current and future customers that efficiently address their needs. Leading through example and influence, as a master of applying technology solutions to solve business problems.
  • Developing Proof of Concepts (PoC) in order to demonstrate feasibility and value to Pythians customers (approx 25%).
  • Defining solutions for current and future customers that efficiently address their needs.
  • Identifying then executing solutions with a commitment to excellent customer service
  • Collaborating with others in refining solutions presented to customers
  • Conducting technical audits of existing architectures (Infrastructure, Performance, Security, Scalability and more) document best practices and recommendations
  • Providing component or site-wide performance optimizations and capacity planning
  • Recommending best practices & improvements to current operational processes
  • Communicating status and planning activities to customers and team members
  • Participate in periodic overtime (occasionally on short notice) travelling up to approx 50%).

What Do We Need From You?

While we realise you might not have everything on the list to be the successful candidate for the Solutions Architect job you will likely have at least 10 years experience in a variety of positions in IT. The position requires specialized knowledge and experience in performing the following:

  • Undergraduate degree in computer science, computer engineering, information technology or related field or relevant experience.
  • Systems design experience
  • Understanding and experience with Cloud architectures specifically: Google Cloud Platform (GCP) or Microsoft Azure
  • In-depth knowledge of popular database and data warehouse technologies from Microsoft, Amazon and/or Google (Big Data & Conventional RDBMS), Microsoft Azure SQL Data Warehouse, Teradata, Redshift,  BigQuery, Snowflake etc.
  • Be fluent in a few languages, preferably Java and Python, and having familiarity with Scala and Go would be a plus.
  • Proficient in SQL. (Experience with Hive and Impala would be great)
  • Proven ability to work with software engineering teams and understand complex development systems, environments and patterns.
  • Experience presenting to high level executives (VPs, C Suite)
  • This is a North American based opportunity and it is preferred that the candidate live on the West Coast, ideally in San Francisco or the Silicon Valley area but strong candidates may be considered from anywhere in the US or Canada.
  • Ability to travel and work across North America frequently (occasionally on short notice) up to 50% with some international travel also expected.

Nice-to-Haves:

  • Experience Architecting Big Data platforms using Apache Hadoop, Cloudera, Hortonworks and MapR distributions.
  • Knowledge of real-time Hadoop query engines like Dremel, Cloudera Impala, Facebook Presto or Berkley Spark/Shark.
  • Experience with BI platforms, reporting tools, data visualization products, ETL engines.
  • Experience with any MPP (Oracle Exadata/DW, Teradata, Netezza, etc)
  • Understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc)
  • Prior experience working as/with Machine Learning Engineers, Data Engineers, or Data Scientists.
  • A certification such as Google Cloud Professional Cloud Architect, Google Professional Data Engineer or related AWS Certified Solutions Architect / Big Data or Microsoft Azure Architect
  • Experience or strong interest in people management, in a player-coach style of leadership longer term would be great.

What Do You Get in Return?

  • Competitive total rewards package
  • Flexible work environment: Why commute? Work remotely from your home, theres no daily travel requirement to the office!
  • Outstanding people: Collaborate with the industrys top minds.
  • Substantial training allowance: Hone your skills or learn new ones; participate in professional development days, attend conferences, become certified, whatever you like!
  • Amazing time off: Start with a minimum 3 weeks vacation, 7 sick days, and 2 professional development days!
  • Office Allowance: A device of your choice and personalise your work environment!  
  • Fun, fun, fun: Blog during work hours; take a day off and volunteer for your favorite charity.
Ripple
  • San Francisco, CA
  • Salary: $135k - 185k

Ripple is the world’s only enterprise blockchain solution for global payments. Today the world sends more than $155 trillion* across borders. Yet, the underlying infrastructure is dated and flawed. Ripple connects banks, payment providers, corporates and digital asset exchanges via RippleNet to provide one frictionless experience to send money globally.


Ripple is growing rapidly and we are looking for a results-oriented and passionate Senior Software Engineer, Data to help build and maintain infrastructure and empower the data-driven culture of the company. Ripple’s distributed financial technology outperforms today’s banking infrastructure by driving down costs, increasing processing speeds and delivering end-to-end visibility into payment fees, timing, and delivery.


WHAT YOU’LL DO:



  • Support our externally-facing data APIs and applications built on top of them

  • Build systems and services that abstract the engines and will allow the users to focus on business and application logic via higher-level programming models

  • Build data pipelines and tools to keep pace with the growth of our data and its consumers

  • Identify and analyze requirements and use cases from multiple internal teams (including finance, compliance, analytics, data science, and engineering); work with other technical leads to design solutions for the requirements


WHAT WE’RE LOOKING FOR:



  • Deep experience with distributed systems, distributed data stores, data pipelines, and other tools in cloud services environments (e.g AWS, GCP)

  • Experience with distributed processing compute engines like Hadoop, Spark, and/or GCP data ecosystems (BigTable, BigQuery, Pub/Sub)

  • Experience with stream processing frameworks such as Kafka, Beam, Storm, Flink, Spark streaming

  • Experience building scalable backend services and data pipelines

  • Proficient in Python, Java, or Go

  • Able to support Node.js in production

  • Familiarity with Unix-like operating systems

  • Experience with database internals, database design, SQL and database programming

  • Familiarity with distributed ledger technology concepts and financial transaction/trading data

  • You have a passion for working with great peers and motivating teams to reach their potential

RentPath
  • Atlanta, GA

Join a winning team!  Become a part of something meaningful!


Looking to join a company in the midst of a digital transformation where the consumer is king and talent, technology and data are our greatest resources? Keep reading to see if this opportunity is of interest to you!


RentPath is looking for a Sr. Analytics Engineer to support the Consumer Product organization and leadership teams as we look for ways to optimize our site experience, troubleshoot changes in conversion, site performance, KPIs, etc. and they will help to build solutions that drive proactive analytics within the organization. You will be a problem solver with a Swiss Army knife set of skills that can be leveraged to break down barriers between business stakeholders and our data and be responsible for a wide variety of data acquisition, manipulation, and cleansing and should be comfortable building solutions that enable analytics teams to perform complex deep dive analyses.  


A Day in the Life.....


  • You conduct thorough analyses to develop/validate consumer segmentations, develop and compute consumer value metrics and support ad-hoc requests for data extraction, modeling and analysis cross-functionally and within a disparate data ecosystem.
  • You will be expected to manage competing priorities and complexity in the face of ambiguity and build streamlined solutions that enable internal partners to effectively obtain and interpret data and platforms used in the operations of our business.
  • You leverage open-source programming languages such as Python to acquire, manipulate, and cleanse data from a variety of sources.
  • You prepare clear and concise data visualizations and presentations to enhance business decision making; experience leveraging Power BI, Tableau or similar software.
  • You are responsible for the development, deployment, and maintenance of dashboards for the consumer product team including product owners.
  • You Work across the organization and leverage cross-functional teams to provide expert analytical service; experience traversing multiple databases to uncover trends.
  • You collect requirements, manage personal project intake, validate and update analyses and dashboards periodically, and provide reports as needed.


What we need from you.....


    • Demonstrated success, experience or proficiency in/with the following: A/B, Multivariate, and other statistical testing methods
  • Experience with web services like REST & SOAP APIs (connecting, gathering data, automation)
    • Experience working in support of product development, collecting requirements, and delivering insight for product improvement
    • Experience supporting the development of analytics solutions leveraging tools like Power BI, Tableau Desktop, and Tableau Online
    • Open source programming languages for data acquisition and manipulation, with a strong preference for object-oriented languages (ex) Python and experiences with cloud technologies (ex) BigQuery and Redshift
    • Familiarity with agile and sprint methodologies
    • Demonstrated experience with the acquisition of and interpretation of business, data and process requirements, with appropriate usage of data visualization tools
    • +4 years experience working with SQL, exposure to NoSQL a plus!
    • Experience leveraging web traffic and consumer analytics tools (Ex. Adobe Analytics,
    • Google Analytics, Mixpanel, Heap, etc.)
    • Familiarity with Optimizely, Google Optimize, or similar A/B testing platforms
    • Demonstrated understanding of data warehousing and data modeling
    • Comfortable narrating data-driven insights and translating technical concepts into simple terminology for a variety of technical and non-technical stakeholders at various levels.
    • Excellent problem-solving skills, including the ability to analyze situations, identify existing or potential problems and recommend sound conclusions and implementation strategies
    • Strong project management skills; ability to prioritize and manage client expectations across multiple projects

Nice to Have:

    • Masters degree in a quantitative field
    • Experience with data mining, ETL, in a Hadoop environment is a plus
    • Experience working with ElasticSearch and Apache Kafka highly preferred but not

required




Why Choose RentPath?

Were a place where you can make an important difference, from day one.  Youll have the opportunity to grow and build, both professionally and in the communities we serve.  Youll work with smart, diverse, and unpretentious people, as we help renters find and enjoy their ideal home.  In fact, we consider ourselves a very well-funded start-up that also has more than 40 years in the industry and strong financial performance.  The challenge of leading our digital transformation has attracted talent from leading companies like Google, Microsoft, HomeAway, and Expedia.  Will you be next?


  
Farfetch UK
  • London, UK

About the team:



We are a multidisciplinary team of Data Scientists and Software Engineers with a culture of empowerment, teamwork and fun. Our team is responsible for large-scale and complex machine learning projects directly providing business critical functionality to other teams and using the latest technologies in the field



Working collaboratively as a team and with our business colleagues, both here in London and across our other locations, you’ll be shaping the technical direction of a critically important part of Farfetch. We are a team that surrounds ourselves with talented colleagues and we are looking for brilliant Software Engineers who are open to taking on plenty of new challenges.



What you’ll do:



Our team works with vast quantities of messy data, such as unstructured text and images collected from the internet, applying machine learning techniques, such as deep learning, natural language processing and computer vision, to transform it into a format that can be readily used within the business. As an Engineer within our team you will help to shape and deliver the engineering components of the services that our team provides to the business. This includes the following:




  • Work with Project Lead to help design and implement new or existing parts of the system architecture.

  • Work on surfacing the team’s output through the construction of ETLs, APIs and web interfaces.

  • Work closely with the Data Scientists within the team to enable them to produce clean production quality code for their machine learning solutions.



Who you are:



First and foremost, you’re passionate about solving complex, challenging and interesting business problems. You have solid professional experience with Python and its ecosystem, with a  thorough approach to testing.



To be successful in this role you have strong experience with:



  • Python 3

  • Web frameworks, such as Flask or Django.

  • Celery, Airflow, PySpark or other processing frameworks.

  • Docker

  • ElasticSearch, Solr or a similar technology.



Bonus points if you have experience with:



  • Web scraping frameworks, such as Scrapy.

  • Terraform, Packer

  • Google Cloud Platform, such as Google BigQuery or Google Cloud Storage.



About the department:



We are the beating heart of Farfetch, supporting the running of the business and exploring new and exciting technologies across web, mobile and instore to help us transform the industry. Split across three main offices - London, Porto and Lisbon - we are the fastest growing teams in the business. We're committed to turning the company into the leading multi-channel platform and are constantly looking for brilliant people who can help us shape tomorrow's customer experience.





We are committed to equality of opportunity for all employees. Applications from individuals are encouraged regardless of age, disability, sex, gender reassignment, sexual orientation, pregnancy and maternity, race, religion or belief and marriage and civil partnerships.

Coolblue
  • Rotterdam, Netherlands
As an Advanced Data Analyst / Data Scientist you use the data of millions of visitors to help Coolblue act smarter.

Pros and cons

  • Youre going to be working as a true Data Scientist. One who understands why you get the results that you do and apply this information to other experiments.
  • Youre able to use the right tools for every job.
  • Your job starts with a problem and ends with you monitoring your own solution.
  • You have to crawl underneath the foosball table when you lose a game.

Description Data Scientist

Your challenge in this sprint is improving the weekly sales forecasting models for the Christmas period. Your cross-validation strategy is ready, but before you can begin, you have to query the data from our systems and process them in a way that allows you to view the situation with clarity.

First, you have a meeting with Matthias, whos worked on this problem before. During your meeting, you conclude that Christmas has a non-linear effect on sales.  Thats why you decide to experiment with a multiplicative XGBoost in addition to your Regularised-Regression model. You make a grid with various features and parameters for both models and analyze the effects of both approaches. You notice your Regression is overfitting, which means XGBoost isnt performing and the forecast isnt high enough, so you increase the regularization and appoint the Christmas features to XGBoost alone.

Nice! You improved the precision of the Christmas forecast with an average of 2%. This will only yield results once the algorithm has been implemented, so you start thinking about how you want to implement this.

Your specifications

  • You have at least 4 years of experience in a similar function.
  • You have a university degree, MSC, or PHD in Mathematics, Computer Science, or Statistics.
  • You have experience with Machine Learning techniques, such as Gradient Boosting, Random Forest, and Neutral Networks, and you have proven experience with successfully applying these (or similar) techniques in a business environment.
  • You have some experience with Data mining, SQL, BigQuery, NoSQL, R, and monitoring.
  • You're highly knowledgeable about Python.
  • You have experience with Big Data technologies, such as Spark and Hadoop.

Included by default.

  • Money.
  • Travel allowance and a retirement plan.
  • 25 leave days. As long as you promise to come back.
  • A discount on all our products.
  • A picture-perfect office at a great location. You could crawl to work from Rotterdam Central Station. Though we recommend just walking for 2 minutes.
  • A horizontal organisation in the broadest sense. You could just go and have a beer with the boss.

Review



'I believe I'm working in a great team of enthusiastic and smart people, with a good mix of juniors and seniors. The projects that we work on are very interesting and diverse, think of marketing, pricing and recommender systems. For each project we try to use the latest research and machine learning techniques in order to create the best solutions. I like that we are involved in the projects start to end, from researching the problem to experimenting, to putting it in production, and to creating the monitoring dashboards and delivering the outputs on a daily basis to our stakeholders. The work environment is open, relaxed and especially fun'
- Cheryl Zandvliet, Data Scientist
Gecko Robotics
  • Pittsburgh, PA
What is Gecko Portal?


Gecko Portal is a rapidly growing team at Gecko Robotics that is focused on transforming the company’s core product from the ground up by creating industry-leading tools for industrial asset management using the latest in modern Web technologies, machine learning, and advanced data visualization. We aim to leverage the tremendous amount of real-world data that Gecko Robotics has collected to generate predictive, never-before-seen insights and radically transform the way that plant owners and managers manage their industrial assets. The Gecko Portal product involves all levels of the stack, from 3D data visualization tools to dynamic React.js frontend applications powered by a backend that processes millions of data points, serves as a platform for comprehensive asset data management, and generates ML-driven analytics. We are looking for talented engineers who are comfortable with rapid iteration and development to join us in building Web applications at scale. 


What is Backend at Gecko Robotics?


Backend engineers on the Gecko Portal team create the core software services that power all of our internal and customer-facing applications, from data validation and signal processing platforms to customer-facing data visualization and inspection data management applications.  We use a wide range of modern languages and technologies such as Python (Bokeh, Django, Flask, Pandas), Javascript (React, Redux, D3, Kepler), Google Cloud Platform (Cloud Storage, Cloud Functions, BigQuery), Redis, PostgreSQL, and Docker. Some of our technical areas of focus include signal processing, real-time video processing / object detection and classification, computer vision, and machine learning / predictive analytics.


Minimum Requirements:



  • 2+ years experience with building backend software applications

  • Bachelor’s and/or Master’s degree in Computer Science, or equivalent experience

  • Demonstrated ability in writing performant, scalable code

  • Dedication to test-driven development and designing production-ready systems

  • Deep understanding of a backend web development framework (Django, Flask, Node.js, Spring, Ruby, or Laravel)

  • Familiarity with Computer Science fundamentals and ability to apply those fundamentals to code

  • Awareness of best practices for scaling backend architectures and databases


Additional requirements/responsibilities:



  • Design, implement, and verify new features in a continuous integration environment

  • Interact with frontend engineers and data analysts to design robust web application APIs

  • Lead design and code reviews

  • Use critical thinking skills to debug problems and develop solutions to challenging technical problems

  • Interact with other engineers from multiple disciplines in a team environment

  • Develop tests to ensure the integrity and availability of the application

  • Provide and review technical documentation

Recursion
  • Salt Lake City, UT

At Recursion we combine experimental biology, automation, and artificial intelligence to quickly and efficiently identify treatments for human diseases. We are transforming drug discovery into a data science problem and to do that we are building a platform for rapid biological experimentation, data generation, automated analysis, model training, and prediction.


THE PROBLEMS YOU’LL SOLVE


As a Software Engineer you’ll work closely with Biologists, Automation Scientists, and Data Scientists to build the infrastructure and applications needed to decode human biology and reinvent drug discovery. In this role, you will:



  • Build, scale, and operate a streaming data pipeline.You’ll be on a team responsible for moving, processing, and analyzing about 20 terabytes of images each week.

  • Create a world-class research platform.You’ll work with Data Scientists and Biologists to create a platform that allows them to generate and access petabytes of data, gives them tools to quickly iterate on novel analysis and research, and deploys new deep learning models into the production data pipeline.

  • Provide visibility into operations.You’ll create tools, dashboards, and metrics that will help everyone keep track of the work they care about, and alert them when they need to take corrective actions.

  • Act as a mentor to peers. You will share your technical knowledge and experiences, resulting in an increase in their productivity and effectiveness.


THE QUALITIES WE VALUE



  • ExperimentationWe want Software Engineers who think critically and use data to measure results.Rigorous use of the scientific method allows our Software Engineers to quickly understand the critical aspects of the problems we’re trying to solve and whether or not we’re moving in the right direction.

  • CollaborationWe want Software Engineers who play well with others.  The role will require close collaboration with our Biological, High Throughput Screening, and Data Science teams to help us achieve our mission to discover transformative new treatments.

  • Curiosity- We want Software Engineers who aren’t satisfied with the status quo. Our Software Engineers openly discuss the tradeoffs inherent in how we build software, and go beyond the traditional boundaries of engineering teams to enable us to get things done faster, cheaper, and more reliably than in traditional drug discovery.


THE EXPERIENCE YOU’LL NEED


We’re hiring a couple of Data Engineers and are interested in people with varying levels of experience.



  • An ability to be resourceful and collaborative in order to complete large projects. We don't have much in the way of project managers.

  • A track record of learning new technologies as needed to get things done. Our current tech stack uses Python and the pydata libraries, Clojure, Kafka, Kubernetes + Docker, PostgreSQL, Big Query, and other cloud services provided by Google Cloud Platform. Experience with Python or the JVM will be helpful.

  • An ability to get things done using various tools from the nooks and crannies of software engineering: composing command line tools such as kubectl, jq, and xargs; creating SQL triggers and managing migrations; and operational support.

  • An ability to write well tested and instrumented code that can be continuously deployed into a production environment with confidence.

  • An interest in learning from and teaching peers in areas of performance, scalability, and system architecture.

  • Biology background is not necessary, but intellectual curiosity is a must!


THE PERKS YOU’LL ENJOY



  • Coverage of health, vision, and dental insurance premiums (in most cases 100%)

  • 401(k) with generous matching (immediate vesting)

  • Stock option grants

  • Two one-week paid company closures (summer and winter) in addition to flexible, generous vacation/sick leave

  • Commuter benefit and vehicle parking to ease your commute

  • Complimentary chef-prepared lunches and well-stocked snack bars

  • Generous paid parental leave for birth, non-birth and adoptive parents

  • Fully-paid gym membership to Metro Fitness, located just feet away from our new headquarters

  • Gleaming new 100,000 square foot headquarters complete with a 70-foot climbing wall, showers, lockers, and bike parking



WHAT WE DO


We have raised over $80M to apply machine learning to one of the most unique datasets in existence - over a petabyte of imaging data spanning more than 10 billion cells treated with hundreds of thousands of different biological and chemical perturbations, generated in our own labs - in order to find treatments for hundreds of diseases. Our long term mission is to decode biology to radically improve lives and we want to understand biology so well that we can fix most things that go wrong in our bodies. Our data scientists and machine learning researchers work on some of the most challenging and interesting problems in computational drug discovery, and collaborate with some of the brightest minds in the deep learning community(Yoshua Bengio is one of our advisors), who help our machine learning team design novel ways of tackling these problems.


Recursion is an equal opportunity employer and complies with all applicable federal, state, and local fair employment practices laws. Recursion strictly prohibits and does not tolerate discrimination against applicants because of race, color, religion, creed, national origin or ancestry, ethnicity, sex, pregnancy, gender (including gender nonconformity and status as a transgender individual), age, physical or mental disability, citizenship, past, current, or prospective service in the uniformed services, or any other characteristic protected under applicable federal, state, or local law.

Coolblue
  • Rotterdam, Netherlands
As an Advanced Data Analyst / Data Scientist you use the data of millions of visitors to help Coolblue act smarter.

Pros and cons

  • Youre going to be working as a true Data Scientist. One who understands why you get the results that you do and apply this information to other experiments.
  • Youre able to use the right tools for every job.
  • Your job starts with a problem and ends with you monitoring your own solution.
  • You have to crawl underneath the foosball table when you lose a game.

Description Data Scientist

Your challenge in this sprint is improving the weekly sales forecasting models for the Christmas period. Your cross-validation strategy is ready, but before you can begin, you have to query the data from our systems and process them in a way that allows you to view the situation with clarity.

First, you have a meeting with Matthias, whos worked on this problem before. During your meeting, you conclude that Christmas has a non-linear effect on sales.  Thats why you decide to experiment with a multiplicative XGBoost in addition to your Regularised-Regression model. You make a grid with various features and parameters for both models and analyze the effects of both approaches. You notice your Regression is overfitting, which means XGBoost isnt performing and the forecast isnt high enough, so you increase the regularization and appoint the Christmas features to XGBoost alone.

Nice! You improved the precision of the Christmas forecast with an average of 2%. This will only yield results once the algorithm has been implemented, so you start thinking about how you want to implement this.

Your specifications

  • You have at least 6 years of experience in a similar function.
  • You have a university degree, MSC, or PHD in Mathematics, Computer Science, or Statistics.
  • You have experience with Machine Learning techniques, such as Gradient Boosting, Random Forest, and Neutral Networks, and you have proven experience with successfully applying these (or similar) techniques in a business environment.
  • You have some experience with Data mining, SQL, BigQuery, NoSQL, R, and monitoring.
  • You're highly knowledgeable about Python.
  • You have experience with Big Data technologies, such as Spark and Hadoop.

Included by default.

  • Money.
  • Travel allowance and a retirement plan.
  • 25 leave days. As long as you promise to come back.
  • A discount on all our products.
  • A picture-perfect office at a great location. You could crawl to work from Rotterdam Central Station. Though we recommend just walking for 2 minutes.
  • A horizontal organisation in the broadest sense. You could just go and have a beer with the boss.
Wellframe
  • Boston, MA

As a Data Scientist at Wellframe, you will leverage your statistical thinking, problem solving skills, communication skills, and curiosity to help us positively impact patient care. You will work alongside healthcare domain experts to uncover novel insights from raw data, and  train production grade models that will augment our platform’s capabilities. By staying up to date on cutting edge innovations in machine learning and artificial intelligence, you will introduce new ways to help patients get the attention and care they need. 



Example projects you’ll work on:




    • Enable care management teams to connect better with patients by leveraging machine translation models that help them converse in the patient’s native language.

    • Maximize the impact of care management teams by building targeting models that identify patients who would benefit the most from health management.

    • Create prescriptive models that guide care managers towards the next best action to improve the health of patients.



Our ideal candidate has:




    • A B.S. or M.S. degree in either Math, Computer Science, Physics, Biostatistics, Epidemiology or closely related quantitative field of study

    • A deep understanding of machine learning fundamentals: multivariable calculus, linear algebra, optimization, probability, and statistics

    • Proficiency in  Python, R or both

    • Experience in building predictive models and performing statistical analysis starting from raw data

    • Up to date knowledge of innovations in machine learning and artificial intelligence

    • Excellent communication skills

    • Commitment to helping healthcare organizations improve the care they deliver to patients




Tools we use include:




    • Python, R, TensorFlow, Scikit-learn.

    • Airflow, Dataproc, BigQuery, Looker, Jupyter Notebooks. 

    • Kubernetes, Google Cloud Platform.




Wellframe, Inc. is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status or any other characteristic protected by local, state, or federal laws, rules, or regulations.

MailChimp
  • Atlanta, GA
Mailchimp is a leading marketing platform for small business. We empower millions of customers around the world to build their brands and grow their companies with a suite of marketing automation, multichannel campaign, CRM, and analytics tools.
The Growth team at Mailchimp uses data-driven experimentation to help Mailchimp users get more value out of our products and drive toward achieving our company goals. You will be responsible for producing meaningful analytics to inform the growth process and serve as the subjecting matter expect in all matters related to A/B and MVT test analysis for the growth product team. Youre a skilled collaborator whos able to work across departments and disciplines. You can communicate the value of data-driven experimentation to other cross functional team members. The ideal candidate will have experience in similar marketing optimization, product analytics, or growth analytics role with strong technical and analytical abilities.
As a part of the Growth team, the work you do will have high visibility, as we quickly turn insights into action and drive change on Mailchimp.com and within the Mailchimp product. If this sounds like you, we would love to hear from you!
Responsibilities
    • Translate customer and business needs into actionable analytics that inform Growth strategy and generate test ideas to help meet the goals of the business; Independently act on your recommendations and deliver key insights to the team
    • Partner with qualitative research to work on better understanding the customer journey, provide quantitative insights that help inform the customer journey, and work together to complete analysis and generate test ideas
    • Be a subject matter expert within cross functional Growth Product team, including upholding best practices around A/B testing and educating others on key concepts such as sample size estimation, confidence intervals, and statistical significance
    • Take ideas put forward by the team and create a hypothesis that captures what the team is trying to learn, has clear and measurable KPIs, and can be tested in a reasonable amount of time
    • Provide regular updates and generate quantitative results to show how the Growth team is making progress towards their teams KPI/North Star Metric
    • Create and maintain automated reporting and dashboards to track key marketing and customer experience metrics across multiple properties leveraging SQL, Google BigQuery, and Google Data Studio; monitor for changes in trends, share insights and make recommendations
    • Create requirements for data tracking needs to ensure the hypothesis can be accurately measured and reported on at the end of a test

Requirements
    • Extensive experience in an analytics focused role, product analytics experience is a plus
    • Bachelors or Graduate degree (business or mathematics a plus or equivalent work experience)
    • Desire to work in a fast-paced environment
    • Expertise in A/B testing analytics and best practices, experience with Optimizely is a plus
    • Expertise in SQL, Web Analytics, Excel - R/Python is a plus
    • Proficiency in wrangling and transforming data
    • Strong communication, collaboration, and problem-solving abilities
    • Demonstrated, hands-on experience with data visualization tools
    • Expertise with web analytics tools, specifically Google Analytics (certification preferred)
    • Experience working with Google BigQuery a plus
    • Experience with statistical methods such as regression and hypothesis testing
    • Proven experience analyzing data from a variety of different sources (quantitative and qualitative), presenting the data in a clear and concise manner, and create actionable insights

Mailchimp is a founder-owned and highly profitable company headquartered in the heart of Atlanta. Our purpose is to empower the underdog, and our mission is to democratize cutting edge marketing technology for small business. We offer our employees an exceptional workplace , extremely competitive compensation, fully paid benefits (for employees and their families), and generous profit sharing . We hire humble , collaborative, and ambitious people, and give them endless opportunities to grow and succeed.
We love our hometown and support sustainable urban renewal. Our headquarters is in the historic Ponce City Market , right on the Atlanta Beltline . If you'd like to be considered for this position, please apply below. We look forward to meeting you!
Mailchimp is an equal opportunity employer, and we value diversity at our company. We don't discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Epidemic Sound AB
  • Stockholm, Sweden

At Epidemic Sound we are reinventing the music industry. Our carefully curated music library, with over 30,000 tracks, is tailored for storytellers no matter what their story is. Countless customers around the world, from broadcasters, productions companies and YouTubers rely on our tracks to help them tell their stories. Epidemic Sound’s music is heard in hundreds of thousands of videos on online video platforms such as YouTube, Facebook, Twitch and Instagram. Our HQ is located in Stockholm with offices in NYC, LA, Hamburg, Amsterdam and Madrid. We are growing fast, we have lots of fun and we are transforming the music industry.


We are now looking for a Software Engineer!


Job description


In addition to the hundred of thousands of storytellers using our products, our music is heard tens of billions of times every month across Youtube, social media & music streaming platforms. We want to make maximum use of all this data to generate insights and enable data driven decisions both in our product development and for our musicians and business stakeholders.


As such, we are now looking to grow our Data Infrastructure Team with an experienced software engineer to help us design, develop and evolve our platform and products.


You can expect to:



  • Collaborate with our product-oriented teams during data-related projects to achieve reliable and scalable solutions

  • Design, develop and evolve the data pipelines that fuel our back-end services, machine learning platform and business intelligence systems.

  • Contribute to all stages of the product life-cycle: Design, implementation, testing, releasing and maintenance

  • Work in a lightweight agile process where we regularly evaluate how we work and try to improve

  • Collaborate constantly: We’re big believers in teamwork and the value of practices like careful code reviews, pair (or mob) programming etc

  • Learn a ton of new things: Be it through hack-days, courses, conferences or tech-talks, we emphasize learning and we also expect you to share your knowledge with your colleagues

  • Have fun and take a lot of pride in what you and all of Epidemic Sound is doing


What are we looking for?



  • You have a great understanding of modern web architectures, distributed systems, data processing and storage solutions

  • You have strong knowledge of relational databases and SQL

  • You are able to find opportunities for data acquisition and new uses for existing data

  • You enjoy working with multiple stakeholders and feel comfortable prioritizing internal and external requirements

  • You love teamwork

  • You speak English with professional working proficiency (Swedish is not a requirement)


It would also be music to our ears if:



  • You have experience in one or more of the following: Python, AWS ecosystem, Google Cloud Platform

  • You have experience working with data processing and querying engines such as Spark Hadoop, BigQuery or Kafka

  • You have experience with multiple storage solutions such as columnar oriented databases, NoSQL or Graph databases


Application


Do you want to be a part of our fantastic team? Send us your cover letter and CV as soon as possible - interviews are held continuously. We look forward hearing from you!

The Rocket Science Group LLC
  • Atlanta, GA
Mailchimp is a leading marketing platform for small business. We empower millions of customers around the world to build their brands and grow their companies with a suite of marketing automation, multichannel campaign, CRM, and analytics tools.
The Growth team at Mailchimp uses data-driven experimentation to help Mailchimp users get more value out of our products and drive toward achieving our company goals. You will be responsible for producing meaningful analytics to inform the growth process and serve as the subjecting matter expect in all matters related to A/B and MVT test analysis for the growth product team. Youre a skilled collaborator whos able to work across departments and disciplines. You can communicate the value of data-driven experimentation to other cross functional team members. The ideal candidate will have experience in similar marketing optimization, product analytics, or growth analytics role with strong technical and analytical abilities.
As a part of the Growth team, the work you do will have high visibility, as we quickly turn insights into action and drive change on Mailchimp.com and within the Mailchimp product. If this sounds like you, we would love to hear from you!
Responsibilities
    • Translate customer and business needs into actionable analytics that inform Growth strategy and generate test ideas to help meet the goals of the business; Independently act on your recommendations and deliver key insights to the team
    • Partner with qualitative research to work on better understanding the customer journey, provide quantitative insights that help inform the customer journey, and work together to complete analysis and generate test ideas
    • Be a subject matter expert within cross functional Growth Product team, including upholding best practices around A/B testing and educating others on key concepts such as sample size estimation, confidence intervals, and statistical significance
    • Take ideas put forward by the team and create a hypothesis that captures what the team is trying to learn, has clear and measurable KPIs, and can be tested in a reasonable amount of time
    • Provide regular updates and generate quantitative results to show how the Growth team is making progress towards their teams KPI/North Star Metric
    • Create and maintain automated reporting and dashboards to track key marketing and customer experience metrics across multiple properties leveraging SQL, Google BigQuery, and Google Data Studio; monitor for changes in trends, share insights and make recommendations
    • Create requirements for data tracking needs to ensure the hypothesis can be accurately measured and reported on at the end of a test

Requirements
    • Extensive experience in an analytics focused role, product analytics experience is a plus
    • Bachelors or Graduate degree (business or mathematics a plus or equivalent work experience)
    • Desire to work in a fast-paced environment
    • Expertise in A/B testing analytics and best practices, experience with Optimizely is a plus
    • Expertise in SQL, Web Analytics, Excel - R/Python is a plus
    • Proficiency in wrangling and transforming data
    • Strong communication, collaboration, and problem-solving abilities
    • Demonstrated, hands-on experience with data visualization tools
    • Expertise with web analytics tools, specifically Google Analytics (certification preferred)
    • Experience working with Google BigQuery a plus
    • Experience with statistical methods such as regression and hypothesis testing
    • Proven experience analyzing data from a variety of different sources (quantitative and qualitative), presenting the data in a clear and concise manner, and create actionable insights

Mailchimp is a founder-owned and highly profitable company headquartered in the heart of Atlanta. Our purpose is to empower the underdog, and our mission is to democratize cutting edge marketing technology for small business. We offer our employees an exceptional workplace , extremely competitive compensation, fully paid benefits (for employees and their families), and generous profit sharing . We hire humble , collaborative, and ambitious people, and give them endless opportunities to grow and succeed.
We love our hometown and support sustainable urban renewal. Our headquarters is in the historic Ponce City Market , right on the Atlanta Beltline . If you'd like to be considered for this position, please apply below. We look forward to meeting you!
Mailchimp is an equal opportunity employer, and we value diversity at our company. We don't discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Zalando SE
  • Berlin, Germany

ABOUT THE TEAM



Department: Lounge Product Analytics
Reports to: Product Analytics Lead


Team Size: <5
Recruiter Name, E-mail: Simona, simona.cerneckyte@zalando.de


You will be a Senior Digital Data Analyst in one of Zalando Lounge’s newest teams, working closely with our Product Managers. You will have end-to-end ownership of data: define tracking requirements, provide actionable insights, carry out A/B tests and dive deep into the results. You will guide more junior team members, and help to establish the processes which drive the life of the team.


WHERE YOUR EXPERTISE IS NEEDED



  • Develop in-depth analytical (user) understanding for what drives growth for the product and how it can be improved

  • Challenge and contribute to product decisions based on insights

  • Drive data mindset in the team, guide in terms of methodology, support and execute testing, and be a quality gatekeeper on data leading to product decisions, even ask to revise decisions that contradict the analytical insights

  • Give (analytical) evidence why we do certain things - from discovery to roll out.



WHAT WE’RE LOOKING FOR



  • 4+ years of experience in a similar role

  • Experience with analysing Google Analytics data, using BigQuery / SQL

  • Strong knowledge of A/B testing best practices and test analysis methods (Python or R)

  • Excellent communication and stakeholder management skills

  • Strong knowledge of e-commerce KPIs and how they can be defined.



PERKS AT WORK



  • Culture of trust, empowerment and constructive feedback, open source commitment, meetups, game nights, 70+ internal technical and fun guilds, knowledge sharing through tech talks, internal tech academy and blogs, product demos, parties & events

  • Competitive salary, employee share shop, 40% Zalando shopping discount, discounts from external partners, centrally located offices, public transport discounts, municipality services, great IT equipment, flexible working times, additional holidays and volunteering time off, free beverages and fruits, diverse sports and health offerings

  • Extensive onboarding, mentoring and personal development opportunities and an international team of experts

  • Relocation assistance for internationals, PME family service and parent & child rooms* (*available in select locations)





We celebrate diversity and are committed to building teams that represent a variety of backgrounds, perspectives and skills. All employment is decided on the basis of qualifications, merit and business need.



ABOUT ZALANDO


Zalando is Europe’s leading online platform for fashion, connecting customers, brands and partners across 17 markets. We drive digital solutions for fashion, logistics, advertising and research, bringing head-to-toe fashion to more than 23 million active customers through diverse skill-sets, interests and languages our teams choose to use.

Zalando SE
  • Berlin, Germany

ABOUT THE TEAM


DEPARTMENT: Consulting Consumer Insights


REPORTS TO: Team lead Brand Consulting


TEAM SIZE: >10


RECRUITER NAME: Simona, simona.cerneckyte@zalando.de


The Insights – Digital Data Analyst role provides data-driven value to both our brand partners as well as internally by being the go-to person for the Analytics team as well as the consultants when it comes to transactional and onsite data analysis.
This is an opportunity to be part of the forefront team to shape a new business venture and to personally grow with it. Insights is a data-driven Zalando offer enabling clients in the fashion industry to gain a holistic understanding of their business and consumers. This offer includes project-based consulting services and self-service web interfaces.


WHERE YOUR EXPERT IS NEEDED



  • You will work on consulting projects by scoping, extracting, analyzing and interpreting transactional and onsite tracking data to derive relevant insights together with our consultants.

  • You will develop new analytical concepts incl. predictive models and build minimum viable products (MVPs) in order to test them with our clients so that they can eventually be integrated into our self-service web interfaces.

  • You will analyze the success of our self-service web interfaces in terms of usage and user preferences and derive recommendations for improvement.

  • You will be responsible for monitoring and improve our data quality by analyzing data from various sources, coordinating with internal teams and reviewing calculation methods.


WHAT WE ARE LOOKING FOR



  • Programming skills in Python or R would be a plus, as well as experience with forecasting and predictive models. Extensive knowledge of respective tools and methods, preferably SQL, Google Analytics and BigQuery.

  • You have experience defining, modeling, testing and evolving analytical models with direct impact on business realities. Special focus on models concerning customers (e.g., customer management, segmentation, value, etc.) is a plus.

  • You possess relevant professional experience in analytics or controlling in industries such as e-business, retail, consultancy or market research are able to communicate precisely with results

  • You are an expert when it comes to working with large volumes of both transactional and onsite consumer data and enjoy working in a commercial context


PERKS AT WORK



  • A culture of trust, empowerment and constructive feedback, open source commitment, meetups, game nights, 70+ internal technical and fun guilds, knowledge sharing through tech talks, internal tech academy and blogs, product demos, parties & events

  • Competitive salary, employee share shop, 40% Zalando shopping discount, discounts from external partners, centrally located offices, public transport discounts, municipality services, great IT equipment, flexible working times, additional holidays and volunteering time off, free beverages and fruits, diverse sports and health offerings

  • Extensive onboarding, mentoring and personal development opportunities and an international team of experts

  • Relocation assistance for internationals, PME family ser, ice and parent & child rooms* (*available in select locations)


At Zalando Media Solutions we create meaningful connections between brands and consumers through a deep understanding of their personalities. We are working with brands, agencies and publishers from the fashion world and beyond to produce innovative, impactful campaigns that reach the right audience at the right moment. Thanks to its holistic mix of cutting-edge and privacy compliant technology, in-depth audience insights, expansive reach and compelling content, brands produce successful campaigns across multiple digital marketing channels.

Delivery Hero SE
  • Berlin, Germany

Are you passionate about food, data and intelligent applications? 


Delivery Hero is building the next generation global online food-delivery platform, with data at the center of delivering amazing food experiences. 


We’re a truly global team, working across 45 countries to ensure our customers are able to find, order and receive their favourite food in the fastest way possible. Since we started our journey in 2011, Delivery Hero has become the world’s largest food-delivery network, and we’re focused on a culture of growth, in both size and opportunities. 


If you’re an enthusiastic, creative problem solver, hungry for a new adventure, an exciting job and an international workplace is waiting for you in the heart of Berlin!


Your mission:



  • Oversee the mapping of data sources, data movement, interfaces, and analytics, to ensure data quality, data and feature agility and compliance.

  • Develop and maintain logical and physical data models and assist with the definition of process models.

  • Identify the key facts and dimensions necessary to support the business and requirements, performing the activities necessary to support the standardization of entities and attributes.

  • Develop entity and attribute descriptions and definitions for the models and facilitate the resolution of inconsistencies and conflicts in data models. 


Required qualifications:



  • MS in Information Technology, Computer Science, Software Engineering, Mathematics or related.

  • Extensive expertise (10+ years) in leading, designing, developing, testing, maintaining, implementing, monitoring, supporting, and documenting data architecture and data modeling (normalized, dimensional, logical, and physical) solutions for Big Data systems, Enterprise Data Warehouses or Enterprise Data Marts.


Your heroic skills:



  • Advanced expertise (8+ years) of data design and modeling on relational systems (Oracle, MsSQL, MySQL, ...).

  • Practical knowledge (4+ years) of data design and modeling on NoSQL environments (Cassandra, MongoDB, Redis, HBase and related).

  • Practical knowledge (4+ years) of BigData technologies in a cloud environment (Hadoop, Spark, BigQuery, RedShift, Athena and related).

  • Practical knowledge (4+ years) of formal object, schema, API and ERD modeling languages (e.g. UML, XSD, JsonSchema and similar) and tools (ER Studio, ERwin).

  • Exposure (2+ years) to master data management, data compliance, change management and data quality management in a BigData environment.

  • Experience (2+ years) curating and managing entities and attributes information with highly domain-specific semantics.

  • Familiarity (2+ years) with agile practices and processes (Scrum).


Aptitudes for success:



  • Strive in complex, fast moving environments.

  • Strong oral, written and interpersonal communication skills.

  • Able to express complex business and technical cases in a clear and didactic way.

  • Proficient at adapting interactions to the participants background and proficiency levels.

  • Comfortable facilitating negotiations and shared understanding between stakeholders.


We offer you:



  • English is our working language, but our colleagues at Delivery Hero come from every corner of the globe providing an incredibly diverse, international working atmosphere with cross-cultural teams.

  • A modern, recently refurbished office in the heart of Berlin.

  • We offer flexible working hours to fit around your personal or family life in case you have to drop off your kids at kita or perhaps you just want to come in to work a little later.

  • Great career opportunities following our development career plan.

  • Being part of a global family under the Delivery Hero umbrella, we can offer you the safety of a large company including a pension scheme and stability.

  • Moving can be stressful so to help you settle in we provide a relocation package including visa help, temporary accommodation and a budget to applicants from abroad.

  • Our building has several kitchens where you’ll find fresh fruit, cereals, juice/ drinks, tea, coffee, etc.

  • On nearly every floor you’ll find a kicker or table tennis (or a lounge, and even a nap room) for when you need to take a break.

  • Being a food-ordering company, in between playing and working hard, we provide a generous number of monthly vouchers to use for ordering food on our platforms when you get the munchies.

  • Sprichst du kein Deutsch? No worries, we provide German classes for those expats who want to expedite their integration.

  • We know that experience at the office with interesting and diverse problems, colleagues and technologies isn’t the only way to learn, so our employees get an education budget to attend conferences or trainings locally or around Europe to satisfy their pursuit of knowledge.

  • Every Friday we have a team integration, when we just stop working earlier, grab a beer and relax with colleagues.

  • Sometimes we just enjoy our times together with team events and company events.


At Delivery Hero, we value diversity as a key element of our success. We are an equal opportunity employer, and welcome all regardless of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.


Have we caught your attention? Then please send us your application including cover letter, CV, salary expectations and earliest starting date. We’re looking forward to your application!

Delivery Hero SE
  • Berlin, Germany

Are you passionate about food, data and intelligent applications? 


Delivery Hero is building the next generation global online food-delivery platform, with data at the center of delivering amazing food experiences. 


We’re a truly global team, working across 45 countries to ensure our customers are able to find, order and receive their favourite food in the fastest way possible. Since we started our journey in 2011, Delivery Hero has become the world’s largest food-delivery network, and we’re focused on a culture of growth, in both size and opportunities. 


If you’re an enthusiastic, creative problem solver, hungry for a new adventure, an exciting job and an international workplace is waiting for you in the heart of Berlin!


Your mission:



  • Install, configure and maintain data infrastructure and large scale data applications in a cloud environment.

  • Ensure security, compliance, operational resilience, stability and scalability of Big Data and Machine Learning systems.

  • Develop tools and processes to monitor and automate application management and incident response.

  • Build analytics tools to provide actionable insights into application performance, efficiency and quality.

  • Provide methods and practice leadership to engineering teams to improve development agility, security profile and operational resilience of data applications. 


Required qualifications:



  • MS in Information Technology, Telecommunications, Electrical Engineering, Computer Science, Software Engineering or related.

  • Advanced expertise (8+ years) deploying, managing and monitoring complex distributed applications at scale in a cloud environment.


Your heroic skills:



  • Advanced expertise (8+ years) of deployment, configuration and troubleshooting of Linux, Mac, and Windows systems.

  • Advanced expertise (8+ years) using at least one scripting (Python, bash, ...) and one object programming (Java, C#, ...) language.

  • Extensive familiarity (6+ years) with implementing, configuring, deploying, monitoring and ensuring the reliability of applications in a cloud environment.

  • Extensive familiarity (6+ years) with one or more cloud vendor services and management tools (AWS, GCP, Azure, OpenShift).

  • Practical knowledge (4+ years) managing BigData applications handling streaming data at scale (Hadoop, Spark, Kafka, AWS Kinesis, AWS S3, BigQuery).

  • Practical knowledge (4+ years) configuring and deploying serverless and containerized applications (Docker, DCOS, Kubernetes, AWS ECS, AWS Lambda, Google Compute Engine).

  • Practical knowledge (4+ years) deploying and using change management, release management and CI/CD tools and processes (GitHub, Jenkins, Puppet/Chef, Terraform, AWS CodeBuild, AWS CloudFormation).

  • Familiarity (2+ years) with agile practices and processes (Scrum).


Aptitudes for success:



  • Strives in complex, fast moving environments.

  • Strong oral, written and interpersonal communication skills.

  • Strong leadership skills with the ability to work effectively within cross-teams collaborations.

  • Strong technical aptitude with an intense desire to learn new skills and industry trends.

  • Passionate about adopting data-driven approaches to ensure efficiency, reliability and agility.


We offer you:



  • English is our working language, but our colleagues at Delivery Hero come from every corner of the globe providing an incredibly diverse, international working atmosphere with cross-cultural teams.

  • A modern, recently refurbished office in the heart of Berlin.

  • We offer flexible working hours to fit around your personal or family life in case you have to drop off your kids at kita or perhaps you just want to come in to work a little later.

  • Great career opportunities following our development career plan.

  • Being part of a global family under the Delivery Hero umbrella, we can offer you the safety of a large company including a pension scheme and stability.

  • Moving can be stressful so to help you settle in we provide a relocation package including visa help, temporary accommodation and a budget to applicants from abroad.

  • Our building has several kitchens where you’ll find fresh fruit, cereals, juice/ drinks, tea, coffee, etc.

  • On nearly every floor you’ll find a kicker or table tennis (or a lounge, and even a nap room) for when you need to take a break.

  • Being a food-ordering company, in between playing and working hard, we provide a generous number of monthly vouchers to use for ordering food on our platforms when you get the munchies.

  • Sprichst du kein Deutsch? No worries, we provide German classes for those expats who want to expedite their integration.

  • We know that experience at the office with interesting and diverse problems, colleagues and technologies isn’t the only way to learn, so our employees get an education budget to attend conferences or trainings locally or around Europe to satisfy their pursuit of knowledge.

  • Every Friday we have a team integration, when we just stop working earlier, grab a beer and relax with colleagues.

  • Sometimes we just enjoy our times together with team events and company events.


At Delivery Hero, we value diversity as a key element of our success. We are an equal opportunity employer, and welcome all regardless of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.


Have we caught your attention? Then please send us your application including cover letter, CV, salary expectations and earliest starting date. We’re looking forward to your application!