OnlyDataJobs.com

Data Analytics Consultant at TalentBridge (Minneapolis, MN)

  • Minneapolis, MN

Techniques

Languages

Job Requirements:

    • Strong SQL or SAS Analytics (not just ETL) used for data mining and root cause analytics minimum of 4+ years
    • Experience querying databases via SQL
    • Deep business systems and database knowledge to perform advanced data validation
    • 6+ years statistical/data modeling and forecasting methods

Required Qualifications

6+ years of experience in one or a combination of the following: reporting, analytics, or modeling; or a Masters degree or higher in a quantitative field such as applied math, statistics, engineering, physics, accounting, finance, economics, econometrics, computer sciences, or business/social and behavioral sciences with a quantitative emphasis and 4+ years of experience in one or a combination of the following: reporting, analytics, or modeling


Desired Qualifications

Extensive knowledge and understanding of research and analysis

Strong analytical skills with high attention to detail and accuracy

Excellent verbal, written, and interpersonal communication skills with the ability to negotiate actions and timelines with Team Members in other departments

Ability to work in ambiguous situations while meeting aggressive timelines

Willingness to quickly and independently learn new processes and systems

Ability to translate complex technical needs into straightforward requests for information and collaboration, working with a wide range of people

Willingness to ask questions in potentially difficult situations

4+ years of experience with SAS or SQL, or other data management, reporting and query tools

Intermediate Microsoft Office (Word, Excel, Outlook, and PowerPoint) skills

  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As part of the Big Data ecosystem, the Comcast dx team defines and executes on data strategy to realize the promise of "data to the people." The Solution Manager plays a critical role in this effort by linking our customers' needs to the data ecosystem, both within the dx team and across the larger Comcast organization.

As a Solution Manager you:

-Lead client engagement, data discovery/analysis and business/process modeling efforts for the dx team.

-Operate across a number of technical domains with a focus in a primary area such as Product or Network Quality.

-Are a naturally curious problem solver with a passion for efficiency and execution excellence.

-Have performed in an analytical or technical role previously and have a strong understanding of the analytical workflow from requirements/data discovery through analysis to operationalization.

-Understand that to be successful in this role requires presence and confidence, and you have the ability to drive a team forward in the face of ambiguity and competing priorities.

-Understand the fundamental role that data plays in the competitive landscape and demonstrate a passion for data excellence.

-Embrace collaboration as a central tenant to being successful and understand the critical need to build trusting bonds both with our key stakeholders and delivery teams.

-Partner effectively with a Solution Engineers, Architects, Product Owners and Tech Leads to define and scope work into delivery roadmaps.

-Ensure traceability and alignment of execution to critical priorities. Drive translation of business requirements into solution intent. This includes requirements identification and clarification, project goals and objective definition, scoping, estimation, risk assessment.

-Anticipate needs, operate with a sense of urgency, and have the ability to adapt to change quickly.

-Fill resource gaps with hands-on work as needed.

-Should have had the ability to write this job description better than us :)

Qualifications:

-Bachelor's Degree (Advanced Degree Preferred) in engineering, mathematics, computer science, statistics, physics, economics, operations research or related field;graduate study extremely helpful.

-Minimum of 7 years Tech Lead / Product Management / Project Management / Consulting experience, preferably in Data Warehousing, Big Data and Analytics.

-Experience with customers: managing consultant utilization, milestone success, setting and managing expectations, controlling outcomes and resolving customer issues.

-Understanding of data warehousing technologies and evolution, including relational databases (SQL, Oracle) and big data technology (Cloud, AWS, Hadoop), business intelligence and analytical tools (Tableau, Python, R), architectural strategies and previous hands on development and engineering (SDLC and Agile).

-Strong communication, presentation and meeting facilitation skills. Ability to positively represent yourself and the team.

-Must be a team-player and be able to work closely with technical teams and business users.

-Capable of building strong relationships with leaders across enterprise for both sources of data and consumers.

-The ability to travel ("25%) and to attend in-person client meetings.

-Expert in MS Office.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • West Chester, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Position Summary

The dx Team has responsibility for Data Engineering part of Comcast. One of the major goal is to harmonize the data ingestion and consumption layer across Comcast. Creating enterprise data sources as a single version of truth is a goal of dx Team.

With moderate guidance, the Big Data Software Developer will develop (code/program), test, and debug ETL (Extract/Transform/Load) of data to answer technically challenging business requirements (complex transformations, high data volume). All work needs to be documented part of release management.

Employees at all levels are expect to:

-Understand our Operating Principles; make them the guidelines for how you do your job

-Own the customer experience-think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services

-Know your stuff-be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences

-Win as a team-make big things happen by working together and being open to new ideas

-Be an active part of the Net Promoter System-a way of working that brings more employee and customer feedback into the company-by joining huddles, making call backs and helping us elevate opportunities to do better for our customers

-Drive results and growth

-Respect and promote inclusion and diversity

-Do what's right for each other, our customers, investors and our communities

Core Responsibilities

-Play a key role at a senior-level engineer by implementing a solid, robust, extensible design that supports key business flow.

-Analyzes and determines data integration needs.

-Evaluates and plans software designs, test results and technical manuals using Big Data (Hadoop) ecosystem

-Build and maintain optimized ETL solutions to process/load source systems data into Hadoop using Sqoop or Microservices leveraging Hadoop tools.

-Reviews literature, current practices relevant to the solution of assigned projects in Data Warehousing/ Data Lake and Reporting areas

-Programs new software using Spark, Scala, Kafka, Sqoop, SQL

-Supports existing and new applications and customization of current applications

-Edits and reviews technical requirements documentation

-Displays knowledge of software engineering methodologies, concepts, skills and their application in the area of specified engineering specialty (like Data warehousing)

-Displays knowledge of, and ability to apply, process software design and redesign skills

-Displays in-depth knowledge of, and ability to apply, project management skills

-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required

-Consistent exercise of independent judgment and discretion in matters of significance

-Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary

-Other duties and responsibilities as assigned

Education Level

Bachelor's Degree or Equivalent

Years of Experience

2-5 years of experience working as data integration developer, SQL Developer, ETL developer or Java/C# Developer or related experience is required.

Field of Study

Computer Science, Engineering

Compliance

Comcast is an EEO/AA/Drug Free Workplace

Disclaimer

The above information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications

Additional Information

Daily Responsibilities:

Technologies used day to day

Linux, Hadoop, Spark, UC4, SQL, Linux Shell Scripting, BI reporting tools

Business Units -what group(s) does the role support

dx and dx business partner initiatives

Paired Programming/Vs. Individual tasks-what does that look like

40% paired programming / Vs. 60% Individual tasks

Business Purpose

Describe the core impact of this role and the team -project details

Ingest data from various data sources to create a harmonized layer of data

The dx Team has responsibility including Data Engineering for Comcast; one of the major goal is to harmonize the data ingestion and consumption layer across Comcast.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Água Branca, Brazil

We are in the process of migrating our [monolithic] application landscape to a microservice architecture.
We also transitioned from fire fighting operations mode to a innovation and devops culture.
In this process a new R&D area emerged which consists of a small team of research and software engineers.


What we are doing



  • Building algorithms that help us with anticipated shipping, purchasing forecast and protects us against system failure.

  • using image recognition to give our users the highest possible convenience and coolest features.

  • using state of the art game engines to build virtual reality into our customer experience.

  • Optimize product search and build data consistency monitoring.

  • Help building a large scale architecture together with entire IT team.


The output will be nothing less than transform the way e-commerce works and to provide sustainable solutions.


What we are looking for


3+ years professional or research experience



  • Comprehensive knowledge in statistics, probability and machine learning.

  • Passion for solving standard and non-standard mathematical problems.

  • Ability to explain data science insights to people with and without quantitative background.

  • Fluency in R, Python, or Julia.

  • Hands on/advanced experience with SQL.

  • Experience in data visualization.

  • Experience using Big Data toolstack (e.g. Hadoop, Hive, Spark, Cassandra, Hbase, or other non-relational DB) is a plus.

  • West Chester, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Responsibilities

Position Summary

This is a senior position within the SE Data Service organization reporting to the Director of Data Experience . The DX group has responsibility for Data Service for Comcast, one of the major goal is to harmonize the data ingestion and consumption layer across Comcast.

Creating the enterprise data sources, EDW, ODS, transaction data source as a single version of truth for data analysis for Comcast business.

The position is especially responsible working across multiple solution architect, business partner and program and project to harmonize, data platform, process, integration, data asset within Comcast.

Collaborate and partner with technical and business team to present, define, strategies, requirement, road-map, budget, solution. Prepare and present Gap analysis, for technology and business solution up-to VP level.

Manage and support all project related architect, design, development, deployment of data-oriented integration across platform and projects as matrix organization.

The Data Integration and solution architect must have strong understanding and hands- on working knowledge with following software components: Linux, Shell Scripting, Informatica, Teradata, SQL, BTEQ, Hadoop Hive, Pig, Flume, Sqoop, Spark, Storm, Kafka, Accumulo, HBase, Java and UC4

-Provides direction for diverse and complex initiatives and is accountable for a variety of tasks to architect and deliver data warehousing solutions that exceed customer expectations in content, usability, accuracy, reliability and performance while assuming a leading role within agile teams (both on-shore and off-shore)

-Interprets business strategy and develops organizational objectives to align with this strategy. Typically manages multiple teams of professionals.

-Development of End to End ETL data integration and solution architecture

-Worked across technical and business team in order to harmonization data asset

-Experience with SQL & BTEQ Scripting: Strong data management and data analysis and performance tuning required.

-Experience with Hadoop Hive, Pig, Flume, Sqoop, Storm, Spark, Kafka, Accumulo, and HBase

-Experience with AWS services (EMR, Kinesis, S3, redshift, EC2)

-Candidate might have worked as one or more role e.g. Data architect, Data modeler, Data integration developer/architect, ETL developer/architect.

-Experiences solution architect to design, develop and implement ODS, EDW, Data Integration Layer etc.

-Experience with Presto query engine

-Method and Procedure creation related to operation and administration related activity.

-Manging and Co-ordniting Dev/ops Process.

-Conduct Operations readiness and environment compatibility review of any changes prior to deployment.

-Experince Managing Relase and process detail

-Knowledge in data warehousing methodologies and best practices required.

-Strong verbal and written communication skills required.

-Effective interpersonal relations skills, ability to effectively collaborate with others and work as part of a team required.

-Skills in navigating a large organization in order to accomplish results required.

-Ability to initiate and follow through on complex projects of both short and long term duration required.

-Excellent organizational and time management skills required.

-Excellent analytical and problem solving skills required.

-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required.

-Participate on interdepartmental teams to support organizational goals

-Perform other related duties and tasks as assigned

-Punctual, regular, and consistent attendance

Required Skills/Experience:

Advanced Degree in technical discipline and/or management required. 10+ year of experience with data integration, Data Warehouse and ETL Architecture.

-Experience with Teradata and Hadoop Ecosystems.

-Experience working across multiple solution architect, business partner and program and project to harmonize, data platform, process, integration, data asset within Comcast.

-5 + year of experience working as data integration, data solution architect EDW, ODS, ETL architect or similar role required.

-Five to seven years' experience leading data integration, Development of ETL architecture using Linux, Informatica, Teradata, SQL, BTEQ, Hadoop Hive, Pig, Spark, Flume, Sqoop, and UC4

-Experience with AWS services (EMR, Kinesis, S3, redshift, EC2)

-Experience working as solution architect supporting Platform as Service Organization.

-Experience with Presto query engine

-Requires understanding of complete SDLC and Experience with continuous integration, test-driven/behavior-driven development, and agile, scrum development methodologies

-Experience collaborating and partner with technical and business team to present, define, strategies, requirement, road-map, budget, solution.

-Experience presenting strategies, road-map and gap analysis, for technology and business solution up-to VP level

-Manging and Co-ordniting Dev/ops Process.

-Experience designing logical and physical architectures

-Experience managing teams of senior technologists

-Experience working in an Agile development methodology in a data warehouse environment

-Ability to work effectively across organizational boundaries

-Excellent oral, written, analytical, problem solving, and presentation skills

-Manage and Co-ordinate 1-10 matrix resources,

-Experience with mange service and on-shore and off-shore development experience is must

Desired Skills/ Experience

-Telecommunications experience Knowledge of Telecommunication/Cable billing, customer care systems e.g. DST, CSG and AMDOC etc.

-Knowledge of NoSQL platforms;

-Hadoop, Teradata, TOGAF, AWS Certified

Comcast is an EEO/AA/ Drug Free Workplace

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Hamburg, Deutschland

System Administration | Hamburg | Full-Time




You are a data analyst/scientist and you want to develop a deep understanding of the infrastructure behind Big Data systems? We are offering a cross-functional position where you can use your existing data processing skills and become the driving force behind the operation and further development of our data infrastructure.

You will be part of our System Administration department and work very closely with our data engineers and scientist to design and implement architecture changes for our Big Data systems. Our experienced system administrators will introduce you to our infrastructure. Once you are familiar with our data ecosystem, Puppet setup, Serveradmin and our operational structure you will take over the responsibility for our Big Data infrastructure step by step.

With our data architecture we are processing more than 1,000,000,000 game interaction events day by day. Our Hadoop cluster works on 1,500 CPUs, 5 TB Memory, 1 PB of disk space.



Your profile:



  • Some years professional experience as a Data Analyst, Scientist or Engineer

  • You understand distributed systems (e.g. CAP, Kappa, Lambda) and the interactions happening in the background (e.g. rebalancing)

  • You know stream processing, Data integration and processing (ETL) and can analyze and optimize SQL queries

  • Professional experience in Java and a Scripting language, ideally Python

  • Open and friendly communication style and very good English skills

  • You know your way around a Linux command line

  • Knowledge in server hardware and ideally a configuration management tool ( like Puppet, Chef or Ansible) would be a huge plus



Your mission:



  • Work together with our Data Engineers and Data Scientists

  • Discuss and implement new technologies

  • Maintain the systems, update, monitor, and debug them

  • Automate and improve the environment



Our technologies:



  • The Hadoop ecosystem (HDFS, Hive, Impala, Spark)

  • Stream processing (Kafka, Flink)

  • Custom data applications with Java, PHP and go

  • Jenkins for job scheduling and build processes

  • Debian and Puppet for configuration management

  • Supported by Nagios, Graphite, Grafana and Serveradmin



Why join us?



  • Be part of a great team in an international environment in a healthy and stable growing company

  • Choose your preferred device (Linux, Mac or even Windows) for your comfortable workplace

  • We will actively support your further development and give you all needed resources to evaluate new technologies, participate in open source communities or improve your soft skills

  • Competitive compensation and an atmosphere to empower creative thinking and strong results

  • Exceptional benefits ranging from flawless relocation support to company gym, smartphone or tablet of your own choice for personal use, roof terrace with BBQ and much more



Excited to start your journey with InnoGames and join our dynamic team as a Data Engineer Base Technologies (for DevOps)? We look forward to receiving your application as well as your salary expectations and earliest possible start date through our online application form. Isabella Dettlaff would be happy to answer any questions you may have.

InnoGames, based in Hamburg, is one of the leading developers and publishers of online games with more than 200 million registered players around the world. Currently, more than 400 people from 30 nations are working in the Hamburg-based headquarters. We have been characterized by dynamic growth ever since the company was founded in 2007. In order to further expand our success and to realize new projects, we are constantly looking for young talents, experienced professionals, and creative thinkers.





Isabella


Talent Acquisition Manager




Phone +494078893350

  • London Borough of Ealing, UK

Cognizant BigData Practice:


Big Data service line is part of the Cognizant Digital Business/Artificial Intelligence and Analytics business unit. It is responsible for delivering digital data platforms, smart data lakes on premise/cloud using different Hadoop distributions/NoSQL technologies, Data Virtualization products availble in the market and have built an proprietary Data Platform to deliver end-end data services, automation and framework driven data processing modules. We are a platinum/gold partners for majority of platform vendors which enables us seamless access to all product upgrades, discussions, events etc;


The members of the team work in a variety of industries and use a diverse set of tools to process IoT, Social data, larger operational systems like SAP, Salesforce, Mainframes etc; types of complex data.  Accessing a range of data stored in disparate systems, integrating data and providing the data in necessary formats to perform data mining to answer specific business questions as well as identifying unknown trends and relationships in data. 


Few Snippets:



  • We are one of the fastest growing and revenue generating service lines with in Cognizant

  • More than 170+ Big Data engagements across the globe with over 2000 + Big Data consultants with expertise on various different industry products & distributions

  • 170+ Usecases repository with Solutions for different Digital & IoT usecases

  • Rated as a Leader by Everest’s PEAK matrix in their “Big Data assessment” for  RCGTH, HC, Banking and Insurance domains

  • “Leader”  Gartner Magic Quadrant for Data & Business Analytics Services, Worldwide 


You as senior technologist will be part of the strategic drive to in leading multi-discipline teams through the full delivery lifecycle of complex data products and pipelines with a clear understanding of large-scale data processing and data science solutions, deliver use cases for one of our customers and develop practical solutions and implement them to give this customer a competitive edge with in the enterprise. You will have breadth of experience in both engineering and architecture across technology disciplines and the unique challenges of each, including software development, automated test and quality assurance, data and integration.


Your role


Delivering a high quality and innovative data driven solution to our client’s advance analytical needs. This will include, working within a team on the clients site: understanding the client’s pain, designing an analytical approach, implementing a solution, ensuring it is of high quality, and leading and mentoring multi discipline technology teams.


Key Responsibilities:



  • Work at a client site as a member of an experienced onsite consulting/delivery teams developing & providing architectures for a highly resilient, fault tolerant, horizontally scalable data platform.

  • To build multiple layers with in the platform around Data Ingestion, Integration, Registration, Metadata Management, Error & Auditing, Data Provisioning.

  • Working with Architecture to refine ideas on the Analytics Architecture and help with standards and guidelines.

  • Building a Big Data Engineering community and helping customers/clients to build centre of excellence for engineering.

  • Working closely with the Lead Software Engineers in your area of responsibility to champion the core tenets of engineering at client

  • Ensuring development teams deliver the highest quality data products through the adoption and continuous improvement of patterns and practices, tools and frameworks and processes. 

  • Applying years of experience and expertise to help teams resolve and overcome technical challenges of any size or complexity and assist where necessary to accelerate problem resolution.


Job Requirements  Essential Skills:



  • Proven, Strong Data Processing skillset with experience in Hadoop tools and techniques. For example (not exhaustive):

  • Spark processing, Streaming and performance tuning

  • Kafka Real time messaging

  • HBase modelling and development

  • HDFS file formats partitioning for eg; Parquet, Avro etc;

  • Impala/Hive

  • Unix Shell Scripting

  • Proficiency in Scala

  • Working proficiency in developmental toolsets like Eclipse, IntelliJ

  • Exposure/competence with Agile Development approach

  • Git & Continuous integration tools such as Jenkins, TeamCity

  • Multi-threaded, OOPS Programming

  • Jenkins/Maven

  • FindBugs, Sonar, JUNIT, Performance, Memory Management

  • Strong experience in delivering Data Solutions using Big-Data technologies and cloud platforms.

  • Strong work experience in Azure Cloud platform

  • Expert in at least one of Java, Scala, Python with knowledge of the others

  • Experience in delivering big data solutions using a leading Hadoop distribution like Hortonworks, Cloudera or MapR

  • Knowledge of RDBMS, ETL and Data warehouse technologies 

  • Testing frameworks like Junit, Scalatest, Mock testing

  • Nice to Have Skills.

  • Knowledge and experience in delivering real time solutions on sinks like Cassandra, MongoDB or equivalent NoSQL databases

  • Container technologies like Docker

  • Knowledge of API and Microservice architectures 

  • Knowledge of advanced analytics and insights techniques (e.g. predictive analytics, machine learning, segmentation)

  • Knowledge of deep learning frameworks like TensorFlow, Keras

  • Knowledge of machine learning libraries like MLLIB, sklearn

  • Strong domain experience in retail/e-commerce,  Communications & Technology verticals. 


Qualifications:  University Degree with a specialization in Computer Science, Mathematics 

  • Milpitas, CA

We are looking for experienced Big Data engineers to add to our team in Milpitas, CA. In this position, you will be working with one of the top 10 US retail stores with a high-priority demand for building continuous integration, delivery, and quality control processes.


Responsibilities:



  • Participate in design and development of Big Data analytical applications

  • Design, support, and continuously enhance the project code base, continuous integration pipeline, etc.

  • Write complex ETL processes and frameworks for analytics and data management

  • Implement large-scale real-time streaming data processing pipelines

  • Work inside a team of industry experts on cutting-edge Big Data technologies to develop solutions for deployment at a massive scale


Requirements:



  • Hadoop v2, MapReduce, HDFS on Google Cloud

  • Building stream-processing systems, using solutions such as Storm or Spark-Streaming

  • Big Data querying tools, such as Pig, Hive, and Impala

  • Integration of data from multiple data sources

  • Spark, Scala, Java, Python, Bash, BigQuery, Azkaban, Airflow, and Dataflow

  • NoSQL databases, such as HBase, Cassandra, MongoDB

  • Messaging systems Kafka, RabitMQ, etc.


What will be a plus:



  • Knowledge of Unix-based operating systems (bash/ssh/ps/grep etc.)

  • Experience with Github-based development processes

  • Experience with JVM build systems (SBT, Maven, Gradle)


What we offer:



  • Work in the Bay Area with terrific customers on large, innovative projects.

  • High-energy atmosphere of exponentially & successfully growing company.

  • An attractive compensation package with generous benefits (medical, dental, vision, and life)

  • 401K and Section 125 pre-tax offerings (POP and FSA plans). 

  • Amsterdam, Netherlands

Enterprise Data Scientist



Are you ready to crunch any data to generate business value? Do you know when to use ANOVA and when to use Kruskal-Wallis test? Are you fluent in Python and do you know what is necessary to produce high quality code for analytics? Have you been working in a multidisciplinary environment using agile methodology? With this skillset, you are an ideal fit for our Enterprise Analytics and Data Management department and we would like to hear from you!


What do we do?


We invent and implement algorithms, we design and build analytical pipelines – all with leveraging from domains like statistics, machine learning, information and graph theory, data mining, optimization, complex network analysis etc. We work in international teams with huge cultural diversity and we interact not only with other fellow data scientists, but also with data engineers, DevOps, business owners, data architects, security and privacy experts, etc. We mostly write our analytical code in Python, but we're not afraid of R, Java and even Scala. Last but not least, we are strong supporters of Open Source.


Whom are we looking for?


·As an ideal candidate, you:


·have an MSc or a PhD degree in Applied Mathematics, Computer Science, Physics, Electrical Engineering or related field.


·have a proven track record of putting data science to work in domains such as FMCG, Finance & Banking, Telco, Transportation, etc.


·have a proven track record of hiring, training and retaining data science talents.


·know how to make your research reproducible.


·have solid understanding of data structures and algorithms, specifically advanced statistics and machine learning.


·are experienced with commonly used Big Data technology stack such as Hadoop Data Platform, Spark, Solr, Zeppelin, to name a few.


·know how to create stunning visualizations of data insights and how to deliver thrilling presentations to non-tech savvy audience.


·have experience working in multidisciplinary teams.


·have SCRUM certification or equivalent experience.


·know what GitFlow and Pull Requests are and how to use them.


·know what continuous delivery and continuous deployment are and how data scientists can benefit from it.


What awaits you?


Within the Enterprise Analytics and Data Managements global function you can use your talents to tackle enterprise-wide challenges and help in shaping up our smoke-free future. As you move up the career ladder or if you would like to explore new business areas, we support your professional and personal development through continuous trainings, conference participation, mentorship etc. We also provide you with opportunities to put your ideas into practice across all levels of the company.


In return we offer:
•Truly dynamic worldwide team dedicated to a bold new vision
•Amazing career opportunities across the globe
•Working for a certified Global Top Employer and multi-awarded company (https://www.pmi.com/careers/our-awards)


About Philip Morris International



PMI is the world’s leading international tobacco company, with six of the world's top 15 international brands and products sold in more than 180 markets. In addition to the manufacture and sale of cigarettes, including the number one global cigarette brand, and other tobacco products, PMI is engaged in the development and commercialization of Reduced-Risk Products (“RRPs”).


RRPs is the term PMI uses to refer to products with the potential to reduce individual risk and population harm in comparison to smoking cigarettes. Through multidisciplinary capabilities in product development, state-of-the-art facilities, and industry-leading scientific substantiation, PMI aims to provide an RRP portfolio that meets a broad spectrum of adult smoker preferences and rigorous regulatory requirements. For more information, see www.pmi.com and www.pmiscience.com.



Benefits


 Performance Bonus
 Flexible Working Opportunities
 Professional Development & Tuition Assistance
 Equity Incentive Plan & Employee Stock Purchase Plan
 Retirement Plan
 Work From Home
 Insurance, Heath & Wellness Benefits
 Maternity & Paternity Leave, Adoption Assistance

  • London, UK

Risk Quant Developer


The Role


As a Risk Quant Developer at AHL you will be working hand-in-hand with our risk managers and researchers. Your challenges will be varied, and will involve implementing new risk models, building APIs and tools to allow flexible risk analysis, supporting and expanding existing tools and building a new scalable risk data infrastructure.


The Team 


Quant Developers at AHL are all part of our broader technology team, members of a group of over sixty individuals representing eighteen nationalities. We have varied backgrounds including Computer Science, Mathematics, Physics, Engineering even Classics - but what unifies us is a passion for technology and writing high-quality code.


Our developers are organised into small cross-functional teams, with our engineering roles broadly of two kinds: "Quant Platform Developers" otherwise known as our "Core Techs", and "Quant Developers" who are closely aligned with a particular asset class or market sector or business function. People often rotate teams in order to learn more about our system, as well as find the position that best matches their interests.


Our Technology


Our systems are almost all running on Linux and most of our code is in Python, with the full scientific stack: numpy, scipy, pandas, scikit-learn to name a few of the libraries we use extensively. We implement the systems that require the highest data throughput in Java. For storage, we rely heavily on MongoDB and Oracle.


We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker for containerisation, OpenStack for our private cloud, Ansible for architecture automation, and HipChat for internal communication. But our technology list is never static: we constantly evaluate new tools and libraries.


Working Here


AHL has a small company, no-attitude feel. It is flat structured, open, transparent and collaborative, and you will have plenty of opportunity to grow and have enormous impact on what we do.  We are actively engaged with the broader technology community.



  • We host and sponsor London’s PyData and Machine Learning Meetups

  • We open-source some of our technology. See https://github.com/manahl

  • We regularly talk at leading industry conferences, and tweet about relevant technology and how we’re using it. See @manahltech


We’re fortunate enough to have a fantastic open-plan office overlooking the River Thames, and continually strive to make our environment a great place in which to work.



  • We organise regular social events, everything from photography through climbing, karting, wine tasting and monthly team lunches

  • We have annual away days and off-sites for the whole team

  • We have a canteen with a daily allowance for breakfast and lunch, and an on-site bar for in the evening

  • As well as PC’s and Macs, in our office you’ll also find numerous pieces of cool tech such as light cubes and 3D printers, guitars, ping-pong and table-football, and a piano. 


We offer competitive compensation, a generous holiday allowance, various health and other flexible benefits. We are also committed to continuous learning and development via coaching, mentoring, regular conference attendance and sponsoring academic and professional qualifications.


Technology and Business Skills


At AHL we strive to hire only the brightest and best and most highly skilled and passionate technologists.


Essential



  • Exceptional technology skills; recognised by your peers as an expert in your domain

  • A keen interest and understanding of financial markets and instruments

  • A proponent of strong collaborative software engineering techniques and methods: agile development, continuous integration, code review, unit testing, refactoring and related approaches

  • Strong knowledge of Python

  • Proficient on Linux platforms with knowledge of various scripting languages

  • Experience of data analysis techniques along with relevant libraries e.g. NumPy/SciPy/Pandas

  • Relevant mathematical knowledge e.g. statistics, asset pricing theory, optimisation algorithms.


Advantageous



  • Experience of front office quantitative software development e.g. in a hedge fund or investment bank

  • Experience of web based development and visualisation technology for portraying large and complex data sets and relationships

  • Experience of machine learning techniques, natural language processing and related libraries and frameworks e.g. scikit-learn, Tensorflow


Personal Attributes



  • Strong academic record and a degree with high mathematical and computing content e.g. Computer Science, Mathematics, Engineering or Physics from a leading university

  • Craftsman-like approach to building software; takes pride in engineering excellence and instils these values in others

  • Demonstrable passion for technology e.g. personal projects, open-source involvement

  • Intellectually robust with a keenly analytic approach to problem solving

  • Self-organised with the ability to effectively manage time across multiple projects and with competing business demands and priorities

  • Focused on delivering value to the business with relentless efforts to improve process

  • Strong interpersonal skills; able to establish and maintain a close working relationship with quantitative researchers, traders and senior business people alike

  • Confident communicator; able to argue a point concisely and deal positively with conflicting views.

  • London, UK

We are a growing team based in Central London and have talented teams working from across the globe: China, Mexico and Europe. As we embark on our next phase of growth in the UK and expand to Sub-Saharan Africa, we are looking for talented Data Scientists to help us build, refine an implement scorecards and machine learning models to improve our customer journeys.


You will build models to:



  • Predict our customers’ behaviours

  • Detect fraudulent online applications

  • Predict which customers are likely to be receptive to an offer for a new loan or an increase in loan amount

  • You will use NLP to identify opportunities to serve customers better by automating certain types of conversations.


Who you are:



  • You have a degree in Computer Science, Statistics, Maths, Physics, or similar

  • You have an understanding of the theory behind machine learning and statistical modelling

  • You have experience building predictive models using either Python or R

  • You are able to proactively identify opportunities where models/computation can replace or redefine real business problems, i.e. you won’t wait for someone to tell you what to model

  • You communicate effectively about complex models to smart but non-technical colleagues

  • You are interested in applying machine learning to improve access to finance