• Phoenix, AZ

Provide management and leadership to the Big Data project teams. Directs initiatives in the Freeport-McMoRan Big Data program. Provides analytical direction, expertise and support for the Big Data program; this includes project leadership for initiatives, coordination with business subject matter experts and travel to mine sites. This will be a global role that will coordinate with site and corporate stakeholders to ensure global alignment on service and project delivery. The role will also work with business operations management to ensure the program is focusing in areas most beneficial to the company.

  • Work closely with business, engineering and technology teams to develop solution to data-intensive business problems
  • Supervise internal and external science teams
  • Perform quality control of deliverables
  • Prepare reports and presentations, and communicate with Executives
  • Provide thought leadership in algorithmic and process innovations, and creativity in solving unconventional problems
  • Use statistical and programming tools such as R and Python to analyze data and develop machine-learning models
  • Perform other duties as required

Minimum Qualifications

  • Bachelors degree in an analytical field (statistics, mathematics, etc.) and eight (8) years of relevant work experience, OR
  • Masters degree in an analytical field (statistics, mathematics, etc.) and six (6) years of relevant work experience, OR
  • Proven track record of collaborating with business partners to translate business problems and needs into data-based analytical solutions
  • Proficient in predictive modeling:
  • Linear and logistic regression
  • Tree based techniques (CART, Random Forest, Gradient Boosting)
  • Time-Series Analysis
  • Anomaly detection
  • Survival Analysis
  • Strong Experience with SQL/Hive environments
  • Skilled with R and/or Python analysis environments
  • Experience with Big Data tools for machine learning, R, Hive, Python
  • Good communication skills

Preferred Qualifications

  • Doctorate degree in an analytical field
  • Willing and able to travel 20-30% or more


  • Ability to understand and apply verbal and written work and safety-related instructions and procedures given in English
  • Ability to communicate in English with respect to job assignments, job procedures, and applicable safety standards
  • Must be able to work in a potentially stressful environment
  • Position is in busy, non-smoking office located in downtown Phoenix, AZ
  • Location requires mobility in an office environment; each floor is accessible by elevator
  • Occasionally work will be performed in a mine, outdoor or manufacturing plant setting
  • Must be able to frequently sit, stand and walk
  • Must be able to frequently lift and carry up to ten (10) pounds
  • Personal protective equipment is required when performing work in a mine, outdoor, manufacturing or plant environment, including hard hat, hearing protection, safety glasses, safety footwear, and as needed, respirator, rubber steel-toe boots, protective clothing, gloves and any other protective equipment as required
  • Freeport-McMoRan promotes a drug/alcohol-free work environment through the use of mandatory pre-employment drug testing and on-going random drug testing as allowed by applicable State laws

Freeport-McMoRan has reviewed the jobs at its various office and operating sites and determined that many of these jobs require employees to perform essential job functions that pose a direct threat to the safety or health of the employees performing these tasks or others. Accordingly, the Company has designated the following positions as safety-sensitive:

  • Site-based positions, or positions which require unescorted access to site-based operational areas, which are held by employees who are required to receive MSHA, OSHA, DOT, HAZWOPER and/or Hazard Recognition Training; or
  • Positions which are held by employees who operate equipment, machinery or motor vehicles in furtherance of performing the essential functions of their job duties, including operating motor vehicles while on Company business or travel (for this purpose motor vehicles includes Company owned or leased motor vehicles and personal motor vehicles used by employees in furtherance of Company business or while on Company travel); or
  • Positions which Freeport-McMoRan has designated as safety sensitive positions in the applicable job or position description and which upon further review continue to be designated as safety-sensitive based on an individualized assessment of the actual duties performed by a specifically identified employee.

Equal Opportunity Employer/Protected Veteran/Disability

Requisition ID

  • Phoenix, AZ

Supports the activities for all Freeport-McMoRan Big Data programs. Provides analytical support and expertise for the Big Data program; this includes coordination with business subject matter experts and travel to mine sites. The role will provide analyses and statistical models as part of Big Data projects, and may be the project lead on analytics initiatives. The role will also provide visualizations and descriptive results of the analysis. This will be a global role that will coordinate with site and corporate stakeholders to ensure alignment on project delivery.

    • closely with business, engineering and technology teams to analyze data-intensive business problems.
    • Research and develop appropriate statistical methodology to translate these business problems into analytics solutions
    • Perform quality control of deliverables
    • Develop visualizations of results and prepare deliverable reports and presentations, and communicate with business partners
    • Provide thought leadership in algorithmic and process innovations, and creativity in solving unconventional problems
    • Develop, implement and maintain analytical solutions in the Big Data environment
    • Work with onshore and offshore resources to implement and maintain analytical solutions
    • Perform variable selection and other standard modeling tasks
    • Produce model performance metrics
    • Use statistical and programming tools such as R and Python to analyze data and develop machine-learning models
    • Perform other duties as requested

Minimum Qualifications

  • Bachelors degree in an analytical field (statistics, mathematics, etc.) and five (5) years of relevant work experience, OR 
  • Masters degree in an analytical field (statistics, mathematics, etc.) and three (3) years of relevant work experience

  • Proven track record of collaborating with business partners to translate operational problems and needs into data-based analytical solutions

  • Proficient in predictive modeling:

  • Linear and logistic regression

  • Tree based techniques (CART, Random Forest, Gradient Boosting)

  • Time-Series Analysis

  • Anomaly detection

  • Survival Analysis

  • Strong experience with SQL/Hive environments

  • Skilled with R and/or Python analysis environments

  • Experience with Big Data tools for machine learning, R, Hive, Python

  • Good communication skills

Preferred Qualifications

  • Masters degree in an analytical field
  • Willing and able to travel 20-30% or more


  • Ability to understand and apply verbal and written work and safety-related instructions and procedures given in English
  • Ability to communicate in English with respect to job assignments, job procedures, and applicable safety standards

  • Must be able to work in a potentially stressful environment

  • Position is in busy, non-smoking office located in Phoenix, AZ

  • Location requires mobility in an office environment; each floor is accessible by elevator and internal staircase

  • Occasionally work may be performed in a mine, outdoor or manufacturing plant setting

  • Must be able to frequently sit, stand and walk

  • Must be able to frequently lift and carry up to ten (10) pounds

  • Personal protective equipment is required when performing work in a mine, outdoor, manufacturing or plant environment, including hard hat, hearing protection, safety glasses, safety footwear, and as needed, respirator, rubber steel-toe boots, protective clothing, gloves and any other protective equipment as required

  • Freeport-McMoRan promotes a drug/alcohol free work environment through the use of mandatory pre-employment drug testing and on-going random drug testing as per applicable State Laws

Freeport-McMoRan has reviewed the jobs at its various office and operating sites and determined that many of these jobs require employees to perform essential job functions that pose a direct threat to the safety or health of the employees performing these tasks or others. Accordingly, the Company has designated the following positions as safety-sensitive:

  • Site-based positions, or positions which require unescorted access to site-based operational areas, which are held by employees who are required to receive MSHA, OSHA, DOT, HAZWOPER and/or Hazard Recognition Training; or
  • Positions which are held by employees who operate equipment, machinery or motor vehicles in furtherance of performing the essential functions of their job duties, including operating motor vehicles while on Company business or travel (for this purpose motor vehicles includes Company owned or leased motor vehicles and personal motor vehicles used by employees in furtherance of Company business or while on Company travel); or
  • Positions which Freeport-McMoRan has designated as safety sensitive positions in the applicable job or position description and which upon further review continue to be designated as safety-sensitive based on an individualized assessment of the actual duties performed by a specifically identified employee.

Equal Opportunity Employer/Protected Veteran/Disability

Requisition ID

  • Atlanta, GA

Music and data are at the heart of Pandora. As a member of the User Engagement data science team, you will help us build models and design experiments that impact the listening experience of millions of people every day.  We are looking for enthusiastic data scientists with experience in machine learning, statistical modeling and analysis, combined with strong CS fundamentals and coding abilities. Scientists on our team partner with engineers, product managers, and other key stakeholders in the product and marketing teams. You'll interact regularly with senior leadership to directly guide and shape Pandora's efforts in one or more of the following areas:

  • Building ML models and improving the core recommendation system that helps serve music to millions of listeners

  • Developing new ways to model user behavior using cutting-edge techniques

  • Optimizing models to drive user growth and engagement

  • Exploring new opportunities to integrate SiriusXM and Pandora radio programming

Successful candidates will have outstanding communication skills, demonstrated ability to work effectively in a small team, and a natural sense of curiosity and drive to experiment. We welcome diverse perspectives and a collaborative spirit.

Relocation and visa programs available.


  • PhD degree in quantitative field (for example: Computer Science, Machine Learning, Statistics, Biology, Neuroscience, Physics, or Mathematics)

  • Demonstrated background in machine learning and applied statistics

  • 2+ years of industry experience in a data science role

  • Strong Python programming skills

  • Experience with Hive or SQL databases

  • Experience implementing ML models at a large scale in a production environment

  • Experimental design and A/B testing

Plus Requirements:

  • Experience with the Hadoop technology stack
  • Experience with AWS (Amazon Web Services) or GCP (Google Cloud Platform) tools
  • Experience with deep learning APIs such as TensorFlow or PyTorch

Pandora is committed to diversity in its workforce. Pandora is an equal employment employer and considers qualified applicants without regard to gender, sexual orientation, gender identity, race, ethnicity, veteran or disability status. Women and people of color are encouraged to apply.

Pandora is also a VEVRAA federal contractor. Pandora requests priority referrals of protected veterans from each ESDS, as required by regulation.

If you believe you need a reasonable accommodation in order to search for a job opening or to apply for a position, please contact us by sending an email to This email box is designed to assist job seekers who require a reasonable accommodation to the application process. A response to your request may take up to two business days.

In Your Email, Please Include The Following

The specific accommodation requested to complete the employment application process.

The location or office to which you would like to apply.

The subject of the email should read "Request for Reasonable Accommodation".
LPL Financial
  • San Diego, CA

As a Development Manager for Digital Experience Technology, responsible for strategy, analysis, design and implementation of modern, scalable, cloud native digital and artificial intelligence solutions. This role primarily will lead Intelligent Content Delivery program with focus on improving content search engine, implementing personalization engine and AI driven content ranking engine to deliver relevant and personalized intelligent content across Web, Mobile and AI Chat channels.  This role requires a successful and experienced digital, data and artificial intelligence practitioner, able to lead discussions with business and technology partners to identify business problems and delivering breakthrough solutions.

The successful candidate will have excellent verbal and written communication skills along with a demonstrated ability to mentor and manage a digital team. You must possess a unique blend of business and technical savvy; a big-picture vision, and the drive to make a vision a reality.

Key Responsibilities:

·       Define innovative Digital Content, AI and Automation offerings solving business problems with inputs from customers, business and product teams

·       Partner with business, product and technology teams to define problems, develop business case, build prototypes and create new offerings.

·       Evangelize and promote Cloud based micro-services architecture and Digital Technology capabilities adoption across organization

·       Lead and own technical design and development aspects of knowledge graphs, search engines, personalization engine and intelligent content engine platforms by applying digital technologies, machine learning and deep learning frameworks

·       Responsible for research, design and prototype robust and scalable models based on machine learning, data mining, and statistical modeling to answer key business problems

·       Manage onsite and offshore development teams implementing products and platforms in Agile

·       Collaborate with business, product , enterprise architecture and cross-functional teams to ensure strategic and tactical goals of project efforts are met

·       Work collaboratively with QA, DevOPS teams to adopt CI/CD tool chain and develop automation

·       Work with technical teams to ensure overall support and stability of platforms and assist with troubleshooting when production incidents arise

·       Be a mentor and leader on the team and within the organization

Basic Qualifications:

·       Overall 10+ years of experience, with 6+ years of development experience in implementing Digital and Artificial Intelligence (NPU-NLP-ML) platforms.

·       Solid working experience of Python, R and knowledge graphs

·       Expertise in any one AI related frameworks (NLTK, Spacy, Scikit-Learn, Tensor flow)

·       Experience with platforms Cloud Platforms and products including Amazon AWS, LEX Bots, Lambda , Microsoft Azure or similar cloud technologies

·       Solid working experience of implementing Solr search engine, SQL, Elastic Search,  Neo4J

·       Experience in data analysis , modelling, reporting using Power BI / similar tools

·       Experience with Enterprise Content Management systems like Adobe AEM / Any Enterprise CMS

·       Experience in implementing knowledge graphs using, Facebook Open graph, Google AMP pages is added advantage

·       Excellent collaboration and negotiation skills

·       Results driven with a positive can do attitude

·       Experience in implementing Intelligent Automation tools like Work fusion , UIPath / Automation Anywhere is added advantage


·       Ms or Ph.D degree in Computer Science / Statistics / Mathematics / Data Science / Any

·       Previous industry experience or research experience in solving business problems applying machine learning and deep learning algorithms

·       Must be hands on technologist with prior experience in similar role

·       Good experience in practicing and executing projects in Agile Scrum or Agile Safe iterative methodologies

iMoney Group
  • Kuala Lumpur, Malaysia
  • Salary: $84k - 96k

Reporting into the CEO, as iMoney’s Head of Data Science, you will be the guru for the full-suite of data related services supporting the organization, including reporting and business intelligence, data analytics, data science and the overall data infrastructure.

  • Unlock the full potential of the huge amounts of data that has been and is continuously being collected at iMoney

  • Craft the data vision and own, identify and implement the data analytics and data science roadmap for the iMoney Group across all business areas

  • Be the strategic leader and developmental coach for our current data team comprising two data analysts and build out additional data capabilities within the iMoney Group

  • Partner with the different business units and leverage data analytics, insights and science to drive all aspects of the customer conversion funnel including marketing channel attribution and optimization, onsite and offline (call-centre) user behavior and conversion, recommendation engines and product matching, customer segmentation, predictive analysis and propensity modelling

  • Utilize best in class practices with respect to data analytics, visualization, reporting dashboards and data science modelling

  • Establish iMoney as the market leader in the field of data analytics and innovative data science

  • Collaborate with the technology and product teams in continuously enhancing and delivering a robust, efficient and scalable data collection, structuring and warehousing infrastructure


  • Passionate about data and its ability to drive high business impact and growth

  • 10 years of experience in the field of data analytics and data science, including at least 3 years in a leadership role at a scale-up stage digital consumer business such as e-commerce, online lending platforms, digital banks, online financial marketplaces or similar

  • Hands on experience in any of the following tools: R, Python, Knime, SAS, SPSS

  • Clear understanding of databases and extensive knowledge of SQL, AWS Redshift, Hadoop, Hive, Teradata, Google Big Query

  • Experience in implementing and leveraging Tableau for business reporting and intelligence

  • Expertise in applying advanced predictive statistical techniques to develop regression, time-series, segmentation models. Exposure to design of experiments or neural network.

  • Responsibility and Attention to Detail - take responsibility for delivery of precise and accurate business intelligence, data analytics and insights to tight timescales and work to resolve problems when they occur

  • Project management skills - ability to scope out and implement larger data related projects as per business requirements including clear understanding of resource and timing requirements

  • People leadership skills - coaching, inspiring, career counselling, mentoring and capability development of team members and peers

  • Excellent stakeholder management, communication and presentation skills – fluent in English, breaking down complex problems with data-driven solutions and having a service-orientated mindset 

Limelight Networks
  • Phoenix, AZ

Job Purpose:

The Sr. Data Services Engineer assists in maintaining the operational aspects of Limelight Networks platforms, provides guidance to the Operations group and acts as an escalation point for advanced troubleshooting of systems issues. The Sr. Data Services Engineer assists in the execution of tactical and strategic operational infrastructure initiatives by building and managing complex computing systems and processes that facilitate the introduction of new products and services while allowing existing services to scale.

Qualifications: Experience and Education (minimums)

  • Bachelors Degree or equivalent experience.
  • 2+ years experience working with MySQL (or other relational databases: Mongo DB, Cassandra, Hadoop, etc.) in a large-scale enterprise environment.
  • 2+ years Linux Systems Administration experience.
  • 2+ years Version Control and Shell scripting and one or more scripting languages including Python, Perl, Ruby and PHP.
  • 2+ Configuration Management Systems, using Puppet, Chef or SALT.
  • Experienced w/MySQL HA/Clustering solutions; Corosync, Pacemaker and DRBD preferred.
  • Experience supporting open-source messaging solutions such as RabbitMQ or ActiveMQ preferred.

Knowledge, Skills & Abilities

  • Collaborative in a fast-paced environment while providing exceptional visibility to management and end-toend ownership of incidents, projects and tasks.
  • Ability to implement and maintain complex datastores.
  • Knowledge of configuration management and release engineering processes and methodologies.
  • Excellent coordination, planning and written and verbal communication skills.
  • Knowledge of the Agile project management methodologies preferred.
  • Knowledge of a NoSQL/Big Data platform; Hadoop, MongoDB or Cassandra preferred.
  • Ability to participate in a 24/7 on call rotation.
  • Ability to travel when necessary.

Essential Functions:

  • Develop and maintain core competencies of the team in accordance with applicable architectures and standards.
  • Participate in capacity management of services and systems.
  • Maintain plans, processes and procedures necessary for the proper deployment and operation of systems and services.
  • Identify gaps in the operation of products and services and drive enhancements.
  • Evaluate release processes and tools to find areas for improvement.
  • Contribute to the release and change management process by collaborating with the developers and other Engineering groups.
  • Participate in development meetings and implement required changes to the operational architecture, standards, processes or procedures and ensure they are in place prior to release (e.g., monitoring, documentation and metrics).
  • Maintain a positive demeanor and a high level of professionalism at all times.
  • Implement proactive monitoring capabilities that ensure minimal disruption to the user community including: early failure detection mechanisms, log monitoring, session tracing and data capture to aid in the troubleshooting process.
  • Implement HA and DR capabilities to support business requirements.
  • Troubleshoot and investigate database related issues.
  • Maintain migration plans and data refresh mechanisms to keep environments current and in sync with production.
  • Implement backup and recovery procedures utilizing various methods to provide flexible data recovery capabilities.
  • Work with management and security team to assist in implementing and enforcing security policies.
  • Create and manage user and security profiles ensuring application security policies and procedures are followed.

Huntech USA LLC
  • San Diego, CA

Great opportunity to work with the leader in semiconductor industry who unveiled the worlds first 7 nanometer PC platform, created from the ground up for the next generation of personal computing by bringing new features with thin and light designs, allowing for new form factors in the always-on, always-connected category. It features the new octa-core CPU, the fastest CPU ever designed and built, with a larger cache than previous compute platforms, faster multi-tasking and increased productivity for users, disrupting the performance expectations of current thin, light and fanless PC designs. This platform is currently sampling to customers and is expected to begin shipping in commercial devices in Q3 of 2019.

Staff Data Analyst

You will study the performance of the Global Engineering Grid/ Design workflows across engineering grid and provide insights in effective analytics in support of Grid 2.0 program. You will conduct research, design statistical studies and analyze data in support of Grid 2.0 program.  This job will challenge you to dive deep into the engineering grid/ design flow world and understand the unique challenges in operating engineering grid at a scale unrivaled in the industry.  You should have experience working in an EDA or manufacturing environment and comfortable workings in an environment where problems are not always well-defined.


  • Identify and pursue opportunities to improve the efficiency of global engineering grid and design workflows.
  • Develop systems to invest, analyze, and take automated action across real-time feeds of high volume data.
  • Research and implement new analytics approaches effective deployment of machine learning/ data modeling to solve business problems Identify patterns and trends from large, high-dimensional data sets; manipulate data into digestible and actionable reports.
  • Make business recommendations (e.g. cost-benefit, experiment analysis) with effectivepresentations of findings at multiple levels of stakeholders through visual displays of quantitative information.
  • Plan effectively to set priorities and manage projects, identify roadblocks and come up technical options.

Leverage your 8+ years of experience articulating business questions and using mathematical techniques to arrive at an answer using available data. 3 - 4 yrs advance Tableau is a must. Experience translating analysis results into business recommendations. Experience with statistical software (e.g., R, Python, MATLAB, pandas, scala) and database languages like SQL Experience with data warehousing concepts (Hadoop, mapR) and visualization tools (e.g. QlikView, Tableau, Angular, Thoughtspot). Strong business acumen,critical thinking ability, and attention to detail.

Background in data science, applied mathematics, or computational science and a history of solving difficult problems using a scientific approach with MS or BS degree in a quantitative discipline (e.g., Statistics, Applied Mathematics, Operations Research, Computer Science, Electrical Engineering) and understand how to design scientific studies. You should be familiar with the state of the art in machine learning/ data modeling/ forecasting and optimization techniques in a big data environment.

Data Analytics Software Test Engineer

As a member of the Corporate Engineering Services Group software test team, you will be responsible for testing various cutting edge data analytics products and solutions. You will be working with a dynamic engineering team to develop test plans, execute test plans, automate test cases, and troubleshoot and resolve issues.

Leverage your 1+ years of experience in the following:

  • Testing and systems validation for commercial software systems.
  • Testing of systems deployed in AWS Cloud.
  • Knowledge of SQL and databases.
  • Developing and implementing software and systems test plans.
  • Test automation development using Python or Java.
  • Strong problem solving and troubleshooting skills.
  • Experience in testing web-based and Android applications.
  • Familiar with Qualcomm QXDM and APEX tools.
  • Knowledge of software development in Python.
  • Strong written and oral communication skills
  • Working knowledge of JIRA and GitHub is preferred.


  • Required: Bachelor's, Computer Engineering and/or Computer Networks & Systems and/or Computer Science and/or Electrical Engineering
  • Preferred: Master's, Computer Engineering and/or Computer Networks & Systems and/or Computer Science and/or Electrical Engineering or equivalent experience

Interested? Please send a resume to our Founder & CEO, Raj Dadlani at and he will respond to interested candidates within 24 hours of resume receipt. We are dealing with a highly motivated hiring manager and shortlisting viable candidates by February 22, 2019.

Vector Consulting, Inc
  • Atlanta, GA

Our Government client is looking for an experienced ETL Developer on a renewable contract in Atlanta, GA

Position ETL Developer

The desired candidate will be responsible for design, development, testing, maintenance and support of complex data extract, transformation and load (ETL) programs for an Enterprise Data Warehouse. An understanding of how complex data should be transformed from the source and loaded into the data warehouse is a critical part of this job.

  • Deep hands-on experience on OBIEE RPD & BIP Reporting Data models, Development for seamless cross-functional and cross-systems data reporting
  • Expertise and solid experience in BI Tools OBIEE, Oracle Data Visualization and Power BI
  • Strong Informatica technical knowledge in design, development and management of complex Informatica mappings, sessions and workflows on Informatica Designer Components -Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Strong programming skills, relational database skills with expertise in Advanced SQL and PL/SQL, indexing and query tuning
  • Having implemented Advanced Analytical models in Python or R
  • Experienced in Business Intelligence and Data warehousing concepts and methodologies.
  • Extensive experience in data analysis and root cause analysis and proven problem solving and analytical thinking capabilities.
  • Analytical capabilities to slice and dice data and display data in reports for best user experience.
  • Demonstrated ability to review business processes and translate into BI reporting and analysis solutions.
  • Ability to follow Software Development Lifecycle (SDLC) process and should be able to work under any project management methodologies used.
  • Ability to follow best practices and standards.
  • Ability to identify BI application performance bottlenecks and tune.
  • Ability to work quickly and accurately under pressure and project time constraints
  • Ability to prioritize workload and work with minimal supervision
  • Basic understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, Version Control and Office software
  • Exposure Data Modeling using Star/Snowflake Schema Design, Data Marts, Relational and Dimensional Data Modeling, Slowly Changing Dimensions, Fact and Dimensional tables, Physical and Logical data modeling and in big data technologies
  • Experience with Big Data Lake / Hadoop implementations

 Required Qualifications:

  • A bachelors degree in Computer Science or related field
  • 6 to 10 years of experience working with OBIEE / Data Visualization / Informatica / Python
  • Ability to design and develop complex Informatica mappings, sessions, workflows and identify areas of optimizations
  • Experience with Oracle RDBMS 12g
  • Effective communication skills (both oral and written) and the ability to work effectively in a team environment are required
  • Proven ability and desire to mentor/coach others in a team environment
  • Strong analytical, problem solving and presentation skills.

Preferred Qualifications:

  • Working knowledge with Informatica Change Data Capture installed on DB2 z/OS
  • Working knowledge of Informatica Power Exchange
  • Experience with relational, multidimensional and OLAP techniques and technology
  • Experience with OBIEE tools version 10.X
  • Experience with Visualization tools like MS Power BI, Tableau, Oracle DVD
  • Experience with Python building predictive models

Soft Skills:

  • Strong written and oral communication skills in English Language
  • Ability to work with Business and communicate technical solution to solve business problems

About Vector:

Vector Consulting, Inc., (Headquartered in Atlanta) is an IT Talent Acquisition Solutions firm committed to delivering results. Since our founding in 1990, we have been partnering with our customers, understanding their business, and developing solutions with a commitment to quality, reliability and value. Our continuing growth has been and continues to be built around successful relationships that are based on our organization's operating philosophy and commitment to ** People, Partnerships, Purpose and Performance - THE VECTOR WAY

  • Austin, TX
Company Description
Visa operates the world's largest retail electronic payments network and is one of the most recognized global financial services brands. Visa facilitates global commerce through the transfer of value and information among financial institutions, merchants, consumers, businesses and government entities. We offer a range of branded payment product platforms, which our financial institution clients use to develop and offer credit, charge, deferred debit, prepaid and cash access programs to cardholders. Visa's card platforms provide consumers, businesses, merchants and government entities with a secure, convenient and reliable way to pay and be paid in 170 countries and territories.
Job Description
At Visa University, our mission is to turn our learning data into insights and get a deep understanding of how people use our resources to impact the product, strategy and direction of Visa University. In order to help us achieve this we are looking for someone who can build and scale an efficient analytics data suite and also deliver impactful dashboards and visualizations to track strategic initiatives and enable self-service insight delivery. The Staff Software Engineer, Learning & Development Technology is an individual contributor role within Corporate IT in our Austin-based Technology Hub. In this role you will participate in design, development, and technology delivery projects with many leadership opportunities. Additionally, this position provides application administration and end-user support services. There will be significant collaboration with business partners, multiple Visa IT teams and third-party vendors. The portfolio includes SaaS and hosted packaged applications as well as multiple content providers such as Pathgather (Degreed), Cornerstone, Watershed, Pluralsight, Lynda, Safari, and many others.
The ideal candidate will bring energy and enthusiasm to evolve our learning platforms, be able to easily understand business goals/requirements and be forward thinking to identify opportunities that may be effectively resolved with technology solutions. We believe in leading by example, ownership with high standards and being curious and creative. We are looking for an expert in business intelligence, data visualization and analytics to join the Visa University family and help drive a data-first culture across learning.
  • Engage with product managers, design team and student experience team in Visa University to ensure that the right information is available and accessible to study user behavior, to build and track key metrics, to understand product performance and to fuel the analysis of experiments
  • Build lasting solutions and datasets to surface critical data and performance metrics and optimize products
  • Build and own the analytics layer of our data environment to make data standardized and easily accessible
  • Design, build, maintain and iterate a suite of visual dashboards to track key metrics and enable self-service data discovery
  • Participate in technology project delivery activities such as business requirement collaboration, estimation, conceptual approach, design, development, test case preparation, unit/integration test execution, support process documentation, and status updates
  • Participate in vendor demo and technical deep dive sessions for upcoming projects
  • Collaborate with, and mentor, data engineers to build efficient data pipelines and impactful visualizations
  • Minimum 8 years of experience in a business intelligence, data analysis or data visualization role and a degree in science, computer science, statistics, economics, mathematics, or similar
  • Significant experience in designing analytical data layers and in conducting ETL with very large and complex data sets
  • Expertise with Tableau desktop software (techniques such as LOD calculations, calculated fields, table calculations, and dashboard actions)
  • Expert in data visualization
  • High level of ability in JSON, SQL
  • Experience with Python is a must and experience with data science libraries is a plus (NumPy, Pandas, SciPy, Scikit Learn, NLTK, Deep Learning(Keras)
  • Experience with Machine Learning algorithms (Linear Regression, Multiple Regression, Decision Trees, Random Forest, Logistic Regression, Naive Bayes, SVM, K-means, K-nearest neighbor, Hierarchical Clustering)
  • Experience with HTML and JavaScript
  • Basic SFTP and encryption knowledge
  • Experience with Excel (Vlookups, pivots, macros, etc.)
  • Experience with xAPI is a plus
  • Ability to leverage HR systems such as Workday, Salesforce etc., to execute the above responsibilities
  • Understanding of statistical analysis, quantitative aptitude and the ability to gather and interpret data and information
  • You have a strong business sense and you are able to translate business problems to data driven solutions with minimal oversight
  • You are a communicative person who values building strong relationships with colleagues and stakeholders, enjoys mentoring and teaching others and you have the ability to explain complex topics in simple terms
Additional Information
All your information will be kept confidential according to EEO guidelines.
Job Number: REF15081Q
Brighter Brain
  • Atlanta, GA

Brighter Brain is seeking a skilled professional to serve as an internal resource for our consulting firm in the field of Data Science Development. Brighter Brain provides Fortune 500 clients throughout the United States with IT consultants in a wide-ranging technical sphere.

In order to fully maintain our incoming nationwide and international hires, we will be hiring a Senior Data Science SME (ML) with practical experience to relocate to Atlanta and coach/mentor our incoming classes of consultants. If you have a strong passion for the Data Science platform and are looking to join a seasoned team of IT professionals, this could be an advantageous next step.

Brighter Brain is an IT Management & Consultingfirm providing a unique take on IT Consulting. We currently offer expertise to US clients in the field of Mobile Development (iOS and Android), Hadoop, Microsoft SharePoint, and Exchange/ Office 365. We are currently seeking a highly skilled professional to serve as an internal resource for our company in the field of Data Science with expertise in Machine Learning (ML)

The ideal candidatewill be responsible for establishing our Data Science practice. The responsibilities include creation of a comprehensive training program, training, mentoring, and supporting ideal candidates, as they progress towards building their career in Data Science Consulting. This position is based out of our head office in Atlanta, GA.

If you have a strong passion for Data Science and are looking to join a seasoned team of IT professionals, this could be an advantageous next step.

The Senior Data Science SMEwill take on the following responsibilities:

-       Design, develop and maintain Data Science training material, focused around: ML Knowledge of DL, NN & NLP is a plus.

-       Interview potential candidates to ensure that they will be successful in the Data Science domain and training.

-       Train, guide and mentor junior to mid-level Data Science developers.

-       Prepare mock interviews to enhance the learning process provided by the company.

-       Prepare and support consultants for interviews for specific assignments involving development and implementation of Data Science.

-       Act as a primary resource for individuals working on a variety of projects throughout the US.

-       Interact with our Executive and Sales team to ensure that projects and employees are appropriately matched.

The ideal candidatewill not only possess a solid knowledge of the realm, but must also have the fluency in the following areas:

-       Hands-on expertise in using Data Science and building machine learning models and Deep learning models

-       Statistics and data modeling experience

-       Strong understanding of data sciences

-       Understanding of Big Data

-       Understanding of AWS and/or Azure

-       Understand the difference between Tensorflow, MxNet, etc

Skills Include:

  • Masters Degree in the Computer Science or mathematics fields

    10+ Years of professional experience in the IT Industry, in the AI realm

  • Strong understanding of MongoDB, Scala, Node.js, AWS, & Cognitive applications
  • Excellent knowledge in Python, Scala, JavaScript and its libraries, Node.js, Python, R and MatLab C/C++ Lua or any proficient AI language of choice
  • NoSQL databases, bot framework, data streaming and integrating unstructured Data Rules engines e.g. drools, ESBs e.g. MuleSoft Computer
  • Vision,Recommendation Systems, Pattern Recognition, Large Scale Data Mining or Artificial Intelligence, Neural Networks
  • Deep Learning frameworks like Tensorflow, Torch, Caffee, Theano, CNTK, cikit-
  • learn, numpy, scipy
  • Working knowledge of ML such as: Naïve Bayes Classification, Ordinary Least
  • Square
  • Regression, Logic Regression, Supportive Vector Machines, Ensemble Methods,
  • Clustering
  • Algorithms, Principal Component Analysis, Singular Value Decomposition, and
  • Independent Component Analysis.  
  • Natural Language Processing (NLP) concepts such as topic modeling, intents,
  • entities, and NLP frameworks such as SpaCy, NLTK, MeTA, gensim or other
  • toolkits for Natural Language Understanding (NLU)
  • Experience data profiling, data cleansing, data wrangling/mungline, ETL
  • Familiarity with Spark MLlib, Mahout Google, Bing, and IBM Watson APIs
  • Hands on experience as needed with training a variety of consultants
  • Analytical and problem-solving skills
  • Knowledge of IOT space
  • Understand Academic Data Science vs Corporate Data Science
  • Knowledge of the Consulting/Sales structure

Additional details about the position:

-       Able to relocate to Atlanta, Ga (Relocation package available)

-       Work schedule of 9 AM to 6 PM EST

Questions: Send your resume to Ansel Butler at Brighter Brain; make sure that there is a valid phone number and Skype ID either on the resume, or in the body of the email.

Ansel Essic Butler


404 791 5128


Senior Corporate Recruiter

Brighter Brain LLC.

1785 The Exchange, Suite 200

Atlanta, GA. 30339

Expedia, Inc.
  • Bellevue, WA

We are seeking a deeply experienced technical leader to lead the next generation of engineering investments, and culture for the GCO Customer Care Platform (CCP). The technical leader in this role will help design, engineer and drive implementation of critical pieces of the EG-wide architecture (platform and applications) for customer care - these areas include, but limited to unified voice support, partner on boarding with configurable rules, Virtual agent programming model for all partners, and intelligent fulfillment. In addition, a key focus of this leader's role will also be to grow and mentor junior software engineers in GCO with a focus on building out a '2020 world-class engineering excellence' vision / culture.

What you’ll do:

  • Deep Technology Leadership (Design, Implementation, and Execution for the follow);

  • Ship next-gen EG-wide architecture (platform and applications) that enable 90% of automated self-service journeys with voice as a first-class channel from day zero

  • Design and ship a VA (Virtual Agent) Programming Model that enables partners standup intelligent virtual agents on CCP declaratively in minutes

  • Enable brand partners to onboard their own identity providers onto CCP

  • Enable partners to configure their workflows and business rules for their Virtual Agents

  • Programming Model for Intelligent actions in the Fulfillment layer

  • Integration of Context and Query as first-class entities into the Virtual Agent

  • Cross-Group Collaboration and Influence

  • Work with company-wide initiatives across AI Labs, BeX to build out a Best of Breed

  • Conversational Platform for EG-wide apps

  • Engage with and translate internal and external partner requirements into platform investments for effective on boarding of customers

  • Represent GCO's Technical Architecture at senior leadership meetings (eCP and EG) to influence and bring back enhancements to improve CCP

Help land GCO 2020 Engineering and Operational Excellence Goals

Mentor junior developers on platform engineering excellence dimensions (re-usable patterns, extensibility, configurability, scalability, performance, and design / implementation of core platform pieces)

Help develop a level of engineering muscle across GCO that becomes an asset for EG (as a provider of platform service as well as for talent)

Who you are:

  • BS or MS in Computer Science

  • 20 years of experience designing and developing complex, mission-critical, distributed software systems on a variety of platforms in high-tech industries

  • Hands on experience in designing, developing, and delivering (shipping) V1 (version one) MVP enterprise software products and solutions in a technical (engineering and architecture) capacity

  • Experience in building strong relationships with technology partners, customers, and getting closure on issues including delivering on time and to specification

  • Skills: Linux/ Windows/VMS, Scala, Java, Python, C#, C++, Object Oriented Design (OOD), Spark, Kafka, REST/Web Services, Distributed Systems, Reliable and scalable transaction processing systems (HBase, Microsoft SQL, Oracle, Rdb)

  • Nice to have: Experience in building highly scalable real-time processing platforms that hosts machine learning algorithms for Guided / Prescriptive Learning
    Identifies and solves problems at the company level while influencing product lines

  • Provides technical leadership in difficult times or serious crises

  • Key strategic player to long-term business strategy and vision

  • Recognized as an industry expert and is a recognized mentor and leader at the company Provides strategic influence across groups, projects and products

  • Provides long term product strategy and vision through group level efforts

  • Drive for results: Is sought out to lead company-wide initiatives that deliver cross-cutting lift to the organization and provides leadership in a crisis and is a key player in long-term business strategy and vision

  • Technical/Functional skills: Proves credentials as industry experts by inventing and delivering transformational technology/direction and helps drive change beyond the company and across the industry

  • Has the vision to impact long-term product/technology horizon to transform the entire industry

Why join us:

Expedia Group recognizes our success is dependent on the success of our people.  We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business.  Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so. 

Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach.  We relentlessly strive for better, but not at the cost of the customer.  We act with humility and optimism, respecting ideas big and small.  We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground, so we can act fast and stay simple.  Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.

If you have a hunger to make a difference with one of the most loved consumer brands in the world and to work in the dynamic travel industry, this is the job for you.

Our family of travel brands includes: Brand Expedia®,®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®,®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions,™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.

Expedia is committed to creating an inclusive work environment with a diverse workforce.   All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.  This employer participates in E-Verify. The employer will provide the Social Security Administration (SSA) and, if necessary, the Department of Homeland Security (DHS) with information from each new employee's I-9 to confirm work authorization.

  • Pittsfield, MA
  • Salary: $65k - 85k

Who We’re Seeking

VidMob’s Ads Integration Engineer is a highly technical position that works with our strategic ad platform partners, integrating their APIs into the VidMob platform. You enjoy digging into complex campaign management and reporting frameworks allowing you to build elegant and scalable integrations. Your experience with ad tech makes you the expert on a team when talking about metrics, dimensions, KPIs, campaigns, squads, and formats.

What You’ll Do

You will engage with some of the world's largest companies to extend their campaign and media performance API offerings through the VidMob platform. Building tools and automation to pull and report on data along with keeping those integrations up to date. You’ll work closely with our data engineers to maximize our data pipelines and write clear documentation so our front-end engineers can quickly build features around each integration.

This position is full time and is based in Pittsfield, MA.


  • Define and implement API integrations with our strategic partners

  • Work closely with our strategic partners staying up to date on product changes

  • Be an ads/reporting integration technical expert, and have a strategic influence on partners and internal teams at VidMob

  • Support VidMob engineering efforts in other areas as needed

Minimum Qualifications

  • 3+ years of previous experience as a software engineer

  • Ad Tech experience a must with a strong understanding of campaign management tools across the major platforms (Facebook, Google, Snapchat, Twitter, etc)

  • Experience with DSPs a plus

  • Solid software development skills with experience building software developed in Java

  • Additional experience in (at least one) Python, PHP, C/C++, Ruby, or Scala

  • Excellent communication skills including experience presenting to technical and business audiences

  • BA/BS in Computer Science or equivalent degree/experience

Ultra Tendency
  • Berlin, Deutschland

Big Data Software Engineer

Lead your own development team and our customers to success! Ultra Tendency is looking for someone who convinces not just by writing excellent code, but also through strong presence and leadership. 

At Ultra Tendency you would:

  • Work in our office in Berlin/Magdeburg and on-site at our customer's offices

  • Make Big Data useful (build program code, test and deploy to various environments, design and optimize data processing algorithms for our customers)

  • Develop outstanding Big Data application following the latest trends and methodologies

  • Be a role model and strong leader for your team and oversee the big picture

  • Prioritize tasks efficiently, evaluating and balancing the needs of all stakeholders

Ideally you have:

  • Strong experience in developing software using Python, Scala or a comparable language

  • Proven experience with data ingestion, analysis, integration, and design of Big Data applications using Apache open-source technologies

  • Profound knowledge about with data engineering technology, e.g. Kafka, SPARK, HBase, Kubernetes

  • Strong background in developing on Linux

  • Solid computer science fundamentals (algorithms, data structures and programming skills in distributed systems)

  • Languages: Fluent English and German is a plus

We offer:

  • Fascinating tasks and unique Big Data challenges of major players from various industries (automotive, insurance, telecommunication, etc.)

  • Fair pay and bonuses

  • Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager

  • International diverse team

  • Possibility to work with the open-source community and become a contributor

  • Work with cutting edge equipment and tools

Confidentiality guaranteed

  • Kleinmachnow, Germany

About the team:

Core Product Technology (CPT) is a global team responsible for the end-to-end eBay product experience and technology platform. In addition, we are working on the strategy and execution of our payments initiative, transforming payments management on our Marketplace platform which will significantly improve the overall customer experience.

The opportunity

At eBay, we have started a new chapter in our iconic internet history of being the largest online marketplace in the world. With more than 1 billion listings (more than 80% of them selling new items) in over 400 markets, eBay is providing a robust platform where merchants of all sizes can compete and win. Every single day millions of users come to eBay to search for items in our diverse inventory of over a billion items.

eBay is starting a new Applied Research team in Germany and we are looking for a senior technologist to join the team. We’re searching for a hands-on person who has an applied research background with strong knowledge in machine learning and natural language processing (NLP). The German team’s mission is to improve the German and other European language search experience as well as to enhance our global search platform and machine learned ranking systems in partnership with our existing teams in San Jose California and Shanghai China.

This team will help customers find what they’re shopping for by developing full-stack solutions from indexing, to query serving and applied research to solve core ranking, query understanding and recall problems in our highly dynamic marketplace. The global search team works closely with the product management and quality engineering teams along with the Search Web and Native Front End and Search services, and Search Mobile. We build our systems using C++, Scala, Java, Hadoop/Spark/HBase, TensorFlow/Caffe, Kafka and other standard technologies. The team believes in agile development with autonomous and empowered teams.

Diversity and inclusion at eBay goes well beyond a moral necessity – it’s the foundation of our business model and absolutely critical to our ability to thrive in an increasingly competitive global landscape. To learn about eBay’s Diversity & Inclusion click here:
Webtrekk GmbH
  • Berlin, Deutschland
Your responsibilities:

In this role, you will set up your full-fledged research and development team of developers and data science engineers. You will evaluate and choose appropriate technologies and develop products that are powered by Artificial Intelligence and Machine Learning

  • Fast pace development of experimental prototypes, POCs and products for our >400 customers

  • Manage fast feedback cycles, adopt learnings and feedbacks and ultimately deliver AI powered products

  • You will develop new and optimise existing components always with an eye on scalability, performance and maintenance

  • Organize and lead team planning meetings and provide advice, clarification and guidance during the execution of sprints

  • Lead your teams' technical vision and drive the design and development of new innovative products and services from the technical side

  • Lead discussions with the team and management to define best practices and approaches

  • Set goals, objectives and priorities. Mentor team members and provide guidance by regular performance reviews.

The assets you bring to the team:

  • Hands on experience in agile software development on all levels based on profound technical understanding

  • Relevant experience in managing a team of software developers in an agile environment

  • At least 3 years of hands-on experience with developing in Frontend Technologies like Angular or React

  • Knowledge of backend technologies such as Java, Python or Scala are a big plus

  • Experience with distributed systems based on RESTful services

  • DevOps mentality and practical experience with tools for build and deployment automation (like Maven, Jenkins, Ansible, Docker)

  • Team and project-oriented leader with excellent problem solving and interpersonal skills

  • Excellent communication, coaching and conflict management skills as well as a strong assertiveness

  • Strong analytical capability, discipline, commitment and enthusiasm

  • Fluent in English, German language skills are a big plus

What we offer:

  • Prospect: We are a continuously growing team with experts in the most future-oriented fields of customer intelligence. We are dealing with real big data scenarios and data from various business models and industries. Apart from interesting tasks we offer you considerable freedom for your ideas and perspectives for the development of your professional and management skills.

  • Team oriented atmosphere: Our culture embraces integrity, team work and innovation. Our employees value the friendly atmosphere that is the most powerful driver within our company.

  • Goodies: Individual trainings, company tickets, team events, table soccer, fresh fruits and a sunny roof terrace.

  • TechCulture: Work with experienced developers who share the ambition for well-written and clean code. Choose your hardware, OS and IDE. Bring in your own ideas, work with open source and have fun at product demos, hackathons and meetups.

  • Blagnac, France

Description of the job

Vacancies for 3 Data Scientists (m/f) have arisen within Airbus Commercial Aircraft in Toulouse. You will join the PLM Systems & Integration Tests team within IM Develop department.  

IM Develop organization is established to ensure Product Life Cycle Management (PLM) Support and Services as requested by Programmes, CoE and CoC. The department is the home within Airbus to lead the development, the implementation, the maintenance and the support of PLM to all Airbus programs in line with the corporate strategy.

Within the frame of its Digital Design, Manufacturing & Services (DDMS) project, Airbus is undergoing a significant digital transformation to benefit from the latest advances in new technologies and targets a major efficiency breakthrough across the program and product lifecycle. It will be enabled by a set of innovative concepts such as model based system engineering, modular product lines, digital continuity and concurrent co-design of the product, its industrial setup and operability features.

As a Data Scientist (m/f), you will be integrated in a team of the IM Develop department and appointed to dedicated missions. You will work in an international environment where you will able to develop in-depth knowledge of local specificities: engineering, manufacturing, costing, etc.

Tasks & accountabilities

Your main tasks and responsibilities will be to:

  • Analyze large amounts of information to discover trends and patterns, build predictive models, implement cost models and machine-learning algorithms based on technical data and DMU models.

  • Combine models through ensemble modelling

  • Present information using data visualization techniques

  • Propose solutions and strategies to business challenges

  • Implement features extraction by analyzing CAD models and engineering Bill of Material

  • Collaborate with engineering, costing (FCC) to implement new costing models in python

  • Design and propose new short/medium- and long-term forecasting methods

  • Consolidate, compare and enlarge the data required for the various types of modelling

  • Attend technical events/conferences and reinforce Data Science skills within Airbus

Required skills

We are looking for candidates with the following skills and experience:

  • Strong knowledge of python development in the frame of industrial projects

  • Experience in data mining & machine-learning

  • Knowledge of Scala, Java or C++,… familiarity with R, SQL is an asset

  • Experience using business intelligence tools

  • Analytical mindset

  • Strong math skills (e.g. statistics, algebra)

  • Problem-solving aptitude

  • Excellent communication and presentation skills

  • PLM knowledge and 3D CAD programming would be a plus

  • French & English: advanced level

  • London, UK

The Role

As a Quant Platform Developer at AHL you will be building the tools, frameworks, libraries and applications which power our Quantitative Research and Systematic Trading. This includes responsibility for the continued success of “Raptor”, our in-house Quant Platform, next generation Data Engineering, and evolution of our production Trading System as we continually expand the markets and types of assets we trade, and the styles in which we trade them. Your challenges will be varied and might involve building new high performance data acquisition and processing pipelines, cluster-computing solutions, numerical algorithms, position management systems, visualisation and reporting tools, operational user interfaces, continuous build systems and other developer productivity tools.

The Team

Quant Platform Developers at AHL are all part of our broader technology team, members of a group of over sixty individuals representing eighteen nationalities. We have varied backgrounds including Computer Science, Mathematics, Physics, Engineering – even Classics - but what unifies us is a passion for technology and writing high-quality code.

Our developers are organised into small cross-functional teams, with our engineering roles broadly of two kinds: “Quant Platform Developers” otherwise known as our “Core Techs”, and “Quant Developers” which we often refer to as “Sector Techs”. We use the term “Sector Tech” because some of our teams are aligned with a particular asset class or market sector. People often rotate teams in order to learn more about our system, as well as find the position that best matches their interests.

Our Technology

Our systems are almost all running on Linux and most of our code is in Python, with the full scientific stack: numpy, scipy, pandas, scikit-learn to name a few of the libraries we use extensively. We implement the systems that require the highest data throughput in Java. For storage, we rely heavily on MongoDB and Oracle.

We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker for containerisation, OpenStack for our private cloud, Ansible for architecture automation, and HipChat for internal communication. But our technology list is never static: we constantly evaluate new tools and libraries.

Working Here

AHL has a small company, no-attitude feel. It is flat structured, open, transparent and collaborative, and you will have plenty of opportunity to grow and have enormous impact on what we do.  We are actively engaged with the broader technology community.

  • We host and sponsor London’s PyData and Machine Learning Meetups

  • We open-source some of our technology. See

  • We regularly talk at leading industry conferences, and tweet about relevant technology and how we’re using it. See @manahltech

We’re fortunate enough to have a fantastic open-plan office overlooking the River Thames, and continually strive to make our environment a great place in which to work.

  • We organise regular social events, everything from photography through climbing, karting, wine tasting and monthly team lunches

  • We have annual away days and off-sites for the whole team

  • We have a canteen with a daily allowance for breakfast and lunch, and an on-site bar for in the evening

  • As well as PC’s and Macs, in our office you’ll also find numerous pieces of cool tech such as light cubes and 3D printers, guitars, ping-pong and table-football, and a piano.

We offer competitive compensation, a generous holiday allowance, various health and other flexible benefits. We are also committed to continuous learning and development via coaching, mentoring, regular conference attendance and sponsoring academic and professional qualifications.

Technology and Business Skills

At AHL we strive to hire only the brightest and best and most highly skilled and passionate technologists.


  • Exceptional technology skills; recognised by your peers as an expert in your domain

  • A proponent of strong collaborative software engineering techniques and methods: agile development, continuous integration, code review, unit testing, refactoring and related approaches

  • Expert knowledge in one or more programming languages, preferably Python, Java and/or C/C++

  • Proficient on Linux platforms with knowledge of various scripting languages

  • Strong knowledge of one or more relevant database technologies e.g. Oracle, MongoDB

  • Proficient with a range of open source frameworks and development tools e.g. NumPy/SciPy/Pandas, Pyramid, AngularJS, React

  • Familiarity with a variety of programming styles (e.g. OO, functional) and in-depth knowledge of design patterns.


  • An excellent understanding of financial markets and instruments

  • Experience of front office software and/or trading systems development e.g. in a hedge fund or investment bank

  • Expertise in building distributed systems with service-based or event-driven architectures, and concurrent processing

  • A knowledge of modern practices for data engineering and stream processing

  • An understanding of financial market data collection and processing

  • Experience of web based development and visualisation technology for portraying large and complex data sets and relationships

  • Relevant mathematical knowledge e.g. statistics, asset pricing theory, optimisation algorithms.

Personal Attributes

  • Strong academic record and a degree with high mathematical and computing content e.g. Computer Science, Mathematics, Engineering or Physics from a leading university

  • Craftsman-like approach to building software; takes pride in engineering excellence and instils these values in others

  • Demonstrable passion for technology e.g. personal projects, open-source involvement

  • Intellectually robust with a keenly analytic approach to problem solving

  • Self-organised with the ability to effectively manage time across multiple projects and with competing business demands and priorities

  • Focused on delivering value to the business with relentless efforts to improve process

  • Strong interpersonal skills; able to establish and maintain a close working relationship with quantitative researchers, traders and senior business people alike

  • Confident communicator; able to argue a point concisely and deal positively with conflicting views.

  • Zürich, Switzerland


Start: As soon as possible

Location: Zürich, 100%


FELFEL is a Start-up based in Zürich, Switzerland. The company has revolutionized how people eat at work with its intelligent technology or ‘the FELFEL fridge’. The company is among the fastest growing start-ups in Switzerland with over 60 employees.

FELFEL's technology team, of 7 engineers, has built the heart of FELFEL's product and is responsible for the company's success as a leading European foodtech company. The team of seven developers consists of Front-End Developers, Back-End Developers, Data Scientists, and a Mobile Developer.

As a team member of the technology team, you will work very directly with our founders, the CTO, our other teams (product, growth, sales, support, etc.) to improve the experience for our end-customers eating at work every day, as well as for our internal users of our technology.


  • Lots of good (like really good ;)) food: just steps away from your desk – we make all food dreams come true…

  • Modern way of working: we use the latest technologies and offer lots time flexibility to individual team members (it matters ‘what’ you do not ‘when’ in the day).

  • Beautiful office in Zürich, Switzerland: a large open space on the top floor with lots of room to hide or room to socialize also with other team members. And of course, there is free lunch every day at our long, wooden table. P.s. there is a Lausanne office too where you can escape every now and then if you like…

  • International start-up atmosphere: the company language is English – our team is very diversified and consists of over 10 nationalities.

  • Work with direct impact: Exciting backend challenges to solve that will have very direct impact on our product.

  • Very little red tape. If you like to get things done using the latest in tech rather than talking about it, you'll feel right at home. You work directly with the CTO and other decision makers in the team (very little hierarchy).

  • Great place to work: we are a family business with strong values & beliefs - a great, warm-hearted team is waiting for you

  • Relocation support if you are based outside of Switzerland. We know that moving can be a challenge in terms of housing & documentation – we will do our best to make it as smooth as possible

  • Great team: we are very selective – and only hire the best chess players, cooks or video gamers ;)


  • You will develop and own part of the technical infrastructure for our data scientists to maintain high reliability and up-time

  • You will build and maintain live data visualizations and business intelligence tools to build a single source of truth for all of our teams to use on a daily basis

  • You will build and maintain customer e-mail engines to enhance engagement with our end-customers


  • You are an experienced full-stack developer with broad programming skills - meaning you are able to build production-ready software that meets the needs of our data science team

  • Python has been your language of choice for a long time

  • You can architect and implement robust software that embraces change, and you understand distributed systems (microservice) architecture and APIs

  • You enjoy building tools for data visualization and reporting

  • You are either highly interested in Machine Learning and/or Big Data

  • You possess a degree in Computer Science or related fields

  • You thrive in a collaborative environment involving different stakeholders and subject matter experts

  • We like to learn new, fail fast and explore new territories - so do you

  • Your Attitude: ‘The Answer is Yes’

  • The Company Language is English - you are fluent in written and spoken English. If you speak German and/or French - even better. 

  • Authentic Love & Passion for good food

  • You never smell like cigarettes and don’t smoke during the day: FELFEL is about healthy living.

  • EU/EFTA or Swiss work permit.


Interested? Then please send your CV to including a short motivation letter why you are the person who should take on this role.

Please make sure to mention the job title 'FULL-STACK PYTHON DEVELOPER ANALYTICS’ in the subject line!


FELFEL revolutionizes how people eat at work with smart technology.  The intelligent fridge 'FELFEL' makes it possible - good food all day long for employees at work by the best local chefs.

Over 30'000 employees in Switzerland already benefit from a FELFEL fridge at work in over 250 companies. 

FELFEL is a family-owned company and supports small, local producers in Switzerland. Sustainability is a core company value. The company has won several prestigious awards, among others the Swiss Economic Forum Award in 2017. 

The company was founded in 2013, and counts over 60 employees today. Friendship, respect, 'eating good food together' are key elements of its company culture. More insights on our company and our team you can find on our LinkedInFacebookInstagram and of course our Website

TRA Robotics
  • Berlin, Germany

We are engineers, designers and technologists, united by the idea of shaping the future. Our mission is to reimagine the manufacturing process. It will be fully software defined. It will be driven entirely by AI. This will mean new products will get to market much quicker.

Now we are working on creating a flexible robotic factory managed by AI. We are developing and integrating a stack of products that will facilitate the whole production process from design to manufacturing. Our goal is complex and deeply rooted in science. We understand that it is only achievable in collaboration across diverse disciplines and knowledge domains.

We're looking for Computer Vision Lead to become a part of the team in our new Berlin office.

About the project:

We want our robots to have a perfect vision. As a team leader, you will manage a distributed team of high-skilled engineers as well as create algorithms for identification, localization and tracking of objects based on both classic computer vision and deep learning.

Your Qualifications:

  • Proficiency with C/C++, Python

  • Strong knowledge of CV algorithms, ML/DL algorithms

  • Extensive experience in OpenCV, TensorFlow, CUDA

  • Deep understanding of optimization methods, machine learning, linear algebra, theory of chances, math statistics, realtime systems

  • Experience to manage distributed teams

  • Fluency in English

Will be an advantage:

  • IoT experience

  • Matlab

  • Java


  • location systems

Your tasks:

  • Working with sensors of various types (2D / 3D cameras, lidars, 3D scanners)

  • Development of computer vision algorithms

  • Solving the problem of identification and localization of the object

  • Development of visual quality control system

What we offer:

  • To join highly scientific-intensive culture and take part in developing the unique product

  • The ability to choose technology stack and approaches

  • Yearly educational budget - we support your ambitions to learn

  • Relocation package - we would like to make your start as smooth as possible

  • Flexible working environment - choose your working hours and equipment

  • Cozy co-working space in Berlin-Mitte with an access to a terrace

  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.


-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins



-Hadoop (HDFS, YARN)

Skills & Requirements:

-5-8 years of Java experience, Scala and Python experience a plus

-3+ years of experience as an analyst, data scientist, or related quantitative role.

-3+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer