Novartis Institutes for BioMedical Research
  • Cambridge, MA

20 years of untapped data waiting for a new Principle Scientific Computing Engineer/Scientific Programmer, Imaging and Machine Learning to unlock the next breakthrough in innovative medicines for patients in need. You will be at the forefront of life sciences, tackling some incredible challenges that are curing diseases and improving patients’ lives.

Your responsibilities include, but are not limited to: 
Collaborating with scientists to create, optimize and accelerate workflows through the application of High Performance Computing Techniques. You will integrate algorithm and application development with state of the art technologies to create scalable platforms that accelerate scientific research in a reproducible and standardized manner.

Key responsibilities:
• Collaborate with scientists and Research IT peers to provide consulting services around parallel algorithm development and workflow optimization for the High Performance Computing (HPC) platform.
• Teaching and training the NIBR Scientific and Informatics community in areas of expertise
• Research, develop and integrate new technologies and computational approaches to enhance and accelerate scientific research.
• Establish and maintain the technical partnership with one or more scientific functions in NIBR.

Minimum Requirements

What you will bring to this role:
• BSc in computer science or related field; or equivalent experience with 
• 5 years minimum relevant experience including strong competencies in data structures, algorithms, and software design
• Experience with High Performance Computing and Cloud Computing
• Demonstrated proficiency in Python, C, C++, CUDA or OpenCL
• Demonstrated proficiency in Signal Processing, Advanced Imaging and Microscopy techniques.
• Solid project management skills and process analysis skills
• Demonstration of strong collaboration skills, effective communication skills, and creative problem-solving abilities

Preferred Qualifications
• MSc degree
• Demonstrated proficiency in 2 or more advanced machine learning frameworks and their application to natural language processing, action recognition-micro and macro tracking.
• Demonstrated proficiency in Signal Processing, Advanced Imaging and Microscopy techniques.
• Interest in drug discovery and knowledge of the life science is a strong plus
• Knowledge of Deep visualization, Deep transfer learning and Generative Adversarial Networks is a plus.
• Demonstrated proficiency in MPI in a large-scale Linux HPC environment
• Experience with CellProfiler, Matlab, ImageJ and R is a plus.

Position will be filled commensurate with experience

  • Houston, TX

FAST GN&C Modeling and Simulation Engineer

Are you passionate about human space exploration and understanding the origins of the universe? Are you seeking a position that will offer you the opportunity to work with a dynamic and diverse team where you will make a difference? If so, we need you.  

We are excited about what we do, and we need you on our team as we take on new challenges for Johnson Space Center (JSC/NASAs pursuits in deep space exploration.

Bring your GN&C modeling and simulation skills to the Flight Analysis and Simulation (FAST) tool. FAST is a generic 3- to 6-DOF multi-body ascent, aerocapture, entry, descent, and landing (A2EDL) simulation code based in the state-of-the-art, object-oriented Trick simulation environment. FAST is currently used to develop Orion trajectory designs, GN&C algorithms, and validate their implementation in flight software. FAST requires improvements to provide flight performance analysis, algorithm development, and flight software contributions for future programs including the Commercial Crew Program, Orion, and missions to the Moon and Mars.

As a FAST GN&C Modeling and Simulation Engineer you will:

    • Implement improvements and enhance FAST capabilities
      • Improve the usability of the FAST user interface to simplify prototyping of algorithms and models
      • Enhance 3- and 6-DOF powered ascent capabilities for lunar and Mars.
      • Create an improved generic optimization capability
      • Incorporate new vehicle, environment, and aerodynamic models for Commercial Crew and lunar/Mars
      • Implement heritage and new flight guidance algorithms, control systems, and navigation models
      • Perform and document model/simulation verification and validation.
    • Achieve Class C certification for use in developing/verifying flight software
      • Expand unit testing and improve model documentation.
    • Improve on-boarding of new users
      • Create a detailed users guide with example cases of historical vehicles.
    • Directly interact with the NASA customers, along with various NASA support contractors, during technical meetings and working groups.
    • Perform other duties as required.

Required Education/Experience/Skills:

    • BS degree in engineering from an accredited engineering school
    • A minimum of five (5) years of related engineering experience, or an MS degree from an accredited engineering school and a minimum of four (4) years of related engineering experience, or Ph.D. from an accredited engineering school and a minimum of zero (0) years of related experience
    • Expertise in C/C++, Linux, and Python
    • Experience with the development, verification and validation of 6-DOF GN&C spacecraft simulations
    • Ability to apply knowledge and experience towards timely completion of technical products and services
    • Excellent communication skills and the ability to work in a team environment consisting of NASA civil servants and various contractor employees
    • Strong leadership skills


    • MS degree in Aerospace Engineering with an emphasis in GN&C principles and orbital/flight mechanics
    • Experience with:
      • JSC TRICK Simulation Environment
      • JEOD (JSC Engineering Orbital Dynamics) modeling package
      • LaTeX
    • Previous experience with simulation visualization and graphics packages, particularly JSC EDGE or DOUG
    • Familiarity with MATLAB/Simulink
    • Experience in product delivery and management utilizing source code management tools such as Jenkins and Git/GitLab

If you have the qualifications and passion for this amazing opportunity please send your updated resume directly to Kim Cordray at,

    • Must be a U.S. Citizen and successfully complete a U.S. government background investigation.
    • Management has the prerogative to select at any level for which this position has been advertised.

For more information on our partnership with NASA at Johnson Space Center, please visit

Essential Functions

Work Environment
Generally an office environment, but can involve inside or outside work depending on task.

Physical Requirements
Work may involve sitting or standing for extended periods (90% of time). May require lifting and carrying up to 25 lbs (5% of time).

Equipment and Machines
Standard office equipment (PC, telephone, printer, etc.).

Regular attendance in accordance with established work schedule is critical. Ability to work outside normal schedule and adjust schedule to meet peak periods and surge requirements when required.

Other Essential Functions
Must be able to work in a team atmosphere. Must put forward a professional behavior that enhances productivity and promotes teamwork and cooperation. Grooming and dress must be appropriate for the position and must not impose a safety risk/hazard to the employee or others.

JHU Applied Physics Laboratory
  • Laurel, MD
  • Salary: $100k - 140k

The Johns Hopkins Applied Physics Laboratory (APL) is a national leader in scientific research and development. APL is actively seeking a Senior Data Scientist for the Health Data Sciences & Analytics Group. The Senior Data Scientist will support the National Health Mission Area, whose aim is to revolutionize health through science and engineering. JHU/APL is located midway between Baltimore and Washington, DC.

The Health Data Science and Analytics Group provides cutting edge analytics contributions to national and global publichealth and healthcare challenges, developing solutions in Health Informatics, Population Health, Precision Medicine, Digital Health, Analytics and Software Systems. Our multidisciplinary team of engineers and scientists develop statistical and machine learning algorithms and incorporate visual analytics into software systems that process large and complex data sets. We are looking for data scientists, computer scientists, applied mathematicians, statisticians and software developers that are creative problem solvers and team players dedicated to building world class expertise to provide
solutions for health and healthcare systems around the globe.

Job Summary:
Design and develop novel computational algorithms and statistical methods and design corresponding data architectures to analyze large and complex data for a variety of challenging health and healthcare problems.
1. Develop advanced algorithms and create software applications to perform analytics on large-scale and complex data for real-world health and healthcare applications. Promote a climate conducive to intellectual curiosity, creativity, innovation, collaboration, growth, life-long learning, productivity, and respect for others.
2. Be a leader in data science and analytics efforts. Provide input to team leads and other analysts to help define the team’s vision, design and execute analytic projects from start-to-finish, inform technical direction, and support reporting of accomplishments. Assure milestones are met on time and be responsive to sponsor needs. Build collaboration among health stakeholders, working across organizations to bring consensus to achieve objectives. Become a sought out resource by consistently producing high-quality results.
3. Document and present papers to communicate impact of research and engage with sponsor and stakeholder community.
4. Communicate often and effectively with team, sponsors and JHU/APL leadership. Participate in the data science, analytics and APL community. Take advantage of collaboration and innovation opportunities to help ensure success of APL’s mission.

Note: This job summary and listing of duties is for the purpose of describing the position and its essential functions at time of hire and may change over time.

Qualifications - External
Required Qualifications:
• M.S. in Computer Science, Information Science, Mathematics, Statistics, Data Science, or related field.
• 5-10+ years of experience.
• Demonstrated ability in selecting, developing, and applying machine learning and data mining algorithms.
• Working knowledge of modern large-scale data systems and architectures; ability to manage and manipulate large disparate data sets.
• Experience with graph analytics.
• Experience with pattern recognition, statistical analysis and machine learning; fluent, with hands-on experience with some of the following implementation languages: Python, R, Matlab, JAVA, or C++/C;
• Excellent interpersonal skills and outstanding written and oral communication skills; ability to articulate complex technical issues effectively and appropriately for a wide range of audiences.
• Strong problem solving skills strong analytical and organizational skills; ability to work independently or within a group.
• Must be eligible for Secret clearance requiring background investigation.

Desired Qualifications:
• Ph.D. in the disciplines listed above.
• Demonstrated capability to carry out original machine learning research beyond incremental application of existing techniques, as evidenced by publications in premier conferences.
• Research records that illustrate in-depth understanding of underlying theory necessary to develop novel algorithms to address unique real-world challenges.
• Extensive experience in developing and applying machine learning algorithms in health and healthcare application settings.
• Research experience with advanced machine learning research topics.
• Experience with data–driven predictive model development, unstructured text mining, natural language processing, and anomaly and novelty detection.
• A strong technical writing background.
• Experience in medicine, emergency response, or public health applications and/or exposure to clinical information systems and medical data standards.

Special Working Conditions: Some travel to local sponsor sites and support for field testing may be required.

Security: Applicant selected will be subject to a government security clearance investigation and must meet the requirements for access to classified information. Eligibility requirements include U.S. citizenship.

Equal Employment Opportunity: Johns Hopkins University/Applied Physics Laboratory (APL) is an Equal Opportunity/Affirmative Action employer that complies with Title IX of the Education Amendments Acts of 1972, as well as other applicable laws. All qualified applicants will receive consideration for employment without regard to race, color, religion, sexual orientation, gender identity, national origin, disability, or protected Veteran status.

Huntech USA LLC
  • San Diego, CA

Great opportunity to work with the leader in semiconductor industry who unveiled the worlds first 7 nanometer PC platform, created from the ground up for the next generation of personal computing by bringing new features with thin and light designs, allowing for new form factors in the always-on, always-connected category. It features the new octa-core CPU, the fastest CPU ever designed and built, with a larger cache than previous compute platforms, faster multi-tasking and increased productivity for users, disrupting the performance expectations of current thin, light and fanless PC designs. This platform is currently sampling to customers and is expected to begin shipping in commercial devices in Q3 of 2019.

Staff Data Analyst

You will study the performance of the Global Engineering Grid/ Design workflows across engineering grid and provide insights in effective analytics in support of Grid 2.0 program. You will conduct research, design statistical studies and analyze data in support of Grid 2.0 program.  This job will challenge you to dive deep into the engineering grid/ design flow world and understand the unique challenges in operating engineering grid at a scale unrivaled in the industry.  You should have experience working in an EDA or manufacturing environment and comfortable workings in an environment where problems are not always well-defined.


  • Identify and pursue opportunities to improve the efficiency of global engineering grid and design workflows.
  • Develop systems to invest, analyze, and take automated action across real-time feeds of high volume data.
  • Research and implement new analytics approaches effective deployment of machine learning/ data modeling to solve business problems Identify patterns and trends from large, high-dimensional data sets; manipulate data into digestible and actionable reports.
  • Make business recommendations (e.g. cost-benefit, experiment analysis) with effectivepresentations of findings at multiple levels of stakeholders through visual displays of quantitative information.
  • Plan effectively to set priorities and manage projects, identify roadblocks and come up technical options.

Leverage your 8+ years of experience articulating business questions and using mathematical techniques to arrive at an answer using available data. 3 - 4 yrs advance Tableau is a must. Experience translating analysis results into business recommendations. Experience with statistical software (e.g., R, Python, MATLAB, pandas, scala) and database languages like SQL Experience with data warehousing concepts (Hadoop, mapR) and visualization tools (e.g. QlikView, Tableau, Angular, Thoughtspot). Strong business acumen,critical thinking ability, and attention to detail.

Background in data science, applied mathematics, or computational science and a history of solving difficult problems using a scientific approach with MS or BS degree in a quantitative discipline (e.g., Statistics, Applied Mathematics, Operations Research, Computer Science, Electrical Engineering) and understand how to design scientific studies. You should be familiar with the state of the art in machine learning/ data modeling/ forecasting and optimization techniques in a big data environment.

Data Analytics Software Test Engineer

As a member of the Corporate Engineering Services Group software test team, you will be responsible for testing various cutting edge data analytics products and solutions. You will be working with a dynamic engineering team to develop test plans, execute test plans, automate test cases, and troubleshoot and resolve issues.

Leverage your 1+ years of experience in the following:

  • Testing and systems validation for commercial software systems.
  • Testing of systems deployed in AWS Cloud.
  • Knowledge of SQL and databases.
  • Developing and implementing software and systems test plans.
  • Test automation development using Python or Java.
  • Strong problem solving and troubleshooting skills.
  • Experience in testing web-based and Android applications.
  • Familiar with Qualcomm QXDM and APEX tools.
  • Knowledge of software development in Python.
  • Strong written and oral communication skills
  • Working knowledge of JIRA and GitHub is preferred.


  • Required: Bachelor's, Computer Engineering and/or Computer Networks & Systems and/or Computer Science and/or Electrical Engineering
  • Preferred: Master's, Computer Engineering and/or Computer Networks & Systems and/or Computer Science and/or Electrical Engineering or equivalent experience

Interested? Please send a resume to our Founder & CEO, Raj Dadlani at and he will respond to interested candidates within 24 hours of resume receipt. We are dealing with a highly motivated hiring manager and shortlisting viable candidates by February 22, 2019.

Brighter Brain
  • Atlanta, GA

Brighter Brain is seeking a skilled professional to serve as an internal resource for our consulting firm in the field of Data Science Development. Brighter Brain provides Fortune 500 clients throughout the United States with IT consultants in a wide-ranging technical sphere.

In order to fully maintain our incoming nationwide and international hires, we will be hiring a Senior Data Science SME (ML) with practical experience to relocate to Atlanta and coach/mentor our incoming classes of consultants. If you have a strong passion for the Data Science platform and are looking to join a seasoned team of IT professionals, this could be an advantageous next step.

Brighter Brain is an IT Management & Consultingfirm providing a unique take on IT Consulting. We currently offer expertise to US clients in the field of Mobile Development (iOS and Android), Hadoop, Microsoft SharePoint, and Exchange/ Office 365. We are currently seeking a highly skilled professional to serve as an internal resource for our company in the field of Data Science with expertise in Machine Learning (ML)

The ideal candidatewill be responsible for establishing our Data Science practice. The responsibilities include creation of a comprehensive training program, training, mentoring, and supporting ideal candidates, as they progress towards building their career in Data Science Consulting. This position is based out of our head office in Atlanta, GA.

If you have a strong passion for Data Science and are looking to join a seasoned team of IT professionals, this could be an advantageous next step.

The Senior Data Science SMEwill take on the following responsibilities:

-       Design, develop and maintain Data Science training material, focused around: ML Knowledge of DL, NN & NLP is a plus.

-       Interview potential candidates to ensure that they will be successful in the Data Science domain and training.

-       Train, guide and mentor junior to mid-level Data Science developers.

-       Prepare mock interviews to enhance the learning process provided by the company.

-       Prepare and support consultants for interviews for specific assignments involving development and implementation of Data Science.

-       Act as a primary resource for individuals working on a variety of projects throughout the US.

-       Interact with our Executive and Sales team to ensure that projects and employees are appropriately matched.

The ideal candidatewill not only possess a solid knowledge of the realm, but must also have the fluency in the following areas:

-       Hands-on expertise in using Data Science and building machine learning models and Deep learning models

-       Statistics and data modeling experience

-       Strong understanding of data sciences

-       Understanding of Big Data

-       Understanding of AWS and/or Azure

-       Understand the difference between Tensorflow, MxNet, etc

Skills Include:

  • Masters Degree in the Computer Science or mathematics fields

    10+ Years of professional experience in the IT Industry, in the AI realm

  • Strong understanding of MongoDB, Scala, Node.js, AWS, & Cognitive applications
  • Excellent knowledge in Python, Scala, JavaScript and its libraries, Node.js, Python, R and MatLab C/C++ Lua or any proficient AI language of choice
  • NoSQL databases, bot framework, data streaming and integrating unstructured Data Rules engines e.g. drools, ESBs e.g. MuleSoft Computer
  • Vision,Recommendation Systems, Pattern Recognition, Large Scale Data Mining or Artificial Intelligence, Neural Networks
  • Deep Learning frameworks like Tensorflow, Torch, Caffee, Theano, CNTK, cikit-
  • learn, numpy, scipy
  • Working knowledge of ML such as: Naïve Bayes Classification, Ordinary Least
  • Square
  • Regression, Logic Regression, Supportive Vector Machines, Ensemble Methods,
  • Clustering
  • Algorithms, Principal Component Analysis, Singular Value Decomposition, and
  • Independent Component Analysis.  
  • Natural Language Processing (NLP) concepts such as topic modeling, intents,
  • entities, and NLP frameworks such as SpaCy, NLTK, MeTA, gensim or other
  • toolkits for Natural Language Understanding (NLU)
  • Experience data profiling, data cleansing, data wrangling/mungline, ETL
  • Familiarity with Spark MLlib, Mahout Google, Bing, and IBM Watson APIs
  • Hands on experience as needed with training a variety of consultants
  • Analytical and problem-solving skills
  • Knowledge of IOT space
  • Understand Academic Data Science vs Corporate Data Science
  • Knowledge of the Consulting/Sales structure

Additional details about the position:

-       Able to relocate to Atlanta, Ga (Relocation package available)

-       Work schedule of 9 AM to 6 PM EST

Questions: Send your resume to Ansel Butler at Brighter Brain; make sure that there is a valid phone number and Skype ID either on the resume, or in the body of the email.

Ansel Essic Butler


404 791 5128


Senior Corporate Recruiter

Brighter Brain LLC.

1785 The Exchange, Suite 200

Atlanta, GA. 30339

TRA Robotics
  • Berlin, Germany

We are engineers, designers and technologists, united by the idea of shaping the future. Our mission is to reimagine the manufacturing process. It will be fully software defined. It will be driven entirely by AI. This will mean new products will get to market much quicker.

Now we are working on creating a flexible robotic factory managed by AI. We are developing and integrating a stack of products that will facilitate the whole production process from design to manufacturing. Our goal is complex and deeply rooted in science. We understand that it is only achievable in collaboration across diverse disciplines and knowledge domains.

We're looking for Computer Vision Lead to become a part of the team in our new Berlin office.

About the project:

We want our robots to have a perfect vision. As a team leader, you will manage a distributed team of high-skilled engineers as well as create algorithms for identification, localization and tracking of objects based on both classic computer vision and deep learning.

Your Qualifications:

  • Proficiency with C/C++, Python

  • Strong knowledge of CV algorithms, ML/DL algorithms

  • Extensive experience in OpenCV, TensorFlow, CUDA

  • Deep understanding of optimization methods, machine learning, linear algebra, theory of chances, math statistics, realtime systems

  • Experience to manage distributed teams

  • Fluency in English

Will be an advantage:

  • IoT experience

  • Matlab

  • Java


  • location systems

Your tasks:

  • Working with sensors of various types (2D / 3D cameras, lidars, 3D scanners)

  • Development of computer vision algorithms

  • Solving the problem of identification and localization of the object

  • Development of visual quality control system

What we offer:

  • To join highly scientific-intensive culture and take part in developing the unique product

  • The ability to choose technology stack and approaches

  • Yearly educational budget - we support your ambitions to learn

  • Relocation package - we would like to make your start as smooth as possible

  • Flexible working environment - choose your working hours and equipment

  • Cozy co-working space in Berlin-Mitte with an access to a terrace

Burnett Specialists / Choice Specialists
  • Houston, TX

Medical Economics & Informatics Analyst


This position is responsible for supporting business analysis, utilization analysis, performance results monitoring, reports development, and analytical support within the Medical Economics and Informatics Department.


Extract, manage, and analyze operational, claims and performance data to identify trends, patterns, insights, and outliers within data.

Translates transactional data into client ready deliverables using visualization tools available, such as Tableau.

Drives a repeatable analytic process, and consistently deliver best-in-class reporting to multiple business stakeholders.

Studies client specific data at multiple geographical and clinical levels to identify utilization and cost trends and provide recommendations and insights.

Contributes in analyzing and developing reports on client specific utilization trends, program savings and performance to be shared with internal Client Services team.

Provides analytical and technical support for development and QA of Client Level Dashboards and other recurrent reporting.

Collaborates on design/development/automatization of Standard Client Reporting Package, including dashboards.

Contributes on data deliverables & takeaways for Quarterly Business Review meetings.

Responsible for ad-hoc Client and Markets requests for mature programs.

Collaborates on processes to integrate new client data (claims/membership) and perform quality control tests.


·         Bachelor Degree or higher in Healthcare Informatics, Health Care Statistics, Public Health Economics, Epidemiology, Mathematics, Computer Science/IT, related field or equivalent experience.

·         3+ years experience in the Healthcare Industry and/or Managed Care Organizations

·         Experience in analytics/informatics and report development is required

·         Experience using Medical Claims data (medical cost, utilization, cost benefit analysis, etc.) is required

·         Experience with Pharmacy claims data and healthcare records is preferred

·         1+ years experience with analytics in data warehouse environment

·         Experience using SQL for report writing and data management

      Direct experience in business intelligence applications, advanced data visualization tools and/or statistical analysis software (such as: SQL/MySQL/R, SAS, Tableau, Minitab, Matlab, Crystal Reports, Business Objects Desktop (Web based) Intelligence, etc.)

      Intermediate to advanced skills with Microsoft Office tools (MS Word, Excel, PowerPoint, Visio, Project) necessary to document, track and present information related to company program/products/clients

      Knowledge of healthcare financial business cycle, healthcare quality reporting and analysis, benchmarking is required

      Knowledge of health system functions, terminology and standard ICD-10 and CPT coding systems is highly desirable

      Excellent critical and analytical thinking skills are highly desirable

      Ability to compile information and prepare reports that are easily translatable for client delivery

UST Global
  • San Diego, CA


- 7+ years experience with Python

- 4+ years experience with Java

General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

UST Global
  • Atlanta, GA


- 7+ years experience with Python

- 4+ years experience with Java

General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

Computer Staff
  • Fort Worth, TX

We have been retained by our client located in Fort Worth, Texas (south Ft Worth area), to deliver a Risk Modeler on a regular full-time basis.   We prefer SAS experience but are interviewing candidates with R, SPSS, WPS, MatLab or similar statistical package experience if candidate has experience from financial loan credit risk analysis industry. Enjoy all the resources of a big company, none of problems that small companies have. This company has doubled in size in 3 years. We have a keen interest in finding a business minded statistical modeling candidate with some credit risk experience to build statistical models within the marketing, direct mail areas of financial services, lending, loans. We are seeking a candidate with statistical modeling, and data analysis skills, interested in creating better ways to solve problems in order to increase loan originations, and decrease loan defaults, and more. Our client is in business to find prospective borrowers, originate loans, provide loans, service loans, process loans and collect loan payments. The team works with third party data vendors, credit reporting agencies and data service providers, data augmentation, address standardization, fraud detection; decision sciences, analytics, and this position includes create of statistical models. They support the one of the, if not the largest profile of decision management in the US.  

We require experience with statistical analysis tools such as SAS, Matlab, R, WPS or SPSS or Python if to do statistical analysis. This is a statistical modeling, risk modeling, model building, decision science, data analysis and statistical analysis type of role requiring SQL and/or SQL Server experience and critical thinking skills to solve problems.   We prefer candidates with experience with data analysis, SQL queries, joins (left, inner, outer, right), reporting from data warehouses with tools such as Tableau, COGNOS, Looker, Business Objects. We prefer candidates with financial and loan experience especially knowledge of loan originations, borrower profiles or demographics, modeling loan defaults, statistical analysis i.e. Gini Coefficients and K-S test / Kolmogorov-Smirnov test for credit scoring and default prediction and modeling.

However, primarily critical thinking skills, and statistical modeling and math / statistics skills are needed to fulfill the tasks of this very interesting and important role, including playing an important role growing your skills within this small risk/modeling team. Take on challenges in the creation and use of statistical models. There is no use for Hadoop, or any NoSQL databases in this position this is not a big data type of position. no "big data" type things needed. There is no Machine Learning or Artificial Intelligence needed in this role. Your role is to create and use those statistical models. Create statistical models for direct mail in financial lending space to reach the right customers with the right profiles / demographics / credit ratings, etc. Take credit risk, credit analysis, loan data and build a new model, or validate the existing model, or recalibrate it or rebuild it completely.   The models are focused on delivering answers to questions or solutions to problems within these areas financial loan lending: Risk Analysis, Credit Analysis, Direct Marketing, Direct Mail, and Defaults. Logistical regression in SAS or Knowledge Studio, and some light use of Looker as the B.I. tool on top of SQL Server data.   Deliver solutions or ways for this business to make improvements in these areas and help the business be more profitable. Seek answers to questions. Seek solutions to problems. Create models. Dig into the data. Explore and find opportunities to improve the business. Expected to fit within the boundaries of defaults or loan values and help drive the business with ideas to get a better models in place, or explore data sources to get better models in place. Use critical thinking to solve problems.

Answer questions or solve problems such as:

What are the statistical models needed to produce the answers to solve risk analysis and credit analysis problems?

What are customer profiles have the best demographics or credit risk for loans to send direct mail items to as direct marketing pieces?

Why are loan defaults increasing or decreasing? What is impacting the increase or decrease of loan defaults?  

Required Skills

Bachelors degree in Statistics or Finance or Economics or Management Information Systems or Math or Quantitative Business Analysis or Analytics any other related math or science or finance degree. Some loan/lending business domain work experience.

Masters degree preferred, but not required.

Critical thinking skills.

must have SQL skills (any database SQL Server, MS Access, Oracle, PostgresSQL, Postgres) and the ability to write queries, joins, inner joins, left joins, right joins, outer joins. SQL Server is highly preferred.

Any statistical analysis systems / packages experience including statistical modeling experience, and excellent math skills:   SAS, Matlab, R, WPS, SPSS or Python with R language if used in statistical analysis. Must have significant statistical modeling skills and experience.

Preferred Skills:
Loan Credit Analysis highly preferred.   SAS highly preferred.
Experience with Tableu, Cognos, Business Objects, Looker or similar data warehouse data slicing and dicing and data warehouse reporting tools.   Creating reports from data warehouse data, or data warehouse reporting. SQL Server SSAS but only to pull reports. Direct marketing, direct mail marketing, loan/lending to somewhat higher risk borrowers.

Employment Type:   Regular Full-Time

Salary Range: $85,000 130,000 / year    

Benefits:  health, medical, dental, vision only cost employee about $100 per month.
401k 4% matching after 1 year, Bonus structure, paid vacation, paid holidays, paid sick days.

Relocation assistance is an option that can be provided, for a very well qualified candidate. Local candidates are preferred.

Location: Fort Worth, Texas
(area south of downtown Fort Worth, Texas)

Immigration: US citizens and those authorized to work in the US are encouraged to apply. We are unable to sponsor H1b candidates at this time.

Please apply with resume (MS Word format preferred), and also Apply with your Resume or apply with your Linked In Profile via the buttons on the bottom of this Job Posting page:  

Please call 817-424-1411 or please send a Text to 817-601-7238 to inquire or to follow up on your resume application. Yes, we recommend you call to leave a message, or send a text with your name, at least.   Thank you for your attention and efforts.

SBT Industries
  • Austin, TX


We are seeking an innovative and passionate engineer who is accomplished in developing cutting-edge machine learning technologies for audio processing. At heart you are a researcher, by nature you are an innovator, and by CV you are a deliverer of multiple groundbreaking machine learning solutions to sophisticated audio problems. Your career progression has contributed to numerous advancements in your field and has created a tangible effect in your space.

You will be an integral member of a small, curated team of elite machine learning engineers delivering next-generation audio processing solutions, with an emphasis in ASR research and development. Your advancements will be applied both internally and externally, in collaboration with groups worldwide, as they develop new products by integrating our deep learning technologies into their solutions. Being in a critical and high-growth role, you would be granted a high degree of trust, resources, and ownership in developing these technologies, from speculation to development to implementation.

This impactful engineer will have the following:

    • Vision for innovation and the end-products global impact, not the commoditization of technology
    • Ability to expand and apply our novel deep learning noise mitigation techniques to other audio areas
    • Collaborative nature to work with global teams in integrating our advancements into their solutions
    • Team-focused vision to work closely with other engineers in concepting, designing, developing, and testing new speech enhancement and signal processing algorithms for the global market
    • Ability to communicate effectively across internal and external teams, technically and non-technically
    • Comfortability with working in a dynamic start-up environment
    • Knowledge in defining and executing team and organization-level R&D practices in an industry environment

Required background:

    • Ph. D. in a quantitative field with specialization in machine learning or closely-related area, preferred emphasis on speech and/or signal processing (highly-qualified MS can be considered)
    • Expert knowledge in one or more major programming languages, MATLAB, and Python
    • Strong fundamentals in problem-solving, complexity analyses, and algorithm development
    • Familiarity and hands-on experience with deep learning techniques (to include convolutional and recurrent networks), machine learning (data analysis, preparation, and training), and audio processing
    • Experience in Bayesian statistics, nonlinear manifold learning, stochastic optimization, and/or wavelet analysis
    • Background in conducting scientific analyses, including designing and developing various types of both supervised and unsupervised models for sophisticated projects
  • Austin, TX


SparkCognition is an AI leader that offers business-critical solutions for customers in energy, oil and gas, manufacturing, finance, aerospace, defense, and security. A highly awarded company recognized for cutting-edge technology, SparkCognition develops AI-powered, cyber-physical software for the safety, security, reliability, and optimization of IT, OT, and the Industrial IoT.

SparkCognition is looking for innovative Data Scientists to join our team to help create the next generation of analytics and artificial intelligence solutions. At SparkCognition, you will immerse yourself in cutting-edge research and work with the latest technologies to deliver value in the Industrial IoT and Defense spaces.


  • Building models to solve specific problems
  • Processing, cleansing, and verifying the integrity of data used for analysis
  • Feature engineering using various techniques for the enhancement of data
  • Performing feature selection on original and generated data
  • Using machine learning tools to develop and train models
  • Performing efficacy testing of the models
  • Building automated tools that enable data scientists to more effectively perform tasks such as data cleaning, feature generation, feature selection, or model building
  • Performing ad-hoc analysis and presenting results in a clear manner
  • Working with a team to help solve new, never-before-solved challenges across multiple industries


  • Must be a US Citizen
  • Understanding and experience using machine learning techniques and algorithms, including but not limited to: Linear Models, Neural Networks, Decision Trees, Bayesian Techniques, Clustering, Anomaly Detection, and more
  • Experience with data science languages, such as Python, R, MatLab, etc.
  • Experience with machine learning frameworks, such as PyTorch, TensorFlow, Theano, Keras, etc
  • Great communication skills
  • Good applied statistics skills, such as distributions, statistical testing, etc.
  • Good scripting and programming skills, especially Python
  • Experience managing large volumes of data (terabytes or more)
  • Graduate degree (or equivalent industry experience), in Computer Science, Statistics, Physics, Mathematics, Neuroscience, Linguistics, Electrical Engineering, Economics, or a related scientific discipline
  • Experience with distributed computing, such as Hadoop, Spark, or an MPP environment a plus
  • Experience with developing application on full stack of HTTP, JSON, REST, React, Java/C#, SQL and no-SQL databases a plus
  • Experience with NLP, Big Data Analytics, and Graphing techniques a plus
Daimler AG
  • Stuttgart, Deutschland


Automatisiertes Fahren ist eine der großen Herausforderungen der Automobilindustrie in den nächsten Jahren. Die Abteilung Umgebungserfassung ist dabei sowohl für die Weiterentwicklung geeigneter Sensorik, als auch für die Entwicklung der notwendigen Algorithmen zum Verstehen von Verkehrsszenen zuständig.

Eine sehr wichtige Technologie in diesem Umfeld ist die Radar-Technologie. Zur Entwicklung einer ausgereiften Radar-Lösung für den Einsatz im automatisierten Fahren suchen wir eine/-n Softwareentwickler/-in Radarverstehen zur Verstärkung des Teams Radar.

Ihre Aufgaben im Einzelnen:
Softwareseitige Anbindung und Inbetriebnahme von Sensoren (Autosar).
Entwicklung und Pflege von SW Infrastruktur und Basis-SW.
Definition und Test von Sensorschnittstellen.
Implementierung neuer Algorithmen und Migration bestehender Algorithmen auf die Zielarchitektur. (ARM, embedded GPU).
Spezifikation und Review von Lastenheften.
Abstimmung der Erfordernisse mit dem Zulieferer.
Freigabeuntersuchungen und Serienfreigaben.
Enge Zusammenarbeit mit in- und externen Partnern des Teams.


Abgeschlossenes Studium der Fachrichtung Informatik, Technische Informatik, Ingenieurwissenschaften oder MINT-Fächer

Erfahrung und spezifische Kenntnisse:
Sehr gute Kenntnisse in der objektorientierten Programmierung mit C++ und CUDA unter Linux
Erfahrung in der Verwendung gängiger Entwicklungswerkzeuge (Eclipse CDT, gdb, gcc, CMake)
Mehrjährige Erfahrung in der Programmierung echtzeitfähiger Systeme
Kenntnis gängiger Entwurfsmuster sowie der UML
Gute Kenntnisse in Skriptsprachen (Python, Linux-Shell)
Kenntnis gängiger Bus-System im Automobilbereich (FlexRay, CAN, BroadR-Reach)
Mehrjährige Erfahrung in der Entwicklung großer Softwareprojekte und der Anwendung agiler Methoden (Scrum, Atlassian-Tools, git Workflow, Issue Tracking)
Erfahrung in der Programmierung von Steuergeräten von Vorteil
Kenntnisse in ISO26262 und MISRA C von Vorteil
Gute Kenntnisse in MATLAB Simulink
Praktische Erfahrung im Einsatz von Matlab, DOORS, MS Project, MS Office

Verhandlungssichere Deutsch- und Englischkenntnisse

Persönliche Kompetenzen:
Ausgeprägte Ziel- und Ergebnisorientierung
Ausgeprägte Kommunikationsfähigkeit
Unternehmerische Herangehensweise und Eigeninitiative

Führerschein Klasse B
Bereitschaft zur Teilnahme an Versuchsfahrten und gelegentlicher Reisetätigkeit

  • Philadelphia, PA

Cloudreach is looking to add a key team member to tackle unique challenges as we grow our software products that focus on optimizing computing performance, cost, and security, as well as enabling software modernization. As a key member of this team, your primary role will be to work directly with the R&D and engineering team as well as the product team. A successful candidate will have several years of engineering technical leadership experience with many of the specific tools listed below and enjoys the challenge of building state-of-the-art automated solutions for infrastructure management and monitoring of large-scale. The position requires deep knowledge of computing performance, including how to measure, model, and predict performance. The candidate must be able to lead the development of performance and cost optimization oriented products, which include developing computational systems that can reliably measure, model, and predict performance. This role requires comprehensive experimentation to ensure that performance models are valid across all scenarios. Lastly, a successful candidate must be able to design intuitive and yet comprehensive user interfaces that enable the user to leverage the deep knowledge computing performance, while the user does not have such knowledge. While this is not a research position, it shares methodologies of isolating specific questions, and developing rigorous solutions to the questions.

On the other hand, while a key differentiator is the ability to develop novel problems and solutions, there is considerable work on more straightforward performance-oriented product development.

What will you do at Cloudreach?

    • Hands-on development using primarily but not limited to Java, Python, SQL, Matlab, Scala, Tensorflow, C/C++, Linux modules, eBPF, or whatever is needed.
    • Lead and mentor a growing group of analytics engineers
    • Explore and help the team explore new methods to gain insights into performance, cost, and applications
    • Work with the product team to explore ideas that maintain a market leadership
    • Work with the entire engineering team to provide overall direction as it relates to performance and application analytics

What do we look for?

    • Strong understanding of computing performance in the context of enterprise and cloud computing
    • Significant experience in developing and implementing computational algorithms
    • Experience with building reliable data acquisition and processing systems
    • Experience in several of the following
      • Comprehensive performance evaluation for computing systems such as GPU, databases, enterprise applications, operating systems, networking, virtualization, and HPC
      • Deep and highly robust system monitoring and time series analysis
      • Anomaly detection
      • Cyber-security
      • Machine learning techniques
      • Code analysis
        • static and dynamic analysis
        • code complexity analysis
      • Natural language processing
        • Corpus generation
        • Named entity detection
        • Ontology generation
        • Text classification

Since Cloudreach focuses on cloud computing, networking, operating systems, enterprise applications, code development, etc., extensive domain knowledge is these or related areas is critical. Successful applicants should be able to read research papers and implement proposed solutions; research experience is helpful, but not required. Consequently, successful applicants will likely have a Masters degree or a Ph.D.


    • Education: Ph.D. prefered or Masters degree in Computer Science or related area.
    • Experience: 8+ years experience in software and performance-oriented analytics development
    • Leadership: Experience leading multiple product development teams in planning, executing and delivering on enterprise software projects
    • Architecture: Strong software architecture background and experience with the ability to mentor team members
    • Experience with one or more of the IaaS providers is a must (AWS, Azure, GCE cloud etc.)
    • Track record of delivering analytics-based products from conception to deployment and customer enablement

What are our cloudy perks?

    • A MacBook Pro and smartphone.
    • Unique cloudy culture -- we work hard and play hard.
    • Uncapped holidays and your birthday off.
    • World-class training and career development opportunities through our own Cloudy University.
    • Centrally-located offices.
    • Fully stocked kitchen and team cloudy lunches, taking over a restaurant or two every Friday.
    • Office amenities like pool tables and Xbox on the big screen TV.
    • Working with a collaborative, social team, and growing your skills faster than you will anywhere else.
    • Full benefits and 401k match.
    • Kick-off events at cool locations throughout the country.
  • Getafe, Spain

A. Organisation:
The mission of the DTO inside Airbus Defence and Space is to actively support the business and help drive the transformation, and is articulated around the following objectives:

• Animate the community of Airbus group digital stakeholders and connect the dots in order to promote information and best practices sharing.
• Promote and facilitate the portfolio of initiatives; capture lessons learnt to accelerate, scale-up and spread most promising projects beyond the current boundaries.
• Help establish a value proposition framework for digital initiatives, including new business models approaches when appropriate.
• Sponsor and/or drive a set of proof of concepts or breakthrough projects, in close coordination with key stakeholders.
• Drive Airbus Group data strategy.
• Ensure proper access to market best practices through structured and focused benchmarking or partnerships.
• Manage the appropriate coordination and synergies between Group initiatives.
• Drive and accompany business transformation in a digital environment.

B. Accountabilities:

Lead the development and deployment of data analytics activities and expertise on digital projects encompassing processes and solutions while making sure the following objectives are met:
• Time, Cost, Quality and respecting existing Airbus project management processes.
• Adherence to requirements.
• Challenging to create the best value proposition and solution for the business.
• Efficient stakeholder management.

C. Main activities:

• Supports the Digital Transformation Office in executing data analytics proofs-of-concept and projects with Airbus Defence and Space and Airbus Group organizations.
• Performs hands-on work to build integrated data sets, analyse data sources, and build compelling visualizations.
• Develop a broad knowledge based on the scope of the Airbus Defence and Space portfolio of business areas:
o Space Systems and Satellites.
o Secure Communications.
o Military Aircraft.
o Digital Security and Intelligence.
o Unmanned Arial Systems and Drones.

• Linked to projects and initiatives within:
o Commercial Aerospace.
o Helicopters.

• Candidate will be guided and mentored by experience Digital Transformation Office and other Airbus staff – focus on developing broader knowledge of Airbus, information systems, and business challenges.
• Presents results to project stakeholders, including business experts and middle management.
• Contribute to the creation of new ways of working and new data governance.
• Share his/her knowledge, contribute to driving change within the digital network, presentations and communications within the digital community, dissemination of best practices (patterns), etc.

D. Outputs:

• Improved business performance through understanding of data analytics and the impact on the business projects and ways of working.
• Stakeholder management.
• Communication, knowledge and best practices sharing.
• Active part of the data analytics network within Airbus.

E. Skills and Competences:

Transversal skills:

• Curiosity for data and data-driven decisions, interest in business or industrial problem solving, creativity.
• Understand the core business context and key drivers of Airbus Defence and Space division and to have a global view on the general environment of the company.
• Knowledge of the company products /solutions/ service portfolio characteristics.
• Understanding of Company activities, legal organisation, internal organisation…
• Excellent internal and external networking skills.
• Fluent English.
• Project management experience – deliver within an Agile framework concrete results or recommendations to an end-customer.

Specific skills:

• Ability to understand data source structure and content, e.g., fields, data quality, ranges, volume, etc.
• Experience with data integration and management – combining independent sources to create a more complete view.
• Knowledge of statistical analysis, e.g., mean, standard deviation, correlations, etc.
• Programming experience in one or more of the following languages: R, Python, Java, Matlab.
• Experience using data visualization tools: RShiny, Spotfire, Tableau, etc.
• Good communication skills – tell an effective story with the data.
• Good operational background in more than one area: for instance Engineering, Manufacturing, Procurement / Supply chain, Support functions,…
• Capability to build up business cases and understand financial reports.
• Knowledge and understanding from digital technologies and related challenges.
• Familiarity with data analysis algorithms such as SVM, K-Means, etc.
• Understanding of big data system architectures, e.g., HDFS, Spark, etc.
• Experience working with unstructured data – text, images, video, etc.
• Aerospace business experience – marketing, sales, engineering, program management, etc.

Behavioural skills:

• Team player : Collaborative mind-set and ways of working.
• Learning agility , Entrepreneurship & hacking mindset, independence of mind and innovative “out of the box thinking”.
• Personal resilience to challenge status quo.
• Demonstrated leadership capacity either across networks of hierarchical.
• Demonstrated ability & autonomy to do free-lance research.
• Excellent and adaptive communication with diverse audiences & skills: engineering, experts, data scientists, IT experts, business experts, exec management.
• Active listening skills.
• Sound understanding of team dynamics and the working environment.

The Wellcome Trust Sanger Institute
  • Cambridge, UK

Salary range: £36,000-£44,000 per annum depending on experience plus excellent benefits. Fixed Term Contract for 3 Years.

Open Targets has recently launched Open Targets Genetics (, a portal that aggregates large scale GWAS data with functional genomics data to identify potential drug targets at disease-associated loci.

A Statistical Geneticist role funded by Open Targets, is available at the Wellcome Sanger Institute in a new team under the leadership of Dr. Maya Ghoussaini. This is an exciting opportunity for you to participate in the enhancement of the existing Open Targets Genetics Portal through de development of new functionality and features.

You will actively engage in the integration of new eQTL datasets and tissue-specific chromatin interaction datasets.

You will have the opportunity to work across a range of analysis such as:
  • Aggregate large scase GWAS data from multiple consortia and across a wide range of disease and traits.
  • Perform association analysis on UK Biobank data with a particular focus on therapeutic areas important for Open Targets
  • Work together with other members of the Open Targets team on statistical genetics analysis for large scale sequence analysis
  • Work with existing members of the team to integrate genetic and cell-specific genomic data to identify and validate causal links between targets and diseases and improve the Genetics Portal.

We welcome candidates with a background in statistical genetics or relevant discipline with advanced level of programming skills suitable for statistical genetic analyses of complex diseases. Experience in functional genomics data analysis is highly desirable. You will have the opportunity to interact with active computational and experimental research teams using cutting edge genomic techniques.

Essential Skills

  • PhD in Statistical Genetics, Computational Biology or a closely related discipline.
  • Advanced level programming skills suitable for statistical genetic analyses, such as R, Python, MATLAB.
  • Firm grounding in statistical methods of complex disease genetics such as genome wide association studies, fine-mapping, high-throughput expression data, whole exome/genome sequencing, PheWAS, Mendelian Randomisation.
  • Previous experience in working with large-scale genetic datasets.
  • Ability to work to tight timelines.
  • Demonstrable good project management and organisational skills.
  • Fluent in written and spoken English.
  • Ability to communicate ideas and results effectively.
  • Ability to work independently and organise own workload.

Ideal Skills

  • Experience in functional genomics data analysis (RNAseq, ChIPseq, etc);
  • Experience with generating reproducible bioinformatics pipelines;
  • A strong track record in preparing publications and other written materials;
  • Interest in target validation and translational research.

Other information

Open Targets is a pioneering public-private initiative between GlaxoSmithKline (GSK), Biogen, Takeda, Celgene, Sanofi, EMBL-EBI (European Bioinformatics Institute) and the WSI (Wellcome Sanger Institute), located on the Wellcome Genome Campus in Hinxton, near Cambridge, UK.

Open Targets aims to generate evidence on the biological validity of therapeutic targets and provide an initial assessment of the likely effectiveness of pharmacological intervention on these targets, using genome-scale experiments and analysis. Open Targets aims to provide an R&D framework that applies to all aspects of human disease, to improve the success rate for discovering new medicines and share its data openly in the interests of accelerating drug discovery.

Genome Research Limited is an Equal Opportunity employer. As part of our dedication to gender equality and promoting women's careers in science, we hold an Athena SWAN Bronze Award. We will consider all qualified applicants without discrimination on grounds of disability, sexual orientation, pregnancy or maternity leave status, race or national or ethnic origin, age, religion or belief, gender identity or re-assignment, marital or civil partnership status, protected veteran status (if applicable) or any other characteristic protected by law.

Please include a covering letter and CV with your application

Closing date: 28th February, however applications will be reviewed on an ongoing basis and therefore the post may be filled before the deadline.
  • Headquarters: San Francisco, C
Headquarters: San Francisco, CA URL: [] EXPLORE AND DISCOVER NIELSEN! WITH OFFICES LOCATED IN 110 COUNTRIES, WE ARE A GLOBAL INDEPENDENT MEASUREMENT AND BIG DATA ANALYTICS COMPANY FOCUSED ON YOUR FUTURE.NIELSEN PORTFOLIOIS ONE OF THREE CORE NIELSEN BUSINESSES. THE NEWLY ESTABLISHED PORTFOLIO DIVISION IS DEFINED AS A HIGH-GROWTH, INNOVATION-BASED BUSINESS THAT SUPPORTS THE NEEDS OF OUR CLIENTS.PORTFOLIO INCLUDES GRACENOTE, NIELSEN SPORTS/ESPORTS/GAMES, NIELSEN MUSIC, BRANDBANK, SUPERDATA, & NIELSEN BOOK, MAKING NIELSEN PORTFOLIO THE LARGEST SUPPLIER OF METADATA, DATA, MEASUREMENT, AND INSIGHTS TO THE GLOBAL ENTERTAINMENT INDUSTRY. Gracenote, a Nielsen Company is the leading provider of entertainment metadata and media recognition technology that powers discovery features for top TV, music, sports and automotive platforms.We are presently looking for aData Scientist for our Connectivity (Telecom) teamwho will be responsible for supporting custom analysis using data from existing Telecom products and assisting in the development of novel client solutions in the form of new products or combining data across Nielsen Telecom platforms. This role can be based in San Francisco or Emeryville. The Data Scientist for Telecom is responsible for supporting custom analysis using data from existing Telecom products and assisting in the development of novel client solutions in the form of new products or combining data across Nielsen platforms. Job Responsibilities * Data Science product support (includes new feature development, methodology, client inquiry response) * Work across different products to help develop integrated vertical solutions for Connectivity * Work with the product leadership team to ensure compatibility of developing solutions with commercial need * Work with the Technology team to transform data streams into business insights * Coordinate with Operations team to transition completed solutions to production * Assist cross-functional teams in design, implementation, and testing of integrated product solutions. * Provide research support for the identification and implementation of methods and best practices to improve product quality. * Work across products when advanced/custom analysis is required for specific business opportunities. Requirements * Experience or thorough training in Python * Strong SQL skills and relevant experience * Ability to manipulate/munge data, analyze, and interpret results of data analysis * Experience in big data technologies and AI / machine learning * B.S. or Master’s degree in Data Science, Statistics, Social Science, Operation Research, or other hard sciences (e.g., Engineering, Computer Science, Biology, Physics, etc.) with * analytical experience. * Experience in trend analyses, multivariate statistics, bias reduction, weighting * Data visualization experience (e.g., Tableau, Spotfire, Microstrategy, Domo, etc.) * One to three years of relevant experience preferred * Strong written and verbal communication skills * Proficiency in SAS, SPSS, R, Matlab, or other statistical packages preferred Gracenote, a Nielsen company, is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class To apply: []
  • Headquarters: San Francisco, C
Headquarters: San Francisco, CA URL: [] Autodesk is seeking a Data Scientist to join our Advanced Analytics organization. The Advanced Analytics organization is chartered with building innovative data products and analytics solutions for Autodesk’s strategy, product, marketing, sales and customer support teams. This critical data scientist role will help us make machine intelligence an integral part of how Autodesk makes decisions and designs and builds its products. You will work alongside a product development team and apply data mining, analytics and machine learning methods to understanding how customers use, adopt and achieve successful outcomes with our products. The ideal candidate is a strong data scientist who thinks outside the box, is naturally curious, highly collaborative, and has a passion for tackling complex data-centric problems. Responsibilities * Work on a variety of problems that seek to better understand how customers use their products and what drives deeper adoption of products * Apply your quantitative analysis, data mining and machine learning expertise to building models that make sense of user needs, usage patterns, factors that drive deeper adoption and contribute to subscriber churn * Influence product development, strategy and roadmap prioritization * Design and implement machine learning pipelines that improve Autodesk’s evidence-based decision-making capabilities * Tackle complex problems requiring a creative mindset to find innovative and elegant solutions Minimum Qualifications * BS/MS in the field of Mathematics, Statistics/Analytics, Computer Science or other relevant field * 2+ years of applicable work experience * Experience working with relational SQL and NoSQL databases * Experience working with big data platforms (Hadoop, Spark, Hive) * Fluency with one or more of the scripting languages: Python, Java, Scala, etc. * Good understanding of CS fundamentals, e.g. algorithms and data structures * Experience in statistical programming tools such as R, Matlab, SAS, etc. * Experience with data science toolkits: pandas, numpy, scikit-learn, TensorFlow, etc. * Familiarity with statistics concepts and analysis, e.g. hypothesis testing, regression, etc. * Familiarity with Machine Learning techniques, e.g. classification, clustering, regularization, optimization, dimension reduction, etc. * Good communication skills and ability to explain complex topics to a non-technical audience About Autodesk With Autodesk software, you have the power to Make Anything. The future of making is here, bringing with it radical changes in the way things are designed, made, and used. It's disrupting every industry: architecture, engineering, and construction; manufacturing; and media and entertainment. With the right knowledge and tools, this disruption is your opportunity. Our software is used by everyone - from design professionals, engineers and architects to digital scientists, students and hobbyists. We constantly explore new ways to integrate all dimensions of diversity across our employees, customers, partners, and communities. Our ultimate goal is to expand opportunities for anyone to imagine, design, and make a better world. Americas-United States of America-California-San Francisco To apply: []
  • New Delhi, India


Are you passionate about applying AI and machine learning to solve some of the world’s biggest challenges? So are we! Esri is the world leader in geographic information systems (GIS) and developer of ArcGIS, the leading mapping and analytics software used in 75 percent of Fortune 500 companies. At the Esri R&D Center-New Delhi, we are applying cutting-edge deep learning techniques to revolutionize geospatial analysis and derive insight from imagery and location data. 
Join our team of exceptional data scientists and software engineers to deliver a spatial data science platform, develop industry-leading AI models for satellite imagery, and build world-class Geo AI solutions.
In this role, you will apply geospatial analysis, data science, and machine learning to geospatial problems. You will author Jupyter Notebooks that showcase ArcGIS AI capabilities and advocate patterns to solve such problems. You will gain valuable experience on how to test and develop deep learning models, APIs, and Geo AI solutions.


  • Author and maintain Jupyter Notebook-based geospatial data science samples using ArcGIS and machine learning/deep learning libraries such as Tensorflow and PyTorch

  • Execute API tests using the continuous integration pipeline and test automation framework

  • Enhance test automation framework to test and compare AI models using appropriate evaluation metrics

  • Author, maintain, and update unit tests and test assets and help maintain high QA standards

  • Certify AI models and APIs ahead of product release cycles


  • 1+ years of experience with Python programming 

  • Coursework and knowledge of machine learning and deep learning

  • Knowledge of GIS tools, libraries, and capabilities

  • Experience with Python machine learning and deep learning libraries such as Scikit-learn, Pandas, PyTorch/FastAI, or TensorFlow/Keras

  • Experience with continuous integration and QA testing

  • Strong verbal and written communication skills

  • Bachelor's or master's in sciences, engineering, GIS, or related disciplines 


  • Familiarity with ArcGIS suite of products and concepts of GIS, including working with ArcGIS API for Python

  • Experience with technical writing and teaching

  • Knowledge of various scientific libraries such as SciPy, R, Numpy, Matlab, etc.

  • Boston, MA
Company Overview:
REsurety is a mission-driven organization solving the challenge of resource intermittency for renewable energy.  We work in partnership with the world’s leading energy and risk management providers to enable renewable energy consumers and producers to manage the fuel risk of the future: the weather.  As a high-growth, profitable company that has already supported over 5,000 MW of clean energy transactions, we are a small team making a big impact! Our culture is open and collaborative.  We expect excellence from our team members and reward it with high ownership and flexibility.  If you’re a high-achiever with a passion for clean energy, then we look forward to receiving your application.

Position Overview:
As a Quality Assurance Engineer, you will support and maintain the verification and validation processes for the REsurety software suite to ensure unwavering accuracy of our results, which accelerate the development of renewable energy worldwide.

Key Responsibilities

  • Test new software features for quality and accuracy before they enter production, identify root causes of issues, and stress-test for unusual conditions

  • Validate input and output datasets across the spectrum of power, weather and risk metrics

  • Develop, maintain and monitor automated unit tests, regression tests, evaluation metrics and holistic verification frameworks, taking action when issues arise

  • Investigate live quality issues in active customer quoting and settlement operations

  • Advocate for quality needs throughout the software lifecycle, including requirements gathering development

  • Identify sources of technical debt, reducing and mitigating its accumulation

Required Qualifications:

  • A bachelor’s or master’s degree in computer science or engineering, or a related field in the sciences

  • Professional experience with software quality processes and DevOps, including testing frameworks

  • Experience in scientific computing and statistical analysis, with R (preferred), Matlab, Python, or similar

Preferred Qualifications:

  • Proven results with R’s package development, S3/RC object oriented programming, Roxygen documentation, testthat unit testing, and data.table frameworks

  • Experience with agile development methodologies and operational tools such as bug tracking, source control and continuous integration

  • Experience with SQL database systems, particularly in monitoring and addressing data quality 

  • Exposure to applications using time series data

  • Attention to detail and a devious mind capable of surfacing deep bugs


  • Location: Boston, MA


  • Unlimited Vacation Policy

  • Medical Insurance

  • Dental Insurance

  • Health Savings Account (HSA)

  • 401(k)

  • Gym Membership Reimbursement

  • Blue Bikes Gold Membership

REsurety, Inc.  is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, gender identity, sexual orientation or any other characteristic protected by law.