OnlyDataJobs.com

DISYS
  • Minneapolis, MN

Client: Large Bank/Financial Services

Work Location: 100% Remote/Telecommute (Preferred Location: Minneapolis, MN) 

Duration: 12 month contract, extendable to 24 months, and eligible for conversion to FTE.

Position Title: People Analytics Consultant

Description:

  • Role requires project management and research experience.
  • Intermediate to Expert level of experience with R strongly desired.
  • Must be extremely comfortable with the following:   Interacting with dashboards --- Working with large datasets of structured and unstructured data ---- Data Reporting ---- Using Tableau ---- Understanding reporting requirements and structural hierarchies for reporting in a collaborative environment ---- Working independently ---- Working on multiple projects
  • Experience with graphic and visuals for storytelling with data/data reporting for executives and senior leaders is a huge plus

Key things that I am looking for:

  • Comfort with data- structured and unstructured
  • Survey design experience
  • experience with hands on deploying and managing survey/measurement related projects people/employee data analysis/modeling experience people analytics experience is a plus
  • Data visualization /graphic design for data reporting/storytelling experience is a HUGE plus
  • Comfort/experience with data analysis tools/programming languages such as SPSS, R, and/or Python


Desired Qualifications:

  • 1+ years of experience with R (Programming Language) and RStudio
  • 6+ years of experience in one or a combination of the following: reporting, analytics, or modeling; or a Masters Degree or higher in a quantitative field such as applied math, statistics, engineering, physics, accounting, finance, economics, econometrics, computer sciences, or business/social and behavioral sciences with a quantitative emphasis and 
  • 4+ years of experience in one or a combination of the following: reporting, analytics, or modeling
  • 2+ years of Experience with Tableau -must be comfortable building out dashboards, creating visualizations, working with complex structured and unstructured data in Tableau
  • 2+ years creating dashboards and reporting data in a clear and structured way using tools/code such as R, SQL, Python
R1 RCM
  • Salt Lake City, UT

Healthcare is at an inflection point. Businesses are quickly evolving, and new technologies are reshaping the healthcare experience. We are R1 - a revenue cycle management company that is passionate about simplifying the patient experience, removing the paperwork hassle and demystifying financial obligations. Our success enables our healthcare clients to focus on what matters most - providing excellent clinical care.

Great people make great companies and we are looking for a great Lead Software Engineer to join our team in Murray, UT. Our approach to building software is disciplined and quality-focused with an emphasis on creativity, craftsmanship and commitment. We are looking for smart, quality-minded individuals who want to be a part of a high functioning, dynamic team. We believe in treating people fairly and your compensation should reflect that. Bring your passion for software engineering and help us disrupt ourselves as we build the next generation healthcare revenue cycle management products and platforms. Now is the right time to join R1!

We are seeking a highly experienced Lead Platform Engineer to join our team. The lead platform engineer will be responsible for building and maintaining real time, scalable, and resilient platform for product teams and developers. This role will be responsible for performing and supervising design, development and implementation of platform services, tools and frameworks. You will work with other software architects, software engineers, quality engineers, and other team members to design and build platform services. You will also provide technical mentorship to software engineers/developers and related groups.      


Responsibilities:


  • Be responsible for designing and developing software solutions with engineering mindset
  • Ensures SOLID principles and standard design patterns are applied across the organization to system architectures and implementations
  • Acts as a technical subject matter expert: helping fellow engineers, demonstrating technical expertise and engage in solving problems
  • Collaborate with stakeholders to help set and document technical standards
  • Evaluates, understands and recommends new technology, languages or development practices that have benefits for implementing.
  • Participate in and/or lead technical development design sessions to formulate technical designs that minimize maintenance, maximize code reuse and minimize testing time

Required Qualifications:


  • 8+ years experience in building scalable, highly available, distributed solutions and services
  • 4+ experience in middleware technologies: Enterprise Service Bus (ESB), Message Queuing (MQ), Routing, Service Orchestration, Integration, Security, API Management, Gateways
  • Significant experience in RESTful API architectures, specifications and implementations
  • Working knowledge of progressive development processes like scrum, XP, Kanban, TDD, BDD and continuous delivery using Jenkins
  • Significant experience working with most of the following technologies/languages: Java, C#, SQL, Python, Ruby, PowerShell, .NET/Core, WebAPI, Web Sockets, Swagger, JSON, REST, GIT
  • Hand-on experience in micro-services architecture, Kubernetes, Docker
  • Familiarity with Middleware platform Software AG WebMethods is a plus
  • Concept understanding on Cloud platforms, BIG Data, Machine Learning is a major plus
  • Knowledge of the healthcare revenue cycle, EMRs, practice management systems, FHIR, HL7 and HIPAA is a major plus


Desired Qualifications:


  • Strong sense of ownership and accountability for delivering well designed, high quality enterprise software on schedule
  • Prolific learner, willing to refactor your understanding of emerging patterns, practices and processes as much as you refactor your code
  • Ability to articulate and illustrate software complexities to others (both technical and non-technical audiences)
  • Friendly attitude and available to mentor others, communicating what you know in an encouraging and humble way
  • Continuous Learner


Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions.  Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests.


Our associates are given valuable opportunities to contribute, to innovative and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package.  To learn more, visit: r1rcm.com

R1 RCM
  • Salt Lake City, UT

Healthcare is at an inflection point. Businesses are quickly evolving and new technologies are reshaping the healthcare experience. We are R1 a revenue cycle management company that is passionate about simplifying the patient experience, removing the paperwork hassle and demystifying financial obligations. Our success enables our healthcare clients to focus on what matters most providing excellent clinical care.  


Great people make great companies and we are looking for a great Application Architect to join our team Murray, UT. Our approach to building software is disciplined and quality-focused with an emphasis on creativity, craftsmanship and commitment. We are looking for smart, quality-minded individuals who want to be a part of a high functioning, dynamic team. We believe in treating people fairly and your compensation should reflect that. Bring your passion for software engineering and help us disrupt ourselves as we build the next generation healthcare revenue cycle management products and platforms. Now is the right time to join R1!  


R1 is a leading provider of technology-enabled revenue cycle management services which transform and solve challenges across health systems, hospitals and physician practices. Headquartered in Chicago, R1 is publicly-traded organization with employees throughout the US and international locations.


Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients and each other. With our proven and scalable operating model, we complement a healthcare organizations infrastructure, quickly driving sustainable improvements to net patient revenue and cash flows while reducing operating costs and enhancing the patient experience.


As an Application Architect you apply your problem solving, critical thinking and creative design to architect and build software products that achieve technical, business and customer experience goals.


Responsibilities:


  • Plans software architecture through the whole technology stack from customer facing features, to algorithmic innovation, down through APIs and datasets.
  • Ensures that software patterns and SOLID principles are applied across the organization to system architectures and implementations.
  • Works with product management, business stakeholders and architecture leadership to understand software requirements and helps shape, estimate and plan product roadmaps.
  • Plans and implements proof of concept prototypes.
  • Directly contributes to the test-driven development of product features and functionality, identifying risks and authoring integration tests.
  • Manages and organizes build steps, continuous integration systems and staging environments.
  • Mentors other members of the development team.
  • Evaluates, understands and recommends new technology, languages or development practices that have benefits for implementing.


Required Qualifications:


    8+ ye
    • ars experience programming enterprise web products with Visual Studio, C# and the .NET Framework. Robus
    • t knowledge in software architecture principles including message and service busses, object-oriented programming, continuous integration / continuous delivery, SOLID principles, SaaS, microservices, master data management (MDM) and a deep understanding of design patterns and domain-driven design (DDD). Signi
    • ficant experience working with most the following technologies/languages: C#, .NET/Core, WCF, Entity Framework, UML, LINQ, JavaScript, Angular, Vue.js, HTML, CSS, Lucene, REST, WebApi, XML, TSQL, NoSQL, MS SQL Server, ElasticSearch, MongoDB, Node.js, Jenkins, Docker, Kubernetes, NUnit, NuGet, SpecFlow, GIT. Worki
    • ng knowledge of progressive development processes like scrum, XP, kanban, TDD, BDD and continuous delivery. Stron
    • g sense of ownership and accountability for delivering well designed, high quality enterprise software on schedule. Proli
    • fic learner, willing to refactor your understanding of emerging patterns, practices and processes as much as you refactor your code. Abili
    • ty to articulate and illustrate software complexities to others (both technical and non-technical audiences). Frien
    • dly attitude and available to mentor others, communicating what you know in an encouraging and humble way.
    • Experience working with globally distributed teams.
    • Knowledge of the healthcare revenue cycle, EMRs, practice management systems, FHIR, HL7 or HIPAA is a major plus.


Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions.  Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests.


Our associates are given valuable opportunities to contribute, to innovative and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package.  To learn more, visit: r1rcm.com

FPC of Three Rivers PA
  • Dallas, TX

Summary

FPC of Three Rivers is happy to present Senior Data Scientist role in the healthcare industry. You will be part of a team working to build next gen data analytics / Machine Learning empowered products.    

Day to Day

  • As part of a team of data scientists, software developers, and architects; engage business users to understand and solve complex problems using data analytics and insights.
  • Prototype / develop new Artificial Intelligence / Machine Learning based products, as well as enhancements/upgrades to existing products all of which provide business value.
  • Maintain a passion for innovation, specifically at the nexus of abstract mathematics, analytics, and computer science.
  • Be a multiplier, increasing group expertise through exploration of emerging domains, toolsets, and methodologies.

Required Qualifications

  • Masters degree in statistics, analytics, mathematics, machine learning or related field.
  • 4+ years of experience using Machine Learning and/or Artificial Intelligence to solve tangible business problems within an established organization.
  • Demonstrated success in translating unstructured business problems into analytical models.
  • Experience with a subset of the following tools / libraries: Python, R, Jupyter, Zeppelin, SparkML, H20, and/or TensorFlow.
  • Experience with cloud computing environments.


Preferred Qualifications

  • Experience with healthcare data analytics.
  • Experience with healthcare industry processes, trends, and standards.
WorldLink US
  • Dallas, TX

Business Analyst

Dallas, TX

Full time, direct hire position

Seeking a bright, motivated individual with a unique, wide range of skills and the ability to process large data sets while communicating findings clearly and concisely.

Responsibilities

  • Analyze data from a myriad of sources and generate valuable insights
  • Interface with our sales team and clients to discuss issues related to data availability and customer targeting
  • Execute marketing list processing for mail, email and integrated multi-channel campaigns
  • Assist in development of tools to optimize and automate internal systems and processes
  • Assist in conceptualization and maintenance of business intelligence tools

Requirements

  • Bachelors degree in math, economics, statistics or related quantitative field
  • An ability to deal and thrive with imperfect, mixed, varied and inconsistent data from multiple sources
  • Must possess rigorous analytical disciplined approach, as well as dynamic, abstract problem solving skills (get to the answer via both inspiration and perspiration)
  • Proven ability to work in a fast-paced environment and to meet changing deadlines / priorities on multiple simultaneous projects
  • Extensive experience writing queries for large, complex data sets in SQL (MySQL, PostgreSQL, Oracle, other SQL/RDBMS)
  • Highly proficient with Excel (or an alternate spreadsheet application like OpenOffice Calc) including macros, pivot tables, vlookups, charts and graphs
  • Solid knowledge of statistics and able to perform analysis in R SAS or SPSS proficiently
  • Strong interpersonal skills as a team leader and team player
  • Self-learning attitude, constantly pushing towards new opportunities, approaches, ideas and perspectives
  • Bonus points for experience with high-level, dynamically compiled programming languages: Python, Ruby, Perl, Lisp or PHP

  **No VISA Sponsorship available

Sogeti
  • Austin, TX

Sogeti is a leading provider of technology and engineering services. Sogeti delivers solutions that enable digital transformation and offers cutting-edge expertise in Cloud, Cybersecurity, Digital Manufacturing, Quality Assurance & Testing, and emerging technologies. Sogeti combines agility and speed of implementation with strong technology supplier partnerships, world class methodologies and its global delivery model, Rightshore®. Sogeti brings together more than 25,000 professionals in 15 countries, based in over 100 locations in Europe, USA and India. Sogeti is a wholly-owned subsidiary of Capgemini SE, listed on the Paris Stock Exchange. For more information please visit www.sogeti.com.


REQUIRED:

- 3-5 years of solid experience with developing back-end services with Python

- Development experience on a Unix-like system
- Experience with developing highly scalable RESTful APIs
- Expert in asynchronous programming

- Expert understanding of Python Core concepts, including both functional and class-based design, and new language features up to Python 3.5.
- Knowledge of Python ORM libraries 
- Strong understanding of relational database concepts and SQL writing skills, proven experience working with major relational database platforms such as SQL Server, Oracle, and PostgreSQL


NICE TO HAVE:

- Experience with socket programming

- Professional experience using Python 3+ 

Glocomms
  • Dallas, TX

Data Scientist
Dallas, TX
Investment Bank
Compensation: $160,000-$190,000

(Unlimited PTO, Remote Work Options, Daily Catered Lunches)

Do you enjoy solving challenging puzzles? Protecting critical networks from cyber-attacks? Our Engineers are innovators and problem-solvers, building solutions in risk management, big data, mobile and more. We look for creative collaborators who evolve, adapt to change and thrive in a fast-paced global environment. We are is leading threat, risk analysis and data science initiatives that are helping to protect the firm and our clients from information and cyber security risks. Our team equips the firm with the knowledge and tools to measure risk, identify and mitigate threats and protect against unauthorized disclosure of confidential information for our clients, internal business functions, and our extended supply chain.

You will be responsible for:
Designing and integrating state-of-the-art technical solutions? A position as a Security Analyst on our Threat Management Center
Applying statistical methodology, machine learning, and Big Data analytics for network modelling, anomaly detection, forensics, and risk management.
Creating innovative methodologies for extracting key parameters from big data coming from various sensors.
Utilize machine learning, statistical data analytics, and predictive analytics to help implement analytics tied to cyber security and hunting methodologies and applications
Designing, developing, testing, and delivering complex analytics in a range of programming environments on large data sets

Minimum Qualifications:
6+ years industry experience focused on Data Science
Hands-on experience in Cyber Security Analytics
Proficient coding capability in at least one of the major programming languages such as Python, R, Java
Proficient in data analysis, model building and data mining
Solid foundation in natural language processing and machine learning
Significant experience with deep learning

Preferred Qualifications
Strong interpersonal, communication and communication skills
Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner
Experience with knowledge management best practices
Penchant to learn about new technology innovation and creativity in applying to business problems

BlueGranite
  • Houston, TX

BlueGranite is looking for a Data & Analytics Staff Consultant to join our team of data architects, engineers, analysts, and data scientists. 


We're looking for individuals with full-time work experience in a professional, project-based setting. Candidates should have experience with data and analytics solutions and workloads, which may include processing and combining data from disparate sources, collecting data from internal systems and preparing it for analysis, or building data visualizations to make the analytical results available to business users and decision makers.


Our consultants work side-by-side with other team members on client projects involving modern platforms and technologies such as SQL Server, Azure data services, Spark, Python, and Power BI.


This full-time position is for individuals who enjoy a hands-on, development role on client-facing project teams with up to 25% travel on an annual basis, plus the potential for regular travel to our Kalamazoo, Michigan area office for the first couple of months for training and onboarding (as needed).


At a minimum, were looking for:


  • Experience processing and combining data from disparate sources such as relational databases, spreadsheets, delimited files, flat files or various user created documents
  • Strong skills and experience with Microsoft SQL Server, Azure, and/or Power BI for data warehousing, data lakes, reporting, and analytics solutions
  • Strong interest in building solutions for analytics and data manipulation with languages and tools including Spark, Python, R, SQL, or Hadoop
  • Knowledge of database languages, concepts, and environments
  • Passion for both business and technology
  • Curiosity about data and the ability to translate data-driven insights into decisions and actions
  • Self-starter, self-managed, quick learner, problem-solver with a positive, collaborative, and team-based attitude
  • Excellent communication skills, ability to engage with other project team members and clients
  • Degree in information technology, computer science, business, statistics, mathematics, or engineering
  • Ability to travel on average of 25% per year, plus the potential for regular travel to our Kalamazoo, Michigan area office for the first couple of months
  • Authorization to work in the U.S., without the requirement for an employment visa


Although not required of applicants, these are sought-after abilities:


  • Demonstration of thought leadership, community volunteering, recognition or awards
  • Professional work experience with SQL, Hadoop, Python, Spark, or R for data and analytics
  • Experience with the development of data warehouse and business intelligence solutions
  • Ability to implement on-premises or cloud-based data ingestion (ETL vs ELT) solutions
  • Aptitude and experience with SQL queries for analytics and data manipulation


Your primary responsibilities would include:


  • Developing and supporting data and analytics solutions using Microsoft SQL, Power BI, and Azure as part of BlueGranite's project delivery teams
  • Creating informative and interactive data visualizations/models
  • Writing relational and multidimensional database queries, flows and movements
  • Obtaining and maintaining technical certifications in these core areas of expertise


BlueGranite is the driving force of data innovation for our customers. We are strategic trusted advisers who use our aptitude and experience to deliver value to our clients. We invest in our team to become better at who we are and what we do. Our core strategy is to lead with offers for data, insights, and analytics that deliver business value on the Microsoft platform.

This position is for U.S. based applicants located in the following states: Michigan, Indiana, Illinois, Ohio, Minnesota, Texas, Florida, Georgia, North Carolina, South Carolina, New York, Massachusetts, or New Hampshire. We are not able to provide sponsorship for student or other employment work visas at this time.


The above statements are intended to describe the general nature and level of work being performed by individuals assigned to this classification. They are not to be construed as an exhaustive list of all responsibilities, duties and skills required of personnel so classified. All personnel may be required to perform duties outside of their normal responsibilities from time to time, as needed.

Turnberry Solutions
  • Philadelphia, PA

TITLE:  Data Scientist

Years of Experience: 7+
Education Required: Bachelors Degree or Equivalent Work Experience

Interview Details:
- Phone Interview
- Face to Face Mandatory - Video Call is not an option


Purpose:
Looking for a Data Scientist who will support operations and data architecture teams with insights gained from analyzing company data. You must be that person who can create value out of data. The ideal candidate is adept at using large data sets to find opportunities for product and process optimization and using models to test the effectiveness of different courses of action. Such a person proactively fetches information from various sources and analyzes it for better understanding about how the business performs. Additionally, they can utilize AI tools to automate certain processes
 
This person must have strong experience using a variety of data mining/data analysis methods, using a variety of data tools, building and implementing models, using/creating algorithms and creating/running simulations. They must have a proven ability to drive business results with their data-based insights. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. This Data Scientist must possess the skill-sets necessary to hit the ground running and must be willing to learn about the mobile phone business while solving problems quickly and efficiently.
 
See Yourself:

  • Working with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
  • Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
  • Assess the effectiveness and accuracy of new data sources and data gathering techniques.
  • Develop custom data models and algorithms to apply to data sets.
  • Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes.
  • Coordinate with different functional teams to implement models and monitor outcomes.
  • Develop processes and tools to monitor and analyze model performance and data accuracy.

Position Requirements

Minimum Requirements:
  • Four-year degree in a related Computer Science, Math or Statistical field of study.
  • 7+ continuous years' of professional experience as a Data Scientist.
  • Strong problem solving skills with an emphasis on product development.
  • Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights from large data sets.
  • Experience working with and creating data architectures.
  • Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.
  • Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
  • A drive to learn and master new technologies and techniques.
  • Coding knowledge and experience with several languages: C, C++, Java, JavaScript, etc.
  • SUPERB communication skills with an emphasis on writing and interpreting abilities
  • Excellent presentation skills. Must have the ability to confirm complex data into digestible formats for non-technical business teams.
Extended Requirements:
  • Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis, etc.
  • Experience using web services: Redshift, S3, Spark, etc.
  • Experience creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
  • Experience analyzing data from 3rd party providers.
  • Experience with distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, Gurobi, MySQL, etc.
  • Experience visualizing/presenting data for stakeholders.
Physical Environment and Working Conditions:
Agile open floor plan
Onsite 5 days a week
U.S. Bank
  • Minneapolis, MN

SUMMARY

The successful candidate for this Data Analyst position will be responsible for day-to-day reporting of mobile app voice capability function. This is a business line support function that will provide reporting of voice adoption and performance metrics used in communication plans with development and data science groups for targeting enhancements. The successful candidate should be able to provide and communicate reporting findings to peers and senior stakeholders.

RESPONSIBILITIES

This position's job functions will include:

- Reporting/KPIs: Agile reporting dashboards, Capex and Epic and tactical performance measurement.

- Raw data requests: active customers and customers by age, duration, income, etc.

- Adoption/Engagement: user metric tracking, engagement frequency, volume measurement, segmentation reporting and geo-location reporting.

- Digital voice funnel insights: success rates, processing and response timing and engagement and fallout.

- A/B testing: front-end test design with technical partners and test performance measurement (quality tacking, e.g., fall out, success rates, usage rates and response time, and conversion (tracking to goal).

- Satisfaction tracking: CSAT, sentiment and leading satisfaction indicator identification.

- Visualization: self-service dashboard development and proven ability to develop in Excel and Tableau.

This position will help drive and contribute to the overall team success measures and goals related to:

- KPI Metrics (LOB Dashboards)

- Descriptive Statistics

 -Raw Data Requests

- CapEx Planning / Forecasting

- Benchmarking

- Funnel Analysis

- Forecasting (Plan)

- Optimization (Operational)

REQUIRED

- Bachelor's degree in a quantitative field such as econometrics, computer science, engineering or applied mathematics, or equivalent work experience.

- 6+ years of statistics or analytics experience.

- Bachelor's degree in a quantitative field such as econometrics, computer science, engineering or applied mathematics, or equivalent work experience.

PREFERRED

- Working knowledge of analytics and statistical software such as SQL, R, Python, Excel, Hadoop, SAS, SPSS and others to perform analysis and interpret data.

- Mastery of Microsoft Excel.

- Experience in report development, automation, and visualization.

- Strong reporting skills with the ability to extract, collect, organize, and interpret trends or patterns in complex data sets.

- Basic understanding of data science capabilities and language.

- Demonstrated project management skills.

- Demonstrated leadership and pro-activity skills.

- Effective interpersonal, verbal and written communication skills.

- Considerable knowledge of assigned business line or functional area.

- Working knowledge of customer self-service technology capabilities.

- Basic understanding of data architecture and development concepts.
ING
  • Amsterdam, Netherlands
ING is looking for experienced hires to help build on our Global Analytics ambition


About ING


Think Forward! Our purpose is to empower people to stay a step ahead in life and in business. We are an industry recognized strong brand with positive recognition from customers in many countries, a strong financial position, Omni-channel distribution strategy and international network. If you want to work at a place where we believe that you can make the difference by using machine learning to generate data driven products and solve the most pressing business challenges, please read on.


We are incredibly excited about Data Analytics and the great potential for progress and innovation. We believe that analytics is a key differentiator in bringing “anytime, anywhere, personalized” services to our customers.  We wish to improve our operational processes and create new and innovative data driven products that go beyond traditional banking, such as the platform models.  Achieving this vision will require us to build and expand on our analytics effort and organize ourselves around focused value buckets with strong coordination capabilities of data, technology, customer journey, UX, as well as external partnerships. 



Global Analytics


Global Analytics is a new unit responsible for realizing this vision for ING, differentiating ING as a leader in data-driven organization, within the banking sector and beyond. The team consists of a number of Global Analytics Center of Excellences around the bank’s key capabilities (such as Pricing, Risk Management, Financial Crime, Customer Intelligence, etc.) as well as strong coordination areas around data management, technology, customer journey, UX, as well as external partnerships.



Financial Crime & RegTech CoE (Center of Excellences)


To be a compliant and safe bank is non-negotiable precondition of everything we do. 


Purpose of the Financial Crime and RegTech center of excellence is to define the strategy and drive the development, implementation and adoption of analytics capabilities in the financial crime domain to make ING a safer and more compliant bank



Role Profile


As part of the center of excellence you will help creating innovative scalable data driven solutions in the space of Financial Crime. You proactively work with teams to implement these solutions across the organization.


You will work collaboratively with an extended group of stakeholders, including but not limited to Operations, Compliance, Engineering, Legal, Corporate Audit.


A background in anti-money laundering services or fraud is therefore a plus



About you



  • You Data Science knowledge enables you build analytics solutions for mitigation of Financial Crime related risks.

  • Willing and able to learn and to improve your technical as well your interpersonal skills.

  • You are a creative and curious Data Scientist who looking forward to work on a wide variety of Financial Economic Crime related problems.

  • You have a thorough understanding of the machine learning algorithms and tooling and are able to pass your knowledge to others.

  • Your are proficient in coding and you are able to deliver production ready code.

  • You have extensive experience with transforming data to added value for your stakeholders. You are able to see things from a different perspective and to make original solutions work in practice; when suitable you propose such endeavors to stakeholders.

  • You are able to see where ING can set further steps towards becoming a truly data driven bank. You’re always thinking one step ahead, for example in advising about the best way of implementation.

  • Your communication skills enable you to work together with many different parties throughout our organization.

  • You have extensive experience with stakeholder management from within data science projects.

  • Your enthusiasm is visible and you are good with mobilizing people for our data driven purpose.




  • You like working in cross-functional teams to realize an ambitious goal.

  • You are not shy to ask for help of other Data Scientists in the team, and you are happy help them out by sharing your knowledge and capabilities. You are able and willing to guide junior Data Scientists and interns in their work; you also have experience doing so in your previous roles.

  • You are a team player who strikes effective balance between independence and acting in the interest of the team.

  • You are perseverant and you do not give up when a problem is hard; you know how to deal with set-backs.

  • Your enthusiasm is contagious and inspires others to act; your act sets an example for others.


Candidate Profile



  • MSc or PhD with excellent academic results in the field of Computer Science, Mathematics, Engineering, Econometrics or similar.

  • At least 3/4 years work related experience in a similar role and/or environment.

  • Machine Learning: Classification, Regression, Clustering, Unsupervised methods, Text Mining. You have an excellent understanding of Random Forests, Gradient Boosting, Neural Networks, Logistic Regression, SVM, KNN, K-Means, etc. Parametric and non-parametric statistics is a pre.

  • Programming Languages: Python and R. Scala is a pre

  • Tools: Spark, Hadoop.

  • Database handling: SQL, Hive. Familiarity with Oracle, Netezza, HBase, Cassandra, Graph databases is a pre.

  • Visualisation tools: D3.js, Shiny, Angular.



Do you recognize a call upon You in our description? Then please apply and join us to change the future of banking!

PrimeRevenue
  • Atlanta, GA

ARE YOU READY TO WORK AT PRIMEREVENUE?

Do you want to work for a high growth, FinTech company helping other companies innovate, grow and create jobs? Do you enjoy working within an entrepreneurial environment that is mission-driven, results-driven and community oriented? Were looking for a Director of Data Architecture to continue the impressive development of our data enterprise, exposing our customers to the wealth of insights and predictive analytics. The Director will be responsible for design, development, and execution of data product initiatives. This individual will be part of multi-disciplinary team including data architects, BI developers, technical architects, data scientist, engineering, and operational teams for data products.  

WHAT YOU GET TO DO:

    • Design, create, deploy and manage our organization's data architecture
    • Develop and own our Data Product Roadmap
    • Map the systems and interfaces used to manage data, set standards for data management, analyze current state and conceive desired future state, and conceive projects needed to close the gap between current state and future goals
    • Provide a standard common business vocabulary, express strategic data requirements, outline high level integrated designs to meet these requirements, and align with enterprise strategy and related business architecture
    • Set data architecture principles, create models of data that enable the implementation of the intended business architecture
    • Create diagrams showing key data entities, and create an inventory of the data needed to implement the architecture vision
    • Drive all phases of data modelling, from conceptualization to database optimization, including SQL development and any database administration
    • Design ETL architecture



WHAT ARE WE LOOKING FOR?

  • Bachelor's degree in Computer Science or related discipline
  • Minimum 10 years working in data products organization
  • Knowledge of relational and dimensional data modelling
  • Knowledge of RDBMS solution (DB2, Oracle)
  • Excellent SQL skills
  • Experienced in Agile methodologies
  • Deep understanding of Data Management principles
  • Strong oral and written communication skills
  • Strong leadership skills
  • BI tools (Tableau, MicroStrategy)
  • Experience architecting enterprise data lakes in AWS
  • Hands-on experience with Hadoop frameworks/tools such as Kinesis, Glue, Redshift, Spark, Hive, Pig etc
  • Experience with distributions in Amazon EMR
  • Previous work with NoSQL databases such as PostgreSQL, MongoDB


WHO ARE YOU?

SMART, HUNGRY, & HUMBLE PERSONALITY IS A MUST!


WORKING AT PRIMEREVENUE BENEFITS:

    • Professional growth within our company
    • Monthly fun TEAM events
    • Generous benefits package
    • Community Service-Oriented Culture
FlixBus
  • München, Germany

We're looking for a motivated and driven Data Scientist (m/f/d) who will help us shape our team, drive the company to the next level, and have the most direct influence on our success.


Your Tasks – Paint the world green



  • You analyse our data and prototype systems and algorithms for customer relationship management and customer behaviour modelling

  • You use statistical models to answer real-life questions such as

    • “Is departure time more important than trip duration?”

    • “What are the effects of a marketing campaign on customer demand?”

    • “Are robots better at making advertisement texts than human?”



  • You work closely with our marketing department, business intelligence and product development teams on solutions for data-driven decision making

  • You solve our complex business challenges with creative insights


Your Profile – Ready to hop on board



  • Experience as a Data Scientist or a relevant role

  • Hands-on experience with various machine learning algorithms, statistics and algorithm design

  • Fluency in at least one high-level programming language (Python, Scala, R, etc)

  • Data visualization skills

  • Working knowledge of relational and NoSQL databases, Big data tools (Spark, Hadoop, etc) 

  • Experience with deep learning is a plus

  • Working knowledge of search engine optimization and marketing, web analytics, customer relationship management is a plus

Epidemic Sound AB
  • Stockholm, Sweden

At Epidemic Sound we are reinventing the music industry. Our carefully curated music library, with over 30,000 tracks, is tailored for storytellers no matter what their story is. Countless customers around the world, from broadcasters, productions companies and YouTubers rely on our tracks to help them tell their stories. Epidemic Sound’s music is heard in hundreds of thousands of videos on online video platforms such as YouTube, Facebook, Twitch and Instagram. Our HQ is located in Stockholm with offices in NYC, LA, Hamburg, Amsterdam and Madrid. We are growing fast, we have lots of fun and we are transforming the music industry.


We are now looking for a Software Engineer!


Job description


In addition to the hundred of thousands of storytellers using our products, our music is heard tens of billions of times every month across Youtube, social media & music streaming platforms. We want to make maximum use of all this data to generate insights and enable data driven decisions both in our product development and for our musicians and business stakeholders.


As such, we are now looking to grow our Data Infrastructure Team with an experienced software engineer to help us design, develop and evolve our platform and products.


You can expect to:



  • Collaborate with our product-oriented teams during data-related projects to achieve reliable and scalable solutions

  • Design, develop and evolve the data pipelines that fuel our back-end services, machine learning platform and business intelligence systems.

  • Contribute to all stages of the product life-cycle: Design, implementation, testing, releasing and maintenance

  • Work in a lightweight agile process where we regularly evaluate how we work and try to improve

  • Collaborate constantly: We’re big believers in teamwork and the value of practices like careful code reviews, pair (or mob) programming etc

  • Learn a ton of new things: Be it through hack-days, courses, conferences or tech-talks, we emphasize learning and we also expect you to share your knowledge with your colleagues

  • Have fun and take a lot of pride in what you and all of Epidemic Sound is doing


What are we looking for?



  • You have a great understanding of modern web architectures, distributed systems, data processing and storage solutions

  • You have strong knowledge of relational databases and SQL

  • You are able to find opportunities for data acquisition and new uses for existing data

  • You enjoy working with multiple stakeholders and feel comfortable prioritizing internal and external requirements

  • You love teamwork

  • You speak English with professional working proficiency (Swedish is not a requirement)


It would also be music to our ears if:



  • You have experience in one or more of the following: Python, AWS ecosystem, Google Cloud Platform

  • You have experience working with data processing and querying engines such as Spark Hadoop, BigQuery or Kafka

  • You have experience with multiple storage solutions such as columnar oriented databases, NoSQL or Graph databases


Application


Do you want to be a part of our fantastic team? Send us your cover letter and CV as soon as possible - interviews are held continuously. We look forward hearing from you!

Accenture
  • Detroit, MI
Position: Full time
Travel: 100% travel
Join Accenture Digital and leave your digital mark on the world, enhancing millions of lives through digital transformation. Where you can create the best customer experiences and be a catalyst for first-to-market digital products and solutions using machine-learning, AI, big data and analytics, cloud, mobility, robotics and the industrial internet of things. Your work will redefine the way entire industries work in every corner of the globe.
Youll be part of a team with incredible end-to-end digital transformation capabilities that shares your passion for digital technology and takes pride in making a tangible difference. If you want to contribute on an incredible array of the biggest and most complex projects in the digital space, consider a career with Accenture Digital.
Basic Qualifications
    • Bachelor's degree in Computer Science, Engineering, Technical Science or 3 years of IT/Programming experience.
    • Minimum 2+ years of expertise in designing, implementing large scale data pipelines for data curation and analysis, operating in production environments, using Spark, pySpark, SparkSQL, with Java, Scala or Python on premise or on Cloud (AWS, Google or Azure)
    • Minimum 1 year of designing and building performant data models at scale for using Hadoop, NoSQL, Graph or Cloud native data stores and services.
    • Minimum 1 year of designing and building secured Big Data ETL pipelines, using Talend or Informatica Big Data Editions; for data curation and analysis of large scale production deployed solutions.
    • Minimum 6 months of experience designing and building data models to support large scale BI, Analytics and AI solutions for Big Data.
Preferred Skills
    • Minimum 6 months of expertise in implementation with Databricks.
    • Experience in Machine learning using Python ( sklearn) ,SparkML , H2O and/or SageMaker.
    • Knowledge of Deep Learning (CNN, RNN, ANN) using TensorFlow.
    • Knowledge of Auto Machine Learning tools ( H2O, Datarobot, Google AutoML).
    • Minimum 2 years designing and implementing large scale data warehousing and analytics solutions working with RDBMS (e.g. Oracle, Teradata, DB2, Netezza,SAS) and understanding of the challenges and limitations of these traditional solutions.
    • Minimum 1 year of experience implementing SQL on Hadoop solutions using tools like Presto, AtScale, Jethro and others.
    • Minimum 1 year of experience building data management (metadata, lineage, tracking etc.) and governance solutions for big data platforms on premise or on AWS, Google and Azure cloud.
    • Minimum 1 year of Re-architecting and rationalizing traditional data warehouses with Hadoop, Spark or NoSQL technologies on premise or transition to AWS, Google clouds.
    • Experience implementing data preparation technologies such as Paxata, Trifacta, Tamr for enabling self-service solutions.
    • Minimum 1 year of building Business Data Catalogs or Data Marketplaces on top of a Hybrid data platform containing Big Data technologies (e.g Alation, Informatica or custom portals).
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration. Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Job Candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women .
DTCC
  • Dallas, TX

Business Unit: Group Chief Risk Office

Department: Quantitative Risk Management - QRM

Job Family: Oversight and Control

Job Title: Senior Quantitative Analyst

Corporate Title: Associate Director

FLSA Code (US Only): Exempt


Location: Dallas TX or Tampa FL

Business Unit Description:

Our Risk Management teams work to protect the safety and soundness of our systems and are responsible for identifying, managing, measuring and mitigating a spectrum of key risk types including credit, market, liquidity, systemic, operational and technology in all existing and new products, activities, processes and systems.

Position Summary:

The Senior Quantitative Analyst is primary responsible for designing, developing and testing new quantitative risk management models, while enhancing existing models. The role will also responsible for model performance monitoring and ad-hoc studies.

Principal Responsibilities:

Work with the team to build and maintain state of the practice quantitative risk management models and tools; including market risk, liquidity risk, and credit risk models.

Deliver both strategical and tactical solutions to risk modeling issues and problems.

Collaborate with internal departments including model validation, market risk, legal and audit to maintain transparency when assessing potential risk exposures within member firms. By doing so the candidate gains a solid understanding of risk management practices within a CCP.

Build prototypes of new models or model enhancements and work with technology teams to implement proposed models.

Stay abreast of the latest developments and best practices in quantitative financial modeling and risk management.

Experience:

3+ years of experience in financial risk management / financial engineering / data analysis / modeling.

Knowledge and Skills Required:

Understanding of traded products, market conventions, as well as risk measurement for equities and/or fixed income products.

Working experience with basic market risk management models such as Value at Risk and financial time series models.  

Ability to handle large set of data and data cleansing. Basic understanding and exposure to data science techniques such as regression, clustering, decision tree etc.

Ability to operate independently as well as being an effective team player.

Excellent communication and presentation skills.

Programming skills in SQL, Python or R; additional programming language is a plus.

Education, Training &/or Certification:

Masters degree in a finance, computer science, or quantitative field Minimum requirement

ASML
  • Veldhoven, Netherlands

Introduction



When people think ‘software’, they often think of companies like Google or Microsoft. Even though ASML is classified as a hardware company, we in fact have one of the world´s largest and most pioneering Java communities. The ASML Java  environment is extremely attractive for prospective Java engineers because it combines big data with extreme complexity. From Hadoop retrieval to machine learning to full stack development, the possibilities are endless. At ASML, our Java teams create and implement software designs that run in the most modern semiconductor fabs in the world, helping our customers like Samsung, Intel and TSMC make computer chips faster, smaller, and more efficient. Here, we’re always pushing the boundaries of what’s possible.


We are always looking for talented Java developers who know how to apply the latest Java SE or Java EE technologies, to join the teams responsible for creating software for high volume manufacturing automation in semiconductor fabs. Could this be your next job? Apply now!



Job Mission



As a Java developer you will join one of our multinational Scrum teams to create state-of-the-art software solutions. Teams are composed of five to ten developer, a Scrum Master and a Product Owner. We are committed to following a (scaled) Agile way of working, with sprints and demos every two weeks, aiming for frequent releases of working software. In all teams we cooperate with internal and external experts from different knowledge domains to discover and build the
best solutions possible. We use tools like Continuous Integration with GIT, Jira and Bamboo. We move fast to help our customers reach their goals, and we strive to create reliable and well-tested software, because failures in our software stack can severely impact customers' operations.



Job Description



All these dedicated Java teams work in unison on different products and platforms across ASML. Here’s a brief description of what the different Java teams do: 



  • Create software infrastructure using Java EE, which provides access to SQL and NoSQL storage, reliably manages job queues with switch-over and fail-over features, periodically collects of information from networked systems in the Fab and offers big-data-like storage and computational capabilities;

  • Create on-site solutions that continuously monitors all scanners in a customer’s domain. The server can detect system failures before they happen and identify needed corrective actions.

  • Provide industrial automation tasks that take care of unattended complex adjustments to the manufacturing process, in order to enable highest yields in high volume manufacturing;

  • Implement and validate algorithms that give our customers the power to reach optimal results during manufacturing;

  • Create applications that help fine-tune the manufacturing process, helping process engineers to navigate the complexities of process set-up through excellent UX design.

  • Create visualization and analytics applications for visual performance monitoring, which also help to sift through huge amounts of data to pinpoint potential issues and solutions. For these applications, we use Spotfire and R as main technologies.

  • Select and manage IT infrastructure that helps us run the software on a multi-blade server with plenty of storage. In this area, we use virtualization technologies, Linux, Python and Splunk. In addition to Java.Use emerging technologies to turn vision into reality, e.g. using big data and machine learning;


Responsibilities:



  • Designing and implementing software, working on the product backlog defined by the Product Owner;

  • Ensuring the quality of personal deliverables by designing and implementing automated tests on unit and integration levels;

  • Cooperating with other teams to ensure consistent implementation of the architecture;- and agreeing on interfaces and timing of cross-team deliveries;

  • Troubleshooting, analyzing, and solving integration issues both from internal alpha and beta tests as well as those reported by our customers;

  • Writing or updating product documentation in accordance with company processes;

  • Suggesting improvements to our technical solutions and way of working, and implementing them in alignment with your team and their stakeholders.


Main technologies: Java SE and EE ranging from 6 to the latest version. Junit, Mockito, XML, SQL, Linux, Hibernate, Git, JIR.



Education



A relevant BSc or MSc in the area of IT, electronics or computer engineering.



Experience



If you already have Java software development experience in the high-tech industry you should have the following experience:



  • At least 4 years of Java SE or Java EE development experience;

  • Design and development of server-side software using object-oriented paradigm;

  • Creation of automated unit and integration tests using mocks and stubs;

  • Working with Continuous Integration;

  • Working as a part of a Scrum team;

  • Experience with the following is a plus:

  • Creation of web-based user interface;

  • Affinity with math, data science or machine learning.

  • A nice-to-have skills are experience with Hadoop, Spark, and MongoDB.




Personal skills




  • First of all, you’re passionate about technology and are excited by the idea that your work will impact millions of end-users worldwide;

  • You’re analytical, and product- and quality-oriented;

  • You like to investigate issues, and you’re a creative problem solver;

  • You’re open open-minded, you like to discuss technical challenges, and you want to push the boundaries of technology;

  • You’re an innovator and you constantly seek to improve your knowledge and your work;

  • You take ownership and you support your team - you’re the backbone of your group;

  • You’re client and quality oriented – you don’t settle for second-best solutions, but strive to find the best ways to acquire top results.


Important information


So, you’ve read the vacancy and are about to click the “apply for this job” button. That’s great! To make everything go as smoothly as possible, we would really appreciate it if you could take a look at the
following tips from us.
Although you have the option to upload your Linkedin profile during the application process, we would like to ask you to upload a detailed CV and a personalized cover letter. Adding your Linkedin, Github, or any other relevant profile in your CV is always a good idea!

Before uploading your CV, can you please make sure that the following information is present:



  • Short information about your experience, what: company / sector/ domain / product.

  • Context of the project (how big / complex were the projects / what were the goals).

  • What was the size of the team / who was involved (think of developers, testers, architect, product owners, scrum master) / what was your role in the team.

  • Which languages / tools / frameworks / versions (recent) were used during the project (full stack, server side, client side).

  • In which projects an agile methodology was used.

  • What were the results of the projects (as a team and individually)  / your contribution and achievements.

Accenture
  • Minneapolis, MN
Position: Full time
Travel: 100% travel
Join Accenture Digital and leave your digital mark on the world, enhancing millions of lives through digital transformation. Where you can create the best customer experiences and be a catalyst for first-to-market digital products and solutions using machine-learning, AI, big data and analytics, cloud, mobility, robotics and the industrial internet of things. Your work will redefine the way entire industries work in every corner of the globe.
Youll be part of a team with incredible end-to-end digital transformation capabilities that shares your passion for digital technology and takes pride in making a tangible difference. If you want to contribute on an incredible array of the biggest and most complex projects in the digital space, consider a career with Accenture Digital.
Basic Qualifications
    • Bachelor's degree in Computer Science, Engineering, Technical Science or 3 years of IT/Programming experience.
    • Minimum 2+ years of expertise in designing, implementing large scale data pipelines for data curation and analysis, operating in production environments, using Spark, pySpark, SparkSQL, with Java, Scala or Python on premise or on Cloud (AWS, Google or Azure)
    • Minimum 1 year of designing and building performant data models at scale for using Hadoop, NoSQL, Graph or Cloud native data stores and services.
    • Minimum 1 year of designing and building secured Big Data ETL pipelines, using Talend or Informatica Big Data Editions; for data curation and analysis of large scale production deployed solutions.
    • Minimum 6 months of experience designing and building data models to support large scale BI, Analytics and AI solutions for Big Data.
Preferred Skills
    • Minimum 6 months of expertise in implementation with Databricks.
    • Experience in Machine learning using Python ( sklearn) ,SparkML , H2O and/or SageMaker.
    • Knowledge of Deep Learning (CNN, RNN, ANN) using TensorFlow.
    • Knowledge of Auto Machine Learning tools ( H2O, Datarobot, Google AutoML).
    • Minimum 2 years designing and implementing large scale data warehousing and analytics solutions working with RDBMS (e.g. Oracle, Teradata, DB2, Netezza,SAS) and understanding of the challenges and limitations of these traditional solutions.
    • Minimum 1 year of experience implementing SQL on Hadoop solutions using tools like Presto, AtScale, Jethro and others.
    • Minimum 1 year of experience building data management (metadata, lineage, tracking etc.) and governance solutions for big data platforms on premise or on AWS, Google and Azure cloud.
    • Minimum 1 year of Re-architecting and rationalizing traditional data warehouses with Hadoop, Spark or NoSQL technologies on premise or transition to AWS, Google clouds.
    • Experience implementing data preparation technologies such as Paxata, Trifacta, Tamr for enabling self-service solutions.
    • Minimum 1 year of building Business Data Catalogs or Data Marketplaces on top of a Hybrid data platform containing Big Data technologies (e.g Alation, Informatica or custom portals).
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration. Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Job Candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women .
Accenture
  • Atlanta, GA
Position: Full time
Travel: 100% travel
Join Accenture Digital and leave your digital mark on the world, enhancing millions of lives through digital transformation. Where you can create the best customer experiences and be a catalyst for first-to-market digital products and solutions using machine-learning, AI, big data and analytics, cloud, mobility, robotics and the industrial internet of things. Your work will redefine the way entire industries work in every corner of the globe.
Youll be part of a team with incredible end-to-end digital transformation capabilities that shares your passion for digital technology and takes pride in making a tangible difference. If you want to contribute on an incredible array of the biggest and most complex projects in the digital space, consider a career with Accenture Digital.
Basic Qualifications
    • Bachelor's degree in Computer Science, Engineering, Technical Science or 3 years of IT/Programming experience.
    • Minimum 2+ years of expertise in designing, implementing large scale data pipelines for data curation and analysis, operating in production environments, using Spark, pySpark, SparkSQL, with Java, Scala or Python on premise or on Cloud (AWS, Google or Azure)
    • Minimum 1 year of designing and building performant data models at scale for using Hadoop, NoSQL, Graph or Cloud native data stores and services.
    • Minimum 1 year of designing and building secured Big Data ETL pipelines, using Talend or Informatica Big Data Editions; for data curation and analysis of large scale production deployed solutions.
    • Minimum 6 months of experience designing and building data models to support large scale BI, Analytics and AI solutions for Big Data.
Preferred Skills
    • Minimum 6 months of expertise in implementation with Databricks.
    • Experience in Machine learning using Python ( sklearn) ,SparkML , H2O and/or SageMaker.
    • Knowledge of Deep Learning (CNN, RNN, ANN) using TensorFlow.
    • Knowledge of Auto Machine Learning tools ( H2O, Datarobot, Google AutoML).
    • Minimum 2 years designing and implementing large scale data warehousing and analytics solutions working with RDBMS (e.g. Oracle, Teradata, DB2, Netezza,SAS) and understanding of the challenges and limitations of these traditional solutions.
    • Minimum 1 year of experience implementing SQL on Hadoop solutions using tools like Presto, AtScale, Jethro and others.
    • Minimum 1 year of experience building data management (metadata, lineage, tracking etc.) and governance solutions for big data platforms on premise or on AWS, Google and Azure cloud.
    • Minimum 1 year of Re-architecting and rationalizing traditional data warehouses with Hadoop, Spark or NoSQL technologies on premise or transition to AWS, Google clouds.
    • Experience implementing data preparation technologies such as Paxata, Trifacta, Tamr for enabling self-service solutions.
    • Minimum 1 year of building Business Data Catalogs or Data Marketplaces on top of a Hybrid data platform containing Big Data technologies (e.g Alation, Informatica or custom portals).
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration. Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Job Candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women .
Accenture
  • Philadelphia, PA
Position: Full time
Travel: 100% travel
Join Accenture Digital and leave your digital mark on the world, enhancing millions of lives through digital transformation. Where you can create the best customer experiences and be a catalyst for first-to-market digital products and solutions using machine-learning, AI, big data and analytics, cloud, mobility, robotics and the industrial internet of things. Your work will redefine the way entire industries work in every corner of the globe.
Youll be part of a team with incredible end-to-end digital transformation capabilities that shares your passion for digital technology and takes pride in making a tangible difference. If you want to contribute on an incredible array of the biggest and most complex projects in the digital space, consider a career with Accenture Digital.
Basic Qualifications
    • Bachelor's degree in Computer Science, Engineering, Technical Science or 3 years of IT/Programming experience.
    • Minimum 2+ years of expertise in designing, implementing large scale data pipelines for data curation and analysis, operating in production environments, using Spark, pySpark, SparkSQL, with Java, Scala or Python on premise or on Cloud (AWS, Google or Azure)
    • Minimum 1 year of designing and building performant data models at scale for using Hadoop, NoSQL, Graph or Cloud native data stores and services.
    • Minimum 1 year of designing and building secured Big Data ETL pipelines, using Talend or Informatica Big Data Editions; for data curation and analysis of large scale production deployed solutions.
    • Minimum 6 months of experience designing and building data models to support large scale BI, Analytics and AI solutions for Big Data.
Preferred Skills
    • Minimum 6 months of expertise in implementation with Databricks.
    • Experience in Machine learning using Python ( sklearn) ,SparkML , H2O and/or SageMaker.
    • Knowledge of Deep Learning (CNN, RNN, ANN) using TensorFlow.
    • Knowledge of Auto Machine Learning tools ( H2O, Datarobot, Google AutoML).
    • Minimum 2 years designing and implementing large scale data warehousing and analytics solutions working with RDBMS (e.g. Oracle, Teradata, DB2, Netezza,SAS) and understanding of the challenges and limitations of these traditional solutions.
    • Minimum 1 year of experience implementing SQL on Hadoop solutions using tools like Presto, AtScale, Jethro and others.
    • Minimum 1 year of experience building data management (metadata, lineage, tracking etc.) and governance solutions for big data platforms on premise or on AWS, Google and Azure cloud.
    • Minimum 1 year of Re-architecting and rationalizing traditional data warehouses with Hadoop, Spark or NoSQL technologies on premise or transition to AWS, Google clouds.
    • Experience implementing data preparation technologies such as Paxata, Trifacta, Tamr for enabling self-service solutions.
    • Minimum 1 year of building Business Data Catalogs or Data Marketplaces on top of a Hybrid data platform containing Big Data technologies (e.g Alation, Informatica or custom portals).
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration. Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Job Candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women .