OnlyDataJobs.com

Analytics Anti-Money Laundering Senior Consultant at DTTS (New York, NY)

  • New York, NY
 

As used in this document, Deloitte Risk & Financial Advisory means Deloitte & Touche LLP, which provides audit and enterprise risk services; Deloitte Financial Advisory Services LLP, which provides forensic, dispute, and other consulting services; and its affiliate, Deloitte Transactions and Business Analytics LLP, which provides a wide range of advisory and analytics services. Deloitte Transactions and Business Analytics LLP is not a certified public accounting firm. These entities are separate subsidiaries of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not be available to attest clients under the rules and regulations of public accounting.

 

At Deloitte, you can have a rewarding career on every level. In addition to challenging and meaningful work, youll have the chance to give back to your community, make a positive impact on the environment, participate in a range of diversity and inclusion initiatives, and get the support, coaching, and training it takes to advance your career. Our commitment to individual choice lets you customize everything from your career path to your educational opportunities to your benefits. And our culture of innovation means your ideas on how to improve our business and your clients will be heard.

 

Visit www.deloitte.com/us/careers to learn more about our culture, benefits, and opportunities.

 

Deloitte Transactions and Business Analytics LLP (DTBA) advises clients on managing business controversy and conflict, executing deals, and maintaining regulatory compliance. We provide services to companies throughout their lifecycle from purchasing a company to investigating potential fraud.

                               

AML Analytics is a fast-growth practice within Deloitte Regulatory that centers around several of the hottest areas of business today analytics, forensic analysis, and litigation support and the specialized skills that make careers in these areas both fascinating and in high-demand. Our Analytics team makes extensive use of data, statistical and quantitative analysis, rules-based methods, and explanatory and predictive modeling to bring insights to client issues in the forensic and transaction domains. We work with Deloitte specialists in many areas, and we apply our solutions to a wide range of highly interesting and complex corporate events, such as forensic investigations, anti-corruption compliance, restructuring, valuation, anti-money laundering compliance, and construction management, to explain what has occurred in the past and to support informed decision making for the future. Our work increasingly employs specialized competencies, such as advanced analytics, visualization, and geospatial techniques.

Data Analytics:

* Data Analytics: This involves strategic analysis of structured (e.g. transactional) and unstructured (e.g. text) electronic data, from initial data scoping and collection through transformation and validation, for purposes of deep data analysis and mining. This analysis is performed as the basis for larger efforts that include rules- and indicator-based testing, inductive reasoning, and visualization techniques. These methods are used to identify, deter, and prevent fraud schemes and scenarios, develop models that depict or describe historical events, and build predictive models for future anti-fraud and anti-corruption efforts. Analytical methods are also commonly used to mine and develop insights into large volumes of data to help navigate intricate legal processes and identify areas of opportunity for business process improvement. Additionally, analytics is often used to assist troubled companies in peer analysis, quantifying opportunity cost, and discerning areas of strategic improvement.
* Application Development: From standalone tools to enterprise web applications, we leverage a modern proprietary services-based development platform and agile methodology to assist clients with processing data, presenting findings, and documenting decisions. We serve both external clients and internal clients with a focus on enhancing existing products and service offerings

Examples of areas in which AML Analytics has delivered value to clients include:
* Enhancing a transaction-monitoring program for a financial services client by assisting them in the development of a methodology to analyze, test, and tune various detection scenarios, and assisting them in deploying that methodology.
* Delivering an analytics solution that created a predictive coding model to assist a financial services client with identifying and categorizing an enormous volume of privileged and non- privileged documents related to a large bankruptcy case.
* Helping to resolve a royalty distribution dispute in the client's favor by developing and articulating an economic model that showed how to determine the relative market value of the copyrighted content.

Our dedicated Deloitte professionals bring vast experience, specialized skill sets and deep industry knowledge to our clients. This personalized level of service, combined with the market reach and technical resources of the Deloitte Touche Tohmatsu Limited (DTTL) member firms and their affiliates, enables us to respond to the complex and diverse needs of our clients around the world.

 

Role Description

Enhancing a transaction-monitoring program for a financial services client by assisting them in the development of a methodology to analyze, test, and tune various detection scenarios, and assisting them in deploying that methodology.

Delivering an analytics solution that created a predictive coding model to assist a financial services client with identifying and categorizing an enormous volume of privileged and non- privileged documents related to a large bankruptcy case.

Helping to resolve a royalty distribution dispute in the client's favor by developing and articulating an economic model that showed how to determine the relative market value of the copyrighted content.Senior associates are professionals with a high degree of academic and professional achievement who have demonstrated the capacity and desire for continuous growth and development.  Specialists at this level will be expected to query and mine large data sets to discover transaction patterns, examine financial data and filter for targeted information using traditional (SQL Server 2012, Oracle 11.2, Microsoft Access, etc..) as well as advanced visual (Tableau, QlikView, etc..) and predictive analytic methodologies (SAS, neural networks, machine learning, etc) and software packages.  Senior associates will usually work with a manager to design and develop user defined application modules, perform data quality control, develop database reports and user interfaces, and normalize relational data.

 

Senior consultants may work on multiple project work streams, small engagement teams or as part of large, complex engagements.  The projects involve detailed data analysis for developing meaningful insights to address regulatory and business challenges across industries.  These challenges include helping clients address serious business concerns involving fraud, forensic investigations, litigation, and reorganization. We also advise businesses on valuation issues and other matters (e.g., FCPA, Anti-Money Laundering) to help them remain compliant in today's rigorous regulatory environment.  

 

Typically, Deloitte senior consultants will work on projects that will allow them to collaborate with attorneys, forensic accountants, clients, and other investigators.  As part of these projects, youll be provided the opportunity to demonstrate your technical, project management and leadership skills in an environment that provides for outstanding growth and advancement.  Senior consultants will also be looked upon to supervise, lead, and train junior associates in the aforementioned activities. Project teams either work onsite with the client team or from a Deloitte office.  Either location may necessitate regular travel and requires flexibility with the significant travel requirements of client service projects.

 

As a senior consultant you will be asked to make significant contributions in a relatively short time frame, take responsibility for the way you manage your time, develop your skill set, and assist In the delivery of innovative solutions to the client.  Your career success is dependent on your ability to personalize your career path and identify and grow your internal and external network. We dont expect you to do this alone - our environment provides multiple opportunities for you to further develop your skills through our training curriculum and mentoring programs.

 

Sr. Consultant Responsibilities

 

Engagements include a wide variety of solutions, tailored to the clients need, and are often performed in conjunction with industry and subject matter experts from throughout Deloitte.  Responsibilities typical of a senior associate on one of these projects could include:

 

* Create, manage, and utilize high performance relational databases (SQL Server 20012 Oracle 11.2g, Microsoft Access, OLAP and other proprietary software)
* Query and mine large data sets to discover transaction patterns, examine financial data and filter for targeted information - using traditional as well as predictive/advanced analytic methodologies
* Design and development of user-defined application modules (standard and web-based), perform data quality control, develop database reports and user interfaces, and normalize relational data
* Supervise, lead, and train junior associates in the aforementioned activities
* Provide input with respect to practice technology initiatives and investments
* Assist in preparing and presenting complex written and verbal materials (reports, findings and presentations)

 

  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As part of the Big Data ecosystem, the Comcast dx team defines and executes on data strategy to realize the promise of "data to the people." The Solution Manager plays a critical role in this effort by linking our customers' needs to the data ecosystem, both within the dx team and across the larger Comcast organization.

As a Solution Manager you:

-Lead client engagement, data discovery/analysis and business/process modeling efforts for the dx team.

-Operate across a number of technical domains with a focus in a primary area such as Product or Network Quality.

-Are a naturally curious problem solver with a passion for efficiency and execution excellence.

-Have performed in an analytical or technical role previously and have a strong understanding of the analytical workflow from requirements/data discovery through analysis to operationalization.

-Understand that to be successful in this role requires presence and confidence, and you have the ability to drive a team forward in the face of ambiguity and competing priorities.

-Understand the fundamental role that data plays in the competitive landscape and demonstrate a passion for data excellence.

-Embrace collaboration as a central tenant to being successful and understand the critical need to build trusting bonds both with our key stakeholders and delivery teams.

-Partner effectively with a Solution Engineers, Architects, Product Owners and Tech Leads to define and scope work into delivery roadmaps.

-Ensure traceability and alignment of execution to critical priorities. Drive translation of business requirements into solution intent. This includes requirements identification and clarification, project goals and objective definition, scoping, estimation, risk assessment.

-Anticipate needs, operate with a sense of urgency, and have the ability to adapt to change quickly.

-Fill resource gaps with hands-on work as needed.

-Should have had the ability to write this job description better than us :)

Qualifications:

-Bachelor's Degree (Advanced Degree Preferred) in engineering, mathematics, computer science, statistics, physics, economics, operations research or related field;graduate study extremely helpful.

-Minimum of 7 years Tech Lead / Product Management / Project Management / Consulting experience, preferably in Data Warehousing, Big Data and Analytics.

-Experience with customers: managing consultant utilization, milestone success, setting and managing expectations, controlling outcomes and resolving customer issues.

-Understanding of data warehousing technologies and evolution, including relational databases (SQL, Oracle) and big data technology (Cloud, AWS, Hadoop), business intelligence and analytical tools (Tableau, Python, R), architectural strategies and previous hands on development and engineering (SDLC and Agile).

-Strong communication, presentation and meeting facilitation skills. Ability to positively represent yourself and the team.

-Must be a team-player and be able to work closely with technical teams and business users.

-Capable of building strong relationships with leaders across enterprise for both sources of data and consumers.

-The ability to travel ("25%) and to attend in-person client meetings.

-Expert in MS Office.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • West Chester, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Position Summary

The dx Team has responsibility for Data Engineering part of Comcast. One of the major goal is to harmonize the data ingestion and consumption layer across Comcast. Creating enterprise data sources as a single version of truth is a goal of dx Team.

With moderate guidance, the Big Data Software Developer will develop (code/program), test, and debug ETL (Extract/Transform/Load) of data to answer technically challenging business requirements (complex transformations, high data volume). All work needs to be documented part of release management.

Employees at all levels are expect to:

-Understand our Operating Principles; make them the guidelines for how you do your job

-Own the customer experience-think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services

-Know your stuff-be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences

-Win as a team-make big things happen by working together and being open to new ideas

-Be an active part of the Net Promoter System-a way of working that brings more employee and customer feedback into the company-by joining huddles, making call backs and helping us elevate opportunities to do better for our customers

-Drive results and growth

-Respect and promote inclusion and diversity

-Do what's right for each other, our customers, investors and our communities

Core Responsibilities

-Play a key role at a senior-level engineer by implementing a solid, robust, extensible design that supports key business flow.

-Analyzes and determines data integration needs.

-Evaluates and plans software designs, test results and technical manuals using Big Data (Hadoop) ecosystem

-Build and maintain optimized ETL solutions to process/load source systems data into Hadoop using Sqoop or Microservices leveraging Hadoop tools.

-Reviews literature, current practices relevant to the solution of assigned projects in Data Warehousing/ Data Lake and Reporting areas

-Programs new software using Spark, Scala, Kafka, Sqoop, SQL

-Supports existing and new applications and customization of current applications

-Edits and reviews technical requirements documentation

-Displays knowledge of software engineering methodologies, concepts, skills and their application in the area of specified engineering specialty (like Data warehousing)

-Displays knowledge of, and ability to apply, process software design and redesign skills

-Displays in-depth knowledge of, and ability to apply, project management skills

-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required

-Consistent exercise of independent judgment and discretion in matters of significance

-Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary

-Other duties and responsibilities as assigned

Education Level

Bachelor's Degree or Equivalent

Years of Experience

2-5 years of experience working as data integration developer, SQL Developer, ETL developer or Java/C# Developer or related experience is required.

Field of Study

Computer Science, Engineering

Compliance

Comcast is an EEO/AA/Drug Free Workplace

Disclaimer

The above information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications

Additional Information

Daily Responsibilities:

Technologies used day to day

Linux, Hadoop, Spark, UC4, SQL, Linux Shell Scripting, BI reporting tools

Business Units -what group(s) does the role support

dx and dx business partner initiatives

Paired Programming/Vs. Individual tasks-what does that look like

40% paired programming / Vs. 60% Individual tasks

Business Purpose

Describe the core impact of this role and the team -project details

Ingest data from various data sources to create a harmonized layer of data

The dx Team has responsibility including Data Engineering for Comcast; one of the major goal is to harmonize the data ingestion and consumption layer across Comcast.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Água Branca, Brazil

We are in the process of migrating our [monolithic] application landscape to a microservice architecture.
We also transitioned from fire fighting operations mode to a innovation and devops culture.
In this process a new R&D area emerged which consists of a small team of research and software engineers.


What we are doing



  • Building algorithms that help us with anticipated shipping, purchasing forecast and protects us against system failure.

  • using image recognition to give our users the highest possible convenience and coolest features.

  • using state of the art game engines to build virtual reality into our customer experience.

  • Optimize product search and build data consistency monitoring.

  • Help building a large scale architecture together with entire IT team.


The output will be nothing less than transform the way e-commerce works and to provide sustainable solutions.


What we are looking for


3+ years professional or research experience



  • Comprehensive knowledge in statistics, probability and machine learning.

  • Passion for solving standard and non-standard mathematical problems.

  • Ability to explain data science insights to people with and without quantitative background.

  • Fluency in R, Python, or Julia.

  • Hands on/advanced experience with SQL.

  • Experience in data visualization.

  • Experience using Big Data toolstack (e.g. Hadoop, Hive, Spark, Cassandra, Hbase, or other non-relational DB) is a plus.

  • West Chester, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Responsibilities

Position Summary

This is a senior position within the SE Data Service organization reporting to the Director of Data Experience . The DX group has responsibility for Data Service for Comcast, one of the major goal is to harmonize the data ingestion and consumption layer across Comcast.

Creating the enterprise data sources, EDW, ODS, transaction data source as a single version of truth for data analysis for Comcast business.

The position is especially responsible working across multiple solution architect, business partner and program and project to harmonize, data platform, process, integration, data asset within Comcast.

Collaborate and partner with technical and business team to present, define, strategies, requirement, road-map, budget, solution. Prepare and present Gap analysis, for technology and business solution up-to VP level.

Manage and support all project related architect, design, development, deployment of data-oriented integration across platform and projects as matrix organization.

The Data Integration and solution architect must have strong understanding and hands- on working knowledge with following software components: Linux, Shell Scripting, Informatica, Teradata, SQL, BTEQ, Hadoop Hive, Pig, Flume, Sqoop, Spark, Storm, Kafka, Accumulo, HBase, Java and UC4

-Provides direction for diverse and complex initiatives and is accountable for a variety of tasks to architect and deliver data warehousing solutions that exceed customer expectations in content, usability, accuracy, reliability and performance while assuming a leading role within agile teams (both on-shore and off-shore)

-Interprets business strategy and develops organizational objectives to align with this strategy. Typically manages multiple teams of professionals.

-Development of End to End ETL data integration and solution architecture

-Worked across technical and business team in order to harmonization data asset

-Experience with SQL & BTEQ Scripting: Strong data management and data analysis and performance tuning required.

-Experience with Hadoop Hive, Pig, Flume, Sqoop, Storm, Spark, Kafka, Accumulo, and HBase

-Experience with AWS services (EMR, Kinesis, S3, redshift, EC2)

-Candidate might have worked as one or more role e.g. Data architect, Data modeler, Data integration developer/architect, ETL developer/architect.

-Experiences solution architect to design, develop and implement ODS, EDW, Data Integration Layer etc.

-Experience with Presto query engine

-Method and Procedure creation related to operation and administration related activity.

-Manging and Co-ordniting Dev/ops Process.

-Conduct Operations readiness and environment compatibility review of any changes prior to deployment.

-Experince Managing Relase and process detail

-Knowledge in data warehousing methodologies and best practices required.

-Strong verbal and written communication skills required.

-Effective interpersonal relations skills, ability to effectively collaborate with others and work as part of a team required.

-Skills in navigating a large organization in order to accomplish results required.

-Ability to initiate and follow through on complex projects of both short and long term duration required.

-Excellent organizational and time management skills required.

-Excellent analytical and problem solving skills required.

-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required.

-Participate on interdepartmental teams to support organizational goals

-Perform other related duties and tasks as assigned

-Punctual, regular, and consistent attendance

Required Skills/Experience:

Advanced Degree in technical discipline and/or management required. 10+ year of experience with data integration, Data Warehouse and ETL Architecture.

-Experience with Teradata and Hadoop Ecosystems.

-Experience working across multiple solution architect, business partner and program and project to harmonize, data platform, process, integration, data asset within Comcast.

-5 + year of experience working as data integration, data solution architect EDW, ODS, ETL architect or similar role required.

-Five to seven years' experience leading data integration, Development of ETL architecture using Linux, Informatica, Teradata, SQL, BTEQ, Hadoop Hive, Pig, Spark, Flume, Sqoop, and UC4

-Experience with AWS services (EMR, Kinesis, S3, redshift, EC2)

-Experience working as solution architect supporting Platform as Service Organization.

-Experience with Presto query engine

-Requires understanding of complete SDLC and Experience with continuous integration, test-driven/behavior-driven development, and agile, scrum development methodologies

-Experience collaborating and partner with technical and business team to present, define, strategies, requirement, road-map, budget, solution.

-Experience presenting strategies, road-map and gap analysis, for technology and business solution up-to VP level

-Manging and Co-ordniting Dev/ops Process.

-Experience designing logical and physical architectures

-Experience managing teams of senior technologists

-Experience working in an Agile development methodology in a data warehouse environment

-Ability to work effectively across organizational boundaries

-Excellent oral, written, analytical, problem solving, and presentation skills

-Manage and Co-ordinate 1-10 matrix resources,

-Experience with mange service and on-shore and off-shore development experience is must

Desired Skills/ Experience

-Telecommunications experience Knowledge of Telecommunication/Cable billing, customer care systems e.g. DST, CSG and AMDOC etc.

-Knowledge of NoSQL platforms;

-Hadoop, Teradata, TOGAF, AWS Certified

Comcast is an EEO/AA/ Drug Free Workplace

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Hamburg, Deutschland

System Administration | Hamburg | Full-Time




You are a data analyst/scientist and you want to develop a deep understanding of the infrastructure behind Big Data systems? We are offering a cross-functional position where you can use your existing data processing skills and become the driving force behind the operation and further development of our data infrastructure.

You will be part of our System Administration department and work very closely with our data engineers and scientist to design and implement architecture changes for our Big Data systems. Our experienced system administrators will introduce you to our infrastructure. Once you are familiar with our data ecosystem, Puppet setup, Serveradmin and our operational structure you will take over the responsibility for our Big Data infrastructure step by step.

With our data architecture we are processing more than 1,000,000,000 game interaction events day by day. Our Hadoop cluster works on 1,500 CPUs, 5 TB Memory, 1 PB of disk space.



Your profile:



  • Some years professional experience as a Data Analyst, Scientist or Engineer

  • You understand distributed systems (e.g. CAP, Kappa, Lambda) and the interactions happening in the background (e.g. rebalancing)

  • You know stream processing, Data integration and processing (ETL) and can analyze and optimize SQL queries

  • Professional experience in Java and a Scripting language, ideally Python

  • Open and friendly communication style and very good English skills

  • You know your way around a Linux command line

  • Knowledge in server hardware and ideally a configuration management tool ( like Puppet, Chef or Ansible) would be a huge plus



Your mission:



  • Work together with our Data Engineers and Data Scientists

  • Discuss and implement new technologies

  • Maintain the systems, update, monitor, and debug them

  • Automate and improve the environment



Our technologies:



  • The Hadoop ecosystem (HDFS, Hive, Impala, Spark)

  • Stream processing (Kafka, Flink)

  • Custom data applications with Java, PHP and go

  • Jenkins for job scheduling and build processes

  • Debian and Puppet for configuration management

  • Supported by Nagios, Graphite, Grafana and Serveradmin



Why join us?



  • Be part of a great team in an international environment in a healthy and stable growing company

  • Choose your preferred device (Linux, Mac or even Windows) for your comfortable workplace

  • We will actively support your further development and give you all needed resources to evaluate new technologies, participate in open source communities or improve your soft skills

  • Competitive compensation and an atmosphere to empower creative thinking and strong results

  • Exceptional benefits ranging from flawless relocation support to company gym, smartphone or tablet of your own choice for personal use, roof terrace with BBQ and much more



Excited to start your journey with InnoGames and join our dynamic team as a Data Engineer Base Technologies (for DevOps)? We look forward to receiving your application as well as your salary expectations and earliest possible start date through our online application form. Isabella Dettlaff would be happy to answer any questions you may have.

InnoGames, based in Hamburg, is one of the leading developers and publishers of online games with more than 200 million registered players around the world. Currently, more than 400 people from 30 nations are working in the Hamburg-based headquarters. We have been characterized by dynamic growth ever since the company was founded in 2007. In order to further expand our success and to realize new projects, we are constantly looking for young talents, experienced professionals, and creative thinkers.





Isabella


Talent Acquisition Manager




Phone +494078893350

  • London Borough of Richmond upon Thames, UK
Gumtree, part of eBay Classifieds Group, is the UK’s leading classifieds site with over 14.5M unique visitors every month and over 9.5M app downloads. Founded in London in 2000, on Gumtree you can buy and sell everything from cars to home items and find jobs, local services, community events and even somewhere to live.



We’re looking for a Head of Engineering to join the Product Development team, reporting to the CTO. You will play a key role within engineering leadership, responsble for hiring, mentoring and managing 15-20 people, while maintaining strong development standards.



Based in beautiful Richmond, London, just by the riverside, you will join a team of over 40 engineers constantly innovating and delivering value to millions of people every single day. We work in an Agile environment in cross-functional squads, building features using continuous integration and constant testing with our users.



In this role you will be responsible for: People management of 15-20 back-end and front-end engineers, Scrum Masters and QA engineers.

Lead coaching, mentoring, career development and hiring across the team.

Work closely with product management to meet company objectives and goals.

Partner with leaders from across the business, including other eBay Classifieds Group brands worldwide.

Continually innovate and optimise our technology, people and performance.

Promote clean, testable and maintainable code that is modular and scalable.



We’re looking for:Experience managing large teams of engineers, testers and managers.

Knowledge and experience of software engineering and standard methodologies.

Familiar with organising work to follow Agile methodologies.

Excellent interpersonal skills with both technical and non-technical people.

Real passion for developing your people to ensure they succeed.



Although this is not a hands-on role, you should have strong development experience, ideally in some of the tools and technologies that we currently use:

Java, Spring, Scala, Akka, Elasticsearch, PostgreSQL, MongoDB, Redis, Hadoop.

React (with Flux), JavaScript ES6, SASS, HTML5.

Jenkins, JUnit, ScalaTest, TestNG, Mockito, Cucumber.



BenefitsFlexible working patterns and occasional work-from-home supported.

Full medical, dental and vision healthcare cover.

Pension scheme.

Life and disability insurance.

Childcare vouchers, parental leave policy and Cyclescheme available.

Networking, learning and global travel opportunities across eBay Classifieds Group.

Regular Tech Talks, Hackathons and workshops.

Phenomenal working environment with height-adjustable desks and Aeron chairs.

Free breakfast, fruit, snacks, soft drinks, coffee and tea.

Free on-site massages, yoga, pilates and fitness bootcamps.



Interview processIntroductory phone call with CTO.

Face-to-face interviews including meetings with a senior engineering, engineering manager and CTO.
  • London Borough of Ealing, UK

Cognizant BigData Practice:


Big Data service line is part of the Cognizant Digital Business/Artificial Intelligence and Analytics business unit. It is responsible for delivering digital data platforms, smart data lakes on premise/cloud using different Hadoop distributions/NoSQL technologies, Data Virtualization products availble in the market and have built an proprietary Data Platform to deliver end-end data services, automation and framework driven data processing modules. We are a platinum/gold partners for majority of platform vendors which enables us seamless access to all product upgrades, discussions, events etc;


The members of the team work in a variety of industries and use a diverse set of tools to process IoT, Social data, larger operational systems like SAP, Salesforce, Mainframes etc; types of complex data.  Accessing a range of data stored in disparate systems, integrating data and providing the data in necessary formats to perform data mining to answer specific business questions as well as identifying unknown trends and relationships in data. 


Few Snippets:



  • We are one of the fastest growing and revenue generating service lines with in Cognizant

  • More than 170+ Big Data engagements across the globe with over 2000 + Big Data consultants with expertise on various different industry products & distributions

  • 170+ Usecases repository with Solutions for different Digital & IoT usecases

  • Rated as a Leader by Everest’s PEAK matrix in their “Big Data assessment” for  RCGTH, HC, Banking and Insurance domains

  • “Leader”  Gartner Magic Quadrant for Data & Business Analytics Services, Worldwide 


You as senior technologist will be part of the strategic drive to in leading multi-discipline teams through the full delivery lifecycle of complex data products and pipelines with a clear understanding of large-scale data processing and data science solutions, deliver use cases for one of our customers and develop practical solutions and implement them to give this customer a competitive edge with in the enterprise. You will have breadth of experience in both engineering and architecture across technology disciplines and the unique challenges of each, including software development, automated test and quality assurance, data and integration.


Your role


Delivering a high quality and innovative data driven solution to our client’s advance analytical needs. This will include, working within a team on the clients site: understanding the client’s pain, designing an analytical approach, implementing a solution, ensuring it is of high quality, and leading and mentoring multi discipline technology teams.


Key Responsibilities:



  • Work at a client site as a member of an experienced onsite consulting/delivery teams developing & providing architectures for a highly resilient, fault tolerant, horizontally scalable data platform.

  • To build multiple layers with in the platform around Data Ingestion, Integration, Registration, Metadata Management, Error & Auditing, Data Provisioning.

  • Working with Architecture to refine ideas on the Analytics Architecture and help with standards and guidelines.

  • Building a Big Data Engineering community and helping customers/clients to build centre of excellence for engineering.

  • Working closely with the Lead Software Engineers in your area of responsibility to champion the core tenets of engineering at client

  • Ensuring development teams deliver the highest quality data products through the adoption and continuous improvement of patterns and practices, tools and frameworks and processes. 

  • Applying years of experience and expertise to help teams resolve and overcome technical challenges of any size or complexity and assist where necessary to accelerate problem resolution.


Job Requirements  Essential Skills:



  • Proven, Strong Data Processing skillset with experience in Hadoop tools and techniques. For example (not exhaustive):

  • Spark processing, Streaming and performance tuning

  • Kafka Real time messaging

  • HBase modelling and development

  • HDFS file formats partitioning for eg; Parquet, Avro etc;

  • Impala/Hive

  • Unix Shell Scripting

  • Proficiency in Scala

  • Working proficiency in developmental toolsets like Eclipse, IntelliJ

  • Exposure/competence with Agile Development approach

  • Git & Continuous integration tools such as Jenkins, TeamCity

  • Multi-threaded, OOPS Programming

  • Jenkins/Maven

  • FindBugs, Sonar, JUNIT, Performance, Memory Management

  • Strong experience in delivering Data Solutions using Big-Data technologies and cloud platforms.

  • Strong work experience in Azure Cloud platform

  • Expert in at least one of Java, Scala, Python with knowledge of the others

  • Experience in delivering big data solutions using a leading Hadoop distribution like Hortonworks, Cloudera or MapR

  • Knowledge of RDBMS, ETL and Data warehouse technologies 

  • Testing frameworks like Junit, Scalatest, Mock testing

  • Nice to Have Skills.

  • Knowledge and experience in delivering real time solutions on sinks like Cassandra, MongoDB or equivalent NoSQL databases

  • Container technologies like Docker

  • Knowledge of API and Microservice architectures 

  • Knowledge of advanced analytics and insights techniques (e.g. predictive analytics, machine learning, segmentation)

  • Knowledge of deep learning frameworks like TensorFlow, Keras

  • Knowledge of machine learning libraries like MLLIB, sklearn

  • Strong domain experience in retail/e-commerce,  Communications & Technology verticals. 


Qualifications:  University Degree with a specialization in Computer Science, Mathematics 

  • San Mateo, CA
Overview


Lumiata is seeking a Software Engineer with a focus on handling interesting data problems to join our nimble and growing team. Our new team member will play a critical role in helping develop and maintain Lumiata’s core data science infrastructure. You will be working on and helping to develop automated pipelines for training and deploying machine learning models and building high performance systems to understand complex patient data.



To apply please apply at our main jobs site (https://lumiata.bamboohr.com/jobs/view.php?id=5) and our Talent Acquisition team will review and follow up. Thanks! 


As an Engineer at Lumiata you will learn in detail how medical information flows from patient all the way through to meaningful insights and the many intricacies involved along the way including but not limited to:



  • How AI and machine learning can transform healthcare.

  • How to build and deploy novel machine learning products.

  • How medical information is stored and communicated between different actors in the healthcare system.

  • What modern, open standards have been developed to better communicate and represent medical data.

  • What specific standards must be respected and how to ensure compliance to handle sensitive healthcare data including HIPAA, SOC2 and HITRUST among others.

  • How to apply TensorFlow, Apache Spark, and other cutting edge tools in a production environment.


We are a diverse, international, creative team; join us in building a better medical system for everyone by tackling looming health problems using large scale machine learning.


Key Responsibilities



  • Research and design of algorithms to solve problems using medical & healthcare data.

  • Building and developing pipelines to

  • Handle large volumes of heterogeneous medical data.

  • Train/update/version/maintain production machine learning systems

  • Expose multilayer machine learning systems to customers while maintaining first class privacy and security standards.

  • Extract and expose external public data sources to enhance medical machine learning models


 Qualifications 



  • Strong developer skills building python infrastructure and applications

  • Solid working experience building applications in Scala or other similar language

  • Working experience with Hadoop, Apache Spark and distributed databases preferred

  • Development of products using distributed computing & big data products will be a huge plus

  • Experience in high concurrency platforms, graph databases, applied mathematics - a plus.

  • B.S. or MS degree in Computer Science or related fields.


Lumiata delivers AI powered health analytics to make healthcare smarter. At the intersection of clinical knowledge, data science and machine learning, Lumiata provides cost and risk analytics to health plans, care providers and employers.

Based in Silicon Valley, Lumiata’s team is comprised of data scientists, engineers and industry experts. Lumiata is backed by Khosla Ventures, BlueCross BlueShield Venture Fund, Intel Capital, Sandbox Industries and other leaders in healthcare and AI.


Diversity creates a healthier atmosphere: Lumiata is an Equal Employment Opportunity/Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law.

  • London, UK
Apple Media Products is the team behind the App Store, Apple Music, iTunes, and many other high profile products on iPhone, Mac and AppleTV.
Our Data Engineering team is looking for talented, performance-savvy, engineers to build out the big data platform and services which power many of these customer features existing and new.

Experience of any of the following is an advantage:
Hadoop-ecosystem technologies in particular MapReduce, Spark, Hive, YARN/MR2
Building and running large-scale data pipelines, including distributed messaging such as Kafka, data ingest to/from multiple sources to feed batch and near-realtime/streaming compute components
Data-modelling and data-architecture optimised for big data patterns (warehousing concepts; efficient storage and query on HDFS; data security and privacy techniques)
Knowledgable about distributed storage and network resources, at the level of hosts, clusters and DCs, to troubleshoot and prevent performance issues

Key Qualifications:
Significant experience in designing, implementing and supporting highly scalable data systems and services in Java
Degree in Computer Science or related discipline or meaningful career experience
Experience in Hadoop/Spark or Kafka

Description:
This is your opportunity to help engineer highly visible global-scale systems with petabytes of data, supporting hundreds of millions of users.

Experience of any of the following is an advantage:

* Hadoop-ecosystem technologies in particular MapReduce, Spark, Hive, YARN/MR2

* Building and running large-scale data pipelines, including distributed messaging such as Kafka, data ingest to/from multiple sources to feed batch and near-realtime/streaming compute components

* Data-modelling and data-architecture optimised for big data patterns (warehousing concepts; efficient storage and query on HDFS; data security and privacy techniques)

* Knowledgable about distributed storage and network resources, at the level of hosts, clusters and DCs, to troubleshoot and prevent performance issues

Apple is an equal opportunity employer that is committed to inclusion and diversity. We also take affirmative action to offer employment and advancement opportunities to all applicants, including minorities, women, protected veterans, and individuals with disabilities. Apple will not discriminate or retaliate against applicants who inquire about, disclose, or discuss their compensation or that of other applicants.

  • Hamburg, Germany

We are a software company and a community of passionate, purpose-led individuals. We think disruptively to deliver technology to address our clients' toughest challenges, all while seeking to revolutionize the IT industry and create positive social change.


You should be a passionate developer / engineer, who cares about software excellence. You love learning new skills and tools and approach software development with best practices in mind.

What really excites you is problem solving and delivering innovative technology solutions that create value for people and organisations. There are opportunities to make an impact beyond your day-day project too - our consultants regularly speak at conferences, get published and provide thought leadership as specialists.

As a Senior Developer at ThoughtWorks, you will do much more than just coding. You will:



  • Learn something new everyday and work with creators of tools, blogs and books whom have inspired you

  • Work on large-scale, custom-designed software development projects using languages such as Java, Scala, JavaScript and Python

  • Champion the latest best practices

  • Work in a dynamic, collaborative, transparent, non-hierarchical, and ego-free culture

  • Develop your career outside of the confinements of a traditional career path by focusing on what you’re passionate about rather than a predetermined one-size-fits-all plan

  • Help to grow the next generation of developers and have a positive impact on the industry






Here's what you'll bring:



  • Hands-on development experience with a broad mix of languages and technologies

  • Passion for software engineering and craftsman-like coding prowess

  • Great OO skills, including strong design patterns knowledge

  • Experience with development best practices, such as Continuous Delivery or TDD

  • Ability to work in a variety of client settings and in a team-oriented, collaborative environment

  • Open to travel across Germany 






"Stop Careering. Start Contributing."


At ThoughtWorks we promote diversity in all its forms and reject discrimination and inequality. We proudly, passionately and actively strive to make both ThoughtWorks and our industry more reflective and inclusive of the society that we serve.


What are you waiting for?


So, you can see yourself at ThoughtWorks? Then tell us who you are and let us know why you want to join by sending us your LinkedIN/ Xing or CV today!