OnlyDataJobs.com

MRE Consulting, Ltd.
  • Houston, TX

Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.


Our client is seeking to hire an Enterprise Data Architect. The position reports to the VP IT. The Data Architect is responsible for providing a standard common business vocabulary across all applications and data elements, expressing and defining strategic data requirements, outlining high level integrated designs to meet the various business unit requirements, and aligning with the overall enterprise strategy and related business architecture.


Essential Duties & Responsibilities:
Provide insight and strategies for changing databased storage and utilization requirements for the company and provide direction on potential solutions
Assist in the definition and implementation of a federated data model consisting of a mixture of multi-cloud and on premises environments to support operations and business strategies
Assist in managing vendor cloud environments and multi-cloud database connectivity.
Analyze structural data requirements for new/existing applications and platforms
Submit reports to management that outline the changing data needs of the company and develop related solutions
Align database implementation methods to make sure they support company policies and any external regulations
Interpret data, analyze results and provide ongoing reporting and support
Implement data collection systems and other strategies that optimize efficiency and data quality
Acquire available data sources and maintain data systems
Identify, analyze, and interpret trends or patterns in data sets
Scrub data as needed, review reports, printouts, and performance indicators to identify inconsistencies
Develop database design and architecture documentation for the management and executive teams
Monitor various data base systems to confirm optimal performance standards are met
Contribute to content updates within resource portals and other operational needs
Assist in presentations and interpretations of analytical findings and actively participate in discussions of results, internally and externally
Help maintain the integrity and security of the company database
Ensure transactional activities are processed in accordance with standard operating procedures The employee will be on call 24 hours 7 days per week.


Qualifications
Minimum of 10 + years of experience.
Proven work experience as a Data Architect, Data Scientist, or similar role
In-depth understanding of database structure principles
Strong knowledge of data mining and segmentation techniques
Expertise in MS SQL and other database platforms
Familiarity with data visualization tools
Experience with formal Enterprise Architecture tools (like BiZZ design)
Experience in managing cloud-based environments
Aptitude regarding data models, data mining, and in cloud-based applications.
Advanced analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
Adept at report writing and presenting findings
Proficiency in systems support and monitoring
Experience with complex data structures in the Oil and Gas Industry a plus


Education 
A bachelors degree in Computer Science, Math, Statistics, or related quantitative field required.


Travel Requirements
The percentage of travel anticipated for this position is 10 20%, including overnight extended stays.


All qualified candidates should apply by providing a current Word resume and denoting skill set experience as it relates to this requirement.

Vector Consulting, Inc
  • Atlanta, GA
 

Our Government client is looking for an experienced ETL Developer on a renewable contract in Atlanta, GA

Position ETL Developer

The desired candidate will be responsible for design, development, testing, maintenance and support of complex data extract, transformation and load (ETL) programs for an Enterprise Data Warehouse. An understanding of how complex data should be transformed from the source and loaded into the data warehouse is a critical part of this job.

  • Deep hands-on experience on OBIEE RPD & BIP Reporting Data models, Development for seamless cross-functional and cross-systems data reporting
  • Expertise and solid experience in BI Tools OBIEE, Oracle Data Visualization and Power BI
  • Strong Informatica technical knowledge in design, development and management of complex Informatica mappings, sessions and workflows on Informatica Designer Components -Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Strong programming skills, relational database skills with expertise in Advanced SQL and PL/SQL, indexing and query tuning
  • Having implemented Advanced Analytical models in Python or R
  • Experienced in Business Intelligence and Data warehousing concepts and methodologies.
  • Extensive experience in data analysis and root cause analysis and proven problem solving and analytical thinking capabilities.
  • Analytical capabilities to slice and dice data and display data in reports for best user experience.
  • Demonstrated ability to review business processes and translate into BI reporting and analysis solutions.
  • Ability to follow Software Development Lifecycle (SDLC) process and should be able to work under any project management methodologies used.
  • Ability to follow best practices and standards.
  • Ability to identify BI application performance bottlenecks and tune.
  • Ability to work quickly and accurately under pressure and project time constraints
  • Ability to prioritize workload and work with minimal supervision
  • Basic understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, Version Control and Office software
  • Exposure Data Modeling using Star/Snowflake Schema Design, Data Marts, Relational and Dimensional Data Modeling, Slowly Changing Dimensions, Fact and Dimensional tables, Physical and Logical data modeling and in big data technologies
  • Experience with Big Data Lake / Hadoop implementations

 Required Qualifications:

  • A bachelors degree in Computer Science or related field
  • 6 to 10 years of experience working with OBIEE / Data Visualization / Informatica / Python
  • Ability to design and develop complex Informatica mappings, sessions, workflows and identify areas of optimizations
  • Experience with Oracle RDBMS 12g
  • Effective communication skills (both oral and written) and the ability to work effectively in a team environment are required
  • Proven ability and desire to mentor/coach others in a team environment
  • Strong analytical, problem solving and presentation skills.

Preferred Qualifications:

  • Working knowledge with Informatica Change Data Capture installed on DB2 z/OS
  • Working knowledge of Informatica Power Exchange
  • Experience with relational, multidimensional and OLAP techniques and technology
  • Experience with OBIEE tools version 10.X
  • Experience with Visualization tools like MS Power BI, Tableau, Oracle DVD
  • Experience with Python building predictive models

Soft Skills:

  • Strong written and oral communication skills in English Language
  • Ability to work with Business and communicate technical solution to solve business problems

About Vector:

Vector Consulting, Inc., (Headquartered in Atlanta) is an IT Talent Acquisition Solutions firm committed to delivering results. Since our founding in 1990, we have been partnering with our customers, understanding their business, and developing solutions with a commitment to quality, reliability and value. Our continuing growth has been and continues to be built around successful relationships that are based on our organization's operating philosophy and commitment to ** People, Partnerships, Purpose and Performance - THE VECTOR WAY

Visa
  • Austin, TX
Company Description
Visa operates the world's largest retail electronic payments network and is one of the most recognized global financial services brands. Visa facilitates global commerce through the transfer of value and information among financial institutions, merchants, consumers, businesses and government entities. We offer a range of branded payment product platforms, which our financial institution clients use to develop and offer credit, charge, deferred debit, prepaid and cash access programs to cardholders. Visa's card platforms provide consumers, businesses, merchants and government entities with a secure, convenient and reliable way to pay and be paid in 170 countries and territories.
Job Description
At Visa University, our mission is to turn our learning data into insights and get a deep understanding of how people use our resources to impact the product, strategy and direction of Visa University. In order to help us achieve this we are looking for someone who can build and scale an efficient analytics data suite and also deliver impactful dashboards and visualizations to track strategic initiatives and enable self-service insight delivery. The Staff Software Engineer, Learning & Development Technology is an individual contributor role within Corporate IT in our Austin-based Technology Hub. In this role you will participate in design, development, and technology delivery projects with many leadership opportunities. Additionally, this position provides application administration and end-user support services. There will be significant collaboration with business partners, multiple Visa IT teams and third-party vendors. The portfolio includes SaaS and hosted packaged applications as well as multiple content providers such as Pathgather (Degreed), Cornerstone, Watershed, Pluralsight, Lynda, Safari, and many others.
The ideal candidate will bring energy and enthusiasm to evolve our learning platforms, be able to easily understand business goals/requirements and be forward thinking to identify opportunities that may be effectively resolved with technology solutions. We believe in leading by example, ownership with high standards and being curious and creative. We are looking for an expert in business intelligence, data visualization and analytics to join the Visa University family and help drive a data-first culture across learning.
Responsibilities
  • Engage with product managers, design team and student experience team in Visa University to ensure that the right information is available and accessible to study user behavior, to build and track key metrics, to understand product performance and to fuel the analysis of experiments
  • Build lasting solutions and datasets to surface critical data and performance metrics and optimize products
  • Build and own the analytics layer of our data environment to make data standardized and easily accessible
  • Design, build, maintain and iterate a suite of visual dashboards to track key metrics and enable self-service data discovery
  • Participate in technology project delivery activities such as business requirement collaboration, estimation, conceptual approach, design, development, test case preparation, unit/integration test execution, support process documentation, and status updates
  • Participate in vendor demo and technical deep dive sessions for upcoming projects
  • Collaborate with, and mentor, data engineers to build efficient data pipelines and impactful visualizations
Qualifications
  • Minimum 8 years of experience in a business intelligence, data analysis or data visualization role and a degree in science, computer science, statistics, economics, mathematics, or similar
  • Significant experience in designing analytical data layers and in conducting ETL with very large and complex data sets
  • Expertise with Tableau desktop software (techniques such as LOD calculations, calculated fields, table calculations, and dashboard actions)
  • Expert in data visualization
  • High level of ability in JSON, SQL
  • Experience with Python is a must and experience with data science libraries is a plus (NumPy, Pandas, SciPy, Scikit Learn, NLTK, Deep Learning(Keras)
  • Experience with Machine Learning algorithms (Linear Regression, Multiple Regression, Decision Trees, Random Forest, Logistic Regression, Naive Bayes, SVM, K-means, K-nearest neighbor, Hierarchical Clustering)
  • Experience with HTML and JavaScript
  • Basic SFTP and encryption knowledge
  • Experience with Excel (Vlookups, pivots, macros, etc.)
  • Experience with xAPI is a plus
  • Ability to leverage HR systems such as Workday, Salesforce etc., to execute the above responsibilities
  • Understanding of statistical analysis, quantitative aptitude and the ability to gather and interpret data and information
  • You have a strong business sense and you are able to translate business problems to data driven solutions with minimal oversight
  • You are a communicative person who values building strong relationships with colleagues and stakeholders, enjoys mentoring and teaching others and you have the ability to explain complex topics in simple terms
Additional Information
All your information will be kept confidential according to EEO guidelines.
Job Number: REF15081Q
Expedia, Inc.
  • Bellevue, WA

We are seeking a deeply experienced technical leader to lead the next generation of engineering investments, and culture for the GCO Customer Care Platform (CCP). The technical leader in this role will help design, engineer and drive implementation of critical pieces of the EG-wide architecture (platform and applications) for customer care - these areas include, but limited to unified voice support, partner on boarding with configurable rules, Virtual agent programming model for all partners, and intelligent fulfillment. In addition, a key focus of this leader's role will also be to grow and mentor junior software engineers in GCO with a focus on building out a '2020 world-class engineering excellence' vision / culture.


What you’ll do:



  • Deep Technology Leadership (Design, Implementation, and Execution for the follow);

  • Ship next-gen EG-wide architecture (platform and applications) that enable 90% of automated self-service journeys with voice as a first-class channel from day zero

  • Design and ship a VA (Virtual Agent) Programming Model that enables partners standup intelligent virtual agents on CCP declaratively in minutes

  • Enable brand partners to onboard their own identity providers onto CCP

  • Enable partners to configure their workflows and business rules for their Virtual Agents

  • Programming Model for Intelligent actions in the Fulfillment layer

  • Integration of Context and Query as first-class entities into the Virtual Agent

  • Cross-Group Collaboration and Influence

  • Work with company-wide initiatives across AI Labs, BeX to build out a Best of Breed

  • Conversational Platform for EG-wide apps

  • Engage with and translate internal and external partner requirements into platform investments for effective on boarding of customers

  • Represent GCO's Technical Architecture at senior leadership meetings (eCP and EG) to influence and bring back enhancements to improve CCP



Help land GCO 2020 Engineering and Operational Excellence Goals

Mentor junior developers on platform engineering excellence dimensions (re-usable patterns, extensibility, configurability, scalability, performance, and design / implementation of core platform pieces)

Help develop a level of engineering muscle across GCO that becomes an asset for EG (as a provider of platform service as well as for talent)

Who you are:



  • BS or MS in Computer Science

  • 20 years of experience designing and developing complex, mission-critical, distributed software systems on a variety of platforms in high-tech industries

  • Hands on experience in designing, developing, and delivering (shipping) V1 (version one) MVP enterprise software products and solutions in a technical (engineering and architecture) capacity

  • Experience in building strong relationships with technology partners, customers, and getting closure on issues including delivering on time and to specification

  • Skills: Linux/ Windows/VMS, Scala, Java, Python, C#, C++, Object Oriented Design (OOD), Spark, Kafka, REST/Web Services, Distributed Systems, Reliable and scalable transaction processing systems (HBase, Microsoft SQL, Oracle, Rdb)

  • Nice to have: Experience in building highly scalable real-time processing platforms that hosts machine learning algorithms for Guided / Prescriptive Learning
    Identifies and solves problems at the company level while influencing product lines

  • Provides technical leadership in difficult times or serious crises

  • Key strategic player to long-term business strategy and vision

  • Recognized as an industry expert and is a recognized mentor and leader at the company Provides strategic influence across groups, projects and products

  • Provides long term product strategy and vision through group level efforts

  • Drive for results: Is sought out to lead company-wide initiatives that deliver cross-cutting lift to the organization and provides leadership in a crisis and is a key player in long-term business strategy and vision

  • Technical/Functional skills: Proves credentials as industry experts by inventing and delivering transformational technology/direction and helps drive change beyond the company and across the industry

  • Has the vision to impact long-term product/technology horizon to transform the entire industry



Why join us:

Expedia Group recognizes our success is dependent on the success of our people.  We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business.  Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so. 


Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach.  We relentlessly strive for better, but not at the cost of the customer.  We act with humility and optimism, respecting ideas big and small.  We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground, so we can act fast and stay simple.  Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.


If you have a hunger to make a difference with one of the most loved consumer brands in the world and to work in the dynamic travel industry, this is the job for you.


Our family of travel brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®, lastminute.com.au®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions, CarRentals.com™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.



Expedia is committed to creating an inclusive work environment with a diverse workforce.   All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.  This employer participates in E-Verify. The employer will provide the Social Security Administration (SSA) and, if necessary, the Department of Homeland Security (DHS) with information from each new employee's I-9 to confirm work authorization.

Expedia, Inc.
  • Bellevue, WA

What is the first thing you do while planning your travel? Do you want to work on a team that helps travelers like you go places and make our world more connected?

Expedia Flights team is the traffic powerhouse for the Expedia group and our flights shopping platform is one of the largest in the world serving over 150 million queries a day and powering some of the strongest brands in the industry like Orbitz, Expedia, Travelocity, Wotif, Hotwire and ebookers. 

Our technology operations are global, with representation in US, Mexico, Australia and India.


What makes Flights technology unique?



  • We are one of the few companies in the world that develop a proprietary flight search engine which is used by millions of users every single day

  • We are moving one of the world’s biggest flights platforms to AWS

  • We handle several 100 thousand booking transactions daily and connect with all major GDS partners you can think of in the world

  • We collect terabytes of flight data and are actively looking to use ML to show the right content to our customers


Expedia is looking for an extraordinary Distinguished Engineer to join the Flight Search Team.  Best Fare Search, Expedia’s proprietary flight search and pricing engine, performs complex manipulations on massive and highly volatile datasets to power airline flight shopping for millions of customers every single day.


You will have the opportunity to understand and shape the marketplace. This role will pursue extremely hard problems, craft solutions and make design decisions which can have a large impact across the company. The systems you design and implement will be expected to meet the levels of scalability and robustness needed for this high-volume and high-visibility product.


Bring your programming smarts, problem solving skills, and passion for software engineering and join us as we solidify and grow our position as the leaders in the travel industry.


What you’ll do: 



  • Lead, influence, and be a contributor across our entire technology team while acting as an area expert for your team and flight search services

  • Primary designer and architect for the continued evolution of Best Fare Search and flight search services for Expedia Group

  • Design for high-performance, highly scalable, and reliable server applications in our data center and the cloud

  • Produce production quality code and have a strong eye for the operational aspects of the platform such as performance tuning, monitoring, and fault-tolerance

  • Design, interpret, analyze and work with large amounts of data to identify issues and patterns

  • Contribute to advancing the team’s design methodology and quality programming practices

  • Technical ownership of critical flight search systems and services from inception through operating in production


Who you are: 



  • Functional Expertise

  • At least 15 years of industry experience in a variety of contexts, during which you’ve built remarkably scalable, robust, and fault-tolerant systems

  • Expertise in solving large scale flight search problems a significant plus

  • Exceptional coding skills in C#, C++ or Java and proficiency with XML and SQL

  • Experience working in a cloud or virtual environment

  • Expertise with continuous integration/delivery and leveraging a dev ops mindset

  • Previous experience delivering data insights by querying dataset in a big data environment(Hadoop, SQL, AWS Aurora, S3 etc.) and performing real-time streaming analytics

  • Production focus: previous history of being hands on in solving critical production issues that affect our valued customers and drive those insights back into the product in true dev ops style

  • Knowledge of airline and/or global distribution system (GDS) preferred


People Leadership
Inspiring and approachable as a leader
Create an environment where people can realize their full potential
Be humble and lead with open, candid relationships
Inspire peripheral relationships across Expedia Group
Passionate about engaging and developing talent; attract, develop, engage and retain talented individuals with a compelling, unifying vision that steers and motivates
Strong people skills and ability to successfully lead up, down, and across the organization
Demonstrated the ability to mentor and grow more junior developers into strong, leading engineers
Proven capacity to establish trusted, effective relationships across diverse sets of partners


Additional Competencies



  • Natural bar-raiser: curious and passionate, with a desire to continuously learn more, which you use to understand basic business operations and the organizational levers that drive profitable growth

  • Bias to action, being familiar with methods and approaches needed to get things done in a collaborative, lean and fast-moving environment

  • Respond effectively to complex and ambiguous problems and situations

  • Lead mostly with questions rather than opinions, thriving on the opportunity to own, innovate, create, and constantly re-evaluate

  • Comfortable making recommendations across competing and equally critical business needs

  • Simplify, clearly and succinctly convey complex information and ideas to individuals at all levels of the organization

  • Motivated by goal achievement and continuous improvement, with the enthusiasm and drive to motivate your team and the wider organization



Why join us:
Expedia Group recognizes our success is dependent on the success of our people.  We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business.  Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so. 


Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach.  We relentlessly strive for better, but not at the cost of the customer.  We act with humility and optimism, respecting ideas big and small.  We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground, so we can act fast and stay simple.  Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.


If you have a hunger to make a difference with one of the most loved consumer brands in the world and to work in the dynamic travel industry, this is the job for you.


Our family of travel brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®, lastminute.com.au®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions, CarRentals.com™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.



Expedia is committed to creating an inclusive work environment with a diverse workforce.   All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.  This employer participates in E-Verify. The employer will provide the Social Security Administration (SSA) and, if necessary, the Department of Homeland Security (DHS) with information from each new employee's I-9 to confirm work authorization.

AIRBUS
  • Blagnac, France

Description of the job



Vacancies for 3 Data Scientists (m/f) have arisen within Airbus Commercial Aircraft in Toulouse. You will join the PLM Systems & Integration Tests team within IM Develop department.  



IM Develop organization is established to ensure Product Life Cycle Management (PLM) Support and Services as requested by Programmes, CoE and CoC. The department is the home within Airbus to lead the development, the implementation, the maintenance and the support of PLM to all Airbus programs in line with the corporate strategy.



Within the frame of its Digital Design, Manufacturing & Services (DDMS) project, Airbus is undergoing a significant digital transformation to benefit from the latest advances in new technologies and targets a major efficiency breakthrough across the program and product lifecycle. It will be enabled by a set of innovative concepts such as model based system engineering, modular product lines, digital continuity and concurrent co-design of the product, its industrial setup and operability features.



As a Data Scientist (m/f), you will be integrated in a team of the IM Develop department and appointed to dedicated missions. You will work in an international environment where you will able to develop in-depth knowledge of local specificities: engineering, manufacturing, costing, etc.



Tasks & accountabilities



Your main tasks and responsibilities will be to:




  • Analyze large amounts of information to discover trends and patterns, build predictive models, implement cost models and machine-learning algorithms based on technical data and DMU models.

  • Combine models through ensemble modelling

  • Present information using data visualization techniques

  • Propose solutions and strategies to business challenges

  • Implement features extraction by analyzing CAD models and engineering Bill of Material

  • Collaborate with engineering, costing (FCC) to implement new costing models in python

  • Design and propose new short/medium- and long-term forecasting methods

  • Consolidate, compare and enlarge the data required for the various types of modelling

  • Attend technical events/conferences and reinforce Data Science skills within Airbus




Required skills



We are looking for candidates with the following skills and experience:




  • Strong knowledge of python development in the frame of industrial projects

  • Experience in data mining & machine-learning

  • Knowledge of Scala, Java or C++,… familiarity with R, SQL is an asset

  • Experience using business intelligence tools

  • Analytical mindset

  • Strong math skills (e.g. statistics, algebra)

  • Problem-solving aptitude

  • Excellent communication and presentation skills

  • PLM knowledge and 3D CAD programming would be a plus

  • French & English: advanced level

Citizens Advice
  • London, UK
  • Salary: £40k - 45k

As a Database engineer in the DevOps team here at Citizens Advice you will help us develop and implement our data strategy. You will have the opportunity to work with both core database technologies and big data solutions.


Past


Starting from scratch, we have built a deep tech-stack with AWS services at its core. We created a new CRM system, migrated a huge amount of data to AWS Aurora PG and used AWS RDS to run some of our business critical databases.


You will have gained a solid background and in-depth knowledge of AWS RDS, SQL/Admin against DBMS's such as PostgreSql / MySQL / SQL Server, Dynamo / Aurora. You will have dealt with Data Warehousing, ETL, DB Mirroring/Replication, and DB Security Mechanisms & Techniques.


Present


We use AWS RDS including Aurora as the standard DB implementation for our applications. We parse data in S3 using Spark jobs and we are planning to implement a data lake solution in AWS.


Our tools and technologies include:



  • Postgres on AWS RDS

  • SQL Server for our Data Warehouse

  • Liquibase for managing the DW schema

  • Jenkins 2 for task automation

  • Spark / Parquet / AWS Glue for parsing raw data

  • Docker / docker-compose for local testing


You will be developing, supporting and maintaining automation tools to drive database, reporting and maintenance tasks.


As part of our internal engineering platform offering, R&D time will give you the opportunity to develop POC solutions to integrate with the rest of the business.


Future


You will seek continuous improvement and implement solutions to help Citizens Advice deliver digital products better and quicker.


You will be helping us implement a data lake solution to improve operations and to offer innovative services.


You will have dedicated investment time at Citizens Advice to learn new skills, technologies, research topics or work on tools that make this possible.

Intercontinental Exchange
  • Atlanta, GA
Job Purpose
The Data Analytics team is seeing a dynamic, self-motivated Data Scientist, who is able to work independently on data analysis, datamining, report development and customer requirement gathering.
Responsibilities
  • Applies data analysis and data modeling techniques, based upon a detailed understanding of the corporate information requirements, to establish, modify, or maintain data structures and their associated components
  • Participates in the development and maintenance of corporate data standards
  • Supports stakeholders and business users to define data and analytic requirements
  • Works with the business to identify additional internal and external data sources to bring into the data environment and mesh with existing data
  • Story board, create, ad publish standard reports, data visualizations, analysis and presentations
  • Develop and implement workflows using Alteryx and/or R
  • Develop and implement various operational and sales Tableau dashboards
Knowledge And Experience
  • Bachelors degree in statistics/engineering/math/quantitative analytics/economics/finance or a related quantitative discipline required
  • Masters in engineering/physics/statistics/economics/math/science preferred
  • 1+ years of experience with data science techniques and real-world application experience
  • 2+ years of experience supporting the development of analytics solutions leveraging tools like Tableau Desktop and Tableau Online
  • 1+ years of experience working with SQL, developing complex SQL queries, and leveraging SQL in Tableau
  • 1+ years of experience in Alteryx, and R coding
  • Deep understanding of Data Governance and Data Modeling
  • Ability to actualize requirements
  • Advanced written and oral communication skills with the ability to summarize findings and present in a clear, concise manner to peers, management, and others
Additional Information
    • Job Type Standard
    • Schedule Full-time
TRA Robotics
  • Berlin, Germany

We are engineers, designers and technologists, united by the idea of shaping the future. Our mission is to reimagine the manufacturing process. It will be fully software-defined and driven entirely by AI. This means new products will get to market much quicker.


Now we are working on creating a flexible robotic factory managed by AI. We are developing and integrating a stack of products that will facilitate the whole production process from design to manufacturing. Our goal is complex and deeply rooted in science. We understand that it is only achievable in collaboration across diverse disciplines and knowledge domains.


Currently, we are looking for a Senior Software Engineer, experienced in Java or Scala, to join our Operations Control System team.


About the project:


The main goal is to create Distributed Fault-Tolerant Middleware. It will coordinate all the robotic operations on the factory: from the interaction between a robotic arm and various sensors to the requirements projection onto the real distributed sequence of robots’ actions.


In our work, we use cutting-edge technologies and approaches: Scala/Akka, Apache Ignite, Apache Spark, Akka Streams. As for data analysis, it is entirely up to the team to choose which methods and tools to use. We are in continuous research and development, nothing written in stone yet. You can have an influence on the decisions and technologies to use.


Up to now, OS supports several architecture patterns:



  • BlackBoard - storage, based on modern in-memory approaches

  • A multi-agent system for managing operations

  • A knowledge base of technological operations with declarative semantics

  • Rule engine based system for an algorithm selection

  • Factory monitoring tools (Complex Event Processing)


Therefore, as a part of the team, you will create a new core technology for all the projects and for the whole industry.


Your Qualifications:



  • Proficiency with Java or Scala Core (3+ years)

  • Strong knowledge of SQL (2+ years)

  • Extensive experience in enterprise development (2+ years)

  • Excellent knowledge of algorithms and data structures

  • Experience with git/maven

  • Fluency in English


Will be an advantage:



  • Experience in Akka, GridGain/Ignite, Hadoop/Spark

  • Understanding of distributed systems main concepts

  • Knowledge of multi-agent systems

  • Basic knowledge of rule-based engines

  • Experience in building high load systems

  • Experience in Linux (as a power user)


What we offer:



  • To join highly scientific-intensive culture and take part in developing a unique product

  • The ability to choose technology stack and approaches

  • Yearly educational budget - we support your ambitions to learn

  • Relocation package - we would like to make your start as smooth as possible

  • Flexible working environment - choose your working hours and equipment

  • Cosy co-working space in Berlin-Mitte with an access to a terrace

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-5-8 years of Java experience, Scala and Python experience a plus

-3+ years of experience as an analyst, data scientist, or related quantitative role.

-3+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-3-5years of Java experience, Scala and Python experience a plus

-2+ years of experience as an analyst, data scientist, or related quantitative role.

-2+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Accenture
  • San Diego, CA
Job Title: Accenture Digital, Accenture Analytics, Data Science, Consultant
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client & Market career track drive profitable growth by developing market-relevant insights to increase market share or create new markets. They progress through required promotion into market-facing roles that have a direct impact on sales.
Accenture Analytics
Accenture Analytics delivers insight driven outcomes at scale to help organizations improve performance. Our extensive capabilities range from accessing and reporting on data to advanced modeling, forecasting, and sophisticated statistical analysis. Specifically we...
    • Help companies better understand consumers and how to connect with them across markets and channels, using data and analytics
    • Design and develop reusable analytical assets using advanced statistical and computational methods
    • Proactively monitor and analyze complex systems to understand, diagnose, and continuously improve key performance indicators
    • Pilot sophisticated advanced analytics & innovative analytics solutions to prove value
    • Work closely with business leaders to understand needs and assist with architecting and deploying innovative solutions
    • Operate at the frontier of innovative analytics; introduce and implement newest market developments & trends in analytics
    • Optimize Resources processes and integrate across the enterprise in order to maximize the opportunities of data, analytics, and outcomes
    • Build, deploy, maintain and scale advanced analytic solutions that reduce complexity and cost
    • Partner and team with technology solution providers to deliver the best solution meeting the needs of our clients
    • Offer our clients end to end solutions and services
Job Description
The Consultant is responsible for delivery on Resources analytics projects. This process requires the use of advanced statistical analysis in a Resources industry.
The Key Responsibilities Include
    • Client-facing interaction including providing analyses, recommendations, presentations and advice to clients.
    • Adapts existing methods and procedures to create possible alternative solutions to moderately complex problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Understands client use cases / user stories and maps them to solutions based on best practice.
    • Uses considerable judgment to determine solution and seeks guidance on complex problems.
    • Primary upward interaction is with direct supervisor. May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance.
    • Decisions often impact the team in which they reside.
    • Project-based analytics including but not limited to: Machine Learning, Predictive Analytics, Comparative Effectiveness Analysis, Failure Analysis, Big Data Analytics, Optimization, Demand Forecasting, Customer Segmentation, Customer Analytic Record.
Basic Qualifications
    • Minimum of Bachelor's Degree required in related field; strong preference for fields of study in the data science, mathematics, economics, statistics, engineering and information management
    • Minimum of 3 years delivery experience in advanced modeling environment: strong understanding of statistical concepts and predictive modeling. (e.g., AI neural networks, multi-scalar dimensional models, logistic regression techniques, machine-based learning, big data platforms, SQL, etc.)
    • Minimum 3 years experience with predictive analytics tools, including at least two of the following: R, SAS, Alteryx, Python, Spark, and Tableau
Preferred Qualifications
    • Minimum of 2 years of experience with consulting or implementing transformational change
    • Experience in the analysis of marketing databases using SAS or other statistical modeling tools.
    • Experience in the following areas: Applied Statistics/Econometrics, Statistical Programming, Database Management & Operations, Digital, Comparative Effectiveness Research
    • Possess a blend of marketing acumen, consulting expertise and analytical capabilities that can create value and insights for our clients.
All of our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. Youll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
Infosys
  • Houston, TX
Responsibilities

-Hands on experience with Big Data systems, building ETL pipelines, data processing, and analytics tools
-Understanding of data structures & common methods in data transformation.
-Familiar with the concepts of dimensional modeling.
-Sound knowledge of one programming language - Python or Java
-Programming experience using tools such as Hadoop and Spark.
-Strong proficiency in using query languages such as SQL, Hive and SparkSQL
-Experience in Kafka & Scala would be a plus

Migo
  • Taipei, Taiwan

  • Responsibility 

    • Collaborate with data scientists to phase in statistical, predictive machine learning and AI models to production scale and continuously optimizing performance.

    • Design, build, optimize, launch and support new and existing data models and ETL processes in production based on data products and stakeholder needs.

    • Define and manage SLA and accuracy for all data sets in allocated areas of ownership.

    • Design and continuously improve data infrastructure and identify infra issues and drive to resolution.

    • Support software development team to build and maintain data collectors in Migo application ecosystem based on data warehouse and analytics user requirements.





  • Basic Qualification:

    • Bachelor's degree in Computer Science, Information Management or related field.

    • 2+ years hands-on experience in the data warehouse space, custom ETL design, implementation and maintenance.

    • 2+ years hands-on experience in SQL or similar languages and development experience in at least one scripting language (Python preferred).

    • Strong data architecture, data modeling, schema design and effective project management skills.

    • Excellent communication skills and proven experience in leading data driven projects from definition through interpretation and execution.

    • Experience with large data sets and data profiling techniques.

    • Ability to initiate and drive projects, and communicate data warehouse plans to internal clients/stakeholders.





  • Preferred Qualification:

    • Experience with big data and distributed computing technology such as Hive, Spark, Presto, Parquet

    • Experience building and maintaining production level data lake with Hadoop Cluster or AWS S3.

    • Experience with batch processing and streaming data pipeline/architecture design patterns such as lambda architecture or kappa architecture.








    • AI

    • ETL

    • (SLA)


    • Migo







    • 2data warehouse space, custom ETL

    • 2SQL (Python)

    • data modeling






    • Hive, Spark, Presto, Parquet

    • Hadoop Cluster or AWS S3.

    • lambda architecture or kappa architecture.


Catapult Systems
  • Houston, TX

High performing team members. Challenging projects. A stable and profitable company. And a great place to work! This is what you can expect if you join the Catapult Systems team. Founded in 1993 and headquartered in Austin, Texas, Catapult is an award winning Microsoft National Solution Provider and was recently named the Microsoft Partner of the Year (U.S.) and Microsoft Partner of the Year Finalist in Cloud Productivity.


What do we attribute our award-winning success to? The people we hire, of course! We provide you the tools and leadership that you need to be successful, and then let you do what you do best. We enable you to make the decisions that you feel are in the best interest of our clients, and we trust your judgment. This type of ownership and independence, and an ongoing commitment to solving real business problems through the innovative use of Microsoft technologies, has resulted in Catapult being voted one of the best places to work year after year!


It is a very exciting time of growth for Catapult Systems, and we are currently hiring a Data Analytics Developer to provide technical leadership for our expanding team.

What will my role be?


As a Data Analytics Developer you will work with customers to identify opportunities and scenarios where Power BI and Azure Data Services can benefit their business. You will deliver short and long term projects utilizing strong business, technical and data modeling skills.

Responsibilities will include:

    • Working with customers to analyze business requirements and define functional specifications
    • Facilitating client meetings and strategy sessions
    • Providing pre-sales technical support by attending sales calls and creating demos for customers
    • Support and implementation of Data Analytics projects

Whats required?

    • First and foremost, you should enjoy what you do and enjoy working in teams!
    • Ability to engage in customer settings and discern client business needs
    • Strong working knowledge and track record of Data Analytics development
    • 5+ years of experience sourcing, star schema & relational data modeling, ETL and processing
    • Expert level knowledge around SSIS, SSAS, SSRS, PowerBI and tools such as SSMS and SSDT
    • Experience supporting large scale analytical platforms
    • Experience designing automated processing, data validation, error checks and alerts, and performance testing techniques
    • Experience working with SQL Azure and cloud data solutions
    • 5+ years of experience with Microsoft SQL Server and proficiency in T-SQL
    • 1+ years of experience in migrating from on-prem to cloud (PaaS or IaaS)
    • Excellent presentation, verbal and written communication, and time management skills
    • Ability to travel up to 25%

What else would make me stand out?

    • Previous consulting experience
    • Knowledge of database optimization techniques
    • Experience with Python and/or R
    • Proficiency in MDX and/or DAX queries
    • Experience with Microsoft Office 365 and cloud data solutions
    • Reporting experience with SharePoint and CRM
    • Relevant Microsoft Certifications and Non-Microsoft data platform certifications
    • Ability to work with cloud and hybrid environments
    • Good understanding of statistics
    • Knowledge in government analytics and policy objectives
    • Experience working with Big Data technologies and NoSQL
    • Multidimensional and Tabular Cube design, development, performance tuning and troubleshooting
    • Experience working with Data Visualization, Auditing, Data Validation, and Data Mining

 So what are you waiting for?? If you are passionate about being a leader and want to work with smart people that are committed to accomplishing great things, then apply today!

Catapult offers an outstanding benefits package including 401(k) match, paid time off, flex spending accounts, identity theft protection, and medical, dental, and life insurance just to name a few.

Catapult was recently named a Texas Monthly magazine Best Place to Work!

118118Money
  • Austin, TX

Seeking an individual with a keen eye for good design combined with the ability to communicate those designs through informative design artifacts. Candidates should be familiar with an Agile development process (and understand its limitations), able to mediate between product / business needs and developer architectural needs. They should be ready to get their hands dirty coding complex pieces of the overall architecture.

We are .NET Core on the backend, Angular 2 on a mobile web front-end, and native on Android and iOS. We host our code across AWS and on-premises VMs, and use various data backends (SQL Server, Oracle, Mongo).

Very important is interest in (and hopefully, experience with) modern big data pipelines and machine learning. Experience with streaming platforms feeding Apache Spark jobs that train machine learning models would be music to our ears. Financial platforms generate massive amounts of data, and re-architecting aspects of our microservices to support that will be a key responsibility.

118118 Money is a private financial services company with R&D headquartered in Austin along highway 360, in front of the Bull Creek Nature preserve. We have offices around the world, so the candidate should be open to occasional travel abroad. The atmosphere is casual, and has a startup feel. You will see your software creations deployed quickly.

Responsibilities

    • Help us to build a big data pipeline and add machine learning capability to more areas of our platform.
    • Manage code from development through deployment, including support and maintenance.
    • Perform code reviews, assist and coach more junior developers to adhere to proper design patterns.
    • Build fault-tolerant distributed systems.

Requirements

    • Expertise in .NET, C#, HTML5, CSS3, Javascript
    • Experience with some flavor of ASP.NET MVC
    • Experience with SQL Server
    • Expertise in the design of elegant and intuitive REST APIs.
    • Cloud development experience (Amazon, Azure, etc)
    • Keen understanding of security principles as they pertain to service design.
    • Expertise in object-oriented design principles.

Desired

    • Machine Learning experience
    • Mobile development experience
    • Kafka / message streaming experience
    • Apache Spark experience
    • Knowledge of the ins and outs of Docker containers
    • Experience with MongoDB
FCA Fiat Chrysler Automobiles
  • Detroit, MI

Fiat Chrysler Automobiles is looking to fill the full-time position of a Data Scientist. This position is responsible for delivering insights to the commercial functions in which FCA operates.


The Data Scientist is a role in the Business Analytics & Data Services (BA) department and reports through the CIO. They will play a pivotal role in the planning, execution  and delivery of data science and machine learning-based projects. The bulk of the work with be in areas of data exploration and preparation, data collection and integration, machine learning (ML) and statistical modelling and data pipe-lining and deployment.

The newly hired data scientist will be a key interface between the ICT Sales & Marketing team, the Business and the BA team. Candidates need to be very much self-driven, curious and creative.

Primary Responsibilities:

    • Problem Analysis and Project Management:
      • Guide and inspire the organization about the business potential and strategy of artificial intelligence (AI)/data science
      • Identify data-driven/ML business opportunities
      • Collaborate across the business to understand IT and business constraints
      • Prioritize, scope and manage data science projects and the corresponding key performance indicators (KPIs) for success
    • Data Exploration and Preparation:
      • Apply statistical analysis and visualization techniques to various data, such as hierarchical clustering, T-distributed Stochastic Neighbor Embedding (t-SNE), principal components analysis (PCA)
      • Generate and test hypotheses about the underlying mechanics of the business process.
      • Network with domain experts to better understand the business mechanics that generated the data.
    • Data Collection and Integration:
      • Understand new data sources and process pipelines. Catalog and document their use in solving business problems.
      • Create data pipelines and assets the enable more efficiency and repeatability of data science activities.
    • Data Exploration and Preparation:
      • Apply statistical analysis and visualization techniques to various data, such as hierarchical clustering, T-distributed Stochastic Neighbor Embedding (t-SNE), principal components analysis (PCA)
    • Machine Learning and Statistical Modelling:
      • Apply various ML and advanced analytics techniques to perform classification or prediction tasks
      • Integrate domain knowledge into the ML solution; for example, from an understanding of financial risk, customer journey, quality prediction, sales, marketing
      • Testing of ML models, such as cross-validation, A/B testing, bias and fairness
    • Operationalization:
      • Collaborate with ML operations (MLOps), data engineers, and IT to evaluate and implement ML deployment options
      • (Help to) integrate model performance management tools into the current business infrastructure
      • (Help to) implement champion/challenger test (A/B tests) on production systems
      • Continuously monitor execution and health of production ML models
      • Establish best practices around ML production infrastructure
    • Other Responsibilities:
      • Train other business and IT staff on basic data science principles and techniques
      • Train peers on specialist data science topics
      • Promote collaboration with the data science COE within the organization.

Basic Qualifications:

    • A bachelors  in computer science, data science, operations research, statistics, applied mathematics, or a related quantitative field [or equivalent work experience such as, economics, engineering and physics] is required. Alternate experience and education in equivalent areas such as economics, engineering or physics, is acceptable. Experience in more than one area is strongly preferred.
    • Candidates should have three to six years of relevant project experience in successfully launching, planning, executing] data science projects. Preferably in the domains of automotive or customer behavior prediction.
    • Coding knowledge and experience in several languages: for example, R, Python, SQL, Java, C++, etc.
    • Experience of working across multiple deployment environments including cloud, on-premises and hybrid, multiple operating systems and through containerization techniques such as Docker, Kubernetes, AWS Elastic Container Service, and others.
    • Experience with distributed data/computing and database tools: MapReduce, Hadoop, Hive, Kafka, MySQL, Postgres, DB2 or Greenplum, etc.
    • All candidates must be self-driven, curious and creative.
    • They must demonstrate the ability to work in diverse, cross-functional teams.
    • Should be confident, energetic self-starters, with strong moderation and communication skills.

Preferred Qualifications:

    • A master's degree or PhD in statistics, ML, computer science or the natural sciences, especially physics or any engineering disciplines or equivalent.
    • Experience in one or more of the following commercial/open-source data discovery/analysis platforms: RStudio, Spark, KNIME, RapidMiner, Alteryx, Dataiku, H2O, SAS Enterprise Miner (SAS EM) and/or SAS Visual Data Mining and Machine Learning, Microsoft AzureML, IBM Watson Studio or SPSS Modeler, Amazon SageMaker, Google Cloud ML, SAP Predictive Analytics.
    • Knowledge and experience in statistical and data mining techniques: generalized linear model (GLM)/regression, random forest, boosting, trees, text mining, hierarchical clustering, deep learning, convolutional neural network (CNN), recurrent neural network (RNN), T-distributed Stochastic Neighbor Embedding (t-SNE), graph analysis, etc.
    • A specialization in text analytics, image recognition, graph analysis or other specialized ML techniques such as deep learning, etc., is preferred.
    • Ideally, the candidates are adept in agile methodologies and well-versed in applying DevOps/MLOps methods to the construction of ML and data science pipelines.
    • Knowledge of industry standard BA tools, including Cognos, QlikView, Business Objects, and other tools that could be used for enterprise solutions
    • Should exhibit superior presentation skills, including storytelling and other techniques to guide and inspire and explain analytics capabilities and techniques to the organization.
Ventula Consulting
  • Northampton, UK
  • Salary: £70k - 75k

Lead Software Engineer – Java – Global Bank – Machine Learning / Big Data, to £75k + Exceptional Package


Lead Software Engineer with a strong background in Java development required to join a new innovation focused team working on greenfield projects.


My client is working on a number of cutting edge Machine Learning and AI solutions which are set to revolutionise fraud detection and prevention so this is a great opportunity to make a real impact on the future of Banking technology.


This role would suit a highly ambitious Software Developer who is looking for a genuine challenge.


You will be joining a newly established innovation team within the Bank which consists of highly skilled technical individuals and industry thought leaders.


There is a very real opportunity for rapid career progression into both technical and management focused roles due to the high profile nature of this function.


The ideal Lead Software Engineer will have the following experience:



  • Expert in Java software Development – Java 8 or later versions

  • Experience developing Business Critical systems with low latency performance

  • Development background creating solutions using AWS

  • Any experience in Big Data, MongoDB, Spark, MySQL and React / Node would be nice to have although not a necessity


This role will be based in Northampton and offers a package of between £70-£75k + an exceptional package including Bonus, strong pension, private healthcare and a host of other benefits.

BIZX, LLC / Slashdot Media / SourceForge.net
  • San Diego, CA

Job Description (your roll):


The Senior Data  Engineer position is a challenging role that bridges the gap between data management and software development. This role reports directly to and works closely with the Director of Data Management while teaming with our software development group. You will work with the team that is designing and implementing the next generation of our internal systems replacing legacy technical debt with state-of-the-art design to enable faster product and feature creation in our big data environment.


Our Industry and Company Environment:

Candidate must have the desire to work and collagerate in a fast-paced entrepreneurial environment in the B2B technology marketing and big data space working with highly motivated co-workers in our downtown San Diego office.


Responsibilities


  • Design interfaces allowing the operations department to fully utilize large data sets
  • Implement machine learning algorithms to sort and organize large data sets
  • Participate in the research, design, and development of software tools
  • Identify, design, and implement process improvements: automating manual processes
  • Optimize data delivery, re-designing infrastructure for greater scalability
  • Analyze and interpret large data sets
  • Build reliable services for gathering & ingesting data from a wide variety of sources
  • Work with peers and stakeholders to plan approach and define success
  • Create efficient methods to clean and curate large data sets


Qualifications

    • Have a B.S., M.S. or Ph.D. in Computer Science or equivalent degree and work experience

    • Deep understanding of developing high efficiency data processing systems

    • Experience with development of applications in mission-critical environments
    • Experience with our stack:
    •      3+ years experience developing in Javascript, PHP, Symfony
    •      3+ years experience developing and implementing machine learning algorithms
    •      4+ years experience with data science tool sets
    •      3+ years MySQL
    •      Experience with ElasticSearch a plus
    •      Experience with Ceph a plus
 

About  BIZX, LLC / Slashdot Media / SourceForge.net


BIZX including its SlashDot Media division is a global leader in on-line professional technology communities such as sourceforge.net serving over 40M website viewers and serving over 150M page views each month to an enthusiastic and engaged audience of IT professionals, decision makers, developers and enthusiasts around the world. Our Passport demand generation platform leverages our huge B2B database and is considered best in class by our list of Fortune 1000 customers. Our impressive growth in the demand generation space is fueled through our use of AI, big data technologies, sophisticated systems automation - and great people.  


Location - 101 W Broadway, San Diego, CA

Accenture
  • Minneapolis, MN
Title : Solution Architect - Oracle Cloud Big Data Management
Organization: Technology/National Data Governance
Locations: Atlanta, Austin, Boston, Charlotte, Chicago, Dallas, Denver, District of Columbia, Florham Park, Houston, Minneapolis, New York, Philadelphia, Phoenix, Raleigh, or Tampa
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better!There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies.We will nurture your talent in an inclusive culture that values diversity.Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
Oracle Cloud professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, data and container services, server less (data) computing, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Oracle cloud platforms, Oracle cloud data management, Oracle Cloud data integration, Oracle Exadata, AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
  • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems
  • Understands the strategic direction set by senior management as it relates to team goals
  • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary
  • Primary upward interaction is with direct supervisor
  • May interact with peers, client counterparts and/or management levels within Accenture
  • Understands methods and procedures on new assignments and executes deliverables with guidance as needed
  • May interact with peers and/or management levels at a client and/or within Accenture
  • Determines methods and procedures on new assignments with guidance
  • Decisions often impact the team in which they reside
  • Manages teams and/or work efforts (if in an individual contributor role) at a client or within Accenture
Basic Qualifications
    • Minimum of 12 years of experience in multiple disciplines including; solution or technical architecture, Data Management, Cloud, or Big Data
    • Minimum of 5 years of experience designing and implementing large and complex Data Lakes, Data Warehouses, and Analytics projects
    • Minimum of 3 years of experience and a working knowledge of Oracle Cloud Big Data Management
    • Minimum of 3 years of experience and a working knowledge of Oracle Cloud Integration, Hadoop, NoSQL, Apache Hadoop, Spark, and Hive
    • Minimum of 3 years of experience developing solutions utilizing a combination of the following;
    • Autonomous Data Warehouse Cloud
    • Autonomous NoSQL Database Cloud
    • Big Data / Big Data SQL Cloud
    • Data Hub
    • Event Hub
    • Exadata
    • Exadata Express
    • MySQL Database
    • Apiary, Data Integrator
    • GoldenGate
    • Self-Service Integration
Preferred Qualifications
    • Experience as a consulting manager in a top-tier consulting firm preferred
    • Deep understanding of public cloud platforms like AWS, MS Azure and Google Cloud Platform
    • Experience with delivering Big Data Solutions in the public cloud platforms
    • Ability to configure and support API and Opensource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    • Over 2 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work

Professional Skill Requirements
  • Proven success in contributing to a team-oriented environment
  • Proven ability to work creatively and analytically in a problem-solving environment
  • Desire to work in an information systems environment
  • Excellent communication (written and oral) and interpersonal skills
  • Ability to work with senior client executives
Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.