OnlyDataJobs.com

Vector Consulting, Inc
  • Atlanta, GA
 

Our Government client is looking for an experienced ETL Developer on a renewable contract in Atlanta, GA

Position ETL Developer

The desired candidate will be responsible for design, development, testing, maintenance and support of complex data extract, transformation and load (ETL) programs for an Enterprise Data Warehouse. An understanding of how complex data should be transformed from the source and loaded into the data warehouse is a critical part of this job.

  • Deep hands-on experience on OBIEE RPD & BIP Reporting Data models, Development for seamless cross-functional and cross-systems data reporting
  • Expertise and solid experience in BI Tools OBIEE, Oracle Data Visualization and Power BI
  • Strong Informatica technical knowledge in design, development and management of complex Informatica mappings, sessions and workflows on Informatica Designer Components -Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Strong programming skills, relational database skills with expertise in Advanced SQL and PL/SQL, indexing and query tuning
  • Having implemented Advanced Analytical models in Python or R
  • Experienced in Business Intelligence and Data warehousing concepts and methodologies.
  • Extensive experience in data analysis and root cause analysis and proven problem solving and analytical thinking capabilities.
  • Analytical capabilities to slice and dice data and display data in reports for best user experience.
  • Demonstrated ability to review business processes and translate into BI reporting and analysis solutions.
  • Ability to follow Software Development Lifecycle (SDLC) process and should be able to work under any project management methodologies used.
  • Ability to follow best practices and standards.
  • Ability to identify BI application performance bottlenecks and tune.
  • Ability to work quickly and accurately under pressure and project time constraints
  • Ability to prioritize workload and work with minimal supervision
  • Basic understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, Version Control and Office software
  • Exposure Data Modeling using Star/Snowflake Schema Design, Data Marts, Relational and Dimensional Data Modeling, Slowly Changing Dimensions, Fact and Dimensional tables, Physical and Logical data modeling and in big data technologies
  • Experience with Big Data Lake / Hadoop implementations

 Required Qualifications:

  • A bachelors degree in Computer Science or related field
  • 6 to 10 years of experience working with OBIEE / Data Visualization / Informatica / Python
  • Ability to design and develop complex Informatica mappings, sessions, workflows and identify areas of optimizations
  • Experience with Oracle RDBMS 12g
  • Effective communication skills (both oral and written) and the ability to work effectively in a team environment are required
  • Proven ability and desire to mentor/coach others in a team environment
  • Strong analytical, problem solving and presentation skills.

Preferred Qualifications:

  • Working knowledge with Informatica Change Data Capture installed on DB2 z/OS
  • Working knowledge of Informatica Power Exchange
  • Experience with relational, multidimensional and OLAP techniques and technology
  • Experience with OBIEE tools version 10.X
  • Experience with Visualization tools like MS Power BI, Tableau, Oracle DVD
  • Experience with Python building predictive models

Soft Skills:

  • Strong written and oral communication skills in English Language
  • Ability to work with Business and communicate technical solution to solve business problems

About Vector:

Vector Consulting, Inc., (Headquartered in Atlanta) is an IT Talent Acquisition Solutions firm committed to delivering results. Since our founding in 1990, we have been partnering with our customers, understanding their business, and developing solutions with a commitment to quality, reliability and value. Our continuing growth has been and continues to be built around successful relationships that are based on our organization's operating philosophy and commitment to ** People, Partnerships, Purpose and Performance - THE VECTOR WAY

Visa
  • Austin, TX
Company Description
Visa operates the world's largest retail electronic payments network and is one of the most recognized global financial services brands. Visa facilitates global commerce through the transfer of value and information among financial institutions, merchants, consumers, businesses and government entities. We offer a range of branded payment product platforms, which our financial institution clients use to develop and offer credit, charge, deferred debit, prepaid and cash access programs to cardholders. Visa's card platforms provide consumers, businesses, merchants and government entities with a secure, convenient and reliable way to pay and be paid in 170 countries and territories.
Job Description
At Visa University, our mission is to turn our learning data into insights and get a deep understanding of how people use our resources to impact the product, strategy and direction of Visa University. In order to help us achieve this we are looking for someone who can build and scale an efficient analytics data suite and also deliver impactful dashboards and visualizations to track strategic initiatives and enable self-service insight delivery. The Staff Software Engineer, Learning & Development Technology is an individual contributor role within Corporate IT in our Austin-based Technology Hub. In this role you will participate in design, development, and technology delivery projects with many leadership opportunities. Additionally, this position provides application administration and end-user support services. There will be significant collaboration with business partners, multiple Visa IT teams and third-party vendors. The portfolio includes SaaS and hosted packaged applications as well as multiple content providers such as Pathgather (Degreed), Cornerstone, Watershed, Pluralsight, Lynda, Safari, and many others.
The ideal candidate will bring energy and enthusiasm to evolve our learning platforms, be able to easily understand business goals/requirements and be forward thinking to identify opportunities that may be effectively resolved with technology solutions. We believe in leading by example, ownership with high standards and being curious and creative. We are looking for an expert in business intelligence, data visualization and analytics to join the Visa University family and help drive a data-first culture across learning.
Responsibilities
  • Engage with product managers, design team and student experience team in Visa University to ensure that the right information is available and accessible to study user behavior, to build and track key metrics, to understand product performance and to fuel the analysis of experiments
  • Build lasting solutions and datasets to surface critical data and performance metrics and optimize products
  • Build and own the analytics layer of our data environment to make data standardized and easily accessible
  • Design, build, maintain and iterate a suite of visual dashboards to track key metrics and enable self-service data discovery
  • Participate in technology project delivery activities such as business requirement collaboration, estimation, conceptual approach, design, development, test case preparation, unit/integration test execution, support process documentation, and status updates
  • Participate in vendor demo and technical deep dive sessions for upcoming projects
  • Collaborate with, and mentor, data engineers to build efficient data pipelines and impactful visualizations
Qualifications
  • Minimum 8 years of experience in a business intelligence, data analysis or data visualization role and a degree in science, computer science, statistics, economics, mathematics, or similar
  • Significant experience in designing analytical data layers and in conducting ETL with very large and complex data sets
  • Expertise with Tableau desktop software (techniques such as LOD calculations, calculated fields, table calculations, and dashboard actions)
  • Expert in data visualization
  • High level of ability in JSON, SQL
  • Experience with Python is a must and experience with data science libraries is a plus (NumPy, Pandas, SciPy, Scikit Learn, NLTK, Deep Learning(Keras)
  • Experience with Machine Learning algorithms (Linear Regression, Multiple Regression, Decision Trees, Random Forest, Logistic Regression, Naive Bayes, SVM, K-means, K-nearest neighbor, Hierarchical Clustering)
  • Experience with HTML and JavaScript
  • Basic SFTP and encryption knowledge
  • Experience with Excel (Vlookups, pivots, macros, etc.)
  • Experience with xAPI is a plus
  • Ability to leverage HR systems such as Workday, Salesforce etc., to execute the above responsibilities
  • Understanding of statistical analysis, quantitative aptitude and the ability to gather and interpret data and information
  • You have a strong business sense and you are able to translate business problems to data driven solutions with minimal oversight
  • You are a communicative person who values building strong relationships with colleagues and stakeholders, enjoys mentoring and teaching others and you have the ability to explain complex topics in simple terms
Additional Information
All your information will be kept confidential according to EEO guidelines.
Job Number: REF15081Q
Brighter Brain
  • Atlanta, GA

Brighter Brain is seeking a skilled professional to serve as an internal resource for our consulting firm in the field of Data Science Development. Brighter Brain provides Fortune 500 clients throughout the United States with IT consultants in a wide-ranging technical sphere.

In order to fully maintain our incoming nationwide and international hires, we will be hiring a Senior Data Science SME (ML) with practical experience to relocate to Atlanta and coach/mentor our incoming classes of consultants. If you have a strong passion for the Data Science platform and are looking to join a seasoned team of IT professionals, this could be an advantageous next step.

Brighter Brain is an IT Management & Consultingfirm providing a unique take on IT Consulting. We currently offer expertise to US clients in the field of Mobile Development (iOS and Android), Hadoop, Microsoft SharePoint, and Exchange/ Office 365. We are currently seeking a highly skilled professional to serve as an internal resource for our company in the field of Data Science with expertise in Machine Learning (ML)

The ideal candidatewill be responsible for establishing our Data Science practice. The responsibilities include creation of a comprehensive training program, training, mentoring, and supporting ideal candidates, as they progress towards building their career in Data Science Consulting. This position is based out of our head office in Atlanta, GA.

If you have a strong passion for Data Science and are looking to join a seasoned team of IT professionals, this could be an advantageous next step.

The Senior Data Science SMEwill take on the following responsibilities:

-       Design, develop and maintain Data Science training material, focused around: ML Knowledge of DL, NN & NLP is a plus.

-       Interview potential candidates to ensure that they will be successful in the Data Science domain and training.

-       Train, guide and mentor junior to mid-level Data Science developers.

-       Prepare mock interviews to enhance the learning process provided by the company.

-       Prepare and support consultants for interviews for specific assignments involving development and implementation of Data Science.

-       Act as a primary resource for individuals working on a variety of projects throughout the US.

-       Interact with our Executive and Sales team to ensure that projects and employees are appropriately matched.

The ideal candidatewill not only possess a solid knowledge of the realm, but must also have the fluency in the following areas:

-       Hands-on expertise in using Data Science and building machine learning models and Deep learning models

-       Statistics and data modeling experience

-       Strong understanding of data sciences

-       Understanding of Big Data

-       Understanding of AWS and/or Azure

-       Understand the difference between Tensorflow, MxNet, etc

Skills Include:

  • Masters Degree in the Computer Science or mathematics fields

    10+ Years of professional experience in the IT Industry, in the AI realm

  • Strong understanding of MongoDB, Scala, Node.js, AWS, & Cognitive applications
  • Excellent knowledge in Python, Scala, JavaScript and its libraries, Node.js, Python, R and MatLab C/C++ Lua or any proficient AI language of choice
  • NoSQL databases, bot framework, data streaming and integrating unstructured Data Rules engines e.g. drools, ESBs e.g. MuleSoft Computer
  • Vision,Recommendation Systems, Pattern Recognition, Large Scale Data Mining or Artificial Intelligence, Neural Networks
  • Deep Learning frameworks like Tensorflow, Torch, Caffee, Theano, CNTK, cikit-
  • learn, numpy, scipy
  • Working knowledge of ML such as: Naïve Bayes Classification, Ordinary Least
  • Square
  • Regression, Logic Regression, Supportive Vector Machines, Ensemble Methods,
  • Clustering
  • Algorithms, Principal Component Analysis, Singular Value Decomposition, and
  • Independent Component Analysis.  
  • Natural Language Processing (NLP) concepts such as topic modeling, intents,
  • entities, and NLP frameworks such as SpaCy, NLTK, MeTA, gensim or other
  • toolkits for Natural Language Understanding (NLU)
  • Experience data profiling, data cleansing, data wrangling/mungline, ETL
  • Familiarity with Spark MLlib, Mahout Google, Bing, and IBM Watson APIs
  • Hands on experience as needed with training a variety of consultants
  • Analytical and problem-solving skills
  • Knowledge of IOT space
  • Understand Academic Data Science vs Corporate Data Science
  • Knowledge of the Consulting/Sales structure

Additional details about the position:

-       Able to relocate to Atlanta, Ga (Relocation package available)

-       Work schedule of 9 AM to 6 PM EST

Questions: Send your resume to Ansel Butler at Brighter Brain; make sure that there is a valid phone number and Skype ID either on the resume, or in the body of the email.

Ansel Essic Butler

EMAIL: ANSEL.BUTLER@BRIGHTERBRAIN.COM

404 791 5128

SKYPE: ANSEL.BUTLER@OUTLOOK.COM

Senior Corporate Recruiter

Brighter Brain LLC.

1785 The Exchange, Suite 200

Atlanta, GA. 30339

Webtrekk GmbH
  • Berlin, Deutschland
Your responsibilities:

In this role, you will set up your full-fledged research and development team of developers and data science engineers. You will evaluate and choose appropriate technologies and develop products that are powered by Artificial Intelligence and Machine Learning



  • Fast pace development of experimental prototypes, POCs and products for our >400 customers

  • Manage fast feedback cycles, adopt learnings and feedbacks and ultimately deliver AI powered products

  • You will develop new and optimise existing components always with an eye on scalability, performance and maintenance

  • Organize and lead team planning meetings and provide advice, clarification and guidance during the execution of sprints

  • Lead your teams' technical vision and drive the design and development of new innovative products and services from the technical side

  • Lead discussions with the team and management to define best practices and approaches

  • Set goals, objectives and priorities. Mentor team members and provide guidance by regular performance reviews.




The assets you bring to the team:


  • Hands on experience in agile software development on all levels based on profound technical understanding

  • Relevant experience in managing a team of software developers in an agile environment

  • At least 3 years of hands-on experience with developing in Frontend Technologies like Angular or React

  • Knowledge of backend technologies such as Java, Python or Scala are a big plus

  • Experience with distributed systems based on RESTful services

  • DevOps mentality and practical experience with tools for build and deployment automation (like Maven, Jenkins, Ansible, Docker)

  • Team and project-oriented leader with excellent problem solving and interpersonal skills

  • Excellent communication, coaching and conflict management skills as well as a strong assertiveness

  • Strong analytical capability, discipline, commitment and enthusiasm

  • Fluent in English, German language skills are a big plus




What we offer:


  • Prospect: We are a continuously growing team with experts in the most future-oriented fields of customer intelligence. We are dealing with real big data scenarios and data from various business models and industries. Apart from interesting tasks we offer you considerable freedom for your ideas and perspectives for the development of your professional and management skills.

  • Team oriented atmosphere: Our culture embraces integrity, team work and innovation. Our employees value the friendly atmosphere that is the most powerful driver within our company.

  • Goodies: Individual trainings, company tickets, team events, table soccer, fresh fruits and a sunny roof terrace.

  • TechCulture: Work with experienced developers who share the ambition for well-written and clean code. Choose your hardware, OS and IDE. Bring in your own ideas, work with open source and have fun at product demos, hackathons and meetups.

Citizens Advice
  • London, UK
  • Salary: £40k - 45k

As a Database engineer in the DevOps team here at Citizens Advice you will help us develop and implement our data strategy. You will have the opportunity to work with both core database technologies and big data solutions.


Past


Starting from scratch, we have built a deep tech-stack with AWS services at its core. We created a new CRM system, migrated a huge amount of data to AWS Aurora PG and used AWS RDS to run some of our business critical databases.


You will have gained a solid background and in-depth knowledge of AWS RDS, SQL/Admin against DBMS's such as PostgreSql / MySQL / SQL Server, Dynamo / Aurora. You will have dealt with Data Warehousing, ETL, DB Mirroring/Replication, and DB Security Mechanisms & Techniques.


Present


We use AWS RDS including Aurora as the standard DB implementation for our applications. We parse data in S3 using Spark jobs and we are planning to implement a data lake solution in AWS.


Our tools and technologies include:



  • Postgres on AWS RDS

  • SQL Server for our Data Warehouse

  • Liquibase for managing the DW schema

  • Jenkins 2 for task automation

  • Spark / Parquet / AWS Glue for parsing raw data

  • Docker / docker-compose for local testing


You will be developing, supporting and maintaining automation tools to drive database, reporting and maintenance tasks.


As part of our internal engineering platform offering, R&D time will give you the opportunity to develop POC solutions to integrate with the rest of the business.


Future


You will seek continuous improvement and implement solutions to help Citizens Advice deliver digital products better and quicker.


You will be helping us implement a data lake solution to improve operations and to offer innovative services.


You will have dedicated investment time at Citizens Advice to learn new skills, technologies, research topics or work on tools that make this possible.

Man AHL
  • London, UK

The Role


As a Quant Platform Developer at AHL you will be building the tools, frameworks, libraries and applications which power our Quantitative Research and Systematic Trading. This includes responsibility for the continued success of “Raptor”, our in-house Quant Platform, next generation Data Engineering, and evolution of our production Trading System as we continually expand the markets and types of assets we trade, and the styles in which we trade them. Your challenges will be varied and might involve building new high performance data acquisition and processing pipelines, cluster-computing solutions, numerical algorithms, position management systems, visualisation and reporting tools, operational user interfaces, continuous build systems and other developer productivity tools.


The Team


Quant Platform Developers at AHL are all part of our broader technology team, members of a group of over sixty individuals representing eighteen nationalities. We have varied backgrounds including Computer Science, Mathematics, Physics, Engineering – even Classics - but what unifies us is a passion for technology and writing high-quality code.



Our developers are organised into small cross-functional teams, with our engineering roles broadly of two kinds: “Quant Platform Developers” otherwise known as our “Core Techs”, and “Quant Developers” which we often refer to as “Sector Techs”. We use the term “Sector Tech” because some of our teams are aligned with a particular asset class or market sector. People often rotate teams in order to learn more about our system, as well as find the position that best matches their interests.


Our Technology


Our systems are almost all running on Linux and most of our code is in Python, with the full scientific stack: numpy, scipy, pandas, scikit-learn to name a few of the libraries we use extensively. We implement the systems that require the highest data throughput in Java. For storage, we rely heavily on MongoDB and Oracle.



We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker for containerisation, OpenStack for our private cloud, Ansible for architecture automation, and HipChat for internal communication. But our technology list is never static: we constantly evaluate new tools and libraries.


Working Here


AHL has a small company, no-attitude feel. It is flat structured, open, transparent and collaborative, and you will have plenty of opportunity to grow and have enormous impact on what we do.  We are actively engaged with the broader technology community.



  • We host and sponsor London’s PyData and Machine Learning Meetups

  • We open-source some of our technology. See https://github.com/manahl

  • We regularly talk at leading industry conferences, and tweet about relevant technology and how we’re using it. See @manahltech



We’re fortunate enough to have a fantastic open-plan office overlooking the River Thames, and continually strive to make our environment a great place in which to work.



  • We organise regular social events, everything from photography through climbing, karting, wine tasting and monthly team lunches

  • We have annual away days and off-sites for the whole team

  • We have a canteen with a daily allowance for breakfast and lunch, and an on-site bar for in the evening

  • As well as PC’s and Macs, in our office you’ll also find numerous pieces of cool tech such as light cubes and 3D printers, guitars, ping-pong and table-football, and a piano.



We offer competitive compensation, a generous holiday allowance, various health and other flexible benefits. We are also committed to continuous learning and development via coaching, mentoring, regular conference attendance and sponsoring academic and professional qualifications.


Technology and Business Skills


At AHL we strive to hire only the brightest and best and most highly skilled and passionate technologists.



Essential



  • Exceptional technology skills; recognised by your peers as an expert in your domain

  • A proponent of strong collaborative software engineering techniques and methods: agile development, continuous integration, code review, unit testing, refactoring and related approaches

  • Expert knowledge in one or more programming languages, preferably Python, Java and/or C/C++

  • Proficient on Linux platforms with knowledge of various scripting languages

  • Strong knowledge of one or more relevant database technologies e.g. Oracle, MongoDB

  • Proficient with a range of open source frameworks and development tools e.g. NumPy/SciPy/Pandas, Pyramid, AngularJS, React

  • Familiarity with a variety of programming styles (e.g. OO, functional) and in-depth knowledge of design patterns.



Advantageous



  • An excellent understanding of financial markets and instruments

  • Experience of front office software and/or trading systems development e.g. in a hedge fund or investment bank

  • Expertise in building distributed systems with service-based or event-driven architectures, and concurrent processing

  • A knowledge of modern practices for data engineering and stream processing

  • An understanding of financial market data collection and processing

  • Experience of web based development and visualisation technology for portraying large and complex data sets and relationships

  • Relevant mathematical knowledge e.g. statistics, asset pricing theory, optimisation algorithms.


Personal Attributes



  • Strong academic record and a degree with high mathematical and computing content e.g. Computer Science, Mathematics, Engineering or Physics from a leading university

  • Craftsman-like approach to building software; takes pride in engineering excellence and instils these values in others

  • Demonstrable passion for technology e.g. personal projects, open-source involvement

  • Intellectually robust with a keenly analytic approach to problem solving

  • Self-organised with the ability to effectively manage time across multiple projects and with competing business demands and priorities

  • Focused on delivering value to the business with relentless efforts to improve process

  • Strong interpersonal skills; able to establish and maintain a close working relationship with quantitative researchers, traders and senior business people alike

  • Confident communicator; able to argue a point concisely and deal positively with conflicting views.

Comcast
  • Philadelphia, PA

Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast.

Job Summary:

Comcast Interactive Media (CIM) is seeking a Software Engineering Senior Manager to join our Digital Home Application Engineering team.  This group is responsible for customer-facing mobile, web and set-top box application development for Comcast’s Home Security, Home Network Management and Internet of Things products.  This includes the native iOS and Android Xfinity Home applications in the iTunes App and Google Play stores, a native Android application on the Xfinity Home Hub (Touch Screen), a Polymer Web Component based web portal, and a Java native X1 Set Top Box application. We are passionate about our products, our engineering, and our people! #DigitalHome

In this role, you will directly manage 5-10 people including full-time-employees and contractors of Mobile Software Engineers and Test Engineers.  These Engineers range in skill-level and experience from Principal to Junior and Entry-Level.  You will collaborate with cross-functional teams (management, web, mobile and API developers, UX designers, test engineers, operations engineers, product and business managers) to lead project initiatives for Digital Home applications.  You must be a strong technical leader, an independent, critical and analytic thinker, have excellent communication skills and rapidly adapt to changing business and customer demands.

REQUIREMENTS:

- 1+ years managing software engineers, setting goals and developing career plans

- 4+ years professionally developing native mobile applications

- Excellent interpersonal and relationship-building skills

- Knowledge of Software Development Life Cycle and Agile methodologies

- Knowledge of Continuous Integration and Deployment tools (Jenkins, TeamCity)

- Fundamental understanding of HTTP protocol and caching

- Experience with HTTP and RESTful web services

- A strong sense of ownership and responsibility for code quality that follows best practices

- A strong passion for learning and adapting to new technologies

CORE RESPONSIBILITIES:
 

- Directly managing multiple Engineering teams (approx. 5-10 Engineers including full time employees and contractors)

- Providing regular feedback and coaching regarding both job performance and career development

- Directing the development and release of multiple project across Digital Home mobile applications

- Providing guidance to engineering teams and other departments in identifying product and technical requirements, contributing to functional strategy development

- Self-Development via an awareness of one’s strengths and weaknesses, and openness to others viewpoints

- Demonstrating interest in others, encourages adaptive development and promotes a sustainable team culture.

- Regular, consistent and punctual attendance. Must be able to work
nights and weekends, variable schedule(s) as necessary.
- Other duties and responsibilities as assigned.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-5-8 years of Java experience, Scala and Python experience a plus

-3+ years of experience as an analyst, data scientist, or related quantitative role.

-3+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-3-5years of Java experience, Scala and Python experience a plus

-2+ years of experience as an analyst, data scientist, or related quantitative role.

-2+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Infosys
  • Houston, TX
Responsibilities

-Hands on experience with Big Data systems, building ETL pipelines, data processing, and analytics tools
-Understanding of data structures & common methods in data transformation.
-Familiar with the concepts of dimensional modeling.
-Sound knowledge of one programming language - Python or Java
-Programming experience using tools such as Hadoop and Spark.
-Strong proficiency in using query languages such as SQL, Hive and SparkSQL
-Experience in Kafka & Scala would be a plus

Impetus
  • Phoenix, AZ

      Multiple Positions I Multiple Locations : Phoenix, AZ/ Richmond, VA/ Tampa, FL

      Emplyment Type :: Full time || Contract


      As a Big Data Engineer, you will have the opportunity to make a significant impact, both to our organization and those of our Fortune 500 clients. You will work directly with clients at the intersection of business and technology. You will leverage your experience with Hadoop and software engineering to help our clients use insights gleaned from their data to drive Value.


      You will also be given substantial opportunities to develop and expand your technical skillset with emerging Big Data technologies so you can continually innovate, learn, and hit the gas pedal on your career.



Required:
  • 4+ years of IT experience
  • Very good experience in Hadoop, Hive, Spark Batch. (Streaming exp is good to have)
  • Good to have experience with 1 NoSQL - HBase/ Cassandra.
  • Experience with Java/J2EE & Web Services, Scala/ Python is good to have
  • AWS (ETL implementation with AWS on Hadoop) good to have
  • Writing utilities/program to enhance product capability to fulfill specific customer requirement
  • Learning new technology/solution to solve customer problems
  • Provide feedback/learning to product team


Soft Skills:

    A team player who understands the roles and responsibilities of all the team members and facilitates a one team culture
    Strong communication skills both verbal and written
    Quick learner who can work independently on the tasks assigned after initial hand holding
Migo
  • Taipei, Taiwan

  • Responsibility 

    • Collaborate with data scientists to phase in statistical, predictive machine learning and AI models to production scale and continuously optimizing performance.

    • Design, build, optimize, launch and support new and existing data models and ETL processes in production based on data products and stakeholder needs.

    • Define and manage SLA and accuracy for all data sets in allocated areas of ownership.

    • Design and continuously improve data infrastructure and identify infra issues and drive to resolution.

    • Support software development team to build and maintain data collectors in Migo application ecosystem based on data warehouse and analytics user requirements.





  • Basic Qualification:

    • Bachelor's degree in Computer Science, Information Management or related field.

    • 2+ years hands-on experience in the data warehouse space, custom ETL design, implementation and maintenance.

    • 2+ years hands-on experience in SQL or similar languages and development experience in at least one scripting language (Python preferred).

    • Strong data architecture, data modeling, schema design and effective project management skills.

    • Excellent communication skills and proven experience in leading data driven projects from definition through interpretation and execution.

    • Experience with large data sets and data profiling techniques.

    • Ability to initiate and drive projects, and communicate data warehouse plans to internal clients/stakeholders.





  • Preferred Qualification:

    • Experience with big data and distributed computing technology such as Hive, Spark, Presto, Parquet

    • Experience building and maintaining production level data lake with Hadoop Cluster or AWS S3.

    • Experience with batch processing and streaming data pipeline/architecture design patterns such as lambda architecture or kappa architecture.








    • AI

    • ETL

    • (SLA)


    • Migo







    • 2data warehouse space, custom ETL

    • 2SQL (Python)

    • data modeling






    • Hive, Spark, Presto, Parquet

    • Hadoop Cluster or AWS S3.

    • lambda architecture or kappa architecture.


Catapult Systems
  • Houston, TX

High performing team members. Challenging projects. A stable and profitable company. And a great place to work! This is what you can expect if you join the Catapult Systems team. Founded in 1993 and headquartered in Austin, Texas, Catapult is an award winning Microsoft National Solution Provider and was recently named the Microsoft Partner of the Year (U.S.) and Microsoft Partner of the Year Finalist in Cloud Productivity.


What do we attribute our award-winning success to? The people we hire, of course! We provide you the tools and leadership that you need to be successful, and then let you do what you do best. We enable you to make the decisions that you feel are in the best interest of our clients, and we trust your judgment. This type of ownership and independence, and an ongoing commitment to solving real business problems through the innovative use of Microsoft technologies, has resulted in Catapult being voted one of the best places to work year after year!


It is a very exciting time of growth for Catapult Systems, and we are currently hiring a Data Analytics Developer to provide technical leadership for our expanding team.

What will my role be?


As a Data Analytics Developer you will work with customers to identify opportunities and scenarios where Power BI and Azure Data Services can benefit their business. You will deliver short and long term projects utilizing strong business, technical and data modeling skills.

Responsibilities will include:

    • Working with customers to analyze business requirements and define functional specifications
    • Facilitating client meetings and strategy sessions
    • Providing pre-sales technical support by attending sales calls and creating demos for customers
    • Support and implementation of Data Analytics projects

Whats required?

    • First and foremost, you should enjoy what you do and enjoy working in teams!
    • Ability to engage in customer settings and discern client business needs
    • Strong working knowledge and track record of Data Analytics development
    • 5+ years of experience sourcing, star schema & relational data modeling, ETL and processing
    • Expert level knowledge around SSIS, SSAS, SSRS, PowerBI and tools such as SSMS and SSDT
    • Experience supporting large scale analytical platforms
    • Experience designing automated processing, data validation, error checks and alerts, and performance testing techniques
    • Experience working with SQL Azure and cloud data solutions
    • 5+ years of experience with Microsoft SQL Server and proficiency in T-SQL
    • 1+ years of experience in migrating from on-prem to cloud (PaaS or IaaS)
    • Excellent presentation, verbal and written communication, and time management skills
    • Ability to travel up to 25%

What else would make me stand out?

    • Previous consulting experience
    • Knowledge of database optimization techniques
    • Experience with Python and/or R
    • Proficiency in MDX and/or DAX queries
    • Experience with Microsoft Office 365 and cloud data solutions
    • Reporting experience with SharePoint and CRM
    • Relevant Microsoft Certifications and Non-Microsoft data platform certifications
    • Ability to work with cloud and hybrid environments
    • Good understanding of statistics
    • Knowledge in government analytics and policy objectives
    • Experience working with Big Data technologies and NoSQL
    • Multidimensional and Tabular Cube design, development, performance tuning and troubleshooting
    • Experience working with Data Visualization, Auditing, Data Validation, and Data Mining

 So what are you waiting for?? If you are passionate about being a leader and want to work with smart people that are committed to accomplishing great things, then apply today!

Catapult offers an outstanding benefits package including 401(k) match, paid time off, flex spending accounts, identity theft protection, and medical, dental, and life insurance just to name a few.

Catapult was recently named a Texas Monthly magazine Best Place to Work!

FlixBus
  • Berlin, Germany

Your Tasks – Paint the world green



  • Holistic cloud-based infrastructure automation

  • Distributed data processing clusters as well as data streaming platforms based on Kafka, Flink and Spark

  • Microservice platforms based on Docker

  • Development infrastructure and QA automation

  • Continuous Integration/Delivery/Deployment


Your Profile – Ready to hop on board



  • Experience in building and operating complex infrastructure

  • Expert-level: Linux, System Administration

  • Experience with Cloud Services, Expert-Level with either AWS or GCP  

  • Experience server and operation-system-level virtualization is a strong plus, in particular practical experience with Docker and cluster technologies like Kubernetes, AWS ECS, OpenShift

  • Mindset: "Automate Everything", "Infrastructure as Code", "Pipelines as Code", "Everything as Code"

  • Hands-on experience with "Infrastructure as Code" tools: TerraForm, CloudFormation, Packer

  • Experience with a provisioning / configuration management tools (Ansible, Chef, Puppet, Salt)

  • Experience designing, building and integrating systems for instrumentation, metrics/log collection, and monitoring: CloudWatch, Prometheus, Grafana, DataDog, ELK

  • At least basic knowledge in designing and implementing Service Level Agreements

  • Solid knowledge of Network and general Security Engineering

  • At least basic experience with systems and approaches for Test, Build and Deployment automation (CI/CD): Jenkins, TravisCI, Bamboo

  • At least basic hands-on DBA experience, experience with data backup and recovery

  • Experience with JVM-based build automation is a plus: Maven, Gradle, Nexus, JFrog Artifactory

Pyramid Consulting, Inc
  • Atlanta, GA

Job Title: Tableau Engineer

Duration: 6-12 Months+ (potential to go perm)

Location: Atlanta, GA (30328) - Onsite

Notes from Manager:

We need a data analyst who knows Tableau, scripting (JSON, Python), Altreyx API, AWS, Analytics.

Description

The Tableau Software engineer will be a key resource to work across our Software Engineering BI/Analytics stack to ensure stability, scalability, and the delivery of valuable BI & Analytics solutions for our leadership teams and business partners. Keys to this position are the ability to excel in identification of problems or analytic gaps and mapping and implementing pragmatic solutions. An excellent blend of analytical, technical and communication skills in a team based environment are essential for this role.

Tools we use: Tableau, Business Objects, AngularJS, OBIEE, Cognos, AWS, Opinion Lab, JavaScript, Python, Jaspersoft, Alteryx and R packages, Spark, Kafka, Scala, Oracle

Your Role:

·         Able to design, build, maintain & deploy complex reports in Tableau

·         Experience integrating Tableau into another application or native platforms is a plus

·         Expertise in Data Visualization including effective communication, appropriate chart types, and best practices.

·         Knowledge of best practices and experience optimizing Tableau for performance.

·         Experience reverse engineering and revising Tableau Workbooks created by other developers.

·         Understand basic statistical routines (mean, percentiles, significance, correlations) with ability to apply in data analysis

·         Able to turn ideas into creative & statistically sound decision support solutions

Education and Experience:

·         Bachelors degree in Computer Science or equivalent work experience

·         3-5 Years of hands on experience in data warehousing & BI technologies (Tableau/OBIEE/Business Objects/Cognos)

·         Three or more years of experience in developing reports in Tableau

·         Have good understanding of Tableau architecture, design, development and end user experience.

What We Look For:

·         Very proficient in working with large Databases in Oracle & Big Data technologies will be a plus.

·         Deep understanding & working experience of data warehouse and data mart concepts.

·         Understanding of Alteryx and R packages is a plus

·         Experience designing and implementing high volume data processing pipelines, using tools such as Spark and Kafka.

·         Experience with Scala, Java or Python and a working knowledge of AWS technologies such as GLUE, EMR, Kinesis and Redshift preferred.

·         Excellent knowledge with Amazon AWS technologies, with a focus on highly scalable cloud-native architectural patterns, especially EMR, Kinesis, and Redshift

·         Experience with software development tools and build systems such as Jenkins

MINDBODY Inc.
  • Irvine, CA
  • Salary: $96k - 135k

The Senior Data Engineer focuses on designing, implementing and supporting new and existing data solutions- data processing, and data sets to support various advanced analytical needs. You will be designing, building and supporting data pipelines consuming data from multiple different source systems and transforming it into valuable and insightful information. You will have the opportunity to contribute to end-to-end platform design for our cloud architecture and work multi-functionally with operations, data science and the business segments to build batch and real-time data solutions. The role will be part of a team supporting our Corporate, Sales, Marketing, and Consumer business lines.


 
MINIMUM QUALIFICATIONS AND REQUIREMENTS:



  • 7+ years of relevant experience in one of the following areas: Data engineering, business intelligence or business analytics

  • 5-7 years of supporting a large data platform and data pipelining

  • 5+ years of experience in scripting languages like Python etc.

  • 5+ years of experience with AWS services including S3, Redshift, EMR andRDS

  • 5+ years of experience with Big Data Technologies (Hadoop, Hive, HBase, Pig, Spark, etc.)

  • Expertise in database design and architectural principles and methodologies

  • Experienced in Physical data modeling

  • Experienced in Logical data modeling

  • Technical expertise should include data models, database design and data mining



PRINCIPAL DUTIES AND RESPONSIBILITIES:



  • Design, implement, and support a platform providing access to large datasets

  • Create unified enterprise data models for analytics and reporting

  • Design and build robust and scalable data integration (ETL) pipelines using SQL, Python, and Spark.

  • As part of an Agile development team contribute to architecture, tools and development process improvements

  • Work in close collaboration with product management, peer system and software engineering teams to clarify requirements and translate them into robust, scalable, operable solutions that work well within the overall data architecture

  • Coordinate data models, data dictionaries, and other database documentation across multiple applications

  • Leads design reviews of data deliverables such as models, data flows, and data quality assessments

  • Promotes data modeling standardization, defines and drives adoption of the standards

  • Work with Data Management to establish governance processes around metadata to ensure an integrated definition of data for enterprise information, and to ensure the accuracy, validity, and reusability of metadata

GrubHub Seamless
  • New York, NY

Got a taste for something new?

We’re Grubhub, the nation’s leading online and mobile food ordering company. Since 2004 we’ve been connecting hungry diners to the local restaurants they love. We’re moving eating forward with no signs of slowing down.

With more than 90,000 restaurants and over 15.6 million diners across 1,700 U.S. cities and London, we’re delivering like never before. Incredible tech is our bread and butter, but amazing people are our secret ingredient. Rigorously analytical and customer-obsessed, our employees develop the fresh ideas and brilliant programs that keep our brands going and growing.

Long story short, keeping our people happy, challenged and well-fed is priority one. Interested? Let’s talk. We’re eager to show you what we bring to the table.

About the Opportunity: 

Senior Site Reliability Engineers are embedded in Big Data specific Dev teams to focus on the operational aspects of our services, and our SREs run their respective products and services from conception to continuous operation.  We're looking for engineers who want to be a part of developing infrastructure software, maintaining it and scaling. If you enjoy focusing on reliability, performance, capacity planning, and the automation everything, you’d probably like this position.





Some Challenges You’ll Tackle





TOOLS OUR SRE TEAM WORKS WITH:



  • Python – our primary infrastructure language

  • Cassandra

  • Docker (in production!)

  • Splunk, Spark, Hadoop, and PrestoDB

  • AWS

  • Python and Fabric for automation and our CD pipeline

  • Jenkins for builds and task execution

  • Linux (CentOS and Ubuntu)

  • DataDog for metrics and alerting

  • Puppet





You Should Have






  • Experience in AWS services like Kinesis, IAM, EMR, Redshift, and S3

  • Experience managing Linux systems

  • Configuration management tool experiences like Puppet, Chef, or Ansible

  • Continuous integration, testing, and deployment using Git, Jenkins, Jenkins DSL

  • Exceptional communication and troubleshooting skills.


NICE TO HAVE:



  • Python or Java / Scala development experience

  • Bonus points for deploying/operating large-ish Hadoop clusters in AWS/GCP and use of EMR, DC/OS, Dataproc.

  • Experience in Streaming data platforms, (Spark streaming, Kafka)

  • Experience developing solutions leveraging Docker

Avaloq Evolution AG
  • Zürich, Switzerland

The position


Are you passionate about data architecture? Are you interested in shaping the next generation of data science driven products for the financial industry? Do you enjoy working in an agile environment involving multiple stakeholders?

Responsible for selecting appropriate technologies from open source, commercial on-premises and cloud-based offerings. Integrating a new generation of tools within the existing environment to ensure access to accurate and current data. Consider not only the functional requirements, but also the non-functional attributes of platform quality such as security, usability, and stability.

We want you to help us to strengthen and further develop the transformation of Avaloq to a data driven product company. Make analytics scalable and accelerate the process of data science innovation.


Your profile


  • PhD, Master or Bachelor degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field

  • Knowledgeable with BigData technologies and architectures (e.g. Hadoop, Spark, data lakes, stream processing)

  • Practical experience with Container Platforms (OpenShift) and/or containerization software (Kubernetes, Dockers)

  • Hands-on experience developing data extraction and transformation pipelines (ETL process)

  • Expert knowledge in RDBMS, NoSQL and Data Warehousing

  • Familiar with information retrieval software such as Elastic Search/Lucene/SOLR

  • Firm understanding of major programming/scripting languages like Java/Scala, Linux, PHP, Python and/or R

  • High integrity, responsibility and confidentiality a requirement for dealing with sensitive data

  • Strong presentation and communication skills

  • Good planning and organisational skills

  • Collaborative mindset to sharing ideas and finding solutions

  • Fluent in English; German, Italian and French a plus





 Professional requirements


  • Be a thought leader for best practice how to develop and deploy data science products & services

  • Provide an infrastructure to make data driven insights scalable and agile

  • Liaise and coordinate with stakeholders regarding setting up and running a BigData and analytics platform

  • Lead the evaluation of business and technical requirements

  • Support data-driven activities and a data-driven mindset where needed



Main place of work
Zurich

Contact
Avaloq Evolution AG
Anna Drozdowska, Talent Acquisition Professional
Allmendstrasse 140 - 8027 Zürich - Switzerland

www.avaloq.com/en/open-positions

Please only apply online.

Note to Agencies: All unsolicited résumés will be considered direct applicants and no referral fee will be acknowledged.
Accenture
  • San Diego, CA
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
Business & Technology Integration professionals advise upon, design, develop and/or deliver technology solutions that support best practice business changes
The Bus&Industry Integration Assoc Mgr aligning technology with business strategy and goals they working directly with the client gathering requirements to analyze, design and/or implement technology best practice business changes. They are sought out as experts internally and externally for their deep functional or industry expertise, domain knowledge, or offering expertise. They enhance Accenture's marketplace reputation.
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
Data Management professionals define strategies and develop/deliver solutions and processes for managing enterprise-wide data throughout the data lifecycle from capture to processing to usage across all layers of the application architecture.
A professional at this position level within Accenture has the following responsibilities:
Identifies, assesses and solves complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors.
Closely follows the strategic direction set by senior management when establishing near term goals.
Interacts with senior management at a client and/or within Accentureon matters where they may need to gain acceptance on an alternate approach.
Has some latitude in decision-making. Acts independently to determine methods and procedures on new assignments.
Decisions have a major day to day impact on area of responsibility.
Manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum of 3 plus years of hands-on technical experience implementing Big Data solutions utilizing Hadoop or other Data Science and Analytics platforms.
    • Minimum of 3 plus years of experience with a full life cycle development from functional design to deployment
    • Minimum 2 plus years of hands-on technical experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Minimum 3 plus years of hands-on technical experience in developing solutions utilizing at least two of the following:
    • Kafka based streaming services
    • R Studio
    • Cassandra , MongoDB
    • MapReduce, Pig, Hive
    • Scala, Spark
    • knowledge on Jenkins, Chef, Puppet
  • Bachelor's degree or equivalent years of work experience
  • Ability to travel 100%, Monday- Thursday
Professional Skill Requirements
    • Proven ability to build, manage and foster a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Excellent leadership and management skills
All of our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture.
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a federal contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
ITCO Solutions, Inc.
  • Austin, TX

The Sr. Engineer will be building pipelines using Spark ScalaMust Haves:
Expertise in the Big Data processing and ETL Pipeline
Designing large scaling ETL pipelines - batch and realtime
Expertise in Spark Scala coding and Data Frame API (rather than the SQL based APIs)
Expertise in core Data Frame APIs
Expertise in doing unit testing Spark Data frame API based code
Strong in Scripting knowledge using Python and shell scripting
Experience and expertise in working on performance tuning of large scale data pipelines