OnlyDataJobs.com

Levatas
  • Palm Beach Gardens, FL

Levatas is an AI solutions company. We help our partners understand and deploy artificial intelligence solutions across their enterprise using Data Science, Machine Learning, Predictive Analytics, Automation, Machine Learning, and Natural Language Processing.



Our ability to create the future of business is only as strong as the smart, creative people who make up our team. We believe that doing the best work of our lives shouldn't come at the expense of happiness and balance, which is why we're consistently recognized as a best place to work and top company culture.



Levatas is seeking a qualified Senior Software Engineer to join our Technology team at Levatas headquarters. We're looking for someone who's ready to do their best work of their career, architecting, designing, and implementing software solutions, with specific concentration in cross-disciplinary problem solving and collaborative development for our growing base of first-class business clients.



Core responsibilities:




  • Design, develop, and maintain software solutions to meet business requirements of the clients

  • Develop and document project deliverables such as requirement specifications, proposed solutions, detailed designs, project plans, system impact analysis and proof of concepts

  • Program, test, build, integrate and implement web based multi-tier applications of varying complexities

  • Analyze and fully understand project requirements to formulate and implement programmatic solutions that efficiently and effectively address clients  requirements

  • Integrate applications by designing database architecture and server scripting; studying and establishing connectivity with network systems, search engines, and information servers

  • Use engineering principles to conduct technical investigations involved with the modification of material; component; or process specifications and requirements

  • Advise, mentor, train or assist engineers and developers at other skill levels, as needed, to ensure timely releases of high quality code

  • Provide technical consulting services to projects and production system issues.



The following are profile items that interest us:




  • 2-5 years experience in Amazon (AWS) native serverless application development

  • Experience in working with various AWS services such as EC2, ECS, EBS, S3, Glacier, SNS, SQS, IAM, Auto scaling, OpsWorks, Route 53, VPC, CloudFront, Direct Connect, Cloud Trail, Cloud Watch and building CI/CD on AWS environment using AWS Code Commit, Code Build, Code Deploy and Code Pipeline.

  • 3-5 years Nodejs development experience, preferably writing Lambda functions in AWS.

  • 5 years of full-time work experience in software engineering, information technology, or related domains.

  • An unlimited passion about software development

  • Willing to work across the stack to tackle technical challenges anywhere in the system.

  • Interest in working in a cross-functional team that touches many of the core systems and user flows of our customers.

  • Demonstrable experience in compile-time languages such as .NET C#, Java, Swift, or Kotlin.

  • Demonstrable experience working with relational and NoSQL database technologies

  • Demonstrable experience building web applications with HTML5, Javascript, CSS, using JavaScript frameworks like AngularJS, VueJS, or ReactJS

  • Knowledge and understanding of data science, machine learning, Tensorflow (or similar platform like Keras), and Python a huge plus

  • Knowledge and understanding of data transformation processes, ETL, etc. a plus.

  • Experience with designing and building large scale production systems or features.

  • Ability to leverage and integrate with third party APIs.

  • Experience with SOA (Service Oriented Architecture) designs a plus.

  • Advanced analytical thinking; experienced with making product decisions based on data and A/B testing.

  • Exposure to architectural patterns of a large, high-scale web application.

  • Strong interpersonal and communication skills

  • Experience working with Scrum or a similar Agile management process



This role is based in Palm Beach County, Florida.

Lawrence Harvey
  • Austin, TX

Lawrence Harvey are recruiting for a top Data Engineer for one of the top Artificial Intelligence organizations in Austin. 


In this position as a Data Engineer will work on developing Machine Learning models and APIs for the companies AI Platform. Your will be involved in interaction with  multiple teams across the business to evolve their architecture as well as optime their data acquisition, selection, and pipeline.

Requirements: 


  • 5+ years of experience owning and building data pipelines
  • Extensive knowledge of data engineering tools, technologies and approaches for both batch and streaming environments
  • Proven experience building or extending data platforms from scratch for data consumption across a wide variety of use cases (e.g data science, ML, etc)
  • Experience with data transformations, API wrappers and output formats used with Machine Learning
  • Ability to absorb business problems and understand how to service required data needs
  • Demonstrated ability to build complex, scalable systems with high quality
  • Experience with multiple data technologies and concepts such as Airflow, Kafka, Hadoop, Hive, Spark, MapReduce, SQL, NoSQL, and Columnar databases
  • Experience in one or more of Java, Scala, and Python
  • Experience with serverless technologies a plus (e.g. AWS Lambda)
  • Hands-on, in-depth experience with AWS or other cloud infrastructure technologies
The Home Depot
  • Toronto, ON, Canada

Askuity is a Toronto-based retail analytics software company operating as an innovative new division within The Home Depot. Askuity's mission is to transform merchant-vendor collaboration between The Home Depot and its product suppliers by enabling best-in-class data-driven decision-making and real-time retail execution.



Askuity is seeking a Senior Back End Developer eager to join our growing team. As a member of the development team at Askuity, you will help drive the technical direction of our industry-leading analytics platform that has a meaningful and immediate impact on our growing customer base. If you’re committed to great work and are constantly looking for ways to improve the systems you are responsible for, then we’d love to hear from you.

Position Responsibilities:



  • Get deeply involved in our technical direction while delivering well-tested, performant, and maintainable code for our data powered SaaS platform

  • Gain an understanding of the real-world problems of our users to help build a great product that exceeds the expectations of our customers

  • Collaborate with product stakeholders to understand, design, and implement product features and realize our product vision to help our customers make better sense of their retail data

  • Deploy and run software in Amazon Web Services

  • Evaluate new technologies and assess their suitability to solve challenges of today and into the future



Experience/Knowledge Required:



  • 5+ years experience building service-oriented architectures and web applications – we have a lot of things that are HTTP and REST-based

  • Proven experience as a successful technical leader in a highly collaborative environment

  • Strong, hands-on technical expertise and demonstrated ability to design and implement reliable, scalable, high performing solutions

  • 5+ years experience building service-oriented architectures and web applications – we have a lot of things that are HTTP and REST-based

  • Exceptional interpersonal and communication skills

  • Proven experience working with AWS and/or GCP products, optimizing performance and costs

  • A thirst to innovate with machine intelligence

  • A self-starter attitude – at Askuity, the drive to do better and be better is baked into our culture



Experience with the following is a bonus:



  • Functional programming – Scala, Kotlin, Elixir, Erlang

  • AWS – Athena, EC2, RDS, EMR, Lambda, Elasticsearch

  • GCP – BigQuery

  • SQL – PostgreSQL, OLAP

  • Big data tech – MapReduce, Hadoop, Presto

  • DevOps – Docker, Kuberneted

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-5-8 years of Java experience, Scala and Python experience a plus

-3+ years of experience as an analyst, data scientist, or related quantitative role.

-3+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

GSquared Group
  • Atlanta, GA

Data Engineer
Atlanta, Georgia
Full Time Opportunity

What are we looking for?

Our highly respected client in the Atlanta area is currently seeking a full time Data Engineer to play a pivotal role in a highly visible business unit within the organization.

Our client is leveraging machine learning and its vast data assets to personalize products to customers. You will join a cross functional team developing machine learning solutions for the ecosystem of applications.

As the Data Engineer, you will collaborate with data scientists and subject matter experts to build platforms that leverage AWS native functionality in order to transform raw data into analytics- ready information. In addition, you will build solutions for the continuous integration and deployment of machine learning solutions within AWS.

How will you make an impact?


  • Design, build, test and maintain AWS platforms architected to support batch and real-time data processing
  • Operationalize machine learning solutions that are deployed as either serverless lambda applications, SageMaker managed services or EMR/Spark platforms
  • Ensure deployed applications meet architecture guidelines
  • Design data ingestion processes for acquiring new datasets supporting key business initiatives
  • Create continuous integration and code deployment processes regard for versioning and deploying machine learning solutions


Qualifications needed to be successful in this role:


  • 5+ years of experience engineering data ingestion solution
  • 3+ years of experience building data-related solutions in AWS
  • Experience with Python, NodeJS or R
  • Experience using AWS technologies: Lambda, S3, Step Functions, Glue, Athena, EMR, Hive or Spark
  • High level of understanding of machine learning concepts applicable to recommendation logic and forecasting
  • Knowledge of AWS Machine Learning resources such as SageMaker
  • GitHub and automated code deployment solutions such as AWS Code Pipeline or Jenkins


Education
Bachelors degree in Computer Science


(#11933536)

BTI360
  • Ashburn, VA

Our customers are inundated with information from news articles, video feeds, social media and more. We’re looking to help them parse through it faster and focus on the information that’s most important that allows them to make better decisions. We're in the process of building a next-generation analysis and exploitation platform for video, audio, documents, and social media data. This platform will help users identify, discover and triage information via a UI that leverages best in class speech-to-text, machine translation, image recognition, OCR, and entity extraction services.


We're looking for data engineers to develop the infrastructure and systems behind our platform.  The ideal contributor should have experience building and maintaining data and ETL pipelines. They will be expected to work in a collaborative environment, able to communicate well with their teammates and customers. This is a great opportunity to work with a high-performing team in a fun environment.


At BTI360, we’re passionate about building great software and developing our people. Software doesn't build itself; teams of people do. That's why our primary focus is on developing better engineers, better teammates, and better leaders. By putting people first, we give our teammates more opportunities to grow and raise the bar of the software we develop.


Interested in learning more? Apply today!


Required Skills/Experience:



  • U.S. Citizenship - Must be able to obtain a security clearance

  • Bachelors Degree in Computer Science, Computer Engineering, Electrical Engineering or related field

  • Experience with Java, Kotlin, or Scala

  • Experience with scripting languages (Python, Bash, etc.)

  • Experience with object-oriented software development

  • Experience working within a UNIX/Linux environment

  • Experience working with a message-driven architecture (JMS, Kafka, Kinesis, SNS/SQS, etc.)

  • Ability to determine the right tool or technology for the task at hand

  • Works well in a team environment

  • Strong communication skills


Desired Skills:



  • Experience with massively parallel processing systems like Spark or Hadoop

  • Familiarity with data pipeline orchestration tools (Apache Airflow, Apache NiFi, Apache Oozie, etc.)

  • Familiarity in the AWS ecosystem of services (EMR, EKS, RDS, Kinesis, EC2, Lambda, CloudWatch)

  • Experience working with recommendation engines (Spark MLlib, Apache Mahout, etc.)

  • Experience building custom machine learning models with TensorFlow

  • Experience with natural language processing tools and techniques

  • Experience with Kubernetes and/or Docker container environment

  • Ability to identify external data specifications for common data representations

  • Experience building monitoring and alerting mechanisms for data pipelines

  • Experience with search technologies (Solr, ElasticSearch, Lucene)


BTI360 is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status, or any other protected class. 

Elev8 Hire Solutions
  • Atlanta, GA
 Jr Mid Level Data Scientist
Our client in the Buckhead area is looking for someone to create next-generation data science to power a company reshaping an industry.

Responsibilities:
  • Create predictive models to power innovative ticketing products
  • Acquire and explore data from creative sources to power data science initiatives
  • Create data pipelines and applications in Amazon Web Services environments
  • Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies
  • Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes
  • Coordinate with different functional teams to implement models and monitor outcomes
  • Develop processes and tools to monitor and analyze model performance and data accuracy
  • Interact with project stakeholders to provide data science solutions addressing business requirements.


Required Skills:

  • Bachelors degree
  • Experience with Python, AWS, SQL, and R
  • 2 - 3 years of data science and analytics experience
  • Experience creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
  • Experience creating and deploying data science solutions in a cloud environment (AWS preferred)
  • Experience using statistical computer languages (Python, SQL, Scala) to manipulate data and draw insights from large data sets
  • Experience working with and creating data architectures
  • Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks
  • Experience visualizing/presenting data for stakeholders using: Tableau, Excel, etc.
  • Strong problem-solving skills with an emphasis on product development


Preferred Skills:
  • Groovy & Grails, Spring, or other JVM language (Java, Scala, etc.)
  • Advanced degree in math, statistics computer science, economics or another data-related field
  • Coding knowledge and experience with several languages: Grails, Java, JavaScript, Python
  • Experience with multiple AWS products: Lambda, Sagemaker, Athena, EC2, EMR, Redshift, Glue, S3
  • Experience managing digital data across web pages and mobile applications
  • Tableau experience
  • Experience in digital analytics tools: Google Analytics, Adobe Analytics, MixPanel, etc.


Perks:

  • Medical/Dental/Vision
  • 401k match
  • In-office massage therapists
  • New MacBook Pro Retina Laptop
  • Open, fun and comfortable workspace (no cubes!)
  • Ping-pong table, XBOX, and Nintendo 64 (bring your A game!)
  • Market Salary Expectation: $70k-73k
GTN Technical Staffing and Consulting
  • Dallas, TX

SCALA  Software Engineer - Scala

HIGHLIGHTS
Location: Dallas/Fort Worth Area
Position Type: Direct Hire
Residency Status: US Citizens and US Permanent Residents only, as sponsorship is not being offered at this time. Local candidates only.


JOB SUMMARY

This position requires a broad array of programming skills and experience as well as the desire to learn and grow in an entrepreneurial environment. You will be responsible for creating and developing client onboarding, product provisioning and real-time analytics software services. You will work closely with other members of the architecture team to make strategic decisions about product development, devops and future technology choices.

The ideal candidate should:

  • Demonstrate a proven track record of rapidly building, delivering, and maintaining complex software products.
  • Possess excellent communication skills.
  • Have high integrity.
  • Embrace learning and have a thirst for knowledge.

            Be able to rapidly learn new frameworks.

  • Be responsible for creating and implementing core product architecture. Be comfortable developing frontend and backend solutions.


This position reports directly to the CTO.


Required experience:

7+ years hands-on experience

  • AWS (EC2, Lambda, ECS)
  • Docker/Kubernetes
  • 3+ years programming in Scala
  • 3+ years programming in Node.js
  • ES6/Modern javascript
  • Microservices


Preferred experience:

  • Mongodb
  • SBT
  • Serverless Computing

The ideal candidate will:

  • Possess excellent communication and organization skills
  • Embrace learning and have a thirst for knowledge
  • Rapidly learn new technological paradigms
  • Understand and program in multiple programming languages
  • Enjoy being part of a growing team
  • Self-motivated team player


Benefits

  • Medical / Dental / Vision Insurance
  • 401(k)
  • Competitive compensation
  • Work with leaders in the industry
  • Opportunities to learn and grow every day
  • Play a meaningful role on a successful team
Mobimeo GmbH
  • Berlin, Deutschland

About the Mobimeo Engineering Team


We are a growing engineering team of 50+ developing technology to solve major urban mobility challenges.  From Navigation, Ticketing, to applying Machine-Learning and Data Analysis - we are on a mission to improve the way people move around our cities.


Our technology stack is versatile and we try to choose the right tools for the right jobs.


We run on the AWS cloud, using API Gateway, S3, Athena, Lambda, EMR, RDS, SES, EC2, CloudWatch, and other multiple services. All codified through Terraform (infrastructure as code) and Helm for Kubernetes Cluster that powers our Docker-Containers.


The different backend services are written in JVM technologies like Scala and Kotlin, while some of our used frameworks are Play!, Akka and Spring, we as well use third party tools like Kibana, Zipkin, StatusCake, Runscope, Datadog and Pagerduty.


Our frontend’s are written in Swift, Kotlin and for internal tooling in HTML5/CSS3/React.


Working in autonomous, cross-functional teams we believe in continuous delivery through high automation and DevOps culture.



YOUR RESPONSIBILITIES



  • You will develop a suite of digital mobility applications

  • You will work alongside product managers, UI/UX designers and data scientists

  • You will build high performance, scalable systems

  • You will work across different technologies and platforms


YOUR PROFILE



  • You have extensive experience building applications in Java, Scala or Kotlin

  • You have a deep understanding of software engineering best practices

  • You have an interest in deploying machine learning based systems

  • You have a DevOps mindset - we automate everything


WHY MOBIMEO



  • Early stage - Build a product from the ground up

  • Help create seamless travel experiences for tens of millions of users

  • Stable - Backed by a leading German transportation company

  • Develop a sophisticated product leveraging bleeding edge technologies

  • Join a diverse and highly experienced engineering team

  • Yearly personnel development budget

  • Weekly team lunches as well as free drinks and snacks

  • A highly collaborative, engineering driven environment

  • Nice office space in Berlin Kreuzberg

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Summary:

Billions of requests. Millions of Customers. Be part of Comcast's TPX X1 Stream Apps' Platform and APIs team! Our team designs, builds and operates the APIs that power Comcast's X1 web, mobile, Roku and SmartTV properties. Reliability and performance at this scale require complex information systems to be made simple. We are looking for an engineer who is able to listen to stakeholders and clients, understand technical requirements, collaborate on solutions, and deliver technology services in a high velocity, dynamic, "always on" environment. As a member of our team you will work with other engineers and DevOps practitioners to produce mission-critical applications & infrastructure, tools, and processes that enable our systems to scale at a rapid pace. One day might involve creating an API that returns a customer's channel lineup or performance tuning of a Java web application; the next may be building tools to enable continuous delivery or infrastructure as code.

Technology snapshot: AWS, Apache, EC2, Git/Gerrit, Graphite, Grafana, Java, Lambda, Linux, Memcached, Scala, Akka, Splunk, Spring, Tomcat, Vmware, OpenStack, TerraForm, Ansible, Concourse

Where we headed?

Our goal is to build, scale and guard the systems that delight our customers. To do so, you will need strong skills in the following areas:

Responsibilities

As a member of Advanced Application Engineering's Platform and APIs Team, you will provide technical expertise and guidance within our cross-functional project team, and you'll work closely with other software and QA engineers to build quality, scalable products that delight our customers. Responsibilities range from high-level logical architecture through low-level detailed design and implementation, including:

  • Design, build, deliver and scale sophisticated high-volume web properties and agreed upon solutions from the catalog of TVX application services.
  • Collaborate with project stakeholders to identify product and technical requirements. Conducts analysis to determine integration needs.
  • Write code that meets functional requirements and is testable and maintainable. Have a passion for test driven development.
  • Work with Quality Assurance team to determine if applications fit specification and technical requirements.
  • Produce technical designs and documentation at varying levels of granularity.

Desired Qualifications

  • 2+ years software development experience in Java with a solid understanding of Spring, Hibernate frameworks and REST based architecture.
  • Experience with Java application servers and J2EE containers (Tomcat).
  • Knowledge of object-oriented design methodology and standard software design patterns.
  • Firm grasp of testing methodologies and frameworks.
  • Experience in caching especially in HTTP compliant caches.
  • Fundamental understanding of the HTTP protocol.
  • Experience developing with Major MVC frameworks (Spring MVC).
  • Familiarity with consolidating and normalizing data across many data sources, specifically Internet data aggregation and metadata processing.
  • Familiar with agile development methodologies such as Scrum.
  • Strong technical written and verbal communication skills.
  • A sense of ownership, initiative, and drive and a love of learning!

Additional Qualifications

  • UNIX background (Solaris/Linux)
  • CDN Experience is a plus
  • Familiarity with cloud computing (OpenStack, S3, SQS, Hadoop...).
  • Experience with Scala, Ruby on Rails, Akka

Education

Bachelor's degree in Engineering or Computer Science or a related field, or relevant work experience.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Daman, Inc
  • Phoenix, AZ

As part of the Data Engineering team of a large insurance firm, you will be architecting highly scalable data integration and transformation platform processing high volume of data under defined SLA. You will be creating and building the platform that supports cloud to cloud integration connecting cloud Apps to the AWS analytical stack.

Qualifications

      • 3+ years working in Big Data and related technologies
      • 3+ years of programming experience in Java
      • 1+ year experience working with Snowflake
      • Experience building high-performance, and scalable distributed systems
      • AWS cloud experience (EC2, S3, Lambda, EMR, RDS, Redshift)
      • Experience in a variety of relevant technologies including Cassandra, AWS DynamoDB, Kafka, AWS Kinesis, Elasticsearch, Machine Learning, Spark, Hadoop, Hive
      • Experience in ETL and ELT workflow management
      • Familiarity with AWS Data and Analytics technologies such as Glue, Athena, Redshift, Spectrum, Data Pipeline
      • Experience building internal cloud to cloud integrations is ideal
      • Experience with Insurance Apps on the cloud is a nice to have.
      • Bachelors Degree in Computer Science, Mathematics or Engineering is required
Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-3-5years of Java experience, Scala and Python experience a plus

-2+ years of experience as an analyst, data scientist, or related quantitative role.

-2+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

ParkMobile, LLC
  • Atlanta, GA

What You’ll Do


ParkMobile is looking for a rockstar Data Engineer to join our growing team!  In this role, the Data Engineer will join a cross functional team developing machine learning solutions for the Company ecosystem of applications.  Collaborating with data scientists and subject matter experts, the Data Engineer will build platforms that leverage AWS native functionality in order to transform raw data into analytics-ready information.  In addition, this individual will build solutions for the continuous integration and deployment of machine learning solutions within AWS.  The Data Engineer will also partner with other AWS teams across Company to define architecture guidelines and provide relevant thought-leadership.  The ideal candidate is highly motivated, passionate about solving complex problems, collaborative and a strong communicator.



  • Design, build, test and maintain AWS platforms architected to support batch and real-time data processing.

  • Operationalize machine learning solutions that are deployed as either serverless lambda applications, SageMaker managed services or EMR/Spark platforms.

  • Ensure deployed applications meet architecture guidelines defined by Company.

  • Design data ingestion processes for acquiring new datasets supporting key business initiatives.

  • Create continuous integration and code deployment processes regard for versioning and deploying machine learning solutions.

  • Identify opportunities to improve data reliability, efficiency and quality.

  • Provide thought-leadership on AWS functionality to management and determine the business value of new features.

  • Work together with the team's data scientist and product manager to ensure machine learning solutions are compatible with the AWS architecture.



Who You Are



  • Bachelor’s degree in Computer Science or related fields

  • 5+ years of experience engineering data ingestion solutions

  • 3+ years of experience building data-related solutions in AWS

  • Experience with programming languages such as Python, NodeJS or R

  • Experience using AWS technologies such as Lambda, S3, Step Functions, Kinesis, Glue, Athena, EMR, Kafka, Hive or Spark

  • High level understanding of machine learning concepts applicable to recommendation logic and forecasting

  • Knowledge of AWS Machine Learning resources such as SageMaker is helpful

  • Experience with GitHub and automated code deployment solutions such as AWS Code Pipeline or Jenkins



Other Skills



  • Able to effectively communicate complex technical ideas to business stakeholders

  • Willing to collaborate with colleagues across Technology functions in order to establish architecture guidelines, provide thought-leadership and solve complex problems

  • Passionate about learning new technologies that will improve functionality, scalability and reliability of relevant solutions

  • High personal and professional standards; unassailable integrity and ethics

  • High-energy and self-starter personality

  • Strong structured coding skills



Why You’ll Love It Here



  • Competitive compensation and an extensive benefits package

  • A comprehensive list of medical, dental, and vision coverage plans for you to pick from

  • Company-paid short-term disability, long-term disability, and life insurance

  • 401K with a 3-year vesting schedule and up to 4% matching of your elected contribution

  • 11 observed, company-paid holidays

  • 21 vacation days to start and an additional day of PTO at your anniversary date each year (up to 28 days total) – and we expect you to use all of it

  • Generous paid parental leave, including 4 paid weeks for primary caregivers and 2 paid weeks for secondary caregivers

  • A light-filled Midtown Atlanta office with both private and collaborative spaces to work, unlimited snacks and beverages, and easy access to MARTA

  • Company-paid volunteer days to support the community that supports us

  • Company and team outings because we play just as hard as we work

  • Employee referral bonuses to encourage the addition of stellar people to the team

  • Spontaneous nerf-gun wars to wake you up and Thursday happy hours to wind you down



About Us


ParkMobile, LLC is the leading provider of smart parking and mobility solutions in North America, helping millions of people easily find, reserve, and pay for parking from their mobile devices. The company’s technology is used in over 3,000 locations across the country, including 7 of the top 10 cities, as well as airports, stadiums, and universities. ParkMobile has been named to the Inc. 5000, Deloitte Fast 500, Entrepreneur360, and Atlanta Business Chronicle’s Pacesetter list. At ParkMobile, we aim to build an inclusive culture where differences are used to inform better creative, strategic, and business decisions. We actively seek diversity of backgrounds, education, beliefs, and ways of thinking. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

TracFone Wireless
  • Miami, FL

TracFone Wireless, Inc. is seeking an individual who will be responsible for constructing and optimizing our data lake and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. Incumbent is expected to be experienced as a data pipeline builder and a data wrangler who enjoys optimizing data systems and building them from the ground up.  The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.

Responsibilities:

  • Build data streams to inject, load, transform, group, logically join and assemble data ready for data analysis / analytics/ reporting / next best action / next-best-offer, build data streams to inject, load, transform, group, logically join and assemble data ready for data analysis / analytics/ reporting / next best action / next best offer.Pipeline data using Cloud or on-premises technologies: AWS, Google, Big Data, Hadoop, AI/Deep Learning APIs, SQL/NOSQL, Unstructured databases.


  • Identify root-causes for tier-2 escalated issues pertaining to customer facing applications from Operations, Incident Management/Workforce Management/IT-OPS/IT Development.


  • Effectively articulate with management, business-units and end-users for each root cause.


  • Receive and process requests in providing data analyzing transactions, not limiting to Activations, Reactivations, Redemptions, Payments, Throttling, Billing, Enrollment by business or IT units. The requests could be through SD Tickets/BCRS or Emails.


  • Support and coordinate Data Governance Body/Committee in order to review/establish coding standards, review/establish design procedures, review proposed designs for projects and BCRs.


  • Maintain knowledge and proficiency of current and upcoming hardware/software technologies. Mentor junior staff in ramping up analytical and technical skills.



Requirements:


  • Bachelors degree from an accredited college in Computer Science or equivalent.
  • Must have 5+ years of PL/SQL and Stored Procedures experience with multiple OS platforms.
  • Must have 2 + years of Python and Pyspark.
  • Must have 3 + years of AWS or alternative cloud systems.
  • Shell scripting, SQL Stored Procedures IBM/other databases and JAVA skills are desired.
  • Strong knowledge of AWS data related services (DMS, Glue, Athena, Lambda, Redshift, DynamoDB, SageMaker.)
  • Strong knowledge of Unix/AIX and Windows operating systems, standard concepts, practices, and procedures within the relational database field.
  • Hadoop, Kafka, Spark, Scala (preferred).




Jampp
  • Palermo, Argentina

We’re looking for outstanding Software Engineers of all shapes and sizes, capable and passionate about technology.


Our challenges span many technologies, from extremely performant and scalable code, to big data tools that allow us to manage the big stream of data we have. We work in small teams that build reliable, fast and beautiful software.


WHAT YOU'LL DO



  • Tackle really tough engineering and product problems

  • Design and own end-to-end, multi-tier architectures

  • Work with the latest open-source projects for extreme scale

  • Code systems that make decisions in ms.

  • Work with a massive amount of data and with the latest machine learning techniques.

  • Collaborate with an extraordinarily small and talented team of designers and engineers

  • Develop junior members of the team.


REQUIREMENTS



  • Great engineering skills and strong CS fundamentals

  • 2+ years of hands on experience building software

  • Strong hands on experience in one low level programming language

  • Good SQL knowledge and databases, PostgreSQL is a plus

  • Ability to work both independently and in cooperation with others

  • Ability to build a minimum working product quickly


BONUS POINTS



  • Expert knowledge in large scale distributed systems.

  • Deep knowledge in AWS services (Kinesis, Lambda, EMR, etc.)


TOOLS WE LIKE


We don’t confine ourselves to a single programming language. We believe in using the best tool for the job while maintaining a slight bias toward the tools the team knows the best.



  • Python / Cython

  • Go

  • MySQL and postgresql taken to its furthest extent

  • Javascript (hand-crafted JS + jQuery)

  • Memcached and Redis

  • Nginx

  • Asynchronous services and queues (e.g. Node.js, ZeroMQ)

  • Mercurial and Git

  • Amazon Web Services (AWS e.g. S3, EC2)

  • Presto

  • Hadoop

  • Spark


WHAT WE OFFER



  • Join a a fast-paced, entrepreneurial environment. Your work will be seen daily by thousands of people.

  • Learn a ton & help building the hottest area of growth in Internet advertising - Mobile!

  • A great level of responsibility from day one and the chance to develop your potential without any limitations

  • A budget for conference and education

  • Coffee, loads of snacks and a fridge full of drinks!

  • Other benefits? Competitive salary and our policy is: “do what you want with it” . Oh, and yes quite a few cool Jampp t-shirts per year

  • Did we mention an annual off-site?

Comcast
  • Sunnyvale, CA

Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast.

Backend Developer (Java + Scala)

Job description

You will be responsible for developing back end software and services for Comcast OTT Billing platform. Your responsibilities will include rapid development of prototypes/concepts along with regular development. You are experienced with agile development and a champion of software development best practices.

Required Skills:

  • Must be self-motivated, and ability to work independently, Fast learner, Pays attention to detail.
  • Ability to think like an architect, produce high quality code. Understanding of Service Oriented Architectures.
  • Use TDD and ATDD, using Cucumber-Jvm, ScalaTest.
  • Must be able to build REST services from the ground up.
  • Technologies: Scala 2.11, Http4s, Play2, Akka, Kafka, ELK, Scalaz, Hadoop, Apache Spark, Amazon Web Service (Lambda, S3, Kinesis, SQS).
  • Strong in OOP & Functional paradigm

Minimum Qualifications: 

  • Bachelor’s degree in computer sciences, engineering, management information systems or combination of education and equivalent working experience.
  • Experience in Scala.
  • Strong software design skills and knowledge of design patterns
  • Experience with Agile/Scrum methodologies and associated tools (Jira)

This description portrays in general terms the type and level(s) of work performed and is not intended to be all-inclusive, nor the specific duties of any one incumbent. 

Comcast is an EOE/Veterans/Disabled/LGBT employer

Accenture
  • Houston, TX
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
AWS Big Data Engineering professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, AWS data services, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Big Data Management, Data on Cloud, AWS Data Services AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
    • Evaluates emerging technologies, shapes Accentures point of view, defines new architecture patterns and standards, and leads proof of concepts of innovative solutions.
    • Leads and supports client sales pursuits.
    • Collaborates with highly talented resource pool and helps lead the Community of Data practice.
    • Key participant in setting strategic direction to establish near term goals for area of responsibility.
    • Interacts with senior management levels at a client and/or within Accenture, which involves negotiating or influencing on significant matters.
    • Has latitude in decision-making and determining objectives and approaches to critical assignments .
    • Decisions have a lasting impact on area of responsibility with impact outside area of responsibility.
    • Mana
    • ges large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.

    Basic Qualifications
      • 5 years of hands-on experience with public cloud platforms like AWS, MS Azure and GCP
      • 12 plus years of experience in multiple disciplines, such as solution or technical architecture, Data Management, Cloud architecture, or Big Data
      • 5 years of experience with a combination of AWS Cloud Platform, S3, glacier, AWS data Services, Redshift, Redshift Spectrum,Lambda functions, Apache Spark, Kafka, and Python.
      • Must be able to travel 100% (Mon-Thurs)
      • Bachelors degree or 12 years professional experience
      • 5 years of experience developing solutions utilizing:
        • Kafka based streaming services
        • R Studio
        • RDS, S3, glacier
        • MapReduce, Pig, Hive
        • Scala, Spark
    Preferred Qualifications
      • Experience as a consulting manager in a top-tier consulting firm preferred
      • Ten or more years of experience dealing with complex business/technical architectures and complex client delivery.
      • Working knowledge in Big Data tools like MongoDb, Cassandra, Hadoop, NOSQL, Apache Hadoop, Spark, Hive
      • Experience with delivering Big Data Solutions in the cloud with AWS
      • Ability to configure and support API and Opensource integrations
      • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
      • Over 5 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work
      • Experience with DevOps support
      • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    Professional Skill Requirements
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Ability to work with senior client executives
    Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
    Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
    Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
    Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
    Equal Employment Opportunity
    All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
    Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
    Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Atlanta, GA
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
AWS Big Data Engineering professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, AWS data services, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Big Data Management, Data on Cloud, AWS Data Services AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
    • Evaluates emerging technologies, shapes Accentures point of view, defines new architecture patterns and standards, and leads proof of concepts of innovative solutions.
    • Leads and supports client sales pursuits.
    • Collaborates with highly talented resource pool and helps lead the Community of Data practice.
    • Key participant in setting strategic direction to establish near term goals for area of responsibility.
    • Interacts with senior management levels at a client and/or within Accenture, which involves negotiating or influencing on significant matters.
    • Has latitude in decision-making and determining objectives and approaches to critical assignments .
    • Decisions have a lasting impact on area of responsibility with impact outside area of responsibility.
    • Mana
    • ges large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.

    Basic Qualifications
      • 5 years of hands-on experience with public cloud platforms like AWS, MS Azure and GCP
      • 12 plus years of experience in multiple disciplines, such as solution or technical architecture, Data Management, Cloud architecture, or Big Data
      • 5 years of experience with a combination of AWS Cloud Platform, S3, glacier, AWS data Services, Redshift, Redshift Spectrum,Lambda functions, Apache Spark, Kafka, and Python.
      • Must be able to travel 100% (Mon-Thurs)
      • Bachelors degree or 12 years professional experience
      • 5 years of experience developing solutions utilizing:
        • Kafka based streaming services
        • R Studio
        • RDS, S3, glacier
        • MapReduce, Pig, Hive
        • Scala, Spark
    Preferred Qualifications
      • Experience as a consulting manager in a top-tier consulting firm preferred
      • Ten or more years of experience dealing with complex business/technical architectures and complex client delivery.
      • Working knowledge in Big Data tools like MongoDb, Cassandra, Hadoop, NOSQL, Apache Hadoop, Spark, Hive
      • Experience with delivering Big Data Solutions in the cloud with AWS
      • Ability to configure and support API and Opensource integrations
      • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
      • Over 5 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work
      • Experience with DevOps support
      • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    Professional Skill Requirements
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Ability to work with senior client executives
    Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
    Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
    Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
    Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
    Equal Employment Opportunity
    All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
    Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
    Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Dallas, TX
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
AWS Big Data Engineering professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, AWS data services, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Big Data Management, Data on Cloud, AWS Data Services AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
    • Evaluates emerging technologies, shapes Accentures point of view, defines new architecture patterns and standards, and leads proof of concepts of innovative solutions.
    • Leads and supports client sales pursuits.
    • Collaborates with highly talented resource pool and helps lead the Community of Data practice.
    • Key participant in setting strategic direction to establish near term goals for area of responsibility.
    • Interacts with senior management levels at a client and/or within Accenture, which involves negotiating or influencing on significant matters.
    • Has latitude in decision-making and determining objectives and approaches to critical assignments .
    • Decisions have a lasting impact on area of responsibility with impact outside area of responsibility.
    • Mana
    • ges large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.

    Basic Qualifications
      • 5 years of hands-on experience with public cloud platforms like AWS, MS Azure and GCP
      • 12 plus years of experience in multiple disciplines, such as solution or technical architecture, Data Management, Cloud architecture, or Big Data
      • 5 years of experience with a combination of AWS Cloud Platform, S3, glacier, AWS data Services, Redshift, Redshift Spectrum,Lambda functions, Apache Spark, Kafka, and Python.
      • Must be able to travel 100% (Mon-Thurs)
      • Bachelors degree or 12 years professional experience
      • 5 years of experience developing solutions utilizing:
        • Kafka based streaming services
        • R Studio
        • RDS, S3, glacier
        • MapReduce, Pig, Hive
        • Scala, Spark
    Preferred Qualifications
      • Experience as a consulting manager in a top-tier consulting firm preferred
      • Ten or more years of experience dealing with complex business/technical architectures and complex client delivery.
      • Working knowledge in Big Data tools like MongoDb, Cassandra, Hadoop, NOSQL, Apache Hadoop, Spark, Hive
      • Experience with delivering Big Data Solutions in the cloud with AWS
      • Ability to configure and support API and Opensource integrations
      • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
      • Over 5 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work
      • Experience with DevOps support
      • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    Professional Skill Requirements
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Ability to work with senior client executives
    Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
    Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
    Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
    Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
    Equal Employment Opportunity
    All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
    Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
    Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Minneapolis, MN
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
AWS Big Data Engineering professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, AWS data services, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Big Data Management, Data on Cloud, AWS Data Services AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
    • Evaluates emerging technologies, shapes Accentures point of view, defines new architecture patterns and standards, and leads proof of concepts of innovative solutions.
    • Leads and supports client sales pursuits.
    • Collaborates with highly talented resource pool and helps lead the Community of Data practice.
    • Key participant in setting strategic direction to establish near term goals for area of responsibility.
    • Interacts with senior management levels at a client and/or within Accenture, which involves negotiating or influencing on significant matters.
    • Has latitude in decision-making and determining objectives and approaches to critical assignments .
    • Decisions have a lasting impact on area of responsibility with impact outside area of responsibility.
    • Mana
    • ges large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.

    Basic Qualifications
      • 5 years of hands-on experience with public cloud platforms like AWS, MS Azure and GCP
      • 12 plus years of experience in multiple disciplines, such as solution or technical architecture, Data Management, Cloud architecture, or Big Data
      • 5 years of experience with a combination of AWS Cloud Platform, S3, glacier, AWS data Services, Redshift, Redshift Spectrum,Lambda functions, Apache Spark, Kafka, and Python.
      • Must be able to travel 100% (Mon-Thurs)
      • Bachelors degree or 12 years professional experience
      • 5 years of experience developing solutions utilizing:
        • Kafka based streaming services
        • R Studio
        • RDS, S3, glacier
        • MapReduce, Pig, Hive
        • Scala, Spark
    Preferred Qualifications
      • Experience as a consulting manager in a top-tier consulting firm preferred
      • Ten or more years of experience dealing with complex business/technical architectures and complex client delivery.
      • Working knowledge in Big Data tools like MongoDb, Cassandra, Hadoop, NOSQL, Apache Hadoop, Spark, Hive
      • Experience with delivering Big Data Solutions in the cloud with AWS
      • Ability to configure and support API and Opensource integrations
      • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
      • Over 5 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work
      • Experience with DevOps support
      • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    Professional Skill Requirements
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Ability to work with senior client executives
    Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
    Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
    Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
    Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
    Equal Employment Opportunity
    All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
    Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
    Accenture is committed to providing veteran employment opportunities to our service men and women.