OnlyDataJobs.com

SafetyCulture
  • Surry Hills, Australia
  • Salary: A$120k - 140k

The Role



  • Be an integral member on the team responsible for design, implement and maintain distributed big data capable system with high-quality components (Kafka, EMR + Spark, Akka, etc).

  • Embrace the challenge of dealing with big data on a daily basis (Kafka, RDS, Redshift, S3, Athena, Hadoop/HBase), perform data ETL, and build tools for proper data ingestion from multiple data sources.

  • Collaborate closely with data infrastructure engineers and data analysts across different teams, find bottlenecks and solve the problem

  • Design, implement and maintain the heterogeneous data processing platform to automate the execution and management of data-related jobs and pipelines

  • Implement automated data workflow in collaboration with data analysts, continue to improve, maintain and improve system in line with growth

  • Collaborate with Software Engineers on application events, and ensuring right data can be extracted

  • Contribute to resources management for computation and capacity planning

  • Diving deep into code and constantly innovating


Requirements



  • Experience with AWS data technologies (EC2, EMR, S3, Redshift, ECS, Data Pipeline, etc) and infrastructure.

  • Working knowledge in big data frameworks such as Apache Spark, Kafka, Zookeeper, Hadoop, Flink, Storm, etc

  • Rich experience with Linux and database systems

  • Experience with relational and NoSQL database, query optimization, and data modelling

  • Familiar with one or more of the following: Scala/Java, SQL, Python, Shell, Golang, R, etc

  • Experience with container technologies (Docker, k8s), Agile development, DevOps and CI tools.

  • Excellent problem-solving skills

  • Excellent verbal and written communication skills 

Accenture
  • Atlanta, GA
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Raleigh, NC
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Ultra Tendency
  • Riga, Lettland

You are a developer that loves to take a look at infrastructure as well? You are a systems engineer that likes to write code? Ultra Tendency is looking for you! 


Your Responsibilities:



  • Support our customers and development teams in transitioning capabilities from development and testing to operations

  • Deploy and extend large-scale server clusters for our clients

  • Fine-tune and optimize our clusters to process millions of records every day 

  • Learn something new every day and enjoy solving complex problems


Job Requirements:



  • You know Linux like the back of your hand

  • You love to automate all the things – SaltStack, Ansible, Terraform and Puppet are your daily business

  • You can write code in Python, Java, Ruby or similar languages.

  • You are driven by high quality standards and attention to detail

  • Understanding of the Hadoop ecosystem and knowledge of Docker is a plus


We offer:



  • Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager

  • Work on the open-source community and become a contributor. Learn from open-source enthusiasts which you will find nowhere else in Germany!

  • Work in an English-speaking, international environment

  • Work with cutting edge equipment and tools

UST Global
  • San Diego, CA

KEY SKILLSETS

- 7+ years experience with Python

- 4+ years experience with Java


General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

UST Global
  • Atlanta, GA

KEY SKILLSETS

- 7+ years experience with Python

- 4+ years experience with Java


General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

inovex GmbH
  • Karlsruhe, Germany

Gemeinsam mit deinem Betreuer arbeitest du an deinem Projekt aus einem der oben genannten Themengebiete und fertigst deine Abschlussarbeit selbstständig bei uns an. An deiner Seite stehen dir des Weiteren verschiedene Experten aus dem Team IT Engineering & Operations zur Verfügung.

Du kannst gerne im Laufe des Bewerbungsprozesses Themen mit dem zukünftigen Betreuer erarbeiten oder wählst aus einem der folgenden Themen:



  • Zusammenspiel klassischer Netzwerke und SDNs

  • VM-Workloads auf Container-Clustern (Scheduling, Orchestrierung)

  • Trusted Container Computing

  • Verschlüsselte Persistent-Volumes für Container

  • Service-Scaleout auf Anycast-Basis

  • Automatisierte Performance-Optimierung von Hadoop-Clustern

  • Multi-Tenant-Container-Cluster (Over-, Underlay-Cluster)

  • Hadoop in der Cloud (Container, HaaS, PaaS)

  • Kubernetes-as-a-Service (KaaS)

  • Hadoop-Lastspitzen in die Cloud

  • Hadoop Hybrid-Cloud (Data on Premise, Compute on Cloud)

  • Intel QuickAssist im OpenSource-Kontext

  • Auswirkung agiler Entwicklung auf den IT-Betrieb und das Unternehmen (DevOps)

  • Application Performance Management im Kontext von Microservice-Architekturen


Wer gut zu uns passen würde:



  • Erste private und schulische bzw. Studiums bezogene Erfahrungen und Kenntnisse im Bereich Linux und Netzwerke im Allgemeinen

  • Große Lern- und Leistungsbereitschaft

  • Leidenschaft und Begeisterung für neue Technologien und Themen rund um Linux

  • Gute kommunikative Fähigkeiten und sehr gute Deutsch- und Englischkenntnisse in Wort und Schrift

GTN Technical Staffing and Consulting
  • Dallas, TX

This position requires a broad array of programming skills and experience as well as the desire to learn and grow in an entrepreneurial environment. You will be responsible for creating and developing client onboarding, product provisioning and real-time analytics software services. You will work closely with other members of the architecture team to make strategic decisions about product development, devops and future technology choices.

The ideal candidate should:

  • Demonstrate a proven track record of rapidly building, delivering, and maintaining complex software products.
  • Possess excellent communication skills.
  • Have high integrity.
  • Embrace learning and have a thirst for knowledge.

Enjoy solving problems and finding generic solutions to challenging situations.

  • Enjoy being a mentor for junior team members.

Be self-motivated to innovate and develop cutting edge technology functionality.

  • Be able to rapidly learn new frameworks.
  • Be responsible for creating and implementing core product architecture. Be comfortable developing frontend and backend solutions.

This position reports directly to the CTO.

Required experience:

7+ years hands-on experience

  • AWS (EC2, Lambda, ECS)
  • Docker/Kubernetes
  • 3+ years programming in Scala
  • 3+ years programming in Node.js
  • ES6/Modern javascript
  • Microservices

Preferred experience:

  • Mongodb
  • SBT
  • Serverless Computing

The ideal candidate will:

  • Possess excellent communication and organization skills
  • Embrace learning and have a thirst for knowledge
  • Rapidly learn new technological paradigms
  • Understand and program in multiple programming languages
  • Enjoy being part of a growing team
  • Self-motivated team player

Benefits

  • Medical / Dental / Vision Insurance
  • 401(k)
  • Competitive compensation
  • Work with leaders in the industry
  • Opportunities to learn and grow every day
  • Play a meaningful role on a successful team
SoftClouds LLC
  • San Diego, CA

Job Overview: SoftClouds is looking for a Data Engineer to join our analytics platform team in designing and developing the next generation data and analytics solutions. The candidate should have deep technical skills as well as the ability to understand data and analytics, and an openness to working with disparate platforms, data sources and data formats.


Roles and Responsibilities:
  • Experience with MySQL, MS SQL Server, or Hadoop, or MongoDB.
  • Writing SQL Queries, tables joins.
  • AWS, python, or bash shell scripting
  • Have some experience pulling data from Hadoop.
  • Analyze data, system and data flows and develop effective ways to store and present data in BI applications
  • ETL experience a plus.
  • Work with data from disparate environments including Hadoop, MongoDB Talend, and other SQL and NoSQL data stores
  • Help develop the next generation analytics platform
  • Proactively ensure data integrity and focus on continuous performance improvements of existing processes.


Required skills and experience:
  • 5  or more years of experience in software development
  • 3 year of experience in writing Data applications using Spark
  • Experience in Java and Python
  • Familiarity  with Agile development methodology `
  • Experience with Scala is a plus
  • Experience with NoSQL databases, e.g., Cassandra is a plus
  • Expertise in Apache Spark & Hadoop.
  • Expertise in machine learning algorithms


Education / Experience:

  • Bachelor's Degree in Engineering or Computer Science or related field required.
  • U.S. Citizens/GC/GC EAD are encouraged to apply. We are unable to sponsor at this time. NO C2C or third-party agencies.



Hulu
  • Santa Monica, CA

WHAT YOU’LL DO



  • Build robust and scalable micro-services

  • End to end ownership of backend services: Ideate, review design, build, code-review, test, load-test, launch, monitor performance

  • Identify opportunities to optimize ad delivery algorithm – measure and monitor ad-break utilization for ad count and ad duration.

  • Work with product team to translate requirements into well-defined technical implementation

  • Define technical and operational KPIs to measure ad delivery health

  • Build Functional and Qualitative Test frameworks for ad server

  • Challenge our team and software to be even better


WHAT TO BRING



  • BS or MS in Computer Science/Engineering

  • 7+ years of relevant software engineering experience

  • Strong analytical skills

  • Strong programming (Java/C#/C++ or other related programming languages) and scripting skills

  • Great communication, collaboration skills and a strong teamwork ethic

  • Strive for excellence


NICE-TO-HAVES



  • Experience with non-relational database technologies (MongoDB, Cassandra, DynamoDB)

  • Experience with Redis and/or MemCache

  • Experience with Apache Kafka and/or Kinesis

  • AWS

  • Big Data technologies and data warehouses – Spark, Hadoop, Redshift

idealo internet GmbH
  • Berlin, Deutschland

We have made it our mission to develop the best price comparison and the most enjoyable shopping experience for customers on smartphones and tablets. The team needs energetic full stack support to create a new product from scratch. Topics like events streaming, GraphQL or OpenShift are exciting topics for you? Do you find elegant solutions for complex architectural problems? If so, then we would like to get to know you.


Your tasks:



  • At idealo you belong to a cross-functional team of concept developers, designers and software engineers who develop products and software at the highest level.

  • We work in an agile team, and therefore you accompany the entire development process from an idea to a product that hundreds of thousands of users hold in their hands every day.

  • You work with modern technologies such as Kotlin, GraphQL, Apache Kafka, MongoDB, Kubernetes, Spring Boot and the Spring Cloud/Netflix OSS stack.

  • Design and implement scalable and resilient microservices and apps, working closely with other teams on common components

  • You continuously improve the code, the development process and contribute with your knowledge to the growth of the team.


What you bring with you:



  • You have solid experience as a software developer or software architect

  • You have sound knowledge in the design of distributed systems and in the conception of parallel data processing.

  • You have excellent knowledge in Java and the Spring Framework and ideally already in Kotlin.

  • Concepts such as Software Craftsmanship, Clean Code, Continuous Deployment are written in capital letters at your company

  • With agile Mindset you prefer a fast live performance without losing sight of the quality.

  • You have a well-founded opinion and you represent it - and respect it when others do the same.

  • You like to communicate and are open for new things

idealo internet GmbH
  • Berlin, Deutschland

We have made it our mission to develop the best price comparison and the most enjoyable shopping experience for customers on smartphones and tablets. The team needs energetic full stack support to create a new product from scratch. Topics like events streaming, GraphQL or OpenShift are exciting topics for you? Do you find elegant solutions for complex architectural problems? If so, then we would like to get to know you.


Your tasks:



  • At idealo you belong to a cross-functional team of concept developers, designers and software engineers who develop products and software at the highest level.

  • We work in an agile team, and therefore you accompany the entire development process from an idea to a product that hundreds of thousands of users hold in their hands every day.

  • You work with modern technologies such as Kotlin, GraphQL, Apache Kafka, MongoDB, Kubernetes, Spring Boot and the Spring Cloud/Netflix OSS stack.

  • Design and implement scalable and resilient microservices and apps, working closely with other teams on common components

  • You continuously improve the code, the development process and contribute with your knowledge to the growth of the team.


Skills & Requirements:



  • You have solid experience as a software developer or software architect

  • You have sound knowledge in the design of distributed systems and in the conception of parallel data processing.

  • You have excellent knowledge in Java and the Spring Framework and ideally already in Kotlin.

  • Concepts such as Software Craftsmanship, Clean Code, Continuous Deployment are written in capital letters at your company

  • With an agile mind-set, you prefer a fast live performance without losing sight of the quality.

  • You have a well-founded opinion and you represent it - and respect it when others do the same.

  • You like to communicate and are open to new things

Hulu
  • Santa Monica, CA

WHAT YOU’LL DO



  • Build elegant systems that are robust and scalable

  • Challenge our team and software to be even better

  • Use a mix of technologies including Scala, Ruby, Python, and Angular JS


WHAT TO BRING



  • BS or MS in Computer Science/Engineering

  • 5+ years of relevant software engineering experience

  • Strong programming (Java/C#/C++ or other related programming languages) and scripting skills

  • Great communication, collaboration skills and a strong teamwork ethic

  • Strive for excellence


NICE-TO-HAVES



  • Experience with both statically typed languages and dynamic languages

  • Experience with relational (Oracle, MySQL) and non-relational database technologies (MongoDB, Cassandra, DynamoDB)

Beamery
  • London, UK
  • Salary: £45k - 75k

We’re building a new generation of AI-driven recruiting tech that changes the way people do their job. To make that happen, we work hard at understanding what our users need, and we build it using cutting-edge infrastructures and a diverse stack of languages and frameworks. Our customers are some of the most innovative and quality-driven companies in the world, and we can’t bring them a product that is less than brilliant. We’re always looking for engineers who have the same passion for quality and customer happiness.


At Beamery, you will constantly be learning and teaching others. You will have a sense of ownership over the product and will take pride in your work. The best practices of the team will be influenced by your voice, and there will always be space and time for you to experiment and bring new ideas to the table.    


As a Back End Developer on the Beamteam, you will be working with a team of experienced engineers to build the next generation back end architecture of our services.


The right engineer will:



  • Have strong NodeJS skills with ES6+ and TypeScript. Functional and Object Oriented programming.

  • Have a good understanding of microservice architectures in the past, and you have experience using pub/sub architectures and Apache Kafka.

  • Have in-depth understanding of MongoDB and ORM systems, and good knowledge of the ELK stack.

  • Have a good understanding of TDD/BDD and test automation suits.

  • Enjoy using a wide variety of tools, and will be happy to pick up and learn new things.

  • Enjoy our regular teach-in sessions delivered by industry experts.

  • Have excellent communication skills, both written and spoken.

  • Enjoy being apart of a collaborative team that is focused on building a product that will delight customers.

  • Not be biased toward a specific technology: finding the right tools for the job.


At Beamery you will:



  • Take part in regular collaborative teach-ins, and learn new skills.

  • Build beautiful, scalable products with wide market exposure.

  • Being a lead contributor in our projects and shaping the architecture of our back-end services.

  • Have the opportunity to mentor junior developers.

  • Work as part of an Agile team with support from Product and Design.

  • Be exposed to leadership training and experience opportunities.

  • Have access to in-house training course as well as external conferences and workshops.

Acxiom
  • Austin, TX
As a Hadoop Administrator, you will assist leadership for projects related to Big Data technologies and software development support for client research projects. You will analyze latest Big Data Analytic technologies and their innovative applications in both business intelligence analysis and new service offerings. You will bring these insights and best practices to Acxiom's Big Data Projects. You must be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them. You will develop highly scalable and extensible Big Data platform which enables collection, storage, modeling, and analysis of massive data sets from numerous channels. You must be a self-starter to continuously evaluate new technologies, innovate and deliver solutions for business critical applications. 


 

What you will do:


  • Responsible for implementation and ongoing administration of Hadoop infrastructure
  • Provide technical leadership and collaboration with engineering organization, develop key deliverables for Data Platform Strategy - Scalability, optimization, operations, availability, roadmap.
  • Own the platform architecture and drive it to the next level of effectiveness to support current and future requirements
  • Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Enterprise, etc.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screen Hadoop cluster job performances and capacity planning
  • Help optimize and integrate new infrastructure via continuous integration methodologies (DevOps CHEF)
  • Manage and review Hadoop log files with the help of  Log management technologies (ELK)
  • Provide top-level technical help desk support for the application developers
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality, availability and security
  • Collaborating with application teams to perform Hadoop updates, patches, version upgrades when required
  • Mentor Hadoop engineers and administrators
  • Work with Vendor support teams on support tasks


Do you have?


  • Bachelor's degree in related field of study, or equivalent experience
  • 3+ years of Big Data Administration experience
  • Extensive knowledge of Hadoop based data manipulation/storage technologies such as HDFS, MapReduce, Yarn, HBASE, HIVE, Pig, Impala and Sentry
  • Experience in capacity planning, cluster designing and deployment, troubleshooting and performance tuning
  • Great operational expertise such as good troubleshooting skills, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
  • Experience in Hadoop cluster migrations or upgrades
  • Strong Linux/SAN administration skills and RDBMS/ETL knowledge
  • Good Experience in Cloudera/Horton Works/MapR versions along with Monitoring/Alerting tools (Nagios, Ganglia, Zenoss , Cloudera Manager)
  • Scripting skills in Perl, Python, Shell Scripting, and/or Ruby on Rails
  • Knowledge of JAVA/J2EE and other web technologies
  • Understanding of On-premise and Cloud network architectures
  • DevOps experience is a great plus (CHEF, Puppet and Ansible)
  • Excellent verbal and written communication skills


 

GTN Technical Staffing and Consulting
  • Dallas, TX

Senior Software Engineer - Scala
HIGHLIGHTS
Location: North Dallas, TX
Position Type: Direct Hire
Hourly / Salary: BOE
Residency Status: US Citizens and US Permanent Residents only, as sponsorship is not being offered at this time.

This position requires a broad array of programming skills and experience as well as the desire to learn and grow in an entrepreneurial environment. You will be responsible for creating and developing client onboarding, product provisioning and real-time analytics software services. You will work closely with other members of the architecture team to make strategic decisions about product development, devops and future technology choices.

The ideal candidate should:

  • Demonstrate a proven track record of rapidly building, delivering, and maintaining complex software products.
  • Possess excellent communication skills.
  • Have high integrity.
  • Embrace learning and have a thirst for knowledge.
  • The ideal candidate will possess a broad array of programming skills and experience, with a focus on writing scalable, high-performance microservices on a platform such as AWS lambda.

Be self-motivated to innovate and develop cutting edge technology functionality.

  • Be able to rapidly learn new frameworks.
  • Be responsible for creating and implementing core product architecture. Be comfortable developing frontend and backend solutions.

Required experience:

7+ years hands-on experience

  • AWS (EC2, Lambda, ECS)
  • Docker/Kubernetes
  • 3+ years programming in Scala
  • 3+ years programming in Node.js
  • ES6/Modern javascript
  • Microservices

Preferred experience:

  • Mongodb
  • SBT
  • Serveries Computing

The ideal candidate will:

  • Possess excellent communication and organization skills
  • Embrace learning and have a thirst for knowledge
  • Rapidly learn new technological paradigms
  • Understand and program in multiple programming languages
  • Enjoy being part of a growing team
  • Self-motivated team player

Benefits

  • Medical / Dental / Vision Insurance
  • 401(k)
  • Competitive compensation
  • Work with leaders in the industry
  • Opportunities to learn and grow every day
  • Play a meaningful role on a successful team
APIvista
  • Raleigh, NC
  • Location: Richmond, VA
  • Type: Full-time
  • Up to 25% travel possible


Job Description:

APIvista is looking to hire a full-time Data Engineer. APIvista is a development and managed services company that drives business value by leveraging API integrations. Our clients rely on us to ensure their APIs and data integrations are secure and reliably available to their clients and employees.


Our experienced team prides itself on utilizing leading technologies, proven best practices, and driving growth of APIs while emphasizing professional growth for our associates.


While we are headquartered in Richmond, Virginia, we work with clients across the United States.


You:
  • View your clientssuccess as your own
  • Are passionate about what you do
  • Love to teach yourself new skills
  • Seek opportunities to learn
  • Thrive in ambiguity
  • Enjoy working on a team
  • Easily adapt to new project requirements and client expectations


What Youll Do:
  • Partner with clients to develop and maintain first-class data platformsthat includes Jupyterhub, DataBricks, and other data sciencetools
  • Develop streaming data processors thatcrunch numbers in real time to help our clients make smart decisions
  • Collaborate with API developers to build data-driven microservices for our clients
  • Write concise, fun cookbooksto empower our clients


Preferred Skills (but not limited to):
  • Designing, developing, scaling, and maintaining data pipelines or reports using Spark, Kafka, Hive, Python or Scala
  • Hadoop Developer
  • Putting modern data platforms into use, including platform as a service variant
  • Communicating complex ideas with clientsand technical staff
  • Using Git or Github in a CI/CD development workflow
  • Developing microservices using languages like Java, Python or JavaScriptand using RESTAPIs
  • Writingeffective technical documentation
  • Automating deployments using DevOps tools like Docker, Ansible, Terraform, or Kubernetes
Anaconda, Inc.
  • Austin, TX

Role: Sr. Product Manager

Reports to: Director, Product Management

Department: Product

Location: Austin, TX

Job Type: Full Time, Exempt      

Help Us Shape the Future of Data

With over 11 million users, Anaconda is the worlds most popular Python data science platform. Anaconda, Inc. continues to lead open source projects like Anaconda, NumPy, and SciPy that form the foundation of modern data science. Anacondas flagship product, Anaconda Enterprise, allows organizations to secure, govern, scale, and extend Anaconda to deliver actionable insights that drive businesses and industries forward.


Anaconda is seeking people who want to play a role in shaping the future of enterprise AI, machine learning, and data science. We strive to create a culture that is both relaxed and focused, and we stress empathy and collaboration with our customers, open source users, and each other. Our primary employee perk is that we are actively working on projects that have a global impact.

Summary

Anaconda is seeking a talented Sr. Product Manager for the Anaconda Enterprise. This is an excellent opportunity to play a key role in enabling any company to run a state of the art Machine Learning discipline, while working with passionate, talented people.


What Youll Do:

    • Understand and clearly define user problems. Work with engineering and design to deliver great solutions in a collaborative fashion.
    • Develop and refine user personas.
    • Use data-driven methodology to prioritize roadmap themes.
    • Present product features and technical direction to internal stakeholders.
    • Ensure that sales, customer success, and support are in the know regarding the direction of the product.
    • Partner with Product Marketing to craft messaging, positioning, and collateral.
    • Tap into the pulse of the open source, data science, and developer communities.
    • Be the interface between disparate stakeholders and roles: Anaconda engineers; community contributors; end users who are data scientists, software developers, sysadmins, and devops engineers.


What You Need:

    • Minimum of 5 years of Product Management experience.
    • Working knowledge of Kubernetes.
    • Superb public speaking and presentation skills.
    • A passion for data science.
    • Ability to travel up to 20% of the time.


What Will Make You Stand Out:

    • You have previous experience as a software developer or devops engineer.
    • You have specific experience working with open source data science technologies, like Hadoop, Spark, or Tensorflow.
    • You have been part of a team that has deployed complex models into production at an enterprise company.


Why Youll Like Working Here:

    • Dynamic company that rewards high-performers
    • Be on the cutting edge of new technologies and services
    • Collaborative team environment that values multiple perspectives and fresh thinking
    • Employees First culture
    • Casual dress code
    • Flexible working hours
    • Medical, Dental, Vision, HSA, Life, and 401K
    • Pre IPO Stock options
    • Unlimited Vacation!
Enghouse Interactive
  • Phoenix, AZ
DevOps Engineer
The DevOps Engineer at Enghouse Interactive Americas will be responsible for the installation, configuration and monitoring of hosted platforms including, but not limited to, contact center, IVR and dialer platforms. These platforms are hosted in both Enghouse Interactives infrastructure as well as strategic partner data centers and must maintain maximum uptime to meet customer SLAs. The DevOps Engineer will be working closely with R&D, implementation services, and pre-sales efforts to ensure platform capabilities are aligned with go to market initiatives and overall operability of the platforms.
This DevOps Engineer position reports to the Director of DevOps and is part of the Enghouse Interactive Americas team focused mainly on North American operations. This is a unique opportunity to become part of a team that is fundamental to the Enghouse Interactive strategy to offer a broad set of cloud (SaaS) solutions to the market.
This position can be located in any of our Enghouse US offices or our Toronto Canada office.
Essential Responsibilities
    • Continually reviews and identifies costs associated and assists migration plans to achieve further cost savings where appropriate
    • Works with R&D teams to continually improve the security, monitoring, and management of hosted platforms
    • Works with Sales Engineering to ensure capabilities of hosted platforms are well understood and align with customer expectations
    • Assists in troubleshooting hosted platforms as part of support investigations and/or onboarding of new customers
    • Participates in an on-call rotation as the escalation path to ensure 24/7/365 availability of hosted platforms

Key Skills And Qualifications
The candidate will possess the following skills
    • Strong background in administering Microsoft Windows based software and databases
    • In-depth knowledge of one or more virtualization platforms (VMWare, AWS, etc.)
    • Experience in troubleshooting network related issues including VoIP audio quality issues
    • Experience with the SIP and related telephony protocols as well as integrating to Telco providers
    • Process oriented and ability to clearly document operational procedures
    • Familiarity of coding and scripting languages aimed at automating and streamlining DevOps processes

Enghouse is an equal opportunity employer
Mercedes-Benz USA
  • Atlanta, GA

Job Overview

Mercedes-Benz USA is recruiting a Big Data Architect, a newly created position within the Information Technology Infrastructure Department. This position is responsible for refining and creating the next step in technology for our organization. In this role you will act as contact person and agile enabler for all questions regarding new IT infrastructure and services in context of Big Data solutions.

Responsibilities

  • Leverage sophisticated Big Data technologies into current and future business applications

  • Lead infrastructure projects for the implementation of new Big Data solutions

  • Design and implement modern, scalable data center architectures (on premise, hybrid or cloud) that meet the requirements of our business partners

  • Ensure the architecture is optimized for large dataset acquisition, analysis, storage, cleansing, transformation and reclamation

  • Create the requirements analysis, the platform selection and the design of the technical architecture

  • Develop IT infrastructure roadmaps and implement strategies around data science initiatives

  • Lead the research and evaluation of emerging technologies, industry and market trends to assist in project development and operational support actives

  • Work closely together with the application teams to exceed our business partners expectations

Qualifications 

Education

Bachelors Degree (accredited school) with emphasis in:

Computer/Information Science

Information Technology

Engineering

Management Information System (MIS)

Must have 5 7 years of experience in the following:

  • Architecture, design, implementation, operation and maintenance of Big Data solutions

  • Hands-on experience with major Big Data technologies and frameworks including Hadoop, MapReduce, Pig, Hive, HBase, Oozie, Mahout, Flume, ZooKeeper, MongoDB, and Cassandra.

  • Experience with Big Data solutions deployed in large cloud computing infrastructures such as AWS, GCE and Azure

  • Strong knowledge of programming and scripting languages such as Java, Linus, PHP, Ruby, Phyton

  • Big Data query tools such as Pig, Hive and Impala

  • Project Management Skills:

  • Ability to develop plans/projects from conceptualization to implementation

  • Ability to organize workflow and direct tasks as well as document milestones and ROIs and resolve problems

Proven experience with the following:

  • Open source software such as Hadoop and Red Hat

  • Shell scripting

  • Servers, storage, networking, and data archival/backup solutions

  • Industry knowledge and experience in areas such as Software Defined Networking (SDN), IT infrastructure and systems security, and cloud or network systems management

Additional Skills
Focus on problem resolution and troubleshooting
Knowledge on hardware capabilities and software interfaces and applications
Ability to produce quality digital assets/products
 
EEO Statement
Mercedes-Benz USA is committed to fostering an inclusive environment that appreciates and leverages the diversity of our team. We provide equal employment opportunity (EEO) to all qualified applicants and employees without regard to race, color, ethnicity, gender, age, national origin, religion, marital status, veteran status, physical or other disability, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local law.