OnlyDataJobs.com

Accenture
  • Atlanta, GA
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Raleigh, NC
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
UST Global
  • San Diego, CA

KEY SKILLSETS

- 7+ years experience with Python

- 4+ years experience with Java


General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

UST Global
  • Atlanta, GA

KEY SKILLSETS

- 7+ years experience with Python

- 4+ years experience with Java


General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

GTN Technical Staffing and Consulting
  • Dallas, TX

This position requires a broad array of programming skills and experience as well as the desire to learn and grow in an entrepreneurial environment. You will be responsible for creating and developing client onboarding, product provisioning and real-time analytics software services. You will work closely with other members of the architecture team to make strategic decisions about product development, devops and future technology choices.

The ideal candidate should:

  • Demonstrate a proven track record of rapidly building, delivering, and maintaining complex software products.
  • Possess excellent communication skills.
  • Have high integrity.
  • Embrace learning and have a thirst for knowledge.

Enjoy solving problems and finding generic solutions to challenging situations.

  • Enjoy being a mentor for junior team members.

Be self-motivated to innovate and develop cutting edge technology functionality.

  • Be able to rapidly learn new frameworks.
  • Be responsible for creating and implementing core product architecture. Be comfortable developing frontend and backend solutions.

This position reports directly to the CTO.

Required experience:

7+ years hands-on experience

  • AWS (EC2, Lambda, ECS)
  • Docker/Kubernetes
  • 3+ years programming in Scala
  • 3+ years programming in Node.js
  • ES6/Modern javascript
  • Microservices

Preferred experience:

  • Mongodb
  • SBT
  • Serverless Computing

The ideal candidate will:

  • Possess excellent communication and organization skills
  • Embrace learning and have a thirst for knowledge
  • Rapidly learn new technological paradigms
  • Understand and program in multiple programming languages
  • Enjoy being part of a growing team
  • Self-motivated team player

Benefits

  • Medical / Dental / Vision Insurance
  • 401(k)
  • Competitive compensation
  • Work with leaders in the industry
  • Opportunities to learn and grow every day
  • Play a meaningful role on a successful team
SoftClouds LLC
  • San Diego, CA

Job Overview: SoftClouds is looking for a Data Engineer to join our analytics platform team in designing and developing the next generation data and analytics solutions. The candidate should have deep technical skills as well as the ability to understand data and analytics, and an openness to working with disparate platforms, data sources and data formats.


Roles and Responsibilities:
  • Experience with MySQL, MS SQL Server, or Hadoop, or MongoDB.
  • Writing SQL Queries, tables joins.
  • AWS, python, or bash shell scripting
  • Have some experience pulling data from Hadoop.
  • Analyze data, system and data flows and develop effective ways to store and present data in BI applications
  • ETL experience a plus.
  • Work with data from disparate environments including Hadoop, MongoDB Talend, and other SQL and NoSQL data stores
  • Help develop the next generation analytics platform
  • Proactively ensure data integrity and focus on continuous performance improvements of existing processes.


Required skills and experience:
  • 5  or more years of experience in software development
  • 3 year of experience in writing Data applications using Spark
  • Experience in Java and Python
  • Familiarity  with Agile development methodology `
  • Experience with Scala is a plus
  • Experience with NoSQL databases, e.g., Cassandra is a plus
  • Expertise in Apache Spark & Hadoop.
  • Expertise in machine learning algorithms


Education / Experience:

  • Bachelor's Degree in Engineering or Computer Science or related field required.
  • U.S. Citizens/GC/GC EAD are encouraged to apply. We are unable to sponsor at this time. NO C2C or third-party agencies.



Hulu
  • Santa Monica, CA

WHAT YOU’LL DO



  • Build robust and scalable micro-services

  • End to end ownership of backend services: Ideate, review design, build, code-review, test, load-test, launch, monitor performance

  • Identify opportunities to optimize ad delivery algorithm – measure and monitor ad-break utilization for ad count and ad duration.

  • Work with product team to translate requirements into well-defined technical implementation

  • Define technical and operational KPIs to measure ad delivery health

  • Build Functional and Qualitative Test frameworks for ad server

  • Challenge our team and software to be even better


WHAT TO BRING



  • BS or MS in Computer Science/Engineering

  • 7+ years of relevant software engineering experience

  • Strong analytical skills

  • Strong programming (Java/C#/C++ or other related programming languages) and scripting skills

  • Great communication, collaboration skills and a strong teamwork ethic

  • Strive for excellence


NICE-TO-HAVES



  • Experience with non-relational database technologies (MongoDB, Cassandra, DynamoDB)

  • Experience with Redis and/or MemCache

  • Experience with Apache Kafka and/or Kinesis

  • AWS

  • Big Data technologies and data warehouses – Spark, Hadoop, Redshift

idealo internet GmbH
  • Berlin, Deutschland

We have made it our mission to develop the best price comparison and the most enjoyable shopping experience for customers on smartphones and tablets. The team needs energetic full stack support to create a new product from scratch. Topics like events streaming, GraphQL or OpenShift are exciting topics for you? Do you find elegant solutions for complex architectural problems? If so, then we would like to get to know you.


Your tasks:



  • At idealo you belong to a cross-functional team of concept developers, designers and software engineers who develop products and software at the highest level.

  • We work in an agile team, and therefore you accompany the entire development process from an idea to a product that hundreds of thousands of users hold in their hands every day.

  • You work with modern technologies such as Kotlin, GraphQL, Apache Kafka, MongoDB, Kubernetes, Spring Boot and the Spring Cloud/Netflix OSS stack.

  • Design and implement scalable and resilient microservices and apps, working closely with other teams on common components

  • You continuously improve the code, the development process and contribute with your knowledge to the growth of the team.


What you bring with you:



  • You have solid experience as a software developer or software architect

  • You have sound knowledge in the design of distributed systems and in the conception of parallel data processing.

  • You have excellent knowledge in Java and the Spring Framework and ideally already in Kotlin.

  • Concepts such as Software Craftsmanship, Clean Code, Continuous Deployment are written in capital letters at your company

  • With agile Mindset you prefer a fast live performance without losing sight of the quality.

  • You have a well-founded opinion and you represent it - and respect it when others do the same.

  • You like to communicate and are open for new things

idealo internet GmbH
  • Berlin, Deutschland

We have made it our mission to develop the best price comparison and the most enjoyable shopping experience for customers on smartphones and tablets. The team needs energetic full stack support to create a new product from scratch. Topics like events streaming, GraphQL or OpenShift are exciting topics for you? Do you find elegant solutions for complex architectural problems? If so, then we would like to get to know you.


Your tasks:



  • At idealo you belong to a cross-functional team of concept developers, designers and software engineers who develop products and software at the highest level.

  • We work in an agile team, and therefore you accompany the entire development process from an idea to a product that hundreds of thousands of users hold in their hands every day.

  • You work with modern technologies such as Kotlin, GraphQL, Apache Kafka, MongoDB, Kubernetes, Spring Boot and the Spring Cloud/Netflix OSS stack.

  • Design and implement scalable and resilient microservices and apps, working closely with other teams on common components

  • You continuously improve the code, the development process and contribute with your knowledge to the growth of the team.


Skills & Requirements:



  • You have solid experience as a software developer or software architect

  • You have sound knowledge in the design of distributed systems and in the conception of parallel data processing.

  • You have excellent knowledge in Java and the Spring Framework and ideally already in Kotlin.

  • Concepts such as Software Craftsmanship, Clean Code, Continuous Deployment are written in capital letters at your company

  • With an agile mind-set, you prefer a fast live performance without losing sight of the quality.

  • You have a well-founded opinion and you represent it - and respect it when others do the same.

  • You like to communicate and are open to new things

Hulu
  • Santa Monica, CA

WHAT YOU’LL DO



  • Build elegant systems that are robust and scalable

  • Challenge our team and software to be even better

  • Use a mix of technologies including Scala, Ruby, Python, and Angular JS


WHAT TO BRING



  • BS or MS in Computer Science/Engineering

  • 5+ years of relevant software engineering experience

  • Strong programming (Java/C#/C++ or other related programming languages) and scripting skills

  • Great communication, collaboration skills and a strong teamwork ethic

  • Strive for excellence


NICE-TO-HAVES



  • Experience with both statically typed languages and dynamic languages

  • Experience with relational (Oracle, MySQL) and non-relational database technologies (MongoDB, Cassandra, DynamoDB)

Beamery
  • London, UK
  • Salary: £45k - 75k

We’re building a new generation of AI-driven recruiting tech that changes the way people do their job. To make that happen, we work hard at understanding what our users need, and we build it using cutting-edge infrastructures and a diverse stack of languages and frameworks. Our customers are some of the most innovative and quality-driven companies in the world, and we can’t bring them a product that is less than brilliant. We’re always looking for engineers who have the same passion for quality and customer happiness.


At Beamery, you will constantly be learning and teaching others. You will have a sense of ownership over the product and will take pride in your work. The best practices of the team will be influenced by your voice, and there will always be space and time for you to experiment and bring new ideas to the table.    


As a Back End Developer on the Beamteam, you will be working with a team of experienced engineers to build the next generation back end architecture of our services.


The right engineer will:



  • Have strong NodeJS skills with ES6+ and TypeScript. Functional and Object Oriented programming.

  • Have a good understanding of microservice architectures in the past, and you have experience using pub/sub architectures and Apache Kafka.

  • Have in-depth understanding of MongoDB and ORM systems, and good knowledge of the ELK stack.

  • Have a good understanding of TDD/BDD and test automation suits.

  • Enjoy using a wide variety of tools, and will be happy to pick up and learn new things.

  • Enjoy our regular teach-in sessions delivered by industry experts.

  • Have excellent communication skills, both written and spoken.

  • Enjoy being apart of a collaborative team that is focused on building a product that will delight customers.

  • Not be biased toward a specific technology: finding the right tools for the job.


At Beamery you will:



  • Take part in regular collaborative teach-ins, and learn new skills.

  • Build beautiful, scalable products with wide market exposure.

  • Being a lead contributor in our projects and shaping the architecture of our back-end services.

  • Have the opportunity to mentor junior developers.

  • Work as part of an Agile team with support from Product and Design.

  • Be exposed to leadership training and experience opportunities.

  • Have access to in-house training course as well as external conferences and workshops.

Perficient, Inc.
  • Dallas, TX

At Perficient, youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.

Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.

About Our Data Governance Practice:


We provide exceptional data integration services in the ETL, Data Catalog, Data Quality, Data Warehouse, Master Data Management (MDM), Metadata Management & Governance space.

Perficient currently has a career opportunity for a Python Developer who resides in the vicinity of Jersey City, NJ or Dallas,TX.

Job Overview:

As a Python developer, you will participate in all aspects of the software development lifecycle which includes estimating, technical design, implementation, documentation, testing, deployment, and support of application developed for our clients. As a member working in a team environment, you will take direction from solution architects and Leads on development activities.


Required skills:

  • 6+ years of experience in architecting, building and maintaining software platforms and large-scale data infrastructures in a commercial or open source environment
  • Excellent knowledge of Python
  • Good knowledge of and hands on experience working with quant/data Python libraries (pandas/numpy etc)
  • Good knowledge of and hands on experience designing APIs in Python (using Django/Flask etc)

Nice to have skills (in the order of priority):

  • Comfortable and Hands on experience with AWS cloud (S3, EC2, EMR, Lambda, Athena, QuickSight etc.) and EMR tools (Hive, Zeppelin etc)
  • Experience building and optimizing big data data pipelines, architectures and data sets.
  • Hands on experience in Hadoop MapReduce or other big data technologies and pipelines (Hadoop, Spark/pyspark, MapReduce, etc.)
  • Bash Scripting
  • Understanding of Machine Learning and Data Science processes and techniques
  • Experience in Java / Scala


Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities, and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues with great benefits are just part of what makes Perficient a great place to work.

GTN Technical Staffing and Consulting
  • Dallas, TX

Senior Software Engineer - Scala
HIGHLIGHTS
Location: North Dallas, TX
Position Type: Direct Hire
Hourly / Salary: BOE
Residency Status: US Citizens and US Permanent Residents only, as sponsorship is not being offered at this time.

This position requires a broad array of programming skills and experience as well as the desire to learn and grow in an entrepreneurial environment. You will be responsible for creating and developing client onboarding, product provisioning and real-time analytics software services. You will work closely with other members of the architecture team to make strategic decisions about product development, devops and future technology choices.

The ideal candidate should:

  • Demonstrate a proven track record of rapidly building, delivering, and maintaining complex software products.
  • Possess excellent communication skills.
  • Have high integrity.
  • Embrace learning and have a thirst for knowledge.
  • The ideal candidate will possess a broad array of programming skills and experience, with a focus on writing scalable, high-performance microservices on a platform such as AWS lambda.

Be self-motivated to innovate and develop cutting edge technology functionality.

  • Be able to rapidly learn new frameworks.
  • Be responsible for creating and implementing core product architecture. Be comfortable developing frontend and backend solutions.

Required experience:

7+ years hands-on experience

  • AWS (EC2, Lambda, ECS)
  • Docker/Kubernetes
  • 3+ years programming in Scala
  • 3+ years programming in Node.js
  • ES6/Modern javascript
  • Microservices

Preferred experience:

  • Mongodb
  • SBT
  • Serveries Computing

The ideal candidate will:

  • Possess excellent communication and organization skills
  • Embrace learning and have a thirst for knowledge
  • Rapidly learn new technological paradigms
  • Understand and program in multiple programming languages
  • Enjoy being part of a growing team
  • Self-motivated team player

Benefits

  • Medical / Dental / Vision Insurance
  • 401(k)
  • Competitive compensation
  • Work with leaders in the industry
  • Opportunities to learn and grow every day
  • Play a meaningful role on a successful team
National Oilwell Varco
  • Houston, TX

NOV is seeking an experienced Data Modeler to assist in building and supporting RigSystems & Aftermarket's Data Warehouse.  This resource will be responsible for separating different types of data into structures that can be easily processed by various systems. This resource will also focus on a variety of issues, such as enhancing data migration from one system to another and eliminating data redundancy.  

Duties and responsibilities include:

  • Understand and translate business needs into data models supporting long-term solutions.

  • Work with the Application Development team to implement data strategies, build data flows and develop conceptual data models.

  • Create logical and physical data models using best practices to ensure high data quality and reduced redundancy.

  • Optimize and update logical and physical data models to support new and existing projects.

  • Maintain conceptual, logical and physical data models along with corresponding metadata.

  • Develop best practices for standard naming conventions and coding practices to ensure consistency of data models.

  • Recommend opportunities for reuse of data models in new environments.

  • Perform reverse engineering of physical data models from databases and SQL scripts.

  • Evaluate data models and physical databases for variances and discrepancies.

  • Validate business data objects for accuracy and completeness.

  • Analyze data-related system integration challenges and propose appropriate solutions.

  • Develop data models according to company standards.

  • Guide System Analysts, Engineers, Programmers and others on project limitations and capabilities, performance requirements and interfaces.

  • Review modifications to existing software to improve efficiency and performance.

  • Examine new application design and recommend corrections if required.

Qualifications / Requirements:

  • Bachelors degree in Information Technology or Computer Science

  • 3+ years experience as a Data Modeler/Data Architect

  • Proficient in the use of data modeling tools; Eriwin proficiency is a must.

  • Experience in meta data management and data integration engines such as Biztalk or Informatica

  • Experience in supporting as well as implementing Oracle and SQL data infrastructures

  • Knowledge of the entire process behind software development including design and deployment (SOA knowledge and experience is a bonus)

  • Expert analytical and problem-solving traits

  • Knowledge of the design, development and maintenance of various data models and their components

  • Understand BI tools and technologies as well as the optimization of underlying databases

Mercedes-Benz USA
  • Atlanta, GA

Job Overview

Mercedes-Benz USA is recruiting a Big Data Architect, a newly created position within the Information Technology Infrastructure Department. This position is responsible for refining and creating the next step in technology for our organization. In this role you will act as contact person and agile enabler for all questions regarding new IT infrastructure and services in context of Big Data solutions.

Responsibilities

  • Leverage sophisticated Big Data technologies into current and future business applications

  • Lead infrastructure projects for the implementation of new Big Data solutions

  • Design and implement modern, scalable data center architectures (on premise, hybrid or cloud) that meet the requirements of our business partners

  • Ensure the architecture is optimized for large dataset acquisition, analysis, storage, cleansing, transformation and reclamation

  • Create the requirements analysis, the platform selection and the design of the technical architecture

  • Develop IT infrastructure roadmaps and implement strategies around data science initiatives

  • Lead the research and evaluation of emerging technologies, industry and market trends to assist in project development and operational support actives

  • Work closely together with the application teams to exceed our business partners expectations

Qualifications 

Education

Bachelors Degree (accredited school) with emphasis in:

Computer/Information Science

Information Technology

Engineering

Management Information System (MIS)

Must have 5 7 years of experience in the following:

  • Architecture, design, implementation, operation and maintenance of Big Data solutions

  • Hands-on experience with major Big Data technologies and frameworks including Hadoop, MapReduce, Pig, Hive, HBase, Oozie, Mahout, Flume, ZooKeeper, MongoDB, and Cassandra.

  • Experience with Big Data solutions deployed in large cloud computing infrastructures such as AWS, GCE and Azure

  • Strong knowledge of programming and scripting languages such as Java, Linus, PHP, Ruby, Phyton

  • Big Data query tools such as Pig, Hive and Impala

  • Project Management Skills:

  • Ability to develop plans/projects from conceptualization to implementation

  • Ability to organize workflow and direct tasks as well as document milestones and ROIs and resolve problems

Proven experience with the following:

  • Open source software such as Hadoop and Red Hat

  • Shell scripting

  • Servers, storage, networking, and data archival/backup solutions

  • Industry knowledge and experience in areas such as Software Defined Networking (SDN), IT infrastructure and systems security, and cloud or network systems management

Additional Skills
Focus on problem resolution and troubleshooting
Knowledge on hardware capabilities and software interfaces and applications
Ability to produce quality digital assets/products
 
EEO Statement
Mercedes-Benz USA is committed to fostering an inclusive environment that appreciates and leverages the diversity of our team. We provide equal employment opportunity (EEO) to all qualified applicants and employees without regard to race, color, ethnicity, gender, age, national origin, religion, marital status, veteran status, physical or other disability, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local law.

TomTom
  • Ghent, Belgium
At TomTom…

You’ll move the world forward. Every day, we create the most innovative mapping and location technologies to shape tomorrow’s mobility for the better.


We are proud to be one team of more than 5,000 unique, curious, passionate problem-solvers spread across the world. We bring out the best in each other. And together, we help the automotive industry, businesses, developers, drivers, citizens and cities move towards a safe, autonomous world that is free of congestion and emissions.

What you’ll do



  • You will work on incremental data transformation algorithms and deliver high quality software through agile development methodologies

  • You will dive into the pile of metadata we gathered on our processes and write tooling to gather insights

  • Use those insights to make recommendations to streamline the process

  • Quickly assess and apply the potential of emerging technologies

  • You will keep looking at the bigger picture for the problem at hand



What you’ll need



  • Your Data Mining techniques and tooling know-how are impressive. Same goes for your experience with cloud computing.

  • You have the skills and desire to build and improve visual simulations of complex data structures.

  • You love working as part of a self-organizing Scrum team in a scaled, agile environment.

  • You are working with automated testing, CI techniques and tools, design patterns and clean code principles.

  • You have knowledge of Java and of streaming concepts.
     


What’s nice to have



  • You might have knowledge in one (or several) of these areas: Prototyping, Geospatial processing, Lambda and Kappa architectures, Unix based environments, Hadoop, Hbase or HDFS.
     


Meet your team

We’re Maps, a product unit within TomTom’s Location Technology Products technical unit. Our team is comprised of over 2,000 people in 40 countries – all driven to deliver the most up-to-date, accurate and detailed maps for the hundreds of millions of people using TomTom maps around the world. Joining our team, you’ll help continuously innovate our map-making processes, create a real-time closed loop between detected changes in the real world and the users’ map, and build maps that will enable the future of autonomous driving.

Achieve more

We are self-starters who play well with others. Every day, we solve new problems with creativity, meet new people and learn rapidly at our offices around the world. We will invest in your growth and are committed to supporting you. In everything we do, we’re guided by six values: We care, putting our heart into what we do; we build trust (you can count on us); we create – driven to make a difference; we are confident, but don’t boast; we keep it simple, since life is complex enough; and we have fun because life’s too short to be boring. 

After you apply

Our recruitment team will work hard to give you a meaningful experience throughout the process, no matter the outcome. Your application will be screened closely and you can rest assured that all follow-up actions will be thorough, from assessments and interviews through your onboarding.

TomTom is an equal opportunity employer

We celebrate diversity, thrive on each other’s differences and are committed to creating an inclusive environment at our offices around the world. Naturally, we do not discriminate against any employee or job applicant because of race, religion, color, sexual orientation, gender, gender identity or expression, marital status, disability, national origin, genetics, or age.

Ready to move the world forward?
 

inovex GmbH
  • München, Germany

Als Linux Systems Engineer mit Schwerpunkt Hadoop und Search bist du bei unseren Kunden für die Konzeption, Installation und Konfiguration der Linux-basierten Big Data Cluster verantwortlich. Ebenfalls zu deinenAufgaben gehören die Bewertung bestehender Big-Data-Systeme und die zukunftssichere Erweiterung von bestehenden Umgebungen.

Du kümmerst dich dabei ganzheitlich um die Systeme und betreust diese vom Linux-Betriebssystem bis zum Big Data Stack. Für die Automatisierung der oftmals komplexen Big Data Cluster verwendest du bevorzugt Konfigurationsmanagementwerkzeuge.

In unseren interdisziplinären Projektteams spielst du eine gestaltende Rolle und hast dabei oftmals die Entscheidungsfreiheit, wenn es um die Wahl der Werkzeuge geht.


Zur Besetzung der Position suchen wir Experten, die folgende Skills und Eigenschaften mitbringen:



  • Ein erfolgreich abgeschlossenes Studium mit Informatikbezug oder eine vergleichbare Qualifikation wie beispielsweise die Ausbildung zum Fachinformatiker sowie relevante Berufserfahrung

  • Leidenschaft und Begeisterung für neue Technologien und Themen rund um Linux und Big Data

  • Praktische Erfahrung mit Hadoop und gängigen Hadoop Ecosystem Tools sowie erste Erfahrungen mit „Hadoop Security“

  • Idealerweise hast du bereits praktische Erfahrung mit einer oder mehreren der folgenden Technologien bzw. Produkten gesammelt:

    • Flume, Kafka

    • Flink, Hive, Spark

    • Cassandra, Elasticsearch, HBase, MongoDB, CouchDB

    • Amazon EMR, Cloudera, Hortonworks, MapR

    • Java



  • Gute Kenntnisse im Bereich Netzwerk und Storage

  • Vorteilhaft sind Kenntnisse in einem Konfigurationsmanagementwerkzeug (z.B. Puppet, Chef oder Salt)

  • Gute kommunikative Fähigkeiten und sehr gute Deutsch- und Englischkenntnisse in Wort und Schrift

  • Hohe Motivation, gemeinsam mit anderen „inovexperts“ exzellente Projektergebnisse zu erzielen

  • Mobilität und Flexibilität für die Projektarbeit bei unseren Kunden vor Ort

Siemens AG Österreich
  • Wien, Österreich
  • Salary: $45k - 65k

Your new role – challenging and future-oriented



  • Create and maintain optimal data pipeline architecture.

  • Assemble large, complex data sets that meet functional/non-functional use-cases requirements

  • Build the infrastructure and procedures required for optimal Extraction, Transformation, and Loading (ETL) of data from a wide variety of data sources.



  • Build the data pipelines that support the data analytics aiming to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

  • Work alongside Data Scientists to productionalize advanced computer science algorithms based on statistical and machine learning models.



  • Work with stakeholders including the Executive, Sales, and Data Science teams to assist with data-related technical issues and support their data infrastructure needs.

  • Keep our data access compliant with GDPR regulations within the company boundaries and the AWS data center.

  • Design and implement data access restriction policies for the different stakeholders and company functions.

  • Identify, design, and implement optimal data processing technologies and tools to strive for greater data lab functionality.


Your profile    



  • A Graduate degree in Computer Science, Informatics, Information Systems or equivalent quantitative field.

  • Advanced working SQL knowledge and experience working with and designing relational databases, optimized query authoring, as well as working familiarity with a variety of databases.

  • Experience building and optimizing data pipelines, architectures and data sets.

  • Ability to build processes supporting data transformation and cleaning, data structures, metadata, dependency and workload management.

  • A successful history of manipulating, processing and extracting value from large disconnected datasets.

  • Ability to assist in documenting requirements, as well as resolve conflicts or ambiguities.

  • Strong analytic skills related to working with heterogeneous datasets.

  • Experience supporting and working with cross-functional teams in a dynamic environment.

  • Strong project management and organizational skills.

Carbon Lighthouse, Inc.
  • San Francisco, CA

Target Start Date: Ongoing


Reports To: Director of Software


Location: San Francisco, CA


Carbon Lighthouse is looking for an adaptable, self-motivated, full stack Senior Software Engineer to work and grow with us to envision and build out our software platform and accelerate environmental impact. Along the way, you’ll learn all about energy efficiency, solar, real estate markets, construction, and probably get your hands dirty (or at least dusty) on site visits.

We have taken the first step of transforming our data analysis and thermodynamics modeling toolset into a robust software platform, called CLUES®, built on the latest web technologies. Now it’s time to take our platform to the next level and convert it into the tool we use to fulfill our mission and have a global impact. Your role will initially concentrate on streamlining the flow of data from wireless sensors through CLUES, implementing optimization and machine learning to drive the automation of our analytics and modeling process, and looking for ways to scale CLUES to a platform that enables us to stop climate change.

Your immediate career path will concentrate on designing and building out both the frontend and backend of CLUES, with many growth opportunities for specialization and management as our company expands. You should have a bachelor’s degree in computer science (or similar), 5+ years of professional development experience, and be excited by complex and open-ended engineering problems. You genuinely enjoy working on collaborative teams, and want to be involved in large scale challenges that require long term effort. While this role involves significant software development work, you will also spend time interacting with Carbon Lighthouse mechanical engineers, project managers, and energy performance engineers to connect our software to our real-world mission.

The role is based at our headquarters in downtown San Francisco, but may require occasional travel to our satellite offices or client sites.

About Carbon Lighthouse: 
Carbon Lighthouse is on a mission to stop climate change by making it easy and profitable for building owners to eliminate carbon emissions caused by wasted energy. The company’s unique approach to Efficiency Production goes deep into buildings to uncover and continuously correct hidden inefficiencies that add up to meaningful financial value and carbon elimination that lasts. Since 2010, commercial real estate, educational, hospitality and industrial customers nationwide have chosen Carbon Lighthouse to enhance building comfort, increase net operating income and achieve their sustainability goals. Backed by notable investors, we are a team of 84 that highly values question asking, getting it done, integrity, and teamwork. We appreciate a fulfilling work-life balance, prize transparency and communication, hold ourselves to high standards of performance and professionalism, strive for dynamism and innovation, and support our team members’ professional development. Every person has both the opportunity and responsibility to make an impact on our growing organization.


Responsibilities:



  • Develop full stack web applications in an Agile environment to increase the speed and efficiency of Carbon Lighthouse process and grow our overall product offering

  • Participate in the full product development cycle, including brainstorming, architecting, release planning and estimation, implementing and iterating on code, coordinating with internal and external clients, internal code and design reviews, MVP and production releases, quality assurance, and product support.

  • Collaboratively work on simultaneous projects with multiple stakeholders, both on the software team and company-wide

  • Work in a team environment, expressing ideas and being open to those of others, to effectively drive cross-team solutions that have complex dependencies and requirements

  • Participate in, study, and improve the current Carbon Lighthouse process



Required Qualifications:



  • Dedication to Carbon Lighthouse's environmental mission

  • BS in Computer Science or similar technical field, or have demonstrated exceptional experience in technology environments

  • 5+ years of development experience

  • Proven track record of building production web applications using Python, or other comparable technology



Relevant Framework and Tool Experience

Proven experience with the majority of the following:



  • Server side web frameworks (e.g. Express, Flask, etc.)

  • Building user interfaces using JavaScript, HTML, CSS, and front end frameworks (e.g. Angular, React, etc.)

  • Developing applications backed by RDBMS or NoSQL data stores (e.g. MySQL, MongoDB, etc.)

  • Developing scalable, robust, and fault-tolerant REST services and microservices

  • Unit testing frameworks (e.g. Mocha, Chai, unittest, pytest, etc.)

  • Version control systems (e.g. Git, SVN, etc.)

  • Test driven development, continuous integration, and continuous deployment

  • AWS



Bonus Qualifications:



  • Data science background

  • HVAC systems background

  • HVAC controls software/hardware background

  • UX/UI design experience

  • Startup experience



Compensation and Benefits:



  • Salary + equity

  • Medical, dental, vision, and disability insurance

  • Generous vacation policy

  • Fully paid maternity and paternity leave benefits

  • Subsidized public transit and bike to work benefits

  • 401(k)



Carbon Lighthouse is an equal employment opportunity employer and considers qualified applicants without regard to gender, sexual orientation, gender identity, race, veteran or disability status. 

If you’re excited about our environmental mission and this looks like a fit, you should apply! Please fill out the brief application form and be prepared to submit three references at a later date.

Giesecke+Devrient Currency Technology GmbH
  • München, Deutschland

Arbeiten Sie mit an einer Zukunft, die bewegt. Für unsere Division Currency Management Solutions suchen wir Sie als



Big Data Engineer (m/w/d)



Ihre Aufgaben:




  • Sie sind verantwortlich für Design, Implementierung und Betrieb von effektiven Data Processing Architekturen innerhalb unserer innovativen Microservices- und Cloud-basierten Data Analytics, IIoT und Digitallösungen

  • Ingestion, Integration, Organization, Batch und Stream Processing, Lifecycle Management der Daten

  • Sichern der Qualität, Integrität und Privacy der Daten

  • Setup, Monitoring und Tuning der Hadoop Cluster und Datenbanken, you build it you run it

  • Enge Zusammenarbeit in agilen Teams mit Data Scientists, Entwicklungsteams und Product Ownern in der Datenmodellierung, Datenanalyse und Technologieberatung



Ihr Profil:




  • Studium (Master, FH / Uni) der Informatik oder einer vergleichbaren Fachrichtung

  • Sehr gute Kenntnisse mit Data Ingestion / Integration (Flume, Sqoop, Nify) Data Storage (PostgreSQL, MongoDB), Distributed Storage (Hadoop, Cloudera), Messaging (Kafka, MQTT), Data Processing (Spark, Scala), Scheduling (Oozie, Pig)

  • Praktische Erfahrung in der Entwicklung und Betrieb von large-scale Data Processing Pipelines in skalierbaren Microservice / REST Architekturen

  • Erfahrung in Cloud-Umgebungen wie Microsoft Azure oder AWS wünschenswert

  • Sehr gute Deutsch- und Englischkenntnisse in Wort und Schrift






Wir freuen uns auf Ihre Online-Bewerbung unter www.gi-de.com/karriere.




Giesecke+Devrient Currency Technology GmbH · Prinzregentenstraße 159 · 81677 München