OnlyDataJobs.com

Accenture
  • Atlanta, GA
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Raleigh, NC
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
DISYS
  • Minneapolis, MN
Client: Banking/Financial Services
Location: 100% Remote
Duration: 12 month contract-to-hire
Position Title: NLU/NLP Predictive Modeling Consultant


***Client requirements will not allow OPT/CPT candidates for this position, or any other visa type requiring sponsorship. 

This is a new team within the organization set up specifically to perform analyses and gain insights into the "voice of the customer" through the following activities:
Review inbound customer emails, phone calls, survey results, etc.
Review data that is unstructured "natural language" text and speech data
Maintain focus on customer complaint identification and routing
Build machine learning models to scan customer communication (emails, voice, etc)
Identify complaints from non-complaints.
Classify complaints into categories
Identify escalated/high-risk complaints, e.g. claims of bias, discrimination, bait/switch, lying, etc...
Ensure routed to appropriate EO for special

Responsible for:
Focused on inbound retail (home mortgage/equity) emails
Email cleansing: removal of extraneous information (disclaimers, signatures, headers, PII)
Modeling: training models using state of art techniques
Scoring: "productionalizing" models to be consumed by the business
Governance: model documentation and Q/A with model risk group.
Implementation of model monitoring processes

Desired Qualifications:
Real-world experience building/deploying predictive models, any industry (must)
SQL background (must)
Self-starter, able to excel in fast-paced environment w/o a ton of direction (must)
Good communication skills (must)
Experience in text/speech analytics (preferred)
Python, SAS background (preferred)
Linux (nice to have)
Spark (Scala or PySpark) (nice to have)

UST Global
  • San Diego, CA

KEY SKILLSETS

- 7+ years experience with Python

- 4+ years experience with Java


General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

UST Global
  • Atlanta, GA

KEY SKILLSETS

- 7+ years experience with Python

- 4+ years experience with Java


General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

GTN Technical Staffing and Consulting
  • Dallas, TX

This position requires a broad array of programming skills and experience as well as the desire to learn and grow in an entrepreneurial environment. You will be responsible for creating and developing client onboarding, product provisioning and real-time analytics software services. You will work closely with other members of the architecture team to make strategic decisions about product development, devops and future technology choices.

The ideal candidate should:

  • Demonstrate a proven track record of rapidly building, delivering, and maintaining complex software products.
  • Possess excellent communication skills.
  • Have high integrity.
  • Embrace learning and have a thirst for knowledge.

Enjoy solving problems and finding generic solutions to challenging situations.

  • Enjoy being a mentor for junior team members.

Be self-motivated to innovate and develop cutting edge technology functionality.

  • Be able to rapidly learn new frameworks.
  • Be responsible for creating and implementing core product architecture. Be comfortable developing frontend and backend solutions.

This position reports directly to the CTO.

Required experience:

7+ years hands-on experience

  • AWS (EC2, Lambda, ECS)
  • Docker/Kubernetes
  • 3+ years programming in Scala
  • 3+ years programming in Node.js
  • ES6/Modern javascript
  • Microservices

Preferred experience:

  • Mongodb
  • SBT
  • Serverless Computing

The ideal candidate will:

  • Possess excellent communication and organization skills
  • Embrace learning and have a thirst for knowledge
  • Rapidly learn new technological paradigms
  • Understand and program in multiple programming languages
  • Enjoy being part of a growing team
  • Self-motivated team player

Benefits

  • Medical / Dental / Vision Insurance
  • 401(k)
  • Competitive compensation
  • Work with leaders in the industry
  • Opportunities to learn and grow every day
  • Play a meaningful role on a successful team
SoftClouds LLC
  • San Diego, CA

Job Overview: SoftClouds is looking for a Data Engineer to join our analytics platform team in designing and developing the next generation data and analytics solutions. The candidate should have deep technical skills as well as the ability to understand data and analytics, and an openness to working with disparate platforms, data sources and data formats.


Roles and Responsibilities:
  • Experience with MySQL, MS SQL Server, or Hadoop, or MongoDB.
  • Writing SQL Queries, tables joins.
  • AWS, python, or bash shell scripting
  • Have some experience pulling data from Hadoop.
  • Analyze data, system and data flows and develop effective ways to store and present data in BI applications
  • ETL experience a plus.
  • Work with data from disparate environments including Hadoop, MongoDB Talend, and other SQL and NoSQL data stores
  • Help develop the next generation analytics platform
  • Proactively ensure data integrity and focus on continuous performance improvements of existing processes.


Required skills and experience:
  • 5  or more years of experience in software development
  • 3 year of experience in writing Data applications using Spark
  • Experience in Java and Python
  • Familiarity  with Agile development methodology `
  • Experience with Scala is a plus
  • Experience with NoSQL databases, e.g., Cassandra is a plus
  • Expertise in Apache Spark & Hadoop.
  • Expertise in machine learning algorithms


Education / Experience:

  • Bachelor's Degree in Engineering or Computer Science or related field required.
  • U.S. Citizens/GC/GC EAD are encouraged to apply. We are unable to sponsor at this time. NO C2C or third-party agencies.



Hulu
  • Santa Monica, CA

WHAT YOU’LL DO



  • Build robust and scalable micro-services

  • End to end ownership of backend services: Ideate, review design, build, code-review, test, load-test, launch, monitor performance

  • Identify opportunities to optimize ad delivery algorithm – measure and monitor ad-break utilization for ad count and ad duration.

  • Work with product team to translate requirements into well-defined technical implementation

  • Define technical and operational KPIs to measure ad delivery health

  • Build Functional and Qualitative Test frameworks for ad server

  • Challenge our team and software to be even better


WHAT TO BRING



  • BS or MS in Computer Science/Engineering

  • 7+ years of relevant software engineering experience

  • Strong analytical skills

  • Strong programming (Java/C#/C++ or other related programming languages) and scripting skills

  • Great communication, collaboration skills and a strong teamwork ethic

  • Strive for excellence


NICE-TO-HAVES



  • Experience with non-relational database technologies (MongoDB, Cassandra, DynamoDB)

  • Experience with Redis and/or MemCache

  • Experience with Apache Kafka and/or Kinesis

  • AWS

  • Big Data technologies and data warehouses – Spark, Hadoop, Redshift

idealo internet GmbH
  • Berlin, Deutschland

We have made it our mission to develop the best price comparison and the most enjoyable shopping experience for customers on smartphones and tablets. The team needs energetic full stack support to create a new product from scratch. Topics like events streaming, GraphQL or OpenShift are exciting topics for you? Do you find elegant solutions for complex architectural problems? If so, then we would like to get to know you.


Your tasks:



  • At idealo you belong to a cross-functional team of concept developers, designers and software engineers who develop products and software at the highest level.

  • We work in an agile team, and therefore you accompany the entire development process from an idea to a product that hundreds of thousands of users hold in their hands every day.

  • You work with modern technologies such as Kotlin, GraphQL, Apache Kafka, MongoDB, Kubernetes, Spring Boot and the Spring Cloud/Netflix OSS stack.

  • Design and implement scalable and resilient microservices and apps, working closely with other teams on common components

  • You continuously improve the code, the development process and contribute with your knowledge to the growth of the team.


What you bring with you:



  • You have solid experience as a software developer or software architect

  • You have sound knowledge in the design of distributed systems and in the conception of parallel data processing.

  • You have excellent knowledge in Java and the Spring Framework and ideally already in Kotlin.

  • Concepts such as Software Craftsmanship, Clean Code, Continuous Deployment are written in capital letters at your company

  • With agile Mindset you prefer a fast live performance without losing sight of the quality.

  • You have a well-founded opinion and you represent it - and respect it when others do the same.

  • You like to communicate and are open for new things

idealo internet GmbH
  • Berlin, Deutschland

We have made it our mission to develop the best price comparison and the most enjoyable shopping experience for customers on smartphones and tablets. The team needs energetic full stack support to create a new product from scratch. Topics like events streaming, GraphQL or OpenShift are exciting topics for you? Do you find elegant solutions for complex architectural problems? If so, then we would like to get to know you.


Your tasks:



  • At idealo you belong to a cross-functional team of concept developers, designers and software engineers who develop products and software at the highest level.

  • We work in an agile team, and therefore you accompany the entire development process from an idea to a product that hundreds of thousands of users hold in their hands every day.

  • You work with modern technologies such as Kotlin, GraphQL, Apache Kafka, MongoDB, Kubernetes, Spring Boot and the Spring Cloud/Netflix OSS stack.

  • Design and implement scalable and resilient microservices and apps, working closely with other teams on common components

  • You continuously improve the code, the development process and contribute with your knowledge to the growth of the team.


Skills & Requirements:



  • You have solid experience as a software developer or software architect

  • You have sound knowledge in the design of distributed systems and in the conception of parallel data processing.

  • You have excellent knowledge in Java and the Spring Framework and ideally already in Kotlin.

  • Concepts such as Software Craftsmanship, Clean Code, Continuous Deployment are written in capital letters at your company

  • With an agile mind-set, you prefer a fast live performance without losing sight of the quality.

  • You have a well-founded opinion and you represent it - and respect it when others do the same.

  • You like to communicate and are open to new things

Hulu
  • Santa Monica, CA

WHAT YOU’LL DO



  • Build elegant systems that are robust and scalable

  • Challenge our team and software to be even better

  • Use a mix of technologies including Scala, Ruby, Python, and Angular JS


WHAT TO BRING



  • BS or MS in Computer Science/Engineering

  • 5+ years of relevant software engineering experience

  • Strong programming (Java/C#/C++ or other related programming languages) and scripting skills

  • Great communication, collaboration skills and a strong teamwork ethic

  • Strive for excellence


NICE-TO-HAVES



  • Experience with both statically typed languages and dynamic languages

  • Experience with relational (Oracle, MySQL) and non-relational database technologies (MongoDB, Cassandra, DynamoDB)

Beamery
  • London, UK
  • Salary: £45k - 75k

We’re building a new generation of AI-driven recruiting tech that changes the way people do their job. To make that happen, we work hard at understanding what our users need, and we build it using cutting-edge infrastructures and a diverse stack of languages and frameworks. Our customers are some of the most innovative and quality-driven companies in the world, and we can’t bring them a product that is less than brilliant. We’re always looking for engineers who have the same passion for quality and customer happiness.


At Beamery, you will constantly be learning and teaching others. You will have a sense of ownership over the product and will take pride in your work. The best practices of the team will be influenced by your voice, and there will always be space and time for you to experiment and bring new ideas to the table.    


As a Back End Developer on the Beamteam, you will be working with a team of experienced engineers to build the next generation back end architecture of our services.


The right engineer will:



  • Have strong NodeJS skills with ES6+ and TypeScript. Functional and Object Oriented programming.

  • Have a good understanding of microservice architectures in the past, and you have experience using pub/sub architectures and Apache Kafka.

  • Have in-depth understanding of MongoDB and ORM systems, and good knowledge of the ELK stack.

  • Have a good understanding of TDD/BDD and test automation suits.

  • Enjoy using a wide variety of tools, and will be happy to pick up and learn new things.

  • Enjoy our regular teach-in sessions delivered by industry experts.

  • Have excellent communication skills, both written and spoken.

  • Enjoy being apart of a collaborative team that is focused on building a product that will delight customers.

  • Not be biased toward a specific technology: finding the right tools for the job.


At Beamery you will:



  • Take part in regular collaborative teach-ins, and learn new skills.

  • Build beautiful, scalable products with wide market exposure.

  • Being a lead contributor in our projects and shaping the architecture of our back-end services.

  • Have the opportunity to mentor junior developers.

  • Work as part of an Agile team with support from Product and Design.

  • Be exposed to leadership training and experience opportunities.

  • Have access to in-house training course as well as external conferences and workshops.

Perficient, Inc.
  • Dallas, TX

At Perficient, youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.

Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.

About Our Data Governance Practice:


We provide exceptional data integration services in the ETL, Data Catalog, Data Quality, Data Warehouse, Master Data Management (MDM), Metadata Management & Governance space.

Perficient currently has a career opportunity for a Python Developer who resides in the vicinity of Jersey City, NJ or Dallas,TX.

Job Overview:

As a Python developer, you will participate in all aspects of the software development lifecycle which includes estimating, technical design, implementation, documentation, testing, deployment, and support of application developed for our clients. As a member working in a team environment, you will take direction from solution architects and Leads on development activities.


Required skills:

  • 6+ years of experience in architecting, building and maintaining software platforms and large-scale data infrastructures in a commercial or open source environment
  • Excellent knowledge of Python
  • Good knowledge of and hands on experience working with quant/data Python libraries (pandas/numpy etc)
  • Good knowledge of and hands on experience designing APIs in Python (using Django/Flask etc)

Nice to have skills (in the order of priority):

  • Comfortable and Hands on experience with AWS cloud (S3, EC2, EMR, Lambda, Athena, QuickSight etc.) and EMR tools (Hive, Zeppelin etc)
  • Experience building and optimizing big data data pipelines, architectures and data sets.
  • Hands on experience in Hadoop MapReduce or other big data technologies and pipelines (Hadoop, Spark/pyspark, MapReduce, etc.)
  • Bash Scripting
  • Understanding of Machine Learning and Data Science processes and techniques
  • Experience in Java / Scala


Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities, and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues with great benefits are just part of what makes Perficient a great place to work.

GTN Technical Staffing and Consulting
  • Dallas, TX

Senior Software Engineer - Scala
HIGHLIGHTS
Location: North Dallas, TX
Position Type: Direct Hire
Hourly / Salary: BOE
Residency Status: US Citizens and US Permanent Residents only, as sponsorship is not being offered at this time.

This position requires a broad array of programming skills and experience as well as the desire to learn and grow in an entrepreneurial environment. You will be responsible for creating and developing client onboarding, product provisioning and real-time analytics software services. You will work closely with other members of the architecture team to make strategic decisions about product development, devops and future technology choices.

The ideal candidate should:

  • Demonstrate a proven track record of rapidly building, delivering, and maintaining complex software products.
  • Possess excellent communication skills.
  • Have high integrity.
  • Embrace learning and have a thirst for knowledge.
  • The ideal candidate will possess a broad array of programming skills and experience, with a focus on writing scalable, high-performance microservices on a platform such as AWS lambda.

Be self-motivated to innovate and develop cutting edge technology functionality.

  • Be able to rapidly learn new frameworks.
  • Be responsible for creating and implementing core product architecture. Be comfortable developing frontend and backend solutions.

Required experience:

7+ years hands-on experience

  • AWS (EC2, Lambda, ECS)
  • Docker/Kubernetes
  • 3+ years programming in Scala
  • 3+ years programming in Node.js
  • ES6/Modern javascript
  • Microservices

Preferred experience:

  • Mongodb
  • SBT
  • Serveries Computing

The ideal candidate will:

  • Possess excellent communication and organization skills
  • Embrace learning and have a thirst for knowledge
  • Rapidly learn new technological paradigms
  • Understand and program in multiple programming languages
  • Enjoy being part of a growing team
  • Self-motivated team player

Benefits

  • Medical / Dental / Vision Insurance
  • 401(k)
  • Competitive compensation
  • Work with leaders in the industry
  • Opportunities to learn and grow every day
  • Play a meaningful role on a successful team
Acxiom
  • Austin, TX
The Data Analytics Engineer leverages Acxiom and third-party software to create solutions to business problems defined by specific business requirements. As a Data Analytics Engineer, you will draw upon technical and data processing knowledge to solve moderately complex marketing and data warehousing problems on very large volumes of data.

This position can be homebased. Corporate office locations include: Downers Grove, IL; Conway, AR: Austin, TX; New York City, NY.

 

Responsibilities:


  • Understands requirements to build, enhance, or integrate data programs and processes for one or more Acxiom Client solutions and/or applications. Able to read and interpret application design and functional specifications to write or enhance application code.
  • Interacts with client/stakeholders to understand and resolve problems in a timely manner, prioritizing multiple issue response based on the severity of the case.
  • Develops automation jobs for data orchestration, data analysis and data transfers
  • Provides input on functional requirements and participates /presents review code in code review sessions. Helps accurately estimate requirements in order to deliver client solutions within time, and quality standards.
  • Utilizes standard /Acxiom methodologies to ensure overall solution and data integrity is maintained.
  • Understands Acxiom solution software. Defines solutions standards, policies and procedures.
  • Identifies and diagnoses areas of maintenance and process improvement.
  • Responds to client/stakeholder problems in a timely manner, prioritizing multiple issue response based on the severity of the case.
  • Using a relevant software language, develops/ executes unit test cases and tests software applications that fulfill functional specifications. Documents and interprets test results and corrects application coding errors
  • Draws on past technical experiences to adapt to new programming languages and technologies on


What you will need:


  • 2-4 years of experience in data engineering/programming and data analytics at a large organization
  • Analytic problem-solving skills with the ability to think outside-the-box
  • Analytical thinker that excels at analyzing and understanding data to answer questions
  • Good communication skills: communicate ideas clearly and effectively to other members of the analytics team and to the client at multiple levels (both technical and business)
  • Excellent understanding of data concepts, data architecture, data manipulation/engineering and data engineering design
  • Able to leverage SQL or other data manipulation languages such as SAS, python pandas, R, Spark, etc. to answer business questions effectively
  • Passion for considering how projects fit into the wider business picture
  • Expert programming experience in Python, Shell scripting, or other object oriented and structured programming language
  • Understanding in multiple types of programming languages to be adaptable (statically typed vs. dynamically typed and object oriented vs. procedural)
  • Self-Starter Able to work independently with little guidance
  • Multitasker Able to prioritize and deliver on multiple projects and tasks that are happening
  • Adaptable - Able to adapt to diverse technical challenges and systems
  • Up to 25% travel
  • Up to 50% direct client interaction


What will set you apart:


  • Client facing consulting experience
  • Experience with 3rd party MarTech data (Email send/response, Direct Mail send/response, Prospect Lists) and/or AdTech data (Digital Ad Impressions/Activity, Social, Website activity)
  • Experience working in environments with strong data privacy and data goverence
  • SAS & building macros in SAS
  • Hadoop architecture(Cloudera, Hortonworks, MapR)
  • Hive
  • Spark/PySpark
  • R
  • AWS experience
  • Building reports on Tableau or other BI Tools

 

Mercedes-Benz USA
  • Atlanta, GA

Job Overview

Mercedes-Benz USA is recruiting a Big Data Architect, a newly created position within the Information Technology Infrastructure Department. This position is responsible for refining and creating the next step in technology for our organization. In this role you will act as contact person and agile enabler for all questions regarding new IT infrastructure and services in context of Big Data solutions.

Responsibilities

  • Leverage sophisticated Big Data technologies into current and future business applications

  • Lead infrastructure projects for the implementation of new Big Data solutions

  • Design and implement modern, scalable data center architectures (on premise, hybrid or cloud) that meet the requirements of our business partners

  • Ensure the architecture is optimized for large dataset acquisition, analysis, storage, cleansing, transformation and reclamation

  • Create the requirements analysis, the platform selection and the design of the technical architecture

  • Develop IT infrastructure roadmaps and implement strategies around data science initiatives

  • Lead the research and evaluation of emerging technologies, industry and market trends to assist in project development and operational support actives

  • Work closely together with the application teams to exceed our business partners expectations

Qualifications 

Education

Bachelors Degree (accredited school) with emphasis in:

Computer/Information Science

Information Technology

Engineering

Management Information System (MIS)

Must have 5 7 years of experience in the following:

  • Architecture, design, implementation, operation and maintenance of Big Data solutions

  • Hands-on experience with major Big Data technologies and frameworks including Hadoop, MapReduce, Pig, Hive, HBase, Oozie, Mahout, Flume, ZooKeeper, MongoDB, and Cassandra.

  • Experience with Big Data solutions deployed in large cloud computing infrastructures such as AWS, GCE and Azure

  • Strong knowledge of programming and scripting languages such as Java, Linus, PHP, Ruby, Phyton

  • Big Data query tools such as Pig, Hive and Impala

  • Project Management Skills:

  • Ability to develop plans/projects from conceptualization to implementation

  • Ability to organize workflow and direct tasks as well as document milestones and ROIs and resolve problems

Proven experience with the following:

  • Open source software such as Hadoop and Red Hat

  • Shell scripting

  • Servers, storage, networking, and data archival/backup solutions

  • Industry knowledge and experience in areas such as Software Defined Networking (SDN), IT infrastructure and systems security, and cloud or network systems management

Additional Skills
Focus on problem resolution and troubleshooting
Knowledge on hardware capabilities and software interfaces and applications
Ability to produce quality digital assets/products
 
EEO Statement
Mercedes-Benz USA is committed to fostering an inclusive environment that appreciates and leverages the diversity of our team. We provide equal employment opportunity (EEO) to all qualified applicants and employees without regard to race, color, ethnicity, gender, age, national origin, religion, marital status, veteran status, physical or other disability, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local law.

inovex GmbH
  • München, Germany

Als Linux Systems Engineer mit Schwerpunkt Hadoop und Search bist du bei unseren Kunden für die Konzeption, Installation und Konfiguration der Linux-basierten Big Data Cluster verantwortlich. Ebenfalls zu deinenAufgaben gehören die Bewertung bestehender Big-Data-Systeme und die zukunftssichere Erweiterung von bestehenden Umgebungen.

Du kümmerst dich dabei ganzheitlich um die Systeme und betreust diese vom Linux-Betriebssystem bis zum Big Data Stack. Für die Automatisierung der oftmals komplexen Big Data Cluster verwendest du bevorzugt Konfigurationsmanagementwerkzeuge.

In unseren interdisziplinären Projektteams spielst du eine gestaltende Rolle und hast dabei oftmals die Entscheidungsfreiheit, wenn es um die Wahl der Werkzeuge geht.


Zur Besetzung der Position suchen wir Experten, die folgende Skills und Eigenschaften mitbringen:



  • Ein erfolgreich abgeschlossenes Studium mit Informatikbezug oder eine vergleichbare Qualifikation wie beispielsweise die Ausbildung zum Fachinformatiker sowie relevante Berufserfahrung

  • Leidenschaft und Begeisterung für neue Technologien und Themen rund um Linux und Big Data

  • Praktische Erfahrung mit Hadoop und gängigen Hadoop Ecosystem Tools sowie erste Erfahrungen mit „Hadoop Security“

  • Idealerweise hast du bereits praktische Erfahrung mit einer oder mehreren der folgenden Technologien bzw. Produkten gesammelt:

    • Flume, Kafka

    • Flink, Hive, Spark

    • Cassandra, Elasticsearch, HBase, MongoDB, CouchDB

    • Amazon EMR, Cloudera, Hortonworks, MapR

    • Java



  • Gute Kenntnisse im Bereich Netzwerk und Storage

  • Vorteilhaft sind Kenntnisse in einem Konfigurationsmanagementwerkzeug (z.B. Puppet, Chef oder Salt)

  • Gute kommunikative Fähigkeiten und sehr gute Deutsch- und Englischkenntnisse in Wort und Schrift

  • Hohe Motivation, gemeinsam mit anderen „inovexperts“ exzellente Projektergebnisse zu erzielen

  • Mobilität und Flexibilität für die Projektarbeit bei unseren Kunden vor Ort

Carbon Lighthouse, Inc.
  • San Francisco, CA

Target Start Date: Ongoing


Reports To: Director of Software


Location: San Francisco, CA


Carbon Lighthouse is looking for an adaptable, self-motivated, full stack Senior Software Engineer to work and grow with us to envision and build out our software platform and accelerate environmental impact. Along the way, you’ll learn all about energy efficiency, solar, real estate markets, construction, and probably get your hands dirty (or at least dusty) on site visits.

We have taken the first step of transforming our data analysis and thermodynamics modeling toolset into a robust software platform, called CLUES®, built on the latest web technologies. Now it’s time to take our platform to the next level and convert it into the tool we use to fulfill our mission and have a global impact. Your role will initially concentrate on streamlining the flow of data from wireless sensors through CLUES, implementing optimization and machine learning to drive the automation of our analytics and modeling process, and looking for ways to scale CLUES to a platform that enables us to stop climate change.

Your immediate career path will concentrate on designing and building out both the frontend and backend of CLUES, with many growth opportunities for specialization and management as our company expands. You should have a bachelor’s degree in computer science (or similar), 5+ years of professional development experience, and be excited by complex and open-ended engineering problems. You genuinely enjoy working on collaborative teams, and want to be involved in large scale challenges that require long term effort. While this role involves significant software development work, you will also spend time interacting with Carbon Lighthouse mechanical engineers, project managers, and energy performance engineers to connect our software to our real-world mission.

The role is based at our headquarters in downtown San Francisco, but may require occasional travel to our satellite offices or client sites.

About Carbon Lighthouse: 
Carbon Lighthouse is on a mission to stop climate change by making it easy and profitable for building owners to eliminate carbon emissions caused by wasted energy. The company’s unique approach to Efficiency Production goes deep into buildings to uncover and continuously correct hidden inefficiencies that add up to meaningful financial value and carbon elimination that lasts. Since 2010, commercial real estate, educational, hospitality and industrial customers nationwide have chosen Carbon Lighthouse to enhance building comfort, increase net operating income and achieve their sustainability goals. Backed by notable investors, we are a team of 84 that highly values question asking, getting it done, integrity, and teamwork. We appreciate a fulfilling work-life balance, prize transparency and communication, hold ourselves to high standards of performance and professionalism, strive for dynamism and innovation, and support our team members’ professional development. Every person has both the opportunity and responsibility to make an impact on our growing organization.


Responsibilities:



  • Develop full stack web applications in an Agile environment to increase the speed and efficiency of Carbon Lighthouse process and grow our overall product offering

  • Participate in the full product development cycle, including brainstorming, architecting, release planning and estimation, implementing and iterating on code, coordinating with internal and external clients, internal code and design reviews, MVP and production releases, quality assurance, and product support.

  • Collaboratively work on simultaneous projects with multiple stakeholders, both on the software team and company-wide

  • Work in a team environment, expressing ideas and being open to those of others, to effectively drive cross-team solutions that have complex dependencies and requirements

  • Participate in, study, and improve the current Carbon Lighthouse process



Required Qualifications:



  • Dedication to Carbon Lighthouse's environmental mission

  • BS in Computer Science or similar technical field, or have demonstrated exceptional experience in technology environments

  • 5+ years of development experience

  • Proven track record of building production web applications using Python, or other comparable technology



Relevant Framework and Tool Experience

Proven experience with the majority of the following:



  • Server side web frameworks (e.g. Express, Flask, etc.)

  • Building user interfaces using JavaScript, HTML, CSS, and front end frameworks (e.g. Angular, React, etc.)

  • Developing applications backed by RDBMS or NoSQL data stores (e.g. MySQL, MongoDB, etc.)

  • Developing scalable, robust, and fault-tolerant REST services and microservices

  • Unit testing frameworks (e.g. Mocha, Chai, unittest, pytest, etc.)

  • Version control systems (e.g. Git, SVN, etc.)

  • Test driven development, continuous integration, and continuous deployment

  • AWS



Bonus Qualifications:



  • Data science background

  • HVAC systems background

  • HVAC controls software/hardware background

  • UX/UI design experience

  • Startup experience



Compensation and Benefits:



  • Salary + equity

  • Medical, dental, vision, and disability insurance

  • Generous vacation policy

  • Fully paid maternity and paternity leave benefits

  • Subsidized public transit and bike to work benefits

  • 401(k)



Carbon Lighthouse is an equal employment opportunity employer and considers qualified applicants without regard to gender, sexual orientation, gender identity, race, veteran or disability status. 

If you’re excited about our environmental mission and this looks like a fit, you should apply! Please fill out the brief application form and be prepared to submit three references at a later date.

Giesecke+Devrient Currency Technology GmbH
  • München, Deutschland

Arbeiten Sie mit an einer Zukunft, die bewegt. Für unsere Division Currency Management Solutions suchen wir Sie als



Big Data Engineer (m/w/d)



Ihre Aufgaben:




  • Sie sind verantwortlich für Design, Implementierung und Betrieb von effektiven Data Processing Architekturen innerhalb unserer innovativen Microservices- und Cloud-basierten Data Analytics, IIoT und Digitallösungen

  • Ingestion, Integration, Organization, Batch und Stream Processing, Lifecycle Management der Daten

  • Sichern der Qualität, Integrität und Privacy der Daten

  • Setup, Monitoring und Tuning der Hadoop Cluster und Datenbanken, you build it you run it

  • Enge Zusammenarbeit in agilen Teams mit Data Scientists, Entwicklungsteams und Product Ownern in der Datenmodellierung, Datenanalyse und Technologieberatung



Ihr Profil:




  • Studium (Master, FH / Uni) der Informatik oder einer vergleichbaren Fachrichtung

  • Sehr gute Kenntnisse mit Data Ingestion / Integration (Flume, Sqoop, Nify) Data Storage (PostgreSQL, MongoDB), Distributed Storage (Hadoop, Cloudera), Messaging (Kafka, MQTT), Data Processing (Spark, Scala), Scheduling (Oozie, Pig)

  • Praktische Erfahrung in der Entwicklung und Betrieb von large-scale Data Processing Pipelines in skalierbaren Microservice / REST Architekturen

  • Erfahrung in Cloud-Umgebungen wie Microsoft Azure oder AWS wünschenswert

  • Sehr gute Deutsch- und Englischkenntnisse in Wort und Schrift






Wir freuen uns auf Ihre Online-Bewerbung unter www.gi-de.com/karriere.




Giesecke+Devrient Currency Technology GmbH · Prinzregentenstraße 159 · 81677 München

Farfetch UK
  • London, UK

About the team:



We are a multidisciplinary team of Data Scientists and Software Engineers with a culture of empowerment, teamwork and fun. Our team is responsible for large-scale and complex machine learning projects directly providing business critical functionality to other teams and using the latest technologies in the field



Working collaboratively as a team and with our business colleagues, both here in London and across our other locations, you’ll be shaping the technical direction of a critically important part of Farfetch. We are a team that surrounds ourselves with talented colleagues and we are looking for brilliant Software Engineers who are open to taking on plenty of new challenges.



What you’ll do:



Our team works with vast quantities of messy data, such as unstructured text and images collected from the internet, applying machine learning techniques, such as deep learning, natural language processing and computer vision, to transform it into a format that can be readily used within the business. As an Engineer within our team you will help to shape and deliver the engineering components of the services that our team provides to the business. This includes the following:




  • Work with Project Lead to help design and implement new or existing parts of the system architecture.

  • Work on surfacing the team’s output through the construction of ETLs, APIs and web interfaces.

  • Work closely with the Data Scientists within the team to enable them to produce clean production quality code for their machine learning solutions.



Who you are:



First and foremost, you’re passionate about solving complex, challenging and interesting business problems. You have solid professional experience with Python and its ecosystem, with a  thorough approach to testing.



To be successful in this role you have strong experience with:



  • Python 3

  • Web frameworks, such as Flask or Django.

  • Celery, Airflow, PySpark or other processing frameworks.

  • Docker

  • ElasticSearch, Solr or a similar technology.



Bonus points if you have experience with:



  • Web scraping frameworks, such as Scrapy.

  • Terraform, Packer

  • Google Cloud Platform, such as Google BigQuery or Google Cloud Storage.



About the department:



We are the beating heart of Farfetch, supporting the running of the business and exploring new and exciting technologies across web, mobile and instore to help us transform the industry. Split across three main offices - London, Porto and Lisbon - we are the fastest growing teams in the business. We're committed to turning the company into the leading multi-channel platform and are constantly looking for brilliant people who can help us shape tomorrow's customer experience.





We are committed to equality of opportunity for all employees. Applications from individuals are encouraged regardless of age, disability, sex, gender reassignment, sexual orientation, pregnancy and maternity, race, religion or belief and marriage and civil partnerships.