OnlyDataJobs.com

Wipro Limited
  • Dallas, TX

Below is the Job Description

    • 15-20 years of experience as Data Engineer / Architect working on end-to-end architecture, consulting on Cloud Platforms
    • Strong experience in large scale initiatives for building world class products on Cloud and Big Data Platforms, managing and handling large data repositories, delivering distributed and highly scalable applications
    • Experience in defining roadmap migration journey on to Cloud Platform covering Data Lake, Data-warehouse, Object Stores, Data Processing using Spark, Real-Time, Deep learning, AI/ML, Notebooks   etc.
    • Strong DevOps skillset and working in Agile environment
    • Good Customer facing, interpersonal and communication skills
    • Experience in addressing non-functional requirements and large application deployment architectures with concerns such as scalability, performance, availability, reliability, enterprise security, data governance, data quality etc.
    • Ability to quickly prototype, architect and build software using latest technologies around Google Cloud Platform, AWS and/or Azure
    • Strong experience on following technologies - Spark, Snowflake, Redshift, Cloudera, Kafka, ETL tools
    •  Candidate should have at least one or two projects delivered at Production Scale on above technology stacks.

Thanks

Harsha Awtaney

4082033108

Accenture
  • Houston, TX
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
AWS Big Data Engineering professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, AWS data services, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Big Data Management, Data on Cloud, AWS Data Services AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
    • Evaluates emerging technologies, shapes Accentures point of view, defines new architecture patterns and standards, and leads proof of concepts of innovative solutions.
    • Leads and supports client sales pursuits.
    • Collaborates with highly talented resource pool and helps lead the Community of Data practice.
    • Key participant in setting strategic direction to establish near term goals for area of responsibility.
    • Interacts with senior management levels at a client and/or within Accenture, which involves negotiating or influencing on significant matters.
    • Has latitude in decision-making and determining objectives and approaches to critical assignments .
    • Decisions have a lasting impact on area of responsibility with impact outside area of responsibility.
    • Mana
    • ges large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.

    Basic Qualifications
      • 5 years of hands-on experience with public cloud platforms like AWS, MS Azure and GCP
      • 12 plus years of experience in multiple disciplines, such as solution or technical architecture, Data Management, Cloud architecture, or Big Data
      • 5 years of experience with a combination of AWS Cloud Platform, S3, glacier, AWS data Services, Redshift, Redshift Spectrum,Lambda functions, Apache Spark, Kafka, and Python.
      • Must be able to travel 100% (Mon-Thurs)
      • Bachelors degree or 12 years professional experience
      • 5 years of experience developing solutions utilizing:
        • Kafka based streaming services
        • R Studio
        • RDS, S3, glacier
        • MapReduce, Pig, Hive
        • Scala, Spark
    Preferred Qualifications
      • Experience as a consulting manager in a top-tier consulting firm preferred
      • Ten or more years of experience dealing with complex business/technical architectures and complex client delivery.
      • Working knowledge in Big Data tools like MongoDb, Cassandra, Hadoop, NOSQL, Apache Hadoop, Spark, Hive
      • Experience with delivering Big Data Solutions in the cloud with AWS
      • Ability to configure and support API and Opensource integrations
      • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
      • Over 5 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work
      • Experience with DevOps support
      • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    Professional Skill Requirements
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Ability to work with senior client executives
    Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
    Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
    Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
    Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
    Equal Employment Opportunity
    All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
    Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
    Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Atlanta, GA
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
AWS Big Data Engineering professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, AWS data services, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Big Data Management, Data on Cloud, AWS Data Services AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
    • Evaluates emerging technologies, shapes Accentures point of view, defines new architecture patterns and standards, and leads proof of concepts of innovative solutions.
    • Leads and supports client sales pursuits.
    • Collaborates with highly talented resource pool and helps lead the Community of Data practice.
    • Key participant in setting strategic direction to establish near term goals for area of responsibility.
    • Interacts with senior management levels at a client and/or within Accenture, which involves negotiating or influencing on significant matters.
    • Has latitude in decision-making and determining objectives and approaches to critical assignments .
    • Decisions have a lasting impact on area of responsibility with impact outside area of responsibility.
    • Mana
    • ges large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.

    Basic Qualifications
      • 5 years of hands-on experience with public cloud platforms like AWS, MS Azure and GCP
      • 12 plus years of experience in multiple disciplines, such as solution or technical architecture, Data Management, Cloud architecture, or Big Data
      • 5 years of experience with a combination of AWS Cloud Platform, S3, glacier, AWS data Services, Redshift, Redshift Spectrum,Lambda functions, Apache Spark, Kafka, and Python.
      • Must be able to travel 100% (Mon-Thurs)
      • Bachelors degree or 12 years professional experience
      • 5 years of experience developing solutions utilizing:
        • Kafka based streaming services
        • R Studio
        • RDS, S3, glacier
        • MapReduce, Pig, Hive
        • Scala, Spark
    Preferred Qualifications
      • Experience as a consulting manager in a top-tier consulting firm preferred
      • Ten or more years of experience dealing with complex business/technical architectures and complex client delivery.
      • Working knowledge in Big Data tools like MongoDb, Cassandra, Hadoop, NOSQL, Apache Hadoop, Spark, Hive
      • Experience with delivering Big Data Solutions in the cloud with AWS
      • Ability to configure and support API and Opensource integrations
      • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
      • Over 5 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work
      • Experience with DevOps support
      • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    Professional Skill Requirements
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Ability to work with senior client executives
    Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
    Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
    Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
    Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
    Equal Employment Opportunity
    All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
    Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
    Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Dallas, TX
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
AWS Big Data Engineering professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, AWS data services, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Big Data Management, Data on Cloud, AWS Data Services AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
    • Evaluates emerging technologies, shapes Accentures point of view, defines new architecture patterns and standards, and leads proof of concepts of innovative solutions.
    • Leads and supports client sales pursuits.
    • Collaborates with highly talented resource pool and helps lead the Community of Data practice.
    • Key participant in setting strategic direction to establish near term goals for area of responsibility.
    • Interacts with senior management levels at a client and/or within Accenture, which involves negotiating or influencing on significant matters.
    • Has latitude in decision-making and determining objectives and approaches to critical assignments .
    • Decisions have a lasting impact on area of responsibility with impact outside area of responsibility.
    • Mana
    • ges large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.

    Basic Qualifications
      • 5 years of hands-on experience with public cloud platforms like AWS, MS Azure and GCP
      • 12 plus years of experience in multiple disciplines, such as solution or technical architecture, Data Management, Cloud architecture, or Big Data
      • 5 years of experience with a combination of AWS Cloud Platform, S3, glacier, AWS data Services, Redshift, Redshift Spectrum,Lambda functions, Apache Spark, Kafka, and Python.
      • Must be able to travel 100% (Mon-Thurs)
      • Bachelors degree or 12 years professional experience
      • 5 years of experience developing solutions utilizing:
        • Kafka based streaming services
        • R Studio
        • RDS, S3, glacier
        • MapReduce, Pig, Hive
        • Scala, Spark
    Preferred Qualifications
      • Experience as a consulting manager in a top-tier consulting firm preferred
      • Ten or more years of experience dealing with complex business/technical architectures and complex client delivery.
      • Working knowledge in Big Data tools like MongoDb, Cassandra, Hadoop, NOSQL, Apache Hadoop, Spark, Hive
      • Experience with delivering Big Data Solutions in the cloud with AWS
      • Ability to configure and support API and Opensource integrations
      • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
      • Over 5 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work
      • Experience with DevOps support
      • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    Professional Skill Requirements
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Ability to work with senior client executives
    Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
    Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
    Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
    Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
    Equal Employment Opportunity
    All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
    Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
    Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Minneapolis, MN
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward. We partner with our clients to help transform their data into an Appreciating Business Asset.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology and data experts who are highly collaborative taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in Technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
AWS Big Data Engineering professionals develop deep data integration skills to support Accenture's Cloud Big data engineering and analytics agendas, including skills such as cloud big data platform architecture, AWS data services, on premise to cloud data migration, data ingestion, data curation, and data migration.
    • Act as a technical and solution expert in the areas of Big Data Management, Data on Cloud, AWS Data Services AI / Machine learning capabilities in Data Platforms
    • Advise clients on Data on Cloud adoption & journey leveraging Nextgen Information platforms on Cloud, Cloud data architecture patterns, platform selection.
    • Build Senior stakeholder relationships
    • Build personal brand within Accenture and drive thought leadership through participation in Business Development efforts, client meetings and workshops, speaking in industry conferences, publishing white papers, etc.
    • Partner with Client teams and clients in helping them in Data Monetization initiatives - making business sense of structured, semi-structured, unstructured and streaming data, to develop new business strategies, customer engagement models, manage product portfolios, and optimize enterprise assets
    • Develop industry relevant data analytics solutions for enterprise business functions
    • Collaborate with GoTo market teams in generating demand and pipeline for data analytics solutions
    • Collaborate with partners (software vendors) to build joint industry solutions
    • Serve as data supply chain expert for the vision and integration of emerging data technologies on cloud, anticipation of new trends and resolution of complex business and technical problems.
    • Lead the discussions and early analysis of the data-on-cloud concepts as it relates to Accentures Data supply chain service offerings, so that clear use cases are developed and prioritized as well as transitioning these concepts from ideas to working prototypes with the full support of the appropriate teams who will develop the new offerings.
    • Evaluate alliance technologies for potential go-to-market partnerships.
    • Lead the development of offering proofs-of-concept and effectively transition those concepts to the lines of business for full architecture, engineering, deployment and commercialization.
    • Coach and mentor both senior and junior members across OGs and IDC.
    • Develop practical solutions, methodologies, solution patterns and point-of-view documents.
    • Manage and grow Data, Data on Cloud pipeline
    • Participate in industry events to project Accentures thought leadership
A professional at this position level within Accenture has the following responsibilities:
    • Evaluates emerging technologies, shapes Accentures point of view, defines new architecture patterns and standards, and leads proof of concepts of innovative solutions.
    • Leads and supports client sales pursuits.
    • Collaborates with highly talented resource pool and helps lead the Community of Data practice.
    • Key participant in setting strategic direction to establish near term goals for area of responsibility.
    • Interacts with senior management levels at a client and/or within Accenture, which involves negotiating or influencing on significant matters.
    • Has latitude in decision-making and determining objectives and approaches to critical assignments .
    • Decisions have a lasting impact on area of responsibility with impact outside area of responsibility.
    • Mana
    • ges large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.

    Basic Qualifications
      • 5 years of hands-on experience with public cloud platforms like AWS, MS Azure and GCP
      • 12 plus years of experience in multiple disciplines, such as solution or technical architecture, Data Management, Cloud architecture, or Big Data
      • 5 years of experience with a combination of AWS Cloud Platform, S3, glacier, AWS data Services, Redshift, Redshift Spectrum,Lambda functions, Apache Spark, Kafka, and Python.
      • Must be able to travel 100% (Mon-Thurs)
      • Bachelors degree or 12 years professional experience
      • 5 years of experience developing solutions utilizing:
        • Kafka based streaming services
        • R Studio
        • RDS, S3, glacier
        • MapReduce, Pig, Hive
        • Scala, Spark
    Preferred Qualifications
      • Experience as a consulting manager in a top-tier consulting firm preferred
      • Ten or more years of experience dealing with complex business/technical architectures and complex client delivery.
      • Working knowledge in Big Data tools like MongoDb, Cassandra, Hadoop, NOSQL, Apache Hadoop, Spark, Hive
      • Experience with delivering Big Data Solutions in the cloud with AWS
      • Ability to configure and support API and Opensource integrations
      • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
      • Over 5 years of experience in sales / pre-sales functions, leading pursuits, proposal development, effort estimations, statement of work
      • Experience with DevOps support
      • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
    Professional Skill Requirements
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Ability to work with senior client executives
    Our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
    Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
    Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
    Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
    Equal Employment Opportunity
    All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
    Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
    Accenture is committed to providing veteran employment opportunities to our service men and women.
Expedia Group - Europe
  • London, UK

Expedia needs YOU!


At Hotels.com (part of the Expedia Group) ensuring that our customers find the perfect hotel is an exciting mission for our technology teams.

We are looking for a Software Development Engineer to join our awesome team of bright minds. We are building Hotels.com's data streaming and ingestion platform, reliably handling many thousands of messages every second.

Our platform enables data producers and consumers to easily share and process information. We have built and continue to improve our platform using open source projects such as: Kafka, Kubernetes and Java Spring, all running in the cloud, and we love contributing our work back to the open source community.


What you'll do:


Are you interested in building a world-class, fast-growing data platform in the cloud?



  • Are you keen to work with and contribute to open source projects?

  • Do you have a real passion for clean code and finding elegant solutions to problems?

  • Are you eager to learn streaming, cloud and big data technologies and keep your skills up to date?

  • If any of those are true... Hotels.com is looking for you!

  • We seek an enthusiastic, experienced and reliable engineer who enjoys getting things done.

  • You should have good communication skills and be equally comfortable clearly explaining technical problems and solutions with analysts, engineers and product managers.

  • We welcome your fresh ideas and approaches as we constantly aim to improve our development methodologies.

  • Our team has experience using a wide range of cutting edge technologies and years of streaming, cloud and big data experience.

  • We are always learning and growing, so we guarantee that you won't be bored with us!


We don’t believe in skill matching against a list of buzzwords…

However we do believe in hiring smart, friendly and creative people, with excellent programming abilities, who are on a journey to mastery through craftsmanship. We believe great developers can learn new technologies quickly and effectively but it wouldn't hurt if you have experience with some of the following (or a passion to learn them):

Technologies:
Kafka, Kubernetes, Spring, AWS, Spark Streaming, Hive, Flink, Docker.

Experience:
Modern core and server side Java (concurrency, streams, reactive, lambdas).
Microservice architecture, design, and standard methodologies with an eye towards scale.
Building and debugging highly scalable performant systems.
Actively contributing to Open Source Software.


What you’ll do:



  • Write clean, efficient, thoroughly tested code, backed-up with pair programming and code reviews.

  • Much of our code is Java, but we use all kinds of languages and frameworks.

  • Be part of an agile team that is continuously learning and improving.

  • Develop scalable and highly-performant distributed systems with everything this entails (availability, monitoring, resiliency).

  • Work with our business partners to flesh out and deliver on requirements in an agile manner.

  • Evolve development standards and practices.

  • Take architectural ownership for various critical components and systems.

  • Proactive problem solving at the organization level.

  • Communicate and document solutions and design decisions.

  • Build bridges between technical teams to enable valuable collaborations.


As a team we love to:



  • Favor clean code, and simple, robust architecture.

  • Openly share knowledge in order to grow and develop our team members.

  • Handle massive petabyte-scale data sets.

  • Host and attend meetups and conferences to present our work. This year we've presented at the

  • Dataworks Summit in Berlin and the Devoxx Conference in London.

  • Contribute to Open Source. In recent years our team created Circus Train, Waggle Dance, BeeJU, CORC, Plunger, Jasvorno and contributed to Confluent, aws-alb-ingress-controller, S3Proxy, Cascading, Hive, HiveRunner, Kafka and Terraform. In addition, we are currently working towards open sourcing our streaming platform.

  • Create an inclusive and fun workplace.


Do all of this in a comfortable, modern office, with a massive roof terrace in great location within central London!


We’ll take your career on a journey that’s flexible and right for you; recognizing and rewarding your achievements:

A conversation around flexible working and what will best fit you is encouraged at Hotels.com.
Competitive salaries and many growth opportunities within the wider global Expedia Group.
Option to attend conferences globally and enrich the technology skills you are passionate about.
Cash and Stock rewards for outstanding performance.
Extensive travel rewards and discounts for all employees, perfect for ticking some destinations off your bucket list!


We believe that a diverse and inclusive workforce, is the most awesome workforce…
We believe in being Different. We seek new ideas, different ways of thinking, diverse backgrounds and approaches, because averages can lie and conformity is dangerous.
Expedia is committed to crafting an inclusive work environment with a diverse workforce. You will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.

Why join us:
Expedia Group recognizes our success is dependent on the success of our people.  We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business.  Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so. 


Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach.  We relentlessly strive for better, but not at the cost of the customer.  We act with humility and optimism, respecting ideas big and small.  We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground so we can act fast and stay simple.  Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.


If you have a hunger to make a difference with one of the most loved consumer brands in the world and to work in the dynamic travel industry, this is the job for you.


Our family of travel brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®, lastminute.com.au®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions, CarRentals.com™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.


We’re excited for you to make Expedia an even more awesome place to work!
So what are you waiting for? Apply now and join us on our journey to become the #1 travel company in the world!



Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.

KI labs GmbH
  • München, Germany
  • Salary: €70k - 95k

About Us


At KI labs we design and build state of the art software and data products and solutions for the major brands of Germany and Europe. We aim to push the status quo of technology in corporations, with special focus areas of software, data and culture. Inside, we are a team of software developers, designers, product managers and data scientists, who are passionate about building the products of future today. We believe in open-source and independent teams, follow Agile practices, lean startup method and aim to share this culture with our clients. We are mainly located in Munich and recently Lisbon.


Your Responsibilities



  • Lead and manage a team of talented Data Engineers and Data Scientists in a diverse environment.

  • Ensure success of your team members and foster their career growth.

  • Ensure that the technology stacks, infrastructure, software architecture and development methods we use provide for an efficient project delivery and sustainable company growth.

  • Use your extensive practical expertise to help teams design, build and scale efficient and elegant data products on cloud computing platforms such as AWS, Azure, or Google Cloud Platform.

  • Enable and facilitate a data-driven culture for internal and external clients, and use advanced data pipelines to generate insights;


Skills and qualifications



  • Advanced degree in Computer Science, Engineering, Data Science, Applied Mathematics, Statistics or related fields;

  • Substantial practical experience in software and data engineering.

  • You are an authentic leader, keen on leading by example rather than by direct top-down management;

  • You are committed to operational excellence and to facilitating team communication and collaboration on a daily basis.

  • You like to see the big picture and you have a deep understanding of the building blocks of large-scale data architectures and pipelines.

  • You have passion for data, be it at crunching, transforming or visualising large data streams;

  • You have working knowledge of programming languages such as Python, Java, Scala, or Go

  • You have working knowledge of modern big data tooling and frameworks (Hadoop, Spark, Flink, Kafka, etc.), data storage systems, analytics tools, and ideally machine learning platforms.


Why work with us



  • You will have an opportunity to be at the frontline of innovation together with our prominent clients, influencing the car you drive in five years, the services you have on your flight, and the way you pay for your morning coffee.

  • Working closely with our leadership team, you will have a real chance to influence what our quickly growing company looks like in 3 years.

  • You get a challenging working environment located in the center of Munich and Lisbon and an ambitious team of individuals with unique backgrounds and expertise.

  • You get the chance to work on various interesting projects in short time frames; we do not do work on maintenance or linear projects.

  • We have an open-door work culture where ideas and initiatives are encouraged.

  • We offer a performance-based competitive salary.

Epidemic Sound AB
  • Stockholm, Sweden

At Epidemic Sound we are reinventing the music industry. Our carefully curated music library, with over 30,000 tracks, is tailored for storytellers no matter what their story is. Countless customers around the world, from broadcasters, productions companies and YouTubers rely on our tracks to help them tell their stories. Epidemic Sound’s music is heard in hundreds of thousands of videos on online video platforms such as YouTube, Facebook, Twitch and Instagram. Our HQ is located in Stockholm with offices in NYC, LA, Hamburg, Amsterdam and Madrid. We are growing fast, we have lots of fun and we are transforming the music industry.


We are now looking for a Software Engineer!


Job description


In addition to the hundred of thousands of storytellers using our products, our music is heard tens of billions of times every month across Youtube, social media & music streaming platforms. We want to make maximum use of all this data to generate insights and enable data driven decisions both in our product development and for our musicians and business stakeholders.


As such, we are now looking to grow our Data Infrastructure Team with an experienced software engineer to help us design, develop and evolve our platform and products.


You can expect to:



  • Collaborate with our product-oriented teams during data-related projects to achieve reliable and scalable solutions

  • Design, develop and evolve the data pipelines that fuel our back-end services, machine learning platform and business intelligence systems.

  • Contribute to all stages of the product life-cycle: Design, implementation, testing, releasing and maintenance

  • Work in a lightweight agile process where we regularly evaluate how we work and try to improve

  • Collaborate constantly: We’re big believers in teamwork and the value of practices like careful code reviews, pair (or mob) programming etc

  • Learn a ton of new things: Be it through hack-days, courses, conferences or tech-talks, we emphasize learning and we also expect you to share your knowledge with your colleagues

  • Have fun and take a lot of pride in what you and all of Epidemic Sound is doing


What are we looking for?



  • You have a great understanding of modern web architectures, distributed systems, data processing and storage solutions

  • You have strong knowledge of relational databases and SQL

  • You are able to find opportunities for data acquisition and new uses for existing data

  • You enjoy working with multiple stakeholders and feel comfortable prioritizing internal and external requirements

  • You love teamwork

  • You speak English with professional working proficiency (Swedish is not a requirement)


It would also be music to our ears if:



  • You have experience in one or more of the following: Python, AWS ecosystem, Google Cloud Platform

  • You have experience working with data processing and querying engines such as Spark Hadoop, BigQuery or Kafka

  • You have experience with multiple storage solutions such as columnar oriented databases, NoSQL or Graph databases


Application


Do you want to be a part of our fantastic team? Send us your cover letter and CV as soon as possible - interviews are held continuously. We look forward hearing from you!

Acxiom
  • Austin, TX
As a Senior Hadoop Administrator, you will assist leadership for projects related to Big Data technologies and software development support for client research projects. You will analyze the latest Big Data Analytic technologies and their innovative applications in both business intelligence analysis and new service offerings. You will bring these insights and best practices to Acxiom's Big Data Projects. You are able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them. You will develop highly scalable and extensible Big Data platform which enables collection, storage, modeling, and analysis of massive data sets from numerous channels. You are also a self-starter able to continuously evaluate new technologies, innovate and deliver solutions for business critical applications


What you will do:


  • Responsible for implementation and ongoing administration of Hadoop infrastructure
  • Provide technical leadership and collaboration with engineering organization, develop key deliverables for Data Platform Strategy - Scalability, optimization, operations, availability, roadmap.
  • Lead the platform architecture and drive it to the next level of effectiveness to support current and future requirements
  • Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Enterprise, etc.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screen Hadoop cluster job performances and capacity planning
  • Help optimize and integrate new infrastructure via continuous integration methodologies (DevOps CHEF)
  • Lead and review Hadoop log files with the help of log management technologies (ELK)
  • Provide top-level technical help desk support for the application developers
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality, availability and security
  • Collaborating with application teams to perform Hadoop updates, patches, version upgrades when required
  • Mentor Hadoop engineers and administrators
  • Work with vendor support teams on support tasks


Do you have?


  • Bachelor's degree in related field of study, or equivalent experience
  • 6+ years of Big Data Administration Experience
  • Extensive knowledge and Hands-on Experience of Hadoop based data manipulation/storage technologies like HDFS, MapReduce, Yarn, Spark/Kafka, HBASE, HIVE, Pig, Impala, R and Sentry/Ranger/Knox
  • Experience in capacity planning, cluster designing and deployment, troubleshooting and performance tuning
  • Experience supporting Data Science teams and Analytics teams on complex code deployment, debugging and performance optimization problems
  • Great operational expertise such as excellent troubleshooting skills, understanding of system's capacity, bottlenecks, core resource utilizations (CPU, OS, Storage, and Networks)
  • Experience in Hadoop cluster migrations or upgrades
  • Strong scripting skills in Perl, Python, shell scripting, and/or Ruby on Rails
  • Linux/SAN administration skills and RDBMS/ETL knowledge
  • Good Experience in Cloudera, HortonWorks, and/or MapR versions along with monitoring/alerting tools (Nagios, Ganglia, Zenoss, Cloudera Manager)
  • Strong problem solving and critical thinking skills
  • Excellent verbal and written communication skills


What will set you apart:


  • Solid understanding and hands-on experience of Big Data on private/public cloud technologies(AWS/GCP/Azure)
  • DevOps experience (CHEF, Puppet and Ansible)
  • Strong knowledge of JAVA/J2EE and other web technologies

 
TalentBurst, an Inc 5000 company
  • Austin, TX

Position: Hadoop Administrator

Location: Austin, TX

Duration: 12+ months

Interview: Skype

Job Description:
Person will be responsible to work as part of 24x7 shifts (US hours) to provide Hadoop Platform Support and Perform Adminstrative on Production Hadoop clusters.

Must have skills:
3+ yrs. hands-on administration experience on Large Distributed Hadoop System on Linux

Technical Knowledge on YARN, MapReduce, HDFS, HBase, Zookeeper, Pig and Hive

Hands-on Experience as a Linux Sys Admin

Nice to have skills:
Knowledge on Spark and Kafka is a plus / Hadoop Certification is preferred

Roles and responsibilities:
Hadoop cluster set up, performance fine-tuning, monitoring and administration

Skill Requirements:
Minimum 3 yrs. hands-on experience on Large Distributed Hadoop System on Linux.
Strong Technical Knowledge on Hadoop Eco System such as YARN, MapReduce, HDFS, HBase, Zookeeper, Pig and Hive.
Hands on experience in Hadoop cluster set up, performance fine-tuning, monitoring and administration.
Hands-on Experience as a Linux Sys Admin
Knowledge on Spark and Kafka is a plus.
Hadoop Certification is preferred

Expedia, Inc.
  • Bellevue, WA

We are seeking a deeply experienced technical leader to lead the next generation of engineering investments, and culture for the GCO Customer Care Platform (CCP). The technical leader in this role will help design, engineer and drive implementation of critical pieces of the EG-wide architecture (platform and applications) for customer care - these areas include, but limited to unified voice support, partner on boarding with configurable rules, Virtual agent programming model for all partners, and intelligent fulfillment. In addition, a key focus of this leader's role will also be to grow and mentor junior software engineers in GCO with a focus on building out a '2020 world-class engineering excellence' vision / culture.


What you’ll do:



  • Deep Technology Leadership (Design, Implementation, and Execution for the follow);

  • Ship next-gen EG-wide architecture (platform and applications) that enable 90% of automated self-service journeys with voice as a first-class channel from day zero

  • Design and ship a VA (Virtual Agent) Programming Model that enables partners standup intelligent virtual agents on CCP declaratively in minutes

  • Enable brand partners to onboard their own identity providers onto CCP

  • Enable partners to configure their workflows and business rules for their Virtual Agents

  • Programming Model for Intelligent actions in the Fulfillment layer

  • Integration of Context and Query as first-class entities into the Virtual Agent

  • Cross-Group Collaboration and Influence

  • Work with company-wide initiatives across AI Labs, BeX to build out a Best of Breed

  • Conversational Platform for EG-wide apps

  • Engage with and translate internal and external partner requirements into platform investments for effective on boarding of customers

  • Represent GCO's Technical Architecture at senior leadership meetings (eCP and EG) to influence and bring back enhancements to improve CCP



Help land GCO 2020 Engineering and Operational Excellence Goals

Mentor junior developers on platform engineering excellence dimensions (re-usable patterns, extensibility, configurability, scalability, performance, and design / implementation of core platform pieces)

Help develop a level of engineering muscle across GCO that becomes an asset for EG (as a provider of platform service as well as for talent)

Who you are:



  • BS or MS in Computer Science

  • 20 years of experience designing and developing complex, mission-critical, distributed software systems on a variety of platforms in high-tech industries

  • Hands on experience in designing, developing, and delivering (shipping) V1 (version one) MVP enterprise software products and solutions in a technical (engineering and architecture) capacity

  • Experience in building strong relationships with technology partners, customers, and getting closure on issues including delivering on time and to specification

  • Skills: Linux/ Windows/VMS, Scala, Java, Python, C#, C++, Object Oriented Design (OOD), Spark, Kafka, REST/Web Services, Distributed Systems, Reliable and scalable transaction processing systems (HBase, Microsoft SQL, Oracle, Rdb)

  • Nice to have: Experience in building highly scalable real-time processing platforms that hosts machine learning algorithms for Guided / Prescriptive Learning
    Identifies and solves problems at the company level while influencing product lines

  • Provides technical leadership in difficult times or serious crises

  • Key strategic player to long-term business strategy and vision

  • Recognized as an industry expert and is a recognized mentor and leader at the company Provides strategic influence across groups, projects and products

  • Provides long term product strategy and vision through group level efforts

  • Drive for results: Is sought out to lead company-wide initiatives that deliver cross-cutting lift to the organization and provides leadership in a crisis and is a key player in long-term business strategy and vision

  • Technical/Functional skills: Proves credentials as industry experts by inventing and delivering transformational technology/direction and helps drive change beyond the company and across the industry

  • Has the vision to impact long-term product/technology horizon to transform the entire industry



Why join us:

Expedia Group recognizes our success is dependent on the success of our people.  We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business.  Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so. 


Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach.  We relentlessly strive for better, but not at the cost of the customer.  We act with humility and optimism, respecting ideas big and small.  We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground, so we can act fast and stay simple.  Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.


If you have a hunger to make a difference with one of the most loved consumer brands in the world and to work in the dynamic travel industry, this is the job for you.


Our family of travel brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®, lastminute.com.au®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions, CarRentals.com™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.



Expedia is committed to creating an inclusive work environment with a diverse workforce.   All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.  This employer participates in E-Verify. The employer will provide the Social Security Administration (SSA) and, if necessary, the Department of Homeland Security (DHS) with information from each new employee's I-9 to confirm work authorization.

Ultra Tendency
  • Berlin, Deutschland

Big Data Software Engineer


Lead your own development team and our customers to success! Ultra Tendency is looking for someone who convinces not just by writing excellent code, but also through strong presence and leadership. 


At Ultra Tendency you would:



  • Work in our office in Berlin/Magdeburg and on-site at our customer's offices

  • Make Big Data useful (build program code, test and deploy to various environments, design and optimize data processing algorithms for our customers)

  • Develop outstanding Big Data application following the latest trends and methodologies

  • Be a role model and strong leader for your team and oversee the big picture

  • Prioritize tasks efficiently, evaluating and balancing the needs of all stakeholders


Ideally you have:



  • Strong experience in developing software using Python, Scala or a comparable language

  • Proven experience with data ingestion, analysis, integration, and design of Big Data applications using Apache open-source technologies

  • Profound knowledge about with data engineering technology, e.g. Kafka, SPARK, HBase, Kubernetes

  • Strong background in developing on Linux

  • Solid computer science fundamentals (algorithms, data structures and programming skills in distributed systems)

  • Languages: Fluent English and German is a plus


We offer:



  • Fascinating tasks and unique Big Data challenges of major players from various industries (automotive, insurance, telecommunication, etc.)

  • Fair pay and bonuses

  • Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager

  • International diverse team

  • Possibility to work with the open-source community and become a contributor

  • Work with cutting edge equipment and tools


Confidentiality guaranteed

eBay
  • Kleinmachnow, Germany

About the team:



Core Product Technology (CPT) is a global team responsible for the end-to-end eBay product experience and technology platform. In addition, we are working on the strategy and execution of our payments initiative, transforming payments management on our Marketplace platform which will significantly improve the overall customer experience.


The opportunity

At eBay, we have started a new chapter in our iconic internet history of being the largest online marketplace in the world. With more than 1 billion listings (more than 80% of them selling new items) in over 400 markets, eBay is providing a robust platform where merchants of all sizes can compete and win. Every single day millions of users come to eBay to search for items in our diverse inventory of over a billion items.



eBay is starting a new Applied Research team in Germany and we are looking for a senior technologist to join the team. We’re searching for a hands-on person who has an applied research background with strong knowledge in machine learning and natural language processing (NLP). The German team’s mission is to improve the German and other European language search experience as well as to enhance our global search platform and machine learned ranking systems in partnership with our existing teams in San Jose California and Shanghai China.



This team will help customers find what they’re shopping for by developing full-stack solutions from indexing, to query serving and applied research to solve core ranking, query understanding and recall problems in our highly dynamic marketplace. The global search team works closely with the product management and quality engineering teams along with the Search Web and Native Front End and Search services, and Search Mobile. We build our systems using C++, Scala, Java, Hadoop/Spark/HBase, TensorFlow/Caffe, Kafka and other standard technologies. The team believes in agile development with autonomous and empowered teams.






Diversity and inclusion at eBay goes well beyond a moral necessity – it’s the foundation of our business model and absolutely critical to our ability to thrive in an increasingly competitive global landscape. To learn about eBay’s Diversity & Inclusion click here: https://www.ebayinc.com/our-company/diversity-inclusion/.
Man AHL
  • London, UK

The Role


As a Quant Platform Developer at AHL you will be building the tools, frameworks, libraries and applications which power our Quantitative Research and Systematic Trading. This includes responsibility for the continued success of “Raptor”, our in-house Quant Platform, next generation Data Engineering, and evolution of our production Trading System as we continually expand the markets and types of assets we trade, and the styles in which we trade them. Your challenges will be varied and might involve building new high performance data acquisition and processing pipelines, cluster-computing solutions, numerical algorithms, position management systems, visualisation and reporting tools, operational user interfaces, continuous build systems and other developer productivity tools.


The Team


Quant Platform Developers at AHL are all part of our broader technology team, members of a group of over sixty individuals representing eighteen nationalities. We have varied backgrounds including Computer Science, Mathematics, Physics, Engineering – even Classics - but what unifies us is a passion for technology and writing high-quality code.



Our developers are organised into small cross-functional teams, with our engineering roles broadly of two kinds: “Quant Platform Developers” otherwise known as our “Core Techs”, and “Quant Developers” which we often refer to as “Sector Techs”. We use the term “Sector Tech” because some of our teams are aligned with a particular asset class or market sector. People often rotate teams in order to learn more about our system, as well as find the position that best matches their interests.


Our Technology


Our systems are almost all running on Linux and most of our code is in Python, with the full scientific stack: numpy, scipy, pandas, scikit-learn to name a few of the libraries we use extensively. We implement the systems that require the highest data throughput in Java. For storage, we rely heavily on MongoDB and Oracle.



We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker for containerisation, OpenStack for our private cloud, Ansible for architecture automation, and HipChat for internal communication. But our technology list is never static: we constantly evaluate new tools and libraries.


Working Here


AHL has a small company, no-attitude feel. It is flat structured, open, transparent and collaborative, and you will have plenty of opportunity to grow and have enormous impact on what we do.  We are actively engaged with the broader technology community.



  • We host and sponsor London’s PyData and Machine Learning Meetups

  • We open-source some of our technology. See https://github.com/manahl

  • We regularly talk at leading industry conferences, and tweet about relevant technology and how we’re using it. See @manahltech



We’re fortunate enough to have a fantastic open-plan office overlooking the River Thames, and continually strive to make our environment a great place in which to work.



  • We organise regular social events, everything from photography through climbing, karting, wine tasting and monthly team lunches

  • We have annual away days and off-sites for the whole team

  • We have a canteen with a daily allowance for breakfast and lunch, and an on-site bar for in the evening

  • As well as PC’s and Macs, in our office you’ll also find numerous pieces of cool tech such as light cubes and 3D printers, guitars, ping-pong and table-football, and a piano.



We offer competitive compensation, a generous holiday allowance, various health and other flexible benefits. We are also committed to continuous learning and development via coaching, mentoring, regular conference attendance and sponsoring academic and professional qualifications.


Technology and Business Skills


At AHL we strive to hire only the brightest and best and most highly skilled and passionate technologists.



Essential



  • Exceptional technology skills; recognised by your peers as an expert in your domain

  • A proponent of strong collaborative software engineering techniques and methods: agile development, continuous integration, code review, unit testing, refactoring and related approaches

  • Expert knowledge in one or more programming languages, preferably Python, Java and/or C/C++

  • Proficient on Linux platforms with knowledge of various scripting languages

  • Strong knowledge of one or more relevant database technologies e.g. Oracle, MongoDB

  • Proficient with a range of open source frameworks and development tools e.g. NumPy/SciPy/Pandas, Pyramid, AngularJS, React

  • Familiarity with a variety of programming styles (e.g. OO, functional) and in-depth knowledge of design patterns.



Advantageous



  • An excellent understanding of financial markets and instruments

  • Experience of front office software and/or trading systems development e.g. in a hedge fund or investment bank

  • Expertise in building distributed systems with service-based or event-driven architectures, and concurrent processing

  • A knowledge of modern practices for data engineering and stream processing

  • An understanding of financial market data collection and processing

  • Experience of web based development and visualisation technology for portraying large and complex data sets and relationships

  • Relevant mathematical knowledge e.g. statistics, asset pricing theory, optimisation algorithms.


Personal Attributes



  • Strong academic record and a degree with high mathematical and computing content e.g. Computer Science, Mathematics, Engineering or Physics from a leading university

  • Craftsman-like approach to building software; takes pride in engineering excellence and instils these values in others

  • Demonstrable passion for technology e.g. personal projects, open-source involvement

  • Intellectually robust with a keenly analytic approach to problem solving

  • Self-organised with the ability to effectively manage time across multiple projects and with competing business demands and priorities

  • Focused on delivering value to the business with relentless efforts to improve process

  • Strong interpersonal skills; able to establish and maintain a close working relationship with quantitative researchers, traders and senior business people alike

  • Confident communicator; able to argue a point concisely and deal positively with conflicting views.

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-5-8 years of Java experience, Scala and Python experience a plus

-3+ years of experience as an analyst, data scientist, or related quantitative role.

-3+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-3-5years of Java experience, Scala and Python experience a plus

-2+ years of experience as an analyst, data scientist, or related quantitative role.

-2+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Infosys
  • Houston, TX
Responsibilities

-Hands on experience with Big Data systems, building ETL pipelines, data processing, and analytics tools
-Understanding of data structures & common methods in data transformation.
-Familiar with the concepts of dimensional modeling.
-Sound knowledge of one programming language - Python or Java
-Programming experience using tools such as Hadoop and Spark.
-Strong proficiency in using query languages such as SQL, Hive and SparkSQL
-Experience in Kafka & Scala would be a plus

118118Money
  • Austin, TX

Seeking an individual with a keen eye for good design combined with the ability to communicate those designs through informative design artifacts. Candidates should be familiar with an Agile development process (and understand its limitations), able to mediate between product / business needs and developer architectural needs. They should be ready to get their hands dirty coding complex pieces of the overall architecture.

We are .NET Core on the backend, Angular 2 on a mobile web front-end, and native on Android and iOS. We host our code across AWS and on-premises VMs, and use various data backends (SQL Server, Oracle, Mongo).

Very important is interest in (and hopefully, experience with) modern big data pipelines and machine learning. Experience with streaming platforms feeding Apache Spark jobs that train machine learning models would be music to our ears. Financial platforms generate massive amounts of data, and re-architecting aspects of our microservices to support that will be a key responsibility.

118118 Money is a private financial services company with R&D headquartered in Austin along highway 360, in front of the Bull Creek Nature preserve. We have offices around the world, so the candidate should be open to occasional travel abroad. The atmosphere is casual, and has a startup feel. You will see your software creations deployed quickly.

Responsibilities

    • Help us to build a big data pipeline and add machine learning capability to more areas of our platform.
    • Manage code from development through deployment, including support and maintenance.
    • Perform code reviews, assist and coach more junior developers to adhere to proper design patterns.
    • Build fault-tolerant distributed systems.

Requirements

    • Expertise in .NET, C#, HTML5, CSS3, Javascript
    • Experience with some flavor of ASP.NET MVC
    • Experience with SQL Server
    • Expertise in the design of elegant and intuitive REST APIs.
    • Cloud development experience (Amazon, Azure, etc)
    • Keen understanding of security principles as they pertain to service design.
    • Expertise in object-oriented design principles.

Desired

    • Machine Learning experience
    • Mobile development experience
    • Kafka / message streaming experience
    • Apache Spark experience
    • Knowledge of the ins and outs of Docker containers
    • Experience with MongoDB
FCA Fiat Chrysler Automobiles
  • Detroit, MI

Fiat Chrysler Automobiles is looking to fill the full-time position of a Data Scientist. This position is responsible for delivering insights to the commercial functions in which FCA operates.


The Data Scientist is a role in the Business Analytics & Data Services (BA) department and reports through the CIO. They will play a pivotal role in the planning, execution  and delivery of data science and machine learning-based projects. The bulk of the work with be in areas of data exploration and preparation, data collection and integration, machine learning (ML) and statistical modelling and data pipe-lining and deployment.

The newly hired data scientist will be a key interface between the ICT Sales & Marketing team, the Business and the BA team. Candidates need to be very much self-driven, curious and creative.

Primary Responsibilities:

    • Problem Analysis and Project Management:
      • Guide and inspire the organization about the business potential and strategy of artificial intelligence (AI)/data science
      • Identify data-driven/ML business opportunities
      • Collaborate across the business to understand IT and business constraints
      • Prioritize, scope and manage data science projects and the corresponding key performance indicators (KPIs) for success
    • Data Exploration and Preparation:
      • Apply statistical analysis and visualization techniques to various data, such as hierarchical clustering, T-distributed Stochastic Neighbor Embedding (t-SNE), principal components analysis (PCA)
      • Generate and test hypotheses about the underlying mechanics of the business process.
      • Network with domain experts to better understand the business mechanics that generated the data.
    • Data Collection and Integration:
      • Understand new data sources and process pipelines. Catalog and document their use in solving business problems.
      • Create data pipelines and assets the enable more efficiency and repeatability of data science activities.
    • Data Exploration and Preparation:
      • Apply statistical analysis and visualization techniques to various data, such as hierarchical clustering, T-distributed Stochastic Neighbor Embedding (t-SNE), principal components analysis (PCA)
    • Machine Learning and Statistical Modelling:
      • Apply various ML and advanced analytics techniques to perform classification or prediction tasks
      • Integrate domain knowledge into the ML solution; for example, from an understanding of financial risk, customer journey, quality prediction, sales, marketing
      • Testing of ML models, such as cross-validation, A/B testing, bias and fairness
    • Operationalization:
      • Collaborate with ML operations (MLOps), data engineers, and IT to evaluate and implement ML deployment options
      • (Help to) integrate model performance management tools into the current business infrastructure
      • (Help to) implement champion/challenger test (A/B tests) on production systems
      • Continuously monitor execution and health of production ML models
      • Establish best practices around ML production infrastructure
    • Other Responsibilities:
      • Train other business and IT staff on basic data science principles and techniques
      • Train peers on specialist data science topics
      • Promote collaboration with the data science COE within the organization.

Basic Qualifications:

    • A bachelors  in computer science, data science, operations research, statistics, applied mathematics, or a related quantitative field [or equivalent work experience such as, economics, engineering and physics] is required. Alternate experience and education in equivalent areas such as economics, engineering or physics, is acceptable. Experience in more than one area is strongly preferred.
    • Candidates should have three to six years of relevant project experience in successfully launching, planning, executing] data science projects. Preferably in the domains of automotive or customer behavior prediction.
    • Coding knowledge and experience in several languages: for example, R, Python, SQL, Java, C++, etc.
    • Experience of working across multiple deployment environments including cloud, on-premises and hybrid, multiple operating systems and through containerization techniques such as Docker, Kubernetes, AWS Elastic Container Service, and others.
    • Experience with distributed data/computing and database tools: MapReduce, Hadoop, Hive, Kafka, MySQL, Postgres, DB2 or Greenplum, etc.
    • All candidates must be self-driven, curious and creative.
    • They must demonstrate the ability to work in diverse, cross-functional teams.
    • Should be confident, energetic self-starters, with strong moderation and communication skills.

Preferred Qualifications:

    • A master's degree or PhD in statistics, ML, computer science or the natural sciences, especially physics or any engineering disciplines or equivalent.
    • Experience in one or more of the following commercial/open-source data discovery/analysis platforms: RStudio, Spark, KNIME, RapidMiner, Alteryx, Dataiku, H2O, SAS Enterprise Miner (SAS EM) and/or SAS Visual Data Mining and Machine Learning, Microsoft AzureML, IBM Watson Studio or SPSS Modeler, Amazon SageMaker, Google Cloud ML, SAP Predictive Analytics.
    • Knowledge and experience in statistical and data mining techniques: generalized linear model (GLM)/regression, random forest, boosting, trees, text mining, hierarchical clustering, deep learning, convolutional neural network (CNN), recurrent neural network (RNN), T-distributed Stochastic Neighbor Embedding (t-SNE), graph analysis, etc.
    • A specialization in text analytics, image recognition, graph analysis or other specialized ML techniques such as deep learning, etc., is preferred.
    • Ideally, the candidates are adept in agile methodologies and well-versed in applying DevOps/MLOps methods to the construction of ML and data science pipelines.
    • Knowledge of industry standard BA tools, including Cognos, QlikView, Business Objects, and other tools that could be used for enterprise solutions
    • Should exhibit superior presentation skills, including storytelling and other techniques to guide and inspire and explain analytics capabilities and techniques to the organization.