OnlyDataJobs.com

FlixBus
  • Berlin, Germany

Your Tasks – Paint the world green



  • Holistic cloud-based infrastructure automation

  • Distributed data processing clusters as well as data streaming platforms based on Kafka, Flink and Spark

  • Microservice platforms based on Docker

  • Development infrastructure and QA automation

  • Continuous Integration/Delivery/Deployment


Your Profile – Ready to hop on board



  • Experience in building and operating complex infrastructure

  • Expert-level: Linux, System Administration

  • Experience with Cloud Services, Expert-Level with either AWS or GCP  

  • Experience server and operation-system-level virtualization is a strong plus, in particular practical experience with Docker and cluster technologies like Kubernetes, AWS ECS, OpenShift

  • Mindset: "Automate Everything", "Infrastructure as Code", "Pipelines as Code", "Everything as Code"

  • Hands-on experience with "Infrastructure as Code" tools: TerraForm, CloudFormation, Packer

  • Experience with a provisioning / configuration management tools (Ansible, Chef, Puppet, Salt)

  • Experience designing, building and integrating systems for instrumentation, metrics/log collection, and monitoring: CloudWatch, Prometheus, Grafana, DataDog, ELK

  • At least basic knowledge in designing and implementing Service Level Agreements

  • Solid knowledge of Network and general Security Engineering

  • At least basic experience with systems and approaches for Test, Build and Deployment automation (CI/CD): Jenkins, TravisCI, Bamboo

  • At least basic hands-on DBA experience, experience with data backup and recovery

  • Experience with JVM-based build automation is a plus: Maven, Gradle, Nexus, JFrog Artifactory

Ventula Consulting
  • Northampton, UK
  • Salary: £70k - 75k

Lead Software Engineer – Java – Global Bank – Machine Learning / Big Data, to £75k + Exceptional Package


Lead Software Engineer with a strong background in Java development required to join a new innovation focused team working on greenfield projects.


My client is working on a number of cutting edge Machine Learning and AI solutions which are set to revolutionise fraud detection and prevention so this is a great opportunity to make a real impact on the future of Banking technology.


This role would suit a highly ambitious Software Developer who is looking for a genuine challenge.


You will be joining a newly established innovation team within the Bank which consists of highly skilled technical individuals and industry thought leaders.


There is a very real opportunity for rapid career progression into both technical and management focused roles due to the high profile nature of this function.


The ideal Lead Software Engineer will have the following experience:



  • Expert in Java software Development – Java 8 or later versions

  • Experience developing Business Critical systems with low latency performance

  • Development background creating solutions using AWS

  • Any experience in Big Data, MongoDB, Spark, MySQL and React / Node would be nice to have although not a necessity


This role will be based in Northampton and offers a package of between £70-£75k + an exceptional package including Bonus, strong pension, private healthcare and a host of other benefits.

BIZX, LLC / Slashdot Media / SourceForge.net
  • San Diego, CA

Job Description (your roll):


The Senior Data  Engineer position is a challenging role that bridges the gap between data management and software development. This role reports directly to and works closely with the Director of Data Management while teaming with our software development group. You will work with the team that is designing and implementing the next generation of our internal systems replacing legacy technical debt with state-of-the-art design to enable faster product and feature creation in our big data environment.


Our Industry and Company Environment:

Candidate must have the desire to work and collagerate in a fast-paced entrepreneurial environment in the B2B technology marketing and big data space working with highly motivated co-workers in our downtown San Diego office.


Responsibilities


  • Design interfaces allowing the operations department to fully utilize large data sets
  • Implement machine learning algorithms to sort and organize large data sets
  • Participate in the research, design, and development of software tools
  • Identify, design, and implement process improvements: automating manual processes
  • Optimize data delivery, re-designing infrastructure for greater scalability
  • Analyze and interpret large data sets
  • Build reliable services for gathering & ingesting data from a wide variety of sources
  • Work with peers and stakeholders to plan approach and define success
  • Create efficient methods to clean and curate large data sets


Qualifications

    • Have a B.S., M.S. or Ph.D. in Computer Science or equivalent degree and work experience

    • Deep understanding of developing high efficiency data processing systems

    • Experience with development of applications in mission-critical environments
    • Experience with our stack:
    •      3+ years experience developing in Javascript, PHP, Symfony
    •      3+ years experience developing and implementing machine learning algorithms
    •      4+ years experience with data science tool sets
    •      3+ years MySQL
    •      Experience with ElasticSearch a plus
    •      Experience with Ceph a plus
 

About  BIZX, LLC / Slashdot Media / SourceForge.net


BIZX including its SlashDot Media division is a global leader in on-line professional technology communities such as sourceforge.net serving over 40M website viewers and serving over 150M page views each month to an enthusiastic and engaged audience of IT professionals, decision makers, developers and enthusiasts around the world. Our Passport demand generation platform leverages our huge B2B database and is considered best in class by our list of Fortune 1000 customers. Our impressive growth in the demand generation space is fueled through our use of AI, big data technologies, sophisticated systems automation - and great people.  


Location - 101 W Broadway, San Diego, CA

AXA Schweiz
  • Winterthur, Switzerland

Dich sprechen Agilität, Product driven IT, Cloud Computing und Machine Learning an?
Du bist leistungsorientiert und hast den Mut Neues auszuprobieren?

Wir haben den digitalen Wandel in unserer DNA verankert!


Dein Beitrag:



  • Das Aufgabenset umfasst vor allem Engineering (IBM MQ Linux, z/OS) und Betrieb von Middleware-Komponenten (File Transfer, Web Service Infrastruktur).

  • Im Detail heisst das Komponentenverantwortung (u.A. Lifecycling, Zur Verfügungstellung von API's und Self-Services, Automatisierung der Abläufe, Erstellung und Pflege der Dokumentation), Sicherstellung des Betriebs (Du ergreifst autonom die notwendigen Massnahmen, Bereitschaft zu sporadischen Wochenendeinsätzen/Pikett), als auch Wissenspflege und -vermittlung.

  • In einem agilen Umfeld, mithilfe bei der Migration unserer Komponenten in die Cloud.


Deine Fähigkeiten und Talente:



  • Du bringst ein abgeschlossenes Informatikstudium oder vergleichbare Erfahrung mit.

  • Dein Know-How umfasst Messaging Middleware-Komponenten, idealerweise IBM MQ auf Linux angereichert mit z/OS Knowhow, cool wären zudem noch Kenntnisse von RabbitMQ und Kafka.

  • Andere Middleware Komponenten (File Transfer und Web Service) sind Dir nicht gänzlich unbekannt und Übertragungsprotokolle als auch die Linux-Welt im Speziellen sind Dir vertraut.

  • Du bringst fundierte Erfahrung in der Automatisierung an den Tisch (Bash, Python) und auch REST, API's sowie Java(-script) sind keine Fremdwörter für Dich. Erste Programmiererfahrung in einer objektorientierten Sprache, vorzugsweise Java, runden dein Profil ab.

  • Du bist integrativ, betrachtest Herausforderungen aus verschiedenen Perspektiven und stellst unbequeme Fragen, wenn es darauf ankommt.

  • Du bist sicher in der deutschen und englischen Sprache.

Accenture
  • San Diego, CA
Organization: Accenture Applied Intelligence
Position: Artificial Intelligence Engineer - Consultant
The digital revolution is changing everything. Its everywhere transforming how we work and play. Accenture Digitals 36,000 professionals are driving these exciting changes and bringing them to life across 40 industries in more than 120 countries. At the forefront of digital, youll create it, own it and make it a reality for clients looking to better serve their connected customers and operate always-on enterprises. Join us and become an integral part of our experienced digital team with the credibility, expertise and insight clients depend on.
Accenture Applied Intelligence, part of Accenture Digital, helps clients to use analytics and artificial intelligence to drive actionable insights, at scale. We apply sophisticated algorithms, data engineering and visualization to extract business insights and help clients turn those insights into actions that drive tangible outcomesto improve their performance and disrupt their markets. Accenture Applied Intelligence is a leader in big data analytics, with deep industry and technical experience. We provide services and solutions that include Analytics Advisory, Data Science, Data Engineering and Analytics-as-a-Service.
Role Description
As an AI engineer, you will facilitate the transfer of advanced AI technologies from the research labs to the domain testbeds and thus the real world. You will participate in the full research to deployment pipeline. You will help conceptualize and develop research experiments, and then implement the systems to execution these experiments. You will lead or work with a team and interact closely with deep experience machine learning engineering and research and the industry partners. You will attend reading groups and seminars, master research techniques and engineering practices, and design research tools and experimental testbeds. You will apply state-of-the-art AI algorithms, explore new solutions, and build working prototypes. You will also learn to deploy the systems and solutions at scale.
Responsibilities
    • Use Deep Learning and Machine Learning to create scalable solutions for business problems.
    • Deliver Deep Learning/Machine Learning projects from beginning to end, including business understanding, data aggregation, data exploration, model building, validation and deployment.
    • Define Architecture Reference Assets - Apply Accenture methodology, Accenture reusable assets, and previous work experience to delivery consistently high quality work. Deliver written or oral status reports regularly. Stay educated on new and emerging market offerings that may be of interest to our clients. Adapt to existing methods and procedures to create possible alternative solutions to moderately complex problems
    • Work hands on to demonstrate and prototype integrations in customer environments. Primary upward interaction is with direct supervisor. May interact with peers and/or management levels at a client and/or within Accenture.
    • Solution and Proposal Alignment - Through a formal sales process, work with the Sales team to identify and qualify opportunities. Conduct full technical discovery, identifying pain points, business and technical requirements, as is and to be scenarios.
    • Understand the strategic direction set by senior management as it relates to team goals. Use considerable judgment to define solution and seeks guidance on complex problems.
Qualifications
    • Bachelors degree in AI, Computer Science, Engineering, Statistics, Physics.
    • Minimum of 1 year of experience in production deployed solutions using artificial intelligence or machine learning techniques.
    • Minimum of 1 years previous consulting or client service delivery experience
    • Minimum of 2 years of experience with system integration architectures, private and public cloud architectures, pros/cons, transformation experience
    • Minimum of 1 year of full lifecycle deployment experience
Preferred Skills
    • Masters or PhD in Analytics, Statistic or other quantitative disciplines
    • Deep learning architectures: convolutional, recurrent, autoencoders, GANs, ResNets
    • Experience in Cognitive tools like Microsoft Bot Framework & Cognitive Services, IBM Watson, Amazon AI services
    • Deep understanding of Data structures and Algorithms
    • Deep experience in Python, C# (.NET), Scala
    • Deep knowledge with MxNet, CNTK, R, H20, TensorFlow, PyTorch
    • Highly desirable to have experience in: cuDNN, NumPY, SciPy
Professional Skill Requirements
    • Recent success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Excellent communication (written and oral) and interpersonal skills
    • Demonstrated leadership in professional setting; either military or civilian
    • Demonstrated teamwork and collaboration in a professional setting; either military or civilian
    • Ability to travel extensively
OUR COMMITMENT TO YOU
    • Your entrepreneurial spirit and vision will be rewarded, and your success will fuel opportunities for career advancement.
    • You will make a difference for some pretty impressive clients. Accenture serves 94 of the Fortune Global 100 and more than 80 percent of the Fortune Global 500.
    • You will be an integral part of a market-leading analytics organization, including the largest and most diversified group of digital, technology, business process and outsourcing professionals in the world. You can leverage our global team to support analytics innovation workshops, rapid capability development, enablement and managed services.
    • You will have access to Accentures deep industry and functional expertise. We operate across more than 40 industries and have hundreds of offerings addressing key business and technology issues. Through our global network, we bring unparalleled experience and comprehensive capabilities across industries and business functions, and extensive research on the worlds most successful companies. You will also be able to tap into the continuous innovation of our Accenture Technology Labs and Innovation Centers, as well as top universities such as MIT through our academic alliance program.
    • You will have access to distinctive analytics assets that we use to accelerate delivering value to our clients including more than 550 analytics assets underpinned by a strong information management and BI technology foundation. Accenture has earned more than 475 patents and patents pending globally for software assets, data- and analytic-related methodologies and content.
    • As the worlds largest independent technology services provider, we are agnostic about technology but have very clear viewpoints about what is most appropriate for a clients particular challenge. You will have access to our alliances with market-leading technology providers and collaborative relationships with emerging players in the analytics and big data spacethe widest ecosystem in the industry. These alliances bring together Accentures extensive analytics capabilities and alliance providers technology, experience and innovation to power analytics-based solutions.
    • You will have access to the best talent. Accenture has a team of more than 36,000 digital professionals including technical architects, big data engineers, data scientists and business analysts, as well as user digital strategists and experience designers.
    • Along with a competitive salary, Accenture offers a comprehensive package that includes generous paid time off, 401K match and an employee healthcare plan. Learn more about our extensive rewards and benefits here: Benefits .
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
American Express
  • Phoenix, AZ

Our Software Engineers not only understand how technology works, but how that technology intersects with the people who count on it every single day. Today, creative ideas, insight and new points of view are at the core of how we craft a more powerful, personal and fulfilling experience for all our customers. So if youre passionate about a career building breakthrough software and making an impact on an audience of millions, look no further.

There are hundreds of chances for you to make your mark on Technology and life at American Express. Heres just some of what youll be doing:

    • Take your place as a core member of an Agile team driving the latest application development practices.
    • Find your opportunity to execute new technologies, write code and perform unit tests, as well as working with data science, algorithms and automation processing
    • Engage your collaborative spirit by Collaborate with fellow engineers to craft and deliver recommendations to Finance, Business, and Technical users on Finance Data Management. 


Qualifications:

  

Are you up for the challenge?


    • 4+ years of Software Development experience.
    • BS or MS Degree in Computer Science, Computer Engineering, or other Technical discipline including practical experience effectively interpreting Technical and Business objectives and challenges and designing solutions.
    • Ability to effectively collaborate with Finance SMEs and partners of all levels to understand their business processes and take overall ownership of Analysis, Design, Estimation and Delivery of technical solutions for Finance business requirements and roadmaps, including a deep understanding of Finance and other LOB products and processes. Experience with regulatory reporting frameworks, is preferred.
    • Hands-on expertise with application design and software development across multiple platforms, languages, and tools: Java, Hadoop, Python, Streaming, Flink, Spark, HIVE, MapReduce, Unix, NoSQL and SQL Databases is preferred.
    • Working SQL knowledge and experience working with relational databases, query authoring (SQL), including working familiarity with a variety of databases(DB2, Oracle, SQL Server, Teradata, MySQL, HBASE, Couchbase, MemSQL).
    • Experience in architecting, designing, and building customer dashboards with data visualization tools such as Tableau using accelerator database Jethro.
    • Extensive experience in application, integration, system and regression testing, including demonstration of automation and other CI/CD efforts.
    • Experience with version control softwares like git, svn and CI/CD testing/automation experience.
    • Proficient with Scaled Agile application development methods.
    • Deals well with ambiguous/under-defined problems; Ability to think abstractly.
    • Willingness to learn new technologies and exploit them to their optimal potential, including substantiated ability to innovate and take pride in quickly deploying working software.
    • Ability to enable business capabilities through innovation is a plus.
    • Ability to get results with an emphasis on reducing time to insights and increased efficiency in delivering new Finance product capabilities into the hands of Finance constituents.
    • Focuses on the Customer and Client with effective consultative skills across a multi-functional environment.
    • Ability to communicate effectively verbally and in writing, including effective presentation skills. Strong analytical skills, problem identification and resolution.
    • Delivering business value using creative and effective approaches
    • Possesses strong business knowledge about the Finance organization, including industry standard methodologies.
    • Demonstrates a strategic/enterprise viewpoint and business insights with the ability to identify and resolve key business impediments.


Employment eligibility to work with American Express in the U.S. is required as the company will not pursue visa sponsorship for these positions.

Pyramid Consulting, Inc
  • Atlanta, GA

Job Title: Tableau Engineer

Duration: 6-12 Months+ (potential to go perm)

Location: Atlanta, GA (30328) - Onsite

Notes from Manager:

We need a data analyst who knows Tableau, scripting (JSON, Python), Altreyx API, AWS, Analytics.

Description

The Tableau Software engineer will be a key resource to work across our Software Engineering BI/Analytics stack to ensure stability, scalability, and the delivery of valuable BI & Analytics solutions for our leadership teams and business partners. Keys to this position are the ability to excel in identification of problems or analytic gaps and mapping and implementing pragmatic solutions. An excellent blend of analytical, technical and communication skills in a team based environment are essential for this role.

Tools we use: Tableau, Business Objects, AngularJS, OBIEE, Cognos, AWS, Opinion Lab, JavaScript, Python, Jaspersoft, Alteryx and R packages, Spark, Kafka, Scala, Oracle

Your Role:

·         Able to design, build, maintain & deploy complex reports in Tableau

·         Experience integrating Tableau into another application or native platforms is a plus

·         Expertise in Data Visualization including effective communication, appropriate chart types, and best practices.

·         Knowledge of best practices and experience optimizing Tableau for performance.

·         Experience reverse engineering and revising Tableau Workbooks created by other developers.

·         Understand basic statistical routines (mean, percentiles, significance, correlations) with ability to apply in data analysis

·         Able to turn ideas into creative & statistically sound decision support solutions

Education and Experience:

·         Bachelors degree in Computer Science or equivalent work experience

·         3-5 Years of hands on experience in data warehousing & BI technologies (Tableau/OBIEE/Business Objects/Cognos)

·         Three or more years of experience in developing reports in Tableau

·         Have good understanding of Tableau architecture, design, development and end user experience.

What We Look For:

·         Very proficient in working with large Databases in Oracle & Big Data technologies will be a plus.

·         Deep understanding & working experience of data warehouse and data mart concepts.

·         Understanding of Alteryx and R packages is a plus

·         Experience designing and implementing high volume data processing pipelines, using tools such as Spark and Kafka.

·         Experience with Scala, Java or Python and a working knowledge of AWS technologies such as GLUE, EMR, Kinesis and Redshift preferred.

·         Excellent knowledge with Amazon AWS technologies, with a focus on highly scalable cloud-native architectural patterns, especially EMR, Kinesis, and Redshift

·         Experience with software development tools and build systems such as Jenkins

Recursion
  • Salt Lake City, UT

At Recursion we combine experimental biology, automation, and artificial intelligence to quickly and efficiently identify treatments for human diseases. We’re transforming drug discovery into a data science problem and to do that we’re building a platform for rapid biological experimentation, data generation, automated analysis, model training, and prediction.


THE PROBLEMS YOU’LL SOLVE


As a Machine Learning Engineer, you'll report to the VP of Engineering and will work with others on the data, machine learning, and engineering teams to build the infrastructure and systems to enable both ML prototyping and production grade deployment of ML solutions that lift our drug discovery platform to new levels of effectiveness and efficiency. We are looking for experienced Machine Learning Engineers who value experimentation and the rigorous use of the scientific method, high collaboration across multiple functions, and intense curiosity driving them to keep our systems cutting edge. In this role you will:



  • Build, scale, and operate compute clusters for deep learning.You’ll be a part of a team responsible for the ML infrastructure, whether that be large-scale on-prem GPU clusters or cloud-based TPU pods.

  • Create a world-class ML research platform.You’ll work with Data Scientists, ML Researchers, and Systems and Data Engineers to create an ML platform that allows them to efficiently prepare hundreds of terabytes of data for training and processing, train cutting-edge deep learning models, backtest them on thousands of past experiments, and deploy working solutions to production. Examples of ML platforms like this are Uber’s Michelangelo and Facebook’s FBLearner Flow.

  • Be a mentor to peers. You will share your technical knowledge and experiences, resulting in an increase in their productivity and effectiveness.


THE EXPERIENCE YOU’LL NEED



  • An ability to be resourceful and collaborative in order to complete large projects. You’ll be working cross-functionally to build these systems and must always have the end (internal) user in mind.

  • Experience implementing, training, and evaluating deep learning models using modern ML frameworks through collaboration with others, reading of ML papers, or primary research.

  • A demonstration of accelerating ML research efforts through improved systems, processes and frameworks.

  • A track record of learning new technologies as needed to get things done. Our current tech stack uses Python and the pydata libraries, TensorFlow, Keras, Kubernetes + Docker, Big Query, and other cloud services provided by Google Cloud Platform.

  • Biology background is not necessary, but intellectual curiosity is a must!


THE PERKS YOU’LL ENJOY



  • Coverage of health, vision, and dental insurance premiums (in most cases 100%)

  • 401(k) with generous matching (immediate vesting)

  • Stock option grants

  • Two one-week paid company closures (summer and winter) in addition to flexible, generous vacation/sick leave

  • Commuter benefit and vehicle parking to ease your commute

  • Complimentary chef-prepared lunches and well-stocked snack bars

  • Generous paid parental leave for birth, non-birth, and adoptive parents

  • Fully-paid gym membership to Metro Fitness, located just feet away from our new headquarters

  • Gleaming new 100,000 square foot headquarters complete with a 70-foot climbing wall, showers, lockers, and bike parking



WHAT WE DO


We have raised over $80M to apply machine learning to one of the most unique datasets in existence - over a petabyte of imaging data spanning more than 10 billion cells treated with hundreds of thousands of different biological and chemical perturbations, generated in our own labs - in order to find treatments for hundreds of diseases. Our long term mission is to decode biology to radically improve lives and we want to understand biology so well that we can fix most things that go wrong in our bodies. Our data scientists, machine learning researchers and engineers work on some of the most challenging and interesting problems in computational drug discovery, and collaborate with some of the brightest minds in the deep learning community (Yoshua Bengio is one of our advisors), who help our machine learning team design novel ways of tackling these problems.



Recursion is an equal opportunity employer and complies with all applicable federal, state, and local fair employment practices laws. Recursion strictly prohibits and does not tolerate discrimination against applicants because of race, color, religion, creed, national origin or ancestry, ethnicity, sex, pregnancy, gender (including gender nonconformity and status as a transgender individual), age, physical or mental disability, citizenship, past, current, or prospective service in the uniformed services, or any other characteristic protected under applicable federal, state, or local law.

The HT Group
  • Austin, TX

Full Stack Engineer, Java/Scala Direct Hire Austin

Do you have a track record of building both internal- and external-facing software services in a dynamic environment? Are you passionate about introducing disruptive and innovative software solutions for the shipping and logistics industry? Are you ready to deliver immediate impact with the software you create?

We are looking for Full Stack Engineers to craft, implement and deploy new features, services, platforms, and products. If you are curious, driven, and naturally explore how to build elegant and creative solutions to complex technical challenges, this may be the right fit for you. If you value a sense of community and shared commitment, youll collaborate closely with others in a full-stack role to ship software that delivers immediate and continuous business value. Are you up for the challenge?

Tech Tools:

  • Application stack runs entirely on Docker frontend and backend
  • Infrastructure is 100% Amazon Web Services and we use AWS services whenever possible. Current examples: EC2 Elastic Container Service (Docker), Kinesis, SQS, Lambda and Redshift
  • Java and Scala are the languages of choice for long-lived backend services
  • Python for tooling and data science
  • Postgres is the SQL database of choice
  • Actively migrating to a modern JavaScript-centric frontend built on Node, React/Relay, and GraphQL as some of our core UI technologies

Responsibilities:

  • Build both internal and external REST/JSON services running on our 100% Docker-based application stack or within AWS Lambda
  • Build data pipelines around event-based and streaming-based AWS services and application features
  • Write deployment, monitoring, and internal tooling to operate our software with as much efficiency as we build it
  • Share ownership of all facets of software delivery, including development, operations, and test
  • Mentor junior members of the team and coach them to be even better at what they do

Requirements:

  • Embrace the AWS + DevOps philosophy and believe this is an innovative approach to creating and deploying products and technical solutions that require software engineers to be truly full-stack
  • Have high-quality standards, pay attention to details, and love writing beautiful, well-designed and tested code that can stand the test of time
  • Have built high-quality software, solved technical problems at scale and believe in shipping software iteratively and often
  • Proficient in and have delivered software in Java, Scala, and possibly other JVM languages
  • Developed a strong command over Computer Science fundamentals
ettain group
  • Raleigh, NC

Role: Network Engineer R/S

Location: RTP, primarily onsite but some flexibility for remote after initial rampup

Pay Rate: 35-60/hr depending on experience.

Interview Process:
Video WebEx (30 min screen)
Panel Interview with 3-4 cpoc engineers- in depth technical screen

Personality:

·         Customer facing

·         Experience dealing with high pressure situations

·         Be able to hand technology at the level the customer will throw at them

·         Customers test the engineers to see if tech truly is working

·         Have to be able to figure out how to make it work

Must have Tech:

·         Core r/s

·         Vmware


Who You'll Work With:

The POV Services Team (dCloud, CPOC, CXC, etc) provides services, tools and content for Cisco field sales and channel partners, enabling them to highlight Cisco solutions and technologies to customers.

What You'll Do

As a Senior Engineer, you are responsible for the development, delivery, and support of a wide range of Enterprise Networking content and services for Cisco Internal, Partner and Customer audiences.

Content Roadmap, Design and Project Management 25%

    • You will document and scope all projects prior to entering project build phase.
    • Youll work alongside our platform/automation teams to review applicable content to be hosted on Cisco dCloud.
    • You specify and document virtual and hardware components, resources, etc. required for content delivery.
    • You can identify and prioritize all project-related tasks while working with Project Manager to develop a timeline with high expectations to meet project deadlines.\
    • You will successfully collaborate and work with a globally-dispersed team using collaboration tools, such as email, instant messaging (Cisco Jabber/Spark), and teleconferencing (WebEx and/or TelePresence).

Content Engineering and Documentation 30%

    • Document device connectivity requirements of all components (virtual and HW) and build as part of pre-work.
    • Work with the Netops team to rack, cabling, imaging, and access required for the content project.
    • As part of the development cycle, the developer will work collaboratively with the business unit technical marketing engineers (TME) and WW EN Sales engineers to configure solution components, including Cisco routers, switches, wireless LAN controllers (WLC), SD-Access, DNA Center, Meraki, SD-WAN (Viptela), etc.
    • Work with BU, WW EN Sales and marketing resources to draft, test and troubleshoot compelling demo/lab/story guides that contribute to the field sales teams and generate high interest and utilization.
    • Work with POV Services Technical Writer to format/edit/publish content and related documents per POV Services standards.
    • Work as the liaison to the operations and support teams to resolve issues identified during the development and testing process, providing technical support and making design recommendations for fixes.
    • Perform resource studies using VMware vCenter to ensure an optimal balance of content performance, efficiency and stability before promoting/publishing production content.

Content Delivery 25%

    • SD-Access POV, SD-WAN POV Presentations, Webex and Video recordings, TOI, SE Certification Proctor, etc.
    • Customer engagement at customer location, Cisco office, remote delivering proof of value and at Cisco office delivering Test Drive and or Technical Solutions Workshop content.
    • Deliver training, TOI, and presentations at events (Cisco Live, GSX, SEVT, Partner VT, etc).
    • Work with the POV Services owners, architects, and business development team to market, train, and increase global awareness of new/revised content releases.

Support and Other 20%

    • You provide transfer of information and technical support to Level 1 & 2 support engineers, program managers and others ensuring that content is understood and in working order.
    • You will test and replicate issues, isolate the root cause, and provide timely workarounds and/or short/long term fixes.
    • You will be monitoring any support trends for assigned content. Track and log critical issues effectively using Jira.
    • You provide Level 3 user support directly/indirectly to Cisco and Partner sales engineers while supporting and mentoring peer/junior engineers as required.

Who You Are

    • You are well versed in the use of standard design templates and tools (Microsoft Office including Visio, Word, Excel, PowerPoint, and Project).
    • You bring an uncanny ability to multitask between multiple projects, user support, training, events, etc. and shifting priorities.
    • Demonstrated, in-depth working knowledge/certification of routing, switching and WLAN design, configuration and deployment. Cisco Certifications including CCNA, CCNP and or CCIE (CCIE preferred) in R&S.
    • You possess professional or expert knowledge/experience with Cisco Service Provider solutions.
    • You are an Associate or have professional knowledge with Cisco Security including Cisco ISE, Stealthwatch, ASA, Firepower, AMP, etc.
    • You have the ability to travel to Cisco internal, partner and customer events, roadshows, etc. to train and raise awareness to drive POV Services adoption and sales. Up to 40% travel.
    • You bring VMWare/ESXi experience building servers, install VMware, deploying virtual appliances, etc.
    • You have Linux experience or certifications including CompTIA Linux+, Red Hat, etc.
    • Youre experience using Tool Command Language (Tcl), PERL, Python, etc. as well as Cisco and 3rd party traffic, event and device generation applications/tools/hardware. IXIA, Sapro, Pagent, etc.
    • Youve used Cisco and 3rd party management/monitoring/troubleshooting solutions; Cisco: DNA Center, Cisco Prime, Meraki, Viptela, CMX.
    • 3rd party solutions: Solarwinds, Zenoss, Splunk, LiveAction or other to monitor and/or manage an enterprise network.
    • Experience using Wireshark and PCAP files.

Why Cisco

At Cisco, each person brings their unique talents to work as a team and make a difference.

Yes, our technology changes the way the world works, lives, plays and learns, but our edge comes from our people.

    • We connect everything people, process, data and things and we use those connections to change our world for the better.
    • We innovate everywhere - From launching a new era of networking that adapts, learns and protects, to building Cisco Services that accelerate businesses and business results. Our technology powers entertainment, retail, healthcare, education and more from Smart Cities to your everyday devices.
    • We benefit everyone - We do all of this while striving for a culture that empowers every person to be the difference, at work and in our communities.
MINDBODY Inc.
  • Irvine, CA
  • Salary: $96k - 135k

The Senior Data Engineer focuses on designing, implementing and supporting new and existing data solutions- data processing, and data sets to support various advanced analytical needs. You will be designing, building and supporting data pipelines consuming data from multiple different source systems and transforming it into valuable and insightful information. You will have the opportunity to contribute to end-to-end platform design for our cloud architecture and work multi-functionally with operations, data science and the business segments to build batch and real-time data solutions. The role will be part of a team supporting our Corporate, Sales, Marketing, and Consumer business lines.


 
MINIMUM QUALIFICATIONS AND REQUIREMENTS:



  • 7+ years of relevant experience in one of the following areas: Data engineering, business intelligence or business analytics

  • 5-7 years of supporting a large data platform and data pipelining

  • 5+ years of experience in scripting languages like Python etc.

  • 5+ years of experience with AWS services including S3, Redshift, EMR andRDS

  • 5+ years of experience with Big Data Technologies (Hadoop, Hive, HBase, Pig, Spark, etc.)

  • Expertise in database design and architectural principles and methodologies

  • Experienced in Physical data modeling

  • Experienced in Logical data modeling

  • Technical expertise should include data models, database design and data mining



PRINCIPAL DUTIES AND RESPONSIBILITIES:



  • Design, implement, and support a platform providing access to large datasets

  • Create unified enterprise data models for analytics and reporting

  • Design and build robust and scalable data integration (ETL) pipelines using SQL, Python, and Spark.

  • As part of an Agile development team contribute to architecture, tools and development process improvements

  • Work in close collaboration with product management, peer system and software engineering teams to clarify requirements and translate them into robust, scalable, operable solutions that work well within the overall data architecture

  • Coordinate data models, data dictionaries, and other database documentation across multiple applications

  • Leads design reviews of data deliverables such as models, data flows, and data quality assessments

  • Promotes data modeling standardization, defines and drives adoption of the standards

  • Work with Data Management to establish governance processes around metadata to ensure an integrated definition of data for enterprise information, and to ensure the accuracy, validity, and reusability of metadata

GrubHub Seamless
  • New York, NY

Got a taste for something new?

We’re Grubhub, the nation’s leading online and mobile food ordering company. Since 2004 we’ve been connecting hungry diners to the local restaurants they love. We’re moving eating forward with no signs of slowing down.

With more than 90,000 restaurants and over 15.6 million diners across 1,700 U.S. cities and London, we’re delivering like never before. Incredible tech is our bread and butter, but amazing people are our secret ingredient. Rigorously analytical and customer-obsessed, our employees develop the fresh ideas and brilliant programs that keep our brands going and growing.

Long story short, keeping our people happy, challenged and well-fed is priority one. Interested? Let’s talk. We’re eager to show you what we bring to the table.

About the Opportunity: 

Senior Site Reliability Engineers are embedded in Big Data specific Dev teams to focus on the operational aspects of our services, and our SREs run their respective products and services from conception to continuous operation.  We're looking for engineers who want to be a part of developing infrastructure software, maintaining it and scaling. If you enjoy focusing on reliability, performance, capacity planning, and the automation everything, you’d probably like this position.





Some Challenges You’ll Tackle





TOOLS OUR SRE TEAM WORKS WITH:



  • Python – our primary infrastructure language

  • Cassandra

  • Docker (in production!)

  • Splunk, Spark, Hadoop, and PrestoDB

  • AWS

  • Python and Fabric for automation and our CD pipeline

  • Jenkins for builds and task execution

  • Linux (CentOS and Ubuntu)

  • DataDog for metrics and alerting

  • Puppet





You Should Have






  • Experience in AWS services like Kinesis, IAM, EMR, Redshift, and S3

  • Experience managing Linux systems

  • Configuration management tool experiences like Puppet, Chef, or Ansible

  • Continuous integration, testing, and deployment using Git, Jenkins, Jenkins DSL

  • Exceptional communication and troubleshooting skills.


NICE TO HAVE:



  • Python or Java / Scala development experience

  • Bonus points for deploying/operating large-ish Hadoop clusters in AWS/GCP and use of EMR, DC/OS, Dataproc.

  • Experience in Streaming data platforms, (Spark streaming, Kafka)

  • Experience developing solutions leveraging Docker

Accenture
  • San Diego, CA
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
Business & Technology Integration professionals advise upon, design, develop and/or deliver technology solutions that support best practice business changes
The Bus&Industry Integration Assoc Mgr aligning technology with business strategy and goals they working directly with the client gathering requirements to analyze, design and/or implement technology best practice business changes. They are sought out as experts internally and externally for their deep functional or industry expertise, domain knowledge, or offering expertise. They enhance Accenture's marketplace reputation.
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
Data Management professionals define strategies and develop/deliver solutions and processes for managing enterprise-wide data throughout the data lifecycle from capture to processing to usage across all layers of the application architecture.
A professional at this position level within Accenture has the following responsibilities:
Identifies, assesses and solves complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors.
Closely follows the strategic direction set by senior management when establishing near term goals.
Interacts with senior management at a client and/or within Accentureon matters where they may need to gain acceptance on an alternate approach.
Has some latitude in decision-making. Acts independently to determine methods and procedures on new assignments.
Decisions have a major day to day impact on area of responsibility.
Manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum of 3 plus years of hands-on technical experience implementing Big Data solutions utilizing Hadoop or other Data Science and Analytics platforms.
    • Minimum of 3 plus years of experience with a full life cycle development from functional design to deployment
    • Minimum 2 plus years of hands-on technical experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Minimum 3 plus years of hands-on technical experience in developing solutions utilizing at least two of the following:
    • Kafka based streaming services
    • R Studio
    • Cassandra , MongoDB
    • MapReduce, Pig, Hive
    • Scala, Spark
    • knowledge on Jenkins, Chef, Puppet
  • Bachelor's degree or equivalent years of work experience
  • Ability to travel 100%, Monday- Thursday
Professional Skill Requirements
    • Proven ability to build, manage and foster a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
    • Excellent communication (written and oral) and interpersonal skills
    • Excellent leadership and management skills
All of our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture.
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a federal contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
ITCO Solutions, Inc.
  • Austin, TX

The Sr. Engineer will be building pipelines using Spark ScalaMust Haves:
Expertise in the Big Data processing and ETL Pipeline
Designing large scaling ETL pipelines - batch and realtime
Expertise in Spark Scala coding and Data Frame API (rather than the SQL based APIs)
Expertise in core Data Frame APIs
Expertise in doing unit testing Spark Data frame API based code
Strong in Scripting knowledge using Python and shell scripting
Experience and expertise in working on performance tuning of large scale data pipelines

GeoPhy
  • New York, NY

We're already working with some of the largest real estate lenders and investors across the globe, and we believe that our AVM will truly disrupt the commercial real estate industry.  Using your machine learning and analytical skills, you will contribute to the development of GeoPhy's core information products. This includes working on the development of our flagship product, the Automated Valuation Model (AVM) that we've developed for the commercial real estate market.



What you'll be responsible for



  • Developing and maintaining predictive valuation algorithms for the commercial real estate market, based on stochastic modeling

  • Identifying and analyzing new data sources to improve model accuracy, closely working with our data sourcing teams

  • Conducting statistical analysis to identify patterns and insights, and process and feature engineer data as needed to support model development and business products

  • Bringing models to production, in collaboration with the development and data engineering teams 

  • Supporting data sourcing strategy and the validation of related infrastructure and technology

  • Contributing to the development of methods in data data science, including: statistical analysis and model development related to real estate, economics, the built environment, or financial markets



What we're looking for



  • Creative and intellectually curious with hands-on experience as a data scientist

  • Flexible, resourceful, and a reliable team player

  • Rigorous analyst, critical thinker, and problem solver with experience in hypothesis testing and experimental design

  • Excellent at communicating, including technical documentation and presenting work across a variety of audiences

  • Experienced working with disparate data sources and the engineering and statistical challenges that presents, particularly with time series, socio-economic-demographic (SED) data, and/or geo-spatial data

  • Strong at data exploration and visualization

  • Experienced implementing predictive models across a full suite of statistical learning algorithms (regression/classification, unsupervised/semi-supervised/supervised)

  • Proficient in Python or R as well as critical scientific and numeric programming packages and tools

  • Intermediate knowledge of SQL

  • Full working proficiency in English

  • An MSc/PhD degree in Computer Science, Mathematics, Statistics or a related subject, or commensurate technical experience



Bonus points for



  • International mind set

  • Experience in an Agile organization

  • Knowledge or experience with global real estate or financial markets

  • Experience with complex data and computing architectures, including cloud services and distributed computing

  • Direct experience implementing models in production or delivering a data product to market



What’s in it for you?



  • You will have the opportunity to accelerate our rapidly growing organisation.

  • We're a lean team, so your impact will be felt immediately.

  • Personal learning budget.

  • Agile working environment with flexible working hours and location.

  • No annual leave allowance; take time off whenever you need.

  • We embrace diversity and foster inclusion. This means we have a zero-tolerance policy towards discrimination,

  • GeoPhy is a family and pet friendly company.

  • Get involved in board games, books, and lego.

SafetyCulture
  • Surry Hills, Australia
  • Salary: A$120k - 140k

The Role



  • Be an integral member on the team responsible for design, implement and maintain distributed big data capable system with high-quality components (Kafka, EMR + Spark, Akka, etc).

  • Embrace the challenge of dealing with big data on a daily basis (Kafka, RDS, Redshift, S3, Athena, Hadoop/HBase), perform data ETL, and build tools for proper data ingestion from multiple data sources.

  • Collaborate closely with data infrastructure engineers and data analysts across different teams, find bottlenecks and solve the problem

  • Design, implement and maintain the heterogeneous data processing platform to automate the execution and management of data-related jobs and pipelines

  • Implement automated data workflow in collaboration with data analysts, continue to improve, maintain and improve system in line with growth

  • Collaborate with Software Engineers on application events, and ensuring right data can be extracted

  • Contribute to resources management for computation and capacity planning

  • Diving deep into code and constantly innovating


Requirements



  • Experience with AWS data technologies (EC2, EMR, S3, Redshift, ECS, Data Pipeline, etc) and infrastructure.

  • Working knowledge in big data frameworks such as Apache Spark, Kafka, Zookeeper, Hadoop, Flink, Storm, etc

  • Rich experience with Linux and database systems

  • Experience with relational and NoSQL database, query optimization, and data modelling

  • Familiar with one or more of the following: Scala/Java, SQL, Python, Shell, Golang, R, etc

  • Experience with container technologies (Docker, k8s), Agile development, DevOps and CI tools.

  • Excellent problem-solving skills

  • Excellent verbal and written communication skills 

Accenture
  • Atlanta, GA
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Raleigh, NC
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Riccione Resources
  • Dallas, TX

Sr. Data Engineer Hadoop, Spark, Data Pipelines, Growing Company

One of our clients is looking for a Sr. Data Engineer in the Fort Worth, TX area! Build your data expertise with projects centering on large Data Warehouses and new data models! Think outside the box to solve challenging problems! Thrive in the variety of technologies you will use in this role!

Why should I apply here?

    • Culture built on creativity and respect for engineering expertise
    • Nominated as one of the Best Places to Work in DFW
    • Entrepreneurial environment, growing portfolio and revenue stream
    • One of the fastest growing mid-size tech companies in DFW
    • Executive management with past successes in building firms
    • Leader of its technology niche, setting the standards
    • A robust, fast-paced work environment
    • Great technical challenges for top-notch engineers
    • Potential for career growth, emphasis on work/life balance
    • A remodeled office with a bistro, lounge, and foosball

What will I be doing?

    • Building data expertise and owning data quality for the transfer pipelines that you create to transform and move data to the companys large Data Warehouse
    • Architecting, constructing, and launching new data models that provide intuitive analytics to customers
    • Designing and developing new systems and tools to enable clients to optimize and track advertising campaigns
    • Using your expert skills across a number of platforms and tools such as Ruby, SQL, Linux shell scripting, Git, and Chef
    • Working across multiple teams in high visibility roles and owning the solution end-to-end
    • Providing support for existing production systems
    • Broadly influencing the companys clients and internal analysts

What skills/experiences do I need?

    • B.S. or M.S. degree in Computer Science or a related technical field
    • 5+ years of experience working with Hadoop and Spark
    • 5+ years of experience with Python or Ruby development
    • 5+ years of experience with efficient SQL (Postgres, Vertica, Oracle, etc.)
    • 5+ years of experience building and supporting applications on Linux-based systems
    • Background in engineering Spark data pipelines
    • Understanding of distributed systems

What will make my résumé stand out?

    • Ability to customize an ETL or ELT
    • Experience building an actual data warehouse schema

Location: Fort Worth, TX

Citizenship: U.S. citizens and those authorized to work in the U.S. are encouraged to apply. This company is currently unable to provide sponsorship (e.g., H1B).

Salary: 115 130k + 401k Match

---------------------------------------------------


~SW1317~

Signify Health
  • Dallas, TX

Position Overview:

Signify Health is looking for a savvy Data Engineer to join our growing team of deep learning specialists. This position would be responsible for evolving and optimizing data and data pipeline architectures, as well as, optimizing data flow and collection for cross-functional teams. The Data Engineer will support software developers, database architects, data analysts, and data scientists. The ideal candidate would be self-directed, passionate about optimizing data, and comfortable supporting the Data Wrangling needs of multiple teams, systems and products.

If you enjoy providing expert level IT technical services, including the direction, evaluation, selection, configuration, implementation, and integration of new and existing technologies and tools while working closely with IT team members, data scientists, and data engineers to build our next generation of AI-driven solutions, we will give you the opportunity to grow personally and professionally in a dynamic environment. Our projects are built on cooperation and teamwork, and you will find yourself working together with other talented, passionate and dedicated team member, all working towards a shared goal.

Essential Job Responsibilities:

  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing data models for greater scalability, etc.
  • Leverage Azure for extraction, transformation, and loading of data from a wide variety of data sources in support of AI/ML Initiatives
  • Design and implement high performance data pipelines for distributed systems and data analytics for deep learning teams
  • Create tool-chains for analytics and data scientist team members that assist them in building and optimizing AI workflows
  • Work with data and machine learning experts to strive for greater functionality in our data and model life cycle management capabilities
  • Communicate results and ideas to key decision makers in a concise manner
  • Comply with applicable legal requirements, standards, policies and procedures including, but not limited to the Compliance requirements and HIPAA.


Qualifications:Education/Licensing Requirements:
  • High school diploma or equivalent.
  • Bachelors degree in Computer Science, Electrical Engineer, Statistics, Informatics, Information Systems, or another quantitative field. or related field or equivalent work experience.


Experience Requirements:
  • 5+ years of experience in a Data Engineer role.
  • Experience using the following software/tools preferred:
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with AWS or Azure cloud services.
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C#, etc.
  • Strong work ethic, able to work both collaboratively, and independently without a lot of direct supervision, and solid problem-solving skills
  • Must have strong communication skills (written and verbal), and possess good one-on-one interpersonal skills.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable big data data stores.
  • 2 years of experience in data modeling, ETL development, and Data warehousing
 

Essential Skills:

  • Fluently speak, read, and write English
  • Fantastic motivator and leader of teams with a demonstrated track record of mentoring and developing staff members
  • Strong point of view on who to hire and why
  • Passion for solving complex system and data challenges and desire to thrive in a constantly innovating and changing environment
  • Excellent interpersonal skills, including teamwork and negotiation
  • Excellent leadership skills
  • Superior analytical abilities, problem solving skills, technical judgment, risk assessment abilities and negotiation skills
  • Proven ability to prioritize and multi-task
  • Advanced skills in MS Office

Essential Values:

  • In Leadership Do whats right, even if its tough
  • In Collaboration Leverage our collective genius, be a team
  • In Transparency Be real
  • In Accountability Recognize that if it is to be, its up to me
  • In Passion Show commitment in heart and mind
  • In Advocacy Earn trust and business
  • In Quality Ensure what we do, we do well
Working Conditions:
  • Fast-paced environment
  • Requires working at a desk and use of a telephone and computer
  • Normal sight and hearing ability
  • Use office equipment and machinery effectively
  • Ability to ambulate to various parts of the building
  • Ability to bend, stoop
  • Work effectively with frequent interruptions
  • May require occasional overtime to meet project deadlines
  • Lifting requirements of